Author Topic: Loading failed message without more explanation  (Read 2168 times)

dustfr

  • Guest
Loading failed message without more explanation
« on: February 07, 2014, 09:05:00 AM »
Hi all

Using cmod 8.4.1 on aix
We are trying a workarround to store big files (more than 2 Gb for a group)
see IBM CMOD Technote #1170676 - What is the maximum input file and document size Content Manager OnDemand supports?
Customer does not allow us to cut its file as a workarroud
So we test IBM workarround consisting on using smaller groups.

ApplicationGroup for POC is POC-DA7104,
Storage : Cache Only - Library Server
Application for text "docs" is POC-NA-TXT allowing compression
Application for tar "docs" is POC-NA-TAR without compression

Storing a "document" as a 2.6 Gb text file works fine with 250 Mb, 500Mb or 1000Mb groups in index

x bigfile.ind, 2601 bytes, 6 media blocks.
x bigfile.dat, 2620000000 bytes, 5117188 media blocks.
...
INFO: Current dir is >/data1/standalone_WMMS61YAN1/ay2/ay2admaa/data/batch/load/TXT-250-bigfile.757762<
INFO: OnDemand index file is >bigfile<
/usr/lpp/ars/bin/arsload -X G -u <> -p <> -h <> -n -fv -g POC-DA7104 -a POC-NA-TXT bigfile
arsload: Processing file >bigfile<
arsload: 02/07/14 14:31:07 -- Loading started, --UNKNOWN-- bytes to process
OnDemand Load Id = >5185-1-0-71FAA-16029-16029<
Loaded 11 rows into the database
Document compression type used - OD77.  Bytes Stored = >13981281< Rows = >11<
arsload: 02/07/14 14:32:25 Loading completed
arsload: Processing successful for file >bigfile<
INFO: Return code for ARSLOAD = 0
arsload: Processing successful for file >bigfile<

Storing a "document" as a 2.6 Gb tar file works bad with 250 Mb groups (not tester bigger groups)

x bigfile.ind, 2601 bytes, 6 media blocks.
x bigfile.dat, 2648657920 bytes, 5173160 media blocks.
...
INFO: Current dir is >/data1/standalone_WMMS61YAN1/ay2/ay2admaa/data/batch/load/TAR-250-bigfile.544874<
INFO: OnDemand index file is >bigfile<
/usr/lpp/ars/bin/arsload -X G -u <> -p <> -h <> -n -fv -g POC-DA7104 -a POC-NA-TAR bigfile
arsload: Processing file >bigfile<

arsload: 02/07/14 15:42:49 -- Loading started, --UNKNOWN-- bytes to process
OnDemand Load Id = >5185-1-0-75FAA-16029-16029<
Loaded 0 rows into the database
arsload: 02/07/14 15:42:54 Loading failed
arsload: Processing failed for file >bigfile<
arsload: 02/07/14 15:42:54 -- Unloading started
OnDemand UnLoad Successful - LoadId(5185-1-0-75FAA-16029-16029) Rows Deleted(0)

02/07/14 15:42:55 -- Unloading of data was successful
arsload: Processing has stopped.  The remaining files will NOT be processed.
INFO: Return code for ARSLOAD = 0


As you can see :
- message "Loading failed" is not very explicit for reason why it fails
- arsload return code seems to be nul
- no more information on system log

According to ulimit, physical available memory, nmon, memory is not satured
Usually loads without enouth memory explicitly say memory is missing
File system is not full (less 5% used)

Any idea where i could have to look to seek why it fails ?
Regards

Justin Derrick

  • IBM Content Manager OnDemand Consultant
  • Administrator
  • Hero Member
  • *****
  • Posts: 2231
  • CMOD Guru for hire...
    • View Profile
    • Tenacious Consulting
Re: Loading failed message without more explanation
« Reply #1 on: February 08, 2014, 01:18:41 PM »
Check DB2 - sqllib/db2dump/db2diag.log.
Check TSM - 'query actlog' in the administrative client.

It's likely that the reason the text file loads is because of compression.  Even though it's 2.6GB, it only stores 13MB after it's compressed, and there's 11 documents in there, so no individual document exceeds the 2GB mark.

What is your AG configuration for the AG that you're loading the 2.6Gb tar file into?  (Also, have you tried compressing the tar file with gzip or bzip2?)

-JD.
IBM CMOD Professional Services: http://TenaciousConsulting.com
Call:  +1-866-533-7742  or  eMail:  jd@justinderrick.com
IBM CMOD Wiki:  https://CMOD.wiki/
FREE IBM CMOD Education & Webinars:  https://CMOD.Training/

Interests: #AIX #Linux #Multiplatforms #DB2 #TSM #SP #Performance #Security #Audits #Customizing #Availability #HA #DR