Author Topic: Very Large Line Data File Failing  (Read 7189 times)

jeffs42885

  • Guest
Very Large Line Data File Failing
« on: December 11, 2013, 10:43:12 AM »
Hi All,

I am attempting to load a very large (1.2GB) line data file into OnDemand. I am getting the following error message in my 88 record:

arsload: 12/10/13 13:23:10 Indexing completed
arsload: 12/10/13 13:23:10 -- Loading started, 1174555350 bytes to process
OnDemand Load Id = >16227-7-0-221FAA-16044-16044<
An error occurred.  Contact your system administrator and/or consult the System Log.  File=arsadmp.c, Line=1669
Failed while attempting to load the database
The last row successfully loaded was 30000
Loaded 30000 rows into the database
arsload: 12/10/13 13:26:08 Loading failed
arsload: Processing failed for file >/od/REPORT.1386149548219.REPORT.REPORT_0.0.0.ARD<
arsload: 12/10/13 13:26:08 -- Unloading started
OnDemand UnLoad Successful - LoadId(16227-7-0-221FAA-16044-16044) Rows Deleted(30000)

I am also seeing this in the system log associated with the event id:

DB Error: [IBM][CLI Driver][DB2/AIX64] SQLSTATE 01517: A character that could not be converted was replaced with a substitute character.  -- SQLSTATE=01517, SQLCODE=0, File=arsdoc.c, Line=4216

DB Error: Warning:  Unexpected SQL_SUCCESS_WITH_INFO -- SQLSTATE=Not Defined, SQLCODE=1, File=arsdoc.c, Line=4216

Nothing in the error reports, nothing in the console. File loads for about 8 hours and then fails. We loaded this same file
appx one month ago and it loaded 140k rows into the database and was around the same size. Ulimits are all set to unlimited.

kasim

  • Guest
Re: Very Large Line Data File Failing
« Reply #1 on: December 11, 2013, 08:37:54 PM »
You field positions might not be consistent. Put some default value for each field (e.g. space).

ewirtz

  • Full Member
  • ***
  • Posts: 134
    • View Profile
Re: Very Large Line Data File Failing
« Reply #2 on: December 11, 2013, 11:52:06 PM »
Hi,

it sounds like a code page problem. you might try to insert an index value with a charcter that is not defined in the code page.

regards

Egon

jeffs42885

  • Guest
Re: Very Large Line Data File Failing
« Reply #3 on: December 12, 2013, 06:26:05 AM »
From the business-

This file has not changed at all and ran successfully in October and November. I have also opened up a PMR for this issue.

jeffs42885

  • Guest
Re: Very Large Line Data File Failing
« Reply #4 on: December 26, 2013, 12:53:53 PM »
Just wondering if anyone has feedback on this. Not really getting anywhere with my PMR..

ewirtz

  • Full Member
  • ***
  • Posts: 134
    • View Profile
Re: Very Large Line Data File Failing
« Reply #5 on: December 27, 2013, 12:28:47 AM »
Hi,

I'm still thinking it's a code page problem.


'A character that could not be converted was replaced with a substitute character'


Did you have any change in the CMOD or DB2 version after the successfull load of the repoort?

Please search for a character in the report that is not defined in the code page.

regards

Egon

jeffs42885

  • Guest
Re: Very Large Line Data File Failing
« Reply #6 on: December 27, 2013, 06:24:47 AM »
Hi- No changes were made to CMOD since the last successful load.

Greg Ira

  • Full Member
  • ***
  • Posts: 240
    • View Profile
Re: Very Large Line Data File Failing
« Reply #7 on: December 31, 2013, 08:29:30 AM »
I get the feeling it's not a code page problem.  The thing that jumps out at me is the 30,000 rows loaded message.  That is too neat a number for me.  If you rerun the load does it always die at 30,000 rows loaded?  One of the first things I would have done is looked at the load data and see what is trying to be loaded for row 30,001 and verify that the file is indeed unchanged from previous runs.  Be sure to check hex values for any unprintable characters present in the record. We've had instances where everyone swears nothing was changed only to find a programmer added some seemingly innocuous modification that messed with our indexing.  Since this worked previously, and yes this is stating the obvious, something had to have changed.  The key is finding what changed. Maintenance(DB2,CMOD,OS),  resource contention, the way the file is created, TCPIP timeout.

jeffs42885

  • Guest
Re: Very Large Line Data File Failing
« Reply #8 on: December 31, 2013, 01:15:40 PM »
I looked at the file with a hex editor and didnt find anything different. This particular issue happens when the NAME field is indexed (IE- Bob Smith)

I also ran a post processor against this file that puts all the indexes to a text file- and again, I didnt see anything.

IBM's suggestion- Upgrade to 9.0

 ??? ??? ??? ???

ewirtz

  • Full Member
  • ***
  • Posts: 134
    • View Profile
Re: Very Large Line Data File Failing
« Reply #9 on: January 02, 2014, 01:08:42 AM »
Hi,

the DB2 message says that a charcter shall be processed that is not defined in the code page. Besides a wrong character in the report it might also be that the memory is corrupt at the time of the insert (I think it's an insert). If you enable the the ODBC trace you will see the failing SQL.

regards Egon

Alessandro Perucchi

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 1002
    • View Profile
Re: Very Large Line Data File Failing
« Reply #10 on: February 03, 2014, 08:00:25 AM »
Hello all,

I'm not sure to be able to help, just maybe give some hint where to look elsewhere... if you didn't do it ...

If between November/October and December nothing has changed... then maybe there are some other aspects like
- have you tried to check if you don't have any disk space problem while doing the load?
- Have you tried to close your segment table before doing the load? If so, maybe the problem could be that you have problem by creating a new segment table
- Do you have a problem with your database (log files needs to be back up)... well something on the db side...
- ...

I don't know if that helps a little bit...

Sincerely yours,
Alessandro
Alessandro Perucchi

#Install #Migrations #Conversion #Educate #Repair #Upgrade #Migrate #Enhance #Optimize #AIX #Linux #Multiplatforms #DB2 #Windows #Oracle #TSM #Tivoli #Performance #Audits #Customizing #Availability #HA #DR #JavaApi #ContentNavigator #ICN #WEBi #ODWEK #Services #PDF #AFP #XML

Justin Derrick

  • IBM Content Manager OnDemand Consultant
  • Administrator
  • Hero Member
  • *****
  • Posts: 2230
  • CMOD Guru for hire...
    • View Profile
    • Tenacious Consulting
Re: Very Large Line Data File Failing
« Reply #11 on: February 03, 2014, 09:45:29 AM »
There's something seriously wrong if a 1.2GB file takes 8 hours to load.  I've built systems that load a terabyte a day...  Something else is very, very wrong here.  How big is each document?  How many records are you EXPECTING to load with this file?  What platform / software versions are you running with?  I'm also wondering if your hardware is way, way, way out of date.  Also, provide indexing parameters, info on User Exits, etc.

-JD.
IBM CMOD Professional Services: http://TenaciousConsulting.com
Call:  +1-866-533-7742  or  eMail:  jd@justinderrick.com
IBM CMOD Wiki:  https://CMOD.wiki/
FREE IBM CMOD Education & Webinars:  https://CMOD.Training/

Interests: #AIX #Linux #Multiplatforms #DB2 #TSM #SP #Performance #Security #Audits #Customizing #Availability #HA #DR

jeffs42885

  • Guest
Re: Very Large Line Data File Failing
« Reply #12 on: February 17, 2014, 08:03:25 AM »
Hardware is definately outdated. We are limited by our licensing for other tools that are keeping us at this server, with this cores..

But, this was back in november/december. I am sure it has been resolved and the boxes are much faster.

I worked with my colleague in the area that created the file (They came from OMNI) and we indexed on load date and he was happy. Kind of a bandaid fix but, it's all we could do with the resources/time we had,