OnDemand User Group
Support Forums => Other => Topic started by: Gobi21 on October 30, 2015, 12:24:52 AM
-
Hi All,
I am facing the below issue during the Ondemand load ingestion.
arsload: Processing file >/sbcimp/dyn/data/RAR/OD_App1/arsload_GI/Blackbox_GI/BLACKBOX.LOGS.20151023<
arsload: 10/29/15 21:19:15 -- Loading started, --UNKNOWN-- bytes to process
OnDemand Load Id = >6026-6-0-1397FAA-16732-16732<
Unable to allocate enough memory. File=arslacif.c, Line=711
An unexpected error occurred. Contact your System Administrator and/or consult the System Log. File=arsadmp.c, Line=426
The last row successfully loaded was 1
Loaded 1 rows into the database
arsload: 10/29/15 21:32:12 Loading failed
arsload: Processing failed for file >/sbcimp/dyn/data/RAR/OD_App1/arsload_GI/Blackbox_GI/BLACKBOX.LOGS.20151023<
arsload: 10/29/15 21:32:12 -- Unloading started
OnDemand UnLoad Successful - LoadId(6026-6-0-1397FAA-16732-16732) Rows Deleted(1)
My ARS Config files point to the below tmp path
ARS_TMP=/sbcimp/dyn/data/RAR/OD_App1/temp
There are no files available in the above directory.
Ulimit -a output also looks good.
ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 8192
vmemory(kbytes) unlimited
Can someone pls suggest me the ways to resolve this issue.
Thanks
-
Hi,
Ulimit -a output also looks good.
ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 8192
vmemory(kbytes) unlimited
well it doesn't look good for me!!!
You should also increase the stack to "unlimited":
ulimit -s unlimited
and try again if that is enoug.
-
Hi,
Ulimit -a output also looks good.
ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 8192
coredump(blocks) unlimited
nofiles(descriptors) 8192
vmemory(kbytes) unlimited
well it doesn't look good for me!!!
You should also increase the stack to "unlimited":
ulimit -s unlimited
and try again if that is enoug.
I agree with allesandro. Though we were advised to against the wishes of our unix guys, we went with unlimited for ulimit -Ha and ulimit -a and we had no problems loading files. We were getting issues when trying to load 50MB excel files.