the reason for failure is likely out of memory (not disk space), that would best explain the sudden crash, missing event information and no logs, and fits the scenario you describe.
you mention this happens in stress test, is this also likely a real-life scenario? if so, then you should consider splitting the output to multiple XMLs, or you may need to get creative in how you generate a single XML output.
Thanks for the reply.
For 1.25 million records,xml file size is 650 mb.
so for 1.6 million records, it could be around 1 gb.
My server has almost 50 gb.
this can not be space issue in the XML file folder as per my understanding.
You said that "Any thing above 1.25 million it is failing with the same error."
This clearly states that the issue was because of the event generation that consumes a lot of memory when input file size is huge.
To resolve this issue, disable events generation while running Data Transformation project. Do the following:
- In Data Transformation studio, go to Project > Properties.
- Go to Output Control tab.
- Uncheck Create event log.
Otherwise, for optimized performance, please ensure the Java heap size is more than that of the size of the file processed.