Please look for any core files generated under $INFA_HOME/server/bin which may have build up and use disk space.
Review as well any sessions which may have failed at the same time: maybe temporary file for the run are not deleted: check session logs for full path.
Thanks & Regards,
This is a question to discuss with your network administrator / Linux administrator. The only exception is what Segolene has pointed out: temporary files ($PMTempDir\, $PMCacheDir}) may fill up the file system to naught, and core dumps in .../server/bin may block much disk space as well.
Other than that that's one typical task of system monitoring, not of Informatica software. The Informatica platform is (in terms of disk space usage) nothing but an application, so thorough and constant system monitoring is in order here.
what files and and in what directory was using all of the space.
Can you please let us know what makes you think that this issue was caused by infromatica and not some other process?
Can you please share your analysis?
The symptom that you described is expected, the IS and the whole PC domain can die if it doesnt have enough space to create logs and temp files on the file system.
Have you identified the files that consumed disk space? That might give some clues as to what happened.
Another possible cause is mapping executions generating errors.
A log of a mapping processing milions of rows can be huge when there are transformation errors.
In those cases is not strange getting logs of several GBs, check also that.
Is your issue solved? If so, please share the solution so others could benefit of it.
Disk Space usage at 100% can cause issues for any application, such as experiencing a stopped Informatica process.
Whatever files are suddenly taking up the full disk space will need to be investigated and addressed.
A core file and analyzing the trace would tell a lot at time of the service stopping. Ensure core files are enabled and ulimit core size set to unlimited on unix to properly obtain full core file if/when generated.
I dont think that the OP will ever respond back.
Can one of the admins please close this topic out?
Below is a summary of all of the possible reasons that are related to PC that can cause this issue to occur.
1- temporary files ($PMTempDir\, $PMCacheDir}) may fill up the file system to naught,
2- core dumps in .../server/bin may block much disk space as well.
3- mapping executions generating errors. A log of a mapping processing milions of rows can be huge when there are transformation errors.
4- You might be running a session that writes to a FF on the server and the Flat file targets are huge and take up a ton of space.