You might have to check the complete workflow log and the Integration service log to see why the command task failed.
Could you verify if the Integration service is able to expand the environment variable: PMHOMEDIR
You can include a command as below to verify:
echo $PMHOMEDIR > /home/user1/echo.txt
Also, did you recycle the Integration service after setting the environment variable?
You would need to run the above command in command task and check if the variable is expanded.
Also, as mentioned kindly check the Integration service log as well.
if this is critical, we advise you to open a support case so that you can directly work with one of the support engineer and resolve this issue.
This clarifies that the variable is expanded correctly.
I did a similar test and the command failed in-house as well.
Following this, I checked the Integration service log which I has suggested and it gave me the cause:
serviceType serviceName severity timestamp threadName messageCode message IS IS 3$_:_$INFO 05/29/2014 16:18:35.253 PM 140569214961408 CMN_1954 [Command task instance [cmd]:] Process id 7039. Standard output and error:
sh: /home/user1/.infa.sh: No such file or directory
So, the command failed because there is no file as .infa.sh.
Then, I modified the command task as:
and it succeeded.
request you to kindly try the same. In case it fails, you can always check the IS log for more details.
Let us know if this helps you.
File .infa.sh is present on the exact loaction.
When i run the command task with this command /home/gdint/wdint/.infa.sh then it is executed successfully. But when I used the variable and run this command $PMHomeDir/.infa.sh then it is throws an error.
You told that as well that variable is also expanded successfully then why the commnad task is falling ?
Can you please tell me from where i will get the Intergration Service log because i do not have the permission on Admin Console ? Do i need to contact to my informatica admin guys to get the log file?
I don't able to undestand what i have missed in this.
Could you please also try making one more Integration Service and retry the logic?
You can fetch the IS log from the admin console-> logs.
You can use :infacmd isp GetLog to get the Integration service log.
Kindly refer the Client help guide to get details on this command.
I have tried the same logic in two integration service(DEV, INT) but it didn't work at all. I am getting the same error in both places.
Do you have any idea why it is not running ?
Try to set the PMHomeDir under general properties tab instead of setting it in environment properties and see what happens.
Amit- Do you have any specific reason that you are setting the root/home directory under env variables rather general properties of IS
If you keen on env variable why to set at application service level , you can set on the box itself.
I'm not Amit, but there are good reasons why setting environment variables globally may not be appropriate.
First they may negatively impact other applications running on the same box.
Second you cannot always set environment variables globally; for example, when using PowerExchange for Essbase on a machine where you also work with Oracle databases, you have to set a couple of environment variables (such as PATH and LD_LIBRARY_PATH resp. its relatives on other Unix systems) differently for those Integration Services running against an Essbase server than for all other Integration Services exchanging data with Oracle. Unfortunately several library files exist with identical names for both products, the Oracle client and the Essbase client software, so you have to cleanly distinguish between these two.
Third there are environment variables which you might want to set only for specific services (such as LD_PRELOAD on HP-UX). Setting these globally may turn your whole system unusable.
you can find the server logs (in this case, the log file of the Integration Service) in the file system. Unfortunately the exact location depends on the PowerCenter version, so you might have to look around a bit in order to find them.
You will most likely find them in some subdirectory named (for the example case of today, Nov 12th 2018) 2018-11-12 ; under 10.1 this is in the subdirectory $INFA_HOME/isp/logs, under other versions this will often differ a bit.
In this subdirectory there's one further level of subdirectories per node in the INFA domain.
Here you will find loads of files named 0_0.dat & 0_0.dat.idx and so forth.
You can translate these files to readable versions using the command infacmd.sh (resp. infacmd.bat on Windows) with its subfunction ConvertLogFile, like in this example:
infacmd.sh ConvertLogFile -in 0_0.dat -fm text -lo 0_0.txt
("-in" = input file for the conversion, "-fm" = format of output file, "-lo" = log goes to named output file)
Of course that requires that you have access to the file system.
If you can't access the file system but you have infacmd.bat installed locally (namely on your Windows PC running the PowerCenter clients), you might be able to use the following command on your PC in a cmd.exe shell window (after having done a "cd" into the subdirectory where infacmd.bat is located):
infacmd GetLog -dn DomainName -un UserName -pd Password -ro -fm text -lo intsvc.log -st IS -sn IntSvcName
(of course you have to enter the correct details instead of "DomainName" and so on)