Do these files have the same name each day?
Who will ensure that these four files are removed / archived / moved somewhere else after the workflow has processed them?
This information will help in designing a suitable easy solution for you.
You can read multiple files from a remote location though FTP connection using a parameter.
To create a workflow and read multiple remote files from a remote server using FTP connection, do the following:
You need to first define a file name with a list of remote file names and use the indirect file to read them using a FTP connection
Do the following:
- Create a file with list of file names under the remote server.
- Use the file name as Remote File Name under the FTP connection properties:
- Select input Type = file and Source FileType= Indirect under Source properties:
- Run the session.
Parameterize the Remote File Name.Do the following:
- Create a file with the list of file names under the Remote Server.
- Create a parameter in a Parameter file.
- Reference the parameter file under the workflow properties.
- Make use of the parameter as part of the FTP connection.
- Run the session.
You can parameterize the Remote File name because it is within the scope of the application; however, you cannot parameterize the Remote File names that are a part of Remote File list.>cat filelist.txt$file1.txt$file2.txt
We leverage on the Operating System to read the file from the location specified. Informatica will see the file names coming in as regular file names and not the parameters. This is the second level of parameterization that is not supported in Informatica
So basically you have to wait for four files which (for example) are named ABC_ddmmyyyy.CSV, DEF_ddmmyyyy.CSV, GHI_ddmmyyyy.CSV, and JKL_ddmmyyyy.CSV (ddmmyyyy means "day number with two digits, month number with two digits, year number with four digits")?
The one problem in this case is that we don't know yet whether there may be more files available in the source directory which have similar names to your actual source files. So the big question is, are these four files named in a unique fashion, or may there be more files with similar names?
In other words: what makes these file names uniquely identifiable? That's a very important point here.
Now about the actual design. What would be interesting is first the operating system on which PowerCenter runs and whether you are willing to use a batch file / shell script in this context.
If a shell script / batch file would be ok for you, then you could use this batch file / shell script to look for those four input files. If they are there, the batch file / shell script would simply start the workflow. If they are not there, the batch file / shell script would wait for some time (let's say, five seconds) and look for those files again.
And this whole process would repeat for at most two hours.
That's the easiest way to implement this two-hours waiting time.
There are other methods, but this would indeed be the easiest way.
we managed to creatign the Audit table to keep track of daily files status and based on the file names insert in to audit table we created a link condition on Target success rows >=4 to pass with he next sessions.
this is how we resolved the issue.