You can create a mapping with multiple pipelines (all are parameterized) and create a mapping task for it. That way you can configure multiple source/targets in MCT. If you are already aware of this approach and this is not suitable for your usecase, I don't think there is an option as of now which would allow single pipeline in a mapping but use multiple source/targets in mappingtask.
You have 2 options here. What Nithy suggested may work but is not dynamic as you have a static number of data pipelines in the mapping. If anything changes you have to change the mapping.
option 1: 1 mapping and X number of mapping tasks.
In this option you have a single mapping where the source object, target object and the target field mapping are completely parameterized.
You then create a mapping task for each source/target combo where the mapping task has the parameter details.
You can also create these mapping tasks via the API, so if you design a process to handle new requests you can automatically added new source/target combos.
Yes you have multiple tasks but it is an option.
option 2: This is the better option. You need to use the "dynamic mapping task". This is a new task that is currently in tech preview but will be out shortly in the next release. This task allows you to define multiple configurations using the same mapping but different parameters. This way you have 1 mapping and 1 mapping task.
I would highly recommend you reach out to your sales rep to discuss the requirement and inquire about getting this added to your org.
Thank you for your response Nithy S. Here I'm looking for generic mapping for all the tables.
I want to know can we parameterize the source and target and run the parameterized file using python scripting and we loop our mapping for N number of times for N of tables to load the data into Target db.
Hi Scott, Thank you for yoiur response. I like your methods but we don't have License for Api Manager and Dynamic mapping task in Asset. Can I able to create Generic mapping by doing source and target parameterize and running the parameterize file using python scripting for n number of times for n of tables to load the data into target table , Instead of creatting n number of mappings. is it possible. If it is possible , Can you please let me know how can I process it.
Thanks and Regards,
Well lets start with API Manager. To use the IICS rest api's you dont need that or application integration. You just need something like Postman or anything else that you can call an API.
Dynamic mapping task you could ask for from your rep. It is not going to cost anything.
Python requires you to hand code something and have somewhere to run said python and then inside the python you are either going to call the CLI or the API to start a job. You could do it but it is going to be single threaded.
Python will have to loop thru the source/target table combo 1 by 1. Each loop will update the single mapping template's mapping task to use the correct parameter file and then execute said mapping task and wait for it to finish. Once done you update the par file the mapping tasks uses for the next table pair and execute and so on and so on.
Not very efficient but doable.
approach 1 you are only creating 1 mapping and X number of mapping tasks. One time actively where then you can run over and over.