1

I am looking for a way in Informatica to pull data from a table in a database, load it in Snowflake, and then move on to the next table in that same DB and repeating that for the remaining tables in the database.

We currently have this set up running in Matillion where there is an orchestration that grabs all of the names of a table of a database, and then loops through each of the tables in that database to send the data into Snowflake.

My team and I have tried to ask Informatica Global Support, but they have not been very helpful for us to figure out how to accomplish this. They have suggested things like Dynamic Mapping, which I do not think will work for our particular case since we are in essence trying to get data from one database to a Snowflake database and do not need to do any other transformations.

Please let me know if any additional clarification is needed.

Perdue
  • 479
  • 9
  • 19
  • Probably not possible using informatica. Powercenter works on fixed set of metadata columns so if your table structure changes, infa will not be able to process them. `dynamic mapping` is related to data services and not sure if its related to powercenter. In case you have this kind of requirement, I believe you can create some shell script or python script to move data from one DB to another. – Koushik Roy Aug 31 '22 at 08:33
  • 1
    @KoushikRoy That was what I was afraid of. So far I've been exploring Data Transfer Tasks to at least get Salesforce data into Snowflake, and that seems to work. The main downside though is that we would have to map out 60+ tables this way and then create a taskflow to run them all. – Perdue Aug 31 '22 at 14:00
  • What db source? I feel like your ideal solution will rely on querying the information schema of the source db, the looping through the object to dynamically populate your source and target in the mapping task. But you are still going to need to create the schema for each table in SF. – Jatin Morar Aug 31 '22 at 21:14
  • @JatinMorar This is what I am thinking as well. My source is Salesforce itself and I need the data to go into Snowflake. There is a Replication Task within Informatica, but it doesn't work with Snowflake. So I think I am pretty much stuck with mapping out each object in Salesforce to a table in Snowflake to get what I need. – Perdue Sep 01 '22 at 15:14

1 Answers1

0

Dynamic Mapping Task is your answer. You create one mapping. With, or without any transformations - as you need. Then you set up Dynamic Mapping Task to execute the mapping across whole set of your 60+ different sources and targets.

Please note that this is available as part of Cloud Data Integration module of IICS. It's not available in PowerCenter.

Maciejg
  • 3,088
  • 1
  • 17
  • 30
  • I really don't think this is the way I can go because I have multiple tables that all have different columns that the same mapping cannot be applied to. What I need is just to copy tables from one database to another, but if you have any examples where a Dynamic Mapping Task is used for my scenario I would be happy to see them since my research has yet to find anything. – Perdue Sep 01 '22 at 15:13
  • In IICS - as opposed to PowerCenter - there is no need to specify individual columns. Give this a try. Build & run the solution, see where you're able to get and let us know if you face errors or issues. – Maciejg Sep 02 '22 at 12:27
  • We have tried to get explanations on how to do this from Informatica and what we could find on the web. Per my manager, I have been instructed to use Data Transfer Tasks and map each table individually since we are wasting too much time trying to figure out the Dynamic Mapping. Do you have any examples that I could use that show how we can accomplish this? I have tried on my own without success to get this to work. – Perdue Sep 03 '22 at 13:30
  • There are some videos available online. I'm happy to help you answer some questions or help overcome issues, but I will not be able to spend time doing this kind of research, unfortunately. Try, describe the issues that you're facing, and let's take it from there. – Maciejg Sep 07 '22 at 13:16
  • I was able to get something somewhat set up with a Dynamic Mapping Task, but the issue that I am now facing is that if there is a new column added in Salesforce that it does not get imported right away through the Dynamic Mapping Task into Snowflake. I have to add the column in manually, and I've tried to find something to point me in the right direction with this but everything I've looked at says this is not possible. Have you done something like this before? – Perdue Sep 07 '22 at 16:11
  • Have you tried this with some simple DB table? Just to remove Salesforce out of the picture and try if it all works. – Maciejg Sep 07 '22 at 17:07