We converted each set of statements to one ‘Execute SQL’ task. Is this the right approach?
No, given your statement of "the long procedures are not easy to manage. Also, wanted to improve the performance" by calling the stored procedures within the context of an SSIS package, all you have accomplished is adding a layer of overhead for your calls.
How do you make it better? That's going to depend greatly on what you are doing. You general approach will probably look like
- Create a source and destination OLE DB Connection Manager
- Create a variable, type of string and use this to store the name of the table being created
- Execute SQL Task - this actually creates your destination table. I think I've read an explicit table declaration is more efficient than creating one with an INTO statement
- Wire a data flow task to that Execute SQL Task. Use an OLE DB Source and change the source type from table to query and call your stored procedure. This might need to be modified to not create the destination table. Drop an OLE DB destination onto the canvas and change the destination to Table or View from Variable Fast Load (name approximate) and select the variable created above.
- The update is probably best left to the existing logic. Just put that in an execute sql task
- This index creation is also going to be an Execute SQL Task.
Things still going slow with all these joins? That'll probably be a tuning operation. We'd need to see table structures, the queries and the estimated query plan.