I’ve been tasked with the job of writing a Java process to move some information from a legacy as400 system to our new SQL Server 2008 installation. Because the information is managed on the as400 system, this needs to be an “on demand” solution (IE someone kicks it off once or twice a day and it should populate SQL Server with the newest version of the data). It's a pretty heavily used set of data so I think it would be a good idea to use a transaction to clear the table and rebuild it. The transaction should prevent anybody from accidentally picking up an incomplete list by requesting the data halfway through my process. Is this the correct way to manage a situation like this?
Further explanation if needed:
I’m one of the newer developers on the team and one of the much more senior developers is trying to insist we perform an "in-memory replacement" process. He wants to update every record that exists in both tables, insert the new ones which didn’t exist, and then delete the old ones which no longer exist. I’m not convinced this is a good idea, particularly because any requests midway through the process will get returned a mishmash of data. I just want to know how strongly I should push back from a technical standpoint.