1

I'm working on an application which will be updating multiple databases (SQL Server 2008 and Oracle 11g). TransactionScope seemed like the logical way to ensure updates were always committed correctly, but it seems that installing MSDTC is not going to be an option. In the future, it's also possible this application could be using data sources which don't support distributed transactions.

I've spent many hours trying to come up with another solution but nothing seems like it will work. All searches point back to TransactionScope and distributed transactions.

The application is written in C#, using the Entity Framework. Anyone have any suggestions, which won't require being escalated to distributed transactions? Here's a list of ideas I've had which have gone nowhere.

+TransactionScope: Can't use MSDTC. Future data sources may not support distributed transactions.

+Manually track and rollback transactions: I haven't found a good way to do this within Entity Framework.

+Queue/log failures so they can be re-committed by another process: Can't come up with a good way to store the failed commits generically. Also need to make sure the re-commit doesn't overwrite newer data.

nitedmn
  • 94
  • 10
  • Sorry, but which of them doesn't support distributed transactions? – Hamlet Hakobyan Oct 04 '13 at 18:48
  • As of right now all the sources do support them (although we can't turn on MSTDC). I work in at a public school district. We have multiple different systems for different things (Admissions, Grades, Attendance, etc). I'm trying to write a class library we can use to update all of the systems at once (If a kid changes his name we can use the library to update the name in all the applications at once). A new system, with XXXXX for it's data store, could be introduced at any time. – nitedmn Oct 04 '13 at 19:03
  • I think tat all applications must use the same shared data. – Hamlet Hakobyan Oct 04 '13 at 19:06
  • In an ideal world, yes. Howerver, these are all 3rd party applications which we don't always (most of the time) have much control over. – nitedmn Oct 04 '13 at 19:16
  • 1
    I think you need to look into an XA compliant transaction manager, probably in a middle tier. MS has [DTC](http://msdn.microsoft.com/en-us/library/windows/desktop/ms679938%28v=vs.85%29.aspx) although I've never used it personally. Oracle has [Weblogic](http://www.oracle.com/technetwork/middleware/weblogic/documentation/index.html), and an overview of XA is [here](http://docs.oracle.com/cd/E11882_01/timesten.112/e21637/xa_dtp.htm#BHAGCDGE). What you are trying to do is coordinating distributed transactions, so I'm not sure how you'll do this "without using distributed transactions" – tbone Oct 04 '13 at 19:55
  • I agree with @tbone. Also, please review **Related** links on right side of the web page, e.g. [this one](http://stackoverflow.com/a/887426/232279) ... – ThinkJet Oct 04 '13 at 23:25

1 Answers1

0

@ThinkJet. That related link is an interesting opinion. In my case a small failure, like what is described, would not be a huge deal. We currently have other stuff in place which tries to keep all these systems in sync (not always successfully). If one or two transactions did fail it should be picked up by these processes.

After reading through these comments, I might try to have this library write the data to it's own database. Then, sync those changes to the other sources so that the other applications can see the changes. It would cause a slight delay in some updates but even that would be better than what we have now.

nitedmn
  • 94
  • 10