1

For a data migration project (10mio rows /day) I need a Java framework which can handle a multithreaded processing of the data transfer. The processes should be scheduable, monitorable and dynamically configurable.

I read several articles how to cover the requirements by a collection of own implementations, but I think that I can avoid to reinvent the wheel by using an existing framework. Which framework is 'state of the art' ? I read about Sprint Integration, but my data source is a database and I don't see a chance for a event driven approach.

Even though this sounds like an ETL tool, the usage of an ETL is no option.

ABX
  • 1,173
  • 2
  • 22
  • 39
  • This question looks very offtopic here, but I'd suggest Apache Camel. You'll be able to choose between batch record processing or jdbc resultset row streaming and apply any EIP you need to transform and save your data. It's also have good monitoring capabilities with jmx and hawtio console. – Konstantin V. Salikhov Nov 08 '14 at 08:10
  • Off topic? Oh, why that? Anyway, Your comment leads towards the direction I was aiming for. While checking Apache Camel I fond a very good blog entry http://www.kai-waehner.de/blog/2012/01/10/spoilt-for-choice-which-integration-framework-to-use-spring-integration-mule-esb-or-apache-camel/ which compares the three 'open source big players' in the Integration Pattern area. – ABX Nov 08 '14 at 12:06

1 Answers1

1

OK, Apache Camel is the one I was looking for. Have a look at What exactly is Apache Camel? for further details.

Community
  • 1
  • 1
ABX
  • 1,173
  • 2
  • 22
  • 39