1

I am required to setup a web application that will interact with an existing ERP system (WinMagi). The ERP is basically a front-end to an xBase (FoxPro) database. The database is located on an in-house server. The ERP, as far as I'm aware, doesn't have an API but can accept purchase orders, etc through an EDI module. The web application should be able to accept online orders and query data for reporting.

My plan so far:

  1. Synchronize the xBase DB to a SQL server instance on a cloud hosted VM.
    • (one-way from ERP -> SQL Server)
  2. Use this sync process as an interface between the ERP and web application.
  3. Push purchase orders back to the ERP using EDI.
    • My thinking here is that it would be safer from a data concurrency perspective to create or update data in the ERP through a controlled and accepted (by the ERP) interface.

Questions/Concerns:

  1. What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?
  2. Would the xBase DB become locked during sync? Or otherwise cause an issues for the live ERP?
  3. How do I avoid data concurrency / integrity problems during the sync?
  4. This system wouldn't be serving live data to the web app. What sort of issues can I expect due to this?
  5. Should I prefer one language over another for this sort of project? My plan was to use Java/Hibernate MVC.

Am I perhaps going about this the wrong way? Would I be better off interfacing my web app directly with the xBase DB? Some problems that immediately spring to mind with this approach are networking issues between the office and the cloud-based VM and potential security vulnerabilities from opening up the ERP directly to the internet.

Any advice or suggestions you might be able to provide would be greatly appreciated!! Thanks in advance.

UPDATE - 3 Sep 2012

How I'm currently doing the data copy (it's not a synchronization) - runs nightly:

  1. A linux box in the office copies the required DBFs from a read-only share on the ERP server to local storage.
  2. The DBFs are converted to CSV using Dave Burton's fantastic dbf2csv perl script
  3. The resulting CSVs are rsync'd to the remote VM. There are only small changes in the data so this is quite fast.
  4. Once the rsync is complete the remote VM does a mysqlimport to the production DB.

Advantages of this approach

  • The ERP cannot be damaged in any way as the network access is read-only.
  • No custom logic has to be implemented to sync data and hence there are no concerns that the data could be wrong on the remote VM.
  • As the data copy runs at night the run time isn't too important.
    • Current run time is approx 7 minutes for over 1 million records with approx 20-30 fields per record.
    • Longest phases are the DBF copy and conversion to CSV.

Disadvantages

  • The DBFs have to be copied in full every time.
  • The DBFs have to be converted in full every time.
  • Tables that are being copied are locked during the mysqlimport. This isn't really too much of an issue though as the import runs during the night and the mysqlimport only takes about 20 seconds.
YakovL
  • 7,557
  • 12
  • 62
  • 102
max
  • 179
  • 2
  • 10
  • I'm thinking an ETL tool like Talend, CloverETL or Pentaho would be a good fit for you. This way you aren't reinventing the wheel and are able to control the sync. – Andrew Mar 12 '12 at 16:04
  • Thanks for the suggestion @Andrew. I'll look into it. Do you have any other ideas about the other questions/concerns? Do you know if the xBase DB would become locked and cause issues for the in-house ERP? – max Mar 12 '12 at 21:22
  • if you're selecting from the xBase DB, you should be able to use NOLOCK statements. This, of course, is a "dirty read", and has some drawbacks, but it is an option. – Andrew Mar 13 '12 at 13:21
  • I need to build something similar. Did you manage to find a better solution? – Gurinder Singh Jun 11 '15 at 08:01

2 Answers2

1
  1. If you are using Visual Foxpro 3.0 or greater, you could use the built in DataBase container to create a connection to the SQL Server DB. Then the Views in the .DBC would do the heavy lifting of reading and updating the SQL Server tables.

  2. I would envision a routine that looped through your Foxpro table and reading the rows and then making the updates to the SQL Server DB. So, the Foxpro tables shouldn't be lock. To ensure this, you could first query the DBFs into a cursor, then loop through the cursor.

  3. I would suggest adding procedure to do concurrency checking.

Another option to server live Foxpro data in your web apps would be to create a linked server in SQL Server to your Foxpro database. That way your Foxpro data could be accessed real time.

Jerry
  • 6,357
  • 8
  • 35
  • 50
  • Thanks @Jerry. I don't really want to go down the road of using Visual Foxpro. I do like the idea of the linked SQL Server though. – max Sep 03 '12 at 07:28
1

I am currently doing something similar - I have to make invoice transactions from a FoxPro-based system available through a web application that will be on a remote, hosted VM running SQL Server.

I will answer your first point based on what I'm doing - you can decide for yourself whether it would work for you!

What is the best way to update the SQL DB from the xBase DB? Are there any pre-existing libraries that can do this so I don't have to reinvent the wheel?

I didn't really look for any shared libraries. What I did was (somewhat simplified):

  1. Added a field to the ERP-side transaction table that holds a CRC32 value based on other fields that I want to detect changes to (for example, the transaction balance).

  2. Wrote a standalone EXE that scans the ERP-side transaction table on a timer, calculates a CRC32 value based on some fields, compares this to the last CRC32 value stored in the new field from point 1, and if different then something has changed and the transaction needs to be re-sent. This EXE was written in VFP for simplicity in accessing DBF files, and it runs as a Windows service. When I get time it will be re-done in C#.

  3. Still in this EXE, once I have a list of new or changed transactions I convert them to JSON. I rolled my own JSON functions, but you could use Craig Boyd's from [Sweet Potato Software][1] or a number of others. There may be a PDF document associated with the transaction, if so it is encoded and embedded in the JSON.

  4. I send the JSON to a web service on the remote side using a class that leverages the standard Windows WinHTTP library (WinHttp.WinHttpRequest.5.1) . The remote web service is essentially running Java. It decodes it all and updates the SQL Server.

Alan B
  • 4,086
  • 24
  • 33
  • Thanks for the suggestion @Alan. I could probably implement something similar but want to steer clear of making changes the ERP tables. Please see the update in my question for my implementation. – max Sep 03 '12 at 07:30