1

I have done extensive research, i feel that i have good candidates but i still lack enough knowledge to decide which one i should implement, ideally i would like to hear from someone that actually implemented a solution to a similar problem.

The Problem

Our project consists of a community of distributed nodes (25 nodes). The nods run on Linux computers, and are installed in the typical residential setting (behind a NAT), with wide dispersion geographically and ISP wise.

Our software on the node collects a variety of its own unique data which is logged to MySQL DB on the local host (node) which is not WAN accessible directly. We also have a Web interface for each node that uses the local node DB to allow the local node user to visualize certain data and parameters; this is only accessible on the LAN.

We typically set-up and maintain an open port for SSH from our labs to each node. All remote DB on nodes have the exact same schema but completely different data. We need an automated way to collect all data from all the nodes and get them to our WAN accessible lab servers (Windows 7 servers, but can be Linux if it provide a better solution). We have narrowed the option as follow:

Solutions:

  1. Create a .bat script that sequentially connect to each node over SSH to import data.
  2. Use the web interface that runs on each node to periodically query the local db then save that data to a central MySQL server. I know i can connect to two db in PHP. seems to be doable here.
  3. Use the MySQL supported “slave-master” replication setup which will duplicate all remote databases on the server.
  4. Use the MySQL supported federated engine setup which will link local tables to remote ones.

Questions:

  1. Are all these viable solutions?
  2. Any major cons i should be aware of for the viable ones?
  3. is there better solutions available (paid or otherwise) ?
Community
  • 1
  • 1
InsaneBot
  • 2,422
  • 2
  • 19
  • 31

0 Answers0