3

I'm creating a collaborative web music platform.

At present, i have a simple drum machine working locally, with a JSON file logging all the beats. i.e. after punching in the pattern, the code looks like this when logged to console. A scheduler and play function then iterates through and plays the beat if it's 'on' at the current beat.

  "kick": [1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0],
    "snare": [0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0],
    "hat": [0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 1, 1],
    "tom1": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
    "tom2": [0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0],
    "tom3": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

So next i want to network this web app, so two people can edit the pattern at the same time. I'm really struggling and would love some help here. I've looked at meteor and sharejs, but have become quite confused.

How can i have a JSON file living on the server, which is edited by two users? (they will take it in turns to edit the pattern, like this game http://sharejs.org/hex.html#9TGAyPGFOy) This JSON file needs to be updated in the code at all times, so the latest version of the track can be played to both users.

Any tips would be great, i feel i'm overcomplicating myself here...

Thanks

futureRobin
  • 81
  • 2
  • 10
  • Wouldn't source control such as [git](http://git-scm.com/) do the trick? When you pull the other person's work you can merge the files together so that you each have the latest. – Sumner Evans Mar 09 '15 at 13:18
  • 1
    I'm guessing the OP wants to have two users edit the pattern and have the server play the adjusted beat back to them both in realtime. – Andy Mar 09 '15 at 13:21
  • The concept is like this game ; http://sharejs.org/hex.html#9TGAyPGFOy. The users click the UI, which changes a JSON file, which they can both then replay the latest version of. So both users have control over the main backbone log file on the server. – futureRobin Mar 09 '15 at 13:25

3 Answers3

2

I would suggest using Socket.io for this project.

You could have a simple socket server set up, with the following code:

(don't forget to npm install --save socket.io!)

// server.js

// we're using filesystem stuff here
var fs = require('fs');

// set up socket.io
var io = require('socket.io').listen(80);

// load the current json file in
var currentJSON = fs.readFileSync('./beat.json');

io.on('connection', function(clientSocket) {

   console.log('Client connected!');

   // tell the client that just connected what the current JSON is
   clientSocket.emit('stateUpdate', {currentJSON: currentJSON});

   // listen for an "update" event from a client
   clientSocket.on('update', function(payload) {
       if(payload.updatedJSON) {

           console.log('We got updated JSON!');

           // validate the incoming JSON here
           validateBeatJSON(payload.updatedJSON);

           // update the "currentJSON" variable to reflect what the client sent
           currentJSON = payload.updatedJSON;

           // save the json file so the server will have it when it next starts. also, try not to use *sync methods for this, as they'll block the server but ive included them cos they're shorter
           fs.writeFileSync('/beat.json', JSON.stringify(payload.updatedJSON), 'utf8');

           // tell the other clients what the current state is
           clientSocket.broadcast.emit('stateUpdate', {currentJSON: currentJSON});

       }
   });

});

//client.html
<script src="socket.io.js"></script>
<script>
 var currentJSON = [];
 var socket = io(); // TIP: io() with no args does auto-discovery
  socket.on('stateUpdate', function (payload) {
     currentJSON = payload.currentJSON;
  });

  // blah blah, set event for when the json is updated
  window.addEventListener('updatedJSONLocally', function(e) {
    socket.emit('update', {updatedJSON: currentJSON});
  });

</script>

I basically just typed this into the answer box - this hasn't been tested or whatever, but I think it gives you a good idea of the fundamentals of the Socket.io library for your project.

As mentioned in my comments, I would not advise using *sync methods when doing filesystem operations. This method is a little easier to read but will lock the entire process while a file is read/written. This will lead to problems if there are more than a couple of users using the project. Look into the *async methods for ways to combat this. They're not much harder to implement.

Good luck!

Tom Hallam
  • 1,924
  • 16
  • 24
  • Don't forget, that writing file synchronously is easy way to implement this. In real production environment, one should not block a whole process with a synchronous file writes. With async write, you should have some kind of locking/queiing mechanism in place. – Capaj Mar 10 '15 at 12:46
  • @freshnode done. working well synchronously currently but will be switching to async in the near future. Thanks again, real helpful – futureRobin Mar 11 '15 at 13:55
0

Let's simplify this as much as possible.

The simplest solution would be to have a javascript object in memory (in your node.js server) and let the users edit this from a single API call. Like,

app.put('/composition', updateJSONObject)

updateJSONObject function will read the in memory javascript object, edit it, and return the updated one.

app.get('/composition', getJSONObject)

This route will get the latest JSON object.

(I'm assuming you're using express.js here)

And then, do a simple setInterval to save your javascript object to the file system using fs.createWriteStream periodically (once every 1 minute?). This will save your json object like filename.json which you can retrieve if the in-memory json object is blank. (Like, after a server restart)

Handling collisions is a bigger problem. But since you don't want to overcomplicate your solution, this is simple, and get the job done unless you're expecting 100 edits a second.

ncabral
  • 2,512
  • 1
  • 19
  • 20
-2

I think a simple PHP script on your server would do the trick here.

You can have a JSON/txt file for each session (maybe generate a random ID for each one) and use AJAX to send the data from both users to the server, which will then update the JSON file.

You can use jQuery for AJAX:

$.post("myscript.php", { data: 'json goes here' });

In your PHP, you can then get the POST data:

$data = $_POST['data'];

Which will be a string that you can save onto your server. Here is a tutorial on saving files with PHP.

To get the data back, you need to simply reload the JSON file from the server. I'd suggest saving the whole thing as a string (JSON.stringify() in Javascript), and then using JSON.parse() to decode it when you get it back.

Given that the users will take turns, this architecture should work just fine.

LMK if there's anything I'm being unclear about, and good luck!

  • 1
    The OP is already using Node.js by the looks of things - no need to add more languages into the mix. This also doesn't really help in terms of the collaboration being realtime (which you would expect a music generation project to be). – Tom Hallam Mar 09 '15 at 15:08