1

I'm designing a database monitoring application. Basically, the database will be hosted in the cloud and record-level access to it will be provided via custom written clients for Windows, iOS, Android etc. The basic scenario can be implemented via web services (ASP.NET WebAPI). For example, the client will make a GET request to the web service to fetch an entry. However, one of the requirements is that the client should automatically refresh UI, in case another user (using a different instance of the client) updates the same record AND the auto-refresh needs to happen under a second of record being updated - so that info is always up-to-date.

Polling could be an option but the active clients could number in hundreds of thousands, so I'm looking for a more robust and lightweight (on server) solution. I'm versed in .NET and C++/Windows and I could roll-out a complete solution in C++/Windows using IO Completion Ports but feel like that would be an overkill and require too much development time. Looked into ASP.NET WebAPI but not being able to send out notifications is its limitation. Are there any frameworks/technologies in Windows ecosystem that can address this scenario and scale easily as well? Any good options outside windows ecosystem e.g. node.js?

John Saunders
  • 160,644
  • 26
  • 247
  • 397
tunafish24
  • 2,288
  • 6
  • 28
  • 47
  • SignalR? Have the POST to the WebAPI push a notification out to all connected clients that they are to refresh their data. – Brendan Green Dec 11 '14 at 03:46
  • SignalR is a layer on top off various other technologies websocket, polling etc. I'm looking for a lower-level solution. Found PushStreamContent, but can't figure out if it will scale to hundreds of thousands of connections. Any ideas? – tunafish24 Dec 15 '14 at 06:21
  • User SignalR or Node.Js based solution. Analyse stock trading application. – Amit Dec 16 '14 at 07:15
  • whats the client type, web page or stand alone application? – Steve Dec 16 '14 at 21:36
  • @Steve Clients will be Mobile and desktop applications, written in C# or C++. – tunafish24 Dec 17 '14 at 00:02
  • is P2P network allowed for your project? – Steve Dec 17 '14 at 00:07
  • @Steve Updates have to come directly from the server. While I know low-level socket programming and can implement such a solution, it obviously would require much more development/testing time, so I'm trying to avoid going down that path. A solution using ASP.NET or some other .NET technology would be much preferred. – tunafish24 Dec 17 '14 at 03:07
  • @tunafish24 web server is not possible due to the way HTTP works. Socketing is your only way out – Steve Dec 17 '14 at 03:09
  • Check out how firebase works. They use sockets and create a custom method to handle one of five events: firebase on(events) include: val, child_added, child_changed, child_removed, child_moved. Val events are always triggered last. From what I've read, you should be able to handle 1400 to 1800 concurrent connections per server without performance issues. – zipzit Mar 12 '15 at 18:00

2 Answers2

1

You did not specify a database that can be used so if you are able to use MSSQL Server, you may want to lookup SQL Dependency feature. IF configured and used correctly, you will be notified if there are any changes in the database.

Pair this with SignalR or any real-time front-end framework of your choice and you'll have real-time updates as you described.

One catch though is that SQL Dependency only tells you that something changed. Whatever it was, you are responsible to track which record it is. That adds an extra layer of difficulty but is much better than polling.

You may want to search through the sqldependency tag here at SO to go from here to where you want your app to be.

Andy Refuerzo
  • 3,312
  • 1
  • 31
  • 38
  • signalR can't handle hundreds of thousands concurrent connections, or should I say the web server itself can't – Steve Dec 17 '14 at 20:17
  • SignalR itself can be configured to handle "hundreds and thousands" of concurrent connections. See the docs. But one could assume that since it will be deployed in the cloud, you can beef up the web server to a certain degree to handle that kind of load. – Andy Refuerzo Dec 17 '14 at 21:53
  • sure.....it can claim that it would be able to handle hundreds of thousands connections at once but....remember, its not possible on the hardware level. – Steve Dec 17 '14 at 23:45
  • oh come on @Steve :) In .net 4.0, the default `maxConcurrentRequestsPerCPU` is set to 5000. Get a 32-core EC2 instance or something like that and you easily have 160000 theoretical. :P – Andy Refuerzo Dec 18 '14 at 00:03
  • you might as well do something like making a HTTP request every 1s to pull info and see if the server likes it. I am pretty sure the server would die fairly quickly after it went over 10k concurrent connections. BTW the property is called PerCPU not PerCore. Not sure if that's just a bad name. Just look at the number of concurrent web socket allowed by all the host providers, and you will see most of them are under 1k. I am sure you know why that is the case – Steve Dec 18 '14 at 00:07
  • All I'm saying is that 'its not possible on the hardware level' is no longer true with the advent of the cloud. Well we can't force MS and Amazon to use the same naming convention when referring to the processing unit, right? – Andy Refuerzo Dec 18 '14 at 00:10
  • To me, CPU refers to the actual CPU itself not the cores inside the CPU, note you can have multiple CPUs in one work station or...Cloud. And the limitation still exists when it comes to cloud computing, there is definitely a limit on the host provider to prevent you stressing the cloud as well... so web server is not a feasible solution since it is just not the right tool for the job... if you want the connection to be persistent, get a private server. – Steve Dec 18 '14 at 00:31
  • Ok if its a matter of terminology, let me correct myself. "Get a 32-vCPU EC2 instance..." Why would they let you get a pre-configured instance then just limit you to not max it out? Just doesn't make sense. I'm sticking with the requirements set by the OP. But hey, if you say that its not feasible then that is your opinion and I'll respect that. :) Cheers! – Andy Refuerzo Dec 18 '14 at 00:41
1

My first thought was to have webservice call that "stays alive" or the html5 protocol called WebSockets. You can maintain lots of connections but hundreds of thousands seems too large. Therefore the webservice needs to have a way to contact the clients with stateless connections. So build a webservice in the client that the webservices server can communicate with. This may be an issue due to firewall issues.

If firewalls are not an issue then you may not need a webservice in the client. You can instead implement a server socket on the client.

For mobile clients, if implementing a server socket is not a possibility then use push notifications. Perhaps look at https://stackoverflow.com/a/6676586/4350148 for a similar issue.

Finally you may want to consider a content delivery network.

One last point is that hopefully you don't need to contact all 100000 users within 1 second. I am assuming that with so many users you have quite a few servers.

Take a look at Maximum concurrent Socket.IO connections regarding the max number of open websocket connections;

Also consider whether your estimate of on the order of 100000 of simultaneous users is accurate.

Community
  • 1
  • 1
dan b
  • 1,172
  • 8
  • 20