0

Background - I am not a programmer. I do trade spot forex on an intraday basis. I am willing to learn programming

Specific Query - I would like to know how to export into Excel in real time 'top of book' price and volume data as displayed on the LMAX level 2 widget/frame on - https://s3-eu-west-1.amazonaws.com/lmax-widget/website-widget-quote-prof-flex.html?a=rTWcS34L5WRQkHtC

In essence I am looking to export

  1. price and volume data where the coloured flashes occur.
  2. price and volume data for when the coloured flashes do not occur.

I understand that 1) and 2) will encompass all the top of book prices and volume. However i would like to keep 1) and 2) separate/distinguished as far as data collection is concerned.

Time period for which the collected data intends to be stored -> 2-3 hours.

What kind of languages do I need to know to do the above? I understand that I need to be an advanced excel user too.

Long term goals - I intend to use the above information to make discretionary intraday trading decisions. In the long run I will get more involved with creating an algo or indicator to help with the decision making process, which would include the information above.

I have understood that one needs to know coding to get involved in activities such as the above. Hence I have started learning C ++. More so to get a hang/feel for coding. I have been searching all over the web as to where to start in this endeavor. However I am quite confused and overwhelmed with all the information. Hence apart from the specific data export query, any additional guidelines would also be helpful.

As of now I use MT4 to trade. Hence I believe to do the above - I will need more than just MT4.

Any help would be highly appreciated.

user3666197
  • 1
  • 6
  • 50
  • 92

1 Answers1

1

Yes, MetaTrader4 is still not able ( in spite of all white-label-ed Terminals' OrderBook Add-On(s) marketing and PR efforts ) to provide an OrderBook-L2/DoM-data into your MQL4 / NewMQL4 algorithm for any decision making. Third party software tools' integration is needed to make MQL4-code aware of the real-time L2/DoM-data.

LMAX widget has impressive look & feel, however for your Excel export it requires a lot of programming efforts to re-use it for an automated scanner to produce data for 1 & 2 while there may be some further, non-technical, troubles on legal / operational restrictions for automated scanner to be operated on such data-source. To bring an example, the data-publisher policy restrict automated Options-pricing scanners for options on { FTSE | CAC | AMS | DAX }, may re-visit the online published data-sources no more than once a quarter of an hour and get blocked / black-listed otherwise. So a care and a proper data-source engineering is in place.

Size of data collection is another issue. Excel has some restrictions on an amount of rows/columns that may get imported. Large data-files, the more the CSV-imports may strike these limits. L2/DoM-data, collected for 2-3 hours just for one single FX Major may go beyond such a limit, as there are many records per second ( tens, if not hundreds, with just a few miliseconds between them ). Static file-size of collected data-records take typically several minutes to just get written on disk, so proper distributed processing data-flow-design and non-blocking-fileIO engineering is a must.

Real-time system design is the right angle to view the problem solution approach, rather than just some programming language excersise. Having mastered some programming language is a great move, nevertheless, so called robust real-time system design, and Trading software is such a domain, requires, with all respect, a lot more insights and hands-on experience than to make an MQL4 code run multi-thread-ed & multi-process-ed with a few DLL services for a Cloud/Grid-based distributed processing system.

How much real-time traffic is expected to be there?

For just a raw idea, what the Market can produce per second, per milisecon, per microsecond, let's view a NYNEX traffic analysis for one instrument:

One second can have this wild relief: REAL-TIME TRAFFIC PER SECOND

And once looking into 5-msec sampling: REAL-TIME TRAFFIC PER MILISECOND

How to export

  1. Check if the data-source owner legally permits your automated processing.
  2. Create your own real-time DataPump software, independent of the HTML-wrapped Widget
  3. Create your own 'DB-store' to efficiently off-load scanned data-records from real-time DataPump
  4. Test the live data-source >> DataPump >> DB-store performance & robustness on being able to serve error-free a 24/6 duty for several FX Majors in parallel
  5. Integrate your DataPump fed DB-store local data-source for on-line/off-line interactions with your preferred { MT4 | Excel | quantitative-analytics } package
  6. Integrate a monitoring of any production environment irregularity in your real-time processing pipeline, which may range from network issues, VPN / hosting issues, data-source availability issues to an unexpected change in the scanned data-source format/access conditions.
user3666197
  • 1
  • 6
  • 50
  • 92
  • That was a big help and very insightful. Would it be possible to suggest/recommend a few languages to look into for the above work. As I understand that different languages have different strengths. For example I am quite sure I will need to create some form of database and based on my personal research ( which can be flawed) I will have to learn something like SQL. In this case I guess it would be relevant for the 'DB store' component. If there is any other software or application where I will be able to perform all of the above in an integrated manner, please do feel to make suggestions. – upliftingmania Oct 03 '14 at 12:36
  • **Good to know you consider it helpfull**. Programming languages typically spark battles as having many fans, a lot of evangelists, a few fanatics, many opinion-based opponents and may have some fact-based opponent(s). The language is IMHO the last thing to decide for any Project about. There are more language-dependent issues: Is the Language capable? What will be a cost of the new Language learning & adaptation of all tools to get solid craftmanship with a new Language? Will the Project be developed with small/big number of people? What is their previous practice with respective Language(s)? – user3666197 Oct 03 '14 at 12:46
  • Well your comment grew in length. It will require a bit longer text to answer it.Be rather carefull about the real-time and off-line tasks. A single `DB-store` typically cannot perform reasonably well on both and so database need not mean any classical steam-powered RDBMS. For real-time sensitive operations, some in-memory datastructures will outperform any general purpose RDBMS as not having their "overhead", while for off-line ( typically analytics, quantitative modelling and back-testing ) work, there are lot of tools well suited for data-processing while not-being an SQL-interpret on RDBMS – user3666197 Oct 03 '14 at 12:58