5

I have an application which obtains data in JSON format from one of our other servers. The problem I am facing is, there is is significant delay when when requesting for this information. Since a lot of data is passed (approx 1000 records per request where each record is pretty huge) is there a way that compression would help reducing the speed. If so which compression scheme would you recommend.

I read on another thread that they pattern of data also matters a lot on they type of compression that needs to be used. The pattern of data is consistent and resembles the following

 :desc=>some_description
 :url=>some_url
 :content=>some_content
 :score=>some_score
 :more_attributes=>more_data

Can someone recommend a solution to how I could reduce this delay. They delay is approx 6-8 seconds. I'm using Ruby on Rails to develop this application and the server providing the data uses Python for the most part.

Community
  • 1
  • 1
Sid
  • 6,134
  • 9
  • 34
  • 57

2 Answers2

6

I would first look at how much of this 8s delay is related to:

  1. Server side processing (how much took for the data to be generated) There are a lot of techniques to improve this time, including:

    • DB indexes

    • caching

    • a faster to_json library

Some excellent resources are the NewRelic podcasts on Rails scalability http://railslab.newrelic.com/2009/02/09/episode-7-fragment-caching

  1. Transmission delay(how much time took for the data to be sent between the server and the client)

RamC
  • 1,287
  • 1
  • 11
  • 15
Vlad Zloteanu
  • 8,464
  • 3
  • 41
  • 58
3

gzip might significantly reduce the size of text data and optimize load speeds. It's also recommended by YSlow.

Darin Dimitrov
  • 1,023,142
  • 271
  • 3,287
  • 2,928