9

I'm looking to store a lot of data in a URL hash parameter without exceeding URL character limits.

Are there any conventional ways of compressing string length which could be then decoded on another page load?

I've seen LZW encoding used for similar solutions, however would special characters be valid for this use?

Community
  • 1
  • 1
Curtis
  • 101,612
  • 66
  • 270
  • 352
  • An alternative solution if you have a DB would be to store the uncompressed string, give it an id and use that in the hash. The client could then retrieve the initial arguments through an AJAX request. That's an additional roundtrip, but it might be acceptable. Is the state shareable between clients? If not you could event store that in the `localStorage`. – plalx Feb 09 '15 at 16:14
  • @plalx Thanks, the state is shareable between clients, but requirements are for no back-end storage unfortunately. – Curtis Feb 09 '15 at 16:27
  • The problem with an encoding algorithm is that you cannot be sure that you will never exceed the limit, unless you know all potential combinations and that implies static data. I believe that the level of potential compression will also depend on what you are actually encoding. E.g. gzip seems quite good at handling repetitive fragments. – plalx Feb 09 '15 at 16:27
  • What kind of strings are you going to transfer? Which characters are allowed? What is the average length? – georg Feb 09 '15 at 16:55

3 Answers3

3

LZW encoding technically works; you'll just need to convert the LZW-encoded binary into URL-safe base64, so that the output doesn't contain special characters. Here's an MDN article on base64 in JavaScript; the URL-safe variant of base64 just replaces + with - and / with _. Of course, you're not likely to reduce the size of your string by much by doing this, unless the data you want to store is extremely compressible.

Adam R. Nelson
  • 541
  • 4
  • 11
  • 2
    Just as a pointer: library [lz-string](https://pieroxy.net/blog/pages/lz-string/index.html)'s `compressToEncodedURIComponent` seems to implement roughly this approach. – ojdo Oct 24 '19 at 13:20
3

You can look at smaz or shoco, which are designed for the compression of short strings. Most compression methods don't really get rolling until well after your URL length limit, so you need a specialized compressor for this case if you expect to get any gain. You can then encode the binary result using a scheme like Base 64 or a more efficient coding that uses all of the URI-safe characters.

Mark Adler
  • 101,978
  • 13
  • 118
  • 158
0

You can try this JSONCrush - Compress JSON into URL friendly strings .

João Melo
  • 784
  • 6
  • 16