-3

I've made a program that reads a text file from my site that gives the md5 checksum and latest link to the needed jar file. The program will then evaluate the given md5 checksum with that of the local file and download it if necessary. There is one problem with this though, should there be quick updates the program will keep downloading over and over because it's getting a cached copy of the file and not the actual live version. Is there any way to fix this?

grundyboy34
  • 372
  • 1
  • 2
  • 9

1 Answers1

1

You can add certain cache control headers to the HTTP request and/or response to prevent the use of cached copies. It's also possible to suppress caching for an entire site, although this is generally ill-advised. See the spec for details about what to add.

Specifically, you can add this header:

cache-control: no-cache

to the request headers. See here for more cache-control syntax.

Note that some HTTP caches do not always respect these headers, so you may need to take other steps, such as adding so-called "cache-busting" extra data to the URI. See, for instance, this thread for one such technique.

If you can use POST instead of GET, that should eliminate most caching problems, because POST responses are not supposed to be cached.

Community
  • 1
  • 1
Ted Hopp
  • 232,168
  • 48
  • 399
  • 521
  • 1
    Thank you very much! Cache-Control didn't work but cache busting did, can't believe I didn't think about using it here. I just simply modified the file URL by appending the current time in millis. `URL url = new URL(fileUrl + "?t=" + System.currentTimeMillis());` – grundyboy34 Jul 21 '13 at 17:26