3

I have used knockout for several projects recently. I have just been working on my first mobile project and have noticed a issue with my network provider (O2) stripping out comments from the source code, this was causing a significant issue in the page as all the knockout virtual elements are also being removed. After some research this seems to be due to some software by bytemobile. The software compresses images, strips comments, etc. (more details on This Question).

I'm struggling to find out how this file knows to strip out the comments. The "full" sites we have working seem to keep their comments/virtual elements when loaded over the mobile data but the mobile sites are having the data stripped out.

Does anyone know how bytemobile/O2 tell whether to ignore the page? Or know of any workarounds/fixes (other than the one linked) to the problem?

Any help or advice would be appreciated!

Community
  • 1
  • 1
Manatherin
  • 4,169
  • 5
  • 36
  • 52
  • 1
    "The software injects a javascript file to compress images, strip comments, etc." ... and how, exactly, would compressing a website *on the client side* work? :) This must be something a transparent proxy between your device and your server does. – Tomalak Aug 15 '13 at 14:47
  • But that doesn't make any sense. JS - injected or not - is only run on the client/device. – Tomalak Aug 15 '13 at 16:25
  • @Tomalak: You've not heard of Node.JS? – Matt Burland Aug 15 '13 at 19:32
  • From reading the link question, it appears that the injected JS has nothing to do with the comment stripping but actually loads the original version of images after they've been stripped out and replaced by poorly compressed version. – Matt Burland Aug 15 '13 at 19:35
  • @Matt LOL! Yes I've headed, and no, that's not how they are doing it. – Tomalak Aug 15 '13 at 19:45
  • @Tomalak Fair enough, I have removed the "injection" bit out of the question. But the question stands on whether there is a way to tell bytemobile to ignore the site/how bytemobile knows the site is a mobile site. I am working around it at the moment by removing the virtual elements but this does lead to messier markup because of empty divs and stuff (when the div is data bind if) where it was a virtual element before. – Manatherin Aug 16 '13 at 07:58
  • Well, that's what I would have recommended. It's not as nice as KO's virtual elements, but it's pragmatic and more robust. And, it's the same number of lines of code, strictly speaking. I'm really not sure if configuring/tricking those transparent web optimization services is a) possible and b) useful (there are N others, you just happened to notice O2's one). I say: Don't waste your time. Virtual elements are too brittle. – Tomalak Aug 16 '13 at 09:00

1 Answers1

1

You mentioned that you don't want answers from the linked question, but I think setting the "Cache-control" header to "no-transform" is the correct way of telling the proxy not to interfere with your page. What part of that solution don't you like?

sean
  • 26
  • 3
  • I'm not particularly against it, I just wanted to know if I have other options. There is also no guarantee that other similar services would honor the header. I'd prefer removing the virtual elements in favor of tags rather than rely on that. – Manatherin Aug 16 '13 at 09:38
  • Id' just say that if the don't honor the header, which is at least a standard HTTP 1.1 header, their proxy is simply broken. Guess there's not much you can do about that, especially not in Javascript land, because the page is already broken by the time it reaches the client. SSL would also solve your problem, but that's even more invasive to your server config. – sean Aug 16 '13 at 13:45