No unfortunately. PNG encoding is a relative slow process in itself (you can see this with large images saved in apps such as Photoshop too).
The only efficient way to speed up things is to reduce the size of the canvas bitmap you want to encode - the browser is not the optimal tool for large data handling such as a large image files (dimension wise) and they where never meant to do these sort of things.
Breaking it down in slices can help you unblocking the UI. But since you cannot encode them in parallel, and there is overhead in producing the Base-64 encoded data-uri, you would need to use async setTimeout to give the browser some time to parse the event queue between each slice, and of course, you would have to piece them together at some point, somewhere, so this would make it slower, more error prone and more complex all-in-all (complex isn't necessary a bad thing though in cases such as these). It's probably your best bet though if reducing bitmap size is not an option. And as Max points out, there is size limits for data-uris.
You could dump the raw buffer using getImageData()
but then you would end up with a raw sized buffer which has a chain of other implications.
Shared web workers could in theory do the encoding in parallel but would very likely be much slower than letting the browser do it in compiled code. You would have to provide code to do the encoding too... And as with many of the new things it does not have full or complete support yet.