1

The example in advanced_outputs.ipynb sending data from the kernel to the client is almost exactly what I need. And that example references the Jupyter comms documentation. But am I right that currently, although I can open a new connection to the client from the kernel with data attached, I cannot yet send multiple messages over the same connection?

The workflow that I think I want is e.g., in javascript:

  (async () => {
    google.colab.kernel.comms.registerTarget('example_comms', (comm, message) => {
      document.body.appendChild(document.createTextNode('comm opened.'))
      comm.send('comm opened');
      console.log(comm)
      comm.on_msg(function(msg) {
        var p = document.createElement("p")
        var t = document.createTextNode(message.data.foo)
        p.appendChild(t)
        document.body.appendChild(p)
      })
    });
  })()

and in python

channel = comm.Comm(target_name='example_comms')
def handle_message(msg):
  print(f"python received message: {msg['content']['data']}")
channel.on_msg(handle_message)
for i in range(10):
  channel.send(data={'foo': i})

But it seems that google.colab.kernel.comms.Comm does not export on_msg, nor any other way that I can see to send multiple messages over the same channel? The ipykernel.comm.Comm object has a send method, but I don't think it can be used at present?

Here is a notebook with minimal modifications from the advanced_outputs example. Running it throws an error in the console that on_msg is not defined.

I suspect (but admittedly have not yet measured the performance to confirm) that making a new connection for every message is overhead that I want to avoid. Any help is appreciated!

Russ Tedrake
  • 4,703
  • 1
  • 7
  • 10
  • Profiling confirms that the cost of opening a new comm on each send is a showstopper for me.
    `Line # Hits Time Per Hit % Time Line Contents ` ============================================================== 84 501 895220.0 1786.9 100.0 comm.Comm(target_name="meshcat", data=command.lower())
    – Russ Tedrake Aug 06 '20 at 10:25

1 Answers1

0

I think this is what you are looking for:

from IPython.display import Javascript

comm_ = None
def target_func(comm, msg):
  global comm_
  comm_ = comm  # The connection you are looking for !

  ## Register handler for later messages
  @comm.on_msg
  def _recv(msg):
    # Use msg['content']['data'] for the data in the message
    comm.send({'echo': msg['content']['data']})
        
  @comm.on_close
  def _close(msg):
    global comm_
    comm_ = None
get_ipython().kernel.comm_manager.register_target('comm_target', target_func)

Javascript('''
(async () => {
  const channel = await google.colab.kernel.comms.open('comm_target');

  (async function() {
    for await (const message of channel.messages) {
      console.log(message.data)
    }
  })();
})()
''')

Contrary to what you are doing, it is establishing a comm channel from client to kernel, which is way more natural in my opinion. Then at this point you get a persistent connection comm_ that you can use to send messages to the javascript frontend:

comm_.send(data="test")

It is also possible to get the list of every open connection doing:

get_ipython().kernel.comm_manager.comms

So it is possible to get rid of the ugly global var and monitor instead changed of this attribute. It could also be used to wait for a connection to be established for instance.

Note that in the case of Jupyter Notebook the Python code is the same, but not the Javascript one. Here is the corresponding implementation:

const channel = await Jupyter.notebook.kernel.comm_manager.new_comm('comm_target')

channel.on_msg(function(message) {
    console.log(message.content.data);
});
milembar
  • 919
  • 13
  • 17
  • Thank you! I just (finally) spent some time trying this. But I'm never able to actually get any `console.log` output from inside the `for await (const message of channel.messages)` loop. Do you happen to have a colab notebook where this works? – Russ Tedrake Feb 21 '21 at 16:57
  • I don't have any example out-of-the-box but yes I was able to make it works on colab, though to throughput turned out to be quite limited in my setup (combining backend zmq server with javscript notebook frontend using python to forward messages in both directions). I suggest to have a look to my other post about a similar [topic](https://stackoverflow.com/questions/63651823/direct-communication-between-javascript-in-jupyter-and-server-via-ipython-kernel/63666477#63666477). I will try to find some time to write an example based on my simulator [jiminy](https://github.com/Wandercraft/jiminy). – milembar Feb 22 '21 at 10:00