17

I am able to connect two ipython console session to one kernel by:

$ ipython console
In [1]: %connect_info  
{
   ... Content of JSON with info for connecting ...
}

Paste the above JSON into a file, and connect with:
    $> ipython <app> --existing <file>
or, if you are local, you can connect with just:
    $> ipython <app> --existing kernel-43204.json
or even just:
    $> ipython <app> --existing
    if this is the most recent IPython session you have started.

And accordingly I can substitute the <app> by console again

$ ipython console --existing kernel-43204.json

However, I want to share my kernel with ipython notebook so I can visualize my data. I tried and failed with:

$ ipython notebook --existing kernel-43204.json
[C 13:35:01.025 NotebookApp] Unrecognized flag: '--existing'

Any suggestion how I can work and switch between ipython console and ipython notebook?

Oplatek
  • 350
  • 2
  • 13

4 Answers4

9

There is no UI, nor API to do that with the notebook, there is an assumption for code simplicity that the notebook is the one that own and start the kernel. You will have to write your own KernelManager subclass and configure IPython to use it (+ write a bit of UI code, if you want it easy to use), for you to be able to select an already existing kernel.

Matt
  • 27,170
  • 6
  • 80
  • 74
  • Do you think that this not implemented feature is against the ipython notebook "ideology"? ... or it may become maintained if implemented? – Oplatek Jul 15 '15 at 17:32
  • I doubt it would be maintained if implmented, there are some inherent abstraction breakage if you do that. Though is a limited fashion, I know that it would be interesting for some people. – Matt Jul 15 '15 at 23:00
  • So, if I understand, notebook allows multiple kernel instances (one per notebook), and multiple kernel types (Kernel->Change Kernel menu), but all kernels must be initiated from the HTML interface. Since the kernel ZeroMQ interface has some security, I was hoping to run the kernel-only on a shared server machine and attach locally running notebook to the existing kernel. Motivated by no solution to this [question](http://stackoverflow.com/questions/32914669/can-an-http-server-listening-on-localhost-be-made-accessible-to-just-one-user). I'm overwhelmed by the steps to configure HTTPS. – NoahR Oct 08 '15 at 21:11
  • Another use case for Jupyter notebooks to support connecting to an already running kernel is to aid in debugging the kernel itself. As it is currently not possible to launch a kernel in a debugger and then have Jupyter notebook connect to that kernel instance being debugged, we have to resort to having Jupyter launch the kernel, then attach a debugger to this already running kernel. This means that it is very difficult if not impossible to debug early kernel startup unless we modify the kernel to somehow wait for a debugger to attach to it. – ack Mar 18 '21 at 16:22
4

I'll give you a solution the other way around. Instead of connecting a notebook to an existing kernel, you can easily connect an ipython session to a kernel that was started by a notebook.

  1. Start your notebook. Now you have a running kernel.
  2. In a code cell, run the magic command %qtconsole

Now you have a console and the notebook connected to the same kernel. You can run the magic command multiple times and have multiple consoles.

BTW, qtconsole is a very smart console. It is even better than the terminal one, especially if you are a Windows user.

neves
  • 33,186
  • 27
  • 159
  • 192
2

Here’s the example of custom kernel manager that allows Jupyter notebook to kernel created externally.

https://github.com/ebanner/extipy

It’s hacky solution at best.

Jupyter folks can hopefully create such custome kernel class and include it in package and enable it via simple —existing switch. I don’t see any reason why they can’t do that.

Shital Shah
  • 63,284
  • 17
  • 238
  • 185
2

None of the other answers here worked for our use case for various reasons, but after a fair amount of hacking we found the following solution. Some work is still needed to put this to use, but it should be enough to get started; just run each of the following blocks in a Jupyter notebook.

First, spool up a remote kernel for us to connect to:

import IPython
import inspect

test_var = "HELLO WORLD"

def embed():
    caller_frame = inspect.stack()[1][0]
    scope = {}
    scope.update(caller_frame.f_globals)
    scope.update(caller_frame.f_locals)

    IPython.embed_kernel(local_ns=scope)
embed()

Now we connect to the remote kernel using a BlockingKernelClient

import jupyter_client
connection_file = jupyter_client.find_connection_file("kernel-1337.json")

client = jupyter_client.BlockingKernelClient()
client.load_connection_file(connection_file)
client.start_channels()

At this point, we're able to send text to the remote kernel and have it interpreted as python. Note that test_var was never defined in our notebook's kernel, but we can still print it out like normal:

client.execute_interactive("print(test_var)")

execute interactive in a shell

Plotting works like normal too, even though we're sending data to the remote shell:

client.execute_interactive("import matplotlib.pyplot as plt; plt.scatter([1], [1])");

plotting via execute interactive

Now we just need to hide this hack from the user. We can do that using ipython input transformer hooks:

def parse_to_remote(lines):
    new_lines = []
    for line in lines:
        line_wrapped = f"client.execute_interactive({repr(line)});"
        new_lines.append(line_wrapped)
    return new_lines

ipy = get_ipython()
ipy.input_transformers_post.append(parse_to_remote)
del parse_to_remote

The del parse_to_remote is important; without it, the local kernel will crash.

After this point, any cells you run in your notebook will be parsed by the remote kernel instead of the local one, and outputs should just work like normal. If we print the remote-side test_var again, we can see that the code is running in the remote kernel (not the local, jupyter-spawned one):

print(test_var)

printing the remote variable test_var

At this point, the Jupyter notebook will act as if it were connected directly to the remote kernel. Restarting the notebook will just reconnect to the remote kernel without restarting it.

John Aaron
  • 210
  • 2
  • 8
  • Wow supper nice job! I've been trying out different methods so far but yours is something I haven't seen yet. Have you considered kernel provisioning as well? Meaning subclassing from LocalProvisioner and overriding the launch_kernel method to return an existing kernel. This would only be possible after v7.0 as stated here: https://jupyter-client.readthedocs.io/en/stable/provisioning.html – roboto1986 May 21 '23 at 19:53
  • We’d definitely like to connect directly to the kernel instead of going with this hack, but this was the first method we found that worked. If we do end up figuring out how to get the notebook to connect directly to the kernel I’ll make sure to add another answer to this post. – John Aaron May 23 '23 at 00:47