3

I have a question regarding Python(pylab) and plotting - I'm able to load and display an image (code below loads the image below), but I'm unable to plot this as a contour in 3D, I understand an array is required for pylab.contourf(x,y,z) though I'm unsure how to achieve this from the loaded image data.

Any suggestions and assistance please. My code:

from PIL import Image
import pylab

fileName = "image1.png"
im = Image.open(fileName)
#pylab.contourf(im) # don't work - needs an array but how
pylab.axis('off')
pylab.imshow(im)
pylab.show()

image1.png

Harry Lime
  • 2,167
  • 8
  • 31
  • 53

3 Answers3

5

The reason your image can be represented in a contour plot is that it is clearly a pseudocolor image, that is, an image that uses the full RGB color spectrum to represent a single variable. Contour plots also represent data that have a single variable that determines the color (i.e., the Z axis), and therefore you can probably represent your image data as a contour plot as well.

This is the reason I suggested that you use a contour plot in the first place. (What you're actually asking for in this question, though, generally does not exist: there is no generally valid way to convert a color image into a contour plot, since a color image in general has three independent colors, RGB, and a contour plot has only one (the Z-axis), i.e., this only works for pseudocolor images.)

To specifically solve your problem:

1) If you have the z-axis data that's used to create the pseudocolor image that you show, just use this data in the contour plot. This is the best solution.

2) If you don't have the z-data, it's more of a hassle, since you need to invert the colors in the image to a z-value, and then put this into the contour plot. The image you show is almost certainly using the colormap matplotlib.cm.jet, and I can't see a better way to invert it than unubtu says here.

In the end, you will need to understand the difference between a contour plot and an image to get the details to work.

demo of why convert doesn't work:
Here I run through a full test case using a ramp of z-values from left to right. As is clear, the z-values are now totally messed up because the values that were the largest are now the smallest, etc.

That is, the goal is that fig. 2 matches fig. 4, but they are very different. The problem, of course, is that convert doesn't correctly map jet to the original set of z-values.

enter image description here

import numpy as np
import matplotlib.pyplot as plt
import Image

fig, axs = plt.subplots(4,1)

x = np.repeat(np.linspace(0, 1, 100)[np.newaxis,:], 20, axis=0)

axs[0].imshow(x, cmap=plt.cm.gray)
axs[0].set_title('1: original z-values as grayscale')

d = axs[1].imshow(x, cmap=plt.cm.jet)
axs[1].set_title('2:original z-values as jet')    
d.write_png('temp01.png')  # write to a file

im = Image.open('temp01.png').convert('L')  # use 'convert' on image to get grayscale
data = np.asarray(im)  # make image into numpy data
axs[2].imshow(data, cmap=plt.cm.gray)
axs[2].set_title("3: 'convert' applied to jet image")

img = Image.open('temp01.png').convert('L')
z   = np.asarray(img)
mydata = z[::1,::1]  # I don't know what this is here for
axs[3].imshow(mydata,interpolation='nearest',cmap=plt.cm.jet)
axs[3].set_title("4: the code that Jake French suggests")

plt.show()

But, it's not so hard to do this correctly, as I suggest above.

Community
  • 1
  • 1
tom10
  • 67,082
  • 10
  • 127
  • 137
  • simplifying code, the key is convert('L'), i.e. rgb to greyscale, then your code works: img = Image.open('40.jpg').convert('L') – Harry Lime Mar 24 '13 at 22:07
  • This is not a generally valid solution. In general, there is no reason to suspect that that `convert(L)` will be the inverse of `jet`, and in particular, the jet colormap is dark at both ends (giving z~=0 at both ends), whereas the z-values are monotonic. I see in your answer that your picture looks correct, but I don't think that `convert(L)` in a generally valid solution. See, for example, here: http://stackoverflow.com/questions/7440340/jet-colormap-to-grayscale – tom10 Mar 27 '13 at 04:26
  • For clarity on the "convert" question, I've updated my answer with a full example or the conversion steps to show why it doesn't work. – tom10 Mar 29 '13 at 16:44
4

OK, some research and simplifying code, the key is convert('L'), i.e. rgb to greyscale, then Ali_m's code works:

from mpl_toolkits.mplot3d import Axes3D
from matplotlib import pylab as pl
from PIL import Image
import numpy as np
import pylab

img = Image.open('40.jpg').convert('L')
z   = np.asarray(img)
mydata = z[::1,::1]
fig = pl.figure(facecolor='w')
ax1 = fig.add_subplot(1,2,1)
im = ax1.imshow(mydata,interpolation='nearest',cmap=pl.cm.jet)
ax1.set_title('2D')

ax2 = fig.add_subplot(1,2,2,projection='3d')
x,y = np.mgrid[:mydata.shape[0],:mydata.shape[1]]
ax2.plot_surface(x,y,mydata,cmap=pl.cm.jet,rstride=1,cstride=1,linewidth=0.,antialiased=False)
ax2.set_title('3D')
ax2.set_zlim3d(0,100)
pl.show()

output here

Harry Lime
  • 2,167
  • 8
  • 31
  • 53
  • -1: I'm happy to change my vote, but as presented, I don't believe the code you show gets you from the image you show in you question to the one in your answer. – tom10 Mar 29 '13 at 20:51
  • I'm sure it does. Change / don't change vote - as long as questions get answered I couldn't give a toss. – Harry Lime Apr 02 '13 at 10:00
  • 3
    No offense intended. I down voted so others wouldn't think this was a generally valid solution, since I don't believe it is, and I wrote the explanation so you would have a chance to explain it. Obviously, "I'm sure it does", doesn't explain anything, but that's your choice. In my answer I gave a fairly detailed explanation of why `convert` won't work, and I'd like to figure this out. Personally, I answer questions here not to help a specific individual, but to help build a body of generally useful answers, which is what I was also trying to do with this question. – tom10 Apr 02 '13 at 16:55
  • it's only an image - try to relax – Harry Lime Apr 03 '13 at 08:55
2

Edit: sorry, I misunderstood the OP's original question. To get a numpy array from a PIL Image object you can usually just call np.array(im). However, I work with a lot of microscopy data, and I find that for some image formats (particularly 16bit TIFFs) this syntax doesn't always work, in which case I would use np.asarray(im.getdata()).reshape(*im.shape[::-1]).

Here's a revised example:

import numpy as np
from matplotlib import pylab as pl
from mpl_toolkits.mplot3d import Axes3D
from PIL import Image

def getimarray(path):
    im = Image.open(path,'r')
    return np.array(im)

def doplots(path='tmp/cell.png'):

    mydata = getimarray(path)
    mydata = mydata[::5,::5]
    fig = pl.figure(facecolor='w')
    ax1 = fig.add_subplot(1,2,1)
    im = ax1.imshow(mydata,interpolation='nearest',cmap=pl.cm.jet)
    ax1.set_title('2D')
    ax2 = fig.add_subplot(1,2,2,projection='3d')
    x,y = np.mgrid[:mydata.shape[0],:mydata.shape[1]]
    ax2.plot_surface(x,y,mydata,cmap=pl.cm.jet,rstride=1,cstride=1,linewidth=0.,antialiased=False)
    ax2.set_title('3D')
    ax2.set_zlim3d(0,255)

    return fig,ax1,ax2

if __name__ == '__main__':
    doplots()
ali_m
  • 71,714
  • 23
  • 223
  • 298
  • I think the OP's essential question is how to convert `im` (see his code) into a numpy array to pass into `contourf`. – Warren Weckesser Mar 23 '13 at 04:07
  • Yes, how to convert, im = Image.open("image1.png"), to an np array? I see I could use the above code if im was an array. – Harry Lime Mar 23 '13 at 12:48
  • Ali_m, doesn't work - I get this -Traceback (most recent call last): File "C:\Users\Jake\Desktop\img.py", line 27, in doplots() File "C:\Users\Jake\Desktop\img.py", line 20, in doplots ax2.plot_surface(x,y,mydata,cmap=pl.cm.jet,rstride=1,cstride=1,linewidth=0.,antialiased=False) File "C:\Python27\lib\site-packages\mpl_toolkits\mplot3d\axes3d.py", line 1351, in plot_surface rows, cols = Z.shape ValueError: too many values to unpack – Harry Lime Mar 24 '13 at 18:02
  • what does `mydata` look like? It should be 2D with the same dimensions as `x` and `y`. If your input image was RGB(A) rather then grayscale then you'll get a 3D array back with the colour channels in the 3rd dimension, which would account for the error you're seeing. – ali_m Mar 25 '13 at 01:06