Suppose I wanted to automatically select the size for square symbols in a scatter plot to make the borders align (a question like that has been asked).
In my answer to that question, I suggested that the distance between two data points measured in pixel could be used to set the size of the symbols in a scatter plot.
This is my approach (it was inspired by this answer):
fig = plt.figure()
ax = fig.add_subplot(111, aspect='equal')
# initialize a plot to determine the distance between the data points in pixel:
x = [1, 2, 3, 4, 2, 3, 3]
y = [0, 0, 0, 0, 1, 1, 2]
s = 0.0
points = ax.scatter(x,y,s=s,marker='s')
ax.axis([min(x)-1., max(x)+1., min(y)-1., max(y)+1.])
# retrieve the pixel information:
xy_pixels = ax.transData.transform(np.vstack([x,y]).T)
xpix, ypix = xy_pixels.T
# In matplotlib, 0,0 is the lower left corner, whereas it's usually the upper
# right for most image software, so we'll flip the y-coords
width, height = fig.canvas.get_width_height()
ypix = height - ypix
# this assumes that your data-points are equally spaced
s1 = xpix[1]-xpix[0]
# the marker size is given as points^2, hence s1**2.
points = ax.scatter(x,y,s=s1**2.,marker='s',edgecolors='none')
ax.axis([min(x)-1., max(x)+1., min(y)-1., max(y)+1.])
fig.savefig('test.png', dpi=fig.dpi)
However, using this approach, the symbols overlap. I can manually tweak the symbol size so that they align but don't overlap:
s1 = xpix[1]-xpix[0] - 13.
- Can the adjustment (in this case the
13
) be determined beforehand? - What is the flaw in the general approach, requiring an adjustment?