I am making a scatterplot in python using matplotlib.pyplot (as plt).
The function plt.scatter() takes in two one-dimensional arrays, so for example, the following code outputs a figure like:
a=[1, 2, 3, 4, 5]
b=[3, 4, 5, 6, 7]
plt.scatter(a, b)
plt.show()
It also has an optional argument which allows the user to specify the size of the points.
For example:
a=[1, 2, 3, 4, 5]
b=[3, 4, 5, 6, 7]
area=[np.pi, np.pi, np.pi, np.pi, np.pi]
plt.scatter(a, b, s=area)
plt.show()
Creates the following figure: Scatterplot
I would really like the size of the circles to be a circle with radius one - (which would correspond to an area of 'pi') but it seems like this argument takes in pixel size, not size fitted with respect to the axes of my plot.
Does anybody know if there is a way to achieve this? (I do not want to manually measure the size of the pixel:axes ratio because the project that I am doing this for will make many different types of plots).
Thank you in advance.