I am given a dataset and I am supposed to plot: How much does each player get paid per game on average?
I converted the dataset into a NumPy array:
Salary = np.array([KobeBryant_Salary, JoeJohnson_Salary, LeBronJames_Salary,
CarmeloAnthony_Salary, DwightHoward_Salary, ChrisBosh_Salary,
ChrisPaul_Salary, KevinDurant_Salary, DerrickRose_Salary,
DwayneWade_Salary])
Games = np.array([KobeBryant_G, JoeJohnson_G,LeBronJames_G, CarmeloAnthony_G,
DwightHoward_G, ChrisBosh_G, ChrisPaul_G, KevinDurant_G,
DerrickRose_G, DwayneWade_G])
After that, I wrote a for loop, and iterated through this array:
for i in range(0,10):
plt.plot(Salary[i]/Games[i])
Since one of the players had played 0 games it is showing the ZeroDivisionError
in my plot.
I wanted to know is this the right approach? Also if it is correct can I please how can I format the y axis so that the lower values are visible better (I tried playing with yticks
but it didn't help much).