This code estimates the value of pi and it then compares it to the real pi value by a certain accuracy which is defined as 'c'. Then it decreases 'c' to a smaller number and does the calculation again.
The values of c are .01,0.001,0.0001,0.00001.
What I am trying to do is the whole process 10 times and and find the average for the amount of 'd' which the amount of times it runs the code to get to the accuracy level I want.
import math
import random
pi = math.pi
n = 0
d = 0
ratios = []
xs = []
ys = []
c = 0.1
simulating = True
while c >= 0.0001:
while simulating:
x=random.random()
y=random.random()
xs.append(x)
ys.append(y)
if x**2 + y**2 <= 1.0:
n += 1
d += 1
ratio = 4*n*1./d
ratios.append(ratio)
if abs(ratio-pi) / pi <= c:
print "Draws Needed: ", d
break
c = c*.1
print c