I am a Python beginner. I'm trying to write a simple but neat code to solve the following problem.
I've written this, but it always evaluates to ~325% somehow. What might be wrong?
import random
numberOfStreaks = 0
flip = []
for experimentNumber in range(10000):
# Code that creates a list of 100 'heads' or 'tails' values.
for i in range(101):
flip.append(random.randint(0,1))
# Code that checks if there is a streak of 6 heads or tails in a row.
for i in flip:
if flip[i] == flip[i+1]:
if flip[i] == flip[i+2]:
if flip[i] == flip[i+3]:
if flip[i] == flip[i+4]:
if flip[i] == flip[i+5]:
numberOfStreaks += 1
flip = []
print('Chance of streak: %s%%' % (numberOfStreaks / 100))