Assume i have a really basic script, that requires a lot of calculation:
c = 2
result = 0
for i in range(0,10000):
c += 5
c = i*c
print(c) //just added this, sorry for confusion!
...takes about 15 seconds in IDLE on my mac book pro. How can I get this exact script to run on the gpu not cpu, for faster results? Also, wondering how code (if at all) would need to change in order to work for gpu?
UPDATE: sorry, meant 15 seconds with the print statement at the end there. Turns out this is a bad example because IDLE executes this unusually slow - just tried in Terminal and it was instant.