So I am creating a list of primes using the "sieve" method and a Python comprehension.
no_primes = [j for i in range(2,sqrt_n) for j in range(i*2, n, i)]
Problem is the Sieve method generates tons of duplicates in the 'no_primes' list. It was recommended to use locals()['_[1]'] to gain access to the list as it is being built and then removing the dups as they occur:
no_primes = [j for i in range(2,sqrt_n) for j in range(i*2, n, i) if j not in locals()['_[1]']]
Problem is, this ability has been removed as of 2.7 and so does not work.
I understand that this method may be "evil" (Dr. Evil with his pinky at his lips.) , however, I need to remove dups before they affect memory with a massive list. Yes, I can filter or use 'set' to remove dups, but by then the list will have taken over my computer's memory and/or 'filter' or 'set' will have a massive task ahead.
So how do I get this ability back? I promise not to take over the world with it.
Thanks.