-2

I am trying to query e-mails from a database that holds approx 170,000 rows of data. I have a basic mysql php script that works, but only will show the first 34,000 or so data.

My ultimate goal is to purge a large database of duplicates. How could I go about this?

<?php
$query="SELECT * FROM guess ORDER BY id";
$result=mysql_query($query);
$num=mysql_numrows($result);
mysql_close();
?>

First

<?php
$b=30000;
while ($b < 60000) {
    $id = mysql_result($result,$b,"id");
    $email = mysql_result($result,$b,"Email");
?>

<?php echo $b;  ?>. <?php echo $email ?> <br />
<?php $nummer++;   $b++; } ?> 

second

By using the above, I was going to store all the rows that contain an email in a php array.

Then, write write a script that will find duplicate emails and strip them. Finally, print the code in a browser, copy and paste in text and upload to my chimpmail.

Tim Post
  • 33,371
  • 15
  • 110
  • 174
Splat ty
  • 41
  • 4
  • 11
  • 2
    Show your code so we don't guess what is going on. – Blake Apr 21 '12 at 19:40
  • Do you really need all of the data in code, or are you just looking to export it to a file? MySQL for example has a utility called 'mysqldump' that might be easier for you. – AJ. Apr 21 '12 at 19:41
  • hi, the reason being is that there are a lot of duplicate emails. So I was going to store all the rows that contain an email in a php array. Then write write a script hat will find duplicate emails and strip then. Then print the code in a browser, copy and paste in text and upload to my chimpmail. – Splat ty Apr 21 '12 at 19:51
  • You can follow this question for your need: http://stackoverflow.com/questions/688549/finding-duplicate-values-in-mysql – Hüseyin Z. Apr 21 '12 at 19:55
  • I've made some edits to your question for formatting, and moved your subsequent comments into the question for clarification. Please double check my edits and improve them if needed. – Tim Post Apr 22 '12 at 13:48

1 Answers1

1

An idea which is probably not a good one but whatever... if you only read 34K data at once and you know the total amount of rows, using LIMIT in your query could be a solution.

Hüseyin Z.
  • 836
  • 1
  • 8
  • 18