48

With https://dev.twitter.com/docs/api/1/get/statuses/user_timeline I can get 3,200 most recent tweets. However, certain sites like http://www.mytweet16.com/ seems to bypass the limit, and my browse through the API documentation could not find anything.

How do they do it, or is there another API that doesn't have the limit?

Giacomo1968
  • 25,759
  • 11
  • 71
  • 103
apscience
  • 7,033
  • 11
  • 55
  • 89

5 Answers5

30

You can use twitter search page to bypass 3,200 limit. However you have to scroll down many times in the search results page. For example, I searched tweets from @beyinsiz_adam. This is the link of search results:

https://twitter.com/search?q=from%3Abeyinsiz_adam&src=typd&f=realtime

Now in order to scroll down many times, you can use the following javascript code.

var myVar=setInterval(function(){myTimer()},1000);
function myTimer() {
    window.scrollTo(0,document.body.scrollHeight);
}

Just run it in the FireBug console. And wait some time to load all tweets.

Giacomo1968
  • 25,759
  • 11
  • 71
  • 103
sevenkul
  • 966
  • 1
  • 8
  • 14
  • This doesn't seem to work for some accounts, such as [Shedletsky](https://twitter.com/search?q=from%3AShedletsky&src=typd&f=realtime). – Buge Aug 09 '14 at 06:15
  • 1
    i voted up but this doesn't bypass hashtag search limit. – sertaconay Aug 19 '14 at 14:23
  • 2
    This doesn't work. At a certain point (varies), you're unable to scroll further. – Tyler Biscoe May 31 '17 at 04:11
  • 1
    After 10K tweets my laptop of 12G RAM began to lag, but yes, this works. The problem is you cannot stop the script. I suggest to name the function with `var`, then you stop it by assigning `null` to the variable. – WesternGun Jul 16 '18 at 07:22
9

The only way to see more is to start saving them before the user's tweet count hits 3200. Services which show more than 3200 tweets have saved them in their own dbs. There's currently no way to get more than that through any Twitter API.

http://www.quora.com/Is-there-a-way-to-get-more-than-3200-tweets-from-a-twitter-user-using-Twitters-API-or-scraping

https://dev.twitter.com/discussions/276

Note from that second link: "…the 3,200 limit is for browsing the timeline only. Tweets can always be requested by their ID using the GET statuses/show/:id method."

meetar
  • 7,443
  • 8
  • 42
  • 73
2

I've been in this (Twitter) industry for a long time and witnessed lots of changes in Twitter API and documentation. I would like to clarify one thing to you. There is no way to surpass 3200 tweets limit. Twitter doesn't provide this data even in its new premium API.

The only way someone can surpass this limit is by saving the tweets of an individual Twitter user.

There are tools available which claim to have a wide database and provide more than 3200 tweets. Few of them are followersanalysis.com, keyhole.co which I know of.

Arjun Jain
  • 401
  • 1
  • 8
  • 20
  • Dear Arjun. I wanted to ask you what is the variable that you suggest to use in order to remove duplicated tweets whenever one tries to merge two different saved datasets of tweets by the same user. In other words I want to update the tweets of user @a that I saved one week ago and therefore I merge the old saved dataset with the most 3200 tweets that I instead collect now. However if I remove duplicated ID tweets, it seems to me that many duplicated tweets remain in the dataframe. Could you please help me with these? Should the id of tweets identify each unique tweet or not? Thank you so much – Marco Mello May 04 '19 at 19:44
  • you need to use the id_str variable to remove duplicate tweets. Two tweets can't have the same id_str (or id) – Arjun Jain May 09 '19 at 01:21
2

You can use a tool I wrote that bypasses the limit.

It saves the Tweets in a JSON format.

https://github.com/pauldotknopf/twitter-dump

Paul Knopf
  • 9,568
  • 23
  • 77
  • 142
0

You can use a Python library snscrape to do it. Or you can use ExportData tool to get all tweets for the user, which returns already preprocessed CSV and spreadsheet files. The first option is free, but has less information and requires more manual work.

Zilvinas
  • 11
  • 3