
I could manually get the next cursor, wait 15 minutes for the rate limit to go down, call the function again with the cursor, get the next 80 items, then get that key and repeat, but I want to set up a script that can call it over and over. Only, it seems after about 80 users it gives me the lovely: => Array It then gets a cursor and goes to the next one, and repeats. The above code calls the twitter followers/list and gets the first 20 users.

$followersR = mysql_query($followersQ2) or die(mysql_error()) $followersQ = mysql_query("SELECT * FROM followers WHERE tw_id = '".$users->id."'") or die(mysql_error()) $followers = $tweet->get('followers/list', array('screen_name' => 'my_screen_name', 'cursor' => $cursor)) $tweet = new TwitterOAuth($consumerKey, $consumerSecret, $OAuthToken, $OAuthSecret) Include('twitter_oauth.php') //oauth connection I modified it to look like the following: include('config.php') //db connection Each page holds 20, and if you have 200 followers you have to go through 10 pages. With every request, the end user gets a "cursor" which allows them to navigate through "pages" of results. Response_dictionary = perform_http_get_request_for_url( url_with_cursor )Ĭursor = response_dictionary Url_with_cursor = api_path + "&cursor=" + cursor For those who don't want to click the link to see said pesudo code, it looks like the following: cursor = -1 I referenced Twitter's " cursoring" page, especially the pseudo code, to make my code. I'm building a database with my follower's information: screen name, user name and twitter ID. I'm using Abraham's OAuth to gain access. I imagine it's pretty easy to do, but I can't figure out what I'm doing wrong. You'd cache those results in your database of course. It will take time, but it could notify you via email when it's finished. You could create a note in the database of the current status and set a cron job to run every 15 minutes which would run a group of requests again. Even tweetbot has this limitation, as it's a limitation twitter impose.

I don't think there is any way round the limitations imposed. paging with 5000 id in the pageĢ- loop on the ids and send each 100 id in a comma separated string to get their infoģ- now u can get 1500 user object instead of 300 user object every 15 minutesīut you need also to set a timer every 15 request in case the followers list is more than 1500 Solution 2

This a way faster, but there's a limitation concerns also :ġ- make a request to get all followers ids.
