Tracking and Hunting Furries in the Wild - A Comprehensive Guide to Discovery and Digging on Furs

  • Thread starter Thread starter GS 281
  • Start date Start date
  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
What about that FA Breach (not the index leak) that exposed usernames and emails? You could get some good info from there, but I couldn't find the DB anywhere.
 
What about that FA Breach (not the index leak) that exposed usernames and emails? You could get some good info from there, but I couldn't find the DB anywhere.
Yiffyleaks? That leak is quite old now and forget who did it but they may have removed things if people messaged them.

Depends how willing person behind it is to cover up stuff for certain people
 
Yiffyleaks? That leak is quite old now and forget who did it but they may have removed things if people messaged them.

Depends how willing person behind it is to cover up stuff for certain people
Having not known this prior, I'm glad I archived some shit.
 
I believe Vivisector had a link to the leaked email database, but unfortunately I can't find any archives from around that time. Everything I've found so far is from 2015 or earlier, and the leak happened May 2016.

Edit: This was the link to the database, but it no longer seems to work.
http://fapassap77jfeffk.onion.link/
fadatabasearchive.png
https://archive.fo/bWaMx
 
Last edited:
  • Informative
Reactions: Catch Your Breath
I know this thread is probably dead but I have a question, what would happen if one (say me) is turning into a furry? Are there any symptoms I should know about? Thank you!
  • Do you like furry porn?
  • Do you like gay furry porn?
  • Do you think every movie would be better if instead of humans it was anthro dogs or cats or whatever?
  • Did you start to use terms like " OwO " or " UwU " ironically but over time it became unironic?
Congrats. You got yiffed, faggot.
 
I know this thread is probably dead but I have a question, what would happen if one (say me) is turning into a furry? Are there any symptoms I should know about? Thank you!
If you roleplay as a furry character for sex.
If you find yourself thinking "I'd tap that" to a real life animal.
If you want to fuck cartoon anthros.
If you jerk off to furry porn.
If your dick twitches to it.
If you have a fursuit.
If you make a fursona (not just an anthro character, but your murrsona).
If you commission anything furry porn related.
If you draw or write anything furry porn related for your personal sexual pleasure.

Mostly sexual qualifications. There are people out there who can dig anthro art (think DarkNatasha) but not be a furfag.
 
My friend has a really good cow. Ill take their permission to make the thread, as they do not want to do it themself and I am not affiliated with the furry community so there is less risk of retribution.
 
In April/May 2015 there was a mass archiving of Furaffinity on the Wayback Machine. The group behind it was able to archive their login sessions, meaning they were able to properly access and archive all materials, including pages inaccessible to guests. Mature-rated submissions, adult-rated submissions. Most usefully profiles of users who have restricted guest access to their pages, which hampers most archival attempts.

If you're looking into someone with a Furaffinity account, check the WBM. If their account predates April/May 2015 all their pages will have been saved.
 
This info is outdated; Twitter killed the app I was using.

Recently I wanted to try archiving some furries' twitter timelines, as well as their followers and following lists. I went looking around online for tools that would help with this, but Google was interested only in showing me shady shit with price tags upwards of $40 per use. These tools also require you to have a Twitter account and access to Twitter's developer interface, which requires an application and has an approval process, and fuck that noise.

So, I went open source. And lo and behold, I got things to work properly. I'm happy to share with you a guide on how to go full NSA on somebody's public twitter feed, without even needing to be logged into a twitter account. This also will work on any operating system that can support Python, so basically any operating system not developed by a lolcow.

DISCLAIMER: I am not a programmer. This guide covers use of coding tools that I barely know how to use, let alone how to use safely. I am literally a script kiddie playing with scripts. Fuck around with these tools at your own risk.

We'll be covering how to use Twint, an open-source Twitter scraping tool coded in Python. The developers are working on a desktop app of this, but for now, you need to have Python installed in order to run it.

You can download Python here. If you're installing it on Windows, you'll want to make sure that these two options are checked during install:

1595037744073.png 1595037848740.png

The first, pip, lets you download and install Python shit with a single command. The second lets you run Python shit from the command line. Get Python installed, then operate up a command line interface (cmd if you're neurotypical; PowerShell, bash, or god knows what else otherwise) and type in the following command:

Code:
pip3 install twint

If you see a series of messages relating to downloading and installing shit from the internet and System32 doesn't vanish, you've done it right. Wait for the gears to stop whirring, then use the command prompt to navigate to a folder you know how to find again, such as Downloads if you're a heathen who saves everything to Downloads. Then, refer to the documentation in Twint's github wiki to build a command to harvest the Twitter account of your choice. As a note, this doesn't work on protected Twitters, so you won't be able to trawl somebody's AD timeline. What a shame.

Here's some examples of commands you can use:

Code:
 twint -u khordkitty -o khord_tweets.csv --csv
This command will pull everything down from KhordKitty's Twitter timeline and save it to a comma-separated values (CSV) file. CSV files are wonderful contraptions that can be opened in Excel or another spreadsheet editor, where you can run all manner of analytics on them. See attached for a sample of the results!

Code:
 twint -u khordkitty --followers -o khord_followers.csv --csv
This command will rip somebody's follower list from start to finish, saving every follower's username as a nice list.

Code:
 twint -u khordkitty --followers --user-full -o khord_followers.csv --csv
This command will rip somebody's follower list from start to finish, saving every follower's name, username, bio, location, join date, and various other shit as a nice list. WARNING: It also takes a lot fuckin longer to work.

Code:
 twint -u khordkitty --following -o khord_following.csv --csv
Same as the command to rip a follower list, except that this time, it collects the list of everybody they're following instead so you can see how many AD twitters they're jerking off to. The --user-full argument works here too, with the same caveat about taking longer.

That's just a few of the wonderful things you can do with twint. However, as I have learned, twint is not without its limitations. One such limitation I have observed is that Twitter does not like being scraped and will stop responding to scrapes after 14,400 tweets in succession have been scraped. Once Twitter so lashes out at you, it'll take a few minutes before Twint starts working again.

Twint has some workarounds for this -- such as allowing you to use the --year argument to only pull tweets from before a given year -- but it's still annoying as hell. I'll have to experiment further with it. Also, as you may have guessed, this only grabs the text of tweets, including image URLs, so if you really want to go full archivist on some faggot you'll need to use some additional code. Thankfully, @Warecton565 is to the rescue, with some code in a wonderful post. The code is not perfect; I did have to make a bit of a change just to get it to work at all:

Python:
import csv
from sys import argv
import requests
import json

tweets = csv.DictReader(open(argv[1], encoding="utf8"))

for tweet in tweets:
    pics = json.loads(tweet["photos"].replace("'", '"'))
    for pic in pics:
        r = requests.get(pic)
   
        open(argv[2] + "/" + pic.split("/media/")[1], "wb").write(r.content)

And I had to make the folder it was going to output to first and run the damn thing in IDLE before it would do anything. However, once you get it working, you can wind up with a folder containing hundreds of furry porn images and fursuit photos in mere minutes. Maybe even a photo of a dong or two.

Have fun! If you have anything you would like added to this post, please let me know via PM or some other means. Again, I'm just a fucking script kiddie.
 

Attachments

Last edited:
Recently I wanted to try archiving some furries' twitter timelines, as well as their followers and following lists. I went looking around online for tools that would help with this, but Google was interested only in showing me shady shit with price tags upwards of $40 per use. These tools also require you to have a Twitter account and access to Twitter's developer interface, which requires an application and has an approval process, and fuck that noise.

So, I went open source. And lo and behold, I got things to work properly. I'm happy to share with you a guide on how to go full NSA on somebody's public twitter feed, without even needing to be logged into a twitter account. This also will work on any operating system that can support Python, so basically any operating system not developed by a lolcow.

DISCLAIMER: I am not a programmer. This guide covers use of coding tools that I barely know how to use, let alone how to use safely. I am literally a script kiddie playing with scripts. Fuck around with these tools at your own risk.

We'll be covering how to use Twint, an open-source Twitter scraping tool coded in Python. The developers are working on a desktop app of this, but for now, you need to have Python installed in order to run it.

You can download Python here. If you're installing it on Windows, you'll want to make sure that these two options are checked during install:

View attachment 1455428 View attachment 1455433

The first, pip, lets you download and install Python shit with a single command. The second lets you run Python shit from the command line. Get Python installed, then operate up a command line interface (cmd if you're neurotypical; PowerShell, bash, or god knows what else otherwise) and type in the following command:

Code:
pip3 install twint

If you see a series of messages relating to downloading and installing shit from the internet and System32 doesn't vanish, you've done it right. Wait for the gears to stop whirring, then use the command prompt to navigate to a folder you know how to find again, such as Downloads if you're a heathen who saves everything to Downloads. Then, refer to the documentation in Twint's github wiki to build a command to harvest the Twitter account of your choice. As a note, this doesn't work on protected Twitters, so you won't be able to trawl somebody's AD timeline. What a shame.

Here's some examples of commands you can use:

Code:
 twint -u khordkitty -o khord_tweets.csv --csv
This command will pull everything down from KhordKitty's Twitter timeline and save it to a comma-separated values (CSV) file. CSV files are wonderful contraptions that can be opened in Excel or another spreadsheet editor, where you can run all manner of analytics on them. See attached for a sample of the results!

Code:
 twint -u khordkitty --followers -o khord_followers.csv --csv
This command will rip somebody's follower list from start to finish, saving every follower's username as a nice list.

Code:
 twint -u khordkitty --followers --user-full -o khord_followers.csv --csv
This command will rip somebody's follower list from start to finish, saving every follower's name, username, bio, location, join date, and various other shit as a nice list. WARNING: It also takes a lot fuckin longer to work.

Code:
 twint -u khordkitty --following -o khord_following.csv --csv
Same as the command to rip a follower list, except that this time, it collects the list of everybody they're following instead so you can see how many AD twitters they're jerking off to. The --user-full argument works here too, with the same caveat about taking longer.

That's just a few of the wonderful things you can do with twint. However, as I have learned, twint is not without its limitations. One such limitation I have observed is that Twitter does not like being scraped and will stop responding to scrapes after 14,400 tweets in succession have been scraped. Once Twitter so lashes out at you, it'll take a few minutes before Twint starts working again.

Twint has some workarounds for this -- such as allowing you to use the --year argument to only pull tweets from before a given year -- but it's still annoying as hell. I'll have to experiment further with it. Also, as you may have guessed, this only grabs the text of tweets, including image URLs, so if you really want to go full archivist on some faggot you'll need to use some additional code. Thankfully, @Warecton565 is to the rescue, with some code in a wonderful post. The code is not perfect; I did have to make a bit of a change just to get it to work at all:

Python:
import csv
from sys import argv
import requests
import json

tweets = csv.DictReader(open(argv[1], encoding="utf8"))

for tweet in tweets:
    pics = json.loads(tweet["photos"].replace("'", '"'))
    for pic in pics:
        r = requests.get(pic)
   
        open(argv[2] + "/" + pic.split("/media/")[1], "wb").write(r.content)

And I had to make the folder it was going to output to first and run the damn thing in IDLE before it would do anything. However, once you get it working, you can wind up with a folder containing hundreds of furry porn images and fursuit photos in mere minutes. Maybe even a photo of a dong or two.

Have fun! If you have anything you would like added to this post, please let me know via PM or some other means. Again, I'm just a fucking script kiddie.
The general twitter API limit is 5000 per 90 minutes, so I'm surprised twint can pull 14,400 tweets.

If you know how to modify twint, or if it has an option for it, I'd recommend putting a wait-timer in play that will pause it for a period of time after 5000 tweets.
 
The general twitter API limit is 5000 per 90 minutes, so I'm surprised twint can pull 14,400 tweets.

If you know how to modify twint, or if it has an option for it, I'd recommend putting a wait-timer in play that will pause it for a period of time after 5000 tweets.
One of the perks of twint is that it doesn't actually use the Twitter API at all; it's just a scraper. This also lets it work without needing a login of any kind.

It does have a wait-timer built in if Twitter gets defiant, too; it pauses for increasingly long periods of time before retrying. I'll need to experiment with it again to see if that works versus just restarting the damn thing.
 
One of the perks of twint is that it doesn't actually use the Twitter API at all; it's just a scraper. This also lets it work without needing a login of any kind.

It does have a wait-timer built in if Twitter gets defiant, too; it pauses for increasingly long periods of time before retrying. I'll need to experiment with it again to see if that works versus just restarting the damn thing.
It uses the API, even if it's not touching it directly. Web-calls also touch the API and can hit the limitations. It's why people complain about the twitterblockchain extension. It hits about 5K blocks then fucks up.

Still, thanks for compiling a bit of information on how to use twint. It'll be helpful.
 
Back