yt-dlp appreciation thread

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
It doesn't download youtube videos for me, got anything that does on desktop?
I've never been able to get it to work anywhere but YouTube, but it seems to work okay for me there.

Mostly just download livestreams in M4A to listen to on my commute. Occasionally music.
 
I have used yt-dlp and YouTube-DL before that, love them so much. Never knew they could work on other sites though!? Any guides and what sites work and what are the principles of how to does this? Is it just any page with embedded video it just scoops it up and extracts it?
 
I have used yt-dlp and YouTube-DL before that, love them so much. Never knew they could work on other sites though!? Any guides and what sites work and what are the principles of how to does this? Is it just any page with embedded video it just scoops it up and extracts it?
They have a list of supported sites here on their github. For most of these sites it will be as simple as yt-dlp <URL> and it will do the best it can to download the media on the page. For some of the other sites (pirate/free streaming sites come to mind) where you are unable to get a specific URL for a video file hunting down an m3u8 playlist file using an extension (or just the network tab in the browser/dev console if it does not break the site) that will catch them as they are requested/loaded can then be used to download the content as well.
 
I originally started using yt-dlp years ago in order to save videos from controversial YouTubers that I know would be deleted one day, but ever since Invidious stopped working entirely, I've been using it to download videos from all my favorite channels. If the video just consists of a guy talking over a still image or if it's a podcast, I can save on storage by doing this:

Code:
yt-dlp --extract-audio --audio-format wav <URL>

Had no idea it could work on sites other than YouTube, though.
 
I am wondering if anyone has any suggestions on how I should handle using yt-dlp and Freetube together. The way I currently do it is by marking recent videos I want to watch as favorites in Freetube. Then I go to the favorites playlist on Freetube and copy and paste each youtube link one by one into a document with a yt-dlp command. I then paste the entire yt-dlp command into the CLI.

The above process is a little inefficient. I would prefer to be able to copy and paste all the youtube videos at once so that I can yt-dlp it. But if I export my playlists as a .db file, instead of just giving me a list of youtube links, they give me all this information that is useless and not well-ordered for my purposes. So if anyone knows a better way to do this or a way to automatically convert the .db information into a list of simple clean youtube links, please let me know. Thank you for your help.
 
I am wondering if anyone has any suggestions on how I should handle using yt-dlp and Freetube together. The way I currently do it is by marking recent videos I want to watch as favorites in Freetube. Then I go to the favorites playlist on Freetube and copy and paste each youtube link one by one into a document with a yt-dlp command. I then paste the entire yt-dlp command into the CLI.

The above process is a little inefficient. I would prefer to be able to copy and paste all the youtube videos at once so that I can yt-dlp it. But if I export my playlists as a .db file, instead of just giving me a list of youtube links, they give me all this information that is useless and not well-ordered for my purposes. So if anyone knows a better way to do this or a way to automatically convert the .db information into a list of simple clean youtube links, please let me know. Thank you for your help.
If you are willing to create a YouTube account, you can add the videos to a public playlist and simply point yt-dlp to that with --download-archive switch, which will keep a record of all the videos it has downloaded to prevent it from downloading multiple copies. That would reduce your yt-dlp inputs to just
Code:
yt-dlp --download-archive archive.txt --no-download-archive <YouTube playlist URL>

If you want to attempt to use the output from the Freetube playlists export, the "videoId" value is the identifier for the YouTube video URL (youtube.com/watch?v=<videoId>) which WILL work in the yt-dlp command line so the below will work:
Code:
yt-dlp dQw4w9WgXcQ
So you could potentially just copy those into a text file as you are, one per line and then use the --batch-file <file> switch to load them all from a text file.
Code:
yt-dlp --batch-file <text file with one ID per line>.txt

Depending on how much effort you are happy putting into this, this would be a good start I guess? There are hundreds of options of varying usefulness on the GitHub to poke around with.
 
  • Like
Reactions: Johnny Apple Sneed
Thank you for the effort. I think I need to be more clear about what I am looking for.

Your suggestion to create a youtube account is appreciated. But if I did so, that would defeat almost the entire point of me using FreeTube. I want to protect my privacy/anonymity. Unless I can create a youtube account entirely through tor, without using javascript (a powerful fingerprinting avenue and possible malware vector), and without needing to give them a phone number (not-anonymous), that option is not available to me. So I need other solutions.

Copying the videoID into a text file and then using --batch-file is an interesting idea. However, copying each videoID one by one into a text file would be just as labor-intensive as copying and pasting each url one by one into a text file with my yt-dlp command and then running the command. What I would need is some sort of script that would automatically scan and copy those videoID's into a batch file, so that I would not have to copy and paste them one by one manually. Unfortunately, I am tech-illiterate and do not know how to do any coding. I do not know where I would start in order to figure out how to do this.

So if anyone knows a better way to do this or a way to automatically convert the .db information into a list of simple clean youtube links, please let me know. Thank you for your help.
 
Thank you for the effort. I think I need to be more clear about what I am looking for.

Your suggestion to create a youtube account is appreciated. But if I did so, that would defeat almost the entire point of me using FreeTube. I want to protect my privacy/anonymity. Unless I can create a youtube account entirely through tor, without using javascript (a powerful fingerprinting avenue and possible malware vector), and without needing to give them a phone number (not-anonymous), that option is not available to me. So I need other solutions.

Copying the videoID into a text file and then using --batch-file is an interesting idea. However, copying each videoID one by one into a text file would be just as labor-intensive as copying and pasting each url one by one into a text file with my yt-dlp command and then running the command. What I would need is some sort of script that would automatically scan and copy those videoID's into a batch file, so that I would not have to copy and paste them one by one manually. Unfortunately, I am tech-illiterate and do not know how to do any coding. I do not know where I would start in order to figure out how to do this.

So if anyone knows a better way to do this or a way to automatically convert the .db information into a list of simple clean youtube links, please let me know. Thank you for your help.
I assumed as much, I did just want to make you aware of the option/capability to have a public playlist.
As the .db exported from FreeTube is just a JSON file, it would be relatively simple to create something in say Python that would output the file/format for bulk downloading. Do you have an (not your own personal/identifying) example .db file at all?
 
Sure, here is my totally non-personal, non-identifying, .db file full of totally random videos! (apparently .db files are not supported in kiwifarms file attachments?)
Yeah it probably does not like the .db extension in particular, probably just filtered by default.
I threw the below together which just exports a .txt file from a dummy one I grabbed. If you are comfortable enough to install Python you can place the freetube db in a folder with this this "playlist.py" file and run it. It will generate one playlist .txt file per playlist in your .db file (it will skip playlists that are empty like favourites if you have not added videos.)
Python:
import json, os

if __name__ == "__main__":
   pldb = {}

   for n in os.listdir(os.curdir):
       if "freetube-playlists-" in n:
           fname = n

   with open(fname, "rb") as file:
       for line in file.readlines():
           d = json.loads(line)
           pldb[d['playlistName']]=d

   for pl in pldb:
       vlist = []
       if len(pldb[pl]["videos"]) == 0: continue

       for v in pldb[pl]['videos']:
           vlist.append(v['videoId'])
       with open(f"playlist-{pl}.txt","w") as output:
           for v in vlist:
               output.write(v+"\n")

Basically:
Install Python
Copy your free-tube-playlists.db into a folder.
Place the above into a python .py file into the folder with the playlists.db
Run the file, it should output a "playlist-<playlistname>.txt" for each playlist in your .db
You can then use that text file as an argument to
Code:
yt-dlp --batch-file <file>
This does not check for newest file, it just assumes the only one it finds is correct. So delete your db file after running it and export another for subsequent runs I guess.
 
Last edited:
Poor Invidious instances getting taken down but at least yt-dlp still works, sad because I enjoyed using FreeTube.
It's a blessing in disguise, to be honest. Invidious doesn't serve any purpose besides offloading effort from the end user and bringing more heat on working methods of video archival from Jewgle. The video stream itself isn't proxied through the instance, you are accessing googlevideo.com regardless. Frontend redesign is best left to browser addons or Tamper/Violentmonkey userscripts.
 
It's a blessing in disguise, to be honest. Invidious doesn't serve any purpose besides offloading effort from the end user and bringing more heat on working methods of video archival from Jewgle. The video stream itself isn't proxied through the instance, you are accessing googlevideo.com regardless. Frontend redesign is best left to browser addons or Tamper/Violentmonkey userscripts.
I figured it would serve as a proxy, that's why I rather use something a bit more anonymous than private browsing or using my google account, so that they collect less data or none at all. Looks like I can't escape YouTube surveillance other than not use yt at all.
 
yt-dlp should be within a stack with streamlink, mpv, gallery-dl and ffmpeg (alongside curl/wget I guess).
Guaranteed you'd be able to download and archive whatever you need.
 
  • Like
Reactions: McAfee
Yeah it probably does not like the .db extension in particular, probably just filtered by default.
I threw the below together which just exports a .txt file from a dummy one I grabbed. If you are comfortable enough to install Python you can place the freetube db in a folder with this this "playlist.py" file and run it. It will generate one playlist .txt file per playlist in your .db file (it will skip playlists that are empty like favourites if you have not added videos.)
Python:
import json, os

if __name__ == "__main__":
   pldb = {}

   for n in os.listdir(os.curdir):
       if "freetube-playlists-" in n:
           fname = n

   with open(fname, "rb") as file:
       for line in file.readlines():
           d = json.loads(line)
           pldb[d['playlistName']]=d

   for pl in pldb:
       vlist = []
       if len(pldb[pl]["videos"]) == 0: continue

       for v in pldb[pl]['videos']:
           vlist.append(v['videoId'])
       with open(f"playlist-{pl}.txt","w") as output:
           for v in vlist:
               output.write(v+"\n")

Basically:
Install Python
Copy your free-tube-playlists.db into a folder.
Place the above into a python .py file into the folder with the playlists.db
Run the file, it should output a "playlist-<playlistname>.txt" for each playlist in your .db
You can then use that text file as an argument to
Code:
yt-dlp --batch-file <file>
This does not check for newest file, it just assumes the only one it finds is correct. So delete your db file after running it and export another for subsequent runs I guess.

Thank you. I will try it at some point.
 
Appears not to download Xitter, but there are methods for that. Anyhow, it's a very fine application.
 
  • Disagree
Reactions: Surfherder
Looks like you can't download age restricted videos without cookies or oauth now, which means you risk getting your account cucked.

Well damn, so much for downloading those age restricted videos. I was taking a look at the old WW1 channel that had quite a few playlists filled with videos that have been age restricted for one reason or another. I rather not risk my account if I can help it, and doubtless YouTube will probably pick up on any attempts to use a burner and back your IP and anything associated with it wholesale. VPNs might be an option, but I am not certain if that will be an option worth pursuing considering the state of things.

Right now, I'll be content if I can find at least one site that would allow me to download age restricted vids with no bullshit attached.


Using this quick one as an example, I've tried a couple online downloaders, yet no dice thus far.
 
  • Like
Reactions: Markass the Worst
Back