Can I undo the tide of history?
May 9, 2010 11:43 AM   Subscribe

Can anyone think of a way to hack in podcast XML from Youtube?

I have about 8 youtube feeds of interest, and would probably have more if I had a way of making them work with my podcast client (currently Banshee) to download and queue them up.

Here's the problem: feeds like this link to youtube URLs but not to directly downloadable items. This is a known problem; people have written and maintain tools to take that step. This works for me with some manual steps taken. However what I'd like to do is create a script to translate the feed into podcast XML with enclosures, and I'm running into a road block.

An example will help. youtube-dl has an option to return a download URL:
youtube-dl -g "http://www.youtube.com/watch?v=A1R8KGKkDjU&feature=youtube_gdata"

yields

http://www.youtube.com/get_video?video_id=A1R8KGKkDjU&t=vjVQa1PpcFNZKeUR1fuXgulAnl7cMvONhjSATO3PEK8=&eurl=&el=detailpage&ps=default&gl=US&hl=en

So far so good. But when I pass this to wget I come up with this error:

Cannot write to `videoplayback?ip=75.0.0.0&sparams=id,expire,ip,ipbits,itag,algorithm,burst,factor&fexp=907004&algorithm=throttle-factor&itag=5&ipbits=8&burst=40&sver=3&expire=1273449600&key=yt1&signature=7BE776FCEC9DFCE73FB8B29F2705BCA9EBE69AD6.AFE3051140A582D6FF0C33111CF1FD26EA94307F&factor=1.25&id=03547c2862a40e35'

The above filename is 302 bytes long, and ext4 only supports 256 byte long filenames. Youtube-dl has flags to choose filenames, but I don't believe podcast clients cope with this. Short of downloading all videos to another webserver and inserting that URL in instead, is there a way to convert these feeds to proper enclosures?
posted by pwnguin to Computers & Internet (6 answers total) 1 user marked this as a favorite
 
wget has the "-O filename" option, would that help?
posted by jozxyqk at 12:35 PM on May 9, 2010


Have you looked into Yahoo! Pipes?
posted by jeffmilner at 12:40 PM on May 9, 2010


Response by poster: Yes, I'm familiar with Pipes. I have a couple of popularly cloned pipes; but I'm getting to the point that Pipes's flaws frustrate me --caching, regex bugs, no composition, etc. This particular problem I'm having is seems independent of which program I use to translate feeds. I could make some youtube url translator web API for Pipes to use and I'd still be stumped on how to get the desktop end to chose a smart filename.

The problem comes down to dealing with podcast clients I think. Wget and youtube-dl have options for filenames, but banshee doesn't and I assume most other podcast clients don't.
posted by pwnguin at 12:59 PM on May 9, 2010


pwnguin, perhaps the source code to Miro could help you out; iIrc Miro tries to download YouTube videos either individually or from YouTube users' RSS feeds. Good luck!
posted by brainwane at 4:49 PM on May 9, 2010


I'm not sure I really understand the problem enough here, but what if you just passed the video through another server, which downloaded the file from youtube, and then streamed it back to you, but with a shorter filename? This obviously doesn't scale, but if you're just looking for something for your personal use, you'd be fine.
posted by !Jim at 8:09 PM on May 9, 2010


Response by poster: That was one option, but the problems are twofold:

* it's a lot of video to store, though I suppose a proxy/cache might work...
* ideally the solution is something I can share without the world destroying my server
posted by pwnguin at 8:58 PM on May 9, 2010


« Older Good dog!   |   Where to get a tat in Des Moines? Newer »
This thread is closed to new comments.