Join 3,496 readers in helping fund MetaFilter (Hide)


shaving off the time sinks
April 19, 2012 9:08 AM   Subscribe

I am trying to find more time in my life, and am looking for tools and strategies to minimize the time I spend on the internet without totally disconnecting.

I have gotten pretty good at time management, but I need it to be better.
Sleeping, waking up and work take up midnight to 6pm. The remaining 6 hours of my day are split between: making/eat/cleaningup dinner, house chores, gym (adding gym visits 2-3 times/week has made me much more productive, thanks old timemanagement askme questions!), time with my partner, working on events/projects I run, working on music. Those are all good things. And INTERNET.

At work I collect the links from metafilter (and RSS feeds, via google reader) that I think are interesting and I probably spend about 60-90 minutes/evening running through those saved URLs and going to other internet sites.
Optimally, I'd like to spend no more than 45 minutes on internet randomness while at home and am hoping find ways to skim through the new information more efficiently.
I don't want to entirely unplug since I do get a lot of really useful information, and enjoy staying on top of the latest weird things on youtube, memes and debates. (I don't need to be involved in any of those things in depth)

Some things I was thinking about:
Is there a youtube download tool I can just feed a list of youtube URLs and have it download them all for me automatically? (offline viewing makes for easier skimming, and less distractions via comments and sidebars)
Is there a tool/site I can feed a list of URLs and it would highlight those that have the highest level response on popular aggregation sites like mefi, reddit, etc?
Should I just build those things and/or have someone make them?

For most news and site updates I use Google Reader. I use a robot to scrape MP3s off music blog so I don't have to browse those. I try and compose responses to emails on my work breaks and send them when I get home. I moved all torrents to RSS subscriptions.

Any strategies and tools to more efficiently sift through the inernet would be appreciated. I want more automation and executive summaries.
posted by Theta States to Computers & Internet (7 answers total) 24 users marked this as a favorite
 
I've found Instapaper -- and especially the addition recently of a Wifi-only Kindle -- very useful for Internet reading that isn't surfing.

(Since I can watch YouTube from my TiVo or XBox but not at work, I'd welcome any useful tools that way as well.)
posted by MCMikeNamara at 9:38 AM on April 19, 2012


Lo-tech efficiency tool: buy a cube timer. Install batteries. Flip it to 30. Start reading. When it beeps, flip it to 15, disconnect and/or finish your current pursuit and/or bookmark stuff for next time. When it beeps again, close your laptop (or shut down your computer, or sleep your monitor, or throw a blanket over the cage of the genetically modified cyberparakeet who trawls the Internets & reads them aloud to you).
posted by feral_goldfish at 3:34 PM on April 19, 2012 [1 favorite]


I don't have a problem with the timing per se, I can always unplug when needed. If I don't get through all of the interesting links I saved in the day, they go in to a text file of old links. Sometimes on a weekend I'll dip back in to that list, but it just bloats and bloats (about 2000 unchecked links at this point...)
I want to get through the URLs I do have, faster. I want to skim for relevance.
posted by Theta States at 6:25 AM on April 20, 2012


Hi everyone, I figured out a way to solve the youtube program.
I am using ActivePerl for the script, and DownloadHelper for the bulk downloading. The code is because DownloadHelper requires youtube links for its batch function to operate.

The program takes a file of youtube links (and youtu.be links) and builds a quick-and-dirty html file. Load that in firefox, CTRL-A to highlight the links, right click and select "Download videos from youtube link(s)". DONE.

I am heaving trouble posting it here due to the markup, but send me an email if you want a copy.
posted by Theta States at 6:53 PM on April 24, 2012


I tested this last night and I found it extremely useful. I had 70 youtube links in my "links to check eventually" file. Piped them in to my program, set downloaderhelper to snatch them, and came back an hour later to a giant folder of videos I could sift through offline.
No youtube loading, no Flash crashing, no distraction from sidebar videos or comments.
It also allowed me to skim through videos much more readily. 8 minute video of someone climbing something awesome? Skim through over 90 seconds.

Only caveat: 1 link was to the "24 hours of the Starship Enterprise idling" video. Luckily I caught that one and cancelled the download. :)




Next up: build program to give me executive summaries on a wide selection of random links...
posted by Theta States at 6:48 AM on April 25, 2012 [1 favorite]


If you are interested in streamlining and simplifying your program, youtube-dl works straight from the shell. If you want videos from non-youtube links scripting ff still may be your best bet. On a related note, on Linux systems flashcache (download link has that name on linked page) is a script that detects and copies flash videos from google-chrome.
posted by idiopath at 7:12 AM on April 25, 2012 [1 favorite]


Thanks, I will check that out when I get home. Having it done via shell will be even more efficient.
posted by Theta States at 11:39 AM on April 25, 2012


« Older I have finally gotten my iTune...   |  I'm the volunteer web develope... Newer »
This thread is closed to new comments.