preprocessing media to save bandwidth

{almost forgot:  Merry Christmas everybody!}

There are several bandwidth-related concerns for nomads:

  1. limits on data plans

  2. spotty mobile data

  3. not abusing open wifi

These concerns can be ameliorated by preprocessing media Elsewhere to more appropriate quality levels to “good enough” before  D/Ling.   In this case, “elsewhere” is a linux VPS.


Podcasts are often overproduce and overencoded.   It’s common for single-voice talking head ‘casts to be encoded in stereo at outrageous bitrates.

My process:

  1. grab the podcasts Elsewhere (newsbeuter + aria2c)

  2. disassemble them to .wav one by one (ffmpeg)

  3. do any processing like voxxing, normalizing, etc

  4. re-encode with a voice-friendly encoder (opusenc).  90%^ filesize reduction is common.

  5. download


I live in a van and don’t have any big-screen devices.  240p looks fine on my phone, chromebook, and pi-based dvr.  Fast-moving action scenes do pixelate, but I’m not an action flick fan.  On-screen text can be hard to read.

  1.  download Elsewhere

  2. convert to 240p .mp4 files (ffmpeg). 75% filesize reduction is common.

  3. download

Talking head YT videos are downloaded as audio only (youtube-dl) and processed with the podcasts above.  If there is visual content I want to see I pull the video down as “worstvideo+worstaudio” with youtube-dl.  If there is something that requires higher quality (rare) I have a separate script to handle that.

Video from Amazon’s Prime Video service is downloaded (not streamed) at non-peak times at the Data Saver rate.  They do a great job with the compression.  Similar sizes to my 240p conversions but better quality.  They do have access to the whole AWS infrastructure.  :-)

TV that comes in over-the-air via antenna( to the MythTV pi rig) obviously uses no mobile data.    Even then I prefer SD to HD for diskspace and playback purposes.


I only started preprocessing ebooks yesterday.  Commercial ebooks tend to come with all kinds of bloaty crap in them.  I’ve been locally processing them in Calibre after d/l, but I finally started doing it from the linux command line Elsewhere.

  1.  download Elsewhere

  2. run ebook-convert (calibre) to [re]convert to epub.  66% filesize reduction is common.

  3. run ebook-polish (calibre) for final pass. Usually only makes tiny reduction.

  4. download


3025975 There_Are_Places...epub
 989351 There_Are_Places...converted.epub
 988687 There_Are_Places...polished.epub

Conversion options:

--change-justification left \
--smarten-punctuation \
--subset-embedded-fonts \ <-- removes unused font symbols
--insert-blank-line \  <-- blank line between paragraphs
--output-profile kindle

Polish options:

--compress-images \ <-- losslessly