Getting a full offline travel copy using wget on Mac OS

Guide for using the standard wget to download a full offline archive for a website on Mac OS X High Sierra. Includes a description of all arguments used.

Paul

Paul

August 02, 2017 · 3 min read
Getting a full offline travel copy using wget on Mac OS

TL;DR: wget -r -E -k -p -np -nc --random-wait http://example.com'

There are cases when you want to download a website as an offline copy. For me it's usually because it's a site I want to be able to use while I'm offline. That happens either by choice or because of planes that don't have Wi-Fi available (Gen-1 planes ;)).

When we download a website for local offline use, we need it in full. Not just the html of a single page, but all required links, sub-pages, etc. Of course any cloud-based functionality will not work, but especially documentation is usually mostly static.

wget is (of course) the tool for the job.

To install wget on your Mac (if you don't have it yet), you can use HomeBrew in a Terminal window:

$ homebrew install wget

Just as with all terminal commands, you can get an overview of the (very extensive) list of options using:
wget --help

In general when you want to use wget, you just use:

$ wget example.com

But that only gets you so far. You will now have the html of the root page of that website. But you can't really use that to view the site on your laptop. You are missing all support CSS files, images, JavaScript, etc. In addition, all links are still in the old 'context' of the online website.

So we'll use some options to get wget to do what we want.

$ wget -r -E -k -p -np -nc --random-wait http://example.com

Let's go through those different command line options to see what they do:

Similiar Posts

Copyright © 2025 brain-dump.space. All rights reserved.