In case you ever want to archive a site for offline use, you can use wget to make a copy of the site for you. Thanks to minimallinux who outlines how to do so.
wget \
      --recursive \
      --no-clobber \
      --page-requisites \
      --html-extension \
      --convert-links \
      --restrict-file-names=windows $url-of-site
	
The command works ok, it’s amazing how an operating system has a build in tool to download an entire website.