08.21.2019

Download a Web Site using wget

Sometime you want to download a website to local for viewing it off-line. wget can help you to do it.

If you have ever need to download a website, perhaps for off-line viewing, wget can do it for you.
This is example command for this action
$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains example.com \
     --no-parent \
         www.example.com/tutorials/html/

Explain:
  • --recursive: download the entire Web site.

  • --domains example.com: not follow links outside example.com.

  • --no-parent: not follow links outside the directory tutorials/html/.

  • --page-requisites: get all the elements that compose the page (images, CSS and so on).

  • --html-extension: save files with the .html extension.

  • --convert-links: convert links so that they work locally, off-line.

  • --restrict-file-names=windows: modify filenames so that they will work in Windows as well.

  • --no-clobber: not overwrite any existing files (used in case the download is interrupted and resumed).

Example command to download all html file in directory named vector from a website:
wget --recursive --domains example.com --no-parent
http://example.com/vector/

Leave a comment