Ever wanted to save a copy of a certain website to review it later when offline?

In the early 2000s, we used to copy a whole website into a static HTML format with images and scripts assets, in order to have access to them when disconnected.

Believe it or not, for many reasons, some are still doing this.

What is a Website Copier?

To have a copy of a website, you need to use a special web crawler, called a website copier that copies all the website into static HTML files alongside its images, styles, and JavaScript files.

Website Copier is an offline browser utility that allows users to download a website from the Internet to a local directory. This enables users to browse the downloaded site from link to link as if they were viewing it online, even when not connected to the Internet. It maintains the original site's relative link structure and can update mirrored sites.

In this post, we offer you a list of the best website copier, note that some of them are low-level and some are aimed at non-technical users.

1- ArchiveBox

ArchiveBox

ArchiveBox is a free, open-source web archiving tool that takes any website, and saves it for offline usage.

Although it is a self-hosted web archiving tool, it also offers a command-line and a desktop edition.

ArchiveBox takes a snapshot of the website and saves it in several file formats. It can extract media files like YouTube, improve articles readability, and more.

GitHub - ArchiveBox/ArchiveBox: 🗃 Open source self-hosted web archiving. Takes URLs/browser history/bookmarks/Pocket/Pinboard/etc., saves HTML, JS, PDFs, media, and more...
🗃 Open source self-hosted web archiving. Takes URLs/browser history/bookmarks/Pocket/Pinboard/etc., saves HTML, JS, PDFs, media, and more... - GitHub - ArchiveBox/ArchiveBox: 🗃 Open source self-hos…

2- HTTrack

HTTrack is free open-source software that helps you to get an offline copy of any website. Unfortunately, it is only available for Windows.

With HTTrack, you can get a mirrored version of any website which can be updatable if the website gets new content.

HTTrack comes in a simple multilingual interface, supports proxies, works with integrated DNS caches, offers a complete support for HTTPS and ipv6.

If you have a Windows machine, this app is a must-have.

HTTrack The Website Copier: Copy Any Website to Your Desktop For Offline Browsing
HTTrack is an offline browser utility that allows users to download a website from the Internet to a local directory. This free and easy-to-use software is licensed under the GPL, or General Public License, which means that users have the freedom to run, study, share, and modify the software. 13

3- Getleft

Getleft is a free open-source website grabber, which is still getting weekly downloads since its first release. Getleft is available for Windows, Linux, and macOS.

Getleft
Download Getleft for free. Getleft is a Web site grabber, it downloads complete web sites according to the options set by the user.

4- Archivarix

Archivarix is a different kind of website grabber/ copier, It is a complete CMS "Content Management System" combined with a website downloader and a smart Wayback machine.

Before Git, many websites uses Archivarix to keep backup copies of their websites and upload them when it is required.

Archivarix works as a WordPress plugin.

5- Website Downloader

This is a customized website downloader solution that takes a copy of any website including JavaScript, Stylesheets, and Images.

This project "Website Downloader" is the brainchild of Ahmad Ibrahiim, who released it as an open-source under MIT License.

GitHub - AhmadIbrahiim/Website-downloader: 💡 Download the complete source code of any website (including all assets). [ Javascripts, Stylesheets, Images ] using Node.js
💡 Download the complete source code of any website (including all assets). [ Javascripts, Stylesheets, Images ] using Node.js - GitHub - AhmadIbrahiim/Website-downloader: 💡 Download the comple…

6- Goclone

Goclone clones any website to your local desk in a matter of mins. It is built using the Go programming language.

Goclone is a command-line tool, but it is extremely easy to use even for non-technical users. The project is released under MIT License.

GitHub - imthaghost/goclone: Website Cloner - Utilizes powerful Go routines to clone websites to your computer within seconds.
Website Cloner - Utilizes powerful Go routines to clone websites to your computer within seconds. - GitHub - imthaghost/goclone: Website Cloner - Utilizes powerful Go routines to clone websites to…

7- PC Website Grabber

PC Website Grabber is yet another tool to help you copy websites to a local disk. It downloads all assets and saves the whole website as a zip archive.

This project is released under the GPL-2.0 License. It is part of the PageCarton website builder.

GitHub - pagecarton/website-downloader: PC Website Grabber enables you to save website files from a remote server straight in your browser. Useful for offline browsing of an online resources. Opensource, Super intelligent and absolutely customizable PageCarton Plugin. https://plugins.pagecarton.org/2018/07/12/website-downloader-website-copier.html
PC Website Grabber enables you to save website files from a remote server straight in your browser. Useful for offline browsing of an online resources. Opensource, Super intelligent and absolutely…

8- goscrape

The "goscrape" tool is yet another website scraper that allows users to create an offline readable version of websites with its assets.

Unlike its competitors on this list, it converts large image files into smaller versions and fetches files from external domains.

Goscrape works seamlessly from the command line, so do not expect any GUI version.

GitHub - cornelk/goscrape: Web scraper that can create an offline readable version of a website
Web scraper that can create an offline readable version of a website - GitHub - cornelk/goscrape: Web scraper that can create an offline readable version of a website

9- Website Cloner

Website Cloner

A Visual Basic .NET tool to create a copy from any working website with its pages, with full recursive directory support.

However, there are no pre-built releases for this tool, so you need to build the app from the source code.

GitHub - X-SLAYER/Website-Cloner: It allows you to download a website from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer.
It allows you to download a website from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. - GitHub -…

10- Monolith

Unlike other tools in this list, this one comes with a new concept, Monolith takes any website with its pages and compresses it into a single HTML file.

Monolith is built using Rust programming language and is ready to install either with Rush package manager (Cargo), Homebrew or MacPorts for macOS, and several package managers for Linux and FreeBSD.

GitHub - Y2Z/monolith: ⬛️ CLI tool for saving complete web pages as a single HTML file
⬛️ CLI tool for saving complete web pages as a single HTML file - GitHub - Y2Z/monolith: ⬛️ CLI tool for saving complete web pages as a single HTML file

11- Node-site-downloader

Node.js is a popular choice for many developers to build their web, mobile, desktop or command-line apps.

Here, Node.js is used to build this command-line script which allows anyone to copy a functional website with one command.

GitHub - gnir-work/node-site-downloader: An easy to use CLI for downloading websites for offline usage
An easy to use CLI for downloading websites for offline usage - GitHub - gnir-work/node-site-downloader: An easy to use CLI for downloading websites for offline usage

12- MiniCopier

MiniCopier is a GUI-based, multi-platform website downloader that supports multiple transfers, download pause, speed limit options, and download queue.

MiniCopier works seamlessly on Windows, Linux, and macOS. It is released under GPLv2.0.

GitHub - jalagamv/Download-Website-Copier-Project-in-JAVA: Java based source code
Java based source code. Contribute to jalagamv/Download-Website-Copier-Project-in-JAVA development by creating an account on GitHub.

13- Website-Downloader

This is a simple PHP script for downloading a website copy into your local disk. It does not come with complex instructions or tutorials, but it is not that hard to use for experienced developers.

GitHub - badrshs/Website-Downloader
Contribute to badrshs/Website-Downloader development by creating an account on GitHub.

Final note

A website copier is a program that helps you keep a functional copy of websites on your disk. As we listed the most popular, open-source website copiers here, we hope this list will come in handy for anyone who is looking for such tools.

If you know of any other open-source, and free website copier that we did not mention here, let us know.