Website Copier and Website Cloner apps are tools designed to download websites and their content for offline browsing purposes. They allow users to create local copies of websites, including HTML, CSS, JavaScript, images, and other files, so that they can be accessed without an internet connection.

Why you may need a site copier?

These apps are useful for various use-cases, including education, productivity, web archiving, and more.

Use-Cases:

  1. Offline Browsing: Website Copier and Website Cloner apps enable users to browse websites offline, without relying on an internet connection. This is particularly beneficial in situations where internet access is limited or unreliable.
  2. Web Archiving: These apps are commonly used for web archiving purposes, allowing users to create preserved copies of websites for historical or research purposes. Web archivists can use these tools to capture and store web content that may change or disappear over time.
  3. Education and Research: Website Copier and Website Cloner apps can be valuable for students, researchers, and educators who need to access online resources offline. By downloading websites, they can have uninterrupted access to relevant information, articles, papers, and other educational materials.
  4. Productivity: These apps can enhance productivity by enabling users to access websites and their content offline. This can be beneficial for professionals who frequently reference online documentation, tutorials, or other resources while working.

Audience

The audience for Website Copier and Website Cloner apps is diverse and can include:

  • Students and researchers who require offline access to web resources for academic purposes.
  • Web archivists who aim to preserve websites for future reference or historical documentation, they often use tools as ArchiveBox.
  • Professionals in various fields who rely on web content for their work and need offline access.
  • Individuals with limited internet connectivity, such as those in remote areas or during travel.
  • Anyone who wants to create a personal offline archive of their favorite websites or online content.

Overall, Website Copier and Website Cloner apps provide a convenient solution for downloading websites and accessing their content offline, catering to the needs of different users in education, productivity, web archiving, and more.


In this list, we offer you the best open-source web copier apps that you can download and use completely for free. Please note that they vary in features, and some may require skilled users to install and run.

1- HTTrack

HTTrack is a free and open-source website copying utility that allows you to download a website from the Internet to a local directory.

It creates a replica of the website's directory structure and saves it on your computer, allowing you to browse the website offline. This can be useful for tasks such as offline browsing, website archiving, or creating backups of websites.

HTTrack The Website Copier: Copy Any Website to Your Desktop For Offline Browsing
HTTrack is an offline browser utility that allows users to download a website from the Internet to a local directory. This free and easy-to-use software is licensed under the GPL, or General Public License, which means that users have the freedom to run, study, share, and modify the software. 13
HTTrack Website Copier - Free Software Offline Browser (GNU GPL)
HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure. Simply open a page of the ‘mirrored’ website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system. WinHTTrack is the Windows 2000/XP/Vista/Seven/8 release of HTTrack, and WebHTTrack the Linux/Unix/BSD release.

2- Getleft

Getleft is a free and open-source website download tool similar to HTTrack. It allows you to download complete websites or parts of websites to your local computer, enabling offline browsing and archiving.

Getleft is available for multiple platforms and provides features such as resuming interrupted downloads and filtering files based on size or type.

Copy Websites for Offline Browsing with Getleft, The Free Website Copier
Getleft is a highly versatile, free and open-source website downloader tool for Windows, bearing similarities to HTTrack, another trusted tool in the field. Getleft’s primary function lies in its ability to download complete websites or selected portions of websites directly to your local computer. This feature is incredibly useful for
Getleft
Download Getleft for free. Getleft is a Web site grabber, it downloads complete web sites according to the options set by the user.

3- GoClone

GoClone is an open-source command-line tool written in Go Language that allows you to clone websites. It provides similar functionality to HTTrack and Getleft, allowing you to download websites or specific parts of websites for offline browsing or archiving purposes.

GoClone offers features such as recursive downloading, filtering options, and customizable configuration settings.

Navigate the Web Offline: GoClone Offers Free, Easy Website Copier and Downloader
Goclone is a free and open-source utility that enables quick website cloning using Go routines. It allows users to download a website’s content, including html, css, js, images, and other files, to a local directory while maintaining the original site’s link-structure. This allows for offline browsing as if the site
GitHub - imthaghost/goclone: Website Cloner - Utilizes powerful Go routines to clone websites to your computer within seconds.
Website Cloner - Utilizes powerful Go routines to clone websites to your computer within seconds. - GitHub - imthaghost/goclone: Website Cloner - Utilizes powerful Go routines to clone websites to…

4- HTTraQt

HTTraQt is a software clone of WinHTTrack that allows for downloading internet sites and their content. It offers features such as easy language switching, adding language files without changing the program code, selecting browsers and user agents, and extended file extension selection.

It was developed for Linux/ Unix/BSD but can be modified for Windows and Mac OSX. The software has a multilingual user interface and is compatible with Qt4 and Qt5.

The app is released under the GNU General Public License version 3.0 (GPLv3).

5- Website Scraper

The open-source website-scraper allows users to download a website to a local directory, including all CSS, images, and JS files. However, it does not execute JS, so dynamic websites may not be saved correctly.

For downloading dynamic websites, consider using website-scraper-puppeteer.

GitHub - website-scraper/node-website-scraper: Download website to local directory (including all css, images, js, etc.)
Download website to local directory (including all css, images, js, etc.) - GitHub - website-scraper/node-website-scraper: Download website to local directory (including all css, images, js, etc.)

6- Web Book Downloader

The Web Book Downloader application allows users to download chapters from a website in three ways: from the table of contents, by specifying a range of chapters, or by crawling from the first chapter.

The program supports customizing language and input settings, and it can generate PDF and EPUB files. The application also features a Swing interface and the ability to crawl through HTML links.

Web Book DownloaderDownload Web Book Downloader for free. Download websites as e-book: pdf, txt, epub. This application allows user to download chapters from website in 3 ways:

Web Book Downloader
Download Web Book Downloader for free. Download websites as e-book: pdf, txt, epub. This application allows user to download chapters from website in 3 ways: - from table of contents; - from range: first chapter address, last chapter address;

7- Monolith

Monolith is a CLI tool that allows you to save complete web pages as a single HTML file. It embeds CSS, image, and JavaScript assets, producing a single HTML5 document that can be stored and shared.

It also includes features like excluding audio sources, saving with custom charset, extracting contents of NOSCRIPT elements, ignoring network errors, and adjusting network request timeout.

You can install it directly using Docker locally on a remote server.

Features

  • Every release contains pre-built binaries for Windows, GNU/Linux
  • Exclude audio sources
  • Save document using custom charset
  • Extract contents of NOSCRIPT elements
  • Ignore network errors
  • Adjust network request timeout

8- Complete Website Downloader

This is a powerful Website downloader that seamlessly integrates with wget and archiver to efficiently download all assets of a website. It then compresses the downloaded assets and promptly sends them back to the user through a reliable socket channel.

Web Snapshot: Capture Entire Websites and Media for Offline Access
The Complete Website Downloader is a free open-source web-based tool that downloads the entire source code of any website, including all assets. It utilizes wget and archiver to download all website assets, compress them, and send them back to the user through a socket channel. Features * Responsive design * Can be

9- Website-cloner

The basic website cloner is a Python script that downloads all files from a website and saves them in a folder. It scrapes links on the page, saves them locally, and replaces them with the local path.

GitHub - ZKAW/website-cloner: Basic website cloner written in Python
Basic website cloner written in Python. Contribute to ZKAW/website-cloner development by creating an account on GitHub.

10- Website Downloader

Website Downloader is a tool that crawls through webpages, examining and saving every link as local files. It only examines links within the same hostname and can be run with the .NET Core SDK or a compiled binary.

GitHub - Kissaki/website-downloader: A website Crawler and downloader. Useful for archiving dynamic websites as static files.
A website Crawler and downloader. Useful for archiving dynamic websites as static files. - GitHub - Kissaki/website-downloader: A website Crawler and downloader. Useful for archiving dynamic websit…

11- PyWebCopy

PyWebCopy is a free tool for copying websites onto your hard-disk for offline viewing. It scans and downloads the website's content, remapping links to local paths. It can crawl an entire website and download all linked resources to create a replica of the source website.

PyWebCopy will scan the specified website and download its content onto your hard-disk. Links to resources such as style-sheets, images, and other pages in the website will automatically be remapped to match the local path. Using its extensive configuration you can define which parts of a website will be copied and how.

GitHub - rajatomar788/pywebcopy: Locally saves webpages to your hard disk with images, css, js & links as is.
Locally saves webpages to your hard disk with images, css, js & links as is. - GitHub - rajatomar788/pywebcopy: Locally saves webpages to your hard disk with images, css, js & links as is.

12- Crystal Web Archiver

Crystal is a web archiving tool that downloads websites for long-term preservation. It is most effective for static websites like blogs and wikis, but can also handle dynamic sites with infinite scrolling feeds, such as social media platforms.

GitHub - davidfstr/Crystal-Web-Archiver: Downloads websites for long-term archival.
Downloads websites for long-term archival. Contribute to davidfstr/Crystal-Web-Archiver development by creating an account on GitHub.

13- Website Downloader

This is a simple Node.js script called "Website Downloader" that can be used to download a website's source code for offline browsing.

GitHub - patevs/website-downloader: Download a website’s source code for offline browsing.
Download a website’s source code for offline browsing. - GitHub - patevs/website-downloader: Download a website’s source code for offline browsing.

14- Website Cloner (Windows)

The Website Cloner is a desktop app that enables users to download website files, including HTML, CSS, JS, and images, onto their computer. It is available for Linux distributions such as Ubuntu, Debian, and Linux Mint.

15- Website-Copier PHP

Website-Copier is a PHP script that can be run on a terminal to download an entire website using link chaining. The script is available on GitHub.

GitHub - sidharthasingh/Website-Copier: A php script that runs on a terminal and downloads and entire website using link chaining.
A php script that runs on a terminal and downloads and entire website using link chaining. - GitHub - sidharthasingh/Website-Copier: A php script that runs on a terminal and downloads and entire we…

16- Web to PDF Converter

This script allows you to create beautiful PDFs using your favorite JavaScript and CSS framework with the Web to PDF Converter.

It fully supports JS and offers a content replacement system for dynamic content and the ability to insert page numbers dynamically.

Features

  • 💥 JS is fully supported, meaning you can use your favorite frameworks to generate your PDF.
  • 🔄 Comes with a powerful content replacement system that allows for dynamic content.
  • 🔢 Insert page numbers in your pages dynamically.
  • 💃 Full SCSS support
  • 👸 Support for headers and footers
  • 🔗 Support for reusable HTML chunks
  • 🎥 Real time mode with hot reloading, meaning you can build your PDF in real time
  • 🌏 Support for rendering remote pages (You can even inject your own css and js!)
  • 🚦 Queueing system so you can render 1000's of PDFs with a single script.
GitHub - PDFTron/web-to-pdf: Convert any web technology to PDF (HTML to PDF, html2pdf)
Convert any web technology to PDF (HTML to PDF, html2pdf) - GitHub - PDFTron/web-to-pdf: Convert any web technology to PDF (HTML to PDF, html2pdf)

17- SingleFile (Browser Extension)

SingleFile is a fork of SingleFile that enables saving webpages as self-extracting HTML files. These files are valid ZIP files containing all the resources of the page. SingleFileZ is compatible with Firefox, Chrome, and Microsoft Edge.

GitHub - gildas-lormeau/SingleFile: Web Extension and CLI tool for saving a faithful copy of a complete web page in a single HTML file
Web Extension and CLI tool for saving a faithful copy of a complete web page in a single HTML file - GitHub - gildas-lormeau/SingleFile: Web Extension and CLI tool for saving a faithful copy of a c…

18- WebimgDL

WebimgDL is a Python tool that allows you to download all images from a website. It offers features such as auto downloading, customizing image size, and customizing image width and height.

WebimgDL
Download WebimgDL for free. Download all images from website. WebimgDL (Web Images Downloader) is a tool to help you download all images on your favorite website. You can customize width, height or size of images to dowload.

19- Sponge

Sponge is a versatile and powerful command-line tool designed for efficient website crawling and seamless link downloading.

It offers a wide range of features that make it an indispensable tool for web developers and researchers. With sponge, you can effortlessly explore and analyze websites, extract valuable data, and effortlessly download links for offline browsing.

GitHub - spypunk/sponge: sponge is a website crawler and links downloader command-line tool
sponge is a website crawler and links downloader command-line tool - GitHub - spypunk/sponge: sponge is a website crawler and links downloader command-line tool

13 Website copiers that help you keep offline mirrored versions of websites
Ever wanted to save a copy of a certain website to review it later when offline? In the early 2000s, we used to copy a whole website into a static HTML format with images and scripts assets, in order to have access to them when disconnected. Believe it or not,