How to download the entire site - an overview of methods and means


Nowadays, when the Internet is available almost at any time, sometimes there is a need to download the entire site. Why is this necessary? The reasons may be different: the desire to save important information for the future, the need to gain access to the necessary data in the absence of access to the network, and the opportunity to familiarize yourself with how the pages are laid out. There may be other reasons. It is important to know how to perform this task, and below we will show several ways to make a copy of a site for later use offline.

Online services

As you can easily guess from the name, these are special sites with which you can download other sites from the Internet. The advantage of this approach is that there is no need to install additional programs, and, accordingly, it does not matter what operating system is installed. Everything seems to be great - you inserted the name of the site, clicked download, and you get a ready-made archive. But in reality everything turned out to be not so rosy.

The disadvantage of services is that there are few of them, and those that exist work so-so or ask for money. Personally, I have never been able to download even a one-page website. The services showed the loading process and froze. But nevertheless, I attach the list:

  • WebSiteDownloader - supposedly allows you to download a site in an archive, English-language
  • R-Tools is a paid service with a tariff schedule. At the time of writing it was not working well, https is not supported, the downloaded site opened crookedly. The project is developing, there is a demo of 25 pages.

In general, I got the impression that these services are either buggy, or do not load exactly what I would like, or are only suitable for small websites.

Download the entire site - why is it necessary?

Often there is a need to download the entire site so that you can view it offline. For example, if a site has a lot of text information, it is easier to download it in its entirety than to take a screenshot of each individual page. This can be useful if you need to download documentation containing 500-1000 web pages.

Also, the need to download the entire site arises for programmers, namely those who are involved in frontend development. Thanks to this, the specialist does not need to go to the site every time to see exactly how its pages are laid out. He can download the site and view it offline.

WinHTTrack WebSite Copier program

Everything that’s normal is paid, but there are options. A free solution is WinHTTrack WebSite Copier. Although this is not our product, the Russian language is available, you need to select it after installing the program.

You can watch the video or read further the article:

Using the program is quite easy; a “wizard” will help us with this. After starting the program, a tree of folders and files is displayed on the left, and a wizard invitation on the right. The tree is only needed to open a previously saved project. Click “Next” to start a new project or resume an interrupted download:

Enter any name and path to save the site. It is better to change the default path “C:\My Web Sites” to “C:\Downloads”:

In the third step you need to enter the domain. For example, I will download my website it-like.ru. The project type can be selected “Download site(s)” if this is a new project, or “Continue interrupted download” if you need to continue downloading the site. The “Update existing download” option is useful for those who already have a copy of the site, but want to update it to the latest one.

Here you can set additional parameters that can reduce loading time. By default, you can not change anything, the program will work fine. But I still recommend paying attention to some parameters:

  • Filters. You can limit the type of files to be uploaded. For example, prohibit or allow downloading of videos or archives (ZIP, RAR, CAB). By default, png, jpg, gif images, css style files and js scripts are included.
  • Restrictions. Set the maximum site scanning depth. Depth is the number of clicks on links from the start page. For example, if the home page of the site is the starting page, and the depth is 2, then the program will download the first level, i.e. the main page, and the pages to which there are links from the main page, and that’s it. If you set the depth to 3, deeper pages will be downloaded. Levels 3-4 are optimal.
  • Links. Check the box “Get HTML files first!”, this will allow you to first download the main text content of the site, and then pictures and other files.

Go ahead. In the fourth step, you can allow the Internet connection to be automatically established and disconnected when the download is complete. You can also set the computer to automatically shut down when you finish work (check “Shutdown PC when finished”).

Click “Done” and go get a cup of coffee while WinHTTrack WebSite Copier works for us.

The finished website can be opened in two ways.

  • from the folder where the project was saved by clicking on the index.html file. In this case, a page will open with a list of all projects. You need to choose the one you are interested in.
  • from the folder with the name of the project itself by clicking on the index.html file. In this case, the site will open immediately.

Manually saving pages

This is the most popular and easiest option for downloading the entire site to your computer. Some people had heard about this option, but there was no reason to use it.

To do this, just press the “Ctrl” + “S” keys simultaneously, in the window that opens, you need to change the name of the page you want to save and select the folder where it should be located.

It seems easier than ever? This option has a slight drawback: we only have the ability to save one page, but there are many pages on the site.

This method is great if the site consists of only one page, but what if there are more? In this case, you need to save each page separately, which will take a lot of time.

This option is for those who are not familiar with the main rules for downloading a site.

Offline Explorer

Paid harvester program, there is a demo. Of the obvious differences from WinHTTrack:

  • built-in Web browser and Web server for local preview;
  • recognizes and loads links from HTML files, Java and VB scripts, Java classes, Macromedia Flash (SFW), CSS, XML/XSL/DTD, TOC, PDF, M3U, AAM, RealMedia (SMIL, RAM, RPM) and MS NetShow Channel (.NSC) files;
  • search for loaded sites;
  • export to various formats (including for burning websites to CD);
  • removing scripts from web.archive.org pages.

The interface will seem more complicated, but there is nothing complicated about it. Launch the wizard:

Choose one of 12 templates:

For example, the “Download only one page with images and scripts” template is designed to download only one page. If you don’t know which one to choose, then let it be “Default Template”. Next, enter the desired site, any project name and the path to the folder on the disk:

Click “Next” and exit the wizard. The download process should begin, but if nothing happens, then click the “Continue” button

When the download is complete, a notification will be sent to your desktop.

Of the paid ones, you can also try the once legendary Teleport Pro program (in English), but in my opinion they ask for much more for it than it costs, and it’s already outdated.

Table: general description

NameDescriptionSpreading
Site2ZipThe site has a simple and clear interface in Russian, the site loads in two clicks. Without registering. For free
WebparseAn interesting tool for the awareness of the offline version of the site, Convenient and easy to use. Registration required. There is a mobile version of the site. Free (1 download)
Web2PDFConvertThis service can convert individual pages to PDF, JPG, PNG formats. Very convenient when you need to save a recipe or some kind of manual. For free
WinHTTrackThe program allows you to make copies of sites for offline viewing. There is a Russian version of the program. For free
Cyotek WebCopyA powerful tool to download websites to your computerFor free
Teleport ProAnother cool tool for downloading websites, either completely or individual sections, to your PC or external drive.Free (40 starts) Price: $50
Offline ExplorerA good program that allows you to make up to 500 simultaneous copies of sites. Very wide functionality with automated processes. Free (30 days) Price: 1500 rub.
WebcopierBrowser for downloading and viewing websites offlineFree (15 days) Price: $30

What are the limitations of copies?

I want to make it clear that a copied project, even if it looks exactly like the original, does not mean that all functions will work. The functionality that runs on the server will not work, i.e. various calculators, surveys, selection based on parameters - 99% will not work. If the functionality is implemented using Javascript, it will work.

But it is IMPOSSIBLE to download .php scripts from the server, absolutely NOT. Also, feedback and application forms will not work without manual modifications; by the way, DollySites does this. Please note that some sites have download protection, in which case you will receive a blank page or an error message.

Bottom line

In 2022, absolutely every user who doesn’t even know about programming, code, etc. can copy any website - there is a wide selection of services for this. They perfectly solve the problem of downloading one-page landing pages, maintaining their functionality, but they are not always able to cope with complex sites.

If you need a high-quality website or a copy of it, we recommend that you contact specialists - this will significantly increase the likelihood that all elements will work correctly, the site will perform its task, and traffic will bring you money.

To avoid surprises, be sure to check the functionality of the site before using it on all devices!
Previous post Back Next post Forward

Continuation of the table comparing programs for creating offline copies of websites.

The name of the programSupported technologiesCustom filteringoperating systemVersionYear of issuePrice
A1 Website Download+Windows, Mac OS Xfrom 39 ue
BackStreet BrowserWindows3.2201119th
Cyotek WebCopya lot of visual settings and modes+Windows1.1.1.42016free
Darcy Rippercross-platformfree
GetLeftWindows (with Tcl/Tk), Linux, Mac OSX2.5free
GNU WgetLinuxfree
HTTrackcross-platformfree
Local Website ArchiveWindows29.95 euros
Offline DownloaderWindows4.229.95 ue
Offline ExplorerAll+Windowsfrom 60 ue
QuadSucker/WebWindows3.52007free
SurfOfflineCSS, Flash, HTTPS, JS+Windows29.95 ue
Teleport ProHTML5, CSS3, and DHTML+Windows1.72201549.95 ue
Visual Web RipperAJAX+Windows3.0.162016349ue
Web Content Extractor+Windows8.3201649 ue
Web2DiskWindows39.95 ue
WebTransporterhttp+Windowsunavailable
WebZIPWindows7.1.2.1052200839.95 ue

GetLeft

This open source grabber has been around for a long time, and for good reason. GetLeft is a small utility that allows you to download various website components, including HTML and images.

GetLeft is very user-friendly, which explains its longevity. To get started, simply launch the program and enter the site's URL, then GetLeft will automatically analyze the website and provide you with a pagination breakdown, listing subpages and links. You can then manually select which parts of the site you want to download by checking the appropriate box.

Once you have dictated which parts of the site you want to download, click on the button. GetLeft will download the site to the folder you select. Unfortunately, GetLeft hasn't been updated for some time.

Thanks for reading! Subscribe to my channels on Telegram , Yandex.Messenger and Yandex.Zen . Only there are the latest blog updates and news from the world of information technology.

Also, read me on social networks: Facebook , Twitter , VK , OK .

SiteSucker

If you're firmly in the Apple ecosystem and only have access to Macs, you need to try SiteSucker. The program, given this name, copies all the files of a website to your hard drive. Users can start this process in just a few clicks, making it one of the easiest tools to use. In addition, SiteSucker copies and saves site content quite quickly. However, remember that actual download speed will vary depending on the user.

Unfortunately, SiteSucker is not without its drawbacks. First of all, SiteSucker is a paid application. As of this writing, SiteSucker costs $4.99 on the App Store. Additionally, SiteSucker downloads every file on the site that can be found. This means a large download with a lot of potentially useless files.

How to completely copy a website page or a website with source code?

The answer is simple - COPYRON.RU. We provide you with the opportunity to place an order for a site rip in several ways.

The most efficient method. 24/7 online. Prompt feedback. You can always ask a question.

The most reliable. Write to us by email - always in touch, and your letter will not be lost.

After placing an order through the website, you will be able to track the progress of your order using a tracking number.

Manual for copying a website

You can track your order using the “order number”, which will be given to you automatically when ordering. If the site is small, you will receive a link to the archive within 30 minutes - 1 hour. If the site is large, then we will warn you and tell you approximately how long to expect. In any case, stay in touch.

The first method is do it yourself

The most traditional option is to do it yourself. You won't need any third-party tools other than your hands and a browser. First, find the site you are interested in. I'll take my own blog as an example. I go to the main page. I right-click in any area. In the menu that opens, select “Save page as...”:

The saving process takes a few seconds. As a result, I get the main page file and a folder with all the constituent elements. There are pictures, PHP and JS files and styles. The HTML file can be opened using Notepad to view the source code.

If you think that saved files can be safely transferred to your resource, then you are very mistaken. This is a very crude option. Rather, it is only useful for viewing the page's source code, which can be done in a browser window without saving. I do not recommend using this method, since it is of zero use and no one will tell you how to transfer it to WordPress (for example) (there is simply no such option).

Why would you want to do that?

Contrary to popular belief, not everything on the Internet always exists. Sites are closed, censored, acquired, redesigned, or simply lost. The idea comes from the data hoarding community on Reddit, where the process of creating archives for fun is nothing new. While we can't predict or prevent a catastrophic event on our favorite website, we can certainly keep it as is .
There are many possible uses and reasons why you might want to download an entire site. It doesn't matter whether the target site is yours or not. On a side note, be careful what you upload. You may want to preserve the era of the site with a special design. Maybe you want to take an informative website with you to a place without internet . This method can ensure that the site stays with you.

Rating
( 2 ratings, average 4.5 out of 5 )
Did you like the article? Share with friends:
For any suggestions regarding the site: [email protected]
Для любых предложений по сайту: [email protected]