<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://freemwiki.com/index.php?action=history&amp;feed=atom&amp;title=How_to_Archive_Websites_on_Unix_Like_Systems</id>
	<title>How to Archive Websites on Unix Like Systems - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://freemwiki.com/index.php?action=history&amp;feed=atom&amp;title=How_to_Archive_Websites_on_Unix_Like_Systems"/>
	<link rel="alternate" type="text/html" href="https://freemwiki.com/index.php?title=How_to_Archive_Websites_on_Unix_Like_Systems&amp;action=history"/>
	<updated>2026-05-16T09:01:53Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.3</generator>
	<entry>
		<id>https://freemwiki.com/index.php?title=How_to_Archive_Websites_on_Unix_Like_Systems&amp;diff=8585&amp;oldid=prev</id>
		<title>Lukegao1: 创建页面，内容为“Archiving websites on Unix-like systems can be accomplished using a few different tools and methods. Here are some steps you can follow to archive websites on Unix-like systems:  1. Install wget: wget is a command-line utility for retrieving files from the web using HTTP, HTTPS, and FTP protocols. Most Unix-like systems come with wget pre-installed, but if it&#039;s not installed on your system, you can install it using your system&#039;s package manager. For example, on…”</title>
		<link rel="alternate" type="text/html" href="https://freemwiki.com/index.php?title=How_to_Archive_Websites_on_Unix_Like_Systems&amp;diff=8585&amp;oldid=prev"/>
		<updated>2023-03-21T16:02:23Z</updated>

		<summary type="html">&lt;p&gt;创建页面，内容为“Archiving websites on Unix-like systems can be accomplished using a few different tools and methods. Here are some steps you can follow to archive websites on Unix-like systems:  1. Install wget: wget is a command-line utility for retrieving files from the web using HTTP, HTTPS, and FTP protocols. Most Unix-like systems come with wget pre-installed, but if it&amp;#039;s not installed on your system, you can install it using your system&amp;#039;s package manager. For example, on…”&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;Archiving websites on Unix-like systems can be accomplished using a few different tools and methods. Here are some steps you can follow to archive websites on Unix-like systems:&lt;br /&gt;
&lt;br /&gt;
1. Install wget: wget is a command-line utility for retrieving files from the web using HTTP, HTTPS, and FTP protocols. Most Unix-like systems come with wget pre-installed, but if it&amp;#039;s not installed on your system, you can install it using your system&amp;#039;s package manager. For example, on Debian-based systems like Ubuntu, you can run the following command to install wget:&lt;br /&gt;
&lt;br /&gt;
   ```&lt;br /&gt;
   sudo apt-get install wget&lt;br /&gt;
   ```&lt;br /&gt;
&lt;br /&gt;
2. Use wget to download the website: Once you have wget installed, you can use it to download the website and its content. The following command will download the entire website and its content recursively:&lt;br /&gt;
&lt;br /&gt;
   ```&lt;br /&gt;
   wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains website.com --no-parent https://website.com/&lt;br /&gt;
   ```&lt;br /&gt;
&lt;br /&gt;
   Here&amp;#039;s what each option in the command does:&lt;br /&gt;
&lt;br /&gt;
   * `--recursive`: download the website recursively.&lt;br /&gt;
   * `--no-clobber`: don&amp;#039;t overwrite existing files (useful if you need to resume an interrupted download).&lt;br /&gt;
   * `--page-requisites`: download all the necessary files to display the page, such as images and CSS.&lt;br /&gt;
   * `--html-extension`: save files with the `.html` extension instead of the default `.html`.&lt;br /&gt;
   * `--convert-links`: convert links to be relative to the downloaded files.&lt;br /&gt;
   * `--restrict-file-names=windows`: restrict the file names to Windows-compatible names.&lt;br /&gt;
   * `--domains website.com`: only follow links from this domain.&lt;br /&gt;
   * `--no-parent`: don&amp;#039;t download files from the parent directory.&lt;br /&gt;
&lt;br /&gt;
   You can adjust these options to suit your needs.&lt;br /&gt;
&lt;br /&gt;
3. Compress the archive: Once you have downloaded the website, you can compress it to save space. You can use the `tar` command to create a compressed archive:&lt;br /&gt;
&lt;br /&gt;
   ```&lt;br /&gt;
   tar -czvf website.tar.gz website.com/&lt;br /&gt;
   ```&lt;br /&gt;
&lt;br /&gt;
   This command creates a compressed archive called `website.tar.gz` of the downloaded website.&lt;br /&gt;
&lt;br /&gt;
4. Store the archive: Finally, you can store the archive in a safe place, such as an external hard drive or cloud storage.&lt;br /&gt;
&lt;br /&gt;
That&amp;#039;s it! With these steps, you can archive a website on a Unix-like system using wget and tar.&lt;/div&gt;</summary>
		<author><name>Lukegao1</name></author>
	</entry>
</feed>