How To Download Files On Debian Using Curl And Wget On The Command Line

Working in the Linux command line gives you more control and flexibility than the GUI. The command line has many uses and is widely used in server administration. You can automate the task using the command line and it also uses less resources than the GUI. Downloading files using command line is also easier and faster as it only requires a single command compared to GUI which mainly requires long steps procedure.

In this article, we will explain how to download files using the Linux command line using two different utilities. Both are free utilities for non-interactive downloading of files from the web. These widgets work in the background even when you are not logged in.

We will use Debian 10 to describe the process mentioned in this article.

Method #1 Download files with Curl

Curl is a command line utility used to transfer files to and from a server. We can use it to download files from the web. It is designed in such a way that you can run it without user interaction. It supports various protocols including HTTP, HTTPS, TELNET, SCP, FTP, etc. It is not installed by default in the Debian operating system. Therefore, we must install it first. To do so, follow these steps:

Install Curl

Launch the Terminal application in Debian. For that, go to Activities in the top left corner of the screen. Then in the search bar, type terminal. When the Terminal icon appears, click on it to launch it.

In Terminal, type the following command to switch to the superuser account.

$ su

When prompted for a password, enter the superuser password.

Then run the following command in Terminal to install the Curl utility.

$ apt install curl

Once the installation is complete, we can use Curl to download the file.

General syntax of CURL :

To download a file using Curl, use the following syntax in Terminal:

$ curl [options] [URL]

Use [options]you can specify different functions, such as save downloads with a specific name, resume downloads, specify transfer rates and more.

Use [URL] you can specify the URL of the remote server.

Download and save the file using the source filename

To download and save a file with the same name as the source filename, use the following syntax:

$ curl –O [URL]

An example of this would be:

$ curl -O https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

It will save the downloaded file as debian-10.0.0-amd64-DVD-1.iso.

Download and save the file with the source filename using curl

Alternatively, you can also specify, –remote-name of the –O to save the file as a remote filename.

Download and save the file with a different name

To download and save a file with a name other than the source file name, use the following syntax:

$ curl [URL] –o [filename]

inside [filename]Specify a new name for the downloaded file.

An example of this would be:

$ curl https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso -o debian.iso

It will save the downloaded file as debian.iso.

Download and save the file with a different name

Download multiple files simultaneously

Instead of downloading files one by one, you can download them all simultaneously by running a single command. To download multiple files at once, use –O with the URL of the file you want to download.

Use the following syntax for this purpose:

$ curl -O [URL1] -O [URL2]

An example of this would be:

$ curl -O https://www.debian.org/doc/manuals/debian-reference/debian-reference.en.pdf -O https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

The above command will download both files.

Download multiple files simultaneously with curl

There is another way to do this. Specify a list of URLs in a file, then use the Curl command along with xargs according to the following syntax:

$ xargs –n 1 curl –O < [filename]

An example of this would be:

$ xargs –n 1 curl –O < files.txt

Our files.txt contains two URLs:

Download all urls from a text file

The above Curl command will download all the URLs specified in files.txt.

File download results

Download files from FTP Server

We can also download files from an FTP server using the Curl utility. To do so, run the command in Terminal using the following syntax:

$ curl -u ftp_user:ftp_pass -O ftp://ftp_url/file_name.zip

ftp_user and ftp_passs is used to specify FTP credentials. However, you can ignore these in case of anonymous FTP connections.

Pause and resume downloads

You can also resume downloads that have been paused manually or for some other reason. To manually pause the download, use Ctrl+C.

To resume a paused download, navigate to the folder where you previously downloaded the file, then use the following syntax to resume.

$ curl –c- [options] [URL]

An example of this would be:

To resume a paused download debian-10.0.0-amd64-DVD-1.isowe used this command:

$ curl –c https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

From the following result, you can see that it has continued to download.

Download files from an FTP server

Download files using Wget

Similar to Curl, there is another command line utility, Wget, which can be used to download files and content from the web. Wget is a combination of the World Wide Web and from get. It supports protocols like FTP, SFTP, HTTP, and HTTPS. In addition, it supports recursive downloads, which is useful if you want to download an entire website for offline viewing or to create a backup for a static website.

Install Wget

If wget is not installed on your system, you can install it by following these steps:

Launch the Terminal application in the same way as discussed earlier in this article. In Terminal, enter the following command to switch to the super user account.

$ su

When prompted for a password, enter the superuser password.

Then run the following command in Terminal to install the Wget utility.

$ apt-get install wget

Install wget on Debian 10

General syntax of Wget

To download a file using Wget, use the following syntax:

$ wget [URL]

Download and save the file using the source filename

Using the above syntax to download the file without any arguments will save the file with the same name as the source file. An example of this would be downloading a debian-10.0.0-amd64-DVD-1.iso file.

$ wget https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

It will save the download as debian-10.0.0-amd64-DVD-1.iso.

Download and save the file with the source filename using wget

Download and save the file with a different name

To download and save a file with a name other than the source file name, use the following syntax:

$ wget –O debian10 https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

It will save the download as debian10.

Download and save the file with a different name using wget

Download files via FTP

To download files from the user’s authenticated FTP server, use the syntax below:

$ wget -u [ftp_user]:[ftp_pass] -O [ftp_URL]

ftp_user and ftp_pass used to specify FTP credentials. However, you can ignore these in case of anonymous FTP connections.

Download multiple files

To download multiple files using Wget, create a text file with a list of the file’s URLs, then use the syntax below to download all the files at once.

$ wget –i [filename.txt]

For example, we created a text file files.txt contains two URLs as shown in the image below.

File contains multiple urls

Then we ran the following command:

$ wget –i files.txt

Download all files from url file

Running the above command will automatically download both URLs contained in files.txt.

Pause and resume downloads

To resume a paused download, navigate to the folder where you previously downloaded the file, then use the following syntax to resume.

$ wget -c [filename]

An example of this would be continuing a debian-10.0.0-amd64-DVD-1.iso by running the following command.

$ wget –c https://gemmei.ftp.acc.umu.se/debian-cd/current/amd64/iso-dvd/debian-10.0.0-amd64-DVD-1.iso

wget: Pause and Resume Downloads

Download files recursively

Wget supports recursive downloads, which is another key feature from Curl. The recursive download feature allows to download anything in a specified folder.

To recursively download a web page or FTP site, use the following syntax:

$ wget –r [URL]

An example of this would be the following full web page download.

$ wget –r https://helpingbox.net/debian

Recursively download files using wget

So, in this article, we have looked at two different non-interactive command line utilities that allow you to download files directly from the command line. Both utilities come in handy, and servers serve a similar purpose. I hope it will be useful whenever you need to download files from the internet.

Leave a Comment