Home Linux Tips & Tricks Resume Partially Downloaded File Using Wget In Linux

Resume Partially Downloaded File Using Wget In Linux

By sk
Published: Updated: 19.7K views

Did you just end up with a partially downloaded file due to slow Internet connectivity speed? Worry not! This short tutorial explains how to resume partially downloaded file using wget command in Linux.


The other day I was testing FreeTube, an open source YouTube Desktop player in my Arch Linux Linux desktop. I tried to download the latest version of FreeTube from GitHub using wget command.

Due to the poor Internet speed, the download process keeps terminating every few minutes. Here is the wget command that I used to download that file.

$ wget https://github.com/FreeTubeApp/FreeTube/releases/download/v0.1.3-beta/FreeTube-linux-x64.tar.xz

Sample output:

--2018-03-09 15:41:44-- https://github.com/FreeTubeApp/FreeTube/releases/download/v0.1.3-beta/FreeTube-linux-x64.tar.xz
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Resolving github.com (github.com)...,, 64:ff9b::c01e:fd70, ...
Connecting to github.com (github.com)||:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://github-production-release-asset-2e65be.s3.amazonaws.com/123220152/39a2c92c-2277-11e8-8ca4-895487b6ddb0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20180309%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20180309T101016Z&X-Amz-Expires=300&X-Amz-Signature=bf0fcb4643f0719239e3a006c9fbc055b6b9805e6ef03dcfc27334d18fdf994c&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3DFreeTube-linux-x64.tar.xz&response-content-type=application%2Foctet-stream [following]
--2018-03-09 15:41:47-- https://github-production-release-asset-2e65be.s3.amazonaws.com/123220152/39a2c92c-2277-11e8-8ca4-895487b6ddb0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20180309%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20180309T101016Z&X-Amz-Expires=300&X-Amz-Signature=bf0fcb4643f0719239e3a006c9fbc055b6b9805e6ef03dcfc27334d18fdf994c&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3DFreeTube-linux-x64.tar.xz&response-content-type=application%2Foctet-stream
Resolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)..., 64:ff9b::34d8:e1c8
Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)||:443... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 39318740 (37M), 24626833 (23M) remaining [application/octet-stream]
Saving to: ‘FreeTube-linux-x64.tar.xz’

FreeTube-linux-x64. 48%[+++++++=> ] 18.04M 4.63KB/s in 3m 6s

2018-03-09 15:49:59 (22.2 KB/s) - Read error at byte 18921544/39318740 (Error in the pull function.). Retrying.

--2018-03-09 15:50:00-- (try: 2) https://github-production-release-asset-2e65be.s3.amazonaws.com/123220152/39a2c92c-2277-11e8-8ca4-895487b6ddb0?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20180309%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20180309T101016Z&X-Amz-Expires=300&X-Amz-Signature=bf0fcb4643f0719239e3a006c9fbc055b6b9805e6ef03dcfc27334d18fdf994c&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3DFreeTube-linux-x64.tar.xz&response-content-type=application%2Foctet-stream
Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)||:443... connected.
HTTP request sent, awaiting response... 403 Forbidden
2018-03-09 15:50:04 ERROR 403: Forbidden.

I checked the download file size.

$ ls -lh FreeTube-linux-x64.tar.xz 
-rw-r--r-- 1 sk users 19M Mar 9 15:44 FreeTube-linux-x64.tar.xz

Well, the actual size of FreeTube is around 38MB, but I got only 19M so far.

I re-run the Wget command hoping it would resume the partially downloaded file, but it didn't. Wget started to download the file from the beginning and after few minutes I got the same error. No matter how many times I tried to download that file, the download task keep getting interrupted after a few minutes.

Resume Partially Downloaded File using wget

After a few google searches and going through wget manual page, I discovered that there is an option to resume the partially downloaded files with wget command.

To resume the partially downloaded file, go to the location where the partially downloaded file exists, and use -c or --continue option with wget command like below.

$ wget -c https://github.com/FreeTubeApp/FreeTube/releases/download/v0.1.3-beta/FreeTube-linux-x64.tar.xz


$ wget --continue https://github.com/FreeTubeApp/FreeTube/releases/download/v0.1.3-beta/FreeTube-linux-x64.tar.xz

Now, Wget started to download the file from where it was left in the previous attempt. The -c or --continue option will continue getting a partially-downloaded file. This will be quite useful when you want to finish up a download started by a previous instance of Wget, or by another program.

Please be mindful that if there is a file with same name in the current directory, Wget will assume that it is the first portion of the remote file, and will ask the server to continue the retrieval from an offset equal to the length of the local file. So, just make sure you have deleted all other partially downloaded files and keep the original file that you want to download.

For more details, refer wget man pages.

$ man wget


$ wget --help

You May Also Like


Kay August 13, 2021 - 8:06 pm

Didn’t work?

result was:
“Redirecting output to ‘wget-log’. “

sk August 13, 2021 - 8:42 pm

I just checked it and it works for me. I assume you’ve tried to resume the download from the same session.


Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. By using this site, we will assume that you're OK with it. Accept Read More