[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: fail to download big files correctly
From: |
Jeffrey Walton |
Subject: |
Re: fail to download big files correctly |
Date: |
Fri, 17 Nov 2023 19:16:36 -0500 |
On Fri, Nov 17, 2023 at 6:29 PM Tim Rühsen <tim.ruehsen@gmx.de> wrote:
>
> On 11/17/23 20:34, grafgrimm77@gmx.de wrote:
> > I use Linux and so not exe files. I use Gentoo Linux.
> >
> > Command line example:
> > One line (wget and the url):
> >
> > wget
> >
> http://releases.mozilla.org/pub/firefox/releases/119.0.1/source/firefox-119.0.1.source.tar.xz
> >
> > result: a file with a wrong checksum.
>
> Just a guess:
>
> If you have a bad network and your connection drops, wget does retries
> by default.
> These retries may result in multiple incomplete files, so that the
> checksums are different. Can you do a 'ls -la' to see which size these
> files have?
>
> I currently can't simulate it - none of the "bad network" emulators for
> Linux do random connection drops.
This one has always made me laugh:
<https://github.com/tylertreat/comcast>. If it is as bad as it sounds,
then you should be able to experience a dropped connection without
unplugging your ethernet cable.
(Comcast has a bad reputation in the US. I experienced it first hand
in the paqst).
Jeff