[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Wget-dev] wget2 | Recursive downloads with HTTP/2 not uniformly div

From: Darshit Shah
Subject: Re: [Wget-dev] wget2 | Recursive downloads with HTTP/2 not uniformly divided (#397)
Date: Fri, 21 Sep 2018 13:28:40 +0000

I don't think I agree with this idea. Even with HTTP/2, I think being able to 
use multiple connections is a better idea. Mostly because some bad ISPs often 
throttle the bandwidth on a per-connection basis. This is never official 
policy, but I've seen it all too often that multiple connections to the same 
server get a higher total bandwidth than the a single connection. 

Also, this raises the question: What the precise defaults we want to use? Right 
now, we have 5 threads and 30 files per thread with HTTP/2. Maybe, we should 
revisit these defaults. Or at the very least discuss them to see what makes the 
most sense. 

I think we should distribute the download queue equally across all the 
available threads. Also, a suggestion: We should have an option, 
`--max-parallel-downloads` which defines how many files / chunks are downloaded 
in parallel. This option is valid for HTTP/1, HTTP/2 and Metalink downloads. 
When used with HTTP/1, `nthreads = max_parallel`. With HTTP/2, `nthreads = 
max_parallel / http2_window_size`.

Reply to this email directly or view it on GitLab: 
You're receiving this email because of your account on gitlab.com.

reply via email to

[Prev in Thread] Current Thread [Next in Thread]