[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [lmi] wget: "Unable to establish SSL connection"
From: |
Vadim Zeitlin |
Subject: |
Re: [lmi] wget: "Unable to establish SSL connection" |
Date: |
Tue, 6 Apr 2021 23:52:08 +0200 |
On Tue, 6 Apr 2021 21:29:09 +0000 Greg Chicares <gchicares@sbcglobal.net> wrote:
GC> I wanted to create a brand-new chroot on our corporate server,
GC> but the 'lmi_setup*.sh' scripts rely on fetching the latest
GC> versions of various files using 'wget', and today we encounter
GC> seemingly random 'wget' failures. Is there anything I should
GC> try beyond the options demonstrated below? I'd rather not
GC> rewrite the scripts, but maybe there's no choice.
I'm afraid rewriting the scripts wouldn't help at all, but you should test
it by running
$ echo 'Host: https://github.com\n\nGET
let-me-illustrate/lmi/raw/master/gwc/.vim/spell/en.utf-8.add\n' | openssl
s_client -connect github.com:443 -crlf -quiet
If this fails (maybe it's worth putting "repeat 100" before it and
">/dev/null" afterwards), it would definitely mean that the problem is not
wget-specific, but is due to a bug in the corporate proxy/firewall.
GC> I tried running one particular 'wget' command with various options, in
GC> the hope of discovering some voodoo that works, but they all fail at
GC> least sometimes. Failing examples:
These options only take effect after the connection is established. If it
can't connect in the first place, they are unlikely to be helpful.
GC> If there's really no other choice than to rewrite the scripts,
GC> which approach seems best:
GC> - Establish a local cache directory for all files that are
GC> downloaded by wget; then run a script to refresh all of
GC> them, until all finally get refreshed? Just writing that
GC> idea down makes it seem clearly poor.
Doesn't lmi only download files that are absent? In this case, just
copying an existing cache directory from another machine could be at least
a temporary solution.
GC> - Use 'curl' instead of 'wget'.
This is trivial to test (basically just do WGET="curl -LO"), but I don't
think it's going to help.
GC> - Use git. All the files to be downloaded are in git. It
GC> seems that git works reliably where wget fails. It seems
GC> wasteful to clone a whole repository just to fetch a
GC> few scripts, but that's okay if it solves the problem.
This point is curious, normally Git uses the same system OpenSSL library
and so should be affected by the same issues. But Git also uses libcurl, I
believe, so maybe it's a wget issue after all, even though this seems to be
difficult to believe.
In short, I suspect that the issue is outside the machine you control, but
trying to use curl would still be worth it, if only because it's so simple
to do. Of course, for a very simple test, you could just run "repeat 100
curl -L http://github.com/whatever" and see if it fails.
Good luck, I'm afraid you will need it,
VZ
pgpGkbY6z5pg5.pgp
Description: PGP signature