guix-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: When substitute download + decompression is CPU-bound


From: Ludovic Courtès
Subject: Re: When substitute download + decompression is CPU-bound
Date: Tue, 15 Dec 2020 12:42:10 +0100
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/27.1 (gnu/linux)

Hi,

Pierre Neidhardt <mail@ambrevar.xyz> skribis:

> Another option is plzip (parallel Lzip, an official part of Lzip).
>
>> decompression of ungoogled-chromium from the LAN completes in 2.4s for
>> gzip vs. 7.1s for lzip.  On a low-end ARMv7 device, also on the LAN, I
>> get 32s (gzip) vs. 53s (lzip).
>
> With four cores, plzip would beat gzip in the first case.
> With only 2 cores, plzip would beat gzip in the second case.
>
> What's left to do to implement plzip support?  That's the good news:
> almost nothing!
>
> - On the Lzip binding side, we need to add support for multi pages.
>   It's a bit of work but not that much.
> - On the Guix side, there is nothing to do.

Well, ‘guix publish’ would first need to create multi-member archives,
right?

Also, lzlib (which is what we use) does not implement parallel
decompression, AIUI.

Even if it did, would we be able to take advantage of it?  Currently
‘restore-file’ expects to read an archive stream sequentially.

Even if I’m wrong :-), decompression speed would at best be doubled on
multi-core machines (wouldn’t help much on low-end ARM devices), and
that’s very little compared to the decompression speed achieved by zstd.

Ludo’.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]