lilypond-user
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Large file


From: David Kastrup
Subject: Re: Large file
Date: Sat, 16 Jun 2012 09:00:48 +0200
User-agent: Gnus/5.13 (Gnus v5.13) Emacs/24.1.50 (gnu/linux)

Ralph Palmer <address@hidden> writes:

> Greetings -
>
>
> I'm using LilyPond Book to print out a large set of short (mostly 16
> bar) contra dance and Celtic tunes. I'm working on an older Dell
> laptop, using LY 14.2 under Linux / Ubuntu. I show 494.8 MiB of
> memory, but I think I may have some additional space set aside as
> virtual memory (I'm not sure about that). I have 510 individual files,
> one for each score, which are "\include"ed in my master file, which
> has a separate score entry for each.
>
> If I try to run more than twenty or thirty scores, things get badly
> bogged down. If I have all my page breaks set up the way I want them,
> and I try to run the whole file (all 510 scores), am I likely to lock
> up the memory, or (if I'm patient), will the computer likely produce
> the .ly and .pdf files in 2 or 3 hours?
>
> I appreciate your time and attention,

Lilypond-book is running LilyPond with large command lines, on a lot of
files at once.  Without memory leaks, this should not be much of a
problem.  However, your setup very much sounds as if you are _not_ doing
this as a proper LilyPond-book job, but rather make one large LilyPond
file.  This kind of setup will require considerable resources.  Things
get worse if LilyPond can't flush out pages in between because of global
page break optimization.

So you should try to change your setup in a manner where you have one
LilyPond-book fragment for each score.  This may mean that every such
fragment does an \include of the settings now in the LilyPond master
file rather than the LilyPond master file including every LilyPond
source.  On this kind of setup, the performance loss due to memory
starvation is likely quite larger than the loss due to reloading the
master file and stuff for every snippet.

Current development versions are quite more efficient in such multifile
situations (talk about shaving 30% off execution time for small files),
but again, you will need to separate this into separate LilyPond-book
fragments rather than one large chunk containing everything.

-- 
David Kastrup




reply via email to

[Prev in Thread] Current Thread [Next in Thread]