gcl-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Gcl-devel] CLtL1 (was Axiom build error)


From: Michael Koehne
Subject: Re: [Gcl-devel] CLtL1 (was Axiom build error)
Date: Wed, 30 Jun 2004 04:18:48 +0200
User-agent: Mutt/1.3.28i

Moin Camm Maguire,

> Now be nice :-)

  *oups* was a longer rant - in short - Axiom must become able to
  build from /usr/local/bin/gcl, else its impossible to ensure that
  every stage of unstable GCL is able to support it. Getting rid
  of strict Axiom/GCL version requirement will benefit both sides,
  and enable normal people to build Axiom.

> I don't find a lot of ansi ifdefs in the C code.  Even these I think
> can be eliminated.

  I mainly thought about those broken #ifdef UNIX places. I eliminated
  those #ifdef UNIX in pathname.d and related places. And they looked,
  as if they broke long ago, e.g. :

            if (d == -1) {
                /*  no file type  */
#ifdef UNIX
                if (i-j == 1 && s->st.st_self[j] == '*')
#endif
                        vs_push(sKwild);
                else
                        make_one(&s->st.st_self[j], i-j);
                
                vs_push(Cnil);
            } else if (d == j) {

  this would never work, if #undef UNIX - #ifdef UNIX is the code
  graveyard of the LGPL cleanup - I think - and feared that #ifdef
  ANSI might become a similar code graveyard in a few years, if
  everybody compiles GCL with #define ANSI, because most applications
  will require some new ANSI features in a few years. Even classical
  appications like Maxima will improve - take a look at f2cl in maxima
  source, its commented out because it requires pprint.

> You can use (specific-error ... ) here.

  i found it short thereafter.

> By in large, ansi is cltl1 + stuff.  The
> exceptions we can deal with in init_ansi.lsp with a (fmakunbound ...)
> for all obsoleted functions (e.g. lsp/serror) followed by (gbc t).

  are you sure, that this will free up space of compiled functions ?

  I would prefer, if GCL becomes more and ANSI CL while providing
  backward compatibility at the same time. This means that the ANSI
  part must be implemented in a useable way, else people will still
  be right on claiming that GCL has a bad ANSI compatibility.
  
  The CLOS that comes with PCL is unusable because its calling the
  (wrong) compiler at the (wrong) time, and the CLCS would better
  fit into ./o/conditions.d and some small ./lsp/gcl_conditions.lsp
  to interface it providing trivial functions in Lisp ... but the
  " big GCL cleanup dream " is more than I could commit right now,
  so I stick with the necessary patches.

> Problem is that most of the large sophisticated lisp programs which
> have come into the open source world or conceivably might in the near
> future most likely don't use clos, and to them it is considerable
> bloat.  

makhno:/usr/share/common-lisp/source $ fgrep -i defmethod */*.l* |wc
   3883   21611  326808
makhno:/usr/share/common-lisp/source $ fgrep -il defmethod */*.l* |
awk -F/ '{print $1}' | sort | uniq | wc
     42      42     364

  so 42 out of 124 new applications are defining methods. The number
  is similar to the bunch of UFFI applications - I still consider
  implementing UFFI on top my half year old elf-loader try them.
  This would be slow, and someone who understands bfd might improve
  it, once the concept worked out, and new applications can be seen
  and tested.

> I'd prefer to go the other way and have the newer stuff keyed by
> :ansi-cl in features.  This is again because logically speaking, with
> a few exceptions, cltl1 is a subset of ansi.

  ctlt1 is the precursor of ansi - its not a complete subset, e.g.
  in-package is really different, even if calling same name.

> This is a good idea.  Like cmucl, and will remove one ifdef from
> packages.d. 

  *oups* there is even one additional #ifdef inside, as I might need
  to provide the old in-package in addition, if compiling for ANSI.

#ifdef ANSI_COMMON_LISP
        make_si_function("KCL-IN-PACKAGE", Lin_package);
        make_special_form("IN-PACKAGE", Fin_package);
#else
        make_function("IN-PACKAGE", Lin_package);
#endif

  There must be some additional problem in package management (perhaps
  in compiler) as some CLC systems (e.g. 'paid) load into the wrong
  package (or are compiled into the wrong one - not really sure here)

> In general, though, I feel it important that we continue to support
> cltl1 going forward.

  I think that its important for an old application like GCL to keep
  its user base. e.g. Stanford & UCSD Pascal had been a good thing,
  because every version, was able to run the old programms and a lot
  of new fresh shiny stuff - Turbo Pascal instead managed to destroy
  my Pascal code base - 1th I had remove lots of features up all those
  good UCSD code, that had been based on old code for Stanford Pascal. 
  But the next Turbo Pascal version was completely different, so I had
  to edit every code again. I dropped Pascal, when Turbo Pascal 3.0
  had been unable to compile TP 2.0 code. A similar thing happend to
  me with TCL - i dont like it because, they dont even manage backward
  compatibility within their own branch. I still like Java, because of
  some Java code that had been compiled using the Netscape 2.0 browser
  and the jdk 1.0 zip - that is still working - same binary for nearly
  10 years ! The actual jdk tells me that some functions are deprecicated
  but I can still compile it.

> My reasons stem basically from the observation that most of the heavy
> lifting in the lisp world appears to have been done under cltl1, and
> most of the talent that produced it originally have left lisp for
> other languages.

  history turns in cycles - some may come back - I had not considered
  writing Lisp 20 years ago - because I was'nt able to afford it. So I
  wrote applications for several midrange system and mainframes to earn
  some money in COBOL&ASM while hacking on VolX4th for my own fun.

> It is unclear to me if this new crop of interest in
> lisp will be able to muster the sustained coordinated and concentrated
> effort to produce from scratch new programs of the quality and
> complexity of maxima, acl2 and axiom.

  Its clear to me that the crop of VisualBasic programmers can never
  deliver a closed shop quality, that good old COBOL code had ;)

  But there are a lot more programmers in our days, and a lot more
  people who can afford a system that is able to run some Lisp today
  than those small elite who had access to a Lisp system in 'good old
  days when computers had been big and expensive and programmers either
  nice, smart, cheap and female or some geeks who sleep the day, to
  hack in the nightshift for a pizza and the fun of using a big iron'
  I had not been in the math department at that time, but I think that
  it was similar - perhaps not that many woman, who went from being
  secretary to datatypist to programmer.

  In result, most modern Lisp programms in ASDF&CLC are more like Perl
  CPAN code - try out if its usable - and look again if its reuseable
  code. CPAN has still a better coding standard (constraining regression
  test and documenation to be same way) but the ASDF&CLC is new compared
  to CPAN and will improve (I hope) in setting their own coding standards.

> This is not to say that full ansi compliance in our ansi build is not
> important.  It is rather now our top priority.

  the good thing in open source, is that everybody could set his top
  priority and the merged result should give a synergy effect, that
  everybody's top priority is coded right. pfdietz his top priority
  is ansi-tests mine is getting new CLC applications up and running,
  your might be to constrain that old applications still run and of
  course to maintain a balance between upstream ideas and stable code.

  Take a look at Debian at example - Debian has been able to sustain
  a high quality and even improve it, because every maintainer has
  private toys, obscure things like Maxima and Axiom that no sane
  Linux user would even consider to install *nudge* are suddenly in
  a quality as if they are top priority of the mananger. And even
  more obscure things like MPICH or LAM install out of the box.

  Compare the efford to setup some computing cluster. You'll have
  much more work, if you try on some other Linux, even if you know
  what the topic is about. The result of a SCore to batch IntelMPI
  over Myrnet on a custom cluster is of course better, than connecting
  a classroom of Debian Linux PC's with LAM - but you'll need to hire
  a consultant to do so, as no sane Linux user has the MPI skills, and
  the MPI users are happy, if their fortran programms run through the
  compiler. So Debian offers an MPI out of the box quality that only
  a few very skilled consultants would be able to improve. And they
  would need to add obscure non-free compilers and libraries, that are
  expensive and might not exist at the next university or institute.
  And the consultant might even chose Debian, because its easier to
  maintain and known for its stability ;)

Bye Michael
-- 
  mailto:address@hidden             UNA:+.? 'CED+2+:::Linux:2.4.22'UNZ+1'
  http://www.xml-edifact.org/           CETERUM CENSEO WINDOWS ESSE DELENDAM




reply via email to

[Prev in Thread] Current Thread [Next in Thread]