guile-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

application framework


From: Tom Lord
Subject: application framework
Date: Thu, 13 Feb 2003 21:23:57 -0800 (PST)


May I suggest an architectural direction for Guile?

An "extension language toolkit" is (IMO) much more than just "an
interpreter that you can connect to your application."

Succinctly expressed, my original intention with Guile was to
generalize the Emacs framework.  For example, in Emacs, buffers are
always text buffers -- there's a string you can edit that represents
the buffer's contents.  My intention was to generalize that -- perhaps
a buffer holds a drawing, or a spreadsheet, or a pixmap.

The idea was that you'd get all of the Emacsish mechanisms for
keymaps, modes, frames, windows, buffers, etc.... but that you could
apply that framework to any particular application area.

So, I'd like to see Guile go in that direction.

I've recently been resurrecting an old X11 toolkit of mind in support
of that.

I guess the question is: does anyone else want to play?

-t







* Introduction: A New Way of Writing GUIs

  GTk, Qt, FLTK -- all the rest -- they all suck.

  This is how GUI applications _should_ be architected.


* The WSI Application Model


  A WSI application consists of four components:

        * application model
        * interactive functions
        * keymap engine
        * redisplay engine

  which are connected to an external component:

        * display/input devices



        interactive <--- (called by) --------  keymap engine --------
        functions                             ^      ^               `
           |                                 /       |               |
           |           ---------------------'  (input events)  (location
           |          /                              |            queries)
       (modifies)  (provides current     display/input devices       |
           |        /      keymap)                   | ^             |
           |       /                                 | |             |
           |      /                     (graphics events/primitives) |
           |     /                                   | |             |
           V    /                                    v |             |
      application ---- (sends change notices) --->  redisplay <-----'
      model       <--- (makes queries) -----------  enginep


** The Application Model

    The application model is the data structure that is the "subject"
    of the program -- the primary data that user's want to manipulate.

    For a text editor, the application model includes the contents of
    files being edited.  For a spreadsheet -- the model includes
    the array of cells and formulae.

    The application model does no user interaction directly.   It may,
    however, produce output which can help to optimize display
    engines (e.g., in a text editor, a record of what regions of the
    files have changed).


** Interactive Functions

   Interactive functions are the "primitives" by which users operate
   on the application model.  

   For a text editor, for example, one interactive function might be:

        (insert-char c)

   which inserts the character `c' in the "file currently being
   edited" file at the "current insertion point" (where "file
   currently being edited" and "current insertion point" are elements
   of the application model.

   Interactive functions do not, for the most part, interact directly
   with users:

   Rather than producing direct output, the "output" of an interactive
   function is the side effects it has on the application model.  The
   redisplay engine has the job of providing that output to users.

   Rather than reading input directly, interactive functions usually
   get their input from the next component: the keymap engine.


** Keymap Engine

  The keymap engine reads input events and interprets them.  It's job
  is to translate input events into calls to interactive functions.

  A keymap engine typically works by by reading a "key sequence" -- a
  series of one or more input events -- and on the basis of that
  key sequence, selecting an interactive function to invoke.


** Redisplay Engine

  The redisplay engine has the job of keeping the display consistent
  with the application model.


* Architectural Observations

  Writing and maintaining a good application model is like writing and
  maintaining a good data structure library.  It's also a critical 
  UI issue -- can you make a model that user's can easily think about?

  The interactive functions are similar and even more UI intensive.

  Every GUI, in this architecture, is a sort of "editor" for the
  application model.   You can think of the useful work done by a 
  GUI of this sort as a series of calls to interactive functions.
  The interactive functions might be called directly as a result of 
  user interaction -- or indirectly as from an extension language.

  Extension languages fit into this architecture nicely.   They can
  adopt interactive functions as primitives, and they can define new
  interactive functions.

  Only the redisplay engine is terribly tightly coupled to the 
  particular display devices -- the application model, interactive
  functions, and keymap engine are largely device independent.


* The Recursive Edit Hack

  So, the job of the keymap engine is to select and invoke interactive
  functions.   We have a view of this application wherein the user
  is a kind of "main loop" that makes a series of calls to interactive
  functions.

  One problem is that some interactive functions require parameters.
  For example, a `load-file' command would want the name of the file
  to load.

  GNU Emacs provides an elegant solution:  When the user selects
  the function `load-file', the keymap manager notes that it requires
  one or more parameters.   Instead of calling `load-file' directly,
  the keymap manager instead invokes an interactive function that will
  collect those arguments.  When the arguments have been collected,
  the keymap loop invokes `load-file'.

  How does that work?  It works in a manner similar to expression
  evaluation in programming languages.  The initial call to 
  `load-file' is pushed onto a stack.  The application model is
  changed so that the current keymaps reflect the goal of collecting
  parameters.  Eventually an application model state is reached
  in which the parameter collection is complete.  At that point the
  keymap manager pops the stack and invokes `load-file'
  



* Critique of Other Approaches

  Most toolkits define `widgets' which are data structures that
  combine: 

        - graphics output for one region of the screen
        - application state
        - callbacks for high-level semantics (e.g., "OK button was pushed")
        - input event processing
        - hooks and meta-data for interface builders

  Applications are then expected to build their interaces out of
  those components.

  That's a _terrible_ design.   For example:

        1) Since each widget combines application state with 
           graphical output for one region of the screen,
           displaying the same state in multiple locations is
           awkward.

        2) Since application state is mixed up with GUI display,
           it is not "natural" to write applications that work
           both on graphical displays and in other contexts -- and
           few people do.

        3) Since application state is mixed up with the GUI, and since
           communication with applications is via callbacks, 
           control flow is disrupted by the toolkit.   Programs must
           either be written in a callback-driven style or resort
           to threads or a mixture of those techniques.

        4) Since input event processing is divided up among a tree of 
           widgets, and distribution of events is complicated, it is
           difficult or impossible to make global changes (such as
           modal changes) to how input is interpreted.

        5) 

        6) I could go on for days about this.


* The WSI X11 Toolkit

  The wsi X11 toolkit is designed to provide the interfaces:

        A) between a keymap engine and X11 input events
        B) between a graphical redisplay engine and X11






reply via email to

[Prev in Thread] Current Thread [Next in Thread]