groff
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Groff] Using C with glib instead of C++ for groff


From: Egil Kvaleberg
Subject: Re: [Groff] Using C with glib instead of C++ for groff
Date: 28 Nov 2002 15:16:58 +0100

On Thu, 2002-11-28 at 14:59, Bernd Warken wrote:

> > This is incorrect.  ANSI C does not mandate `int' has 16 bits.
> > 
> In my version of the ANSI C book of Kernighan/Ritchie, chapter B.11
> on <limits.h> says
> 
> INT_MAX     +32767
> INT_MIN     -32767
> UINT_MAX     65535
> 
> This makes int and uint 16 bits types in ANSI C.

No, definitely not.

In whatever brand of C I've known since dawn of time, the requirement
for an int has been that it should be >= 16 bits.

Short is also >= 16 bits, while long >= 32. Additionally, wrt. bit
counts, it is a requirement that long >= int >= short.

In other words, it is OK if all are 64 bits, for instance.

If it is expected that an integer may take on larger values than what
can be represented by 16 bits, it is an error to use int. Instead, long
should be used.

"int" is meant to represent an *efficient* implementation of integers on
the architecture at hand. "long" can also be expected to be efficient.

Types with explicit bit widths should only be used where the number of
bits represented is a strict requirement, for instance for external
interfaces.

Egil
-- 
Email: address@hidden  
Voice/videophone: +47 22523641 Voice: +47 92022780 Fax: +47 22525899
Mail:  Egil Kvaleberg, Husebybakken 14A, 0379 Oslo, Norway
Home:  http://www.kvaleberg.com/

reply via email to

[Prev in Thread] Current Thread [Next in Thread]