[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [avr-libc-dev] delay in microseconds [Was: Adding new device to Libc
Re: [avr-libc-dev] delay in microseconds [Was: Adding new device to Libc]
Wed, 15 Jan 2003 13:02:08 -0800 (PST)
I see.. so someone has already worked on it!
BTW One idea I thought of if you wanted to do the math
at compile time and still keep 3 or 4 inline
instructions is to keep the current delay functions
the same, but add a simple define like
#define uS *((0.000001 * F_CLK)/4)
then when you call the function:
PS: It seems there is a few different implimentations
possible (malloc, hardware, etc), but I think a lot of
the people will just be using these as general delays
that don't have to be accurate, and the inline ASM
might be best for them...
However I'm sure some people WILL require more
accuract delays, so maybe have a few different
implimentations avaliable in the file?
--- "E. Weddington" <address@hidden> wrote:
> On 15 Jan 2003 at 17:24, Joerg Wunsch wrote:
> > As Colin O'Flynn wrote:
> > > PS: Is anyone already working on this TODO:
> > > - In include/avr/delay.h, add macros to allow
> > > specifying delays directly in microseconds (with
> > > clock frequency defined by the user). With
> > > constant delays, all floating point math would
> be done
> > > at compile time.
> > Yes, someone recently posted a proposal to the
> list. Should be
> > in the archives... Yep:
> > Nobody seems to have reviewed it so far.
> Jörg, Ted, do you remember what this todo item was
> all about?
> Especially the tag about the floating point math
> And should these delays be implemented with loops or
> using the
> processor's timers? Whenever I use delays I always
> prefer the latter.
> AVR-libc-dev mailing list