help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: How to interface with DAQ driver


From: Przemek Klosowski
Subject: Re: How to interface with DAQ driver
Date: Thu, 9 Feb 2006 13:56:37 -0500 (EST)

   Although it is likely to be an off-topic here, but I just wanted to warn you 
   that with the standard Linux kernel you cannot get real-time performance, at 
   least not hard real-time. There exist RT extensions though, 

This statement is true in a strict sense: Linux kernel doesn't
guarantee precise even latencies on the level comparable to machine
instruction execution (microseconds and below). However, with the
judicious design, it is quite realistic to expect a fairly good
performance---good enough for many data acquisition tasks, up to tens
of kilohertz repetition rate. After all, real time is about satisfying
YOUR timing constraints, not about some absolute number. Many real
realtime systems run on slow (e.g. 20MHz) processors; they guarantee
you consistent latency, not absolutely best latency.

So you can do interesting near-real-time work on a regular Linux
kernel---you just have to design it right:

 - be careful what drivers you load (no serial ports,etc)
 - avoid disk and network use, or at least make sure it's DMA
 - make sure you have enough memory to avoid swapping
 - use near-realtime scheduling (FIFO/round robin)
 - don't run anything that's not strictly necessary (no
   sendmail/no cron jobs, etc)

To give you a feeling for what's realistic to expect on a typical
Linux system, I ran a timing loop on my AMD-64 desktop. The program
prints periodically the value of the CPU Real Time Stamp register,
which is incremented by the CPU's crystal oscillator main clock
(probably few ppm stability). I then use Octave to calculate
statistics of this measurement (thus, all this is not entirely OT).

First, I run the experiment on the idle system, just staring at the
screen; the results are in follow (I load the consecutive Real Time
Stamp Counter values in array 'a', and calculate the interval in 'b').

octave:4> b=diff(a)
octave:5> mean(b)
ans =  1.0040e+09
octave:6> std(b)
ans = 3882.4

so the time interval is on the order of 1e9 (it checks out because my
CPU clock is 1GHz) and the standard deviation of the latency is around
3 microseconds.  That is pretty good!!!!

When I played a compressed movie on the system while transfering large
data while encrypting it, the numbers where:

octave:2> b=diff(a)
octave:3> mean(b)
ans =  1.0041e+09
octave:4> std(b)
ans =  1.1072e+06

Still, not too bad: 1.1 ms latency (0.1% jitter).

Here's the program I used:

#include <stdio.h>
#include <unistd.h>
#include <asm/msr.h>

int main(void) {
   int i=0;
   unsigned long long t;
   while (i++<600) {
        sleep(1);
        rdtscll(t); printf("%lld\n",t);
   }
}



-------------------------------------------------------------
Octave is freely available under the terms of the GNU GPL.

Octave's home on the web:  http://www.octave.org
How to fund new projects:  http://www.octave.org/funding.html
Subscription information:  http://www.octave.org/archive.html
-------------------------------------------------------------



reply via email to

[Prev in Thread] Current Thread [Next in Thread]