PDA

View Full Version : Stopwatch



Quicksilver
06-10-2011, 12:53 AM
Im making a stopwatch. But I dont know how exactly to get an accurate second.
_delay_ms is inaccurate.

avinash_g
06-10-2011, 08:20 AM
_delay_ms is accurate. People don't know how to exactly use it. The maximum possible delay possible by _delay_ms() is


The maximal possible delay is 262.14 ms / F_CPU in MHz.

as clearly stated in the man pages

http://www.nongnu.org/avr-libc/user-manual/group__util__delay.html

So what is the F_CPU for your design?

Other wise don't stuck with basic move on an make a AVR Stop watch

if you Google AVR Stop watch this page comes in 1st rank

http://extremeelectronics.co.in/avr-projects/avr-project-digital-stop-watch-with-atmega8/

may be worth reading.

If you Google "avr 1ms time base" this page tops

http://extremeelectronics.co.in/avr-tutorials/timers-in-compare-mode-part-ii/

It shows how to make a accurate time base.

allbits
06-10-2011, 12:25 PM
@quicksilver;

What is _delay_ms?
Alright, it is obvious, but we have no clue what processor you are talking about and at which frequency, or what kind of oscillator you are using or what compiler you are using. So there is no way you can know if delay is accurate or not.

@avinash,.
software delays CAN be very inaccurate. VERY.

avinash_g
06-10-2011, 12:51 PM
@allbits

May I know why software delay can be in accurate ???

If I say CPU to execute 1,000,000 cycles it will take exactly 1 sec on AVR running on 1MHz provided interrupts are disabled.

I have generated PAL signals (TV) using asm and software delays they are really time sensitive and works great!

What you miss is how _delay_ms() on avr libc works. You can see its source. The calculation it performs to guess the number of cycles it requires for given delay is approximate. So is the function it self.

As far as one second is required. It can be generated quite accurately with software (so you are wrong here). If not I will post a code that is at least accurate as far as 99.99% using only software loop.

@Quicksilver

compiler optimizations must be enabled and F_CPU must be defined for _delay_ms() to work.

In case your F_CPU is 1Mhz

you cannot achive more than 262.14ms

This delay is exactly same as calling

_delay_loop_2(0)

passing 0 to _delay_loop_2(0) request a delay equivalent to 65536(2^16) iterations, since _delay_loop_2(0) function burs 4 CPU cycle in each iteration it burns a total of 4x65536 CPU cycles = 262144 (launch calculator and see your self) since CPU is running @ 1000000 cyles/sec it takes

262144/1000000=0.262144s or 262.144 sec

The error you must me making may be like calling

_delay_ms(1000); and expecting a 1s delay! But see source of _delay_ms see how it works

It just calculate a value and call _delay_loop_2(0) ONCE!

Although newer version of AVR libC has implemented a version of _delay_ms(1000) that can achieve delay of 6.5535 seconds with a resolution of 1/10ms

below is a code from WinAVR-20100110



/**
\ingroup util_delay

Perform a delay of \c __ms milliseconds, using _delay_loop_2().

The macro F_CPU is supposed to be defined to a
constant defining the CPU clock frequency (in Hertz).

The maximal possible delay is 262.14 ms / F_CPU in MHz.

When the user request delay which exceed the maximum possible one,
_delay_ms() provides a decreased resolution functionality. In this
mode _delay_ms() will work with a resolution of 1/10 ms, providing
delays up to 6.5535 seconds (independent from CPU frequency). The
user will not be informed about decreased resolution.
*/
void
_delay_ms(double __ms)
{
uint16_t __ticks;
double __tmp = ((F_CPU) / 4e3) * __ms;
if (__tmp < 1.0)
__ticks = 1;
else if (__tmp > 65535)
{
// __ticks = requested delay in 1/10 ms
__ticks = (uint16_t) (__ms * 10.0);
while(__ticks)
{
// wait 1/10 ms
_delay_loop_2(((F_CPU) / 4e3) / 10);
__ticks --;
}
return;
}
else
__ticks = (uint16_t)__tmp;
_delay_loop_2(__ticks);
}


The above code can look too heavy for a 8 bit CPU (all those double calculation going on!) but the trick is that _delay_ms() is almost always are called with constant values. So the exact argument that is passed to _delay_loop_2() (the last line of function) i.e. __ticks can be calculated during compile time (yes and by a moster 32 bit CPU running the avr-gcc and your OS).

So all those heavy calculations and varriable are NOT at all present on your 8 bit MCU only a _delay_loop_2() call with a constant value is there. This magic is called compiler optimization.

If you compile the code using _delay_ms() with O0 compiler setting it will issue a warning. Now you know why!!!

See the attached screen shot
http://extremeelectronics.co.in/images/delay_ms_warning.png

allbits
06-10-2011, 01:23 PM
@allbits

May I know why software delay can be in accurate ???

If I say CPU to execute 1,000,000 cycles it will take exactly 1 sec on AVR running on 1MHz provided interrupts are disabled.

I have generated PAL signals (TV) using asm and software delays they are really time sensitive and works great!

What you miss is how _delay_ms() on avr libc works. You can see its source. The calculation it performs to guess the number of cycles it requires for given delay is approximate. So is the function it self.

As far as one second is required. It can be generated quite accurately with software (so you are wrong here). If not I will post a code that is at least accurate as far as 99.99% using only software loop.


There are a lot of 'clauses' for which software delays work without issues. Software delay works well, alone.

If they were really that accurate, why do people rely on timers?

Software delay depends a lot on compiler and the program logic. There are lots and lots of cases where software delay fails, as a person who knows programming you should be very much aware of it.

And yes, the OP did not even mention the processor or compiler, nor did he even mention if he was using assembly or C !!!

So, you cannot, blindly say software delay works without mentioning the conditions. That will misdirect the beginners.

avinash_g
06-10-2011, 01:42 PM
Please explain at least 5 of those lot

If they were really that accurate, why do people rely on timers?
Software delays are avoided NOT because they are inaccurate but because it blocks the CPU. Timers free the CPU for other tasks.

Ask me to genarate any delay between 1ms and 1 year I will do in software with 99.99 percent accuracy.

Compiler developers are not fool!

In commercial compilers like HI-TECH C for PIC the delays function area are handled in a different way. Those area of compilers are hand crafted. They generate 100% accurate software delays.

How they work if the compiler finds a _DELAY() function now the code genarator put exact number of CPU cycle that is passed to _DELAY() so it is 100.000000% accurate.



Synopsis
#include <htc.h>
void _delay(unsigned long cycles);
Description
This is an inline function that is expanded by the code generator. When called, this routine expands
to an inline assembly delay sequence. The sequence will consist of code that delays for the number
of cycles that is specified as argument. The argument must be a literal constant.

the above is from the user manual of HI-TECH C pg.207

The compiler makers are NOT fools so they issues a warning when Optimizations were disabled on avr-gcc.

This made user know in advance that _delay_ms is not functioning well.

What you say now ???

Quicksilver
06-10-2011, 06:11 PM
Hey Avinash! Thanks a lot. Your website is terrific. Thanks a ton! F_CPU is 1mhz by default for an atmega8L ?

allbits
06-13-2011, 09:07 AM
Software delays are avoided NOT because they are inaccurate but because it blocks the CPU. Timers free the CPU for other tasks.

Well, that explains everything !! It answers almost all of your queries.

Whatever cycles the compiler puts in, the moment you have an event, or a time lapse, especially in the case of a stopwatch, you miss them because the processor is executing the code !! its as simple as that. Thats why people use a timer !! It is possible to write a program with software delays, but it depends a lot on the clock frequency and calculations, and sometimes, is not accurate when you use a non standard frequency, whatever compiler you use, because there is always truncation and rounding offs in case of 8 bit controllers.

Remember - However the compiler puts the code in, the final working of the code depends on the logic. While using timers with interrupts, you remove a lot of uncertainty in terms of timing, especially in cases of stopwatch. Dont forget the amount of time required for an LCD write, by that time which your milli second has long gone. (THis case is just an example how you can miss events. there are a lot of other routines that make you miss time !! )This is the case with your stop watch with software delays, with interrupts disabled. You can keep on arguing, but that will not help the OP, who still has no clue what he is doing.

@OP - even after all the discussions, you did not even mention what processor or compiler or even the controller that you use ! SO what you miss, is a better, efficient and accurate way to write your code. And possibly people may not prefer to help you on a later post.

mcufreak
06-21-2011, 11:21 PM
To design a STOPWATCH software delays are imperfect.
If we consider a commercial stopwatch like in our wristwatches the timer continues even in background while displaying 'current time'
@ Quicksilver: you should clearly understand the point, allbits is making.
if you design a multifunction stop watch than you have to consider the 'switch debouncing delay' too. as per my experience for most locally available pushbuttons its as high as 25ms !!!!(so wheres ur stopwatch???).
software delays are accurate but why we should engage processor in just making delays. moreover there could be issues with the accuracy with software delays. But delays by Timers are universally accurate so whats the point in engaging processor to make delays which may not be accurate.