SCO Unix, ALR FlexCACHE losing time

Geoffrey S. Mendelson gsm at gsm001.uucp
Thu Jan 3 09:15:27 AEST 1991


in Message-ID: <1991Jan01.162400.6155 at litwin.com>
Dr. Victor L. Rice writes:
>	      
>What gives ??? Why am I losing over a second a day ??
>-- 

This is a very common bug in IBM PC design.  It comes from one of two
"features" of the IBM pc that has been carried over from the original.

Time loss may be caused by three different things, but I think that yours
is the second or third:

1:  The battery backed up clock loses (or gains time).  
    This is caused by the clock chip being "off".  The problem may be fixed
    by your motherboard vendor (in this case ALR) or it may not.  They
    probably do not warrenty clock accuracy.  

    This problem usually shows up on systems that are powered off most of the
    time.

    If it is really off, such as hours a day, or dead, replace the battery.

    A friend of mine had an early Tandy 3000 (mitsubishi m/b) that lost 11
    seconds a day.  TANDY fixed it by replacing the motherboard.  It lost
    12 seconds a day.  On the third try they said it was within specs.

    I know of no published specs for clock accuracy. 

2:  When UNIX (and MS-DOS) are booted, the read the battery backed up clock.
    From then on they update the time on each "clock tick interupt".
    Most device drivers, especially disk, turn off interupts while they 
    are running.  

    The clock will be "off" 1/60 of a second until the interupt gets processed.
    Since disk drivers don't want to be interupted, they turn off interupts
    during transfers.  If for some reason they are busy for more than 1/60 of
    a second you loose any clock ticks after the first.

    If you have a tape or SCSI driver that is hit very hard you may see this.
    Also a serial card driver may block interupts, but not likely for that 
    long a time.  
      
3:  Of course, your clock tick generator on you motherboard may be off, 
    this is usually a crystal used for veritical sync.  In fact, it's supposed
    to be off.   This is because NTSC video actually refreshes at 59.9? Hz
    not sixty hertz.  The number is some sub-multiple of the color carrier
    frequency which is 3.57??????? MHZ.

Also a note on accuracy:
 
    1%  would be 864 seconds a day or 14 minutes 24 seconds
   .1%  would be 86  seconds a day or  1 minute  26 seconds
   .01% would be 8.6 seconds a day
   1 second a day is 1/86400 or 1 in almost 1 part in one hundred thousand.

How many scientific instruments can boast that accuracy?
     
-- 
Geoffrey S. Mendelson
(215) 242-8712
uunet!gsm001!gsm



More information about the Comp.unix.sysv386 mailing list