Computing
In computing, a jiffy was originally the time between two ticks of the system timer interrupt.[7] It is not an absolute time interval unit, since its duration depends on the clock interrupt frequency of the particular hardware platform.[8][dubious – discuss]
Many older game consoles (which use televisions as a display device) commonly synchronize the system interrupt timer with the vertical frequency of the local television standard, either 59.94 Hz with NTSC systems, or 50.0 Hz (20 ms) with most PAL systems.[citation needed]
1980s 8-bit Commodore computers, such as the PET / VIC-20 / C64, had a jiffy of 1/60 second, which was not dependent on the mains AC or video vertical refresh rate.[9] A timer in the computer creates the 60 Hz rate, causing an interrupt service routine to be executed every 1/60 second, incrementing a 24-bit jiffy counter, scanning the keyboard, and handling some other housekeeping.[10]
Jiffy values for various Linux versions and platforms have typically varied between about 1 ms and 10 ms, with 10 ms (1/100 s) reported as an increasingly common standard in the Jargon File.[11]
Stratus VOS (Virtual Operating System) uses a jiffy of 1/65,536 second to express date and time (number of jiffies elapsed since 1 January 1980 00:00 Greenwich Mean Time). Stratus also defines the microjiffy, being 1/65,536 of a regular jiffy.[12]
The term jiffy is sometimes used in computer animation as a method of defining playback rate, with the delay interval between individual frames specified in 1/100 of a second (10 ms) jiffies, particularly in Autodesk Animator .FLI sequences (one global frame frequency setting) and animated Compuserve .GIF images (each frame having an individually defined display time measured in 1/100 s).[citation needed]