How do computers have a sense of time?

with clocks of course!

strap on your boots for this one

computers, in the context of this question, can be understood as networks of wires that connect a handful of simple components (capacitors and resistors, maybe an inductor here & there). electricity flows through these wires, but that flow isn’t exactly like the flow of water through a pipe. the drift velocity of the electrons is actually very slow and wildly fluctuates with temperature (which is simultaneously being generated by the electricity moving through a resistive substrate) which dictates you model drift velocity as an incredibly complicated differential equation that - yadda yadda yadda none of this matters. you can’t measure time reliably with simple electrical components is the take-away of this paragraphs

so how do computers keep track of time? a 2.3GHz CPU, which is around the frequency of the CPU in the computer you’re using to view this post, has a clock in it that ‘ticks’ 2.3 billion times a second with extraordinary accuracy. that is quite a feat!

how is this accomplished? with crystals. ‘crystal’, referring to any compound whose atomic structure follows an extremely precise and constant geometric lattice. look:

image


this is what table salt looks like on atomic scale. it’s made up of two atoms in alternating patterns to create a very consistent & simple shape. this pattern is repeated, with almost zero divergence, to form crystals your greasy ape eyeballs are actually able to see. it makes for an extremely reliable & calculable substrate that occurs naturally

now

these crystals exhibit something called the piezoelectric effect, which almost everything does. when you apply mechanical stress to (pretty much) anything, you elicit a momentary (or permanent) change in shape. that change in shape shifts around once-still geometries of atoms and disrupts the incredibly delicate orbit of electrons surrounding them. it follows that a disruption of a once-stable & homogenous body of electrons results in a momentarily unstable & uneven distribution of electrons. consider this as a dipole, or any system where there’s more negatively-charged particles in one region than another, creating a positive and negative pole. just like the compacted air molecules in a balloon “want” to explode outwards because of pressure, the electrons in a dipole “want” to move from an area of dense negative charge to an area of less dense negative charge. “less dense negative charge” is a fancy way of saying “positive charge”, by the way

this concept is known as a voltage differential and is the core driving principle describing how “electricity” can “flow”

so

to summarize, when you squeeze something in your hand you are mutating the atomic structure of that thing, leading to a dipole & voltage differential. inversely, when you apply a voltage to a piezoelectric medium, you get a momentary mutation of its atomic structure, which then causes another weaker voltage differential that causes another weaker mutation that causes so on and so forth

if you kick off this type of reaction with a single pulse of electricity, your piezoelectric medium just switches (or oscillates) between shape mutation & voltage generation over and over with less and less vigor each time until it finally settles back into a stead state:

image

however

if you measure the voltage produced by the piezoelectric medium, and wire it to a powered amplifier and feed the signal back into the piezoelectric medium, you arrive at a situation wherein the medium oscillates faster and faster due to the amplified feedback thereby creating a stronger voltage which goes through the amp and back to the medium to create and even stronger voltage and so on and so forth. the waveform doesn’t just taper off and die like in the image above

eventually, you reach an equilibrium where the increased voltage feedback into the medium no longer creates a faster oscillation & stronger voltage, and the voltage going out the medium into the amplifier doesn’t change, resulting in the output of the amplifier not changing easier

you have arrived at a steady state

the output waveform is no longer tapering off and dying. it remains constant so long as the amplifier is powered. it’s also oscillating extremely quickly and is dampened by the capacitance (”elasticity’) of the amplifier. so your output waveform looks like this:

image

instead of this:

image

cool, ok

this clean-looking waveform two images above is called a sin wave. it’s the most fundamental waveform and every aspect or variable describing it is constant and even. most piezoelectric compounds found in nature tend to be impure and uneven and extremely hesitant to provide such a waveform, even with the amplifiers “elasticity”, except for one category of compounds:

crystals

the extremely consistent and even atomic structure of crystals provides a correspondingly consistent & predictable piezoelectric effect which can be exploited to produce nice smooth sine waves like above

this is nice, but even with an analog sinusoidal waveform at an almost-perfect and stable frequency is not enough for a computer to be able to tell the time

the critical property of sine waves are that they increase from zero at a linear rate which decreases once that wave has reached it’s positive peak. for an infinitely short point in time, the waveform is flat (at the peak of the wave,) then rapidly falls back to zero & then the negative peak where the cycle is reciprocated. rising & falling, over and over again.

computers only understand ‘1′s and ‘0′s, though, which is way fewer things represented in a sine wave. we build a special type of transistor (light switch) that follows the waveform very precisely, and toggles its switch only once the analog waveform appears to stop heading in the direction it previously was heading and starts to go into the opposite direction. this is at each peak & valley of the wave:

image

in the image above, just replace “crest” with “peak” & “trough” with “valley”. 

now we have a transistor (light switch) that toggles to one of its two possible states (on or off) at a very even frequency. with a 1Mhz sine wave (sine waves goes up and down a million times a second) you now transistor switching on & off a million times a second, or once every microsecond

cool! the “switch” being either on or off (with nothing in between) is where the concept of binary states comes from, a switch in the on position is analogous to a ‘1′ & off is analogous to ‘0′.

so now that we have a digital circuit that follows an extremely accurate timeframe because of shit happening in real life with magic crystals, we’re done right?

no

this consistently-alternating pattern of ‘1′s and ‘0′s is used to create an identical waveform, one that stays at a high voltage for an even, predictable period of time and then goes to zero, seemingly immediately, once that period is over. and then back to high voltage, then low, etc. looks like this:

image

this is called a square wave, or more contextually-specific, a clock signal. it’s amplified and provided to many hundreds of components on your computer with very strong impedance, meaning that despite it being provided to a bunch of shit that’s sucking up the strength of the signal, the signal doesn’t deform or de-amplify. these alternating “on-off” signal is what drives any kind of electrical component

such as a simple component that counts each on-off cycle. this is crucial. since you now have a signal that undergoes a quantifiable change at a very constant rate (which you know), and you have a device that can count those changes, you can use the sum total number of changes to describe the amount of time that has elapsed in the real world. for example, with a 1MHz clock (ticks a million times per second or once every microsecond) and a counter capable of holding any number from 0 to 1 million, you can now keep track of how many seconds has passed

we’re still not done because you still can’t keep track of amounts of time above a second, only the time between each second. so what do you do? add another counter of course. it counts the number of times your zero-to-one-million counter has rolled over from 1 million back to zero, effectively counting the seconds.

in reality, you have a counter that translates from billions-of-ticks-per-second to perhaps millions-of-ticks-per-second, then from millions-of-ticks-per-second to thousands-of-yadda yadda yadda

these hardware counters are super expensive though, so you don’t want to keep adding counters to describe longer and longer timespans. the computer i’m typing this long-winded answer on is capable of keeping track of time until the end of the god damn universe and a trillion-billion years after that. how?

eventually, as your counter chains move from extremely fast ticks to slower and slower ticks, you reach a threshold where the circuitry driven by the fastest counters is suddenly able to keep track of the slower, longer counters and you cross a boundry on which my entire university education is premised upon:

the hardware-software interface

your 1′s and 0′s are no longer stored as inaccessible charges inside your counters. they’re stored in memory of which you have a lot of. in fact the amount of memory/bits/1′s-and-0′s taken up by the following group of characters:

12345678

is exactly equal to the size of memory used to keep track of the number of seconds between the beginning of time (january 1st 1970, a story for another time) and now. it has enough space to keep on tracking the seconds until the end of the universe (and long after that too)

once these time intervals become available in memory, you can do very easy manipulations & computations based off them and therefore keep track of time within huge timescales without the cost of adding more hardware counters. again, this is because the shit that does those calculations & manipulations works faster than the relatively-long increments of time. those calculations complete before another relatively-long increment of time has passed such that the internal circuitry is always a step ahead of what’s happening in the software realm. realistically, this means that while my computer operates physically on the timescale of billions of ticks per second, software i write can at most only deal with timescales of millions or tens of millions of ticks per second

you can take your boots off now