Modern computers were invented around World War II. According to Walter Isaacson’s book The Innovators, these machines have four main characteristics. They are:
- Digital – using discrete numbers rather than continuous parameters such as voltage levels
- Binary – the particular underlying number scheme is based only on zeros and ones, not decimal numbers or some other choice. Turns out this matches well with electronic methods of computation.
- Electronic – Well of course. But, at the time, many computing machines used electromechanical relays or combinations of different types of devices. The invention of the transistor settled that one.
- General Purpose – Alan Turing provided the theoretical background here. Yes, this is the same genius portrayed as breaking the German enigma codes in the movie The Imitation Game, and also the inventor of the Turing test for determining whether or not a machine exhibits intelligence, as portrayed in the movie Ex Machina and others. But the theoretical Turing Machine is perhaps his most enduring legacy (movies aside). Earlier work had shown that any system of logic has problems that it cannot solve, and that there is no way to determine a priori which ones are unsolvable. Turing was able to prove that his conceptual machine could solve all the rest, every problem that was computable. The machines we use today are logically equivalent to his Turing machine, and hence “general purpose” in a very strong sense of the term.
One other key figure in the early development of computers was John von Neumann. Among his many contributions was the realization that both data and computer programs could be represented as binary strings and thus could be stored in the same memory. Again this was not obvious at the time, but today’s machines are almost universally built on the “von Neumann architecture” of shared storage.
These basics have remained the foundation of computers for decades. Of course there have been dramatic changes, and not only as a result of Moore’s Law, describing the exponential increase in computational power over time. First, computer scientists figured out how to get single computers to run multiple programs “simultaneously”, making them more efficient and also allowing multiple users on machines. The von Neumann concept of a single storage component became increasingly specialized as memory chips evolved, with hierarchies of faster, more expensive storage near the computational engines and slower, cheaper memory further away. More recently we have seen the change from a single computing core to parallel processing with multiple cores (most new machines have 4 or more) along with specialized graphics engines and other auxiliary computational components. Each change in technology has also led to changes in the operating systems, the specialized software that controls how the basic computer functions are carried out.
Interesting history, but why is this a topic for cybersecurity awareness (apart from the obvious fact that hackers want to go where the data lives)? Basically, each increase in complexity is another opportunity, another potential vulnerability for someone to exploit.
Take the von Neumann architecture. Computers keep track of what instruction to execute next, and where to find the data for that instruction through the use of pointers. They let the computer know to go to address “abcd” for the next instruction and address “wxyz” for the next piece of data. But if someone finds a way to load a set of nefarious instructions that look like data and then change the pointer, the computer doesn’t know the difference…you’re pwned. Or, since we have multiple users on a single machine, if someone gains access as a normal user and then sets their privileges so the machine believes they are an administrator…poof, they’re an administrator. Even newer memory types, such as the Non-Volatile RAM that retains data when the power goes off, open up a whole new range of in-memory persistent hacks.
This is a recurring theme in cybersecurity (and elsewhere, for that matter). The systems we use have become so complex that they are beyond the capabilities of any individual to fully understand, and provide abundant opportunities for failure or exploitation. Even at the level of individual computers, the digital, binary, electronic string of events that takes place in the millions of “transistors” that make up the general purpose computational engine in today’s computers is hard to imagine…whether it is sending an e-mail, playing a game, or presenting your next great concept.
What can you do as a user, faced with this complexity? Four main things:
- Buy from reputable sources. The major choices for laptops are fairly limited, but there are many more suppliers of peripherals, and any of these devices can provide potential attack vectors.
- Make sure your machine is properly configured. This is good advice both for new machines and to check periodically on your existing machines. Microsoft, for example, created significant controversy when they shipped new Windows 10 machines this past summer, configured as a default to settings and apps that some users considered violations of their privacy. As a user you can generally check configuration settings both in the operating system and in your browser and other basic programs.
- Keep it updated. Maintaining software to current patch levels is becoming easier, but is still a time-consuming nuisance. Nevertheless, this is the primary method that software manufacturers use to correct recently discovered errors, and unpatched systems continue to be a primary exploitable vulnerability.
- Periodically clean up your machine. Delete unused apps. Defragment your drives. Backup your data. Update and run your anti-malware programs. This type of maintenance will make your machine run better, and also reduce your vulnerability to a variety of attacks and unpleasantries.