COS 126 Lecture 13: Computer architecture

Overview - slide 1

This slide outlines the major levels of abstraction in computer architecture.

FETCH-INCREMENT-EXECUTE refers to the sequence of fetching an instruction, incrementing the PC (program counter), and executing the instruction. An instruction must be 'fetched' from memory before it can be executed. The encoding of the instruction will be used to control the operation of the CPU. The PC must be automatically incremented so that the next instruction can be incremented. Execution of the function will rely on the underlying architecture. This cycle simply continues infinitely as the computer runs.


slide 2 - Building blocks

We've seen most of these building blocks except for...

Multiplexer: a multiplexer has n control lines and up to 2^n input lines in addition to a single output bit. The output bit will 'get' the value of one of the input bits. The control lines specify which input bit to select. You can think of a muliplexer as a 'selector' - it selects one of the input lines as its output. An analogy that might be helpful in understanding the multiplexer's operation is that of a switch in a train yard. The switch specifies which train track is connected to the main line. In a multiplexer, the control lines specify a particular input line with a binary encoding. If there are 3 control lines, then 000 specifies the first line, 001 specifies the second, 010 specifies the third, etc.

CONTROL LINES: Control lines specify things like which function to execute. They're used to specify things like the controls of multiplexers, decoders, read/write bits of memory, etc.


slide 3 - Register Transfer

This slide demonstrates how data can be moved through the computer system. It's pretty straightforward.


slide 4 - Register Selection

The implementation of a simple selector circuit is given - you can verify its validity with a truth table. A series of these circuits can be used to select entire registers.


slide 5 - Basic Machine Organization

The main idea is that we can build a machine from primitive components, but this requires a well thought-out design. The picture looks quite complicated. But try to imagine how complicated it would be if the picture consisted entirely of AND, OR, and NOT gates (or transistors). Abstraction is quite useful!


slide 6 - Control

We have glossed over many important details, but you should have a pretty good idea of how a machine is built.


slide 7 - Final Step

Notice the use of decoders in this example. In exercises, we've looked at decoders that take a binary number as its input and output a 1 on the corresponding line. (Example: 110 will cause the 7th line to have value 1, where the line corresponding to 000 is considered to be the 1st line.) Decoders in general often have different function. They can be implemented to decode any defined encoding. The important thing to remember is that only one output line will have value 1 (the result of the function.)

I think the rest of this slide is fairly self-explanatory.


slide 8 - Summary

The fundamental building block of a machine is an abstract switch. This can be implemented in many ways (vacuum tubes, transistors, mechanical switches) but it always serves the same purpose. Using these switches, we can build basic logical gates, including AND, OR, and NOT. Now, using these logical gates, we can build more complicated Boolean circuits (e.g., using the sum-of-products method). Higher-level circuit can be built up from these Boolean circuits (e.g., we can build an adder circuit using the majority and odd-parity Boolean circuits). Ultimately, a machine is (conceptually) built from these high level circuits. In reality, the machine consists only of switches, but it would be a nightmare to try to design a computer, thinking only about switches. By looking at the machine from the right level of abstraction, we gain perspective and insight.

If you've found this section of the course to be interesting, you might want to look into the history or some of the new technologies listed. We'll also discuss the history of computers in Lecture 26.