Newly annotated words*:
sequential storage. A term given to devices where the next item accessed on the device is the item stored at the next sequential address. Tapes work this way.
direct storage. A term given to the storage mechanism utilized by disks. In fact, IBM has always referred to their disks as DASD (direct access storage devices). The notion is that the device needs to first find the starting address of the access, and the following locations accessed are those at the following sequential addresses. i.e., sequential locations on the track that is rotating under the disk head. I have added another buzzword, disk terminology to help make this clear.
Disk terminology. Information is stored on disk in the following way: The disk consists of multiple rotating platters, much like an old phonograph record (ask your grandmother!). If you don't like that analogy, think pizza trays. All rotate about the same vertical access at their center. Each platter consists of concentric tracks (think circles of different radii). Data is picked off the platters by locating a disk head close to but not touching (ouch!) the platters. There is one head for each platter, and they concurrently access one bit from each platter. Each bit is on a track. The set of tracks form what they call a cylinder. The disk head move as a unit. At any point in time they are concurrently accessing the n bits of an n platter cylinder. (Hope this is clear. If not, google Wikipedia. I am 99% sure they probably even have a picture. Hopefully you do not need a picture, since what I put on the white board was clear!)
irregular parallelism. To differentiate it from regular parallelism, regular parallelism is when the code can be executed in a very orderly parallel way which is obvious when looking at the algorithm. Simplest example I know of is taking the inner product of two vectors: One multiplies ai times bi for all i. Very "regular." Irregular parallelism is the case where, if you draw the data flow graph of the algorithm, it is not obvious where the parallelism is. I appreciate that this definition is not precise, reflecting the notion that the concept is not precise.
*A list of all the annotated buzzwords can be found in the glossary on the handouts page.
List of Buzzwords:
row major / column major
SRAM / DRAM
row address register
row buffer hit
page in residence
head / track in disks
Kernel, executive, supervisor, user privilege
Segment Registers (Code / Stack / Data)
linear address space
random access memory
content addressable memory
HASH function (key transformation)
cache line / block
n-way set associative cache
direct mapped cache
fully associative cache
cache hit ratio