Newly annotated words*:
sequential storage. A term given to devices where the next item accessed on the device is the item stored at the next sequential address. Tapes work this way.

direct storage. A term given to the storage mechanism utilized by disks. In fact, IBM has always referred to their disks as DASD (direct access storage devices). The notion is that the device needs to first find the starting address of the access, and the following locations accessed are those at the following sequential addresses. i.e., sequential locations on the track that is rotating under the disk head. I have added another buzzword, disk terminology to help make this clear.

Disk terminology. Information is stored on disk in the following way: The disk consists of multiple rotating platters, much like an old phonograph record (ask your grandmother!). If you don't like that analogy, think pizza trays. All rotate about the same vertical access at their center. Each platter consists of concentric tracks (think circles of different radii). Data is picked off the platters by locating a disk head close to but not touching (ouch!) the platters. There is one head for each platter, and they concurrently access one bit from each platter. Each bit is on a track. The set of tracks form what they call a cylinder. The disk head move as a unit. At any point in time they are concurrently accessing the n bits of an n platter cylinder. (Hope this is clear. If not, google Wikipedia. I am 99% sure they probably even have a picture. Hopefully you do not need a picture, since what I put on the white board was clear!)

irregular parallelism. To differentiate it from regular parallelism, regular parallelism is when the code can be executed in a very orderly parallel way which is obvious when looking at the algorithm. Simplest example I know of is taking the inner product of two vectors: One multiplies ai times bi for all i. Very "regular." Irregular parallelism is the case where, if you draw the data flow graph of the algorithm, it is not obvious where the parallelism is. I appreciate that this definition is not precise, reflecting the notion that the concept is not precise.

*A list of all the annotated buzzwords can be found in the glossary on the handouts page.

List of Buzzwords:
HPS
node
irregular parallelism
non-determinism
back-up registers
row major / column major
stride
bank
interleaving
vector chaining
CRAY-1
CRAY-XMP
unaligned access
address space
overlay
parity bit
ECC bits
Burst Error
SRAM / DRAM
refreshing DRAM
row address register
RAS
CAS
row buffer hit
page mode
channels
hamming code
checksum
CRC
virtual memory
page mode
frame
page in residence
system space
page fault
head / track in disks
seek time
rotation time
PTE
page table
PFN
Modified bit
Reference bit
protection field
Kernel, executive, supervisor, user privilege
Working Set
Balance Set
TLB
Segmentation
Segment Registers (Code / Stack / Data)
Real address
linear address space
Descriptor table
cache
sequential storage
direct storage
random access memory
content addressable memory
associative memory
HASH function (key transformation)
temporal locality
spatial locality
cache line / block
index bits
tag bits
tag store
n-way set associative cache
direct mapped cache
fully associative cache
cache hit
cache hit ratio