CACHE MEMORY MAPPING
Use of a cache memory
- When a read request is received from the CPU, the contents of a
block of memory words containing the location specified are
transferred into the cache one word at a time.
- When the program references any of the locations in the block, the
desired contents are read directly from the cache.
- The correspondence between the main memory blocks and those in
the cache is specified by mapping function.
- When the cache memory is full and memory word that is not in the
cache is referenced, the cache control hardware must decide which
block should be removed to create space for the new- decision by
- Direct Mapping technique
- Associative Mapping technique
- Set associative mapping technique
1. Direct- mapped cache
- Cache consisting of 128 blocks of 16 words each, for a
total of 2048(2K ) words
- Main memory is addressable by a 16-bit address.
- The main memory has 64K words, which will view as 4K
blocks of 16 words each.
- In this technique block j of the main memory maps onto
block j modulo 128 of the cache
- Direct mapping – easy to implement – not very flexible.
- Determined from memory address
- The low order 4 bits select one of 16 words in a block.
- When a new block enters the cache 7-bit cache block field
determines the cache position in which this block must be
- The high – order 5 bits of the memory address of the block are
stored in 5 tag bits associated with its location in the cache.
- The high –order 5 bits of the address are compared with the tag
bits associated with that cache location. If they match, then the
desired word is in that block of the cache.
2. Associative –mapped cache
- Much more flexible – higher costs (must search all
128 tag patterns to determine if a given block is in
cache.) (All tags must be searched in parallel)
- A main memory block can be placed into any cache
- Existing blocks only need to be ejected if cache is full.
3. Set Associative Mapped cache
- Blocks of cache are grouped into sets
- A block of main memory can reside in any block of a specific
- reduces hardware necessary for searching tag addresses in
- K-blocks per set is a k-way set associative cache