04

Memory Hierarchy and Cache

Chapter 4 • Intermediate

60 min

Memory Hierarchy and Cache

Memory hierarchy organizes different types of memory to balance speed, size, and cost.

Memory Hierarchy

Levels (Fastest to Slowest)

  1. Registers: Fastest, smallest, most expensive
  2. Cache (L1, L2, L3): Very fast, small, expensive
  3. Main Memory (RAM): Medium speed, medium size, medium cost
  4. Secondary Storage (Disk): Slow, large, cheap

Principle of Locality

Temporal Locality: Recently accessed items likely to be accessed again.

Spatial Locality: Items near recently accessed items likely to be accessed.

Exploited by:

  • Cache (temporal and spatial)
  • Prefetching (spatial)

Cache Memory

Cache is a small, fast memory that stores frequently accessed data.

Cache Organization

Cache Parameters

  • Block Size (Line Size): Size of data transferred
  • Cache Size: Total cache capacity
  • Associativity: Number of blocks per set
  • Number of Sets: Cache size / (Block size × Associativity)

Cache Mapping Techniques

1. Direct Mapped Cache

Each memory block maps to exactly one cache block.

Mapping: Block number mod (Number of cache blocks)

Advantages:

  • Simple hardware
  • Fast access

Disadvantages:

  • Conflict misses
  • Poor utilization

Example:

  • Cache has 8 blocks (0-7)
  • Memory block 10 maps to block 10 mod 8 = 2
  • Memory block 18 maps to block 18 mod 8 = 2 (conflict!)

2. Fully Associative Cache

Any memory block can go to any cache block.

Advantages:

  • No conflict misses
  • Best utilization

Disadvantages:

  • Complex hardware (needs search)
  • Slower access

3. Set Associative Cache

Cache divided into sets. Each set has multiple blocks (ways).

Mapping: Block number mod (Number of sets)

Example: 4-way set associative

  • Cache has 4 sets
  • Each set has 4 blocks
  • Memory block maps to a set, can go to any block in that set

Advantages:

  • Balance between direct mapped and fully associative
  • Good performance

Disadvantages:

  • More complex than direct mapped

Cache Replacement Policies

When cache is full, which block to replace?

1. LRU (Least Recently Used)

Replace block not used for longest time.

Advantages:

  • Good performance
  • Exploits temporal locality

Disadvantages:

  • Complex to implement
  • Requires tracking

2. FIFO (First In First Out)

Replace oldest block.

Advantages:

  • Simple implementation

Disadvantages:

  • May replace frequently used block

3. Random

Replace random block.

Advantages:

  • Very simple

Disadvantages:

  • Poor performance

Cache Write Policies

Write-Through

Write to both cache and memory immediately.

Advantages:

  • Memory always consistent
  • Simple

Disadvantages:

  • Slower (every write goes to memory)

Write-Back

Write only to cache. Write to memory when block is replaced.

Advantages:

  • Faster (fewer memory writes)
  • Better performance

Disadvantages:

  • More complex
  • Memory may be inconsistent

Cache Performance

Hit Rate: Fraction of accesses that are hits.

Miss Rate: Fraction of accesses that are misses. (Miss Rate = 1 - Hit Rate)

Average Access Time:

  • T_avg = Hit Rate × T_cache + Miss Rate × (T_cache + T_memory)

Example:

  • T_cache = 1ns
  • T_memory = 100ns
  • Hit Rate = 0.95
  • T_avg = 0.95 × 1 + 0.05 × (1 + 100) = 0.95 + 5.05 = 6ns

Virtual Memory

Virtual memory allows programs larger than physical memory.

Address Translation

Virtual AddressPhysical Address

Components:

  • Page Number: Part of virtual address
  • Page Offset: Part of virtual address
  • Frame Number: From page table
  • Frame Offset: Same as page offset

Physical Address = (Frame Number × Page Size) + Offset

TLB (Translation Lookaside Buffer)

Cache for page table entries.

TLB Hit: Fast translation (no page table access)

TLB Miss: Must access page table

Effective Access Time:

  • EAT = TLB_time + Memory_time × (1 + TLB_miss_rate × Page_table_accesses)

GATE CS Important Points

  1. Cache Mapping: Know direct, fully associative, set associative
  2. Cache Performance: Calculate hit rate, miss rate, average access time
  3. Replacement Policies: Understand LRU, FIFO, Random
  4. Write Policies: Know write-through vs write-back
  5. Virtual Memory: Understand address translation

Practice Tips

  1. Calculate Cache Parameters: Practice finding block number, set number
  2. Trace Cache Access: Manually trace memory accesses
  3. Calculate Performance: Practice hit/miss rate calculations
  4. Understand Mapping: Know how addresses map to cache
  5. Previous Year Questions: Solve GATE memory hierarchy questions