Intel x86 Assembly Language & Microarchitecture Real vs Protected modes Protected Mode



When the 80286 was invented, it supported the legacy 8086 Segmentation (now called "Real Mode"), and added a new mode called "Protected Mode". This mode has been in every x86 processor since, albeit enhanced with various improvements such as 32- and 64-bit addressing.


In Protected Mode, the simple "Add address to Shifted Segment Register value" was done away with completely. They kept the Segment Registers, but instead of using them to calculate an address, they used them to index into a table (actually, one of two...) which defined the Segment to be accessed. This definition not only described where in memory the Segment was (using Base and Limit), but also what type of Segment it was (Code, Data, Stack or even System) and what kinds of programs could access it (OS Kernel, normal program, Device Driver, etc.).

Segment Register

Each 16-bit Segment Register took on the following form:

| Desc Index | G/L | Priv |
 Desc Index = 13-bit index into a Descriptor Table (described below)
 G/L        = 1-bit flag for which Descriptor Table to Index: Global or Local
 Priv       = 2-bit field defining the Privilege level for access

Global / Local

The Global/Local bit defined whether the access was into a Global Table of descriptors (called unsurprisingly the Global Descriptor Table, or GDT), or a Local Descriptor Table (LDT). The idea for the LDT was that every program could have its own Descriptor Table - the OS woud define a Global set of Segments, and each program would have its own set of Local Code, Data and Stack Segments. The OS would manage the memory between the different Descriptor Tables.

Descriptor Table

Each Descriptor Table (Global or Local) was a 64K array of 8,192 Descriptors: each an 8-byte record that defined multiple aspects of the Segment that it was describing. The Segment Registers' Descriptor Index fields allowed for 8,192 descriptors: no coincidence!


A Descriptor held the following information - note that the format of the Descriptor changed as new processors were released, but the same sort of information was kept in each:

  • Base
    This defined the start address of the memory segment.
  • Limit
    This defined the size of the memory segment - sort of. They had to make a decision: would a size of 0x0000 mean a size of 0, so not accessible? Or maximum size?
    Instead they chose a third option: the Limit field was the last addressible location within the Segment. That meant that a one-bye Segment could be defined; or a maximum-sized one for the address size.
  • Type
    There were multiple types of Segments: the traditional Code, Data and Stack (see below), but other System Segments were defined as well:
    • Local Descriptor Table Segments defined how many Local Descriptors could be accessed;
    • Task State Segments could be used for hardware-managed context switching;
    • Controlled "Call Gates" that could allow programs to call into the Operating System - but only through carefully managed entry points.
  • Attributes
    Certain attributes of the Segment were also maintained, where relevant:
    • Read-Only vs Read-Write;
    • Whether the Segment was currently Present or not - allowing for on-demand memory management;
    • What level of code (OS vs Driver vs program) could access this Segment.

True protection at last!

If the OS kept the Descriptor Tables in Segments that couldn't be accessed by mere programs, then it could tightly manage which Segments were defined, and what memory was assigned and accessible to each. A program could manufacture whatever Segment Register value it liked - but if it had the audaciousness to actually load it into a Segment Register!... the CPU hardware would recognise that the proposed Descriptor value broke any one of a large number of rules, and instead of completing the request, it would raise an Exception to the Operating System to allow it to handle the errant program.

This Exception was usually #13, the General Protection Exception - made world famous by Microsoft Windows... (Anyone think an Intel engineer was superstitious?)


The sorts of errors that could happen included:

  • If the proposed Descriptor Index was larger than the size of the table;

  • If the proposed Descriptor was a System Descriptor rather than Code, Data or Stack;

  • If the proposed Descriptor was more privileged than the requesting program;

  • If the proposed Descriptor was marked as Not Readable (such as a Code Segment), but it was attempted to be Read rather than Executed;

  • If the proposed Descriptor was marked Not Present.

    Note that the last may not be a fatal problem for the program: the OS could note the flag, reinstate the Segment, mark it as now Present then allow the faulting instruction to proceed successfully.

Or, perhaps the Descriptor was successfully loaded into a Segment Register, but then a future access with it broke one of a number of rules:

  • The Segment Register was loaded with the 0x0000 Descriptor Index for the GDT. This was reserved by the hardware as NULL;
  • If the loaded Descriptor was marked Read-Only, but a Write was attempted to it.
  • If any part of the access (1, 2, 4 or more bytes) was outside the Limit of the Segment.