When the 80286 was invented, it supported the legacy 8086 Segmentation (now called "Real Mode"), and added a new mode called "Protected Mode". This mode has been in every x86 processor since, albeit enhanced with various improvements such as 32- and 64-bit addressing.
In Protected Mode, the simple "Add address to Shifted Segment Register value" was done away with completely. They kept the Segment Registers, but instead of using them to calculate an address, they used them to index into a table (actually, one of two...) which defined the Segment to be accessed. This definition not only described where in memory the Segment was (using Base and Limit), but also what type of Segment it was (Code, Data, Stack or even System) and what kinds of programs could access it (OS Kernel, normal program, Device Driver, etc.).
Each 16-bit Segment Register took on the following form:
+------------+-----+------+
| Desc Index | G/L | Priv |
+------------+-----+------+
Desc Index = 13-bit index into a Descriptor Table (described below)
G/L = 1-bit flag for which Descriptor Table to Index: Global or Local
Priv = 2-bit field defining the Privilege level for access
The Global/Local bit defined whether the access was into a Global Table of descriptors (called unsurprisingly the Global Descriptor Table, or GDT), or a Local Descriptor Table (LDT). The idea for the LDT was that every program could have its own Descriptor Table - the OS woud define a Global set of Segments, and each program would have its own set of Local Code, Data and Stack Segments. The OS would manage the memory between the different Descriptor Tables.
Each Descriptor Table (Global or Local) was a 64K array of 8,192 Descriptors: each an 8-byte record that defined multiple aspects of the Segment that it was describing. The Segment Registers' Descriptor Index fields allowed for 8,192 descriptors: no coincidence!
A Descriptor held the following information - note that the format of the Descriptor changed as new processors were released, but the same sort of information was kept in each:
0x0000
mean a size of 0
, so not accessible? Or maximum size?If the OS kept the Descriptor Tables in Segments that couldn't be accessed by mere programs, then it could tightly manage which Segments were defined, and what memory was assigned and accessible to each. A program could manufacture whatever Segment Register value it liked - but if it had the audaciousness to actually load it into a Segment Register!... the CPU hardware would recognise that the proposed Descriptor value broke any one of a large number of rules, and instead of completing the request, it would raise an Exception to the Operating System to allow it to handle the errant program.
This Exception was usually #13, the General Protection Exception - made world famous by Microsoft Windows... (Anyone think an Intel engineer was superstitious?)
The sorts of errors that could happen included:
If the proposed Descriptor Index was larger than the size of the table;
If the proposed Descriptor was a System Descriptor rather than Code, Data or Stack;
If the proposed Descriptor was more privileged than the requesting program;
If the proposed Descriptor was marked as Not Readable (such as a Code Segment), but it was attempted to be Read rather than Executed;
If the proposed Descriptor was marked Not Present.
Note that the last may not be a fatal problem for the program: the OS could note the flag, reinstate the Segment, mark it as now Present then allow the faulting instruction to proceed successfully.
Or, perhaps the Descriptor was successfully loaded into a Segment Register, but then a future access with it broke one of a number of rules:
0x0000
Descriptor Index for the GDT. This was reserved by the hardware as NULL
;