Each computer display is made up of pixels, a small square or rectangular lighting and color element that can be turned on or off, with varying brightness values. Computer resolutions are measured in a certain number of pixels wide by a certain number of pixels tall – for example, a 2013 vintage HDTV resolution is 1920 pixels by 1080 pixels (or 1920 pixels by 1200 pixels). When images are drawn on the screen that are perfectly vertical or perfectly horizontal, the edges of those images align with the grid. When those images are drawn at an angle, the squares of the grid don't align with the edge of the image.
When a program or video card uses anti-aliasing, it calculates an average color for the pixels adjacent to the "edge" of each line being drawn, making it blurry or softer. This fills in the "gaps" along the staircase edge and creates an optical illusion. Because pixels are so small, the human brain blends the shades together to make a smooth curve. The same psychological effect is used to do half-toning on newspapers, where photographs are reproduced by varying the size of dots of ink to get subtle gradations in color. One way to look at anti-aliasing is that the scene is rendered at double, quadruple or octuple the resolution of the display, and then shrunk before it's shown.
Anti-aliasing requires a certain amount of processor capability. Every pixel that's used to define an edge requires sampling the color hue and brightness of adjacent pixels to find the correct value. For text, or static images, this only needs to be done once. For video, it's done every time something changes between frames. This is one reason why enabling anti-aliasing on computer games can cause a noticeable performance lag in frames per second, and why many video cards have silicon dedicated to running anti-aliasing algorithms. While early anti-aliasing video cards would do two passes per screen, running top-to-bottom, the number of passes (the sampling rate) has increased as performance has improved.
While the performance hit from anti-aliasing has been countered with ever more capable video card hardware, advances in display technology may ultimately render it moot. The iPhone 4 introduced 250+ pixels-per-inch (PPI) displays in 2011, and those very high resolution displays, with pixels smaller than the human eye can pick out, are gaining mainstream adoption. At roughly 300 PPI, human eyes really can't make out individual pixels. With pixels that small, the need for anti-aliasing (and the additional video processing calculations) goes away. As of late summer 2013, only a handful of desktop and laptop displays have this kind of resolution, but it's clear that they're going to become widespread in the near future.