A variety of graphics settings are available in PC games that can improve graphics quality and detail or increase game performance. This gives games the ability to take advantage of the latest high performance computers, while still allowing acceptable performance for slower computers. Some common options are detailed below. Note that these options are set by the game, not by the graphics driver. Not all games allow changing all of these settings. Usually, a subset of these options is available in an option screen for the game.
When enabling or disabling different features, there is usually a tradeoff between graphics quality and game performance (or frame rate). Try enabling a feature or increasing the detail level, then check the quality and performance that results to find the best balance for that particular game.
Increased screen resolution improves graphics quality by increasing the number of pixels displayed at once. This allows for sharper graphics details and decreased stair-step patterns on the edges of polygons. In most cases, the higher the screen resolution, the lower the frame rate for the game. 640x480, 800x600, and 1024x768 are common screen resolutions for games. 640x480 is good for network play, where the frame rate must stay high in order to compete with other players. 1024x768 is good for single player gaming, where frame rate is usually not as important as visual quality.
- Intel® 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support up to 2048x1536 screen resolution in 3D accelerated games.
- Intel® 82810 and 82815 graphics controllers support up to 1024x768 screen resolution in 3D accelerated games.
Many games give the option to run in either 16-bit or 32-bit color depth. This refers to the amount of video memory that is required for each screen pixel. 32-bit color depth gives a larger range of colors to use, resulting in higher quality rendering. Due to the increased video memory bandwidth that is needed, 32-bit color will reduce the frame rate for the game. With some games, this can result in choppier performance.
Some games also allow setting the color depth of textures. 32-bit color can dramatically improve the appearance of textures and reduce artifacts, like dithering and banding. The improvement is especially visible when three or more textures are applied to a polygon. A small performance decrease may be seen with 32-bit color textures.
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support 16 and 32-bit color depths in 3D accelerated games
- Intel 82810 and 82815 graphics controllers support 16-bit color depth
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers supports up to 32-bit color textures
- Intel 82810 and 82815 graphics controllers support up to 16-bit color textures
Texture Detail Level
This usually refers to how large or how many textures are used in the game. Large textures can take up a lot of video memory, but this can be alleviated by using texture compression, if supported by the game.
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support up to 2048x2048 textures and up to four textures per polygon can be applied in a single rendering pass
- Intel 82810 and 82815 graphics controllers support texture sizes up to 2048x2048 and up to two textures per polygon
Mipmapping is a method of improving graphics quality and performance by using different mipmap levels, or texture sizes, depending on how far a pixel is in the distance. Trilinear mipmapping further improves quality by smoothing the transition between mipmap levels. Anisotropic filtering further improves graphics quality by increasing the amount of detail that can be seen when textures are seen from certain angles.
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support anisotropic filtering and trilinear mipmapping
- Intel 82810 and 82815 graphics controllers support anisotropic filtering
The depth buffer (Z-buffer or W-buffer) is used in 3D games to determine whether pixels on one polygon are in front of the pixel on another polygon. A higher precision depth buffer, such as 24-bit, will prevent pixels from showing up in front of pixels that they should be behind. A 16-bit depth buffer gives higher performance due to a large reduction in video memory bandwidth.
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support a 24-bit or 16-bit W- or Z-buffer
- Intel 82810 and 82815 chipset families graphics controllers support a 16-bit Z-buffer.
Texture compression is a method of reducing the amount of memory and memory bandwidth required for textures with a small reduction in visual quality. In certain games where a low-resolution texture is used for a large surface, like a sky image, significant color banding can be seen if texture compression is enabled. A combination of enabling texture compression and high texture detail can provide a good balance of quality and performance in many games.
- Intel 82865G, 82852/82855 GM/GME, and 82845G graphics controllers support texture compression
- Intel 82810 and 82815 graphics controllers do not support texture compression
Common lighting models for games include lightmap and vertex lighting. Vertex lighting gives a fixed brightness for each corner of a polygon. The lightmap model add an extra texture, called a lightmap, on top of each polygon which gives the appearance of variation of light and dark levels across the polygon. Due to the extra texture pass required, the lightmap model usually gives lower framerate than vertex lighting, but gives a much richer look to games that use it. Both types of lighting are supported by all Intel chipsets with integrated graphics.
Anti-aliasing is used to reduce stair-step patterns on the edges of polygons in games. It gives a smoother, slightly blurred look to the edges. Full scene anti-aliasing accomplishes this by rendering each frame at a larger resolution, then scaling it down to fit the actual screen resolution. This can lower the frame rate by a large amount, while increasing quality by a small amount. Usually, increasing the screen resolution is a better tradeoff than turning on anti-aliasing. Anti-aliasing is only useful for games when a lot of extra graphics performance is available. Intel chipsets with integrated graphics do not support full scene anti-aliasing. Anti-aliased lines are supported in OpenGL* applications.
This applies to: