NVIDIA's GeForce 8600GTS and 8600GT are G84-based GPUs while the GeForce 8500GT is G86-based. The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0 as well as support for NVIDIA SLI and PureVideo technologies. NVIDIA touts three dedicated video engines on the G84 and G86-based graphics cards, providing MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p. G84 and G86 support hardware accelerated decoding of H.264 video as well (no mention of VC-1 decoding). G84 and G86 also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.
GeForce 8600GTS PCIe x16
- Eight-layered PCB, measures 7.2" x 4.376"
- 675 MHz GPU and a 128-bit bus
- 256MB of GDDR3 memory clocked at 1000 MHz
- Require external PCIe power; estimated total board power consumption: ~71-watts
- Supported video output connectors: include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs
- Six-layer PCB, measures 6.9" x 4.376"
- 540 MHz GPU and a 128-bit bus
- 128MB or 256MB of GDDR3 memory clocked at 700 MHz
- External PCIe power not required; maximum board power consumption: ~43-watts
- Supports similar video outputs as the 8600GTS, no video input features