DirectX 10: why it's exclusive to Vista

When Microsoft officially announced that DirectX 10 (DX10) would only be available for Windows Vista, many gaming fans yearning to be on the bleeding edge were upset. In order to get the most from their video cards, users would have to upgrade their operating systems to Vista. Some have attributed Microsoft's decision to be purely […]

When Microsoft officially announced that DirectX 10 (DX10) would only be available for Windows Vista, many gaming fans yearning to be on the bleeding edge were upset. In order to get the most from their video cards, users would have to upgrade their operating systems to Vista. Some have attributed Microsoft's decision to be purely based on marketing, but that's not entirely the case. What other factors were in play?

According to Microsoft DirectX guru Phil Taylor, development for DX10 wasn't complete until late in Windows XP's lifecycle, and during the time of its development, things became clear that DX10 simply would not fit into XP.

Given XP shipped in 2001 and it was late 2003 when the DX10 design solidified - it should be obvious that 'what the OS was' was well beyond XP before serious DX10 work commenced. Heck, the Longhorn reset was in 2004 and DX10 wasn't done until later. The build that was demo'ed [sic] at WinHEC 2004 with the texture memory management was a very fresh build and wasn't feature complete - and that was April or May 2004. The 1st DX SDK supporting DX10 didn't appear until Dec 2005.

Continue for more info....

Microsoft, DirectX 10, Windows Vista, Windows XP