Pin Me

Double The Power, Double The Fun: A Brief History of Dual Graphics

by: Brian Healy ; edited by: Michael Hartman ; updated: 4/17/2012 • Leave a comment

Playing a game is one of the most demanding tasks you can ask of your PC, especially if its a game that relies heavily on graphics, such as first-person shooters. As graphics card age, and new games take advantage of new technologies, users have increasingly embraced dual graphics card technologies

  • slide 1 of 1
    Playing a game that relies on good visuals – such as first-person shooters – is perhaps one of the most arduous tasks that faces the modern PC. Recent advancements in games programming means that while many new games will give your CPU and memory a good workout, it is inevitably the graphics card will suffer and pushed to the limit.

    For older graphics cards, it may not be possible to keep pace with the latest titles without turning detail settings and screen resolutions down to their bare minimums, and even then it often won't be enough to make for an enjoyable gaming experience. For many people in this boat, often the only recourse is to splash out on a new graphics card; depending on the type of card, that could also mean a new motherboard and memory to go with it. However, there is another option – providing your motherboard supports it – dual graphics. When browsing new graphics cards or motherboards, you might have come across the term 'SLI' or 'Crossfire' being mentioned in relation to the component. Both SLI and Crossfire allow a second graphics card to be placed alongside your existing graphics card and as a result, doubles the graphics output of your system. If your motherboard supports SLI or Crossfire, this can be a cheaper alternative to having to upgrade other components and it could provide the necessary extra graphics muscle you need.

    SLI has been around a while – since 1998 and the days of 3dfx and their Voodoo2 3D graphics accelerator – and originally stood for for 'Scan Line Interweave'. This technology allowed for two Voodoo2 accelerator boards to run in tandem with the PC's 2D graphics chip. In essence, each Voodoo2 rendered alternate lines on the screen, making it faster than running a single 2D/3D graphics card. Each Voodoo card required a separate PCI slot, and had to be connected via a ribbon connector. However, despite this advancement, subsequent product releases by 3dfx never incorporated a dual graphics system, and in the face of stern competition by major market rivals ATI and Nvidia, 3dfx collapsed in 2000. Nvidia subsequently acquired most of their assets but never released a dual graphics processor.

    In 1999, ATI produced the Rage 3D Fury MAXX – a single graphics card, but featuring two separate graphics chips capable of both 2D and 3D rendering. The ATI card worked differently from the Voodoo2. Where the dual 3dfx cards would render alternate lines on the screen, ATI's chips would each render a complete frame – one chip would render odd-numbered frames, the other even numbered frames. This methodology was known as alternate frame rendering (AFR) and is still used in today's dual-card setups. The Fury MAXX was short-lived and was replaced soon after by the Radeon; a return to single chip graphic processing.

    Single chip graphic processing has been the norm until the introduction of the PCI Express interface. It was at this point that Nvidia launched a new dual-card technology to take advantage of the new architecture. Nvidia retained the term 'SLI' but changed the acronym's meaning – it now stands for 'scalable link interface' and doesn't utilise the rendering method of the 3dfx product, instead using AFR.

    Nvidia's SLI system uses two graphics cards, each of which is housed in a separate PCI Express x16 slot. However, to use SLI, your motherboard must support the technology required. SLI requires that the two graphics cards use the same graphics processor, but the cards don't need to be from the same manufacturer. For example, you wouldn't be able to use a 7800GS and a 6800GT together in SLI, but you could run a 7800GS made by Asus with a 7800GS made by Inno3D. What's more, the clock speeds of the cards don't need to be the same either – the faster card should scale to accommodate the speed of the slower card, while if one card has more onboard memory, part of the memory will be disabled so the two are equal.

    ATI have also developed a dual-graphics card technology called 'Crossfire'. Like Nvidia's SLI, the technology also uses two graphics cards in PCI Express x16 slots. Initially, Crossfire was quite cumbersome to employ as it required that a special master-card be used to connect the two cards together and needed a motherboard with a compatible ATI chipset and a regular graphics card alongside the master card. There were two master cards initially released: The X800 and X850 Crossfire editions, and users could use cards belonging to the same series as the master card. For example, a system using a X800 master card could use a X800XT graphics card as it's second card. The master card would reconfigure itself to match the specification of the secondary card.

    Unlike the Nvidia SLI method, which used a ribbon connector to connect the cards together inside the system, Crossfire instead used pass-through connectors to connect the DVI output on the secondary graphics card to an input on the master card. A chip on the master card then processed the DVI signal and merged it with the master card data and sent the final image to the monitor. However, ATI followers were blighted by the fact that Crossfire edition master cards were scarce and in short supply prompting ATI to revisit revise the technology.

    In both cases – SLI for Nvidia and Crossfire for ATI – the technology is being reworked to allow greater flexibility for users, and despite added demands from power supply units, for those with compatible hardware, it is a step worth considering.