Basic information regarding graphic cards for newbies

Status
Not open for further replies.

ico

Super Moderator
Staff member
How does a graphic card look like?
A graphic card looks like this:
*i.imgur.com/OtALg.gif

or might look different as well.

Where does a graphic card fit in?
All modern graphic cards (since 2002) utilize the PCI Expresss or PCIe interface and fit in a PCIe X16 slot. (not to be confused with PCI)

What is a PCIe slot?
PCIe is a computer expansion card standard. You get a PCIe equipment (sound card, graphic card etc.) and you fit it in a PCIe slot.

Here's how PCIe slots look like:

*i.imgur.com/IWiJT.jpg

On top you have PCIe X4 slot. Then PCIe X16. Then PCIe X1. Then PCIe X16 again.

X1, X4, X16 etc. are the number of lanes. A PCIe X1 card can physically fit and work fine in a PCIe X1 or X4 or X16 slot.

Further information:
One lane of PCIe 1.0 provides 250 MB/s of bandwidth. Total bandwidth available for a X16 PCIe 1.0 slot would be 16 * 250 MB/s = 4000 MB/s.
One lane of PCIe 2.0 provides 500 MB/s of bandwidth. Total bandwidth available for a X16 PCIe 2.0 slot would be 16 * 500 MB/s = 8000 MB/s. For X8 PCIe 2.0 slot, it would be 8 * 500 MB/s = 4000 MB/s. That is, same as X16 PCIe 1.0 slot.
You can extrapolate the same for PCIe 3.0 and 4.0 standards as well.

Now as PCIe is a bidirectional data technology. Theoretical bandwidth gets doubled.

Physically, all PCIe slots of different revisions have been the same till date.

Currently no graphic card is fast enough to utilize X16 PCIe 3.0 fully. There is hardly a 1-2% gaming performance drop when you use X8 PCIe 3.0. X8 PCIe 2.0 = X16 PCIe 3.0. So, this means the fastest single GPU card available today won't see any gamming performance drop in even PCIe 2.0. 2-3% but that hardly matters. There may be a performance drop for programmers that use GPUs for deep learning applications, but gamers need not worry.

How do I know that my motherboard has a PCIe slot?
Google your motherboard's model number. Check the specification page on the manufacturer's website. This is something which you can do yourself. Even easier, open up the cabinet and take a look.

What if my motherboard doesn't have a PCIe slot?
If your motherboard doesn't have PCIe slot and only has PCI or AGP slots, forget about adding a graphic card. Buy a latest system first.

In 2020, your system will almost always have a PCIe slot.

Will a PCIe 2.0 or 2.1 card work fine in PCIe 1.0 slot on my motherboard?
yes, it will work fine. All PCIe revisions are backward and forward compatible. Here 2.0, 2.1 and 1.0 are PCIe revision numbers. (not to be confused with lanes) PCIe is an evolving standard and has maintained backward and forward compatibility.

Will a PCIe 3.0 card work fine in PCIe 2.x or 1.0 slot on my motherboard?
Yes. You can put in any PCIe card in any PCIe slot and it will work. Version number is of no concern.

Which slots did the earlier graphic cards use?
Pre-historic graphic cards used the PCI interface. Then the world moved onto AGP interface for graphic cards. And now we use PCIe interface. Physically and electrically, these all are completely different interfaces and offer no kind of interoperability with each other.

Which companies make GPUs?
ATi and nVidia.

ATi was acquired by AMD in 2007.

So now, AMD and nVidia.

Fine...but when I go out to buy a graphic card, dealers offer me cards form various companies like MSI, eVGA, XFX, Sapphire etc??

These are the board/card manufacturers. The chips (GPU) which you'll find in these cards would be made by AMD and nVidia respectively.

Who will provide the service for the graphic card??
The board makers will provide their service. Visit their Indian websites to know more. AMD and nVidia don't provide the service. They only provide the driver.

Will an AMD/ATi card work on an Intel CPU platform? Will an nVidia card work on an AMD CPU platform?
Like I wrote above, the only requirement for a graphic card is a PCIe slot (and a good power supply). Processor has nothing to do with it.

My motherboard has DDR2 RAM. Will a graphic card with GDDR3 or GDDR5 memory work?
The RAM/memory on your motherboard has nothing to do with the type of memory on the graphic card. All what a graphic card needs is a PCIe slot and it will work.

How do I determine the performance of a graphic card? Video memory (VRAM)? Memory width? Core frequency?
None of these. The foolproof way to determine the performance of a graphic card is by looking at the frames-per-second delivered by it in games. So, check out reviews on websites like HardOCP, HardwareCanucks, Anandtech, Tomshardware et cetera.

Will nVidia 7300 GT 1 GB be faster than AMD HD 2600 XT 256 MB or nVidia 7600GT 256 MB??
NO. 7300 GT is simply not fast enough to make use of that 1GB video memory. 7600GT and HD 2600 XT are much faster despite having only 256 MB of video memory. Like I told you above, the only way to judge performance of a graphic card is by looking at FPS numbers delivered by them in games. For this, you need to read reviews and research yourself.
(Note: The cards mentioned in this question are end-of-line and very old. Don't think these are good cards in year 2020)

Fun fact about Laptop GPUs: If you are being happy because your Rs. 50,000 laptop has 1 GB or 2 GB video memory, no need to be happy about. AMD HD 6770M which is the fastest mobile graphic card in laptops costing around Rs. 50,000 in the Indian market, it performs around the level of HD 5670 (dekstop). It is hardly helped by video memory above 512 MB. What about Intel HD graphics? Intel GPUs suck as far as graphics are concerned. I hope you got your answer.
(Note: This was an example true for the year 2011. You can also extrapolate it to year 2020)

Will AMD HD 5670 1GB GDDR3 be faster than AMD HD 5670 512 MB GDDR5?
GDDR3 and GDDR5 are memory types. GDDR5 is faster than GDDR3. So, it is better to get AMD HD 5670 512 MB GDDR5. 5670 isn't fast enough to utilize 1GB memory anyway.

These days all high-end graphic cards use GDDR5. GDDR3 is mainly found in low end graphic cards. Many years ago, the low-end graphic cards used DDR2 memory.

Where do graphic cards get their power from?
Low end graphic cards get all their power to run form the PCIe slot itself. The fastest card which can get all its power from the PCIe slot is AMD HD 7750 which costs 7.2k at the moment [true for year 2011, not true right now]. But as the performance of cards increases, power consumption increases too. High-end cards get their power from 6-pin or 8-pin PCIe power connectors coming out from your power supply. Here's how it looks like:

*i.imgur.com/FjXcp.jpg (The two pins in this PCIe connector are detached which will enable it to work in both 6-pin and 8-pin plugs)

Now, cards like GTS 450 and HD 6850 require only one 6-pin PCIe connector. You go higher, cards might require two 6-pin PCIe connectors. More higher, one 8-pin and one 6-pin PCIe connectors and then finally two 8-pin PCIe connectors.

So, make sure your power supply has enough of these. Otherwise you'll have to use a 4-pin MOLEX to 6-pin PCIe converter which looks like this:

*i.imgur.com/XPI7l.jpg
Two MOLEX give you a single 6-pin PCIe connector.

To know about various power supply cables/connectors and other stuff, please visit this link: All about the various PC power supply cables and connectors

How important is the power supply (PSU)?

It is the heart of your system - the most important part of your computer. Never ever use a local/generic PSU if you want want to use a graphic card on your computer. Kindly go through this thread: *www.thinkdigit.com/forum/power-sup...89-power-supply-blacklist-thread-newbies.html

What are reference and non-reference cards?

AMD and nVidia usually decide the default specifications and design the default cooler for their cards. For reference, this information is then passed onto their board makers like MSI, Sapphire, XFX, eVGA and the likes. Board makers can either use this or also sell cards with their custom designed coolers for better cooling for overclocking.

Small example, AMD fixed the core and memory clocks of HD 6950 to 800 Mhz and 1250 Mhz respectively. MSI released a reference model. Then MSI also chose to release a factory overclocked HD 6950 sporting a (better) custom designed cooler with Core clock as 850 Mhz and memory clock as 1300 Mhz.

Obviously this non-reference card from MSI is going to be a priced slightly above a reference HD 6950.

A "non-reference" card can sport one or many of the following things:
1) A non-reference cooler which will generally be better.
2) Non-reference clocks which will generally be a bit higher. Such cards can be called "factory over-clocked" cards. Usually this will also be accompanied by a better non-reference cooler.
3) A non-reference PCB design sporting beefy VRMs and other changes.

Here are a few pictures:

AMD Radeon HD 6950 2GB - the card which AMD sent to reviewers on Day #1 of the launch. You will _not_ find this in the market. Getting cards in the market is the job of the board makers like MSI, XFX etc. This is the reference design which AMD decided upon - 800 Mhz Core clock and 1250 Mhz Memory clock.
*i.imgur.com/U1FrL.png

XFX HD 6950 2GB - XFX's card exactly similar to AMD's reference design. This is a reference card.
*i.imgur.com/qa4X9.png

XFX HD 6950 2GB CNDC "XXX Edition" - this card is sports a non-reference cooler which is fairly better than the reference cooler. This card is mildly factory overclocked to 830 Mhz Core and 1300 Mhz Memory.
*i.imgur.com/CIYrd.png

MSI R6950 2GB Twin Fozr II/OC - A non-reference dual fan cooler. The cooler is better than the cards mentioned above. This card is factory overclocked to 850 Mhz Core and 1300 Mhz Memory.
*i.imgur.com/Pilpo.png

If you'll be overclocking your card, it's better to get a card with a better non-reference cooler.
 
Last edited:
OP
ico

ico

Super Moderator
Staff member
1. Which Power Supply do you have? (Be EXACT while naming the company and model)
Ans:

2. What is your budget?
Ans:

3. Which resolution will you game at?
Ans:

4. What are your current computer specifications?
Ans:
 
Status
Not open for further replies.
Top Bottom