amd fx 8350 disscussion

vickybat

I am the night...I am...
@cyberkid,
HD4000 is equal to nothing in gaming performance.you may think all the time that you have back up but when the times when you will have to jump from a discrete to HD4000 then you will feel its wrath.btw if i remember it correctly then those delta peripherals prices are excluding taxes.i may be wrong.
btw AMD could have continued giving those comparitively good on board graphic chips in their mobos rather than throwing it of the window.

power consumption only matters to those pople who run their computer 24X7 under full load but if you are running your computer at full load for hardly 2-3hrs a day then you will save negligible amount of power.if the world is going in that much "go Green" way then every house out there should have CFLs and none of the shops should sell 0,100 watt bulbs.street lights glowing even in the morning.using electric guisers,etc.,we waste hell lot of power everyday for useless things and then we try to save 0.1 or even 0.01% of that in computers.whats the use of it?

Do you even know what hd4000 is? Do you know how much it differs from hd 3000?

Performance of hd4000 sometimes touches llano a8. It supports direct compute 5 and gives playable framerates in all titles.
Talk about the bold part what you said, care to check the following:

AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review

Are those unplayable framerates? Besides those crappy onboard gpu that amd used to give in its motherboards are no match for hd 4000. Again misleading info.
The next iteration of intel hd that's gonna come with haswell will have dx11.1, full compute performance with direct compute ( hd4000 has this) , opencl 1.2 , opengl 4.0 (gaming) and a newer and more powerful version of quicksync.

This is gonna also assist the cpu in opencl accelerated apps without the need of a discrete gpu. It also draws very few similarities from intel's larabee architecture ( broadwell will be a full redesign and might have dedicated vector processor like larabee for compute). You know how xeon phi performs right? It has already been implemented in a supercomputer and will be commercially available very soon.

Mate don't post stuff without giving them a second thought.

Looking at your mockery towards environment, i really feel sad. Although it has no meaning in the context of this thread, you don't understand the significance.
In my house, we use cfl's only and i guess lots of members here do the same. The electric geysers you talk about now come with 5 star rating that save considerable amount of power than previous iterations. In short, each and every industry is doing their part and so is the cpu industry. If power consumption was negligible, why the heck would reviewers even include them to their tests? Think this in a broader way and don't relate this in a personal context.
 
Last edited:

Skud

Super Moderator
Staff member
Do you even know what hd4000 is? Do you know how much it differs from hd 3000?

Performance of hd4000 sometimes touches llano a8. It supports direct compute 5 and gives playable framerates in all non dx11 titles coz it does not support dx11.

Talk about the bold part what you said, care to check the following:

AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review

Are those unplayable framerates? Besides those crappy onboard gpu that amd used to give in its motherboards are no match for hd 4000. Again misleading info.

Performance of FX8350 also sometimes touches i7 3770k and even higher while being much cheaper which the HD4000 is not (I am taking the IGP in isolation) as compared to Llano IGP. By that logic, there's absolutely nothing wrong in AMD's CPU performance. And the links you have given below, as far as I can see in none of the games the hd4000 can touch the llano IGP, which was more than an year old. Even in an anemic game like Minecraft the HD4000 was more than 3 times slower. Intel graphics progression is worse than AMD's CPU progression. Even compute performance of HD4000 is highly inconsistent.

And HD4000 does support DX11 & OpenCL, wrong info. Check the very Anandtech review you have posted.


The next iteration of intel hd that's gonna come with haswell will have dx11.1, full compute performance with direct compute ( hd4000 has this) , opencl 1.2 and opengl 4.0 (gaming).

This is gonna also assist the cpu in opencl accelerated apps without the need of a discrete gpu. It also draws very few similarities from intel's larabee architecture ( broadwell will be a full redesign and might have dedicated vector processor like larabee for compute). You know how xeon phi performs right? It has already been implemented in a supercomputer and will be commercially available very soon.


Even Llano APUs have been used in HPC solutions, you are talking about specialized solutions here, which doesn't count in general users' context. And regarding OpenCL, well, even AMD is already doing it for quite sometime without discrete GPU, nothing new.


Mate don't post stuff without giving them a second thought.

Looking at your mockery towards environment, i really feel sad. Although it has no meaning in the context of this thread, you don't understand the significance.
In my house, we use cfl's only and i guess lots of members here do the same. The electric geysers you talk about now come with 5 star rating that save considerable amount of power than previous iterations. In short, each and every industry is doing their part and so is the cpu industry. If power consumption was negligible, why the heck would reviewers even include them to their tests? Think this in a broader way and don't relate this in a personal context.


There's nothing wrong with going green, point here is AMD is not "red" like the way it has been painted. Reviewers generally give the worst case scenario of power consumption, they never compare power consumption simulating application loads that majority of the users will use in their PCs, say a browser, word processor, games, photo editing software etc. You know what, most of us don't really use our PCs to run Furmark or Prime95.
 

vickybat

I am the night...I am...
Performance of FX8350 also sometimes touches i7 3770k and even higher while being much cheaper which the HD4000 is not (I am taking the IGP in isolation) as compared to Llano IGP. By that logic, there's absolutely nothing wrong in AMD's CPU performance. And the links you have given below, as far as I can see in none of the games the hd4000 can touch the llano IGP, which was more than an year old. Even in an anemic game like Minecraft the HD4000 was more than 3 times slower. Intel graphics progression is worse than AMD's CPU progression. Even compute performance of HD4000 is highly inconsistent.

And HD4000 does support DX11 & OpenCL, wrong info. Check the very Anandtech review you have posted.

Yup got the dx11 part wrong. Post edited.

Now coming to the point. How can you compare a CPU with a GPU in non compute tasks like gaming? There we see difference in frames and anything above 30 for a highend title at low settings is playable. The results contradict the person's statement in bold which i had quoted. Notice the word unplayable?

Besides, rendering frames per second and doing compute tasks are completely different aspects. A slow cpu is slow with respect to work done in unit time. Rendering a few lesser frames makes a gpu relatively slower and until and unless it delivers unplayable or subpar gameplay experience, it cannot be ruled out. I don't see a 3870k igp excelling brilliantly in gaming but does so in opencl tasks owing to its vliw5 architecture.

I won't comment on haswell's igp yet but whether or not that's a significant step up, only time will tell. But talking about hpc, intel's larabee elements are massively significant and if implemented in igp ( haswell has some but broadwell will use it full scale), it has the potential to challenge amd's GCN in hpc apps which is more vital here than gaming.


There's nothing wrong with going green, point here is AMD is not "red" like the way it has been painted. And reviewers never compare power consumption simulating application loads that majority of the users will use in their PCs, say a browser, word processor, games, photo editing software etc. You know what, most of us don't really use our PCs to run Furmark or Prime95.

Nobody's painting AMD red. We are not talking about prime95 or furmark here either. And no, majority of users buying a 3770k,3570k or 8350 don't use the occasional browser, word processor all the time. Atleast not me. Even while we play games like crysis , battlefield, we stress the cpu to load levels and power consumption is not negligible here. I play for hours in a long stretch.

While encoding in handbrake fotr large 6gb + rips, my cpu takes more than half an hour( sometimes even more) with all my cpu cores operating at 100% with max temperatures. You have a database client like oracle running in your background and it uses cpu resources most of the time. I'm not talking about people who use occasional browser and wordprocessor stuff. Watching a 1080p movie with post processing filters too puts a bit of load to the cpu which directly affects power consumption. There are so many other aspects to think off. The powerconsumption tests used by reviewers use general productivity , content creation and gaming apps ( real world and not synthetic) to test load power. No furmark or prime95 come even to the picture.
 

Skud

Super Moderator
Staff member
Yup got the dx11 part wrong. Post edited.

Now coming to the point. How can you compare a CPU with a GPU in non compute tasks like gaming? There we see difference in frames and anything above 30 for a highend title at low settings is playable. The results contradict the person's statement in bold which i had quoted. Notice the word unplayable?

Besides, rendering frames per second and doing compute tasks are completely different aspects. A slow cpu is slow with respect to work done in unit time. Rendering a few lesser frames makes a gpu relatively slower and until and unless it delivers unplayable or subpar gameplay experience, it cannot be ruled out. I don't see a 3870k igp excelling brilliantly in gaming but does so in opencl tasks owing to its vliw5 architecture.

I won't comment on haswell's igp yet but whether or not that's a significant step up, only time will tell. But talking about hpc, intel's larabee elements are massively significant and if implemented in igp ( haswell has some but broadwell will use it full scale), it has the potential to challenge amd's GCN in hpc apps which is more vital here than gaming.


I had gamed at 13x7 with 880G IGP (yeah, even Crysis) and hadn't found them to be as crappy as you have posted earlier. If playability is all you are concerned about then even such an older gen IGP is sufficient at low res and low settings. HD4000 is nothing groundbreaking ( I am not even talking about how costly it is) and pales in comparison to AMD's APU (even the older Llano). So goes for performance in OpenCL apps. Trinity is already ahead, and given the compute performance of current GCN based cards, next-gen APUs can only improve.


Nobody's painting AMD red. We are not talking about prime95 or furmark here either. And no, majority of users buying a 3770k,3570k or 8350 don't use the occasional browser, word processor all the time. Atleast not me. Even while we play games like crysis , battlefield, we stress the cpu to load levels and power consumption is not negligible here. I play for hours in a long stretch.

While encoding in handbrake fotr large 6gb + rips, my cpu takes more than half an hour( sometimes even more) with all my cpu cores operating at 100% with max temperatures. You have a database client like oracle running in your background and it uses cpu resources most of the time. I'm not talking about people who use occasional browser and wordprocessor stuff. Watching a 1080p movie with post processing filters too puts a bit of load to the cpu which directly affects power consumption. There are so many other aspects to think off. The powerconsumption tests used by reviewers use general productivity , content creation and gaming apps ( real world and not synthetic) to test load power. No furmark or prime95 come even to the picture.


No single power consumption test give results over a significant period of time, as I have already posted, they are mostly worst case scenarios. The best I have seen is Toms' reviews where they measure power consumption across their whole benchmark run, instead of a single run of a particular application where the goal is to stress the CPU/GPU at its fullest. And unlike you, I have rarely seen games utilizing 100% load across all the cores of the CPU throughout its runtime, yeah even with Crysis. Video encoding/3D rendering can stress your CPU to fullest, but then again, as the FX8350 will complete the task before a i5 3570k, the higher power consumption will be negated to some extent.

Point is money matters. Unless you are running applications which can stress the CPU all its core at full load for a significant portion of 24 hours (taking it granted you are running it 24x7), you will take a long time to actually cover the additional cost. And what that additional amount can fetch you during that time in terms of investment, savings etc. is a different topic altogether. If the companies were so serious about environmental issues, they shouldn't have charged a premium for energy efficient components.
 

vickybat

I am the night...I am...
I had gamed at 13x7 with 880G IGP (yeah, even Crysis) and hadn't found them to be as crappy as you have posted earlier. If playability is all you are concerned about then even such an older gen IGP is sufficient at low res and low settings. HD4000 is nothing groundbreaking ( I am not even talking about how costly it is) and pales in comparison to AMD's APU (even the older Llano). So goes for performance in OpenCL apps. Trinity is already ahead, and given the compute performance of current GCN based cards, next-gen APUs can only improve.

I can say the same thing about hd4000. Crysis warhead gives 30+ fps in 1366x768 medium settings. Is that unplayable? Anyway before posting generalized comments, i want you to just go through xeon phi's architecture and what dedicated vector processors can do for opencl acceleration. I would say its a direct threat to amd's gcn implementation in its firepro gpu's. Just saying trinity is ahead, gcn will be ahead won't contribute anything to this discussion.

Anyway this discussion is for another topic which will soon be put up in a different thread so lets not crap this one.


No single power consumption test give results over a significant period of time, as I have already posted, they are mostly worst case scenarios. The best I have seen is Toms' reviews where they measure power consumption across their whole benchmark run, instead of a single run of a particular application where the goal is to stress the CPU/GPU at its fullest. And unlike you, I have rarely seen games utilizing 100% load across all the cores of the CPU throughout its runtime, yeah even with Crysis. Video encoding/3D rendering can stress your CPU to fullest, but then again, as the FX8350 will complete the task before a i5 3570k, the higher power consumption will be negated to some extent.

That bold part is highly debatable. I think you did not get what the following pic shows:

*i.imgur.com/Up6Hz.png

Are you saying 3570k is weak in 3d rendering? Show me a real world 3d application like maya where 8350 is beating a 3570k considerably??
How about overclocked performance? Don't bring price factor here. Do you see the performance gains 3570k has in the above slide?

What about the power consumptions at those clocks? If 8350 can't negate performance deficit, how do you except to negate power consumption?

Do you know how much power 8350 consumes at 4.5ghz?

Point is money matters. Unless you are running applications which can stress the CPU all its core at full load, and that too for a significant portion of 24 hours (taking it granted you are running it 24x7), you will take a long time to actually cover the additional cost. And what that additional amount can fetch you during that time in terms of investment, savings etc. is a different topic altogether. If the companies were so serious about environmental issues, they shouldn't have charged a premium for energy efficient components.

Sure money matters but not by sacrificing other factors. I want a cpu in my system which does everything, gives blistering performance when overclocked and does so by consuming 1/3 of the power than its competition which does not excel in any of these. In short, this is performance per watt and has significant importance in every industry including cpu and gpu. In case of kepler vs gcn, i can say gcn comes very close in power consumption and delivers better performance to negate that.

But that's not the case with piledriver.
 

Skud

Super Moderator
Staff member
Sure money matters but not by sacrificing other factors. I want a cpu in my system which does everything, gives blistering performance when overclocked and does so by consuming 1/3 of the power than its competition which does not excel in any of these. In short, this is performance per watt and has significant importance in every industry including cpu and gpu. In case of kepler vs gcn, i can say gcn comes very close in power consumption and delivers better performance to negate that.

But that's not the case with piledriver.


And there's performance per rupee too, and not everybody required blistering performance and not everybody can afford that blistering performance. No point arguing over that, for you power matters, for others its money.

And just like you say 30 fps at a low/medium settings is acceptable (I know numerous guys at this very forum itself who will disagree to that), waiting for an extra couple of minutes to get the job done (or less work done in unit time as said by you) is also acceptable to many.
 

Cilus

laborare est orare
What is the point of discussing Larabee and Xeon Phi architecture here? Xeon _phi might be superior or equal to GCN in compute tasks but it is not made for any kind of Gaming task. So what is the point? And regarding IGP progress, I don't expect anything ground breaking from Hashwell's IGP, looking at the progress of their IGP performance. Whether games are playable or not, I will go for the one which offers me better performance at lower price. Even Llano IGP is far faster than HD 4000 and Trinity IGP simply crush it. Now for getting HD 4000, you need to spend around 11K minimum whereas for Llano or Trinity, I think by paying 7K, you will get a Quad Core CPU equivalent to Intel 3rd Gen and 2nd Gen i3 and a GPU like HD 6570.
Similarly if you buy a FX 6300 around 8K and a 4K graphics card like HD 6670 DDR3, the total package will deliver far better gaming performance than a i5 3470 with its HD 4000 at the same price point.
 

rijinpk1

Aspiring Novelist
Now for getting HD 4000, you need to spend around 11K minimum whereas for Llano or Trinity, I think by paying 7K, you will get a Quad Core CPU equivalent to Intel 3rd Gen and 2nd Gen i3 and a GPU like HD 6570.
Similarly if you buy a FX 6300 around 8K and a 4K graphics card like HD 6670 DDR3, the total package will deliver far better gaming performance than a i5 3470 with its HD 4000 at the same price point.

i5 3470 has only HD 2500 graphics. The only i5 processor which has HD 4000 is i5 3570k as of now.

Now for getting HD 4000, you need to spend around 11K minimum whereas for Llano or Trinity, I think by paying 7K, you will get a Quad Core CPU equivalent to Intel 3rd Gen and 2nd Gen i3 and a GPU like HD 6570.
Similarly if you buy a FX 6300 around 8K and a 4K graphics card like HD 6670 DDR3, the total package will deliver far better gaming performance than a i5 3470 with its HD 4000 at the same price point.

i5 3470 has only HD 2500 graphics. The only i5 processor which has HD 4000 is i5 3570k as of now.
 

vickybat

I am the night...I am...
And there's performance per rupee too, and not everybody required blistering performance and not everybody can afford that blistering performance. No point arguing over that, for you power matters, for others its money.

And just like you say 30 fps at a low/medium settings is acceptable (I know numerous guys at this very forum itself who will disagree to that), waiting for an extra couple of minutes to get the job done (or less work done in unit time as said by you) is also acceptable to many.

Btw 8350 doesn't have any igp so technically it will give zero fps. What's the point of arguing over this? And trinity does not even fit here coz of its abysmal cpu performance. Talking about pure computing performance here and besides 3570k has opencl acceleration support out of the box without the need of a discrete gpu which the 8350 completely lacks. You did not comment anything about the performance chart which i was waiting for. The discussion ends here because overclocking performance ( not values) of 3570k is superior as shown above. Not to forget, much lower power consumption.

Besides, i won't suggest anyone a piledriver based 8350 with a 5-6k motherboard and tell them to overclock considering the notoriety of 125w cpu's.
People who do this , save money foolishly here ( no personal offense to anyone as its a generalized comment). So for a balanced config, you have to invest in atleast a 9-10k motherboard with
8350 to better harness the cpu.

But then, you have better options. The guy below is not dumb to make a comment:

On average, the FX-8350 and Core i5-3570K do pretty well at their stock settings, the Intel-based box about 10% quicker. This will likely change as we fold more heavily-threaded tests into the Marathon, starting this quarter. Naturally, you'll want to look closest at the benchmarks that matter to you specifically when you evaluate performance, since each architecture excels in a different way.

When it comes to overclocking though, Intel extends its lead with significantly lower power consumption and much better performance. If we were measuring efficiency, that'd be a home run. Yes, Xigmatek's Loki is insufficient for overclocking the 125 W FX-8350. But let's be realistic. If we wanted to squeeze better performance out of AMD's chip, we'd need to spend more money on cooling, and power consumption would rise even faster as higher voltages paved the way for more aggressive clock rates. It'd be a great experiment, and we might even play around with it in the future, but it's clear that Intel's Core i5-3570K remains the better choice for overclockers in this price range.

The bold parts say everything that one needs to know and comprehend in short. Nothing more.

One thing that goes in piledriver's favour is its support for newer instruction sets that ivybridge lacks. That's one advantage i cannot deny and future apps with fma support will benefit 8350 than 3570k. Until haswell, intel is in a disadvantage here but not an immediate one coz those apps are quite a while away.
 

Skud

Super Moderator
Staff member
And surely you have missed this, from that same review:-

Since we ordered the parts for this build, some prices are up and others are down. Fortunately, the current $1,009 price tag is very close to the $1,000 target. Keep in mind that the previous configuration went $57 over budget, and the graphics card alone is down $50 since last quarter.


That extra $48 will actually fetch the Gigabyte 7970 OC over the 670, and those graphs will look more equal in length.
 

vickybat

I am the night...I am...
And surely you have missed this, from that same review:-

That extra $48 will actually fetch the Gigabyte 7970 OC over the 670, and those graphs will look more equal in length.

A person with a $1000 budget won't mind spending $48 extra for a 7970oc which i agree is arguably better than a 680 too. It won't break his/her bank.

Perhaps you did not notice those graphs properly. There are two bars , green one depicting application performance while blue one is gaming performance. Changing 670 with a 7970 will change the blue bar but not the green one. So your point does not hold true here.
 

Skud

Super Moderator
Staff member
I guess right from the beginning the bone of contention here is how the savings of the CPU can help you get more elsewhere. Toms have kept the bars separate, mix them up, and we will get to know the overall performance of the systems, rather than separate Games/Apps performance. And that's how people use their systems, as a complete package, instead of individual components. If you throw in some compute tests, the Radeon 7970's lead over the 670 will ensure more or less two equally capable system, but quite different in their strengths and weaknesses.
 

vickybat

I am the night...I am...
I guess right from the beginning the bone of contention here is how the savings of the CPU can help you get more elsewhere. Toms have kept the bars separate, mix them up, and we will get to know the overall performance of the systems, rather than separate Games/Apps performance. And that's how people use their systems, as a complete package, instead of individual components. If you throw in some compute tests, the Radeon 7970's lead over the 670 will ensure more or less two equally capable system, but quite different in their strengths and weaknesses.

I don't agree. Bars are separate. Why the heck we need to mix them up?:confused:
Gaming performance is a separate entity coz it mixes a gpu.
Opencl performance is not even in question here because this is a cpu test. Dragging opencl will make it gpu dependent. Why even anybody forcefully want to pair 670 with 3570k and 7970 with 8350 to test compute strength and make the 3570k look inferior? Doesn't make any sense.

Anybody who wants compute will pair the 7970/7950/7870/7850/7770/7750 with 3570k rather than a 8350. And you know why.

The bottomline about the choice of cpu is clear. There is no other bone of contention.
And i guess this is also getting reflected in the respective sales figures.
 

Skud

Super Moderator
Staff member
I don't agree. Bars are separate. Why the heck we need to mix them up?:confused:
Gaming performance is a separate entity coz it mixes a gpu.
Opencl performance is not even in question here because this is a cpu test. Dragging opencl will make it gpu dependent. Why even anybody forcefully want to pair 670 with 3570k and 7970 with 8350 to test compute strength and make the 3570k look inferior? Doesn't make any sense.

Anybody who wants compute will pair the 7970/7950/7870/7850/7770/7750 with 3570k rather than a 8350. And you know why.

The bottomline about the choice of cpu is clear. There is no other bone of contention.
And i guess this is also getting reflected in the respective sales figures.


They are talking about $1000 PCs and comparing their overall performances, and it seems CPU test to you. :shock: Pairing 670 with 3570k and 7970 with 8350 is to keep the budget equal for both the systems and that absolutely makes sense. Doesn't really matter if you think otherwise as you are always on an unlimited budget. IIRC, OP has a budget of around 13k and that 3570k doesn't even fit the bill.

Going by sales figures, I remember the initial months of the 680 when the 7970 far outpaced it in sales figure while being generally slower. Lesser sales didn't necessarily make the 680 a less than capable solution.
 

sukesh1090

Adam young
@vickybat,
i don't know commenting on HD4000 makes any sense now but yes i have read and seen HD4000 benchmarks because i am not insane i won't spend 16k on a processor to play a game at low settings that too selective games giving 30 FPS at 1080p.i better spend on 8k trinity and get 2X FPS.it makes me more sense if your intention is gaming.wtf most of the people out their who use desktops for gaming or atleast gaming comes in one of the top priority.if you have doubt in that then please do me a favour and check pc buying thread and you will see most of them listing gaming as one of the most top priority.so we motly spend 1000$s on a pc either for gaming or video editing or those animation or photoshop stuffs which mostly work better if you give more graphical horse power than processing power.so it boils down to GPU so HD4000 doesn't make any sense at that high price segment and yeah even trinity doesn't make any ense and even it doesn't need to because its price justifies it.so you want to do those mostly GPU taxing works then save 4k and get a better gfx card and enjoy better performance or if you don't have that much money then again 8350 looks better choice because you have less money so you save 4k.
about using cfl lamps common bro its not personal and you know india and every point i told there holds good.now lets take the gas guisar coming with 5 stars but still tell me can it beat 50 stars solar energy but still people prefer guisar over solar .why?
you are talking about saving 100W with IB and there guisar takes 2KW.so think about it tell me how we are going in go green way.

@cyberkid,
buddy if you don't want to game then you don't need i5 or 8350 you can do the job with low end pentium dual core or i3 or trinity.as you itself told you are doing job with that 5 year old computer.now tell me if you think of upgrading what will the reason on the top of the list?
gaming.isn't it?then why bother to argue that we don't spend that much money for gaming.
 

Skud

Super Moderator
Staff member
@niz04:

I am locking the thread before things turn ugly. If you are going to purchase a complete system, feel free to make a separate thread quoting your budget and using the template in the proper section. If CPU/Mobo & GPU are your only concern, you have already got enough feedback to make a decision.
 

quan chi

mortal kombat
Sorry to bump an old thread but guys anybody using the 8350 can tell me whats the impact on the electricity bill.I mean in india how much extra we have to pay if we are using it instead of intel.
 

sumonpathak

knocking on heavens door
i jumped from E7500 to FX 8350 for my oldest rig..and i use it for 24/7 folding....bill difference is not astronomical or even high..
 
Top Bottom