GPU NEWS Channel

sukesh1090

Adam young
^^
lol yes i did but did you bother reading my comment? :wink:
i told you free sync is not available for desktop yet and it supports only laptops for now and your article says the samething.so in future we may see monitors supporting it,you just have to wait.
but then if people have money and they don't find anything else relevant for spending then they can go ahead and buy it. its that simple.
If you read that article from anadtech he clearly states that g sync is for those people who has high end cards and who has excellent eye sights who can notice the stuttering when you enable v sync but the truth is most of the people won't notice or got accustomed to it.so basically g sync is an extra layer of comfort like physx which most people don't need.
I don't know why NVIDIA always spends time and resource to develop these sort of non proprietary (useless) things,rather than this they could have went ahead and ported mantle for nvidia as its an open source API and now we can see that it really helps but then they won't because they can't loot people with mantle because AMD is already giving it for free.
 

sukesh1090

Adam young
^^
oh buddy i can't keep on saying the same thing again and again :(
Most of the new notebooks or netbooks will support free sync as it has all the features required.laptop screens are using that vblank feature to reduce power consummption so most of the new laptops will have that feature and you can use free sync with just a driver update and AMD saya the latest catalyst driver has that feature.they atually just bought new randomn toshiba laptops and showed it working flawlessly.where as for desktop monitors it won't work as of now because vblank is a standard feature in DP 1.3 and we are currently using DP 1.2,but then the monitor manufacturers may implement it in new monitors even though it is not a standard feature of DP 1.2

Why do NVIDIA want to develop g sync when its not that important?
right isn't it obvious that they get money from it.they can loot people in the name of g sync. [ there is an asus monitor with g sync and there is one more sibling of that monitor without g sync and the price between them is more than $100.hope now its clear to you why they want to develop g sync. ]
ok found the monitor its, ASUS VG248QE,(1080p monitor)
without gsync price is $250(street price)
with gsync announced price is $400(mostly street price will be even high)
now itsleft to the byer lf he wants to spend more than $150 to get gsync,if you ask me $50 difference good,anything more than that "bettel luck next time NVIDIA".

In my personal experience i never observed stuttering with v sync or may be i am used to it and if we start a poll here most of them will have the same opinion.let me ask you have you annoyed of stutering while you play the games with v sync enabled?
 
Last edited:

sukesh1090

Adam young
^^
I am not at all talking about those people who own quad sli or crossfire of highend cards.as i said before if a person wants to spend $150 and that $150 doesn't matter to him then no body is stopping him from getting it.
as i said laptops are actually have that tech and latest catalyst driver supports it and works as same as gsync.desktops they are not supported yet but we may see it in the future.and in some games the lags are due to diffrent reasons for example in rivals and batman AO they both are poorly optimised so we are facing lags.g sync will not iron out all the lags or hiccups it will just corrects one of the reason thats is improper sync between monitor and GPU where monitor refresh frame even before GPU does leading to stuttering.g sync will only fix this issue.
at the end of the day conclusion is simple, for most of the people its not as important as nvidia wants them to think and if you have money nobody's stopping from buying it. ;)

ah... and about AMD launching its own Gsync alternate,it's not happening because they think its not important.and their quetion is why nvidia wants to charge people so much when there is cheap alternative for it.thats what they demonstrated with those two toshiba monitors.
 
Last edited:

sukesh1090

Adam young
@sam_738844,
now that you want to go harsh so be it.you say games are best optimised just go and read gamespot review why they have given Batman AO 6.5score,just do me a favor and read it.do you know why people are angry at dice?may be you don't ,its because BF4 is very very poorly optimised.and you say games now a days are perfectly optimised.btw if you have observed tearing then you should enable vsync lol,thats why we have v sync.
the best examples for good optimised games are i think codemaster games.dirt3 ,grid 2 ,play it and you will know how good optimised they are.
you don't have to have high end gears to understnd who is looting and who is not.you just have to be able to read english and have internet.thats all you need.
 
Last edited:

sukesh1090

Adam young
what benefit AMD gets from consoles yet to be seen.from last 3 months only xbone and ps4 combined sold 7.2 million and the AMD chip inside costs $100 so in 3 moths AMD got 720 million,thas a lot of money in 3 months,even though the profit may be less from it.
he says five years i think that much time is more than enough for AMD,microsoft and sony to make money out of present gen consoles.lets see,AMD is saying they will get around 3 billion from console sales in 3to 4 years.fingers crossed.
 

topgear

Super Moderator
Staff member
Technically Freesync sounds more appealing but Nvdia has it's point on this so need to wait FS vs. GS. Anyway, I'm more interested about the 14.1 beta driver performance and more importantly side by side image comparison test just to be sure there's no driver level image quality compromise to improve performance or did AMD make something more like physX :D
 

topgear

Super Moderator
Staff member
seriously the first news does not belong to the gpu news section rather there should be a thread on this on technology section ;)
 

ASHISH65

Technomancer
Report: Nvidia Prepping Maxwell-based 750 Ti for February

nvidia-kepler-2012-maxwell-2014,4-1-299521-22.png



Maxwell Nvidia GTX 750 Ti is Apparently Slower than the GTX 660 - Allege Leaked Benchmarks

nvidia_gtx750ti_sp.jpg
 

vickybat

I am the night...I am...
A quote from toms article:

The latest scuttlebutt says we could see the first Maxwell-based GPU as early as next month. SweClockers says it will be a TSMC-made GeForce GTX 750 Ti manufactured on the 28 nanometer process, and it will replace the GTX 650 Ti Boost.

It's actually a replacement for 650-ti boost and currently in 28nm avatar.
It might be a testing ground for Maxwell, before it moves into 20nm nodes for Geforce 8xx series.
 

vickybat

I am the night...I am...
780ghz is faster than R9 290 in all departments. Its even faster than an overclocked R9 290.

Considering that they both are priced similarly, the 780 ghz wins brownie points for being way cooler (65 c full load as opposed to 95 c on R9 290), far quieter and faster by a margin, out of the box.
780 ghz also allowed higher in game settings than 290.

This is like 780 reborn. :)
 
Top Bottom