“Crysis makes me sick”

Status
Not open for further replies.

amitava82

MMO Addict
I’m guessing this title will attract quite a few gamers.

Around 3 years ago Far Cry was launched by the same developers behind Crysis: Germany based Crytek. The Game was an average FPS that didn’t bring anything new to the genre, but was still a pleasant game, playable at least one time. Back then just like now, graphics were at the center of things.

Far Cry had beautiful environments that few computers at the time could handle. Hardware websites and enthusiasts quickly made Far Cry a benchmark standard for all types of hardware. A little later however, something happened which became the beginning of this whole story. AMD launched their 64-bit processor Athlon 64 and were hunting for sales arguments.

Due to the fact that AMD were the first to create 64-bit processors in regular home PC’s, there was almost no software that supported this new technology. This made it difficult for AMD to convince the consumers of the advantages of more “bits in the processor”. AMD was simply forced to convince developers to use the new technology, and one of the goals of this campaign was a 64-bit version of Far Cry.

Apparently AMD managed to “convince” Crytek. About the same time that Microsoft released their 64-bit version of Windows XP a patch popped up on AMD’s website promising gold to those with the courage to buy a new processor and upgrade their operating system. The advantages of “more bits in the processor” was demonstrated with snapshots showing more badass explosions and more detailed textures. Isn’t 64-bit wonderful?

For those of us with our feet on the ground, these arguments were not as convincing. 64-bit in fact has nothing to do with bigger textures. To be able to adress more memory and have access to wider registers can make it easier to handle large sums of data, but at the time no personal computer was even close to breaking the 32-bit barrier. Cut short, this PR scam had nothing to do with “more bits in the processor”.

Back to present day and the launch of the Crysis Demo. Just like last time an enormous amount of hype was built up, largely about the astounding graphics. By using Microsofts latest graphics standard Directx 10, which is only available in Windows Vista, the developers have been able to push the boundaries of what is possible with todays hardware. That is the official version at least.

The truth is the true purpose of Directx 10 is to make developing easier by cleaning up registers and supplying new useful functions. This however is nothing the consumer notices, and therefore Microsoft must point out Directx 10’s “graphical improvements” in order to convince gamers to upgrade to Windows Vista. In reality DX10 does not mean drastically improved visual effects, at least not with todays graphics cards. There is a certain repetition of history to be seen here, right?

And then a few days after the Crysis demo launched the bad news was announced. When using Directx 9 you can’t run the game at “very high” settings, which drastically improves the visual experience from lower settings. A member at Crysis-online poked around a bit with the demo files and found a way to get almost exactly the same visual quality with Directx 9. This meant that the developers (Crytek) had purposefully worsened the Directx 9 setting to make Microsofts new technology appear superior. Apparently Crytek dosen’t mind lying to their customers.

This is not all. Crytek CEO Cevat Yerli was interviewed a while back by Shacknews and talked about how beneficial multi-core processors would be for the game. Finally those who had spent big bucks on quad-cores would earn their increased perfomance.

Quad core was the advice Crytek had to give to hopeful gamers saving money for upgrades. What was the reality again? The reality is that four cores gives zero, I repeat, ZERO perfomance increase in Crysis. And thats not all, because once again the 64-bit question has to be adressed. Cevat Yerli was also interviewed by Gamespot among others praising “more bits in the processor”

Better Performance at higher graphics settings? This was not the reality. The truth is that 64-bit improves NOTHING in Crysis!

This is of course the demo version we are talking about, but everything points toward the full version of the game functioning the same. Is this the kind of behaviour us enthusiasts and gamers will have to live with in the future? Game developers being a part of the marketing of new technology and hardware, no longer concentrating on delivering the best possible product but convincing consumers to open their wallets and unnecessarily upgrading their systems? I assume money has exchanged hands more than once behind the scenes, and who the suspects are need not even be mentioned. As a true gamer and hardware enthusias i declare that Crysis makes me sick.

Ctrl+C Ctrl+V: Here

Well the article does have some valid points and gets the points across.
 

naveen_reloaded

!! RecuZant By Birth !!
Demo version are sure to be compressed and will miss out few features to make the demo small in size.
We should take the final and discuss.
And the title really is attractive..
But most of the things said doesn have proof...or i think yöü weren that elaborate..
I dont know how a quad core wont deliver extra performance. ..
I dont know..
Dx10 is sure a advancement in gfx area..
I was simply amazed by the effects it produced in bioshock...
Without that how can those effects can be produced.take the better side..
Even tiny improvement towards betterment of gfx is good and should be welcomed.
We have to wait and see how dev take full advantage of dx10 in future and also its upcoming update with vista SP1..
Then yöü said that farcry didnt bring anything new..i wont accept it.when we played it ar that time,its ai was one of the best...
And it have players a ability to roam all thru the island.as for me thats new..when we were playing wolfestien castle...and sam
Anyway its your view.
This is mine..
 

bikdel

Alpha Geek Banned
tell me why does SLI not give much performance boost in Crysis even though the multi-gpu technology has been around for say 3 years ???

lies are all we get from m$ and those affiliated :(
 
Status
Not open for further replies.
Top