Intel's 'Larrabee' on Par With GeForce GTX 285

Status
Not open for further replies.

topgear

Super Moderator
Staff member
Traveling to Taiwan for Computex usually yields a lot of components, future launches, and lots of random hardware.

This time however, we waned to find out a little bit more about what Intel had up its sleeve for Larrabee--the company's next generation graphics solution, that's suppose to blow the water out of everything in the market.

According to one close Intel partner that wished not to be named, this isn't the case. We were told that Larrabee is currently only capable of performance levels similar to Nvidia's GeForce GTX 285. While this isn't a bad thing in by any measure, it doesn't quite line up with a lot of the heat that we've been told.

The partner said that with current Larrabee silicon, things may change down the line, but it did not expect Intel's graphics solution to take the high-end of the market. At the time of Larrabee's release, both AMD/ATI and Nvidia will introduce newer and faster versions of its GPUs. Despite this, it's still important to keep in mind that Intel has always been an enabler of technology, pushing other industry leaders to adopt new technology. This was the case with Intel's infamous i740.

Intel told us several weeks ago that Larrabee would be taking the same approach as Intel's SSD drives. Silent. No frills. But market dominating when released.

At this point, we still think it's a bit too early to draw very solid conclusions, but, this is what we were told.

Source : *www.tomshardware.com/news/intel-larrabee-nvidia-geforce,7944.html

Gamers rejoice :p

I don't know in which news channel I should put it ATI/Nvidia or Intel/AMD.
So I thought creating a new thread worth it for this news :p
 
OP
topgear

topgear

Super Moderator
Staff member
^^ I think it will come with core i5 :p & core i5 will be cheaper than core i7 ( though not sure. )
 

infra_red_dude

Wire muncher!
The current Larrabee is just some old Pentium cores stuck together for graphics processing mixed with some really intelligent ring bus architecture for communication. Not sure what their next revision will be but it better be good to stand up to the competition.
 

infra_red_dude

Wire muncher!
I'm not sure if the patent is granted, but a lot of processing elements (like the STI's Cell processor and other massively parallel PEs) use the ring bus. Thats the most viable solution as the number of cores increase. The ring bus is used to interconnect all the L2 caches (of each core) in Larrabee.

But if you look at the overall design is just old Pentiums manufactured on a smaller process slapped on a ring with SIMD support (SSE instructions) and x86 compatibility tweaked for performance. Intel sure knows how to squeeze the most out of their procs!
 
isnt ring bus ATi patended?

I'm not sure if the patent is granted, but a lot of processing elements (like the STI's Cell processor and other massively parallel PEs) use the ring bus. Thats the most viable solution as the number of cores increase. The ring bus is used to interconnect all the L2 caches (of each core) in Larrabee.

But if you look at the overall design is just old Pentiums manufactured on a smaller process slapped on a ring with SIMD support (SSE instructions) and x86 compatibility tweaked for performance. Intel sure knows how to squeeze the most out of their procs!

The ring bus could be ATI patented, but as far as IBM Cell is concerned, AMD already has tie-ups with IBM for newer manufacturing processes and some other things. If ATI indeed patented Ring Bus it could have been cross licensed with IBM.

And Pentiums on Larrabee ? Thats news to me. AMD needs to learn something for sure here.
 

comp@ddict

EXIT: DATA Junkyard
1. Performance equal to GTX285 will be offered by the HD5870 too.
2. I would still go for HD5870 because it will not just take one day/or month to optimise th drivers for larrabee
3. And any idea on the price of this piece yet?
 

dOm1naTOr

Wise Old Owl
if it uses older pentium cores, then doesnt it have any seperate pipelines for shaders? And how could it match a GTX 285 which has bout 240 processing elements. And how coult it be made to access its VRAM at high rates, as pentiums and most desktop proccys access RAM at 128 bit and limited frequency. For a GPU, it shud be a min of more than double @ 256 or 512 bit with GDDR3/4/5. So just adding another mem controller does the job hat easily...
all this mess is hard to conceive.
If this is true, think what wud it make out if they did somethin similar on an i7?? it cud probably replace a graphics mainstream used for movie FX.
 

infra_red_dude

Wire muncher!
The ring bus could be ATI patented, but as far as IBM Cell is concerned, AMD already has tie-ups with IBM for newer manufacturing processes and some other things. If ATI indeed patented Ring Bus it could have been cross licensed with IBM.

And Pentiums on Larrabee ? Thats news to me. AMD needs to learn something for sure here.
Ring bus is not ATi patented afaik. Manufacturing process tie ups has nothing to do with licensing technology.

if it uses older pentium cores, then doesnt it have any seperate pipelines for shaders? And how could it match a GTX 285 which has bout 240 processing elements. And how coult it be made to access its VRAM at high rates, as pentiums and most desktop proccys access RAM at 128 bit and limited frequency. For a GPU, it shud be a min of more than double @ 256 or 512 bit with GDDR3/4/5. So just adding another mem controller does the job hat easily...
all this mess is hard to conceive.
If this is true, think what wud it make out if they did somethin similar on an i7?? it cud probably replace a graphics mainstream used for movie FX.
The Pentium cores are the shader cores themselves. It has a 512 bit wide bus. The cores, being more of desktop procs, haf a lot better throughput than each processing element (or compute unit) of a modern day Geforce or Radeon core. You will be surprised to know about the Larrabee architecture. Its nothing revolutionary but the optimizations Intel has done to extract performance is intelligent and commendable.

I dunno how much you guys will understand from this technical paper (Straigh from Intel - the horse's mouth themselves!) but haf a look at this if you haf time. It explains the Larrabee architecture in detail: *software.intel.com/file/18198/

Larrabee's architecture = 10/16 Pentium cores (which do not even support out-of-order execution, just plain in-order), slapped with 512 bit ring bus, hyperthreading enabled (4 threads per CPU), with some SIMD instructions = Larrabee!

I've studied that paper in detail as I had a presentation on it and know the ins and outs of it. The architecture is really nothing new. Intel knows how to milk money and extract the very last drop of performance from its processors. But they are intelligent and know how to tweak for performance as well as survive in this industry. Of corz, Larrabee will be tweaked in future for sure.
 
Last edited:
OP
topgear

topgear

Super Moderator
Staff member
Intel's 'Larrabee' to Be "Huge"

Earlier in the week, we posted about Intel's Larrabee GPU and its future-looking performance.

This information comparing Larrabee to Nvidia's GTX 285 was preliminary, and given to us by a company close to Intel and Nvidia. After posting, we received more information on what Larrabee could shape up to be from one of Intel's very close and large partners. The following information should be taken as "current-known" information, and may very well change when Intel ships Larrabee.

According to current known information, our source indicated that Larrabee may end up being quite a big chip--literally. In fact,we were informed that Larrabee may be close to 650mm square die, and to be produced at 45nm. "If those measurements are normalized to match Nvidia's GT200 core, then Larrabee would be roughly 971mm squared," said our source--hefty indeed. This is of course, an assumption that Intel will be producing Larrabee on a 45nm core.

Our source also indicated that Intel is looking to ship Larrabee two years later, putting us in summer of 2011. Of course, by that time, we will have GPUs that are 2 to 4 times faster than current GPUs from both AMD/ATI and Nvidia. However, at that time Larrabee may not be what it is today either.

One critical point we were told was that 1st and 2nd generation Larrabee GPUs will not be compatible with 3rd generation Larrabee. This is of course, highly speculative and very far out. According to the data, Intel's 3rd generation part will have an emulation mode for backwards compatibility. If this is true, then developers would have a hard time programming for Larrabee.

We contacted Intel for comment in regards to the above information. Intel denied that any of the above is true.

Despite the above red-flag, there's an assumption that Larrabee will have to be compliant with Microsoft's DirectX, which will make it compatible with any existing technology on the application level. Games and application would be programmed for DirectX and not coded at the GPU level. However, in a recent Intel Larrabee slide, Larrabee's rendering architecture was suggested to be a successor to DirectX, possibly replacing the DirectX standard.

Source : *www.tomshardware.com/news/intel-larrabee-gpu-graphics,8019.html

So intel will be behind those gfx makers like nvidia & ati. On 2011 we will see far more better gfx chips than larrabee. So on 2011 larrabee will perform with nvidia & ati gpus as intel g45 & GTX 295 performing ( the performance differnce ) now :p
 

comp@ddict

EXIT: DATA Junkyard
Well, the TDP is reported to be 300W, LOL now who wud be buying larrabee? oops i mean pentium proccs put together!
 

dOm1naTOr

Wise Old Owl
650mm square die with lots of pentium chips put together eating 300w releasing on 2011. Is that wat larrabbe is as of now?
for a reference HD 4770 is a 75W GPU with 137mm die.
 

j1n M@tt

Cyborg Agent
Intel sure seems funny with this Larrabee with those pentium shaders...but will be nice to see a new competitor along with Nvidia and ATI. The starting will not be so surprising, but sure Intel will learn with the following years.
 
Status
Not open for further replies.
Top Bottom