Next Generation Emulation banner
1 - 20 of 55 Posts

·
Banned
Joined
·
23,263 Posts
Discussion Starter · #1 ·
Then these are the final specs for the G300 series

 

·
The one and only
Joined
·
3,660 Posts
*droool* that 380 rapes my 260
 

·
Meow Meow Meow
Joined
·
1,401 Posts
Doesn't the notion of 2 billion transistors sitting in that chip just boggle your mind? :???:
 

·
┐( ̄ー ̄)┌
Joined
·
2,058 Posts
Here's another one :
Transistor count of over 3 billion
Built on the 40 nm TSMC process
512 shader processors (which NVIDIA may refer to as "CUDA cores")
32 cores per core cluster
384-bit GDDR5 memory interface
1 MB L1 cache memory, 768 KB L2 unified cache memory
Up to 6 GB of total memory, 1.5 GB can be expected for the consumer graphics variant
Half Speed IEEE 754 Double Precision floating point
Native support for execution of C (CUDA), C++, Fortran, support for DirectCompute 11, DirectX 11, OpenGL 3.1, and OpenCL 1.1
source : nVidia GT300's Fermi architecture unveiled: 512 cores, up to 6GB GDDR5 - Bright Side Of News*
Crazy! :drool:
 

·
The Hunter
Joined
·
15,879 Posts
Ati has lesser power consumption.
Irrelevant for this market, unless you count the fact that they'll need a jet engine to cool such a beast.

Anyway, all speculation till now even though there's rumors of some working models being shown behind curtains. If these are indeed the specs, you'll know it's something you can only dream of as it'll be the first GPU in some while that will retail for €600,-
 

·
Curiously Cheddar
Joined
·
2,077 Posts
Honestly... I expected more.

This thing's been getting hype since before the GT200 series came out. The spec sheet is impressive, but not jaw-dropping.
 

·
The Hunter
Joined
·
15,879 Posts
There's more reasons not to trust nVidia right now:

The desktop 8800 series are falling apart massively, right after their 2 years of warranty expired. And the laptop cards are all going to suffer the same fate. The build quality of ATi's cards of these past generations is much higher apparently.

@ Cheesus: These are just numbers, nVidia will come with something like MIMD (as opposed to ATi's SIMD) but how useful that will be in games of course remains to be seen. I'm sure Squall can explain more about this. Also, it wouldn't surprise me if they'd lower the bus width to 384 bit to keep costs in check. Having an overkill of memory bandwidth is not a wise idea. You only need as much to prevent having a bottleneck.
 

·
From Love and Limerence
Joined
·
6,574 Posts
Wait, desktop 8800 cards are "falling apart"? Do you have any sources that explain this? I've long known about the 8600 laptop GPUs, but this is the first I've heard of this.
 

·
Registered
Joined
·
1,620 Posts
ive known people (more than 20) to have 8800 more than 4 years and its still working.... make sure its not an issue with bad drivers,,, yeah my friends got blue screen of death but it was due to bad drivers from nvidia
 

·
Scourge of the Seven Seas
Joined
·
2,412 Posts
Looks to be like the the current series. 4870 < GTX285 < 4870X2 --- 5870 < GT380 < 5870X2. And all by pretty small margins in the long run. With aggressive pricing, and good Crossfire, ATi will sell tons of cards despite not being the "undisputed champ" (nVidia too will sell tons of cards despite not being "undisputed" either).

...anybody want a waterblock'd 4870x2? :heh: (pretty serious about it...). Dolphin and PCSX2 run awesome.
 

·
No sir, I don't like it.
Joined
·
5,570 Posts
Personally, I think the native execution of C++ and Fortran code to be the most interesting aspect. But why is there native execution of Fortran code? Isn't Fortran Intel's native CPU compiler? Is nVidia teaming with Intel (unlikely I think) or is this a response to Larrabee?

I think it's a response to Larrabee. Seems like nVidia is just kicking Intel in the balls with a faster, better and more importantly, a product that will actually be in production for purchase within a reasonable timeframe rather than just appearing in a few crappy tech demos. (Larrabee)
 

·
The Hunter
Joined
·
15,879 Posts
Wait, desktop 8800 cards are "falling apart"? Do you have any sources that explain this? I've long known about the 8600 laptop GPUs, but this is the first I've heard of this.
Other forums that I frequent have some reports of 8800 desktop cards getting problems over time, just a bit more than the average card. I don't think you need to worry about it too much, but it seems it has become some sort of trend. From what I could catch it were mostly the first generation GTS cards (320/640 MB and GTX)
 

·
Premium Member
Joined
·
8,586 Posts
There's more reasons not to trust nVidia right now:

The desktop 8800 series are falling apart massively, right after their 2 years of warranty expired.
I've had my 8800gtx running on 3 years. My EVGA warranty is still valid to this day and if it does fail, I get whatever card is currently out that costs the same as when I bought it(if they don't have a replacement). :lol:

512 Shader cores, supports ECC memory, native C++.
http://www.nvidia.com/object/fermi_architecture.html

There's a PDF whitepaper in the link above. With so much programability, could an emulator run totally on the GPU itself?
 

·
Registered
Joined
·
47 Posts
Fermi isn't MIMD however, but rather 2 16-way CUDA cores per SM with two warp schedulers per SM. Basically Fermi's shader core is an evolution (a significant one) of nVidia's architecture rather than an reinvention.
 

·
Level 9998
Joined
·
9,384 Posts
Emulators would benefit from fast memory access (big CPU cache...) and CUDA has memory access problems (PCI-E to RAM) so yeah... not an ideal candidate for emulation... unless there's something else I've missed all this time. :innocent: It would probably be a good candidate for say... a software filter accelerator (HQ2x?) but for the most part, you ain't gonna get miracles.

So I'd say native support for C++ and Fortran will... depend on what exactly are supported. Otherwise it's just your general stuffs, which is why even with all this talk about how awesome CUDA is, you can only "fold" or encode videos on it to name a few tasks. :innocent:

That's one thing. The other is... while the specs look impressive, the important thing is the performance, and I don't have to remind anyone that this time around SPs ain't SPs anymore (lest nVidia is a good liar), and the GPU itself will be geared more towards extreme computing rather than extreme graphics. :innocent:
 
1 - 20 of 55 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top