Next Generation Emulation banner

Don't put a chip out, just because its hot lots yield are small.

1906 Views 37 Replies 9 Participants Last post by  Squall-Leonhart
Can it be that bad? Sure, it can always be zero.

Let's just assume ALL of Charlie's numbers and sources are 100% correct...its four wafers.

Getting low yields on four wafers is not exactly uncommon. And it especially comes as no surprise for a hot lot as typically hot lots have nearly all the inline inspection metrology steps skipped in order to reduce the cycle-time all the more.

Those inline inspections are present in the flow for standard priority wip for a reason, related to both yield (reworks and cleanups) as well as cost reduction (eliminate known dead wip earlier in the flow).

I really pity anyone who is wasting their time attempting to extrapolate the future health of an entire product lineup based on tentative results from four hot-lotted wafers. That's not a put down to anyone who is actually doing just that, including Charlie, its an honest empathetic response I have for them because they really are wasting their time chasing after something with error bars so wide they can't see the ends of the whiskers from where they stand at the moment.

Now if we were talking about results averaged from say 6-8 lots and a minimum of 100-200 wafers ran thru the fab at standard priority (i.e. with all the standard yield enhancement options at play) then I'd be more inclined to start divining something from the remnants of the tea leaves here.

But just four wafers? Much ado about nothing at the moment, even IF all of the claimed details themselves are true.

This would have been far more interesting had the yields on those four wafers came back as 60% or 80%, again not that such yield numbers could be used to say anything about the average or the stdev of the yield distribution but it would speak to process capability and where there is proven capability there is an established pathway to moving the mean of the distribution to that yield territory.

But getting zero, or near-zero, yield is the so-called trivial result, it says almost nothing about process yield to get four wafers at zero yield. All it takes is one poorly performing machine during one process step and you get four wafers with yield killing particles spewed on them.
Discuss.
1 - 20 of 38 Posts
Even if it is possible to be the truth, would they say we have problems and your product sucks, did 3dfx told everyone they were having problems before being purchased by nvidia.
They can always try to use them for lower segments graphic card, like AMD and Intel do with their processors.
I assume this has everything to do with GT300's rumored low chip yields?

Considering how monstrously complex GT300 is over previous generation nVidia chips, I'd be more worried if yields were high? Why? Well because if yields were high, that would suggest that the new chip isn't very complex (in comparison to) and perhaps more comparable to the previous generation, meaning that all of the talk about how awesome GT300 is going to be is just that: Talk.

The only thing that I am worried about is price. With yields as low as they are rumored to be for the flagship product (~1.5%), how much will the GT300 cost the consumer?
It was a high risk wafer deal, this doesn't reflect the main production wafers but instead is a bunch of rough output to rush samples out before christmas.

The GTX380 will be monstrously powerful but that's not really worth anything until more demanding games come out.

ATI have consistenly been first to the table with all new GPU trends.
Power isn't as important as providing newer technology nowdays
Probably the same as always 600$.
Squall, why don't you ever post links to the original articles?
I do.. this one i just don't have the link too as it was quoted elsewhere. if you can sift through anandtech to find it... well all the power to you.
It's all stuff that doesn't matter to the end user, after all, nVidia won't sell the bad chips anyway and they'll determine their pricing on the speed of the chip and the prices of the competition, not on the yields, since marketshare is an important factor. They have the funds to lose a bit on their GPUs like in last round anyway.

Having these low yields now is just a bad indication for their general production, as it would suggest that mass production is too risky still, meaning we won't see the card soon. Still I wish nVidia all the luck, right now I see two scenarios, given their history:

1. They're staying silent as they have hardly anything to show: Their monstrous GPU strategy is biting them in the rear and they can't give a real indication of neither its power nor its release date.

2. They're going the ATi approach, putting up smokescreens and trying to catch them off guard with a card that even the 5870X2 can't compete with. Staying low now might make ATi feel comfortable.

I'm gearing more towards the 1st scenario though. nVidia loves to boast, especially to take the wind out of ATi's sails. Right now the only info we have is that it will be a huge and powerful chip that "should" beat ATi's series. That's all.
See less See more
Having these low yields now is just a bad indication for their general production, as it would suggest that mass production is too risky still, meaning we won't see the card soon. Still I wish nVidia all the luck, right now I see two scenarios, given their history:
Nope, these hotlots aren't chips that will end up on end users hands, the lucky reviewers might get one, then again they might not. a hotlot has all the typical WIP processing removed so there's no EC/QC in the production line, 7 chips from 4 wafers might sound bad but considering the lack of QC its not that bad actually. These hotlots allows them to adjust the manufacturing machinery prior to ramping up full production. Which has the QC/EC in place to make sure that all the bag parts are weeded out before reaching the end process..... its the parts that reach the end process that are generally counted as good or bad yield.

1. They're staying silent as they have hardly anything to show: Their monstrous GPU strategy is biting them in the rear and they can't give a real indication of neither its power nor its release date.

2. They're going the ATi approach, putting up smokescreens and trying to catch them off guard with a card that even the 5870X2 can't compete with. Staying low now might make ATi feel comfortable.

I'm gearing more towards the 1st scenario though. nVidia loves to boast, especially to take the wind out of ATi's sails. Right now the only info we have is that it will be a huge and powerful chip that "should" beat ATi's series. That's all.
G300 is bigger then GT200b but smaller then GT200 its not that big at all. The last time nvidia was this quiet, we got the 8800 series. The last time nvidia was loose when it came to details, we got the FX series.
See less See more
That "reply" is not worth crap. Everyone knows yields get better with time, and the first yields will always be crappy. Especially when TSMC's 40nm manufacturing has been known to suck proven already by ATi with it's 4770. The newer 4770 don't have the PCI-E 6-pin that was found in the earlier batches because they somewhat fixed the leakage issues with the first few batches.

What should be understood from what Charlie says is that nVidia:
1) Will be late for the DX11 party, and not by a short while.
2) The GT300 will most likely not launch for cheap.
3) There's a chance nVidia will **** it's customers over -AGAIN- with faulty chips just to get it's chips out ASAP.

If nVidia being "quiet" is good news, then why are they downplaying DX11 and -earlier- DX10.1 every chance they had? It's because they don't have a product coming any time soon. Why did nVidia STFU about DX10.1? Because they edited their ****ty architecture to comply, and are now selling DX10.1 products. How about fixing the rest of the GTX2XX series now? Oh that's right, that's not how nVidia rolls. They always tell the customer to **** off when they **** up. (G84 and G86)
See less See more
lol, Sagi, your just an nvidia hater, pure and simple.

I've got founded reasons for not liking ATI (Driver and design politics) just like others have for not liking nvidia.

regardless. ATI's hotlots were low yields too.

PS.
both nvidia's and ati's quality are dependant on the brand selling them. except where ATI's reference design is flawed.... atleast nvidia has never used driver profiling to inhibit performance on their cards >.>
lol, Sagi, your just an nvidia hater, pure and simple.

I've got founded reasons for not liking ATI (Driver and design politics) just like others have for not liking nvidia.

regardless. ATI's hotlots were low yields too.

PS.
both nvidia's and ati's quality are dependant on the brand selling them. except where ATI's reference design is flawed.... atleast nvidia has never used driver profiling to inhibit performance on their cards >.>
The boards? yes. The Processing units? NO. That's nVidia's fault, if they weren't so arrogant and if I didn't lose 230 bucks because of it, maybe I wouldn't have disliked them as much. It's going to be a very long while if I ever consider buying a nVidia. Maybe long enough till they're out of business. Which shouldn't be that long anyway.

nVidia's crappy drivers are not even close to being better. They have as much issues as ATi's drivers. One of the main reasons Vista is hated so much is because of nVidia's ****ty drivers, which happens to be the OS I favor more than any other.
what did you lose 230 bucks on?

a preoverclocked board which didn't adhere to nvidia's design.?

What nvidia board partners do with the cards is on them, nvidia just makes chips that the vendors tend to screw up on and not implement chip timing properly.
what did you lose 230 bucks on?

a preoverclocked board which didn't adhere to nvidia's design.?

What nvidia board partners do with the cards is on them, nvidia just makes chips that the vendors tend to screw up on and not implement chip timing properly.
...

not OC'd. It was a reference 8600GT with a GPU that wasn't made properly by nVidia. Nice job there trying to spin off nVidia's guilt. I just happen to not live in US (disregarding my profile, you can check my IP for where I really live) which is why I had to pay more for less of a GPU.
Oh, you mean those cards. (the ones i don't care about because they are 128bit value cards)
Fact: Nvidia didn't sell you a card. XFX, Leadtek, Asus, PowerColour, etc did. It is up to them to replace the card.

Yeah, they had the same teething issues with the leadfree designs as Microsoft did with the Xbox360, Yet... people still buy those.

The biggest issue today is Vendors Preoverclocking the cards where nvidia has explicitly told them not to.
Main reason for this is because the Drivers set a bunch of additional timings for the gpu and memory, The only vendors known to adjust the timings for the new clocks properly is XFX and EVGA. (Contact mikeyakame on guru3d, hes been decompiling the nvidia drivers/bios's to learn whats in them)
Oh, you mean those cards. (the ones i don't care about because they are 128bit value cards)
Fact: Nvidia didn't sell you a card. XFX, Leadtek, Asus, PowerColour, etc did. It is up to them to replace the card.
When will you stop crying for nVidia? Yes, it's their job to replace it. But it isn't their FAULT. They are NOT THE COMPANY TO BE BLAMED.
Actually ITS NOT.
its XFX's job to replace it, its Dells job to replace it, its Asus's job to replace it.

the vendor failed to proceed with a product recall, and nvidia would've have to complied with replacement chips had they have.

nvidia didn't fail anyone here. the media has failed at placing the blame to the proper agents though.
Actually ITS NOT.
its XFX's job to replace it, its Dells job to replace it, its Asus's job to replace it.

the vendor failed to proceed with a product recall, and nvidia would've have to complied with replacement chips had they have.

nvidia didn't fail anyone here. the media has failed at placing the blame to the proper agents though.
Prove that a recall was issued by nVidia or you're just fudding.
Actually ITS NOT.
its XFX's job to replace it, its Dells job to replace it, its Asus's job to replace it.

the vendor failed to proceed with a product recall, and nvidia would've have to complied with replacement chips had they have.


nvidia didn't fail anyone here. the media has failed at placing the blame to the proper agents though.
Quoting myself to indicate your failure to read.

FACT: IBM doesn't declare a product recall because the Tricore in the Xbox 360 fails.
Infact... theres not even any proof the fault was in the GPU's construction, RROD issue anyone?
1 - 20 of 38 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top