Šta je novo?

Fermi is Nvidia's DirectX 11

Status
Zatvorena za pisanje odgovora.
Naravno da zna, i GT400 i GT500 su vec u nekom stadijumu razvitka, kao i rv970 itd.
 
Next generation Nvidia 40nm chip yields are fine
Written by Fuad Abazovic
Friday, 25 September 2009 08:05



Senior manager confirmed


After a lot of rumours about bad yields with GT300, Nvidia has decided to talk to us and give us a simple quote about the state of its next generation 40nm product line.

Senior manager from Nvidia has told Fudzilla that “Our (Nvidia’s) 40nm yields are fine. The rumors you are hearing are baseless.”

This comes after some stories that GT300 yields are really bad and that next generation Nvidia is in really bad shape. According to AMD’s competitive analysis team, only nine chips should work per wafer, but they got this number due to a bad translation and we can definitely dismiss this number as completely inaccurate information.

As we've said before, the chip we call GT300 is a completely new design and it simply could not come out earlier. Nvidia certainly doesn’t like the fact that ATI is first with new technology again, but it comforts them that they can make its DirectX 11 faster.

You can expect a lot of architectural changes - the chip is completely redesigned and much faster than the GT200 generation. We would not be surprised if Nvidia starts talking about the chip ahead of official announcement as it currently hurts from positive Radeon 5870 reviews around the globe.

Znaci cekamo GT300 ;)
 
Poslednja izmena:
Čak ni Fudo ne zna nešto što ja znam - a to je da GT300 stiže oko 5 novembra ;)
 
Jel to proverena informacija?
 
Neka ga pomere na 7. ili 8. novembar. Da simbolicki obeleze trogodisnjicu od izlaska g80. Mozda su opet spremili monster gpu koji je 2-3 puta brzi od gpu-a predhodne generacije;)
 
"prema AMD-ovom analitickom timu"

kakva budalastina! AMD ne komentarise tudje proizvode, narocito ako oni nisu ni predstavljeni!
 
"prema AMD-ovom analitickom timu"

kakva budalastina! AMD ne komentarise tudje proizvode, narocito ako oni nisu ni predstavljeni!

Nije mislio na to, vec je "AMD’s competitive analysis team" nazvao Carlija:D
 
Pa nVidia senior manager nije rekao ni jednu jedinu reč o GT300. A koliko ja znam jedini nVidia 40nm čipovi koji su u proizvodnji su GT21x (za razliku od GT300) i verujem da su yield-ovi fine (not great but fine).

.
 
Poslednja izmena:
OFFICIAL: NVIDIA says GT300 on schedule for Q4 2009, yields are fine.

http://www.brightsideofnews.com/new...n-schedule-for-q4-20092c-yields-are-fine.aspx

November is looking to be an exciting month as we hear rumblings of a launch of AMD's Radeon HD 5850 X2 and HD 5870 X2. In fact almost all of the news lately has been about AMD's new DX11 40nm GPU. The question on everyone's minds was the same - Where is NVIDIA?

We are happy to report that we were correct in our earlier predictions that the GT300 is farther along than many other sites were predicting. We hear that NVIDIA has made an official statement on the state of the GT300 yields.

It seems that the GT300 has been taped out long ago and that yields are fine. This statement comes from a senior product manager for the GT300, but we also have this from our own sources [who have been saying this all along].

So where did the rumors of sub 2% yields come from? Well according to the unnamed source it looks like AMD's Competitive Analysis team mistranslated some information that stated only 9 chips per wafer worked. This information quickly spread around the internet and became what we like to call "the repeated truth". Bear in mind that this was the same team in charge of spreading rumors that Larrabee is now in its third or fourth generation of silicon, which is direly incorrect. But we'll address Larrabee and the turmoil there in a future article.

Our sources [and the senior managers] are still saying that the GT300 is slated for a Q4 2009 launch. This means that you should expect the new GT300 to show up around the end of November. This is most likely a follow up to Editor's Day that should happen at the beginning of November. Most likely nVidia is pitching the GT300 chip exactly three years after its revolutionary G80 [also known as NV50] architecture, pitching it right around Thanksgiving, however from what we are hearing it is looking more like the very end of November or early December.

Still with this news it does take a little notch out of the AMD announcement. We are sure that NVIDIA will be making more information public as the momentum grows around AMD’s new 5xxx series GPUs. After all nothing works better to kill potential sales than leaked information on a possibly better product. Since we already disclosed some GT300 performance figures, expect an interesting war of words.

We all will have to wait to see if the GT300 can outperform the HD 5870, HD 5850X2 and the HD 5870X2 but as the current HD5870's margin is not that big we might see an interesting upset. No matter what the outcome the GPU race is certainly going to be very interesting in the coming months.
 
Mislim da niko ne sumnja da ce GT300 biti veoma brz GPU, izvesno i brzi od Cypressa, kada se pojavi u Q4, a da li ce to biti paper-launch, videcemo. Takodje mislim da nece biti brzi od Hemlock X2 karte, ali to naravno nije sigurno, i samo je licni utisak posle ovoga sto smo videli u poslednjih par dana.

Ono sto je problem za njih nije to sto ce high-end GT300 kasniti 2 ili 3 meseca za konkurencijom, vec je pravo pitanje sta je sa derivatima za nize segmente (mainstream, ali i performance). AMD ce u sledecih mesec-dva dana pokriti sve trzisne segmente sem najnizeg DX11 kartama (taman na vreme sa bozicne kupovine), a NVidia ce realno kasniti makar 3, a mozda i vise meseci u ovim segmentima koji i donose najvece pare. Oni su najavljeni samo jednom (da ce ukupno biti 5 cipova zasnovanih na ovoj arhitekturi), ali nista preciznije nije objavljeno.

A ako se ispostave tacnim specifikacije Junipera, na njega NVidia nema odgovor trenutno, jer je daleko brzi od G92 i rentabilniji od GT200.
 
Odlično zapažanje.
Mene više interesuju derivati GT300 nego high end jer pri kupovini nove grafike se vodim price/performance računicom kao i čvrstim uverenjem da nikada neću dati više od 200 eura za grafiku.
 
OK, ali najvaznije je da high-end bude killer jer ce onda i derivati biti odlicni (kad god da se pojave), a onda, ko zna, mozda novi 6600GT :)

Nego:

http://vr-zone.com/articles/amd-vs-nvidia-dx11-gpu-war-heats-up-this-christmas/7740.html?doc=7740

"GT300 card will be fitted with GDDR5 memories featuring a 384-bit memory interface."

Stari dobri 384-bit memorijski interfejs :)

Moguce da ce biti odlicni, ali je pravo pitanje kada ce se pojaviti ;) Slozices se da njihov efekat nece biti isti da se pojave u decembru, ili aprilu 2010. godine.

A ovo za 384-bitni intefejs se i ranije pricalo, i stvarno ima logike. Zadrzavanje 512-bitnog samo komplikuje cip i poskupljuje izradu kartice, a u kombinaciji sa GDDR5 donosi suludi BW od koga nema previse koristi.
 
OK, ali najvaznije je da high-end bude killer jer ce onda i derivati biti odlicni

Koji crni derivati?
Mislim... 1,5god. nakon GT200, izađe ono siroče od čipa, onaj GT218. To je sve.
Ne bih se nadao da će derivati ubrzo. O njima, za sada ni pomena... niti javno, niti iza kulisa.

Biće to masakr nVidije u sledećih godinu dana u nižim segmentima tržišta.
 
Koji crni derivati?
Mislim... 1,5god. nakon GT200, izađe ono siroče od čipa, onaj GT218. To je sve.
Ne bih se nadao da će derivati ubrzo. O njima, za sada ni pomena... niti javno, niti iza kulisa.

Biće to masakr nVidije u sledećih godinu dana u nižim segmentima tržišta.

Zar mislis da su toliko glupi?:D

Poredjenje sa GT200 ne stoji. Derivata GT200 nije bilo, te segmente su pokrili rebrand-om G92 karti (eventuualno 55nm), koje su imale manje vise slicne mogucnosti (DX10 i sl, i manje vise slicnu arhitekturu). GT300 ce biti DX11 sa novim arhitekturom tako da rebrand ne radi posao, a mislim i da su sami svesni sta se najvise prodaje.;)
 
Poslednja izmena:
Derivata GT200 nije bilo, te segmente su pokrili rebrand-om G92 karti (eventuualno 55nm)
Ma, slažemo se u potpunosti.

Samo kažem da nema teorije da se derivati GT300 pojave u skorije vreme, jer će se pojaviti par meseci nakon flagship čipa. A znamo da je i datum STVARNOG početka prodaje GT300 pod velikim znakom pitanja.
 
OFFICIAL: NVIDIA says GT300 on schedule for Q4
"This means that you should expect the new GT300 to show up around the end of November."
http://www.brightsideofnews.com/new...n-schedule-for-q4-20092c-yields-are-fine.aspx

Next generation Nvidia 40nm chip yields are fine
http://www.fudzilla.com/content/view/15689/1/

GT300 predstavljena određenim ljudima - Na tržištu u ovoj godini
"Septembar je mjesec predstavljanja nove generacije 40nm arhitekture koju zovemo GT300. Možemo potvrditi da se to već desilo i da su neki VIP analitičari već imali priliku vidjeti ovu novu DirectX 11 karticu."
http://www.itx.ba/index.php?option=com_content&task=view&id=11186&Itemid=1

Yieldovi nove generacije Nvidia 40nm čipa su dobri (hmmm,ne čini li vam se da je ovo prevod od reči do reči onoga iznad)
"Viši menadžer Nvidije:„Naši (Nvidijini) 40nm yieldovi su dobri. Glasine koje slušate su neosnovane.....Prema AMD-ovom analitičkom timu, samo bi devet čipova trebalo raditi po waferu, ali je to samo greška u prijevodu i ovaj navod možemo odbaciti kao potpuno neistinit....Možete očekivati mnogo promjena u arhitekturi – čip je u potpunosti redizajniran i dosta je brži od GT200 generacije.“
http://www.itx.ba/index.php?option=com_content&task=view&id=11187&Itemid=1
 
fudzilla=itx.ba
Mislim da je glupo postavljati vesti sa oba izvora posto su identicni samo sto je jedan na engleskom a drugi na bosanskom...
 
Mislim da je prikladniji izraz suvišno; a možda nekome i odgovara, zbog lošijeg snalaženja sa tekstom koji je na engleskom - ima i takvih usera verovatno.
 
Poslednja izmena:
GT300 is codenamed Fermi

Named after a nuclear reactor


The chip that we ended up calling GT300 has internal codename Fermi. The name might suit it well as Enrico Fermi was the chap that came up with the first nuclear reactor.

The new Nvidia chip is taped out and running, and we know that Nvidia showed it to some important people. The chip should be ready for very late 2009 launch. This GPU will also heavily concentrate on parallel computing and it will have bit elements on chip adjusted for this task. Nvidia plans to earn a lot of money because of that.

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

The gaming part is also going to be fast but we won’t know who will end up faster in DirectX 11 games until we see the games. The clocks will be similar or very close to one we've seen at ATI's DirectX 11 card for both GPU and memory but we still don't know enough about shader count and internal structure to draw any performance conclusions.


Of course, the chip supports DirectX 11 and Open GL 3.1

Izbor: Fudzilla
 
Sto je sasvim ocekivao imajuci u vidu pozadinu nVidia-inog chief scientist.
 
Koji inace sedi i u odboru za OpenCL.
Pametni su oni, shvatili su da guranje tranzistora u nedogled zarad 10-20-30% boljih performansi u igrama nece moci u nedogled i na vreme su se preorjentisali na GPGU koji sada dobija i zvanicnu podrsku kroz DX11, DirectCompute.
 
Da, kao sto su proizvodjaci CPU-a udarali u zid sa single-core MHz trkom, tako i sad GPU-ovi teze da postanu generalniji i izvuku se iz fixed-function "blata".
 
Da, kao sto su proizvodjaci CPU-a udarali u zid sa single-core MHz trkom, tako i sad GPU-ovi teze da postanu generalniji i izvuku se iz fixed-function "blata".

A isto to pokusava i Intel sa Larabijem :) gomila X86 procesora....
 
... + dedicated jedinica za teksturisanje na inter-core prstenu, jer jednostavno, to nije dovoljno brzo za implementirati softverski, makar to bilo i na mighty Cell-u. :p

Referenca za maltene sve u vezi Larrabee-a, svi sajtovi su uzimali odatle.
http://software.intel.com/en-us/articles/rasterization-on-larrabee/

Intel zeli LRB i na GPU i na HPC trzistu, i zbog toga je arhitektura takva
800px-Slide_convergence.jpg



A koliko ce u tome uspeti, videcemo. Najveca prednost LRB-a je sto je "fully programmable".

Tom Forsyth je napisao(la):
Because the whole chip is programmable, we can effectively bring more square millimeters to bear on any specific task as needed - up to and including the whole chip; in other words, the pipeline can dynamically reconfigure its processing resources as the rendering workload changes. If we get a heavy rasterization load, we can have all the cores working on it, if necessary; it wouldn't be the most efficient rasterizer per square millimeter, but it would be one heck of a lot of square millimeters of rasterizer, all doing what was most important at that moment, in contrast to a traditional graphics chip with a hardware rasterizer, where most of the circuitry would be idle when there was a heavy rasterization load.

Tom Forsyth je napisao(la):
The SuperSecretProject is of course Larrabee, and while it's been amusing seeing people on the intertubes discuss how sucky we'll be at conventional rendering, I'm happy to report that this is not even remotely accurate. Also inaccurate is the perception that the "big boys are finally here" - they've been here all along, just keeping quiet and taking care of business.
 
Poslednja izmena:
Po nekim informacijama, naslov topika bi mogao da se preimenuje u:

GF100 is Nvidia's DirectX 11
 
Status
Zatvorena za pisanje odgovora.
Vrh Dno