Šta je novo?

nVidia se vraca na staze 3dFX-a?!

^^Niko nije ni rekao da Nedjo pisao nesto o tome? Ali kad se pomene 3dFX nekako mi rec propast ide sama po sebi:D
A Kyle je napravio odlicnu recenziju kutije, nadam se da ce nastaviti s tom praksom i dalje.


Dusan K je napisao(la):
Nije mi jasno kako rezonujes da ovo nije konkurencija AIB partnerima - kada kupac ode u prodavnicu i kupi ovu kartu, jedan od njih ce ostati uskracen za profit.

Pa zar i ti ne rezonujes isto? Rekao si i sam da ce geekovi kupovati eVGA i ostale brendove preko neta kao i do sad, a ne u By More-u...:)D)...

Dusan K je napisao(la):
Kao sto rekoh, cela ova prica mi ima smisla samo u jednom slucaju - imaju viska cipova koje niko od AIB partnera nece da kupi, verovatno jer su margine za zaradu na GF10x cipovima male.

Pa da...uvek mora crnjak...GF104 niko nece da im kupi, pa moraju ovako...ima smisla, da...
 
Poslednja izmena:
...GF104 niko nece da im kupi, pa moraju ovako...ima smisla, da...

...ja ne znam dal se nesto drugo i prodaje osim te karte u zadnje vreme , u rangu od 150-450 evra.
...mozda je najveci minus za tu kartu sto ne mogu 3 u SLI...
 
Poslednja izmena:
Pa zar i ti ne rezonujes isto? Rekao si i sam da ce geekovi kupovati eVGA i ostale brendove preko neta kao i do sad, a ne u By More-u...:)D)...

Hm, a zar nije tvoj rezon da ce kroz BuyMore (respect :D) da zamene XFX... to se nece desiti, jednostavno nije ista publika, ali sta je sa drugim proizvodjacima? Koliko sam shvatio, i tamo je bilo nekih retail kartica, a njima su ove NV branded direktna konkurencija.

Pa da...uvek mora crnjak...GF104 niko nece da im kupi, pa moraju ovako...ima smisla, da...

Kao sto rekoh, ja ne vidim drugo logicno objasnjenje za ovaj potez, sto ne znaci da ga nema ;) Mi ne mozemo da znamo kakva je cenovna struktura GF104 karte i da li se AIB partneru isplati da je prodaje za ~140 dolara, nebitno kolika je potraznja za njom. Mozda se NV ne isplati da taj cip prodaje AIB partnerima po ceni po kojoj moraju da bi cena bila tolika kolika jeste.

Iskreno, ovo poslednje je jos mozda i najverovatnije od svega ;)
 
Hm, a zar nije tvoj rezon da ce kroz BuyMore (respect :D) da zamene XFX... to se nece desiti, jednostavno nije ista publika, ali sta je sa drugim proizvodjacima? Koliko sam shvatio, i tamo je bilo nekih retail kartica, a njima su ove NV branded direktna konkurencija.

Pa ok, ja sam samo mislio na popunjavanje rupe na ovaj nacin, koji nece da uvredi gospodu partnere, ne tvrdim da ce biti preterano uspesan...I naravno, bice konkurencija u nekoj meri, ali ne vidim razlog za dramatisation...

Kao sto rekoh, ja ne vidim drugo logicno objasnjenje za ovaj potez, sto ne znaci da ga nema ;) Mi ne mozemo da znamo kakva je cenovna struktura GF104 karte i da li se AIB partneru isplati da je prodaje za ~140 dolara, nebitno kolika je potraznja za njom. Mozda se NV ne isplati da taj cip prodaje AIB partnerima po ceni po kojoj moraju da bi cena bila tolika kolika jeste.

Iskreno, ovo poslednje je jos mozda i najverovatnije od svega ;)

Mozda...a mozda misle da takvo trziste (tj nacin kupovine) nije dovoljno "pokriven" ili iskoriscen u USA, pa zele nesto da promene...:)
 
Mozda...a mozda misle da takvo trziste (tj nacin kupovine) nije dovoljno "pokriven" ili iskoriscen u USA, pa zele nesto da promene...:)

Mozda, ali po svemu sto znam retail trziste tehnicke robe je u USA na izdisaju, cak i kada su veliki lanci u pitanju. Ljudi odu u prodavnicu, pogledaju proizvod, probaju ga eventualno, odluce se, a onda odu kuci i naruce sa neta... jer je jeftinije ;) Sumnjam da NV moze tu nesto da promeni, sve i da im to jeste namera (a mislim da nije :))
 
Da vidimo šta Charlie sa Semiaccurate ima da kaže na ovu temu:

WHY IS NVIDIA screwing it's partners by cutting them out and directly competing against them? That one is easy, they are circling the wagons and grabbing a bigger share of a shrinking pie because plans A and B didn't work out.

If you haven't been paying attention lately, HardOCP noticed, then confirmed, that Nvidia is putting their own branded cards out. If you are an Nvidia partner, this is basically your death knell, but according to Kyle, Nvidia claims it is only a test. With Newegg and Best Buy on board, the largest etail and retail channels in the US respectively, that is one heck of a 'test'.

Why would they do such a thing? That one is easy, their income is dropping like a rock, and will crater completely in a year or two. The chipset business that made up between 1/4 and 1/3 of their income is in the process of going away, and that puts them from a $1 Billion/Q company to a sub $1B/Q company. That revenue isn't coming back, and something needs to replace it.

Normally, they would make this up in other areas, that is what companies tend to do. The other options are GPUs, GPU compute/GPGPU and widgets. None of those three options are panning out for the company, and this latest power grab will only hasten the death of their mainstream GPU business just like it did for 3DFx.

GPUs are dying as a market, or three market segments; high, mid, and low, so to speak, are dying. On top you have the 'high end' GPUs, generally segmented as the $200+ category. These are large die size chips that sell for very high margins, but only make up 2-5% of the unit volume. If a GPU maker can break even on dev costs here, they are usually pretty ecstatic. This generation, Nvidia is not going to break even, but that is somewhat tangential to the point.

On the other end of the spectrum the 'low end' GPUs are classified as <$75 boards and generally are made by taking 1/4 of the high end card, sometimes less, and calling it adequate. They rarely are, but Moore's law has made them 'suck less' with each passing year. In any case, they are very cheap to develop.

Low end GPUs have razor thin margins but huge volumes, between 60-75% of the unit volume, sometimes more. Even with that, the margins mean that a GPU maker makes little if anything in this segment. A $29 card doesn't leave much room for silicon profits, much less anything else, but this is what HP and Dell sell by the millions.

It does drive volume, and that can drive down costs, amortize a lot of fixed costs, and generally spread a lot of the things that accountants don't like to talk about. They are a necessity even if the bottom line isn't directly helped much by their existence.

The last segment is the 'mid-range' cards, basically the $75-200 category. These are generally made by cutting the high end parts in half, and they are the meat of the market. Mid-range GPUs perform adequately and consumers like them, so this segment sells fairly well, and makes decent margins. GPU makers need this segment if they are going to make a profit, the other two are not going to do it.

The low end GPU is going to evaporate in a few months. Later in Q4, AMD's Ontario/Zacate parts have 80 shaders on the die, and in Q1 Intel's Sandy Bridge arrives with about the same power, possibly a bit more. Q2 sees AMD's Llano hit the streets with 400 shaders for the kill. From there, things only get faster and more powerful. Remember, this functionality is included on the CPU for zero additional cost, you won't be able to buy a consumer CPU without them.

That means the low end GPU market goes *poof* because even if you could make a faster GPU, it wouldn't be economically sane, much less sound, to sell it at the 'low end' price point. Llano will not only eat that market but threaten the low end of the mid-range GPU segment. Game over, and it won't come back. That is the majority of Nvidia's unit sales in the GPU market, gone.

On the high end, Nvidia got a bit of a reprieve when Intel shelved their Larrabee GPU, but those in the know will tell you it isn't permanently dead. In any case, the high end was spared immediate death, but it is running into another wall, that of 'good enough'. A single high end GPU will power almost any game in existence on a 30" 2560 * 1600 monitor, and a mid-range one will do the same for a 1920 * 1200/1080p screen.

Making the case for needing a faster card on the high end is a progressively more tenuous argument. Multiple screens, 3D and all the other things that are touted as 'killer apps' for this category are simply not turning into an economically viable customer base. Banging the drum may make some analysts change their ratings, but it still doesn't sell cards.

With the high end dying out, the mid-range will have to shoulder more and more of the development costs, meaning those profits get whittled away. With the low end gone, the fixed costs are amortized over a smaller base too. This death spiral quickly leads to a mid-range that is unprofitable as well. That time is coming quicker than Nvidia wants the financial community to believe. 60% of Nvidia's business is hanging by a thread right now.

The net result is that GPUs are not worth doing for anything but the 'professional'/ultra-high end/compute markets. Those markets are insanely profitable, selling a chip that normally goes into a $500 card for $4000 is not a bad trick, but it depends on two things, GPUs and heavy investments.

Nvidia's professional line can only exist because they subsidize it with the mainstream GPU market. In essence, they have the chips given to them for free, and all the division has to do is write software and drivers. That is a big expense, but an order of magnitude less than the cost of developing the underlying chip.

With the low end gone, that means the professional card line will have to shoulder more of the burden too. When the high end goes, the line will have to stand on it's own as well as supporting the mid-range GPUs. History shows that there have been literally dozens of companies making GPGPU-like chips for compute acceleration. Every single one failed because the economics did not work out. Let's repeat that very important point. Every single one failed because the economics did not work out.

Nvidia can only make money at this because they don't have to develop chips on the same books as the GPGPU division. When that 'mana from gamers' goes away, so will all the profitability of the GPGPU segment. That wipes out one of the two remaining 'exit strategies' for Nvidia.

The other exit strategy is 'widgets', aka the Tegra line. All we really need to say here is that the second generation Tegra 2 is 20+% over the promised power budget, and is so buggy that it's adoption is basically zero. As they said last generation, next generation will be better. Or they will say it again.

In the mean time, the Tegra line does not appear to be self-sustaining. That is the second door slammed shut, and unless it is flung wide open very soon, it won't ever make money. Considering that projections for Tegra 2 sales were cut in half earlier this year, then vastly reduced again a few months later, it doesn't look good.

The end result is that things look bleak for the boys in green, but this article is about the short term, and circling the wagons. If you recall, the chipset business is gone, and that is a huge chunk of of Nvidia's income wiped out. The GPU business is about to take a sharp drop, and head down with rapidity that few understand.

Nvidia does see the upcoming drop, and to brace for it, they are grabbing what revenue they can. That means people upstream of them, namely their AIB 'partners' are expendable. Nvidia has been cutting back partners, they just shut down BFG by not supplying them 400 series parts, and are trying to do the same to XFX now. Ironically, BFG's big strength was at Best Buy, I wonder why they had to die?

By cutting out the AIBs, Nvidia can sell $200 boards instead of $50 chips. That is good, right? Well, not as good as it is seems, margins on GPUs are in the 50% range while boards are lucky to get 1/3 of that. The first thing selling boards will do is crater their margins, but that is expected.

Once the retail and etail returns start pouring in, those margins will decrease even more, and support costs have a long tail. Retail is horribly tough, and few can master it. Those that can don't tend to make huge profits, and step over a lot of burnt out husks when they do. Nvidia is jumping into the deep end more out of fear than planning.

What the cards will do is boost the revenue coming in the door by a large margin. That will give the appearance of 'things going well' so insiders can sell more stock. Nvidia has one thing mastered, snowing gullible financial analysts. The card sales are another attempt at doing that, and it will likely succeed.

Success on this scale is measured in months though. In the best case it will only cost Nvidia a few 'partners'. Worst case, they will all die or leave. If Nvidia succeeds, their partners are dead, it is just a matter of 'when' not 'if' they go. When the AIBs go, Nvidia loses a lot of marketing presence, channel expertise, and sales outlets.

There is no way that the branded Nvidia cards can make up the revenue lost from the chipset and low end of the GPU business. There is no way that the branded Nvidia cards can make up for the lost marketing from their ex-'partners'. There is no way that GPUs will be sustainable as a stand alone business in a few years.

All the fuss about making their own cards comes down to one thing, propping up a failing business. The core is rotting out, the exit strategies aren't panning out, so Nvidia is eating their progeny with a 'better them than us' smile on their faces. The problem is that it is a shortsighted strategy, and will only hasten the end. By then, stock will have been sold, and those that know will have moved on.S|A
 
Bar je kod jedne stvari u pravu:
What the cards will do is boost the revenue coming in the door by a large margin. That will give the appearance of 'things going well' so insiders can sell more stock. Nvidia has one thing mastered, snowing gullible financial analysts. The card sales are another attempt at doing that, and it will likely succeed.
 
Charlie, Nvidia's biggest hater...
 
Taj charli definitivno radi za AMD samo takav neko moze napisati ovakav tekst, pomesan za istinom i nagadjanjem u toj meri da lici na tekst koji je napisan iz cinjenica sa svojim zakljucima koji jasno vuce na stranu protiv NV-a
 
Meni je iz celog teksta najinteresantnija jedna cinjenica (na stranu Charlie-jeva poznata "ljubav" prema NV)... da li je u pravu da ce od desetog (kada je bila planirana i prodaja kroz BestBuy retail) startovati i prodaja na Newegg-u? To onda znacajno menja situaciju.
 
Charlie je na benchu pod nickom Nedjo.:p
 
Uh ovo se nece dopasti AIC-ovima (u NV terminologiji), jer implicitno sugerise da su ostale kartice drugih proizvodjaca ispod "highest quality":

inace odlican tekst kod Kyle-ia:

http://www.hardocp.com/article/2010/10/05/nvidia_enters_retail_direct_sales_at_best_buy/1

Ne verujem da će to tako mnogi protumačiti. HardOCP je tu malo senzacionalistički pristupio temi (što je i njihovo pravo i skroz je ok). Ali mi je zanimljivo pitanje - zašto to NVIDIA radi - nije zbog profita, ali zašto, onda? Ako su im troškovi 8M dolara, čist profit preko BestBuy-a neće biti bitno veći od toga, nego okvirno do oko 15M (bar ako su priče o marginama i količinama naručene robe tačne)... Dakle, da li je ovo one-time thing ili je BestBuy sam prvi korak u priči...
 
E sad ... zašto Google prodaje telefon pod svojim brandom, ili zašto Intel prodaje ploče pod svojim brandom? Jedino što natpis sa kutije zaista bode u oči (partnere) :d.
 
Charlijev tekst je skroz na mestu. Realno stanje stvari. Samo je prepričao tj. dao siže onog što se desilo i što se dešava ATM.
 
Charlijev tekst je skroz na mestu. Realno stanje stvari. Samo je prepričao tj. dao siže onog što se desilo i što se dešava ATM.
a i Jensen se trudi da mu pomogne nastavljajuci da potvrdjuje cinjenicno stanje stvari:

During a press conference in Taiwan, Huang also conceded that Nvidia lost market share due to AMD’s early lead in the DirectX 11 market. He noted that Nvidia only managed to launch two high end Fermi parts in Q2, but that new products should help fill the gap

http://www.fudzilla.com/graphics/item/20413-huang-denies-globalfoundries-deal
 
Taj tekst ne vuce na stranu protiv nV nego protiv onih koji se bave samo GPU poslom.
Ima li sanse da low-end nestane kao sto Charlie najavljuje za narednih pola godine? Intel i AMD sigurno rade na tome da ga prigrabe sebi.
Nemojte zaboraviti i da je GPU trziste nastalo zahvaljujuci PC gaming trzistu koje je u sve dubljem bunaru i vise nece biti ono sto je bilo.

Ako stvarno neki izvori prihoda nestanu moglo bi da bude velikih problema, a onda sledi jedva velika bombasticna (ne i iznenadjujuca) vest u direktnoj vezi sa prvom recenicom -> Intel preuzima nV.
 
Neverovatno kako se preuveličava kašnjenje od par meseci...
 
Pa da je samo kasnjenje ne bi se toliko prepricavalo iznova i iznova...
 
Povucite analogiju između dedicated zvučne karte, i grafičke karte. Pre 10 godina svaki komp je imao odvojenu karticu, i poneki integrušu. Danas 95% forumaša fura integruše. Pala cena, performanse porasle, i potreba za dedicated rešenjima je nestala.
 
Nije problem samo u kasnjenju od par meseci, vec sto su time izgubili korak. Tih par meseci moze da znaci da sada uopste nece imati odgovor na 6000 seriju (izuzimajuci Full GF104 i dual gpu karte), a to je onda vec devet meseci najmanje.
 
Povucite analogiju između dedicated zvučne karte, i grafičke karte. Pre 10 godina svaki komp je imao odvojenu karticu, i poneki integrušu. Danas 95% forumaša fura integruše. Pala cena, performanse porasle, i potreba za dedicated rešenjima je nestala.

Pa ta dva se ionako ne mogu porediti, ljudi sto ste smesni onima koji je integralac dovoljan pa vec 10 GODINA koriste integrisane ploce , pa i ja sam kad nisam imao kesa sam imao intel 810 i S3 graphics na ploci!,
To sto ce upotrebljivost integrisane grafike biti malo na visem novou radiklano ne menja nista u vecem obimu... onaj ko oce da igra igre i dalje ce morat da izvoji kes za GPU, to je u korist i AMD i NV-a, jer ni AMD nema neki preium profit na integrisanoj grafici kao sto i Intel nema nesto specijalno zbog svog I3/i5 integrisanog procesora sa GPU-em (prodaju ih po istojceni kao i obican i5 bez GPU-a)
 
Pa lepo covek rekao da ni prve integruse zvucne nisu valjale, a sad ih vecina koristi. Tako i ove integrisane grafike za sad ne valjaju, ali s vremenom ce se i to verovatno promeniti. Ali, kad neko stavi one konjske naocari i vidi samo ono sto on hoce, tu nema pomoci.
 
Poslednja izmena:
Pa lepo covek rekao da ni prve integruse zvucne nisu valjale, a sad ih vecina koristi. Tako i ove integrisane grafike za sad ne valjaju, ali s vremenom ce se i to verovatno promeniti. Ali, kad neko stavi one konjske naocari i vidi samo ono sto on hoce, tu nema pomoci.

To kad se desi izumrace ceo GPU segment, kad budemo moglu crysis 8 i COD 21 da igrama na integrisanoj ploci... to definitifno nece biti fusion ili bilo sta u naredne 7-8 godina
 
Ispadate iz celokupne koncepcije napretka tehnologije..
Pa da se i desi da integrisane grafike budu na nivou nekih zadovoljavajucih performansi GPU ,mozda na nivou danasnjih kartica ,tehnologija ce da napreduje isto tako i taj raskorak koji stoji izmedju integracije gpu-a i zasebnog externog gpu-a kao celokupan zaseban interfejs ce uvek da bude jako velik..
Druga stvar 3d gaming nece stajati u mestu,i tu se implementiraju razne inovacije,engine se uvek poboljsava..
Integrisani gpu ce uvek sluziti za prostije operacije dok ce za pravi gaming uvek postojati i sluziti iskljucivo toj svrsi prava grafika ..
 
Poslednja izmena:
Koliko ja znam jedan od aspekata u kojima se ogleda napredak elektronske industrije je integracija, a ultimativni cilj je imati ceo racunar u jednom cipu. Ne kazem da ce se to desiti sutra, ali ce se desiti.
Odosmo u off.
 
@Zeljko, masas poentu. Uvek ce biti ljudi koji ce biti spremni da povecu sumu novca daju na grafike. Ali mnogo veci broj istih (humanusa) ce kupovati jeftine integrisane ploce, jer i one zavrsavaju dobar deo posla. Od 4 racunara sa kojima se cesto srecem, 3 imaju integruse (SiS, VIA i AMD) a i cetvrti bi da integrusa moze da tera dva digitalna monitora. Mali broj ljudi kupuje odvojenu grafiku niskih i srednjih performansi (u odnosu na broj ljudi koji kupuje integruse), kada integruse napreduju toliko da pregaze dedicated grafike srednje klase, situacija ce biti ista kao sa danasnjim zvucnim kartama, imaju ih samo oni koji "cuju" jedva primetnu razliku.

@PCBeast
Zaboravljas da "konzolaske" cikluse i na to da PC gaming polako ali sigurno odumire? Sve je manji broj igara koje vredi "kupovati" a kamoli onih koje vredi igrati. Situacija moze samo biti gora u narednih par godina.
 
Vrh Dno