Šta je novo?

GeForce GTX 680 Thread

Zato sto ga kartica tretira kao "power virus" i ne dozvoljava mu da koristi sve resurse...i cim ga hw prepozna uleti zastita i obori klokove. Batali furmark on je izgubio smisao jos od kada je AMD uveo zastitu od istog, a zatim i nV :)
Btw cestitam na kupovini:)
 
Poslednja izmena:
Oduvek su umeli da odrade test majstorski. Odličan izbor igara

Suma sumarum:

1920 x 1080
57.png


Na osnovnim (default) vrednostima frekvencije u rezoluciji 1920 x 1080 HD7970 je u proseku testiranih igara sporiji 8,8% od GTX680.
Kada se obe kartice oveklokuju GTX680 ima prednost od 1% nad HD7970.

2560x1440
58.png


Na osnovnim (default) vrednostima frekvencije u rezoluciji 2560 x 1440 HD7970 je u proseku testiranih igara sporiji 5,8% od GTX680.
Kada se obe kartice oveklokuju GTX680 gubi za oko 1% od HD7970.

Nesumnjivo da se prednost GTX680 nad HD7970 gubi kako povećanjem rezolucije tako i overklokovanjem (ruku na srce HD7970 ima više "mesta" za OC).
 
Poslednja izmena:
Malo sam se smorio sa zelenim ekranima na flash video klipovima... Kartica jeste jaka, ali izgleda da se sa nečim kod mene u kompu kolje. U svakom slučaju ide da doboš....

@Ljanmi :p
 
re:ASUS GeForce GTX 680 ASUS DirectCU II TOP

1367 MHz WTF?
Jos malo i 1400 Mhz ,cool ;)

VR-Zone have tested this card, it can reach a core clock of 1.3GHz without voltage mods!

Temperatura je stvarno smesna..
27c idle dok full load ide do 67c !


30v0htl.png


Battlefield 3 is another example, with all eye candy opened up in game and again at 4xAA the card still pushes 61 FPS at 19x12 and that makes it the fastest single GPU based graphics card on the market.

Bruka,a jos nije ni overklokovana..
Ovo se zove zver :eyebrows:
 
Poslednja izmena:
Inno3D GTX 680 iChill Keeps 41°C Under Full Load

Inno3D GTX 680 iChill Edition is hybrid liquid and air cooled card. It comes with additional waterblock which is mounted separately in pc case. Card is equipped with iChill Accelero Hybrid cooler including one fan.
This card requires two different power connectors – 6 and 8-pin.

Tests shown that iChill cooler can provide substantial decrease of temperature (in comparison to stock cooler). In open space and room temperature, this card has a idle temp of 25°C and 29°C in closed environment.
Full load will rise temperature up to 41°C.

Source did not provide information where were those pictures taken, but it looks like some Inno3D lab.
There are also other Inno3D cards pictured as well. We can see one already announced – Inno3D GTX 680 TwinFan Edition, card with removable cooler shrouds.
You can also notice a triple-fan cooler, which according to translation could be called Inno3D GTX 680 Ice Dragon Black Gold Edition (if it wasn’t long enough already).

qIYi1.jpg

01.jpg

bGWqk.jpg
 
re: GIGABYTE WindForce 5X

Vracamo se starim navikama
8722d1082235844-voodoo-5500-voodoo56000.jpg


Ja definitivno imam osecaj za "veliki" hardver pogotovu graficke karte,deluju mocno ali ovo je preterano.Na kraju ce poceti i cigle da ugradjuju u kompjuter.
Taj Gigabyte vise i ne lici na kartu,preterali su!
 
Ne znam sta se zalite. Jeste da bi buka bila malo velika, ali ova Gigabyte kartica mi je bukvalno prelepa, a jos dusu je dala za SLI.
 
Poslednja izmena:
Ja definitivno imam osecaj za "veliki" hardver pogotovu graficke karte,deluju mocno ali ovo je preterano.Na kraju ce poceti i cigle da ugradjuju u kompjuter.
Taj Gigabyte vise i ne lici na kartu,preterali su!

Da ne spominjemo lakrdiju što su natpisi na fanovima obrnuto postavljeni (to jest sami fanovi) :D
 
Idu "vece je bolje" filosofijom. Impresionirace valjda karta poput vesla :)
 
GTX 680 GPU Boost: a double variability

During our testing the GeForce GTX 680, we expressed some reservations about the behavior of GPU Boost , turbo Nvidia. This is indeed non-deterministic in the sense that, to enable the GPU to increase in frequency, it is based on actual consumption rather than an estimate which has the advantage of being the same for all GPUs. Approach has the advantage of maximizing the performance of samples that consume the least, for example because they suffer from less leakage currents, but by introducing a dose cons of variability in the performance of two samples. Nvidia is tight-lipped on the subject, although we have dwelt at length for more details on this range of variability. The manufacturer is happy to announce a frequency and GPU Boost which is the minimum frequency guaranteed that the GPU can reach, said its engineers were amazed watching the GPU go higher and in many cases refused to say more. Actually this GPU frequency Boost is a specification of façade, and its reference in the bios or drivers has absolutely no effect except to allow monitoring tools to postpone it.


So what is the true maximum frequency of GPU Boost? By observing the behavior of some trade cards, we could begin to understand why Nvidia is embarrassed by the questions that revolve around this point. All GK104-400 (version of the GPU for the GeForce GTX 680) are not qualified to the same maximum frequency! For example, the sample that provided us with Nvidia for the test is qualified to 1110 MHz while a copy Gigabyte we have learned the trade has a GPU that has been described as "only" to 1084 MHz. Others are at 1071 MHz, others to 1097 MHz etc.. This means that more non-deterministic operation of GPU Boost, the increase in frequency will be limited differently depending on the sample, regardless of whether the GPU temperature and fuel consumption are largely under the limits. Set against these findings, Nvidia still refuses to answer, arguing that, after insistence, he wants to keep secret his qualification procedure to avoid the competition draws inspiration. In general, and simplifying, the first batch of GPUs are tested and common specifications are defined that allow to have a certain volume of production with a certain performance level and a certain thermal envelope. In the case of GK104-400 , probably because the GeForce GTX 680 is neck and neck with the Radeon HD 7970, Nvidia seems to enjoy the single point of performance available ... even if the reach is incompatible with a sufficient volume of production. In other words, Nvidia wants to enjoy the performance of a GPU and 1110 MHz and the volume of production of a GPU to 1058 MHz. How it works in practice? We can assume that each GPU postpones number of bins of 13 MHz (notches) the gap between the base frequency and maximum frequency turbo, just as each GPU postponed for some time a tension all its own. Our sample press is a well qualified GK104-400 MHz to 1006 + 8 bins (1110 MHz) while the sample Gigabyte merely a qualified GK104-400 MHz to 1006 + 6 bins (1084 MHz). In practice we observed that the variation in performance between two cards comes over the use of GPU qualified at frequencies at different maximum operating non-deterministic GPU Boost, most games remaining well below its consumption limit. What a performance gap between these two samples?

This 2% difference in the actual specifications of these two GeForce GTX 680 results in a practical difference of 1.5%, which may amount to 5% in the case of Anno 2070, very greedy and is also influenced by the consumer GPU obviously lower on the sample media.
All for that?
Why worry about this detail? After all, 2% difference or less in games do not make a big difference, that does not fail to note Nvidia ... that does not account, however without it! We assume in fact that the GPU designer has taken this approach in order to glean a single point performance against the competition. Moreover, it is only an example on which we have fallen, without seeking the largest gap possible. Of GK104-400 can be validated at a frequency higher or lower turbo that we observed, producing the largest performance gaps. Nvidia categorically refusing to communicate on these differences, probably because it would be embarrassing to admit a difference in the specifications, we can not know the range of variation. It also raises more fundamental questions with respect to component specifications we are used that they are fixed by product model. Is it acceptable that they are not fixed? Is less than 1% is acceptable? 2%? 3%? 5%? 10%? At what point would there be abuses? Could you imagine an Intel Turbo which could be randomly 3.9 GHz or 4 GHz following the sample? A 128 GB SSD or 130 GB? It will be interesting to see if this margin increases or no change on future Nvidia products, although it is generally difficult to determine, especially at launch when we have to settle for Press a copy that has in all likelihood not chosen randomly. How to tell if the performance of the GeForce GTX 690 that we will offer tomorrow will be fully representative of the trade cards?
 
Preterivanje i neosnovano poredjenje razlicitih tehnologija .
Patent mozda jeste "isti" ali princip rada itekako nije.Intel Turbo - (nVidia Boost)
Drugo ,forsiranje da se prebaci teziste na slab potencijal samog chipa sto opet nije tacno jer su testeri
vec odavno dokazali da GK104 ima dosta lufta za OC.
Nedovoljne frekvencije da bi nVidia imala vecu prednost ili ti prednost at all?!WTF?
Kao da se GPU boost projektuje tek tako i implementira..Hajde da vidimo koliki je domet chipa,ako nemamo
osnovu za neke vise frekvencije ubacicemo nesto "GPU Boost" koji ce da pogura malo situaciju,zamazacemo oci!
Hehe.
Pa svi smo znali kako GPU boost funkcionise i do sad i da je skoro svaka karta razlicita,tj primerak za sebe..
Izmisljaju toplu vodu.

Tipicna anti kampanja ..Ne znam ko moze ovo da proguta tako lako?!
 
Poslednja izmena:
Pa svaka karta i jeste primerak za sebe!!
Kako 680 tako i 7970, pa i sam ASIC nam po kome se i određuje fabrički VID to govori!
I šta je tu uopšte novo, to je važilo i ranije i za CPU i GPU, zamisli seriju gde svaki primerak ima isti OC i istu voltažu? Zato i ima high binned za sve pa i memoriju, NAND...

Nema tu nikakve anti kampanje, istakli su samo ono što se već zna odavno (sa ili bez Boost-a); možda su jedino malo hejterski raspoloženi ili šta već, pa Nvidia i nije specificirala klokove po boost-u, ali da ih gura napred stoji...
 
Sam tekst ima takvu konotaciju ,pogotovu sto naglasava ,navodi da nVidia nesto skriva..U prevodu, there's something fishy !
Obicno su ovakvi tekstovi pokriveni od druge strane,retko se desava da sam sajt na svoju ruku opali paljbu..
 
Ma to su Francuzi, vole oni tako poznajem ih i procitao sam original tekst, sumnjam da su "angazovani". U sustini bune se da Nvidia gura boost kako bi pretekao radeone (malo su i u pravu), a da svaka kartica nece moci imati toliki boost sto je poenta njihove zalbe, tj. ne bi se ni bunili u slucaju da je boost isti kod svih i optuzuju ih zbog toga (malo su vise u pravu).
 
Poslednja izmena:
Preterivanje i neosnovano poredjenje razlicitih tehnologija .
Patent mozda jeste "isti" ali princip rada itekako nije.Intel Turbo - (nVidia Boost)
Drugo ,forsiranje da se prebaci teziste na slab potencijal samog chipa sto opet nije tacno jer su testeri
vec odavno dokazali da GK104 ima dosta lufta za OC.
Nedovoljne frekvencije da bi nVidia imala vecu prednost ili ti prednost at all?!WTF?
Kao da se GPU boost projektuje tek tako i implementira..Hajde da vidimo koliki je domet chipa,ako nemamo
osnovu za neke vise frekvencije ubacicemo nesto "GPU Boost" koji ce da pogura malo situaciju,zamazacemo oci!
Hehe.
Pa svi smo znali kako GPU boost funkcionise i do sad i da je skoro svaka karta razlicita,tj primerak za sebe..
Izmisljaju toplu vodu.

Tipicna anti kampanja ..Ne znam ko moze ovo da proguta tako lako?!

Pa jeste bezveze da jedan primerak karte na default-u bude +2%/-2% brzi/sporiji u odnosu na druge primerke, dakle nece svaka karta da boost-uje isto... Jeste da je to toliko sitna razlika da se ne primeti sto i oni kazu u testu, ali moze se desiti da razlika bude i veca mozda, pa su tu spomenuli Intel Boost i upitali sta bi bilo da jedan procesor ide na 3.9GHz a drugi na 4GHz.. Ili da isti SSD disk ima 128GB a drugi 130GB... To je poenta testa, a ne neko blacenje nVidia-e i slicno...

@mdm Ne bih rekao da boost gura da bi pretekla Radeon-e, GTX 680 jeste generalno brza karta od HD 7970 i da nema tog boost-a, ali boost doprinosi da se razlika jos poveca pa prednost bude ubedljivija.
 
Poslednja izmena:
Pa i nije brza kada je sve na visokim setinzima, a pogotovo na 2560x1440 i preko AKO su na istom kloku (1100-1200) gde ume biti i brza.
Poenta je da im je max domet kloka slican i tu su tako reci iste u proseku ili radeon ima i malu prednost u nekoj igri (1-2%), nije velika ali tu je. Radeon se samo zeznuo sa fabrickim klokom (samo 925 vs 1006+boost) sto je stavljen tako nisko a avaki moze preko 1000 (i 1100 bez muke) i na fabrickoj voltazi, a nisu se setili ili sta vec boosta - ko im kriv!
 
Vrh Dno