Šta je novo?

Radeon 8.5k i Doom3...

byMaX

Znamenit
VIP član
Učlanjen(a)
10.01.2001
Poruke
12,148
Poena
1,019
Want to know some of what John Carmack has been doing with the Radeon 8500 in regards to Doom 3? Here is a blurb:


"A test of the non-textured stencil shadow speed showed a GF3 about 20% faster
than the 8500. I believe that Nvidia has a slightly higher performance memory
architecture.
A test of light interaction speed initially had the 8500 significantly slower
than the GF3, which was shocking due to the difference in pass count. ATI
identified some driver issues, and the speed came around so that the 8500 was
faster in all combinations of texture attributes, in some cases 30+% more.
This was about what I expected, given the large savings in memory traffic by
doing everything in a single pass."

So it looks like the talk that Ati's Developer Relations have improved greatly have is once again in evidence here. It seems that Radeon 8500+ owners will not have much to worry about with games based on future ID graphic engines or Doom 3 itself.

Koji drajveri......?????

Ma idi bre :)
 
Pa ipak...

"A high polygon count scene that was more representative of real game graphics
under heavy load gave a surprising result. I was expecting ATI to clobber
Nvidia here due to the much lower triangle count and MUCH lower state change
functional overhead from the single pass interaction rendering, but they came
out slower. ATI has identified an issue that is likely causing the unexpected
performance, but it may not be something that can be worked around on current
hardware.


I can set up scenes and parameters where either card can win, but I think that
current Nvidia cards are still a somewhat safer bet for consistent performance
and quality.

On the topic of current Nvidia cards:

Do not buy a GeForce4-MX for Doom."

--
Mada covek lepo kaze:
"As usual, there will be better cards available from both Nvidia and ATI by the
time we ship the game"

JC Rulezz!boom
 
Ma rullez my S.... Mogao bi malo i da radi na tom engine-u osim sto svaki dan updateuje planove i emailove... Osecam da pola sata dnevno radi a onda ladi ... da ne kazem sta.... Kao, ja sam genije i drugi bi 10 sati radili ovo sto sam ja za pola sata... Ma idi... Ta igra se pravi godinama....
 
Evo jos od tog info-a, obratite paznju na sledecu recenicu :"Nvidia's OpenGL drivers are my "gold standard"....



----------------------------------

February 11, 2002
-----------------

Last month I wrote the Radeon 8500 support for Doom. The bottom line is that
it will be a fine card for the game, but the details are sort of interesting.

I had a pre-production board before Siggraph last year, and we were discussing
the possibility of letting ATI show a Doom demo behind closed doors on it. We
were all very busy at the time, but I took a shot at bringing up support over
a weekend. I hadn't coded any of the support for the custom ATI extensions
yet, but I ran the game using only standard OpenGL calls (this is not a
supported path, because without bump mapping everything looks horrible) to see
how it would do. It didn't even draw the console correctly, because they had
driver bugs with texGen. I thought the odds were very long against having all
the new, untested extensions working properly, so I pushed off working on it
until they had revved the drivers a few more times.

My judgment was colored by the experience of bringing up Doom on the original
Radeon card a year earlier, which involved chasing a lot of driver bugs. Note
that ATI was very responsive, working closely with me on it, and we were able
to get everything resolved, but I still had no expectation that things would
work correctly the first time.

Nvidia's OpenGL drivers are my "gold standard", and it has been quite a while
since I have had to report a problem to them, and even their brand new
extensions work as documented the first time I try them. When I have a
problem on an Nvidia, I assume that it is my fault. With anyone else's
drivers, I assume it is their fault. This has turned out correct almost all
the time. I have heard more anecdotal reports of instability on some systems
with Nivida drivers recently, but I track stability separately from
correctness, because it can be influenced by so many outside factors.

ATI had been patiently pestering me about support for a few months, so last
month I finally took another stab at it. The standard OpenGL path worked
flawlessly, so I set about taking advantage of all the 8500 specific features.
As expected, I did run into more driver bugs, but ATI got me fixes rapidly,
and we soon had everything working properly. It is interesting to contrast
the Nvidia and ATI functionality:

The vertex program extensions provide almost the same functionality. The ATI
hardware is a little bit more capable, but not in any way that I care about.
The ATI extension interface is massively more painful to use than the text
parsing interface from nvidia. On the plus side, the ATI vertex programs are
invariant with the normal OpenGL vertex processing, which allowed me to reuse
a bunch of code. The Nvidia vertex programs can't be used in multipass
algorithms with standard OpenGL passes, because they generate tiny differences
in depth values, forcing you to implement EVERYTHING with vertex programs.
Nvidia is planning on making this optional in the future, at a slight speed
cost.

I have mixed feelings about the vertex object / vertex array range extensions.
ATI's extension seems more "right" in that it automatically handles
synchronization by default, and could be implemented as a wire protocol, but
there are advantages to the VAR extension being simply a hint. It is easy to
have a VAR program just fall back to normal virtual memory by not setting the
hint and using malloc, but ATI's extension requires different function calls
for using vertex objects and normal vertex arrays.

The fragment level processing is clearly way better on the 8500 than on the
Nvidia products, including the latest GF4. You have six individual textures,
but you can access the textures twice, giving up to eleven possible texture
accesses in a single pass, and the dependent texture operation is much more
sensible. This wound up being a perfect fit for Doom, because the standard
path could be implemented with six unique textures, but required one texture
(a normalization cube map) to be accessed twice. The vast majority of Doom
light / surface interaction rendering will be a single pass on the 8500, in
contrast to two or three passes, depending on the number of color components
in a light, for GF3/GF4 (*note GF4 bitching later on).

Initial performance testing was interesting. I set up three extreme cases to
exercise different characteristics:

A test of the non-textured stencil shadow speed showed a GF3 about 20% faster
than the 8500. I believe that Nvidia has a slightly higher performance memory
architecture.

A test of light interaction speed initially had the 8500 significantly slower
than the GF3, which was shocking due to the difference in pass count. ATI
identified some driver issues, and the speed came around so that the 8500 was
faster in all combinations of texture attributes, in some cases 30+% more.
This was about what I expected, given the large savings in memory traffic by
doing everything in a single pass.

A high polygon count scene that was more representative of real game graphics
under heavy load gave a surprising result. I was expecting ATI to clobber
Nvidia here due to the much lower triangle count and MUCH lower state change
functional overhead from the single pass interaction rendering, but they came
out slower. ATI has identified an issue that is likely causing the unexpected
performance, but it may not be something that can be worked around on current
hardware.

I can set up scenes and parameters where either card can win, but I think that
current Nvidia cards are still a somewhat safer bet for consistent performance
and quality.

On the topic of current Nvidia cards:

Do not buy a GeForce4-MX for Doom.

Nvidia has really made a mess of the naming conventions here. I always
thought it was bad enough that GF2 was just a speed bumped GF1, while GF3 had
significant architectural improvements over GF2. I expected GF4 to be the
speed bumped GF3, but calling the NV17 GF4-MX really sucks.

GF4-MX will still run Doom properly, but it will be using the NV10 codepath
with only two texture units and no vertex shaders. A GF3 or 8500 will be
much better performers. The GF4-MX may still be the card of choice for many
people depending on pricing, especially considering that many games won't use
four textures and vertex programs, but damn, I wish they had named it
something else.

As usual, there will be better cards available from both Nvidia and ATI by the
time we ship the game.
 
I malo komentara sa stranog foruma.... izgleda da ATI ipak nikad nece napraviti dobru kartu za igrace, dzaba 3DMark i quake 3, nije to ni 1% onoga sto treba odraditi kako valja...

-----------------------------------------




John Reynolds
Member

Joined: Feb 07, 2002
Posts: 9 Posted: 2002-02-11 23:47
--------------------------------------------------------------------------------
That high poly bug I keep seeing 8500 users complain about online is scarily starting to sound like a hardware, and not driver, level problem.

--------------------------------------------------------------------------------



DaveBaumann
Member


Joined: Jan 29, 2002
Posts: 98
From: Bedforshire, UK
Posted: 2002-02-12 01:10
--------------------------------------------------------------------------------
John - if its the same 'high poly bug' that's been talked about then it seems that Croteam have run into it as well; they have a workaround and ATi say they will be updating it in the next driver.
_________________
'Wavey' Dave
Beyond3D

--------------------------------------------------------------------------------

Johnny Rotten
Member

Joined: Feb 06, 2002
Posts: 9
From: Edmonton, Alberta
Posted: 2002-02-12 01:23
--------------------------------------------------------------------------------
I believe we're talking about 2 distinct things here. The SS:SE slowdown bug was a driver problem with regards to particular texture uploads. This is apparently a different issue than the VulpineGL problem and (apparently) Doom3 issue.

At this point though only ATI knows for sure.

--------------------------------------------------------------------------------

DaveBaumann
Member


Joined: Jan 29, 2002
Posts: 98
From: Bedforshire, UK
Posted: 2002-02-12 01:30
--------------------------------------------------------------------------------
Yeah - just re-read that one and it seems to be a texture thrashing issue rather than anything else.
_________________
'Wavey' Dave
Beyond3D

--------------------------------------------------------------------------------

Doomtrooper
Member


Joined: Feb 06, 2002
Posts: 38 Posted: 2002-02-12 02:46
--------------------------------------------------------------------------------
This poly issue concerns me, although all my games play excellent ...I still can't fathom how the 8500 could lose to my old Radeon Vivo in test 6 in Glxcess.
I plan on getting a 128 8500 as soon as they're available, the first thing I'm gonna do is load up the leaked Firegl drivers that detects chip revision..if it shows a different chip revision (mine is currently A13) I will get suspicous.
The thing that concerns me the most is Croteam and the Designer of Glexcess have contacted ATI (Glexcess Coder over a month ago) about this issue and even with the many dev leaks the problem is still there. The Firegl drivers show better results than the 8500's, but not near what is should be putting out.

This also may be the reason why the 128 meg Radeon 8500 cards are showing a 1000 point 3Dmark increase and 20 fps more RTCW...
---------------------------------------------------------------------------





Jel jos neko primetio ovako nesto (byMax)?
 
byMaX je napisao(la):
Ma rullez my S.... Mogao bi malo i da radi na tom engine-u osim sto svaki dan updateuje planove i emailove... Osecam da pola sata dnevno radi a onda ladi ... da ne kazem sta.... Kao, ja sam genije i drugi bi 10 sati radili ovo sto sam ja za pola sata... Ma idi... Ta igra se pravi godinama....

Heh, Max. Oprastam ti na ovim recima jer ne znas.
Evo malo linkova za tebe i ostale:
http://www.redherring.com/insider/2002/0201/1289.html
http://boards.fool.com/Message.asp?mid=16695587
Poz
Nedjo:wave;
A kad smo kod endzina evo kako izgleda jedan u razvoju (radi ga, izmedju ostalih, i covek koji je radio UT
http://www.3dengine.ca/prx/gallery.htm
http://www.3dengine.ca/prx/movies/prx1.wmv
BTW
Max, Srecan rodjendan!(D)
 
Hvala momci na svom ovom materijalu za citanje...

Nedjo, hvala ti :)... Da je vise takvih kao sto si ti.... danas bi imali... srpskog S Sam-a... :D :D
 
byMaX je napisao(la):
Hvala momci na svom ovom materijalu za citanje...

Nedjo, hvala ti :)... Da je vise takvih kao sto si ti.... danas bi imali... srpskog S Sam-a... :D :D

Mislis Susama? ;) :D
 
Procitao sam tekstove... Pa onaj Joca zaista radi... bas ga volim, lik je susta suprotnost od mene :)... Elem... Nikada te nisam pitao sta je to tvoj Avatar... Predator koji se smeje... ?:confused:
 
Na ovo treba obratiti posebno paznju (i izvuci zakjucak na vreme ko god misli o 8500).... + sve one zalbe na forumima.... niko nije siguran da li je core 8500 u stvari 100% bug free. :confused:

Zna se ipak ko je tata. :)

Ipak lako je Nvidii oni imaju simulator poslednje generacije (mozda ima i ATI ali ne zna da ga koristi ;) :D )....


ATI has identified an issue that is likely causing the unexpected
performance, but it may not be something that can be worked around on current
hardware.
 
Poslednja vest je da je ATi isporucio najnoviju verziju "Developer" drajvera u kojima je otklonjen bug u HiPoly scenama, na koji se Joca zalio.
Takodje i nVidia je u svojim najnovijim "Developer" drajverima ispravila "bug" koji im je Joca zamerao.
Pa vi sad vidite koliko covek ima uticaj u industriji.

Elem, koga zanima sta pojedini likovi rade evo linkova:
http://hem.passagen.se/emiper/3d.html
u pitanju je 3D engine u ravoju (delo jednog od zescih likova sa Beyon3D-ovog (meni omiljenog) foruma.
http://jpct.de/
3D engine napisan u Javi! Obavezno ga pogledajte i poigrajte se malo. Budite strpljivi sa ucitavanjem
poz
Nedjo
Max, iskreno, nisma siguran ko je baja sa avatara, ali mnogo Ql izgleda. Verovatno je neki Superheroj-negativac.:wave;
 
Nazad
Vrh Dno