rec.autos.simulators

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

chris

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by chris » Thu, 24 Jan 2002 06:15:09


>You are assuming you actually have any valid points...hehe

My point is that once again, you're a proven idiot and newsgroup
clown, Zod.  You said, and I quote, "there has been such a flip-flop
in the 3dfx-nVidia comparisons".  No such "flip-flop" occurred.  All
along, from day one, has been the same stance that 2D quality should
not be sacrificed for 3D quality or speed.

So get it through your head, stop being such a dumbass, and stop lying
to make your favorite brand of video hardware look better.  hehe.

chris

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by chris » Thu, 24 Jan 2002 06:16:23


>> What??  You really think so?  I was hoping that a GF3 or Radion 8500
>> would suffice!
>> When's Doom3 due out, anyway?

>You 3dfx sheep are such idiots....hehe

Shut up, you twit.  Neither GF3 or Radian 8500 is a 3dfx card.  hehe.
chris

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by chris » Thu, 24 Jan 2002 06:19:17



>> >Doom3 hits the stands my GF3 will be obsolete
>> What??  You really think so?

>Carmack has mentioned he expects a GF3 to run Doom3 at 30fps-ish I believe,
>and while it may be playable it's certainly not pretty nor comfortable...

Gads.

I thought I heard once that it would be this Spring.  But that was a
long time ago, and schedules always slip...

ZOD

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by ZOD » Thu, 24 Jan 2002 07:24:55

SO...you are saying that 3dfx sheep have never put speed over quality?
[Oh...please say yes.....hehe]

Hans Bergengre

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Hans Bergengre » Thu, 24 Jan 2002 09:56:34


> >Carmack has mentioned he expects a GF3 to run Doom3 at 30fps-ish I
believe,
> >and while it may be playable it's certainly not pretty nor comfortable...
> Gads.

Well, I don't think it's due to code-bloat... :-) No prerendered lightmaps,
everything with their own proper realtime shadows and highlights, it really
taxes the vidcard.

id hasn't set a date for the game, it's like with all their previous work;
"when it's done".

Well, if you don't have a (public) schedule to refer to, how would you know
it's slipping? ;-) Except, if the title is DNF of course, coming from Mr.
Big-Mouth Broussard of course who slammed a certain Romero with his
Daikatana title for spending too much in development limbo... Heh, look
who's talking... ;-)

 Bye!
/HB.

Mark Nusbau

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Mark Nusbau » Thu, 24 Jan 2002 13:32:59



> > That 3DFX optimized its
> > hardware through the use of a proprietary API and made 3D a reality that
> > started a revolution should hardly be criticized.

> Why not?

> They did fracture the market and attempted to monopolize it. They also
held
> back that very revolution you claim they started by releasing competent
but
> technically outdated products time and time again.

Are you saying the Voodoo2 was a technically outdated product at its
release? Or the original Voodoo board? I think they were pretty clearly the
whole ballgame at that time. If you're talking about the V3 and V4/5 stuff,
there
was very little about the V3 that held anything back. The (PR, at least)
position that they took with the V3 was that there were no games rendered in
32-bit color and that 32-bit was really too slow to be useful anyway, so why
pay more for the extra, faster memory? They didn't support larger textures
(more than 256x256) when they weren't used in games anyway, and no AGP
texturing when that was too, too slow. But you paid notably less for a card
that performed well with what was out there to play. The last round was in
large part effected by their impending failure, I think, at least the delay
in getting it out. But what were they failing to include? Hardware T&L,
something that was hardly used by game developers at the time of the V5
release (and not at all if the card had gone to market on schedule) and even
remains so today? AGP texturing, something that brought any GF256 with only
32mb of memory to a crawl? The reviewers preached "future protection" on all
this stuff, since it all had little or no current value, but did there end
up being much validity in that? I don't think so.

Yes? What were they? Just name a handful on the market by the time of the V3
release, 4/99.

3dfx wasn't so profitable that you could say their money wasn't their
customer's money. The money they spent for R&D and manufacturing was their
customer's money in the end.

Some T-buffer features weren't worth much, but their FSAA implementation
certainly was. And FXTC, an open standard that they hoped MS would
incorporate into DirectX, would have been very worthwhile if it was as
superior
to S3's TC as advertised. That has made a huge difference in optimizing
memory use on current boards, and it's a shame that nVidia has chosen to
mothball FXTC, I think.

The two biggest things associated with nVidia are hardware T&L and the
programmable shaders stuff in the GF3s. Hardware T&L didn't turn out to be
much of anything in the end, and the other is just now coming on line, so we
don't know where that will go. But beyond that, look back to when nVidia
made its bones in this business, when they went head-to-head with 3dfx
during the V3/TNT2 generation in '99. That was when nVidia made its
reputation and became viewed as the leading manufacturer of 3D hardware. And
what made that reputation? The ravings of the hardware press, extolling the
virtues of the stuff the TNT had and the V3 didn't - 32-bit color, 32mb of
memory, large texture support, AGP texturing functionality, an OpenGL ICD.
While 3dfx sold shitloads of V3s anyway, this stuff probably meant they lost
ground with OEMs and new or more casual gamers, and that was likely a
critical issue in the longer run.

A perfect example of this is an article I came across recently on Tom's
Hardware, a gamer's guide to video hardware posted in April '99, right after
the release of the ATi Rage Fury, the Voodoo3 2k and 3k, and the TNT2 and
TNT2 Ultra, and just before the V3 3500 and Matrox G400s -
http://www.racesimcentral.net/
requirements Mr. Pabst cited were:

1) An OpenGL ICD and Direct3D support. He cites the importance of OGL,
naming games run with this API, including, if you can believe it, Unreal and
UT, both native Glide games. There is no mention of Glide at all, and
remember that this was April '99.
2) Fill rates in excess of 300megapixels/texels. States "The color depth,
Z-buffer and particularly rendering quality varies significantly in between
the vendors. A good example for this is TNT2, which has a lower theoretical
fill rate than Voodoo3, but it scores higher frame rates in complex 3D
scenes."
3) Up to 32mb of onboard memory. Accurately but vaguely states that this is
really a 32-bit color issue.
4) 32-bit rendering. He seems to overstate the existence of 32-bit games at
that time, although he at least states that the current hardware was only
"starting to be capable of" handling that color depth.
5) RAMDAC of at least 250mhz. This covers all new boards, but fails to
mention that the V3's RAMDAC is faster than the TNT's.
6) State-of-the-art AGP support. States "the 3D-chip should also be able to
do AGP-texturing, which is one of the things that Voodoo3 is not able to."
7) Supports textures of at least 2048x2048. Rather vague on why (future
requirement), but points out that the V3 only supports 256x256.
He also cites multitexturing, anisotopic filtering and hardware bump
mapping, which were either provided across the board or not at all in the
boards of that time. In any case, this could have been an adverti***t
written by nVidia. If you took this guide literally, the Rage Fury would
pass with flying colors but the Voodoo3 would not. Every single mention of
the V3 was negative, and every mention of the TNT2 positive.

The direct reviews were much the same. I don't know if these guys were
really taken in by this stuff - dazzled by the technology, desiring to
support the company that seemed to be moving 3D *** forward, damning 3dfx
for their disappointment, or just paid off by nVidia. That the V3 3k was
generally as fast as a TNT2 Ultra in 16-bit, that TNT2 32-bit performance
dropped off considerably, that the V3's 2D and 16-bit 3D image quality was
better, that Glide still mattered to some, that the V3s were notably cheaper
than the TNT2s was hardly mentioned.

Pretty clearly an agenda here, and that went on with the GF256's hardware
T&L vs. the V5, then the GF3's programmable shaders vs. anyone else left
(ATi and Kryo). At least by the time the GF3 came out they were getting a
clue and talking about how it didn't make all that much sense to spend all
that money on a card that wasn't really faster than the GF2 Ultra, that had
as its distinction a feature that wasn't usable at the time, and that might
be a tad slow when it eventually was. But 3dfx is gone now, so it's all
about how crappy ATi's Radeon drivers are.

So what has nVidia brought to the table that has made a big difference in 3D
*** or anything else? ATi put out the first semi-viable 32-bit card, AGP
texturing has never really worked, T&L came to almost nothing after all the
talk of revolutionizing 3D and making CPU speed almost a non-issue, they
killed FXTC, their FSAA is inferior to what 3dfx did and even what ATi is
doing, they copied ATi's HyperZ, the DX8 stuff hasn't really happened yet,
they've done little to enhance 2D or 3D image quality. But after the TNTs,
they did lead the way in producing higher and higher priced cards, topping
out with the $500 GF2 Ultras. Sure, they've made some chips with very nice
architecture that were very fast, but that's not much of a legacy
considering their exalted position in the minds of many and the price of
those cards.

Pierre Legra

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Pierre Legra » Thu, 24 Jan 2002 15:25:12

Just wanted to say that Im sorry to anyone who has tried to get to my
site. In 7 days I had a total of 16,000 visitors to my video card
review V5 vs GF3 and it caused me to exceed my transfer limits.

Wifey doesnt think its a good idea to pay .04 per mb so the site will
remain down till the end of the month.

THANK YOU FOR VISITING...its extremely gratifying to know that you all
found it useful. Its the only reward for the work of the review and
its great to know it was appreciated.

Thanks again....see you when we open up again.

Pierre PAPA DOC Legrand

Pierre PAPA DOC Legrand
Never Forget Never Forgive
September 11, 2001
We Will Find You
www.papadoc.net

Tony Hil

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Tony Hil » Thu, 24 Jan 2002 15:25:33



>>Uh, dude... I'd have agreed with you on that had you limited yourself to
>>D3D. According to those in the know, all previous D3D versions to DX6 or
>>something like that seemed to have been put together by a team of drunken
>>monkeys. However, OGL is by far the more competent and powerful API of the
>>two, and trying to pass off BS that Glide is/was superior is just plain
>>bullshit.

>Glide WAS superior in the games that supported it.
>Too bad you don't agree.

Glide was superior mainly due to the fact that the implementation in
the games was better, not so much that the API was better then OpenGL.
In fact, the whole idea of Glide was just to make an API similar to
OpenGL but much simpler and specifically for games.  It was easier to
implement and faster, but only because it was really limited to games,
while OpenGL was designed as a much more general purpose 3D
application.  D3D was a very late-comer to the game, and took quite a
few revisions before it was really playing a role in this area.

Uhh, that was a completely meaningless parallel.  The lack of a proper
OpenGL implementation kept 3dfx's cards to a fairly small section of
the market, ie gamers, and it didn't even work with all games.  There
WERE games that came out that used OpenGL which did not work (or
worked poorly) with the miniGL driver that 3dfx provided.

Still, all that being said, it has virtually nothing to do with the
reason why 3dfx went out of business.  The reason that 3dfx went under
is simply that nVidia just blew them away in their core market and
3dfx didn't have a niche to drop back on.  S3 also got wiped out and
was sold off.  Matrox got blown away but was able to fall back on
their niche of high-end 2D graphics.  ATI was the only company that
kept within shooting range, though for a while they were only holding
on due to their strong OEM ties and low cost.  nVidia simply set a
development pace that no one else could match.  Everyone else kept
planning on building a "TNT killer" or "GeForce killer", but by the
time they got to market, nVidia had already moved on by two
generations.  What's more, nVidia Unified Driver approach gained them
some popularity with OEMs and also meant that they were able to
quickly develop really good drivers for ALL their operating systems
(nVidia to this date remains the only company with really good 3D
drivers for Linux, being more then twice as fast as their next
competitors on just about everything).  ATI was especially bad for
this, where their drivers might have worked ok for Win98, but simply
sucked for Win2K for quite some time (and under Linux a TNT2 m64 will
beat out a Radeon 8500 any day).

In short, I don't think that many people can really fault 3dfx that
much for their demise, it was really just that nVidia did a LOT of
things right and, perhaps more importantly, they did them right now,
not two years from now.  Even Intel, who many feared would totally
dominate the graphics industry in short order was completely blown
over by nVidia in the separate graphics chipset market and was
relegated to the integrated graphics market (where nVidia is already
making a push at as it is).

Hans Bergengre

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Hans Bergengre » Thu, 24 Jan 2002 15:51:40

"Mark Nusbaum" <mark.nusb...@worldnet.att.net> wrote:
> Are you saying the Voodoo2 was a technically outdated product at its
> release? Or the original Voodoo board?

Naturally, you would pick the two groundbreaking products, avoiding all the
turkeys that came later... To quote Zod the idiot child: "hehe". :-)

Rush was maybe not outdated, but at least...hm...poorly thought out.
Banshee, outdated. V3, outdated. V4/5, outdated. They either did not advance
3D graphics at all, or at best caught up with competition as it existed two
product generations ago - with the notable exception of FSAA, which even
though exceptional even today quality-wise (if not speed-wise) does not
change the overall picture however.

> there was very little about the V3 that held anything back.

Of course no, since they were still the market leaders, and developers aimed
at voodoo-level performance. That meant 16-bit everything and blurry 256*256
textures. That's exactly what I meant by holding back the revolution, since
3dfx refused to change with the times feature-wise apart from a few
insignificant details from the first voodoo graphics in 96, all the way up
until the V5 in spring 99. John Carmack publically lamented the fact in his
.plan that features he discussed with the company back when the V2 was in
development had YET TO SHOW UP in a 3dfx chipset when the company went bust
in 2000! I think that pretty much proves my point.

> The (PR, at least)
> position that they took with the V3 was that there were no games rendered
in
> 32-bit color and that 32-bit was really too slow to be useful anyway, so
why
> pay more for the extra, faster memory?

The chicken and the egg syndrome. 3dfx was not evolving, hence software did
not evolve, hence no need for 3dfx - in their mind - to evolve... Stamp
"Voodoo" on it and people will automatically buy it. 3dfx vastly
overestimated the strength of their brand name and when this evil circle
finally was cracked with the rise of Nvidia, it brought down the entire
company. And for that I actually think we should all be thankful, we don't
need no lazy bums clinging on to past fame in the 3D chip biz...

> But what were they failing to include?

Like ANY of the big DX7 features (and some from DX6 too), including dot3 and
e.m. bumpmapping, cube mapping, pixel shaders (yes, these existed back in
DX7, if in a more primitive form than GF3, Radeon 8500), hardware T&L, just
to mention the most prominent. I might have overlooked something or other.

> AGP texturing, something that brought any GF256 with only
> 32mb of memory to a crawl?

How about even fundamental AGP support? 3dfx treats the AGP slot much like a
66MHz PCI interface, thus missing out not only on DiME (which nobody likes),
but also AGP2x/4x transfer speeds, sidebanding, fastwrites etc.

> The reviewers preached "future protection" on all
> this stuff, since it all had little or no current value, but did there end
> up being much validity in that? I don't think so.

People still buy lots of GF2s of various speed grades, and they run today's
software well. You don't think that is a sign of looking ahead,
future-proofing oneself? These things have basically been around since
autumn 98, that's not too shabby eh?

Of course, clock speed has increased and GF2 does a few things better than
the original GF256, so it's not EXACTLY the same chip.

> Yes? What were they? Just name a handful on the market by the time of the
V3
> release, 4/99.

You and I were talking about the market as it exists TODAY. My memory
doesn't stretch back to mid-ish 1999, I'm sorry to say. The original
Homeworld might be one of them, I'm not sure which API it runs in. Other
than that, you have to research this yourself. :-)

I'm not contending that there weren't many OGL games back then, what I'm
saying is that 3dfx contributed to this situation by trying to push Glide on
developers and dragging their heels at developing a full OGL implementation.
It doesn't exactly take a rocket scientist to see there is a connection
here...

> 3dfx wasn't so profitable that you could say their money wasn't their
> customer's money. The money they spent for R&D and manufacturing was their
> customer's money in the end.

No, it was THEIR money. Once the money stops being yours (when you buy
something), it automatically becomes theirs. Besides, 3dfx lived on borrowed
money for like six consecutive quarters (they posted big losses for a long
time before biting the dust), so in a way you're right - it WASN'T their
money after all. :-) But it wasn't their customers' cash, it was their
investors'...

> Some T-buffer features weren't worth much, but their FSAA implementation
> certainly was.

3dfx's FSAA could well have been implemented in another way instead of
wasting cash on a feature they must have known would recieve limited impact.

> And FXTC, an open standard that they hoped MS would
> incorporate into DirectX, would have been very worthwhile if it was as
> superior
> to S3's TC as advertised.

However, it still is universally unsupported many years later, and user
installed base was ZERO when it was released, so one must think if that
really was the smartest way of spending money.

> > Would you mind very much listing the things Nvidia has developed that
has
> > little or no practical application?
> The two biggest things associated with nVidia are hardware T&L and the
> programmable shaders stuff in the GF3s.

EEEEEE!
I believe you confuse "limited software support" with "little or no
practical application". Besides, T&L is well supported these days, there is
a page on Nvidias site that lists titles, and it is quite extensive.
Virtually all new software uses the DX or OGL transformation pipelines which
means automatic support of T&L.

The only widespread gaming engine NOT supporting either is the Unreal
(Tournament) engine, but I think all games licensed for that engine has been
released now, so we're finally through with that crap. One exception: Duke
Nukem Forever (Waiting :-) ), and that title was retrofitted with T&L and
possible pixel shader support according to the devs.

So I don't agree with your assessment at all.

And besides, one simply HAS to release a product with support for feature X
in order to generate software support for it. Doing it 3dfx-loser-style
claiming there is no need because no games use it is completely backward.
Thinking like that will make sure it NEVER happens! Or you go bankrupt
because customers flock to some other manufacturer wich is far less
backwards and inept than you are...

> While 3dfx sold shitloads of V3s anyway

But those "shitloads" didn't help them in the end. The company was in the
reds the entire time the V3 was on the market, they were constantly BLEEDING
MONEY! And then they went the way of the dodo, so clambering to the idea it
was a smart move not to move along with the times - lack of software support
for new ooh-aah crowdpleasing features or not - is hardly the smartest thing
anyone could do I would think!

> 5) RAMDAC of at least 250mhz. This covers all new boards, but fails to
> mention that the V3's RAMDAC is faster than the TNT's.

He also failed to mention that monitors supporting such a pixel clock cost
as much as a good computer system back then, and virtually nobody owned such
a beast outside a professional CAD studio or such... No point in owning a
Ferrari that does 330kph if the speed limit is only a third of that. ;-)

> In any case, this could have been an advertisement
> written by nVidia.

Well, who won in the end?

And I don't just mean sales- and money-wise. Do we still play titles in
all-16-bit with small textures, no bumpmapping etc, or...? It was an
intellectual victory as much as anything else, showing that progress is
indeed the way to go.

> Every single mention of
> the V3 was negative, and every mention of the TNT2 positive.

You don't ACTUALLY mean to imply it's better to own a videocard that does
not support feature X, even if software support still is in its infancy for
said feature? It almost sounds like that's what you mean.

> or just paid off by nVidia.

"Ooh, Nvidia killed 3dfx!"

"You bastards!"

Drop the conspiracy theory hysteria please. That's just utterly prepostrous
to suggest something like that, especially with zero evidence to back it
up...

> how it didn't make all that much sense to spend all
> that money on a card that wasn't really faster than the GF2 Ultra

But that changed later with new drivers. Yes sirree bob, that it did. And
new stuff is always more expensive in the beginning you know that. Even if
Tom Pabst tells you to go out and get a GF19 RIGHT NOW even though it costs
$1200, you wouldn't actually do it would you? You do have a brain of your
own don't you, capable of independent thought?

> So what has nVidia brought to the table that has made a big difference in
3D
> gaming or anything else?

An extremely successful series of 3D accelerators. Isn't that enough,
really?

> they copied ATi's HyperZ

No they didn't! Ask me why and I'll tell you.

> the DX8 stuff hasn't really happened yet,
> they've done little to enhance 2D or 3D image quality.

I could write a coupla paragraphs on this subject, but I'll pass. All things
needed has already been said on this subject, and repeating them again would
just be tedious and annoying.

> But after the TNTs,
> they did lead the way in producing higher and higher priced cards, topping
> out with the $500 GF2 Ultras. Sure, they've made some chips with very nice
> architecture that were very fast, but that's not much of a legacy
> considering their exalted position in the minds of many and the price of
> those cards.

I can't take that very seriously at all, you simply come across as a sore
loser refusing to give credit where it's due when you say things like that.
Geez, it's just a goddamn videocard alright! "Legacy", my butt! What good is
3dfx's legacy now, with nothing left of the company? Just managing to stay
in the top spot for ...

read more »

Mark Nusbau

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Mark Nusbau » Thu, 24 Jan 2002 17:09:35

I think you're mostly right about this, but you characterization of what
happened with 3dfx is a little too simplistic. They were a small specialty
company that built add-on PCI 3D-only hardware for *** and built and
maintained an API to help make this happen. It was apparent that they'd have
to make a move to AGP-based 2D/3D cards in order to grow beyond this, or
even to maintain their position - the future of high-end graphics was on the
AGP bus, which meant incorporating 2D, and other companies were beginning to
make inroads into 3D from that position. nVidia was the principal
competition, and the Riva TNT established them as a high-end OEM presence as
well as a company that could make competent 3D hardware. So 3dfx put out the
Banshee, which was a middling effort and really just a PCI card using the
AGP bus, but not a bad start. Then they followed it with the similar but
faster Voodoo3, a great real-world, bang-for-the-buck card - $130-180 srp
for V2 SLI-level 3D performance and great 2D. But at the same time they were
dealing with moving their boardmaking in house, were trying to develop an
OEM base, Glide was dying, probably erred in putting together their 3500
flagship card, and weren't focusing enough on their next generation, which
had to incorporate all those existing features found in competing cards and
add new ones and even more speed.

Meanwhile nVidia was putting out the TNT2, which really wasn't anything very
new either, but it was much more at home in the AGP slot and the press
jumped all over 3dfx once it became apparent that it could keep pace with
the V3 in 16-bit benchmarks and could provide playable 32-bit. Then 3dfx
wasn't doing anything right in their eyes, as I said in another post in this
thread. The coup de grace was the GeForce, which was rushed out to follow up
on the TNT2 momentum. The press acted like its hardware T&L was the biggest
thing since the invention of 3D, and who could tell since there wasn't
anything to actually run on it? Anyway, the actual hardware trickled out,
starting with SDR, then DDR, then 64mb of it months later. It almost wasn't
so much a piece of hardware as a symbol, and the press said if you didn't
have T&L you might as well box up your PC and start watching TV in six
months. The opposition didn't really help themselves at that point - the
Rage Fury Maxx was too slow and flawed, the Savage 2000 sounded good but
didn't deliver and then S3 was gone, Matrox was silent, and the announced
Voodoo5 didn't show up at all.

By the time the V5 did show, nVidia was ready for the GF2, which was the
real deal. It was faster, it had enhanced features, the drivers had matured,
and you could actually buy one. The too-little, too-late V4/5 didn't stand
much of a chance, and it hardly was noticed that T&L games hadn't really
showed yet. 3dfx was done already, and the 6000 never made it out of the
lab. But the death knell had rung the generation before, when 3dfx couldn't
make the transition from a boutique to a mainstream operation. That the GF2
GTS was very expensive didn't matter much - nVidia had fully established
their name, so followed the aging TNT2 with the GF2 MX for mid-level buyers
and the OEMs to spread their base.

Since then nVidia has backed off their pace. The GF2 Ultra was just a faster
and more expensive card ($500!) and anyone could have pulled that off, the
GF3 came out a full year after the GF2 and didn't represent a quantum
performance leap, and the Ti thing is just a juggling of speeds and prices
to make it look like they've done something. In comparison, ATi increased
the speed of their flagship Radeon64 Vivo last year and didn't even mention
it - with nVidia that would have been a product cycle! The reality is that
nVidia hasn't made massive progress in the last two years, much less than
the two years prior to that. The competition hasn't done that well because
its mostly gone - 3dfx and S3 out of business, and Matrox quietly having
withdrawn from the high-end sector after dabbling with one card. But ATi has
come farther in those two years than nVidia, having moved from the Rage
128-based RF Maxx to the Radeon 8500, every bit the card that the GF3 Ti 500
is and perhaps more. But they haven't gotten the driver and customer
relations stuff together the way they should have for this market, and have
lost a lot of ground to nVidia in the OEM, mobile and Mac sectors.

So nVidia made their name by defeating 3dfx, a victory largely based on
flashy features that wowed the press, and on ground that belonged to nVidia
more than 3dfx. They have parlayed that to a broad base that now goes beyond
PC video cards. But the image of them as this technological steamroller that
cranked out new and better cards every six months is a little inaccurate, I
think.


chris

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by chris » Thu, 24 Jan 2002 23:43:35


>> in the 3dfx-nVidia comparisons".  No such "flip-flop" occurred.  All

>SO...you are saying that 3dfx sheep have never put speed over quality?
>[Oh...please say yes.....hehe]

3dfx's 3D "quality" has always been "fine" compared to the
competition.  PLENTY good for ***.   What IS VERY important is for
a video card to produce quality 2D, something that the nVidia crowd
likes to forget.

2D.  Get it, moron?  2D.

chris

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by chris » Thu, 24 Jan 2002 23:45:16



>>> They weren't converted at the time that Glide was in development.

>>...and your point would be...?

>Why did you snip my entire post, loser?

Because he's an idiot, and proud of it.  He doesn't seem to mind that
everyone knows this, so that everyone gives exactly zero weight to any
of his opinions on the issues.
Gonz

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Gonz » Fri, 25 Jan 2002 01:31:59

All this talk about 3Dfx "holding back" the 3D market/development is a joke
IMO.  Where exactly is the advancement since 3Dfx is now gone?  Let's see,
the industry now wants you to pay $300+ to get decent frame rates in newer
and still in development games with all the eye candy turned up.  IMO we
would have better and faster advancement had 3dfx stayed in business.

Where's the advancement?...

I don't see it yet.


rec.autos.simulators is a usenet newsgroup formed in December, 1993. As this group was always unmoderated there may be some spam or off topic articles included. Some links do point back to racesimcentral.net as we could not validate the original address. Please report any pages that you believe warrant deletion from this archive (include the link in your email). RaceSimCentral.net is in no way responsible and does not endorse any of the content herein.