"Mark Nusbaum" <mark.nusb...@worldnet.att.net> wrote:
> Are you saying the Voodoo2 was a technically outdated product at its
> release? Or the original Voodoo board?
Naturally, you would pick the two groundbreaking products, avoiding all the
turkeys that came later... To quote Zod the idiot child: "hehe". :-)
Rush was maybe not outdated, but at least...hm...poorly thought out.
Banshee, outdated. V3, outdated. V4/5, outdated. They either did not advance
3D graphics at all, or at best caught up with competition as it existed two
product generations ago - with the notable exception of FSAA, which even
though exceptional even today quality-wise (if not speed-wise) does not
change the overall picture however.
> there was very little about the V3 that held anything back.
Of course no, since they were still the market leaders, and developers aimed
at voodoo-level performance. That meant 16-bit everything and blurry 256*256
textures. That's exactly what I meant by holding back the revolution, since
3dfx refused to change with the times feature-wise apart from a few
insignificant details from the first voodoo graphics in 96, all the way up
until the V5 in spring 99. John Carmack publically lamented the fact in his
.plan that features he discussed with the company back when the V2 was in
development had YET TO SHOW UP in a 3dfx chipset when the company went bust
in 2000! I think that pretty much proves my point.
> The (PR, at least)
> position that they took with the V3 was that there were no games rendered
in
> 32-bit color and that 32-bit was really too slow to be useful anyway, so
why
> pay more for the extra, faster memory?
The chicken and the egg syndrome. 3dfx was not evolving, hence software did
not evolve, hence no need for 3dfx - in their mind - to evolve... Stamp
"Voodoo" on it and people will automatically buy it. 3dfx vastly
overestimated the strength of their brand name and when this evil circle
finally was cracked with the rise of Nvidia, it brought down the entire
company. And for that I actually think we should all be thankful, we don't
need no lazy bums clinging on to past fame in the 3D chip biz...
> But what were they failing to include?
Like ANY of the big DX7 features (and some from DX6 too), including dot3 and
e.m. bumpmapping, cube mapping, pixel shaders (yes, these existed back in
DX7, if in a more primitive form than GF3, Radeon 8500), hardware T&L, just
to mention the most prominent. I might have overlooked something or other.
> AGP texturing, something that brought any GF256 with only
> 32mb of memory to a crawl?
How about even fundamental AGP support? 3dfx treats the AGP slot much like a
66MHz PCI interface, thus missing out not only on DiME (which nobody likes),
but also AGP2x/4x transfer speeds, sidebanding, fastwrites etc.
> The reviewers preached "future protection" on all
> this stuff, since it all had little or no current value, but did there end
> up being much validity in that? I don't think so.
People still buy lots of GF2s of various speed grades, and they run today's
software well. You don't think that is a sign of looking ahead,
future-proofing oneself? These things have basically been around since
autumn 98, that's not too shabby eh?
Of course, clock speed has increased and GF2 does a few things better than
the original GF256, so it's not EXACTLY the same chip.
> Yes? What were they? Just name a handful on the market by the time of the
V3
> release, 4/99.
You and I were talking about the market as it exists TODAY. My memory
doesn't stretch back to mid-ish 1999, I'm sorry to say. The original
Homeworld might be one of them, I'm not sure which API it runs in. Other
than that, you have to research this yourself. :-)
I'm not contending that there weren't many OGL games back then, what I'm
saying is that 3dfx contributed to this situation by trying to push Glide on
developers and dragging their heels at developing a full OGL implementation.
It doesn't exactly take a rocket scientist to see there is a connection
here...
> 3dfx wasn't so profitable that you could say their money wasn't their
> customer's money. The money they spent for R&D and manufacturing was their
> customer's money in the end.
No, it was THEIR money. Once the money stops being yours (when you buy
something), it automatically becomes theirs. Besides, 3dfx lived on borrowed
money for like six consecutive quarters (they posted big losses for a long
time before biting the dust), so in a way you're right - it WASN'T their
money after all. :-) But it wasn't their customers' cash, it was their
investors'...
> Some T-buffer features weren't worth much, but their FSAA implementation
> certainly was.
3dfx's FSAA could well have been implemented in another way instead of
wasting cash on a feature they must have known would recieve limited impact.
> And FXTC, an open standard that they hoped MS would
> incorporate into DirectX, would have been very worthwhile if it was as
> superior
> to S3's TC as advertised.
However, it still is universally unsupported many years later, and user
installed base was ZERO when it was released, so one must think if that
really was the smartest way of spending money.
> > Would you mind very much listing the things Nvidia has developed that
has
> > little or no practical application?
> The two biggest things associated with nVidia are hardware T&L and the
> programmable shaders stuff in the GF3s.
EEEEEE!
I believe you confuse "limited software support" with "little or no
practical application". Besides, T&L is well supported these days, there is
a page on Nvidias site that lists titles, and it is quite extensive.
Virtually all new software uses the DX or OGL transformation pipelines which
means automatic support of T&L.
The only widespread gaming engine NOT supporting either is the Unreal
(Tournament) engine, but I think all games licensed for that engine has been
released now, so we're finally through with that crap. One exception: Duke
Nukem Forever (Waiting :-) ), and that title was retrofitted with T&L and
possible pixel shader support according to the devs.
So I don't agree with your assessment at all.
And besides, one simply HAS to release a product with support for feature X
in order to generate software support for it. Doing it 3dfx-loser-style
claiming there is no need because no games use it is completely backward.
Thinking like that will make sure it NEVER happens! Or you go bankrupt
because customers flock to some other manufacturer wich is far less
backwards and inept than you are...
> While 3dfx sold shitloads of V3s anyway
But those "shitloads" didn't help them in the end. The company was in the
reds the entire time the V3 was on the market, they were constantly BLEEDING
MONEY! And then they went the way of the dodo, so clambering to the idea it
was a smart move not to move along with the times - lack of software support
for new ooh-aah crowdpleasing features or not - is hardly the smartest thing
anyone could do I would think!
> 5) RAMDAC of at least 250mhz. This covers all new boards, but fails to
> mention that the V3's RAMDAC is faster than the TNT's.
He also failed to mention that monitors supporting such a pixel clock cost
as much as a good computer system back then, and virtually nobody owned such
a beast outside a professional CAD studio or such... No point in owning a
Ferrari that does 330kph if the speed limit is only a third of that. ;-)
> In any case, this could have been an advertisement
> written by nVidia.
Well, who won in the end?
And I don't just mean sales- and money-wise. Do we still play titles in
all-16-bit with small textures, no bumpmapping etc, or...? It was an
intellectual victory as much as anything else, showing that progress is
indeed the way to go.
> Every single mention of
> the V3 was negative, and every mention of the TNT2 positive.
You don't ACTUALLY mean to imply it's better to own a videocard that does
not support feature X, even if software support still is in its infancy for
said feature? It almost sounds like that's what you mean.
> or just paid off by nVidia.
"Ooh, Nvidia killed 3dfx!"
"You bastards!"
Drop the conspiracy theory hysteria please. That's just utterly prepostrous
to suggest something like that, especially with zero evidence to back it
up...
> how it didn't make all that much sense to spend all
> that money on a card that wasn't really faster than the GF2 Ultra
But that changed later with new drivers. Yes sirree bob, that it did. And
new stuff is always more expensive in the beginning you know that. Even if
Tom Pabst tells you to go out and get a GF19 RIGHT NOW even though it costs
$1200, you wouldn't actually do it would you? You do have a brain of your
own don't you, capable of independent thought?
> So what has nVidia brought to the table that has made a big difference in
3D
> gaming or anything else?
An extremely successful series of 3D accelerators. Isn't that enough,
really?
> they copied ATi's HyperZ
No they didn't! Ask me why and I'll tell you.
> the DX8 stuff hasn't really happened yet,
> they've done little to enhance 2D or 3D image quality.
I could write a coupla paragraphs on this subject, but I'll pass. All things
needed has already been said on this subject, and repeating them again would
just be tedious and annoying.
> But after the TNTs,
> they did lead the way in producing higher and higher priced cards, topping
> out with the $500 GF2 Ultras. Sure, they've made some chips with very nice
> architecture that were very fast, but that's not much of a legacy
> considering their exalted position in the minds of many and the price of
> those cards.
I can't take that very seriously at all, you simply come across as a sore
loser refusing to give credit where it's due when you say things like that.
Geez, it's just a goddamn videocard alright! "Legacy", my butt! What good is
3dfx's legacy now, with nothing left of the company? Just managing to stay
in the top spot for
...
read more »