Neither here nor there anyway. 3DFX went down because they ***ed their
entire distribution channel. Anyone could have seen that one coming a mile
away. The market doesn't respond well to a good ***ing.
Go figure,
C//
Neither here nor there anyway. 3DFX went down because they ***ed their
entire distribution channel. Anyone could have seen that one coming a mile
away. The market doesn't respond well to a good ***ing.
Go figure,
C//
You think drivers can't impact 2D quality?
Hehe....
Uh..I've been busy...hehe
1024 not big enough on a 17"? Are you ok?
Seek professional help, ok...hehe
Yeah.....I have seen no such documentation.
Please share...hehe
had
Gee.....wonder why you are having issues with drivers...hehe
Maybe you just don't know how to install them properly...hehe
Yeah.....that's why they are #1
Wow....hehe
YEah...and that's why 3dfx is the company it is today....doh
HEhe...
No.....that's dot pitch...I meant DPI.....
Yada..yada...hehe
GF3 Ti 500.....22" Viewsonic...I would call that decent, wouldn't you?
See what happens when you assume...hehe
Of course you must be used to assuming the position...hehe
Yawn...hehe
I take it you've been there a few times?
Hehe....
Yeah..I see your facts...
Green giant?
Hehe.....
Actually, the V5 didn't come out until spring '00. I don't think 3dfx was
really holding much of anything back in early '99 with the V3. First, it was
sort of a refresh of the Banshee, but with somewhat better features and much
more speed. What it gave people was very good performance in the standards
of the time - 16-bit color, smaller textures, etc. The hope at that time on
the games playable was the Quake2 60-80 fps at 1024x768 in 16-bit kind of
thing, which really wasn't close to possible prior to the V3 unless you had
a $600 V2 SLI setup - a Banshee did 30, and a Rage Fury or TNT did 40 in
16-bit. This necessity for features not yet practical or usable was
something new, and was it really worth paying $50-70 more for rather slow
32-bit performance in 16-bit games, AGP texturing at 10 fps, and some vague
hope of "future protection" - the TNT2 Ultra could match a V3 3k in 16-bit,
but 32-bit put them back to that 40 fps level, and I'd bet the V3's "22-bit"
looked better than the TNT2's 16-bit while creating very little performance
hit. Anyway, a very real question, I think.
I doubt that 3dfx not including some of these theoretically necessary future
features did much to hold software back. The hardware of late '98 and early
'99 couldn't support the games out there then - for instance, you almost had
to have an SLI rig to do Unreal justice at its release, which just meant
16-bit at 10x7. Some of this advancement stuff would have just brought any
pre-GF generation card to a crawl, so it was raw speed that was necessary.
And nVidia played the speed game as well as anyone. That's why they still
work so hard to maximize 16-bit framerates, something ATi doesn't do.
There's no reason that hardware and software can't be developed in concert -
you don't have to have the hardware out on the market first. Games were
clearly going to go to higher color depth, larger resolution textures and
greater poly counts, as they already had been, and the V3s delivered on that
level - they made the stuff out there playable, so the game designers could
continue to move forward. Because a TNT2 or a dog-slow Rage 128 could
technically support 32-bit color and 2048x2048 textures didn't signal the
game designers that they could start using them - they would do so anyway,
as soon as hardware was fast enough that it was worthwhile, and 3dfx was
making plenty of noise on that level.
And we aren't better off with 3dfx gone, just as we are worse off with
Matrox out of the 3D game, or S3 gone. The more competition and the more
heads doing the R&D, the more advancement is made. Only an nVidia
stockholder would look at it that way.
Game design has stagnated, witnessed by the continued use of Q3 as a
benchmark, a late '99 release. The GF2 was a mid-'00 release, so isn't that
old. That it runs just fine on today's games in comparison to the GF2 Tis
says something about the slowed development of nVidia's hardware as well.
But a funny thing that game development has stagnated just when, according
to you, all of nVidia's advancements should have been pushing it forward.
Hmm...
I read somewhere recently, in an article about DX8 programmable shaders,
that hardware T&L hadn't gone over very big with game designers because of
its limitations and inflexibility. I don't know which engines use it and
which don't, and more importantly to what extent, but I certainly don't see
a revolution of nearly the magnitude of that predicted by the hardware press
and nVidia at the release of the GF256. I think it's mostly the changes to
the basic hardware design and clock speeds in the GFs and Radeons that have
made for their greater capability in comparison to the '99 card class - it's
increases in fill rates and memory bandwidth that matter, just like in the
old days. "You've got a Pentium3 600? You're good for the next five years",
that was the claim. Well, reviewers have all but stopped doing CPU scaling
benchmarks, but it's not because it no longer matters.
Yes I do, and that's why I returned my 32mb, 32-bit, AGP-capable, large
texture-supporting, OpenGL ICD $200 Rage Fury (we all make mistakes:) in
early '99 and replaced it with an old school $180 Voodoo3 3000 instead of a
$250 TNT2 Ultra crammed with future protection, and then sat down and played
Unreal and Grand Prix Legends to my heart's contentment. And then waited
until I could spend the same $180 for a Radeon64 DDR Vivo early last year to
abandon that archaic, development-stifling card. Well, not entirely - I
still rev up my old P2 400 rig (now mostly my Linux box) now and again and
slap in GPL for a few quick laps...
I wonder what I can get next for that $180? Probably a Radeon 8500, since
nVidia isn't likely to be selling their top stuff for that kind of price
anytime soon. But I have to hand it to nVidia for reversing their
traditional pricing pattern with the Ti's, and hope this is a sign of things
to come. If they can bring ATi to their knees next, that certainly won't
continue, though. Think about it.
> > Too bad 3dfx focused on making fast cards with 3D
> > features that only worked on games one could actually play - and ALL
games
> > one could actually play...
> Here's another guy that believes NOT having features is better than having
> them.
> What's with these people anyway, are they on *** or something? :-) The
day
> a piece of software comes along that interests you and uses cube
environment
> maps or dot3 bumpmapping for example, wouldn't it be great if your
videocard
> actually was equipped to handle it - unlike the V5 and previous
> incarnations??? Geez!
dunno about lemming but if the guy wants to post stuff and not expect
to get responses then he souldn't post but as I already said he knew
exactly what sort of response he was going to get hence the "I don't
care what anybody says" line in his post
Yeah right "Real World" if you think all this BS and my***s better
than your disk BS is the real world then boy do you need help
Personally I think you should buy whatever makes YOU happy and take
absolutley NO notice of whatever anybody else says or does. It's your
$$$ spend them how you like do your own research. if you can do your
own side by side comparisons come to your own conclusions. If you take
all "reviews" with a pinch of salt then you wont go far wrong.
My opinion of the comparison done on said webpage is this
The author very obviously cherry picked the numbers from his test to
give the answers he needed to put forward his own personal
tastes/opinions. Anybody could quite easily do other tests that could
make it seem like NOT upgrading from a V5 to a GF3 is the equivelent
of suicide. I would not question the numbers he obtained as I have not
do the same tests. but the slant on the review is so blatent that a
5yr old could see it coming.
Again I say anybody who buys anything relying on somebody elses
opinion without doing some legwork of their own is a fool and deserves
to end up with a pile of junk.
I dont think he cares if he didn't want a "flame war" he wouldn't have
posted in the first place. Ego's just love attention and they really
don't care about the source.
When I got my Radian, I was shocked to see how bad straight 16-bit
color looked. GROSSLY inferior to the 22/16-bit of my V3...
Ironically, even though the Voodoo had better looking graphics in
practice (because 32-bit was just too slow), it still took knocks from
the nVidia children, just graduating from their Super Nintendos, just
because Voodoo didn't have 32-bit color in the check-off box.
Exactly. A stupid, immature market was there for the FUDing.
>>> And I AM running at 1280x1024 on a 19" monitor.
>>At what DPI?
>.26, I think. KDS VS19SN.
>What's with these people anyway, are they on *** or something? :-) The day
>a piece of software comes along that interests you and uses cube environment
>maps or dot3 bumpmapping for example, wouldn't it be great if your videocard
>actually was equipped to handle it - unlike the V5 and previous
>incarnations??? Geez!
> >Third, The V3 had 22bit color not 16 and the V5 had 32bit color before
it
> >went under so big deal.
> Yes, and 3dfx's 22/16 bit color was virtually indistinguishable from
> 32-bit. Certainly VASTLY better looking (and just as fast) than the
> straight 16-color that the nVidia crowd was forced to use (for
> performance reasons) in that time frame.
> When I got my Radian, I was shocked to see how bad straight 16-bit
> color looked. GROSSLY inferior to the 22/16-bit of my V3...
> Ironically, even though the Voodoo had better looking graphics in
> practice (because 32-bit was just too slow), it still took knocks from
> the nVidia children, just graduating from their Super Nintendos, just
> because Voodoo didn't have 32-bit color in the check-off box.
I think it's ironic that you can still read postings from G3 owners having
fram rate problems in MOHAA and RTCW. Make you wonder if our OLD cards are
really the problem at all.
If people are having framerate problems with RTCW...maybe they should look
to see what their system is lacking instead of blaming the GF3......
A computer system needs to have balanced components and throwing a GF3 in a
system that is not really able to handle it is folly....but you would know
about that.....(Kyro...lol)...hehe