And you're certainly not the first to find that.
To reply by email, remove the XYZ.
Lumber Cartel (tinlc) #2063. Spam this account at your own risk.
It's your SIG, say what you want to say....
And you're certainly not the first to find that.
To reply by email, remove the XYZ.
Lumber Cartel (tinlc) #2063. Spam this account at your own risk.
It's your SIG, say what you want to say....
> > > You're right, 3Dfx didn't hold back the market...because nVidia
stepped
> in
> > > and took over. If you still don't see what nVidia has brought, i
suggest
> > you
> > > read Hans' post again.
> > Who?
> > nVidia most not have brought much since you do not seem to even want to
> > comment on it. Sometimes silence can be more revealing than words huh?
> Silence? Hardly...it's called, "why should i have to type it all out when
> someone else already has". But...since you've failed to read previous
posts
> in this thread, i'll copy|paste a little portion of what was said...
> "Of course no, since they [3Dfx] were still the market leaders, and
> developers aimed
> at voodoo-level performance. That meant 16-bit everything and blurry
256*256
> textures. That's exactly what I meant by holding back the revolution,
since
> 3dfx refused to change with the times feature-wise apart from a few
> insignificant details from the first voodoo graphics in 96, all the way up
> until the V5 in spring 99. John Carmack publically lamented the fact in
his
> .plan that features he discussed with the company back when the V2 was in
> development had YET TO SHOW UP in a 3dfx chipset when the company went
bust
> in 2000! I think that pretty much proves my point."
> Enter nVidia, who intruduces 32 bit color, larger texture support and
> more...suddenly games start coming out with those features and are now
> mainstream. It is unfortunate that 3Dfx seemed to take the stance that
they
> were invincible and refused to make much progress with their cards as they
> had some very good hardware. I think something even as simple as 32 bit
> color and larger textures would have helped them out tremendously.
> At this point, this thread has wondered so far off topic, it's pretty
funny.
> lol. The point Pierre was making, i believe, still stands. For more people
> who have the V5, an upgrade isn't totally essential. It runs a decent
> portion of games adequately still. However, IMO, people coming from any
> other card would be foolish to buy a V5 over the current crop of cards.
First of all, ATI is the one that made the first real 32bit color card with
the Rage 128, not Nvidia and S3 had the first texture compression AFAIK, not
Nvidia.
Second of all, if you had bothered to actually read my posting you would
have understood that my question was in regards to the post-3Dfx era and big
textures and 32 bit color are OLD news in that regard.
Third, The V3 had 22bit color not 16 and the V5 had 32bit color before it
went under so big deal.
So please try tell me again what wonderfull advancements has Nvidia given us
in this post 3dfx era besides $300+ price tags. Ill assume that you can do
that without quoting somebody else. Ill be waiting...
Hehe.....It could have been worse, I suppose (a V5....lol)
Sorry.....I have been around too long for that ***to wash...hehe
I really hated all those ATI's....all the different driver versions for the
different cards (pre-rage pro).
Dude......ok..I won't make any cracks about your dad's vision or your
mothers looks...hehe
Please...spare me the BS.......that is NOT normal......1152 on a 17"? Give
me a break...
Just like with the 23.11's....you're just plain wrong again....hehe
(hey...maybe clock yer GF3 down a little..hehe)
Oh really? Better check that again...hehe
Psudo - 22bit .......dumbass....
Why don't you get your facts straight and try again loser...hehe
.26, I think. KDS VS19SN.
To reply by email, remove the XYZ.
Lumber Cartel (tinlc) #2063. Spam this account at your own risk.
It's your SIG, say what you want to say....
Ok...
Perhaps not, but I do. History shows that the mass market
follows the leader. More is better.
So you buy crappy tools. Buy a decent graphics card and monitor!
DOn't assume the rest of us have no class!
Work! Hey goober, I run 3200x1200 at work. The primary display
is my laptop's 15" and the secondary is a 20" CRT. It works
*very* well. Just because you haven't a clue, don't project your
ignorance on others.
I think you're ready for the funny-farm, where life is beautiful,
hehe-hoho...
No, a statement of fact.
Hoho.....
----
Keith
Perhaps DeanK can help you out? You won't have to worry about
your transfers and might make a small buck.
----
Keith
By the way, never did see you comment that post I made in the iNHELL
newsgroup about the P4 Xeon being totally crushed, mauled and mangled by the
Athlon MP... I expected at least a "hehe" out of you, but not a peep. Seems
I shut you up GOOD that time, LOL!
You better not, I tell ya...! ;-)
What would you run on a 17" then? 1440 is rarely available and would be way
too small. 1280 is okay-ish, but unless it's a sharp tube things will get
fuzzy. 1024 isn't big enough. 1152 is a good compromise, I think.
Excuse me, plain wrong? The problems with the 23.11:s are well documented on
the web, so shuddap. Just shuddap. Okay?
It runs steady as a rock at 255/515 now... Dunno what happened, maybe I had
sidebanding active last time I tried to find the sweet-spot and that tricked
me into believing the thing wouldn't o/c well... This generic GF3 rocks the
house, who needs to pay premium prices for brand-name ***when plain green
cards rule this much? :-D With better cooling I bet this could be increased
even more, maybe an orange orb and some Thermaltake ramsinks, and a slot
cooler... Mmm, that is certainly food for thought!
Bye!
/HB.
Is there something you'd like to add to this group? Please do!
----
Keith
Well, I didn't say no progress, that would be a ridiculous claim. At its
release the GF3 was the best commercial card ever, but marginally so -
considering features, framerate, drivers, games that utilize its
capabilities, it wasn't that much better than a GF2 Ultra. But looking over
the two periods I was speaking of, nVidia went from the GF256 DDR to the GF3
Ti 500, but prior to that went from the Riva128 to the GF256 DDR. I think
that earlier period represented a much greater advancement than the last two
years. Just think about playing UT or Q3A on a Riva 128, then think about
playing RTCW or Max Payne on a GF256 - which is more viable? A GF3 T5
running Quake3 can do 165 fps at 1024x768 and 32-bit on a 1.33gig Athlon, a
GF256 in the same game, color depth and resolution could do 70 fps on a P3
550 - and a Riva 128 running Quake2 could do 20 fps at 640x480 and 16-bit on
a P2 300 (benchmarks courtesy of TH).
Translation: they did their homework, isolated themselves from their
customer's complaints by using 3rd party board makers, built good, fast
chips that sold more on "gotta have, but can't really use" 3D features
touted by the press than their unquestionable *** capability, and their
higher prices, less impressive 2D, image quality and feature breadth didn't
matter much in the end. Too bad 3dfx focused on making fast cards with 3D
features that only worked on games one could actually play - and ALL games
one could actually play...
Whistful wishing. 3DFX screwed an entirely different pooch. They
wet their own bed, and then slept in it good and proper.
C//
What's with these people anyway, are they on *** or something? :-) The day
a piece of software comes along that interests you and uses cube environment
maps or dot3 bumpmapping for example, wouldn't it be great if your videocard
actually was equipped to handle it - unlike the V5 and previous
incarnations??? Geez!
No, these men are not from Mars, they must be from outside this solar system
entirely! ;-)
Bye!
/HB.