rec.autos.simulators

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

Hans Bergengre

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Hans Bergengre » Sat, 26 Jan 2002 03:36:58


> Well, that just depends, doesn't it?  If the features are FREE, in
> both dollars and time-to market, sure - bring them on!  But they're
> not free, are they.

Chris, how much is a GF3 200Ti in dollars and cents today? It's not THAT
much is it!

If you find another video card that in your eyes presents a better value per
$, by all means go ahead and buy it. A quick search on Pricewatch shows
numerous entries around the $150 mark. Maybe you think that represents poor
value?

That is certainly a factor that should be considered when thinking of byuing
a new video card. Otoh, even IF a certain feature will be under-utilized in
the lifespan of proposed video card, your new video card would still run
your existing (and upcoming) library of titles faster and better than your
current one. With the exception of mebbe Deus Ex and some ancient Glide-only
titles which precious few people still play... ;->

 Bye!
/HB.

Hans Bergengre

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Hans Bergengre » Sat, 26 Jan 2002 03:43:57


> Good post.  Nice to read someone with a brain.

Subtle as always, ChrisV...

You go ahead and have a nice day you too. :-/

 Bye!
/HB.

Hans Bergengre

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Hans Bergengre » Sat, 26 Jan 2002 03:29:36


> Yeah, like the guys who bought TNTs in late '98 and then got their copy of
> Quake3 a year later, slapped it in and cranked up the quality to utilize
> those detailed textures and 32-bit color, and then fragged away at 5
frames
> per second.

Uhhh... Not exactly, no.

Tell me which feature will make today's video cards frag away at 5 fps,
please. You aren't really painting a true picture of the situation here,
dude. Things have happened since the TNT was released, mebbe it's time to
realize that for die-hard 3dfx fans?

Here we see it again, sore loser syndrome. ;-)

<sarcasm>Otoh, 3dfx was a fair company, THEY DECIDED FOR YOU when a
particular feature was "needed" and released a card supporting that feature
years and years after the competition did. Yeah, that's a much better
approach than that Evil *** Nvidia. If only all companies competed fairly
by making sure they lagged behind 3dfx, I'm sure the state of the ***
market would be MUCH BETTER today...!</sarcasm>

I have no idea what people you're referring to, but one has to remember that
everyone is responsible for making their own decisions (and doing proper
research in order to make the right one for them).

Not sure what you're talking about. I got my GF3 in november, it wasn't
particulary new by then, it did not cost $400 either, that's just wishful
thinking on your behalf. Nor have I preached the virtues of getting $400
video cards, that's just another example of wishful thinking on your behalf.

Seems the only defense left for the aging 3dfx fanboys are accusing others
of mindlessly and needlessly buying $3-400 (and sometimes more) video cards,
that's just SO pathetic.

Get real, will ya!

 Bye!
/HB.

Never anonymous Bu

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Never anonymous Bu » Sat, 26 Jan 2002 05:41:10


thusly  

Not even close.

I had a choice of about 10 units in my price range, the KDS looked best
and cost almost $90 more than the lowest priced monitor I saw.

I could have spent twice as much, and gotten a noticeably better unit, but WHY??

This is a home system, not something I'm doing CAD or photoshop on all day long.

To reply by email, remove the XYZ.

Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

It's your SIG, say what you want to say....

Tim Mise

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Tim Mise » Sat, 26 Jan 2002 08:38:26

Wow, that's a great post.

-Tim


> I think you're mostly right about this, but you characterization of what
> happened with 3dfx is a little too simplistic. They were a small specialty
> company that built add-on PCI 3D-only hardware for *** and built and
> maintained an API to help make this happen. It was apparent that they'd
have
> to make a move to AGP-based 2D/3D cards in order to grow beyond this, or
> even to maintain their position - the future of high-end graphics was on
the
> AGP bus, which meant incorporating 2D, and other companies were beginning
to
> make inroads into 3D from that position. nVidia was the principal
> competition, and the Riva TNT established them as a high-end OEM presence
as
> well as a company that could make competent 3D hardware. So 3dfx put out
the
> Banshee, which was a middling effort and really just a PCI card using the
> AGP bus, but not a bad start. Then they followed it with the similar but
> faster Voodoo3, a great real-world, bang-for-the-buck card - $130-180 srp
> for V2 SLI-level 3D performance and great 2D. But at the same time they
were
> dealing with moving their boardmaking in house, were trying to develop an
> OEM base, Glide was dying, probably erred in putting together their 3500
> flagship card, and weren't focusing enough on their next generation, which
> had to incorporate all those existing features found in competing cards
and
> add new ones and even more speed.

> Meanwhile nVidia was putting out the TNT2, which really wasn't anything
very
> new either, but it was much more at home in the AGP slot and the press
> jumped all over 3dfx once it became apparent that it could keep pace with
> the V3 in 16-bit benchmarks and could provide playable 32-bit. Then 3dfx
> wasn't doing anything right in their eyes, as I said in another post in
this
> thread. The coup de grace was the GeForce, which was rushed out to follow
up
> on the TNT2 momentum. The press acted like its hardware T&L was the
biggest
> thing since the invention of 3D, and who could tell since there wasn't
> anything to actually run on it? Anyway, the actual hardware trickled out,
> starting with SDR, then DDR, then 64mb of it months later. It almost
wasn't
> so much a piece of hardware as a symbol, and the press said if you didn't
> have T&L you might as well box up your PC and start watching TV in six
> months. The opposition didn't really help themselves at that point - the
> Rage Fury Maxx was too slow and flawed, the Savage 2000 sounded good but
> didn't deliver and then S3 was gone, Matrox was silent, and the announced
> Voodoo5 didn't show up at all.

> By the time the V5 did show, nVidia was ready for the GF2, which was the
> real deal. It was faster, it had enhanced features, the drivers had
matured,
> and you could actually buy one. The too-little, too-late V4/5 didn't stand
> much of a chance, and it hardly was noticed that T&L games hadn't really
> showed yet. 3dfx was done already, and the 6000 never made it out of the
> lab. But the death knell had rung the generation before, when 3dfx
couldn't
> make the transition from a boutique to a mainstream operation. That the
GF2
> GTS was very expensive didn't matter much - nVidia had fully established
> their name, so followed the aging TNT2 with the GF2 MX for mid-level
buyers
> and the OEMs to spread their base.

> Since then nVidia has backed off their pace. The GF2 Ultra was just a
faster
> and more expensive card ($500!) and anyone could have pulled that off, the
> GF3 came out a full year after the GF2 and didn't represent a quantum
> performance leap, and the Ti thing is just a juggling of speeds and prices
> to make it look like they've done something. In comparison, ATi increased
> the speed of their flagship Radeon64 Vivo last year and didn't even
mention
> it - with nVidia that would have been a product cycle! The reality is that
> nVidia hasn't made massive progress in the last two years, much less than
> the two years prior to that. The competition hasn't done that well because
> its mostly gone - 3dfx and S3 out of business, and Matrox quietly having
> withdrawn from the high-end sector after dabbling with one card. But ATi
has
> come farther in those two years than nVidia, having moved from the Rage
> 128-based RF Maxx to the Radeon 8500, every bit the card that the GF3 Ti
500
> is and perhaps more. But they haven't gotten the driver and customer
> relations stuff together the way they should have for this market, and
have
> lost a lot of ground to nVidia in the OEM, mobile and Mac sectors.

> So nVidia made their name by defeating 3dfx, a victory largely based on
> flashy features that wowed the press, and on ground that belonged to
nVidia
> more than 3dfx. They have parlayed that to a broad base that now goes
beyond
> PC video cards. But the image of them as this technological steamroller
that
> cranked out new and better cards every six months is a little inaccurate,
I
> think.



> > Still, all that being said, it has virtually nothing to do with the
> > reason why 3dfx went out of business.  The reason that 3dfx went under
> > is simply that nVidia just blew them away in their core market and
> > 3dfx didn't have a niche to drop back on.  S3 also got wiped out and
> > was sold off.  Matrox got blown away but was able to fall back on
> > their niche of high-end 2D graphics.  ATI was the only company that
> > kept within shooting range, though for a while they were only holding
> > on due to their strong OEM ties and low cost.  nVidia simply set a
> > development pace that no one else could match.  Everyone else kept
> > planning on building a "TNT killer" or "GeForce killer", but by the
> > time they got to market, nVidia had already moved on by two
> > generations.  What's more, nVidia Unified Driver approach gained them
> > some popularity with OEMs and also meant that they were able to
> > quickly develop really good drivers for ALL their operating systems
> > (nVidia to this date remains the only company with really good 3D
> > drivers for Linux, being more then twice as fast as their next
> > competitors on just about everything).  ATI was especially bad for
> > this, where their drivers might have worked ok for Win98, but simply
> > sucked for Win2K for quite some time (and under Linux a TNT2 m64 will
> > beat out a Radeon 8500 any day).

> > In short, I don't think that many people can really fault 3dfx that
> > much for their demise, it was really just that nVidia did a LOT of
> > things right and, perhaps more importantly, they did them right now,
> > not two years from now.  Even Intel, who many feared would totally
> > dominate the graphics industry in short order was completely blown
> > over by nVidia in the separate graphics chipset market and was
> > relegated to the integrated graphics market (where nVidia is already
> > making a push at as it is).

Tim Mise

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Tim Mise » Sat, 26 Jan 2002 08:45:11

You mean Dots Per Inch?  That is not a monitor measurement, that's
printer/scanner measurement.  Do you mean to say color bit depth?

-Tim


Tony Hil

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Tony Hil » Sat, 26 Jan 2002 09:33:47

On Wed, 23 Jan 2002 08:09:35 GMT, "Mark Nusbaum"


>I think you're mostly right about this, but you characterization of what
>happened with 3dfx is a little too simplistic. They were a small specialty
>company that built add-on PCI 3D-only hardware for *** and built and
>maintained an API to help make this happen. It was apparent that they'd have
>to make a move to AGP-based 2D/3D cards in order to grow beyond this, or
>even to maintain their position - the future of high-end graphics was on the
>AGP bus, which meant incorporating 2D, and other companies were beginning to
>make inroads into 3D from that position. nVidia was the principal
>competition, and the Riva TNT established them as a high-end OEM presence as
>well as a company that could make competent 3D hardware. So 3dfx put out the
>Banshee, which was a middling effort and really just a PCI card using the
>AGP bus, but not a bad start.

Their first effort was actually the Voodoo Rush, which was***poor
and probably best forgotten.  Banshee was their second attempt at
things, and was a marked improvement over the Rush, but wasn't up to
par with it's Voodoo2 counterpart.

Yup, they had to do all the above and they had to do it FAST.  They
couldn't do it fast enough though.

nVidia was hardly the first company to do hardware Transform and
Lighting, this was a mainstay of the high-end OpenGL graphics cards
for many years prior to the release of the GeForce.  What nVidia did
was bring this high-end feature out in cards that cost $200-$300,
while the professional OpenGL cards were costing $3000-$5000.

Yup, that's about it.  nVidia delivered, and they delivered MUCH
faster then anyone else was able to.  Sure, they had their weak
points, but they had a lot of strong points to and each new revision
of nVidia's cards fixed some of the previous versions weaknesses.

The drivers are a major issue, and I think a large part of the reason
why so many OEMs moved to nVidia in the first place.  Whether or not
they deserve it, ATI has managed to get a reputation for rather poor
drivers.  In fact, they earned this reputation quite a ways back with
the Mach32 (quite possible the absolute WORST drivers for ANY PC
component ever), and it hasn't helped that with every new chipset that
they come out with its taken them at least 6 months before their
drivers are up to par, particularly for anything other then Win9x.
The Rage 128 based cards took forever before they had halfway decent
drivers for Win2K.

For a while, nVidia was a technological steamroller.  Their cards
might not have been a huge step forward, but they did bring out new
models every 6-8 months, while everyone else was trying for huge leaps
forward every 2-3 years.  This meant that even if another company
managed to catch up to nvidia with the release of a new product, 6
month later nVidia had something that was better by just enough to
make the competition look like outdated technology.  What's more,
nVidia quickly expanded to fill all the segments of the PC market,
while most others were concentrating on their niches.  3dfx was trying
to milk the gamers for all it was worth, ATI was going after the OEMs,
Matrox was trying for the 2D business users.  nVidia went after them
all, from the low-cost OEM cards like the TNT2 m64 (still a mainstay
in OEM machines), through the mid-range *** cards like the GeForce2
MX, up to the high-end *** cards they're best known for, and
they've recently managed to take over a HUGE chunk of the high-end 3D
workstation cards with their Quadro line.  They had very ambitious
plans that were executed well enough that they beat out just about
everyone else.

I am, however, very glad that ATI is getting back into the swing of
things, doing very much like what nVidia was doing in trying to bring
out a lot of new products at a lot of different price points.  On the
lower end of things, ATI has some tremendous value in their older Rage
128 cards (good 2D business and OEM type cards), as well as their
low-end Radeon products for the gamers on a budget.  On the high-end,
the Radeon 8500 has a lot of promise, though it needs better driver
support IMO.  They're even a really big name in the high-end 3D
workstation market with the FireGL line that they recently acquired.
Combine that with the fact that they've started selling their chips to
third parties, and I think we're starting to see a real resurgence of
competition on the graphics card front.  This, I think, is a very good
thing, because as you mentioned, nVidia hasn't been moving forward
nearly as fast as they used to be (they simply haven't needed to).
Mind you, the GeForce 4 is just around the corner.

ZOD

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by ZOD » Sat, 26 Jan 2002 11:33:47

What I was getting at is their are many people upping the screen size and
then changing the fonts.....96dpi and 120dpi are common.

fly13

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by fly13 » Sat, 26 Jan 2002 12:21:50




thusly

> >>> And I AM running at 1280x1024 on a 19" monitor.

> >>At what DPI?

> >.26, I think.  KDS VS19SN.

> Cheapest monitor you can buy.

And the best deal too.  I have two of them and am very impressed.  I run a

crappy Trinitron tube (you know, the one with the mask wires).  Dot pitch
doesn't mean doodly anymore.  It's the bandwidth of the electronics and the
KDS has a 220MHz bandwidth, which is pretty good (if you can believe printed
specs).  Just checked the web site.  Looks like they stopped printing the
bandwidth figure.
Keith R. William

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Keith R. William » Sat, 26 Jan 2002 12:10:04


Evidently not, if it's fuzzy at 1200x1024, or less.

You are real funny nit-wit.  If you have anything to say (no
evidence of any lights on upstairs), I'm sure there is someone
left to humor you.  

*PLONK*

----
  Keith

Keith R. William

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Keith R. William » Sat, 26 Jan 2002 12:05:03



"ZOD" doesn't know what he means.  Give it up, there is no one
home there.

----
  Keith

=================  

> -Tim



> > > .26, I think.  KDS VS19SN.

> > No.....that's dot pitch...I meant DPI.....

Mark Nusbau

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Mark Nusbau » Sat, 26 Jan 2002 13:33:19

"Hans Bergengren" <s...@spam.com> wrote in message

news:koY38.17723$l93.3551138@newsb.telia.net...

> "Mark Nusbaum" <mark.nusb...@worldnet.att.net> wrote:

> > Yeah, like the guys who bought TNTs in late '98 and then got their copy
of
> > Quake3 a year later, slapped it in and cranked up the quality to utilize
> > those detailed textures and 32-bit color, and then fragged away at 5
> frames
> > per second.
> Uhhh... Not exactly, no.

> Tell me which feature will make today's video cards frag away at 5 fps,
> please. You aren't really painting a true picture of the situation here,
> dude. Things have happened since the TNT was released, mebbe it's time to
> realize that for die-hard 3dfx fans?

Not following you, Hans. I'm not talking about today, I'm talking about late
'99. I think it's true that hardware has caught up to the games today as
compared to then, so it takes longer for this effect to take place. But it
still happens - look at the article at AnandTech today on cards running the
beta Unreal2/UT2 engine. There are development issues that will make the
cards run better when the games do come out, but you can see the framerates
dropping. If you look at 12x10 32-bit, no current card even hits 35fps, GF2
Pro/Ti's do 16, my Radeon 14 (sob!).

> > Just because you have certain capabilities in hardware doesn't
> > mean you'll be able to use them when the software eventually comes out.
> But
> > people like you who buy into this crap, primarily shoveled by the
hardware
> > press, are the reason nVidia rules the business now.

> Here we see it again, sore loser syndrome. ;-)

> <sarcasm>Otoh, 3dfx was a fair company, THEY DECIDED FOR YOU when a
> particular feature was "needed" and released a card supporting that
feature
> years and years after the competition did. Yeah, that's a much better
> approach than that Evil Nasty Nvidia. If only all companies competed
fairly
> by making sure they lagged behind 3dfx, I'm sure the state of the gaming
> market would be MUCH BETTER today...!</sarcasm>

Regarding 3dfx, if they hadn't been in the financial condition they were in
'99 and had stayed viable, I have no doubt that the state of gaming hardware
would be better than it is now. I think their approach was somewhat
different than nVidia, moreso than ATi, and a third competitor would have
done a lot to spur development. Without their financial difficulty, 3dfx
would have gotten their V5 generation out earlier, including the 6000. Since
they were still on a .25mu process, their next, die-shrunk generation would
have been much faster. Continuing the multichip solution would have changed
what the card could do, and things like FSAA at very high resolutions was
possible very soon, and other effects as well. There are those who say (and
I'm not talking about 3dfx diehards) that the only very good thing to come
out of the featureset of the last few generations of cards is FSAA, and 3dfx
and ATi have contributed as much as nVidia in this area.

As 3dfx faded away nVidia's development slowed and the prices went up - the
$200-250 TNTs were followed by the GF256 32mb SDR at $250-300, the GF 32mb
DDR at  $300-350, the $350-400 GF 64mb DDR, the GF2 GTS at about $350, then
the 64mb version at $400-450, and  finally the GF2 Ultra at $500 after 3dfx
was long gone.They've backed down to the $400 level with the GF3 and Ti 500,
probably realizing what the limit was that even gaming fanatics would pay,
no longer having to pay so much for 64mb of fast DDR memory, and realizing
that they still have a competitor in ATi. But have the advancements in
performance over those last two years been worth the bucks? Would the
picture have been the same if 3dfx was pushing them all the time?

Regarding 3dfx' product decisions. It's not like I bought the card a year or
two before its release and then got stuck with their development decisions.
I made that decision based on what it could do, my assessment of what it
couldn't, and the price. Anyone was free to do that, and it would have been
nice if the press coveyed that message. But they didn't, rather they coveyed
the type of message that you do, that 3dfx was almost evil, that they were
truly bad for gaming, that one would really regret getting one of their
cards only six months or so after buying it. I think that kind of bias hurt
3dfx in very real ways, and ultimately lead many people to...

> > like the guys who were buying all those TNT2 M64s when the GeForce first
> > came out, because nVidia was best then as well.

> I have no idea what people you're referring to, but one has to remember
that
> everyone is responsible for making their own decisions (and doing proper
> research in order to make the right one for them).

...buy cards that didn't perform very well at all but had that TNT2 name
that the press touted so much. It's that low-end base in OEM and retail
boards that has benefitted nVidia greatly in terms of profits, name
recognition, wider OEM penetration. That's a big part of the reason ATi
decided to redouble their efforts in the high-end gaming arena, I think, to
help
protect their low-end and OEM base.

- Show quoted text -

> > I'd think that people who
> > populate the hardware newsgroups would know better. But I guess I can't
> > complain if you guys spend $400 on a card that perhaps helps in the
> > development of better software that I can actually run when released a
> year
> > or two later on my new, faster (but no longer cutting edge) $200 card...

> Not sure what you're talking about. I got my GF3 in november, it wasn't
> particularly new by then, it did not cost $400 either, that's just wishful
> thinking on your behalf. Nor have I preached the virtues of getting $400
> video cards, that's just another example of wishful thinking on your
behalf.

> Seems the only defense left for the aging 3dfx fanboys are accusing others
> of mindlessly and needlessly buying $3-400 (and sometimes more) video
cards,
> that's just SO pathetic.

The prices I mention are suggested retail at release, since that's about the
best relative measure I can think of. Your GF3 purchase came after the
release of the Ti "generation", so should have been less, and you may not
have bought it off the shelf in a typical retail outlet. No matter - I paid
$180 for my Radeon64 DDR Vivo early last year, but I would refer to it as a
$400 card since that's what SRP was at release. I don't really care what you
paid for your card, I'm just indicating prices to show relative
measurements, between nVidia cards and others, between generations of nVidia
cards.

And I'm not some bitter 3dfx fanboy living in the past. I think they made
some very good cards, did a lot for 3D gaming in the early days, and I do
wish they were still around for everyone's sake. I could easily say that you
and
the other "nVidiots" love to crow over your victory over 3dfx that was
measured by a company going out of business, love to cite benchmarks and
wiz-bang features, but ignore the price of admission to the nVidia club, the
lack of real, measurable benefit from those features you tout, and the cost
of the loss of competition to everyone. But I won't.

Finally, I'll leave you with a few words from John Carmack early last year
on your precious GF3's programmable shaders:

"Now we come to the pixel shaders, where I have the most serious issues.
I can just ignore this most of the time, but the way the pixel shader
functionality turned out is painfully limited, and not what it should have
been.

"DX8 tries to pretend that pixel shaders live on hardware that is a lot
more general than the reality.

"Nvidia's OpenGL extensions expose things much more the way they
actually are: the existing register combiners functionality extended to
eight stages with a couple tweaks, and the texture lookup engine is
configurable to interact between textures in a list of specific ways.

"I'm sure it started out as a better design, but it apparently got cut and
cut until it really looks like the old BumpEnvMap feature writ large: it
does a few specific special effects that were deemed important, at the
expense of a properly general solution.

"Yes, it does full bumpy cubic environment mapping, but you still can't
just do some math ops and look the result up in a texture. I was
disappointed on this count with the [original] Radeon as well, which was
just slightly too hardwired to the DX BumpEnvMap capabilities to allow more
general dependent texture use.

"Enshrining the capabilities of this mess in DX8 sucks. Other companies
had potentially better approaches, but they are now forced to dumb them
down to the level of the GF3 for the sake of compatibility. Hopefully
we can still see some of the extra flexibility in OpenGL extensions."

Kinda looks like the same ol' story...

Never anonymous Bu

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by Never anonymous Bu » Sat, 26 Jan 2002 13:58:22


I'm using the default 96dpi font.

Most of my friends think it's WAY too small, but it works for me.

To reply by email, remove the XYZ.

Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

It's your SIG, say what you want to say....

ZOD

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by ZOD » Sat, 26 Jan 2002 15:28:31

Yada...yada....go crawl back under that rock, wormboy..hehe

ZOD

Voodoo 5 vs Geforce 3 using Ghost Recon, Nascar 4, Quake3, F/A-18, Flanker Benchmarks Galore....downloads for those who care

by ZOD » Sat, 26 Jan 2002 15:31:11

I never said it was fuzzy...just too small for my liking.
Pay attention...hehe

And you are a total moron.......hehe

Yeah...I got yer 3200x2000 right here!
Hehe.....


rec.autos.simulators is a usenet newsgroup formed in December, 1993. As this group was always unmoderated there may be some spam or off topic articles included. Some links do point back to racesimcentral.net as we could not validate the original address. Please report any pages that you believe warrant deletion from this archive (include the link in your email). RaceSimCentral.net is in no way responsible and does not endorse any of the content herein.