Well, that's all fine and good but, again, first impressions are everything so *I* won't ever be buying AMD/ATI EVER again. Games will ALWAYS DEFINITELY work fine on the industry-standard hardware (which is, hate to burst yo' bubble, still Intel and nVidia). Everything else is secondary...
> > Yea, well, until I have a reason to go with AMD or ATI <shudder> I won't
> be jumping through hoops just because they're slightly cheaper--especially
> when you KNOW game developers extensively test and use >industry-standard
> hardware (Intel and nVidia currently)
> How are AMD and ATI perfectly good hardware vendors? Let me count the
> ways...
> Alot of game developers use AMD and ATI cards.
> AMD's architecture is better for ***- benchmarks prove this. To top it
> off, it's much cheaper. The only cases that Intel is better are in
> specialized productivity applications which have been optimized for Intel
> processors. Intel cards are fast, both in clock rate and performance, but
> only when the software is optimized fro them. AMD chips have plenty of raw
> muscle to handle anything.
> ATI current has the fastest consumer-level graphics card. The Radeon 8500
> doesn't have the performance of the GeForce 3 Ti 500, but it was also alot
> cheaper and really offers more bang for the buck than the Ti 200. The
> occlusion culling architecture was better, and it offers better antialiasing
> image quality and anisotropic filtering. Geforce 3 and 4 just blur textures
> like crazy with antialiasing. And the Radeon 8500 also supports hardware
> n-patch technology, something you won't see a GeForce card do, and this
> technology has appeared in several games, despite ATI's smaller market share
> in high-end graphics. The original Radeon offered occlusion culling
> technology to save bandwith (Hierarchial Z), all the while the GeForce 2
> lacked this important feature. So ATI cards are not ***after all, Eep.
> >FIRST and then, if they have time, bother with the more rinky-dink stuff
> (AMD, ATI, Matrox, etc currently).
> > THAT, if anything, is the way of engineering--and the industry.
> > > On Fri, 13 Sep 2002 11:25:23 GMT, =?iso-8859-1?Q?Eep=B2?=
> > > >FIrst impressions are everything. I had an AMD K5-PR133 (Heh, remember
> > > >"Pentium ratings"? How lame!) AND an ATI 3DXpression+PC2TV (Rage II+
> > > >chipset) and BOTH sucked petunas. Needless to say, since then I've
> > > >only bought Intel and nVidia and haven't had ANY problems since.
> > > >I will NEVER buy AMD or ATI EVER again--EVER! Just like I won't ever
> > > >buy an IBM hard drive ever again because I had one that completely
> > > >failed for no reason (no warning or bad sectors or anything!).
> > > That's one approach - safety first. Another reason some people would
> > > have is the brand loyalty (see, I didn't use the f word).
> > > Doing this, however, restricts you from getting an optimal solution at
> > > any time. Again some people are willing to live with a non-optimal
> > > path for a) peace of mind, and b) their own definitions of 'optimal'.
> > > For example, price plays an important role in what I consider to be
> > > optimum. Buying a 300-400$ video card isn't practical for me, no
> > > matter what it does unless it comes packaged with Jennifer Conelly
> > > (and if my wife is reading this, I'll donate her to charity.. by 'her'
> > > I mean Ms. Connelly)
> > > I suspect, that many people including myself, do look for the best
> > > possible solution at any given time and their decision doesn't depend
> > > too much on what a certain brand's products used to be a few years
> > > ago. If, let's say, I got burned out by ATI 3DXpression, my 'first
> > > impressions' would last just for that generation of 3DXpression
> > > products, and not for new graphic chips (with substantially different
> > > functionality) which come out years later.
> > > More importantly, because of always buying optimally, I wouldn't have
> > > bought 3DXpression in the first place :)
> > > 3DFX Voodoo1 was ignored by some Rendition fans, nVidia TNT came under
> > > fire by 3DFX fans and ATI Radeon line of cards found the life tough as
> > > well. They all were ridiculed for past performances; nVidia for its
> > > abysmal NV1 chip (in Diamond Edge 3D), which I am glad Eep you didn't
> > > buy because you'll be boycotting all nVidia products as well, ATI for
> > > its indifferent driver support for many years, AMD for its forgettable
> > > K5 and K6.
> > > Some of us, have an expiry date for our 'first impressions',
> > > especially when the product, they are based on, has expired as well,
> > > and which gives us a better chance of picking the right CPU, sound or
> > > graphic chip, when the time comes.
> > > That's the way of the engineer !