Through its trials and tribulations, it fought against other long-standing companies as well as hot new upstarts.
We are, of course, referring to ATI Technologies.
At Lau’s urging, they joined him in transforming Comway into a new venture.

In less than 12 months, it had its first product on the market.
Over the years, multiple versions of these products were released.
However, this was a highly competitive market, and ATI wasn’t the only company producing such cards.
What was needed was something special, something that would make them stand out from the crowd.
All of the computations involved in actually generating the displayed graphics and colors were entirely handled by the CPU.
However, the time was ripe for a dedicated co-processor to take over some of these tasks.
In the case of ATI, this took the shape of the ATI38800 chip better known as the Mach8.
One major drawback of the design, however, was its lack of VGA capabilities.
Packed full of improvements, the ATI68800 (Mach32) reached shelves in late 1992.
However, it was not without issues, much like its competitors.
ATI offered a more economical Plus version with 1 MB of DRAM and no memory aperture support.
This model was less problematic and more popular due to its $500 price tag.
ATI soon faced its first setback, posting a net loss of just under $2 million.
It didn’t help that other companies had more cost-effective chips.
Almost two years would pass before it had an appropriate response, in the form of the Mach64.
This new graphics adapter went on to inspire a multitude of product lines, particularly in the OEM sector.
Xpression, WinTurbo, Pro Turbo, and Charger all became familiar names in both household and office PCs.
Despite the delay in launching the Mach64, ATI wasn’t idle in other areas.
The founders were consistently observing the competition and foresaw a shift in the graphics market.
The 3D Rage was developed with Microsoft’s new Direct3D in mind.
With bilinear filtering or PCTM applied, the fill rate would halve, at the very least.
An updated version of the Mach64 GT (aka the3D Rage II) appeared later the same year.
Released in October 1996,3dfx Voodoo Graphicswas 3D-only, but its rendering chops made the Rage II look decidedly second-class.
None of ATI’s products came close to matching it.
ATI was more invested in the broader PC market.
This trend continued with the next update in 1997, the3D Rage Pro, despite its enhanced rendering capabilities.
Nvidia’s newRiva 128, when paired with a competent CPU, significantly outperformed ATI’s offering.
However, cards such as theXpert@Playappealed to budget-conscious buyers with their comprehensive feature sets.
TheAll-In-Wondermodel exemplified this (shown below).
It was a standard Rage Pro graphics card equipped with a dedicated TV tuner and video capture hardware.
ATI’s designs just couldn’t compete at the top end.
However, these markets also had slim profit margins.
They simplified the old architecture codename Rage 6c became R100 and introduced a new product line: theRadeon.
However, the changes were more than just skin-deep.
This engine could sort and cull polygons based on their visibility in a scene.
In turn, this fed into two pixel pipelines, each housing three texture mapping units.
Unfortunately, its advanced features proved a double-edged sword.
Only through OpenGL and ATI’s extensions could its full capabilities be fully realized.
The first Radeon graphics card hit the market on April 1st, 2000.
Despite potential jokes about the date, the product was no laughing matter.
Over the next year, ATI released multiple variants of the R100 Radeon.
The RV100 chip was essentially a defective R100, with many parts disabled to prevent operational bugs.
However, ATI still dominated the OEM market, and its coffers were full.
Before the Radeon hit the market, ATI set out once again to acquire promising prospects.
They purchased the recently formed graphics chip firmArtX for a significant sumof $400 million in stock options.
This might seem unusual for such a young business, but ArtX stood out.
On paper, the new architecture offered more than any other product on the market.
It was the only product that supported pixel shader v1.4, raising expectations.
This reality began to show between 1999 and 2001.
Despite accumulating $3.5 billion in sales, ATI recorded a net loss of nearly $17 million.
But ATI couldn’t simply spend its way to the top.
It needed more than just a wealth of high-revenue, low-margin contracts.
It needed a graphics card superior in every possible way to everything else on the market.Again.
Naturally, as people often do, most dismissed these claims as mere hyperbole.
After all, ATI had garnered a reputation for over-promising and under-delivering.
While not all rumors were entirely accurate, they all underestimated the card’s power.
Every aspect of the processor’s design had been meticulously fine-tuned.
Even the drivers, a long-standing weak point for ATI, were stable and rich in features.
The Radeon 9700 Pro and the slower, third-party-only Radeon 9700 were instant hits, bolstering revenues in 2003.
For the first time in three years, the net income was also positive.
The NV30, like the R300, purportedly had eight pixel pipelines.
Except, that wasn’t entirely true.
That figure accounted for specific operations, primarily related to z-buffer and stencil buffer read/writes.
The FP32 claim was a similar story; for color operations, the drivers would often force FP16 instead.
The solution to this thorny problem came as an almighty surprise.
Ho had already stepped down as CEO by then.
David Orton, former CEO of ArtX, took over in 2004.
James Fleck, an already wealthy and successful businessman, replaced Ho as Chairman.
For the first time in twenty years, ATI had a leadership structure without any connections to its origins.
In 2005, the company posted its highest-ever revenues slightly over $2.2 billion.
In July 2006, almost completely out of the blue,AMD announcedits intention to acquire ATI.
AMD had initially approached Nvidia but rejected the proposed terms of the deal.
The acquisition wasa colossal gamblefor AMD, but ATI was elated, and for a good reason.
This was an opportunity to push Nvidia out of the motherboard chipset market.
AMD was evidently willing to spend alotof money and had the financial backing to do so.
Best of all, the deal would also mean that ATI’s name would remain in use.
The acquisition wasfinalizedby October 2006, and ATI officially became part of AMD’s Graphics Product Group.
Naturally, there was some reshuffling in the management structure, but it was otherwise business as usual.
Boasting 710 million transistors and measuring 420 mm2, the R600 was twice the size of the R520.
Nvidia then followed up with higher clocked Ultra in May 2007.
The Radeon HD 2900 XT’s saving grace was its price.
For the money, the performance was superb.
Why wasn’t it better?
Games at the time were still heavily dependent on texturing and fill rate (yes,even Crysis).
This direction was also followed with the 2008 iteration of TeraScale GPUs - the RV770.
The use of faster GDDR5 memory allowed the memory bus to be halved in size.
Coupled with TSMC’s 55nm process node, the entire die was almost 40% smaller than the R600.
All this resulted in the card being retailed at $300.
Once again, it playedsecond fiddleto Nvidia’s latest GeForce cards in most games.
In the late summer of 2009, a revised TeraScale architecture introduced a new round of GPUs.
Thegaming performancewas excellent, and the small increase in MSRP and power consumption was acceptable.
Goodbye to ATI?
Its professional FireGL range was thriving, and OEM and console GPU contracts were robust.
The less successful aspects had already been trimmed.
However, this was only for the brand.
The actual company stayed active, albeit as part of AMD.
At its peak, ATI had over 3500 employees and several global offices.
This group continues to contribute to the research and development of graphics technology, as evidenced bypatentlistings andjob vacancies.
But why didn’t AMD just completely absorb ATI, like Nvidia did with 3dfx, for instance?
The terms of the 2006 merger meant that ATI would becomea subsidiarycompany of AMD.
Gone but not forgotten?
No, notgone and not forgotten.
We cover the most prominent part of their history, innovations, successes and controversies.