Graphics Cards

Ati technologies inc: history, models and development

Table of contents:

Anonim

ATI Technologies was an essential company in the history of graphics cards. Inside, we tell you all its history. Do you want to know it?

The history of personal computers is marked by companies like ATI Technologies because they managed to develop what were previously computers. In this case, ATI was dedicated to the manufacture of graphics cards and the world of 3D graphics in a context marked by other manufacturers such as 3dfx interactive or Nvidia. Later, it ended up being absorbed by AMD and its name would become AMD Radeon in 2010.

Next, you have at your disposal the story of one of the best graphics card manufacturers in history: ATI Technologies.

Index of contents

1985, the year of its foundation

ATI is founded by Lee Ka Lau, Kwok Yuen Ho, Francis Lau, and Benny Lau in Canada. Then, it would be called Array Technology Inc and it would be a "simple" manufacturer of equipment, specifically, integrated graphics cards. In this sense, IBM and Commodore were also working on the same thing; in fact, IBM was one of the most powerful manufacturers in the world.

In October of this same year, ATI used ASIC technology to develop its first graphics controller, which would be the first integrated graphics.

It all started with " Small Wonder ".

1987, EGA Wonder

As you read, you will see that the graphics cards will be marked by the technology of the monitors. Furthermore, it could be said that the development of graphics cards and monitors go hand in hand.

At the moment, ATI releases its first graphics card: the EGA Wonder. It was named this way because monitors from the late 1980s incorporated EGA graphics. This card worked with any graphics interface, operating system or monitor.

This manufacturer was interested in personal computers, so he tried to provide faster graphics to them.

1988, VGA Wonder

At ATI they were still "wondering" what the world of graphics cards would be like, but this time with the VGA Wonder. It had two versions: one with 256kb DRAM and the other with 512kb DRAM. This graphic was a 16-bit ISA card running in an 8-bit ISA slot.

It had a chip called ATI 18800 and it supported SVGA graphics mode. In addition, it automatically synchronized with the monitor.

This company was beginning to show its head in a tricky sector with a lot of competition. However, I did not know that the war for 3D graphics was looming in the 90s.

1990 ATI Mach 8

This graphic released in the 90s, was the beginning of 2D graphics accelerators on personal computers. Mach8 was like an expansion of the Wonder and they carried a cloned chip of the IBM 8514 / A with expansions. This was one of the first 2D graphics acceleration chips on the market.

However, another add-on VGA card was required to use the Mach8. This would make the price of enjoying 2D graphics very expensive because you had to buy a VGA card and the Mach8. This ATI plugged into the ISA or MCA port, reproduced 8-bit colors, and had two versions:

  • 512 KB . 1 MB.

The Mach8 chip would be used in later models and would be a graphics that was capable of processing graphics without a CPU.

1992 Mach32

With the Mach32, things were starting to get serious because we were starting to see more than decent cards on the market. Back then, there was MS-DOS, so Mach32 incorporated a 32-bit GUI accelerator for this OS. In addition, the memory interface was 64-bit and we had two models: one 1Mb and the other 2Mb.

The Mach32 integrated a VGA processor, so that it was enough by itself to work. As latest data, I was still using ISA and MCA, but also PCI … a slot that would be very famous in the 90's. Also, it was compatible with new color modes: 15 bbp, 16 bbp and 24 bbp, as with the IBM 8514 chip / A.

In 1993, ATI Technologies Inc. would be listed on the Toronto and NASDAQ exchanges. As a curiosity, its trading symbol would not be ATI, but ATY.

1994 Mach 64

This year, ATI would launch one of the latest graphics cards in the "Mach" family. We were starting to get into the preamble of 3D graphics. Meanwhile, ATI does its own thing:

  • 64-bit GUI accelerator. 1mb up to 8mb video memory DRAM, VRAM or SGRAM, a memory for graphics adapters that had additional functions. 64-bit memory interface. ISA, VLB and PCI ports.

The Mach64 chip was completely new and was one of the first graphics cards to accelerate video on the move. This chip would be the protagonist in the 3DRage, which we will see later.

In this same year, ATI launches multilanguage software compatible with 13 different languages. This graphic would bring Graphics Xpression or Graphics Pro Turbo to life.

1996, 3D Rage I, II

3D graphics came to personal computers from 3dfx Interactive, nVidia and ATI. Specifically, the 3D Rage combined 3D acceleration, video acceleration, and 2D acceleration. In addition to being the successor to the Mach series, it used the same chip as the Mach 64.

3D Rage was released in April 1996 and would use 3D Xpression, as it would be compatible with MPEG-1. It was the first 3D chip in the history of graphics cards, which meant more than 1 million chips sold.

However, this had only just begun.

Rage II

The 3D Rage 2 almost bowed its predecessor in terms of 3D graphics. It incorporated a mach64 chip restructured for better 2D performance. In fact, it is compatible with its older sister, Rage 1.

This graph improved the previous generation in almost everything:

  • PCI compatible. 20% increase in 2D performance. MPEG-2 Drivers support for Direct3D, Criterion RenderWare, Reality Lab and QuickDraw, among others. Support for professional solutions, such as AutoCAD. Compatible with Windows 95, Mac OS, Windows NT, OS / 2 and Linux.Support DirectX 5.0.

In addition, its core ran at 60 MHz, its SGRAM memory up to 83 MHz and it had a bandwith of 480 MB / s. It was a great product, being present in many personal computers, such as the Macintosh G3.

1997 Rage Pro and the beginning of the war

We say that the war is served because we are in a very competitive 1997 in terms of 3D graphics with the new nVidia RIVA 128 and the Voodoo from 3dfx. The Rage Pro was the solution against these two, but ATI made several mistakes: they did not outperform their rivals, nor did they support OpenGL.

At that time, OpenGL was essential to enjoy the best video games. ATI tries again with Rage Pro Turbo, but fails again against its rivals because its performance is not as advertised and the Turbo did not improve much on the Pro.

Therefore, we find an ATI as a strong manufacturer of graphics cards for having closed contracts with companies like Apple for the integration of graphics on Macs. In addition, he works to bring graphics chips to television. Not forgetting that it introduces the first 3D chip for laptops, which would have the name of Rage LT.

However, the competition for 3D graphics would be tougher than previously thought because, despite the Rage's having a Direct3D 6 throttle with DVD acceleration, they didn't seem to take flight against their rivals.

1999 Rage 128 and Rage 128 PRO

Rage 128 was compatible with Direct3D 6 and OpenGL 1.2. It brought the novelty of IDCT, ATI's first dual texture renderer. Also, we got to know their processor that was capable of reproducing 32-bit colors, like 16-bit, although inexplicably worse in this mode.

This graphics card was intended to compete against the RIVA TNT and the Voodoo 3, a decadent card. On the other hand, Matrox got in the way with its G200 and G400. Finally, it was able to compete against them and it did really well because the Voodoo 3 did not support 32-bit.

To finish describing it, it incorporated 250 MHz RAMDAC and 2 AGP ports.

Rage 128 PRO

Launched in August 1999, it integrated 250nm chips and supported DirectX 6.0. It was the successor to the Rage 128 and its chip had improvements related to DirectX compression, better texture filtering, DVI support, and more AGP ports. This graphic emerged to compete against the Voodoo 3 2000, the TNT2 and the Matrox 6400.

The truth is that it was not a product that could compete from you to you with the mentioned models because its clock was slower. Admittedly, ATI managed to break all benchmarks, but when it came down to it, its video game performance disappointed.

The next series would be called Rage 6, but it would change its name to Radeon, one of the most remembered names on graphics cards.

2000, ATI Radeon DDR

From Rage, we would go to Radeon, a line that would give a lot to talk about throughout the 21st century. This new series would arrive in April 2000, at which time the Radeon DDR would be presented. Faced with continued failures against nVidia, ATI did not want to fail, so it took DDR very seriously.

To clarify the context, we were in the era of Pentium 4 and AMD Athlon, so there were some 3D graphics marked by games like Quake.

Introduced this series, let's move on to get fully into the DDR. It was built on 180nm chips manufactured by TSMC. It supported DirectX 7.0 and brought the news of 2 pixel shaders and 1 vertex shader along with 6 TMUs and 2 ROPs. Also, it was compatible with OpenGL 1.3. Lastly, it would bring HyperX technology and incorporate the first chip in the series: the R100.

We had two models: 32MB and 64MB. The last one, has a faster clock (183Mhz) and with LIVE capacity. Its main rival was nVidia's GeForce series, but ATI did not drive the world crazy with its Radeons, although it was a product that worked very well.

2001 Radeon 8500

In 2001 , ATI took the route of the quality - price ratio, since it failed to unseat nVidia. In this field, it was difficult to beat ATI because it made a great business move.

The ATI Radeon 8500 was released on August 14, 2001. It used 150nm chips and was focused for the most gamers . Within this series we would have 3 cards:

  • 8500, the high-end ones. They incorporated 4x, 64mb or 128mb AGP, 250MHz and an 8GB / s broadband. 8500LE, the mid-range ones. They came with 4x, 64mb or 128mb AGP, 275MHz and a 8.8GB / s 8500XT broadband, the enthusiastic range. It did not launch, but would carry 4x, 128mb, 300MHz AGP and a 9.6GB / s broadband

The XT was canceled because with only 300 MHz it would have been crushed by the GeForce 4 Ti4600. So ATI focused on the mid-range and the lower end.

2002 Radeon 9000

In this same year ATI released the 8500 All-In-Wonder, which was a success because it incorporated an analog television tuner and FM radio. In the multimedia sector, it devastated without complexes.

Entering fully into 3D graphics, the Radeon 9000 belongs to the R300 series and was launched in August 2002. It is a GPU that brought Direct3D 9.0 and OpenGL 2.0 support. Also, it was compatible with Windows 98, 98SE, Me, 2000 and XP.

It had 64MB or 128MB DDR, 200 Mhz of core and 500 MHz of memory. Finally, its bandwidth was 8 GB / s. It was a somewhat light graphics, since it had lower specifications than the 8500.

2003 Radeon 9600 Pro

Things get serious with the release of the Radeon 9600 Pro on October 1, 2003. With 130nm chips and 128MB memory, it was able to deliver a bandwith of 9.6GB / s. It supported OpenGL 2.0 and DirectX 9.0. It had DVI, VGA and S - Video outputs. It was mounting the RV360 chip.

It is true that the 9600 Pro had the mission to replace the 9500 Pro, but it did not do it, although it was much cheaper. We got into a 600 Mhz watch focused on a mid-range that gave spectacular performance.

The only downside was that it did not support Shader Model 3.0. On the other hand, we must highlight the advance that Mobility Radeon 9600 represented in graphics for laptops.

9800XT

However, ATI surprised with the 9800XT in the same year, a card that was scary. It was a graphic that had 256 MB of memory, a bandwidth of 23.36 GB / s and a 412 MHz GPU Clock, like a 365 MHz Memory Clock, capable of outputting 730 MHz.

ATI returned to the ring and competed from you to you with Nvidia. Of course, at a very high price.

2004 Radeon X700

The new Radeon R420 series would be marked by the X700 and X800 graphics. The X700 did not have a good entry because it was not compatible with the Shader Pixel 3.0, despite having a very affordable price. The GeForce 6600 and 6800 took the cat into the water. PCI - Express started to be used.

ATI continued to release high, medium and low-end models. In this case, the X700 SE, X700 LE, X700, X700 Pro and the one that never came out: the X700XT. This series did not stop, so we would have to wait another year.

WE RECOMMEND AMD lowers the price of its graphics cards

This year we saw the Nvidia SLI platform (appropriate from 3dfx), but ATI responded with Crossfire, which was a very similar platform that combined the power of its x2, x3 or x4 graphics, depending on the graphics that were installed.

2005 Radeon X850 XT

Nvidia released the FX 5800 Ultra, so ATI competed against it with its new model: X850 XT.

This graph was really fast, but the situation with the Shader Model 3.0 was made worse because there were many games that did not work on AMD equipment. So, we had a spectacular graphics, but it depended on a community of users to “hack” the game in question and be able to play it.

The X850 XT had 256 MB GDDR3, a bandwidth of 34.56 GB / s, a 520 MHz clock and memory capable of reaching 1080 Mhz. Its big drawback: the Shader Model 2.0b.

It must be said that ATI defeated Nvidia in other ranges, as happened to the GeForce 6800 GT.

In 2005 ATI started again with the X1300 Pro. Despite its compatibility with Shader Model 3.0… it was defeated once again by Nvidia.

2006 X1650 Pro

We are in the R520 series that was developed by ATI and produced TSMC. All this series would have Direct3D 9.0c, Shader Model 3.0 and OpenGL 2.0. This would not be enough, although it could compete against Nvidia.

The X1650 Pro used an RV535 core, making it cooler and more efficient. We were at DDR2, 256 MB, 800 Mhz frequency and a bandwith of 12.8 GB / s. It competed well in the mid-range, but nothing more.

X1950 XTX, a knock on the table

Just the name was scary, and it was not for less because the X1950XTX defeated the Geforce 7, overthrowing the kingdom to Nvidia. It was very expensive, a stove, but terribly fast. It was presented on August 23, 2006.

It supported GDDR4 memory, worked at 1 GHz and gave 64 GB / s of bandwidth. Not only that, it had a Core 650 MHz GPU and 512 MB of video memory.

2007 HD 2900 XT

We switched to the R600 series, which was released on June 28, 2007. It was the HD 2000 graphics cards, which would compete with GeForce 8. As a standout model, the HD 2900 XT was extreme even in its silver flames design in its red casing.

Its performance was brutal, but it was further improved with the driver update. It incorporated a fan that made more noise than the war and that was insufficient to keep the graphics fresh. Although it was rumored that it would incorporate 1GB of memory, they were finally 512 Mb.

HD 3850

ATI had to refocus on other niches because the new GeForce left it in check. In this way, the HD 3850 emerged, graphics cards that worked very well and became popular for their performance in medium ranges.

2008, the return to the throne

Nvidia dominated the high-end graphics and ATI seemed Rocky Balboa dizzy, giving accurate crouches in the last rounds. The two marks were poking mercilessly.

HD 3870 X2

You never have to give up. ATI struggled every 8 months to get something out that could compete with Nvidia. This time, they put a lot of brute force, since it was 2 graphics cards in 1.

Not surprisingly, the power requirements were brutal and you had to account with the power supplies (and the electricity bill). Anyway, this graph regained the throne thanks to its great performance.

HD 4670

ATI knew that its greatest asset was the mid-range because it could sell very good graphics cards at a moderate price. We were in an era where HD reigned and 720p occupied all computers. This 512 MB GDDR3 graphics worked very well and was compatible with DirectX 10.1, OpenGL 3.3 and Shader Model 4.1.

That mini graphics was perfect for multimedia equipment or small computers (HTPC) Well thought ATI!

HD 4870, the most popular of all

It quickly became popular because it was the best value for money graphics card on the market. He toppled Nvidia once again with a round graphic in every way. What most attracted was its cheap price, $ 299. It was released on June 25, 2008.

Its specifications were magnificent:

  • PCIe 2.0. 1GB or 512mb GDDR5. GPU clock: 750 MHz. Memory clock: 3600 MHz at maximum performance. It required 350 W of the power supply.

ATI released an X2 version for the most demanding. However, it seems that the drivers could have taken better advantage of their performance.

2009, HD 4890, HD 5770 and HD 5970

HD 5770

We are in the last year of ATI as an independent graphics card company. Practically no one knew this, but it has not been a brutal change today.

Starting with HD 4890, it was a rehash of HD 4870 and did not have the success that ATI expected of it. It had good specifications and good performance in Full HD, but nothing was further from reality: Nvidia was once again dominating strongly with its GTX.

For this reason, the same year the HD 5770 would be presented, a mid-range card that conquered many homes. Its main reasons:

  • Low consumption Very economical Very good performance

This company had been relegated to the lower-middle range, so nobody expected another hit at the table, as they did with the HD 4870 or the X1950XTX. In this year, AMD was already present in the company, so the acquisition was an open secret.

2009 would end for ATI with two latest releases: the HD 5780 and HD 5970. They were graphics cards that were in the high range for price, but not for performance. Don't get me wrong, its performance was good, but it didn't match the best of Nvidia, whose models were worth a lot more money.

The HD 5970 was an X2 graphics, like the ones we were used to seeing. ATI returned to reply to Nvidia with this great graphics card, but its consumption distanced every buyer.

2010, the end of ATI

The end of ATI would come hand in hand with its latest release: the HD 5670. This graphic opened a new possibility for the mid-range in the market because it brought very good specifications at a competitive price. Despite the purchase of AMD in 2006, the ATI brand ended up disappearing in 2010. The first AMD graphics would be the HD 6850, a fairly good and continuous graphics.

This is how the sad stories end. ATI had very good times in the 90s, but the 21st century was uphill. The Nvidia thing had a lot of merit because it did not need to take monstrous graphics like ATI to have the best performance in video games. Its R&D was better than ATI's, without debate.

We recommend you read the best graphics cards

What did you think of the history of ATI? Share your impressions!

Graphics Cards

Editor's choice

Back to top button