The graphics card, also called a video card, display adapter or directly GPU (inherited from the name of your graphics processor) is an expansion card or an integrated circuit, that is responsible for processing the data sent to you by the computer processor, and transforming them in information visible and understandable to the user, represented on the output device, the monitor.
Currently, we can find two types of graphics cards: integrated and dedicated.
-
The Integrated Graphics
In the past, many motherboards integrated their GPU between their circuits. Still, currently, this was no longer done to give way to the integrated GPU or graphics calls, which already exist with the processor (the motherboards, however, continue to integrate the outputs Of video). These integrated graphics logically have relatively reduced power, and also need to take part of the system RAM for themselves.Currently, both AMD and Intel have integrated graphics in their processors. The main reason is that the graphics processors are potent when carrying out parallel processes, and locating the GPU together with the CPU allows the use of integrated graphics to perform many operations in addition to the graphics processing itself (See the architecture of heterogeneous systems). -
The Dedicated Graphics
The dedicated graphics cards are the ones we all know: video cards that have their GPU, graphics memory, video outputs, and cooling system, and that connects to the motherboard through a PCI-Express port. They provide a much higher level of performance than integrated graphics, but they are also much more expensive, although they allow us to update or replace the component in a much simpler way.
Video Outputs
It is known as video outputs to the connection ports that are used to connect the graphics card to the monitor. There are many different types of video output, but only the following are currently used:
- VGA (practically deprecated): it means Video Graphics Array, and it was the analog video standard in the 90s. It was designed for CRT monitors and suffers a lot of electrical noise and distortion in the conversion from analog to digital so that VGA cables usually have a line filter in the cable itself. Its connector is called D-sub and has 15 pins.
- DVI Means Digital Visual Interface and replaces the previous one. It is entirely digital, so there is no conversion, eliminating much of the electrical noise and distortion. It offers higher quality and higher possible resolutions.
- HDMI: Means High Definition Multimedia Interface, and is currently the most used. It has encryption without compression and is capable of transmitting audio at the same time as video, offering even higher resolutions.
- DisplayPort: He is currently the “rival” of HDMI. It is a proprietary VESA technology that also transmits audio and video, but at a higher resolution and frequency than HDMI. It has the advantage that it is free of patents, so it is easier for its use to be extended (currently almost all graphics have DisplayPort). It has a small size version called Mini DisplayPort of the same characteristics that allow the inclusion of many more ports on the same graphics card.
A Little History, Why Were Graphics Cards Invented?
The history of graphics cards begins in the late 60s when printers are no longer used as the primary display element, and monitors come into play. The first graphics cards were only capable of displaying a resolution of 40 x 25 monochrome pixels until the first dedicated graphics chips such as the Motorola 6845 appeared.
Later the first video consoles began to appear, and the success of the PC (Personal Computer, home computer) had a great boom, significantly reducing production costs since they were already made in bulk.
In the beginning, the graphics cards that were sold were 80 columns, which added a text mode of up to 80 x 25 characters (not pixels), mainly for CP / M software. Then, the famous IBM PCs that popularized the “interchangeable” design of graphics cards arrived.
The most widespread was the MDA (Monochrome Display Adapter) that IBM created in 1981. This graphic had a visual memory of 4 KB and was able to represent up to 25 lines of text of 80 characters each on the monitor.
IBM MDA
From there, a cycle of competition entered the PC market, with names that will sound a lot like Commodore, Amiga 2000, and Apple Macintosh. These teams began using proprietary technology for the graphics card, integrating the GPU into the motherboard. This situation is maintained until the appearance of another member that will also sound to you: the PCI port.
The PCI removed the bottleneck from the previous interface (ISA), and the first professional 3D adapters (S3 ViRGE) began to develop. Massive manufacturing of graphics for PCI sockets began. From there, the industry evolved as we know it today, with the following highlights:
In 1995, the first 2D / 3D graphics cards manufactured by Matrox, ATI, S3, and Creative, among others, appeared. They complied with the SVGA standard but with 3D functions.
In 1997, 3DFX launched what is possibly the most famous graphics chip of all time: Voodoo. This had a vast computing power for the time and added various 3D effects such as Mip Mapping, Z-Buffering, and Antialiasing. From this, several models are known as the Voodoo2 of 3DFX, TNT of Riva, and later, the TNT2 of NVIDIA (after acquiring Riva) was launched.
Such was the power of these graphics that the PCI port fell short, but Intel came to develop the AGP (Accelerated Graphics Port) port that solved the bottlenecks that were already back then.
Between 1999 and 2002, NVIDIA took over the hegemony of the graphics card market (among other things, bought 3DFX) with its GeForce family. In that period, there were a lot of graphic improvements in the 3D environment. The outstanding graphics of those times had up to 128 MB of DDR memory.
Most of the consoles of those times used graphics chips based on 3D accelerator cards. Apple Macintosh used NVIDIA and ATI chips.
In 2006 AMD bought ATI and became a direct rival of NVIDIA in the graphics card market. Since then, the leadership was shared with their respective GeForce and Radeon families, and they continue to this day.
The Best Graphics Cards
Every year, or at most every two years, new, more powerful graphics cards and new technologies (such as the late Ray Tracing for video games) come to the market that offers us a much more realistic gaming experience.
Today, the most powerful graphics cards for gaming that we can find come from the hand of NVIDIA, among which the models of the ten series (Pascal) and the 20 series (Turing) stand out. Within the NVIDIA series 10, we can find the NVIDIA GTX 1060 (although this model is considered more as a mid-range), NVIDIA GTX 1070, GTX 1070 Ti, GTX 1080 and GTX 1080 Ti. Within the series 20, also known as RTX, from NVIDIA, we can opt for the NVIDIA RTX 2060 (with a power higher than the NVIDIA GTX 1070), NVIDIA RTX 2070, RTX 2080 and the most advanced and powerful model of all, the NVIDIA RTX 1080 Ti.