Ghz: what is and what is a gigahertz in computing
Table of contents:
- What is a GHz or Gigahertz
- GHz in computing
- The CPU only understands electrical signals
- The evolution of GHz
- The CPI of a processor
- Conclusion and more interesting links
If you are entering the world of computing and you are looking at processors to buy, you will have read GHz or Gigahertz or Gigahertzio many times. All this is exactly the same, and no, it is not a food seasoning, it is a measure that is used very often in computing and engineering.
Index of contents
So the least we can do at this point is explain what this measure measures and why it is used so much today. Perhaps after this, you will be clearer about many things that you encounter every day in the world of electronics.
What is a GHz or Gigahertz
GHz is the abbreviation for a measurement used in electronics called Gigahertz in Spanish, although we can also find it as Gigahertz. And it is not really a base measure, but it is a multiple of Hertz, specifically we are talking about 10.9 million Hertz.
So really what we will have to define is the Hertz, the base measurement and where the Kilohertz (kHz), Megahertz (Mhz) and Gigahertz (GHz) come from. Well, this measure was invented by Heinrich Rudolf Hertz, from whose surname the measure's name comes. He was a German physicist who discovered how electromagnetic waves propagate in space. So really this measurement comes from the world of waves and not purely from computing.
A Hertz represents one cycle per second, in fact, until 1970, Hertz was not called cycles. In case you do not know, a cycle is simply the repetition of an event per unit of time, which in this case will be the movement of a wave. Then a Hertz measures the number of times a wave repeats in time, which can be sound or electromagnetic. But this is also extensible to the vibrations of solids or to sea waves.
If we try to blow a paper parallel to its surface, we will notice that it begins to undulate repeating the pattern every so often, in seconds or thousandths of a second if we blow hard. The same happens with waves, and at this magnitude we call it frequency (f) and it is the inverse of a period, which is measured in clear seconds (s) . If we put it all together, we can define Hertz as the frequency in the oscillation of a particle (of a wave, paper, water) in a period of insurance.
Here we can see the shape of a wave and how it repeats over a period. In the first, we have the measurement of 1 Hz, because in one second it has only suffered one oscillation. And in the second image, in a single second it has oscillated 5 complete times. Imagine then how much would be 5 GHz.
Name | Symbol | Value (Hz) |
Microhertz | µHz | 0.000001 |
Millihertz | mHz | 0.001 |
… | … | … |
Hertz | Hz | one |
Decahertz | daHz | 10 |
Hectoertium | hHz | 100 |
Kilohertz | kHz | 1, 000 |
Megahertz | MHz | 1, 000, 000 |
Gigahertz | GHz | 1, 000, 000, 000 |
… | … | … |
GHz in computing
Now that we really know what a Hertz is and where it comes from, it 's time to apply it to computing.
The Hertz measures the frequency of an electronic chip, for us, the best known is the processor. So transferring the definition to it, a Hertz is the number of operations that a processor can do in a period of one second. This is how the speed of a processor is measured.
The processor of a computer (and other electronic components) is a device that is responsible for carrying out certain operations that are sent from the main memory in the form of instructions that are generated by the programs. Then each program is subdivided into tasks or processes, and in turn into instructions, which will be executed one by one by the processor.
The more hertz a processor has, the more operations or instructions it can carry out in a second. In common, we can also call this frequency " clock speed ", since the entire system is synchronized by a clock signal so that each cycle lasts the same time and the transfer of information is perfect.
The CPU only understands electrical signals
As you will understand, an electronic component only understands voltages and amps, signal / no signal, so all instructions must be translated into zeros and ones. Currently, processors are capable of working simultaneously with strings of up to 64 zeros and ones, called bits, and it represents the presence or absence of a voltage signal.
The CPU only receives a succession of signals that it is capable of interpreting with its structure of internal logic gates, which in turn are composed of transistors that are responsible for passing or not passing electrical signals. In this way it is possible to give a “comprehensible meaning” to the human being, in the form of mathematical and logical operations: addition, subtraction, multiplication, division, AMD, OR, NOT, NOR, XOR. All these and some more are the operations that the CPU does, and that we see on our PC in the form of games, programs, images, etc. Curious, right?
The evolution of GHz
We haven't always had Gigahertz in the soup, in fact, almost 50 years ago, engineers just dreamed of ever naming the frequency of their processors this way.
The beginning was not bad either, the first microprocessor implemented on a single chip was the Intel 4004, a small cockroach invented in 1970 that revolutionized the market after those huge vacuum valve-based computers that did not even have RGB lighting. Exactly, there was a time when RGB did not exist, imagine. The fact is that this chip was capable of processing 4-bit strings at a frequency of 740 KHz, not bad, by the way.
Eight years later, and after a few models, the Intel 8086 arrived, a processor of no less than 16 bits that worked from 5 to 10 MHz, and was still shaped like a cockroach. It was the first processor to implement the x86 architecture, which we currently have on processors, incredible. But this architecture was so good at handling instructions that it was a before and after in computing. There have also been others like IBM's Power9 for servers, but undoubtedly 100% of personal computers continue to use x86.
But it was the DEC Alpha processor the first chip with RISC instructions that reached the 1 GHz barrier in 1992, then AMD arrived with its Athlon in 1999 and in the same year the Pentium IIIs reached these frequencies.
The CPI of a processor
In the current era we have processors that are capable of reaching up to 5 GHz (5, 000, 000, 000 operations per second) and to top it off they have not only one, but up to 32 cores on a single chip. Each core is capable of performing even more operations per cycle, so the capacity multiplies.
The number of operations per cycle is also called the CPI (not to be confused with the consumer price index). The IPC is an indicator of the performance of a processor, currently it is very fashionable to measure the IPC of the processors, since this determines how good a processor is.
Let me explain, two basic elements of a CPU are the cores and their frequency, but sometimes having more cores does not mean having more IPCs, so it is possible that a 6-core CPU is less powerful than a 4-core CPU.
The instructions of a program are divided into threads or stages, and are entered in the processor so that, ideally, a complete instruction is carried out in each clock cycle, this would be IPC = 1. In this way, in each cycle, a complete instruction would come and go. But not everything is so ideal, since the instructions depend largely on how the program is built and the type of operations to be performed. Adding is not the same as multiplying, nor is it the same if a program has multiple threads as only one.
There are programs to measure the IPC of a processor under conditions as similar as possible. These programs obtain an average IPC value by calculating the time it takes for the processor to run a program. Series like this:
Conclusion and more interesting links
It really is a very interesting topic, this one about Hertz and how the speed of a processor is measured. It really gives for many topics to talk about, but we can not make an article like novels either.
At least we hope that the meaning of the Hertz, the frequency, the cycles per second and the CPI have been well explained. Now we leave you with some interesting tutorials related to the topic.
If you have any questions about the topic, or want to point something out, leave us a comment in the box.
Nvidia announces dgx and hgx computing stations based on gpus volta
NVIDIA DGX-1 and HGX-1 are two new computing machines based on NVIDIA Volta graphics cards. They use Tesla V100 cards.
Amazon technology offers March 19: discounts on computing and storage
Amazon Technology Offers March 19: Discounts on computing and storage. Find out more about the discounts that the popular store leaves us today on products such as laptops, external hard drives or keyboards.
Black friday amazon november 25 and 26: computing and electronics: ssd, monitors ...
Latest Black Friday offers. We detail the offers for this weekend: printers, monitors, Razer peripherals and SSD Crucial.