The gigahertz, abbreviated GHz, is a unit of alternating current (AC) or electromagnetic (EM) wave frequency equal to one thousand million hertz (1,000,000,000 Hz). The gigahertz is used as an indicator of the frequency of ultra-high-frequency (UHF) and microwave EM signals and also, in some computers, to express microprocessor clock speed.
An EM signal having a frequency of 1 GHz has a wavelength of 300 millimeters, or a little less than a foot. An EM signal of 100 GHz has a wavelength of 3 millimeters, which is roughly 1/8 of an inch. Some radio transmissions are made at frequencies up to hundreds of gigahertz. Personal computer clock speeds are increasing month by month as the technology advances, and reached the 1 GHz point in March of 2000, with a processor from AMD, closely followed by a 1 GHz Pentium 3 from Intel.
Other commonly-used units of frequency are the kHz, equal to 1,000 Hz or 0.000001 GHz, and the MHz, equal to 1,000,000 Hz or 0.001 GHz.
lol does that explain it?
or more to the point, generally speaking the higher the GHz the faster the processor(clock speed) but does not mean that its a faster processor overall. example of that would be an AMD 64-bit running at 2.0GHz compared to an intel celeron running at 2.8GHz... AMD would stomp the celeron