Gigabyte Definition

A gigabyte is equal to approximately one billion bytes or a thousand megabytes.

A byte (represented by the upper-case letter B), is a contiguous sequence of eight bits that is used as a unit of memory, storage and instructions execution in computers. A bit (represented by a lower case b) is the most basic unit of information in computing and communications. Every bit has a value of either zero or one. A megabyte, abbreviated MB or M, is equal to about one million bytes, or exactly 1,048,576 bytes.

A gigabyte is equal to exactly 1,073,741,824 (i.e., two to the 30th power) bytes and to 1,024 megabytes. However, the differences from the approximate values are small and often unimportant.

Gigabytes are commonly abbreviated GB or G in writing (not to be confused with Gb, which is used for gigabits), and they are often called gigs in speech in the English language.

The capacity of hard disk drive (HDDs), CDs and DVDs is today virtually always measured in gigabytes. One gigabyte is sufficient to store the text of roughly one thousand average novels, 800 typical digital photos, 300 album tracks in MP3 format or an hour of television programs. USB flash drives are measured in megabytes or gigabytes, according to their size.

As of 2005, even low priced personal computers come with 80 GB HDDs, and commercial HDDs are available at modest cost with in excess of 400GB. 80GB is far more space than is required for most consumer applications, but it is probably insufficient for recording television programs or professional video editing.

The prefix giga comes from the Greek word gigas, which means giant.

Created August 11, 2005.
Copyright © 2005 The Linux Information Project. All Rights Reserved.