byte definition

The byte is the unit of digital information usually made up of eight bits (series of zeros and ones).

Depending on how those eight bits are combined, they will form a byte that represents a given character of text on a computer. In other words, the byte is the unit that most computers today use to represent a character as a letter, a number, and other symbols.

Actually the size of the byte depends on the hardware used in the computer, but historically the most popular has been 8 bits (called an octet). These 8 bits allow 256 combinations and therefore represent 256 different characters (see below to understand why 256 combinations).

The byte is the smallest accessible memory unit in most computer architectures. In other words, it is the smallest meaningful unit of data, always speaking of typical computers.

Depending on the number of bytes, they will form kilobytes, megabytes, gigabytes, terabytes, etc.

Why does 8 bits allow 256 combinations?

A binary bit can be 0 or 1. If the byte is made up of 8 bits, then we can form 2^8 = 256 combinations, for example:

1) 00000000
2) 00000001
3) 00000010
4) 00000011
5) 00000100

255) 11111110
256) 11111111

Each combination is associated with a character (letters, numbers, symbols…), but 256 are not enough to represent characters from other languages ​​or for other uses. So standards have been created for character encoding, such as UTF-16, which uses two bytes (that is, 16 bits) to represent characters from multiple languages. In this case 16 bits allow to represent 2^16 = 65536 characters.

The byte as a unit

Usually the byte is abbreviated with the letter «B» (uppercase), while the bit with the «b» (lowercase).

The byte and its multiples are used as data storage units.

Historically, the unit byte has been used with the prefixes of kilo, mega, giga, etc. to form larger units. The classical units in computing are the following:

1024 bytes (B) = 1 kilobyte (KB)

1,048,576 bytes = 1024 kilobytes (KB) = 1 megabyte (MB)

1,073,741,824 bytes = 1024 megabytes (MB) = 1 gigabyte (GB)

1,099,511,627,776 bytes = 1024 gigabytes (GB) = 1 terabyte (TB).

The problem is that the prefixes kilo means 1000, mega is 1,000,000, giga is 1,000,000,000, instead the byte units are based on powers of 2 (2^n, where n: 10, 20, 30 for KB , MB and GB respectively), giving as results 1024, 1,048,576, 1,073,741,824, etc.

It is true that the numbers are quite close to each other, that is why those prefixes were chosen as names to represent the units of the byte, but in any case it can lead to confusion.

I suggest reading our article: Byte Unit Conversions, to better understand the topic with examples.

Summary of traditional storage units and units according to the international system.

In some computers, four bytes make up a word, which is the unit that a processor can efficiently handle while reading and processing each instruction. Other computer processors can handle double-byte and even single-byte instructions.

For more information read: word (word).

byte units

Byte

Megabyte

gigabyte

Terabyte

petabyte

Exabyte

zettabyte

yottabyte

brontobyte

byte history

The term «byte» was introduced by computer scientist and IBM employee Werner Buchholz in July 1956 while developing the IBM Stretch computer. It is a modification of the word «bite» (bite in English) to avoid confusion with the word bit.

However, in 1960, the UK IBM Education Department taught that bit came from Binary digIT and byte from Binary TuplE.

Early computers used a variety of 4-bit binary (Binary-Coded Decimal, BCD, or Binary-Coded Decimal) and 6-bit (the Fieldata, used in the US military).

In 1963 these were expanded to 7 bits, called ASCII (American Standard Code for Information Interchange), which became the standard and replaced the codes used in teletypes (used by the US government and universities), many incompatible with each other.

ASCII included lowercase, uppercase English characters, certain symbols, and control characters that facilitated the transmission of written information, as well as functions for printing and for data flow control in data transmission.

In the early 1960s, IBM also introduced the 8-bit EBCDIC standard for its IBM System 360 family of computers. This was an expansion of the 6-bit BCDIC.

The success in widespread use of the IBM System/360 computer established a de facto standard in 8-bit design as the size of the byte.

In the 1970s, the development of 8-bit microprocessors became popular, so this size for the byte became even more consolidated.

The Intel 8008 microprocessor (direct predecessor of the popular 8080 and 8086) used in early PCs could also perform a few operations using bytes in 4-bit pairs (A 4-bit quantity is usually called a nibble).

The use of the term octet is used as a synonym for «8-bit byte».

The byte to measure the size of a file

A bit is the storage space needed to store one binary digit — a 0 or a 1. A nibble is four bits, and a byte is eight bits. One byte gives you enough storage space to store one of the many characters normally found on a keyboard (and a few other special characters according to the ASCII character encoding scheme).

We can measure the size of files or hard drives in terms of the number of bytes they can store. However, this can be cumbersome, since the number is very, very large. Therefore, taking the metric system as a reference, larger units of measurement were created by advancing various prefixes to the word «byte». Kilobyte, Megabyte, Gigabyte, etc. are all examples of this. A few years ago we described hard drive space in terms of Megabytes. Today, Gigabytes is the most common term, and Terabytes are beginning to reveal their presence. But how much storage does each of them describe? This is where it gets interesting, because there is more than one accepted definition for each term.

In the metric system, movement up the unit scale is direct. For example, a kilometer is 1000 meters, a megameter is 1000 kilometers, a gigameter is 1000 megameters, and so on… If you are a manufacturer of hard drives, this is the standard to follow. So, for example, a megabyte turns out to be 1,000,000 bytes.

However, when the term megabyte is used for real and virtual storage, operating systems and other computer programs often follow the more natural binary path, with 1,000,000,000 base 2=2^20=1,048,576 bytes intended to be used. the number of bytes described by a megabyte. This means that when you buy an 80 Gigabyte hard drive you will get a total of 80,000,000,000,000 bytes of available storage. However, since MS Windows uses the 1,048,576 byte rule, if you look at the Windows drive properties, an 80 Gigabyte drive will have a capacity of 74.56 Gigabytes and a 250 Gigabyte drive will only produce 232 Gigabytes. Gigabytes of available storage space.

As one moves from the number of bytes in a Kilobyte, to the number of bytes in a Megabyte, and then to the number in a Gigabyte, Terabyte, Petabyte, and beyond – we can use a multiplier of 1,000 or we can use a multiplier of 1,024. Either option can be considered «correct». Traditionally, the standards adopted depend on the type of storage in question.

The byte is used to measure the size of a file.

Read the full article: file size.

How much information does the byte represent?

Here are some rough examples of how much information could be stored in different numbers of bytes. Some examples were taken from Wikipedia and the internet.

1 byte: Can store a character (a letter, a number…) in ASCII.

10 bytes: Can store one or two words of English or Spanish (10 ASCII characters).

100 bytes: One or two sentences of a text.

1000 bytes (1 KB): Half a page of a text-only document.

10,000 bytes: A page from a novel.

100,000 bytes: A small compressed digital photograph.

1,000,000 bytes (1 MB): One minute of compressed MP3 music or a complete novel.

10,000,000 bytes: Two copies of the complete works of Shakespeare.

100,000,000 bytes (100 MB): A 1 meter shelf of books.

1,000,000,000 bytes (1 GB): A truckload of pages of text.

1,000,000,000,000 bytes (1 TB): 2,000 hours of CD-quality audio.

10,000,000,000,000 bytes: The print collection of the US Library of Congress.

1,000,000,000,000,000 bytes (1 PB): Two thousand years of average quality MP3 music.

1,000,000,000,000,000,000 bytes (1 EB): The monthly traffic of the entire Internet in 2004 (by 2010 it was estimated to be 21 times higher, with growth of up to 50% per year).

1,000,000,000,000,000,000,000 bytes (1 ZB): By 2013, the entire WWW was estimated at four times this capacity (4 ZB)

1,000,000,000,000,000,000,000,000 bytes (1 YB).

Name

Symbol

Binary

number of bytes

Equivalent

kilobyte

KB

2^10

1,024

=

megabyte

MB

2^20

1,048,576

1,024KB

gigabyte

UK

2^30

1,073,741,824

1,024MB

terabyte

TB

2^40

1,099,511,627,776

1,024GB

petabyte

PB

2^50

1,125,899,906,842,624

1,024TB

exabyte

EB

2^60

1,152,921,504,606,846,976

1,024PB

zettabyte

ZB

2^70

1,180,591,620,717,411,303,424

1,024EB

yottabyte

YB

2^80

1,208,925,819,614,629,174,706,176

1,024ZB

To understand the difference with kibibyte, mebibyte, gibibyte…
Read: Units of bytes.

Related:

Nibble which is equal to half a byte.

digital information

DBCS, set of characters that need two bytes to appear.

Related article:

Byte unit conversions.

Quote the definition:
Alegsa.com.ar (2019). byte definition – ALEGSA 2019-05-06 url: https:///Dec/byte.php

Doubts? needs more information? Write and we will respond to your email: click here