One byte is data that is made up of a sequence of contiguous bits. At the beginning, the term byte was used when mentioning those 4-bit instructions that allowed the inclusion of between one and sixteen bits per byte, although, later, the production design reduced the byte to 3-bit fields, situation that allowed to have between one and eight bits per byte. Finally, the size of a byte would be set at eight bits and declared a standard.
Meanwhile, the byte has various multiples such as: kilobyte (1,000 bytes), magabyte (1,000,000 bytes), gigabyte (1,000,000,000 bytes), and terabyte (1,000,000,000,000 bytes).
While, terabyte is a information storage unit whose symbol is TB and equals 1012 bytes. Meanwhile, the prefix tera comes from the Greek that refers monster or beast.
During the beginnings of computing, units were considered as multiples of 1024, because computers worked on a binary basis, but when wanting to name the quantities, confusion would arise, since the prefixes of the multiples of the International System of MeasurementsTherefore, to clarify the denominative complications between decimal and binary prefixes, the IEC, in 1998, defined new prefixes using the combination of the International System of Measurements with the word binary and thus the word terabyte was adopted when refers to the amount of 1012 bytes.
On the contrary, with quantities in binary base two it would be incorrect to use the prefix tera and therefore it was created instead the tebi, giving rise to the concept tebibyte corresponding to 240 bytes.