We all are familiar with bytes. Our phones, laptops, and SD cards are able to house billions of them. In fact, that’s what a Gigabyte is — one billion bytes. And each byte is made up of 8 bits. A bit is the lowest common denominator. It’s the atom to the molecule. The photon to the sunbeam.

When we talk storage capacity, we do so in terms of bytes. For instance a 16 GB hardrive is able to store 16 billion bytes. . When we measure rates of data transfer, we use bits. That’s why you hear about and see Gigabit Ethernet, and not “Gigabyte Ethernet.” The bit’s a much more accurate, straightforward unit of measurement — trying to note rates in byte-terms would be like trying to pay for gas in 8-cent coins!

You might have noticed that the abbreviations for gigabits and gigabytes vary ever so slightly, too: for bits the ‘b’ stays lowercase (Gb) and for bytes, those packet of bits, we capitalize (GB). If you can get that down, you might be ready to tackle some of the headier concepts we’ll be addressing in future articles. (And if you want to impress your tech-minded friends, bring up “nibbles” the next time you’re in polite conversation — a nibble is half a byte, or four bits.) All of this might seem…arbitrary, but it’s important to remember that it really is. Computing notation is a language, and like all languages, it was built by people.

Stick with us to learn the ins and outs of the processes that connect us to the world. And if you have any questions, you know who to call.