Q & A: Decimal and Binary, Measure vs. Measure

Question

I have read that some computer memory sizes are in decimal (base 10) rather than normal binary (base 2). Is this true?

Answer

Both decimal and binary systems are used to express capacity in computer systems. The binary system measures a kilobyte as 1,024 bytes, whereas the decimal system measures a kilobyte as an even 1,000 bytes. The range between the two gets even more notable when you bump it up to gigabytes: the binary value comes out to 1,073,741,824 bytes, compared to 1,000,000,000 bytes in decimal.

Companies making data-storage products tend to use the decimal system to indicate capacity and often put small disclaimers on product pages that say so, like “One gigabyte (GB) = one billion bytes. One terabyte (TB) = one trillion bytes. Total accessible capacity varies depending on operating environment.”

Computers, being computers, traditionally measure things in binary. The smallest unit of measurement on a computer, after all, is a bit, short for binary digit. Eight bits make a byte and so on.

The use of these two different measuring systems has led to discrepancies when, say, the computer reports a different amount of overall available space than you thought you were getting on that new external hard drive you just bought. Seagate, a company that makes computer storage products, has a knowledge base article that explains a bit of the history of how the two different systems came to be used and why the operating system reports only 465.76 gigabytes of your new external 500-gigabyte hard drive.

The National Institute of Standard and Technology has a reference page for referring to binary multiples. NIST uses notation like 1 GiB (gibibyte) to refer to a binary gigabyte, while 1 GB (gigabyte) is used for the decimal system.

Comments are no longer being accepted.

When I pay my bills next time, I’m going to write a little asterix footnote saying “* Note that $1 = 75c” and get a 25% discount.

In the real world, these things are called lies. They redefine what Gigabyte means for their own base marketing purposes. There is no difference between the decimal system and the binary system, they both refer to the same set of numbers. By telling these lies people who sell bytes, like hard disk manufacturers, sell you 10% less than what you think you’re getting, in the case of a 1TB drive.

You’re just perpetuating a fiction foisted on consumers.

If nutjob gets a load of the vagaries of floating point, he’ll really blow a resistor.

Just another example of the basic dishonesty of much of American business. Try to fool the customer, make something sound more than it is, better than it is, cheaper than it is. Enough already.

Nutjob is pretty hilarious. He’s engaging in a bit of performance art to see if people will bite his bait. For some reason, this kind of trolling never gets old on the Web.

He does sign off with “Nutjob” though. So that’s some concession to indicate that people really shouldn’t take his jokes too seriously.

Nutjob, it’s an approximation, in the same way that a liter is approximately a quart, and all your sound and fury is approximately nothing. Lighten up.