There are two ways to define a gigabyte. One is vendor way and another one is the computer's binary powers of two definition method.
When you buy a "500 Gigabyte" hard drive, the vendor defines it using the decimal powers of ten definition of the "Giga" prefix.
500 * 109 bytes = 500,000,000,000 = 500 GigabytesBut the computers operating system determines the size of the drive using the computer's binary powers of two definitions of the "Giga" prefix:
465 * 230 bytes = 499,289,948,160 = 465 Gigabytes/ GibibytesIf you're wondering where 35 Gigabytes of your 500 Gigabyte drive just disappeared too, you're not alone. It's an old trick by hard drive makers -- they intentionally use the official SI definitions of the Giga prefix so they can inflate the sizes of their hard drives, at least on paper. Ideally, we should refer to binary prefix when calculating sizes of storage devices as this makes more sense. Following tables helps you in understanding the difference more clearly.
Comments
Post a Comment