There are two ways to define a gigabyte. One is vendor way and another one is the computer's binary powers of two definition method. When you buy a "500 Gigabyte" hard drive, the vendor defines it using the decimal powers of ten definition of the "Giga" prefix. 500 * 10 9 bytes = 500,000,000,000 = 500 Gigabytes But the computers operating system determines the size of the drive using the computer's binary powers of two definitions of the "Giga" prefix: 465 * 2 30 bytes = 499,289,948,160 = 465 Gigabytes/ Gibibytes If you're wondering where 35 Gigabytes of your 500 Gigabyte drive just disappeared too, you're not alone. It's an old trick by hard drive makers -- they intentionally use the official SI definitions of the Giga prefix so they can inflate the sizes of their hard drives, at least on paper. Ideally, we should refer to binary prefix when calculating sizes of storage devices as this makes mo...