LT said:

**Why is it that way? Why doesn't 1 gig = and even 1000 mbs?**

Why not make it simple like metric?

That's simple to explain... it all comes down to the way that computers

**store** information: bits!

Then we get to the ANSI standard for a single caracter: byte, which as we all know is 8bits!

To

**store** a single byte we need 2^3 bits... there isn't any whole number n, that will make 10^n = 8. So for

**storage** computer engineers decided to use powers of 2 instead of 10.

So 1KB=1KiB=2^10=1024, 1MB=1MiB=2^20 and so on....

I enfasize storage, because in communications the International System is used. This international number system uses power of 10, multiples of 3 to make the distinction.

so 1K=10^3=1000, 1M=10^6 an so on!

So a network connection of 10Mb(10 mega bits) mean 10^6=1000000bits passing the cable!

One final question? What's a billion??

**NOTE:** By ^ I mean to the power of!