Megabytes - Bits Converter
Convert between megabytes (MB) and bits (b). This converter is part of the full data storage converter tool. Simply choose which unit you want to convert from and to, enter a value and click the 'convert' button. A reference chart can be found further down the page.
Whilst every effort has been made in building this megabytes - bits converter, we are not to be held liable for any special, incidental, indirect or consequential damages or monetary losses of any kind arising out of or in connection with the use of the converter tools and information derived from the web site. This megabytes - bits converter is here purely as a service to you, please use it at your own risk. Do not use calculations for anything where loss of life, money, property, etc could result from inaccurate conversions.
Please see the full disclaimer for more information.
Converting bits and megabytes
Some commonly asked questions are included below, as well as a reference chart. When making manual conversions, you can use the converter at the top of this page to check your answer.
How many megabytes are there in 1 bit?
There are 1.25E-7 megabytes in 1 bit. To convert from bits to megabytes, divide your figure by 8000000 .
How many bits are there in 1 megabyte?
There are 8000000 bits in 1 megabyte. To convert from megabytes to bits, multiply your figure by 8000000 .
In the chart below, figures are rounded to a maximum of 3 decimal places (6 with smaller numbers) to give approximations.
What is a bit?
You might not be aware that the term "bit" is actually a shorter version of the phrase binary digit. A bit is the simplest unit of data storage and its value can represent a 0 or a 1 at any give time. Every other form of computing measurement is based off of the humble (and yet very important) bit.
What is a megabyte?
Most computer users are very familiar with the megabyte. As you might have already guessed, a megabyte is the equivalent of 1,000,000 bytes (or 10^6 bytes). We should point out that there is more than meets the eye here. While the SI definition is one million bytes of information, many computing professionals will instead use the more precise number of 1,048,576 bytes (1,024^2). This arises from the binary multiples that occur (each byte contains eight bits of information within its string).