Decimals, like fractions, are a way of describing points on the number line that fall in between the whole numbers. If you were using fractions, you’d say that the point halfway between 1 and 2 was 1 1/2. In decimals, you’d call that same point 1.5.
Each digit after the decimal point represents 1/10 of the one before it. So 1.55 means 1 and 1/2 and 5/100. One common use of decimals is in money, where $1.55 means a dollar and 55 cents. Another common use of decimals is in percents, where 50% of 200 means .5 times 200, or 100.
Early measuring systems and mathematics didn’t use decimals, because they didn’t have a good way to write numbers down. After Indian mathematicians invented the numbers we use today, adding the idea of zero about 500 AD, African mathematicians used those numbers to work more with fractions around 1300 AD.
But decimals became really common only with the French Revolution about 1800 AD. French mathematicians encouraged people to use decimals after the Revolution. They saw decimals as a new, more scientific, easier to understand way of doing math. They thought with decimals, math would be more available to ordinary people, so everybody could be an educated citizen.
More about percent
More about fractions
Bibliography and further reading about math: