That’s because human perception exists on a logarithmic scale! It’s called the Weber-Fechner law, and it was one of the first studied psychological phenomena, before psychology as a field was even defined.
Interestingly, our sense of the “bigness” of numbers is also logarithmic. This is why there have to be explicit explanations of the massive difference between a million and a billion - our brains instinctively and erroneously think “eh, it’s like double.”
~edit I can’t type~
“What’s the difference between a million dollars and a billion dollars? About a billion dollars.”
It’s .1% of a billion, that is a rounding error
My favourite way to comprehend it is by time:
A million seconds is 12 days. A billion seconds is 31 years. A trillion seconds is 31,688 years
if its 45C thats hot.
Parts of the US can reach 100 during the summer
That sounded weird so I had to look this up. The hottest recorded temperature America is 134.4°F (56.7°C), which was measured in Death Valley, California, on July 10, 1913.
I think you could be mixing up °C and °F.
It is a joke about Celsius vs Fahrenheit
Just don’t try visiting a place that is 100 Kelvin
Sorry, that went way over my head. 🤦♂️





