Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People who complain about Fahrenheit vs. Celsius are correct to the degree (sorry) that the Celsius degree unit of difference is the standard in a lot of engineering calculations. But Celsius as a temperature scale is no more logical than Fahrenheit, which is arguably more practical for day to day use--and Kelvin is more likely required for a lot of engineering and chemical calculations anyway.


> But Celsius as a temperature scale is no more logical than Fahrenheit

Celsius is more logical:

(1) the endpoints of Celsius are boiling/melting point of water (at standard atmospheric pressure). The lower endpoint of Fahrenheit is the lowest temperature Fahrenheit could achieve using a mixture of water, ice and ammonium chloride-using the freezing point of pure water is more logical than using the freezing point of an ammonium chloride solution-water is fundamental to all known life, ammonium chloride solutions don’t have the same significance (and why ammonium chloride instead of sodium chloride or potassium chloride? of the salts readily available to Fahrenheit, the ammonium chloride solution had the lowest freezing point)

(2) Fahrenheit initially put 90 degrees between his two “endpoints” (ammonium chloride solution freezing point and human body temperature), then he increased it to 96. Celsius having 100 degrees between its endpoints is more logical than 90 or 96

(3) while for both Celsius and Fahrenheit, there is error in the definition of their endpoints (the nominal values are different from the real values, because our ability to measure these things accurately was less developed when each scale was originally devised, and some unintentional error crept in)-the magnitude of that error is smaller for Celsius than for Fahrenheit

(4) nowadays, all temperature units are officially defined in terms of Kelvin - and Celsius has a simpler relation to Kelvin than Fahrenheit (purely additive versus requiring both addition and multiplication)

(5) Celsius is the global standard for everyday (non-scientific) applications, not Fahrenheit, and it is more logical to use the global standard than a rarely used alternative whose advantages are highly debatable at best


> which is arguably more practical for day to day use

No, it is not. Americans say this only because they're used to it. The common arguments are that is that it is more precise, and 'you see temperatures from zero to one hundred degrees Fahrenheit throughout the year'.

Firstly, the problem with Fahrenheit is that its realisation is inaccurate by modern standards—which is why every single non-SI unit is now just an exact multiple of a corresponding SI unit with the same dimensions; the mediaeval and early modern definitions having been entirely forgotten. A bowl of salted ice and his own armpit? Truly an 18th-century invention.

Next, the extra precision that a difference of one degree Fahrenheit gives you is frankly useless. Within a single room one can experience a difference of over five degrees Celsius or more, depending on what's in the room—a radiator or air conditioner running, or the gas stove in a kitchen, or a high-end PC. Forget rooms. On the human body itself there can be a two to three degree Celsius difference between the extremities and the thorax/torso/head. Any field that requires extreme precision will naturally end up using SI units, so kelvin (or some related scientific unit). (Excluding the absolutely crazy bunch of American machinists who like using thousandths and ten-thousandths of an inch—at this point the joke writes itself).

As for climates, there are places that see very little difference in temperature, and definitely not the 'nice 0 – 100' range that Americans claim. Even in the US there are places like southern Louisiana and Florida that have borderline tropical climates, and don't really go below ~15 °C or above 35 °C.

All of this is not really logical either, and all end up being a manifestation of familiarity.


I'm not generally a Fahrenheit defender, but I think it's silly to deny the user-friendliness of having more integer values in the day-to-day temperature range, without going too far out of the two-digit measurement range. It lets you have a little more precision without being much effort to do casual math on. Milli-kelvin are far too small, a single kelvin is too big a perceptual range, and decimals are too annoying when we just want to talk about the weather.


I’m legitimately surprised by this idea - surely people in countries that use Fahrenheit don’t go around saying things like “Oh I thought it was going to be 54 degrees but it’s actually 55, so much different!”

I’ve grown up with Celsius and never felt the need to use decimals in day to day weather discussion… Many air conditioners let you go up by half a degree C and that’s more than enough precision, more than I’ve ever felt was necessary in everyday conversation.


1 degree F is right on the borderline of detectability, and in my opinion that's just right. FWIW I do make single degree changes to the AC reasonably often and notice the difference. That might be due in part to convection nonsense in my house, but that's arguably another argument in favor of that level of precision being useful in practice. I can believe centigrade-and-a-half works fine too, but that doesn't itself undermine the idea that F is kind of handy, with that tiny bit less bother.

Again, not saying F is great, just that that one minor advantage is in fact real.


Oddly the same system that's enamoured with measurements like "sixty-fourths of an inch" can't countenance the sort of witchcraft entailed in "half a degree Celsius"


Eighth of an inch is about the limit of precision in most circumstances.

Countries that use Imperial for day to day use still do tend to use metric for a lot of purposes including many engineering tasks. (Imperial is terrible for certain types of engineering calculations--especially involving pounds and lets not get into slugs--but it's fine for cooking even if I tend to use grams for weight on my scale when baking.)

"Imperial countries" are actually pretty hybrid for the most part to greater or lesser degrees. UK is more SI than the US but they still commonly use a lot of Imperial units including some "odd" stuff like stones.


I presume if you think eighths of an inch is good enough then you mostly work with large objects. Luckily outside of USA, Liberia, and Myanmar inch-based scales are rare nowadays, but I still come across specifications in small fractions of inches for such things as manufacturing tolerances, drill bits, coating thickness, cable diameters.

Informal measurement of humans and beer is a nice quirk of the UK system, but in my experience it's not commonplace to use imperial units for anything where accuracy is important (except maybe jewellers, and arguably road speeds/distances).

See also, the monstrosity that is AWG: https://en.wikipedia.org/wiki/American_wire_gauge


No one here is defending inches. In particular I don't think anyone is "enamored" with 64ths. We all agree they're a pain.


On the edges, absolutely. 66F is comfortable working temperature for me. 65F is too cold and I will start to shiver.


So why do those air conditioners go up and down by half a degree C rather than a full degree if people don't care about precision? Never seen an F degree thermostat do stuff in half degrees.


> As for climates, there are places that see very little difference in temperature, and definitely not the 'nice 0 – 100' range that Americans claim.

That's not really the point.

0° F: It's cold outside. 100° F: It's hot outside.

0° C: It's cold outside but not really that cold. 100° C: Dead.

0 K: Dead. 100 K: Dead.

These things are the case for humans regardless of whether you live in a place that actually gets cold or hot outside.


It literally doesn’t matter at all what’s 0 and 100 though. Honestly if you’re familiar with one system, the scale feels intuitive to you, and if you’re familiar with the other then the other one does.

Like people tell me that the US customary system is “more human scale and intuitive” but I literally cannot picture, say, 15 inches or ten feet - it just means nothing to me unless I mentally convert to centimetres or meters.

So much of these arguments boil down to “I grew up with this system so I can intuitively use it, so it must be superior”


> I grew up with this system so I can intuitively use it, so it must be superior

This is essentially every American argument for USC or Imperial units. In fact, there are actually legitimate reasons why some legacy units are superior—for instance the duodecimal or sexagesimal systems which have many more factors than the decimal. But every other argument is a variation of 'it's better because I know it better'.


0°C tells you the very practical information that it's freezing outside and that you must be careful, or you can expect snow. For F° you have to know the value by heart.


Zero is just as arbitrary as 32 but ever so slightly easier to remember. Anyone for whom that slight difference in remembering is actually a real impediment likely has other issues that require them to be under the care of others so it ceases to be relevant to them. At the other end, 100 vs 212 is of no practical use for anyone. Outside of the laboratory and manufacturing, people just boil water without thinking about the number associated with it. And in those settings, many other variables are tracked so the difference is of near zero consequence.


Please sir/madam, if you can, spare a thought for us lost souls who spend most of their finite lives in the laboratory and/or manufacturing.


So to put it short: if I know what I'm talking about I'll use K, and if I have no idea it doesn't matter anyway.


Even that isn’t quite true because you can get ice on the road or the sidewalk when air at 2m is ~3C.


250K: quite cold.

300K: beach weather.

350K: you're distilling your own moonshine right ?

(worksforme)


In case any of you are distilling your own moonshine, please be aware that at 350 kelvins (77 Celsius) you're likely to be getting methanol[0]. You should have the antivenom (ethyl alcohol, like vodka) on hand just in case.

Home distilling is great fun, and sometimes it's even legal, but please have an accurate thermometer and try not to poison yourself and others unless absolutely necessary.

(These temperate levels are also humidity and altitide/pressure dependent, if your still is in the high Appalachies then just listen to your heart.)

[0] https://diydistilling.com/distillation-temperatures/


Amen to that. I wasn't trying to encourage anyone to try their hand on distilling without learning more about it.

Only highlighting that one can "humanly relate" to Kelvin-based temperatures. If one so wishes. And that "reference points" there needn't be any more "arbitrary" than for °F/°C.


> 0° C: It's cold outside but not really that cold.

> These things are the case for humans

Who says so?

0 °C is very cold by many people's standards. About half the human population lives within the tropics. In fact I'd like to see Americans walk around in the UK wearing just a T-shirt and bermudas when it's barely above freezing, and insist 'it's not really that cold, it's only 32 °F'.


Maybe this is the perspective of someone who has spent a lot of time in the northern US, but it's cold but really not that cold. I might even run outside briefly in bare feet in the snow. But, yes, get to 0 degrees F and you're in bitter cold territory although, of course it gets colder in the northern Midwest/mountains/etc. And my contractor has been in shorts and a T in 40s-ish weather this week.

The point is that F degrees seems a pretty human scale that doesn't usually need a lot of decimals or minus signs for routine purposes. That it doesn't correspond to a couple of water transitions at standard pressure/temperature is sort of irrelevant. Of course, I and many other people are perfectly happy with using Celsius/Kelvin degrees for various purposes.


The minus sign is quite practical though.

When winter is coming, if it's 3 deg C outside, I typically don't need to worry about ice. If it's -3 deg C outside, I need to worry about ice.

When winter is waning, if it's been icy and it's -3 deg C outside, I typically don't need to worry about water on top of the ice. If it's 3 deg C outside, I typically do need to worry about water on top of the ice making it super slippery.


Again, it's not about where you live. 0°F is approximately the temperature at which the risk of frostbite becomes significant. 100°F is approximately the temperature at which the risk of heat-related illnesses becomes significant. Temperatures in between are far less hazardous to humans, even if the numbers near the edges are starting to get uncomfortable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: