Hacker News new | past | comments | ask | show | jobs | submit login
ColdMac – a 2009 iMac self-serving graphs of it’s temps in a cold outdoor shed (macfixer.com)
42 points by blakespot on Feb 4, 2021 | hide | past | favorite | 17 comments



Wondering what kind of database/schema is being used here for the time series data- I have a Raspberry Pi tracking the moisture of the soil of one of my plants and I know I could be doing a lot better than my naive sqlite setup for historical data...


Oh, XXI century, people store Macs in sheds, cars drive themselves, AI-powered robots vacuum floors, and I still have no freaking idea how to read fahrenheit. It makes absolutely no sense, move to celsius please :(


I recently set my indoor/outdoor thermometer to Celsius, and I now have an argument for Fahrenheit that I haven't heard before. It may not make sense to anyone else...

The minimal temperature difference that "makes a difference" to me in Fahrenheit is about two degrees. When I'm comfortable at 70, I feel a little cold at 68. But in Celsius, that's about 21 vs 20. Having one degree make a difference vaguely feels wrong to me.

The scale should have slightly more precision than a person can easily feel. It's sort of loosely associated in my mind with how you need to have an audio sampling rate twice your top frequency.

I thought of a modest proposal. There are 360 degrees in a circle, so there should be 360 degrees between absolute zero and the melting point of water. That would also split the difference between the size of the deg. C and deg. F.

Or, if you don't like that, then how about a scale that goes from 0 at water freezing to 200 at boiling?


> But in Celsius, that's about 21 vs 20. Having one degree make a difference vaguely feels wrong to me.

See, it's funny, because I've always felt exactly the opposite way.

A difference that I can't feel is a difference that doesn't matter. So it utter aesthetic sense to me that one unit should equal the minimum difference I can feel.

(And then all the fancy science stuff that needs more precision can use decimals.)


I think I had an intuitive sense of the issue, but after thinking about it, I can formulate the exact problem from my point of view.

Suppose that the thermometer is 21. That means in reality it's somewhere between 21.5 and 20.5 assuming perfect measurement and rounding.

And if it's 20, then it must be between 19.5 and 20.5.

So looking at the temperature moving by one degree, it could really be anywhere from zero to two degrees. But whether or not it moved one degree (C) is what I want to know. So the precision being equal to the meaningful amount is inadequate.

I don't consider this fancy science stuff, but something that bothered me at a gut level before figuring out why.


> Suppose that the thermometer is 21. That means in reality it's somewhere between 21.5 and 20.5.

Isn't that completely dependent on the thermostat? The tolerance could be lower, say between 20.9 and 21.1, or higher, such as between 20 and 22, or even something terrible like 18 and 24 (come to New York City some time and experience the wonder of our ancient heating systems).


Sorry, I wasn't talking about the precision that the temperature is maintained at, but just the precision that the temperature is measured with. Nothing to do with my thermostat.


Farenheit’s the only part of the imperial system that makes sense given we primarily use it to convey how temperatures between humans.

0 is really cold, 100 is really hot, 1/3 of the way from really cold to really hot is when water starts freezing.

Meanwhile, Celsius... too much variation per unit as you mention. And telling human temperatures on a scale based around the properties of water is a bit odd.


When my pot is boiling its 100 degrees. When my ice tray starts freezing its zero degrees. I can look outside in winter an if it's raining I know it's not less than zero. How is that not incredibly useful?


Because your pot boiling isn't really a useful measure for weather on Planet Earth quite just yet?

Pretty much all of the scale from "hot day" to your pot boiling is wasted in reference to weather


You can look outside in winter and know that if it is raining the air temperature is above freezing without a temperature scale. It’s pretty apparent by the fact that the water is not freezing.

And how often does knowing that your pot of boiling water is 100 degrees or your ice tray is 0 help you?

I have never been making pasta and thought to myself “wow, sure is nice I know the exact temperature of this water right now”


Fahrenheit is also nice in that increments of 10 are equivalently significant, and increments of 15~20 correspond fairly well to jacket levels.


AFAIK we use the Celsius decimal point.

37.4 degrees is different to 37.9 degrees.

37.4 is a bit warm but normal human temperature. 37.9 is a fever.

I see no need or desire to ever use Farenheit in my life.


What do you mean "we use the Celsius decimal point"? Most if not all devices I own switch between Fahrenheit and Celsius, but they don't provide a decimal point on one if it isn't on the other. And that wouldn't really equate, since the difference is more like a factor of 2 than 10.


You can see the CPU temp increase when this hit the front page. Kinda neat.


This is such a silly but entertaining project. I love that it's the server for the website as well.


Nice one, it would be good to have an option to display temperature in C and not just in F.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: