When I was a CS prof, many, many years ago, our undergraduate lab had Macs with floppy disks. I asked the University to pay for installing 10MB Hard Drives in the Macs. I was asked to present my case to the deans council. At the meeting, I said that the students used the floppy to load their development environment. I said that, with a hard drive, it took 10 secs to load and be ready. With the floppy, I said it took 30 seconds. Then I said, "That does not sound like much difference, but this is 10 seconds ...." I paused and looked at my watch for 10 seconds. Then I said "And this is 30 seconds" - again I looked at my watch. At the 20 second mark, the VP Academic (chair of the meeting) said "You have your money".
I've used this trick many times when people underestimate how long a few minutes can be in certain contexts. It works astonishingly well. Happy to see I'm not the only one!
There's an anecdote [1] about Steve Jobs and the original Mac's boot time similar to this story.
One of the things that bothered Steve Jobs the most was the time that it took to boot when the Mac was first powered on. It could take a couple of minutes, or even more, to test memory, initialize the operating system, and load the Finder. One afternoon, Steve came up with an original way to motivate us to make it faster.
[...]
"Well, let's say you can shave 10 seconds off of the boot time. Multiply that by five million users and thats 50 million seconds, every single day. Over a year, that's probably dozens of lifetimes. So if you make it boot ten seconds faster, you've saved a dozen lives. That's really worth it, don't you think?"
If the request you are talking about actually gets executed once (maybe it gets cached or something), then you shouldn't be pitching the time difference anyway.
If it gets executed 100s of times in a day per person, you can say this is 50 ms * 100 = 5s, 200 ms * 100 = 20s. And that's just per user.
Yeah, actually doing this math tends to surprise people. We had an automated background process that took ~2s to run, which doesn't sound bad at all considering it includes an API call over the internet. But multiplying it out to the number of backlog items we actually had at the time, 30-40 hours doesn't sound so good.
Fair enough. Unless you're talking about lags for automated trading systems/ algos where even that single "I really can't measure this" sec difference counts.
I mean, sure, the precision matters for HFT but at the scale the point would be moot since the time is so minuscule. Unless you hyperscale it: “on 1,000,000 trades the 50ms difference becomes very pronounced and could cost us $z” or something of the sort. But I still think it loses “the spirit” of the method — best way I can phrase that.
Interesting. By coincidence last night (while playing with a new streaming-stick) I did a little quick calculation. If I watched, on average, only 18 minutes of advertising per day, and did that for 40 years, that adds up to 6 months of my life.
What else might I have done with that 6 months? I will never know. But I know this: I won't be doing that any more.
Left university and joined a startup, switched jobs to a different company, that company was acquired by Apple, worked there for about 12 or so years, now retired, but work on an App.