Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> get decent performance

The issue is that in Computer Science "real-time" doesn't just mean "pretty fast", it's a very specific definition of performance[0]. Doing "real-time" computing is generally considered hard even for problems that are themselves not too challenging, and involves potentially severe consequences for missing a computational deadline.

Which leads to both confusion and a bit of frustration when sub-fields of CS throw around the term as if it just means "we don't have to wait a long time for it to render" or "you can watch it happen".

[0] https://en.wikipedia.org/wiki/Real-time_computing



That link defines it in terms of simulation as well: "The term "real-time" is also used in simulation to mean that the simulation's clock runs at the same speed as a real clock." and even states that was the original usage of the term.

I think that pretty much meets the definition of "you can watch it happen".

Essentially there is real-time systems and real-time simulation. So it seems that they are using the term correctly in the context of simulation.


I don't think it's reasonable to expect the larger community to not use "real time" to mean things other than "hard real time as understood by a hardware engineer building a system that needs guaranteed interrupt latencies".


I think it’s reasonable to assume that it means what you described on this site.


Of course. I'm in the "Reality is just 100M lit, shaded, textured polygons per second" kind of guy- realtime is about 65 FPS with no jank.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: