Hacker News new | past | comments | ask | show | jobs | submit login

> While this isn’t a big deal in nearly all production environments (you want to know when your database gets slow, so you can optimize queries or add indexing), it matters a lot to us because a slow query can affect other customers queries.

One more reason to be leery of single-threaded eventing systems. You'd never run into this issue with a threaded web app, and it would perform just as well provided you kept your datastructures as independent as they are in your current eventing setup.

Comparing an eventing system and a threading system, the eventing system is inherently less shared-nothing. It shares everything the threading system does and the event thread.




You appear only to have read the first graf of this post, as the whole point of eventing the database access layer is to eliminate the "slow query blocks" problem.

But that's not why I'm commenting. Rather:

You're able to make that last assertion only by shifting the meaning of the word "shared" and denuding it of all its concurrency implications. Yes, event systems "share" the event loop, in all the glory of the word "shared". However, no two contexts in an evented system ever step on each other for access to a shared resource.


> You appear only to have read the first graf of this post

I read a little more than that, but I commented on what was interesting to me. If I'm reading the rest of it right, it's basically a tutorial on making a python extension for two specific mysql API functions. That's fine, but it's not that interesting (to me).

> However, no two contexts in an evented system ever step on each other for access to a shared resource.

Isn't that exactly what is happening when other requests are blocked by a blocking mysql call? They are stepping on each other for access to the shared event thread resource, which they need concurrent access to. Is this not the case? Please help me understand if I am misreading you.


Also, making everything non-blocking will not necessarily make the user experience better. It will just allow you to have more users concurrently, all waiting for your slow queries. In other words, user A no longer has to wait on users B, C, and D, but still has to wait on his/her own slow query. Depending on your application usage patterns, this could be enough to go from unusable to awesome. Or it could be that your just need to add a ton of indecies.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: