I think that most Oracle haters hate them for their obscenely expensive and often idiosyncratic database product, as well as their legal division being larger than their engineering division.
And speaking of databases, the idea of running custom Java code inside the database seems like an abuse of the database. How many people actually use this feature, and how many of them like it (as opposed to having made a bad technical decision 20 years ago)?
Plenty, this is no different from running C and Perl, which Oracle did before, what PostgreSQL allows for (which Rust stable support being cheered over here last week), or running .NET on SQL Server.
When performance matters, stored procedures are the way to go, not wasting network traffic and CPU cycles on the client, As it so happens, these additional runtimes are great and safer way to extend PL/SQL, pg/SQL, T-SQL,... than writing C extensions.
> When performance matters, stored procedures are the way to go, not wasting network traffic and CPU cycles on the client, As it so happens, these additional runtimes are great and safer way to extend PL/SQL, pg/SQL, T-SQL,... than writing C extensions.
While I agree in principle, in practice to me it feels like the tooling just isn't there.
Testing and debugging is a bit more difficult than with other languages (e.g. even using breakpoints and stepping through code), things like logging typically don't have pleasant implementations (e.g. log shipping), the discoverability and even version control of the code also tend to be worse, among other things. That's even before you get into building around the particular runtime that you're provided with, trying to get a grip on dependency/package management, automated CI deploys, rollbacks, monitoring/health checks, local development environments and so on.
My experiences might be the opposite of some folks, but I recall working on a system where most of the logic was implemented in the database packages and something like Java was used just as a glorified templating solution to serve a webapp. The performance was great, but actually working with the codebase was an utter nightmare, so it's not worth it in my opinion. That's like choosing to write a webapp in Assembly just because it's faster.
Do you debug stored procedures?
47% Never
44% Rarely
9% Frequently
Do you have tests in your database?
70% No
15% I don't know
14% Yes
Do you keep your database scripts in a version control system?
54% Yes
37% No
9% I don't know
Do you write comments for the database objects?
49% No
27% Yes, for many types of objects
24% Yes, only for tables
If half the people don't debug their code, three quarters don't test their code, almost half don't use version control and about half don't bother writing comments of any sort, that's the kind of code that I don't want to be working with and would advise others against going for that approach. While we can talk about the fact that these things can be done, the fact that they're not, is evidence enough that the community just isn't there yet.
Use databases for what they're good at (including some in-database processing, like reporting), but don't try to do everything in them.
Let's not blame databases for lack of skills or interest in how to use them properly.
Oracle, Microsoft and IBM provide the same kind of IDEs, graphical debugging and source control as any other programming language.
It is this lack of skills that comes up with fads like NoSQL.
By the way, there are similar results related to debugging in other languages, where most can't do better than printf debugging, don't know unit tests, profilers or static analysers.
The outcome of bootcamps or CS degrees without sound engineering practices, while people label themselves "engineer".
> Let's not blame databases for lack of skills or interest in how to use them properly.
We can (and should) explore the causes for it, sure, but that doesn't change the reality that if you join a project that uses a certain approach, it isn't guaranteed to be using the best possible practices, but rather whatever is popular and easy to do in the industry, unless you're very selective about where you work.
> By the way, there are similar results related to debugging in other languages, where most can't do better than printf debugging, don't know unit tests, profilers or static analysers.
I'd say that this is true to some degree and is also a reflection of either poor tooling, or lack of interest. For example, command line debuggers with arcane keybinds will be harder for the average developer to learn and use effectively, than just clicking on the line they want to stop at in a JetBrains (or similar) IDE and just clicking a custom run button that will launch their entire project in debug mode. The same goes for being able to run either your entire test suite or a particular test by just clicking a button in the source file, helpfully shown by a good IDE.
Things get worse when you want to test the integration with an actual data source (like an external API or a database), because in some cases you'll have to mock so much of it that you won't be testing anything remotely close to the real thing, or will have to deal with bootstrapping an instance of the API (if you can even self-host the full thing) or a real database, which will be really good from how truthful your tests are for real world use cases, but will need certain configuration and resources for setting it all up. Sometimes you can get away with something close enough, like using an in-memory database behind an ORM, but some of those abstractions end up leaky. Even worse if you want to test your integration against cloud services.
Static analysis tools are not without their issues either: something like SonarQube is good theoretically, but will have you struggling against setting up the actual scanner (including mundane stuff like source file encoding) on your CI server, setting up separate configurations if you ever want to run it against your local codebase and will be hard to configure in regards to what should or shouldn't actually trip up the analysis and throw warnings at you, because not all of the recommendations will be even viable for your framework and how it expects code to be written.
> It is this lack of skills that comes up with fads like NoSQL. ... The outcome of bootcamps or CS degrees without sound engineering practices, while people label themselves "engineer".
Does it mean that we shouldn't do these things? No, but it definitely means that we shouldn't just wave our hands around and suggest that it's just an issue of education, when actually trying to use the current technologies is often like banging your head against a wall. Use what works well and causes the least headaches, be open to eventually trying new things as the ecosystem and tooling improve, but don't stray too far from what others do successfully for now either.
From what I've seen, the things that have absolutely improved are schema versioning solutions (and thus, versioned DB migrations) and the ability to run database instances locally for development (in throwaway containers), so that you can test breaking migrations with believable seeded data before it ever needs to run on a shared environment. Codegen still could be better (e.g. generating Java/.NET/... entity code for an ORM in a schema-first approach), but some forwards/backwards engineering has been around for a decent amount of time at least, when dealing with models (for example, in MySQL Workbench, though pgAdmin is still lagging behind there). There are even tools for easier development of APIs, like Hasura, PostGraphile, PostgREST and so on, though the adoption there varies.
Edit: oh, another thing that was really good was recent versions of Oracle letting you automatically generate indices for your schema based on how it's actually queried, in case the queries evolve with time but nobody reviews the indices. Except that the automatically generated ones couldn't be removed manually, which felt like bad design. Despite that, more RDBMSes should have that sort of functionality, or the equivalent of SQL Tuning Advisor, that gives you actionable advice. Oracle was a mess to work with for other reasons, though.
For doing in-database processing, even when you use good solutions like DataGrip, things still don't feel as good as when compared to what you can knock together using your typical Java + Spring + Hibernate setup, C# + ASP.NET + EF, or other equivalents. I'd personally use DB views for complex queries, or dynamically generate SQL (like in MyBatis XML) to not get too caught up with ORM idiosyncrasies, but would implement lots of logic in the apps still.
And speaking of databases, the idea of running custom Java code inside the database seems like an abuse of the database. How many people actually use this feature, and how many of them like it (as opposed to having made a bad technical decision 20 years ago)?