Cloudflare already knows how to make the web hell for people they don't like.
I read the robots.txt entries as those AI bots that will be not marked as "malicious" and that will have the opportunity to be allowed by websites. The rest will be given the Cloudflare special.
Sam Altman literally casts himself a God apparently and that's somehow to be taken as an indictment of his rivals. Maybe it's my GenX speaking but that's CEO bubblespeak for "OpenAI is fucked, abandon ship".
It came up at the same time as the UK prince as a "wealthy folks just do these things it's funny and not an endorsement you pleebs don't understand" sort of way.
For what it's worth, a UK prince is one of the few people or groups of people that I would assume were likely not hiding Nazi sympathies. Their entire country, and specifically their recent royal ancestors, where subject to Nazi aggression and responsible for countering it. There's a long history of dressing up as those you want to lampoon, especially in British media.
Well, there is Edward VIII which as an American who doesn't follow UK monarchy drama was flabbergasted and in disbelief to learn about and which triggered a wikipedia rabbithole after seeing in particular the closing credits of The Crown S2E6
I haven't watched it, and skimming the wikipedia page isn't making it obvious what you might be alluding to, other than maybe him touring Germany prior to the war.
I'm not entirely sure where the downvotes came from since other than you nobody bothered to respond, but maybe it's based on people's belief that Nazis are some special and unique threat or that the only thing that should be thought about when they are mentioned is concentration camps and their treatment of the Jewish people, but they did so much more than just that, even if that was uniquely horrifying for the period. The Jews weren't the only people sent to concentration camps, and overall, WWII cost 70-85 million people their lives. Nazi Germany affected so many people, nobody gets to claim them as uniquely their own boogie men to the exclusion of all others.
Hmm, from some light reading I'm unsure how active he was as "part of the plot". In any case, that sort of supports my original claim, in that the British royals not only have a history of Nazis attacking their country, but trying to meddle with their royal line and succession, which also wouldn't enamor the royals to them.
Honestly when watching I had figured The Crown was playing it up for drama or invented it whole-cloth because the whole thing was so foreign vs what I was taught about WW2 British monarchy. And that's why the way they calmly deliver receipts (artifacts from the Windsor File/Marburg Files) in the credits triggered a rabbit hole of surreal bewilderment. In fact you will find historians of the British royalty who say The Crown was very lenient in its handling. But specifically about S2E6 there's a section here that summarizes what was in the episode
If it's an honest question and not just the beginning of a hate-fest, let's think...
Both Java the language and OpenJDK the main runtime & development kit have had much more money and manpower poured into them under Oracle than they ever had previously. Both continue to advance rapidly after almost dying pre-Oracle acquisition.
MySQL 8 (released in 2018) was a massive release that brought many long awaited features (like CTEs) to the database, although MySQL's development have stalled during the past few years.
Oracle employs several Linux kernel developers and is one of major contributors (especially to XFS and btrfs): https://lwn.net/Articles/1022414
Not top 3 or even top 10, but better than most companies out there.
That's all I can remember.
edit: after thinking about it for a couple more minutes, they're also the main developer of GraalVM — the only high quality FOSS AOT compiler for Java (also mentioned by a sibling comment), and are writing one of the major relatively lightweight modern alternatives to Spring (the other two being Micronaut and Quarkus): https://helidon.io
There's also VirtualBox (inherited from Sun). And probably some other things. Although from what I heard you risk being besieged by aggressive Oracle salespeople if they suspect you're using one of the proprietary "extension pack" features. This sort of thing is why I would think very hard before using anything from Oracle.
Made this mistake w VirtualBox once - never again. I ended up somehow running the proprietary version which “phoned home” and some 30 or 60 days later corporate got a call. When it comes to Oracle, I take Nancy Reagan’s advice [0].
>although MySQL's development have stalled during the past few years.
9.0 is finally released and we are now at 9.3. While nothing big or exciting with every release but development is steady. MySQL 8.0 will reach EOL in April 2026 so every should move to 8.4 LTS soon(ish) and 9.7x should also be LTS by then. I know most on HN is about Postgres but modern MySQL is decent. I think a lot of people still have MySQL from 5.0 era. Which is also somewhat true with Java as well.
I think Oracle do contribute lot of open source code, they just dont get the credit or brag about it.
Regarding Java, also to note that alongside IBM, they were cooperating with Sun since the very early days from Java, including the Network Computer efforts, for Java based thin clients.
I wonder how much of that was due to cursed distro repositories.
I will probably never forget the rage I felt at discovering the reason my scripts broke between Debian versions was because "apt-get install mysql" started silently installing MariaDB instead, which was close but subtly different enough. (I am honestly surprised considering an Oracle trademark is involved, no lawsuits were filed over this.)
I think the Oracle hate goes a bit beyond the generic hatred for successful companies (be it Microsoft, FAANG, etc).
People actually used to like the products that made these companies explode. Windows 2000 was cool; Facebook was cool; Google was cool. Whenever they stop being pure unadulterated evil for a few minutes, very quickly a lot of people are willing to forgive them and welcome them back into polite society.
But as long as most geeks can remember, Oracle has never been cool. At its peak, the company enabled a world of snooty BOFH DBAs, selling unreasonably expensive products to well-oiled middle-managers. And then they started "acqui-squeezing" adjacent products, blackmailing their own customers, and suing everyone in sight. They could cure cancer tomorrow, gifting all the related IPs to the world, and most geeks would still see them as scum trying to whitewash their image - and they would probably be right. There are some great engineers in Oracle, but their management is all that is wrong with capitalism.
From 01979 until about 02000, Oracle's RDBMS software was probably the best in the world, and definitely better than the free-software alternatives like Postgres. (Remember that MariaDB, then named MySQL, didn't become free software until 02000 and didn't support transactions until even later.) For several years after that, it was still the best database for some purposes. And that's true even though the Bastard Database Operators From Hell were present from the very beginning.
SQL is kind of shitty, but the relational database model is so great that it makes up for it. And until the rise of entity component systems, SQLite, PostgreSQL, and MariaDB, all the decent implementations were proprietary software. Separately, although SQL's implementation of transactions is even more broken than its implementation of the relational model, transactional concurrency is also great enough to make up for SQL, and, again, all the usable implementations used to be proprietary.
> until about 02000, Oracle's RDBMS software was probably the best in the world
I don't disagree, but that's not what I said. Yes, it was a good product; but everything else that came with it was already shit. It empowered DBAs, not developers, and enterprise people, not hobbyists; for these reasons (and others), it was just not something that most people liked.
Popularity is not simply a function of technical value.
>
From 01979 until about 02000, Oracle's RDBMS software was probably the best in the world, and definitely better than the free-software alternatives like Postgres. [...] For several years after that, it was still the best database for some purposes.
Which RDBMS software has/have become the best in the world after that?
I don't know. Oracle was written to run on a VAXCluster with a shared disk with a seek time in the tens of milliseconds, and things like Postgres are kind of architected for that world. The world has changed a lot. Anything you could fit on disk in 02000 fits in RAM today, most of our programmable computing power is in GPUs instead of CPUs, website workloads are insanely read-heavy (favoring materialized views and readslave farms), SSDs can commit transactions durably in 0.1ms and support random reads while imposing heavy penalties on nonsequential writes, and spinning up ten thousand AWS Lambda functions to process a query in parallel is now a reasonable thing to do.
I think you could make reasonable arguments for SQLite, Postgres, MariaDB, Impala, Hive, HSQLDB, SPARK, Drill, or even Numpy, TensorFlow, or Unity's ECS, though those last few lack the "internal representation independence" ("data independence") so central to Codd's conception.
If you want high availability and scalability (and damn the expense), then Oracle is probably still number one. Especially for write heavy workloads. But not everyone can afford to burn money.
As far as I am aware (I may be wrong, or things have changed in the last years) Postgres still is is worse in horizontal scaling than Oracle and Microsoft SQL Server (and likely also DB2).
As one rough estimate, 2.25GHz with AVX512 (at 1 IPC) means you can do 36 billion column-oriented 32-bit integer operations per core per second, which with 384 cores means 13 trillion 32-bit integer operations per core per second. On one server. So if you have a query that needs to do a linear scan of a column in a 13-million-row table, the query might take 300μs, but you should be able to do a million such queries per second. But normally you index your tables so that most queries don't need to do such inefficient things, so you should be able to handle many more queries per second than that!
(Each socket has 12 DDR5 channels, totaling 576 gigabytes per second to DRAM per socket or 1.13 terabytes per second across the two of them, so you'll get worse performance if you're out of cache. And apparently you can use 512GiB DIMMs and get 6 tebibytes of RAM!)
So, if you need more than one server for your database, it's probably because it's tens or hundreds of terabytes, or because you're handling tens of millions of queries per second, or because your database software is designed for spinning rust. Spinning rust is still the best way to store large databases, but now the cutoff for "large" is approaching the petabyte scale.
I think the space of databases that are under ten terabytes and under ten million queries per second is large enough to cover almost everything that most people think of when they think of "databases".
> So, if you need more than one server for your database
Servers fail, any serious business obviously needs more than one server. And many need more than one data centre, just for redundancy. The more modern Oracle clusters act in a similar manner to RAID arrays, with virtual databases replicated across clusters of physical servers in such a way that a loss of a physical server doesn’t impact the virtual databases at all.
Most databases aren't running a business. It isn't 01973 anymore. My web browser on my phone has a bunch of SQL databases.
I was talking about horizontal scalability, not failover. You don't need horizontal scalability, or for that matter any special software features, for failover. (Though PITR is nice, especially if your database is running a business.) With cloud computing vendors, you may not even need a second server to fail over to; you can bring it up when the failure happens and pay for it by the hour.
The features you're talking about made a lot of sense 30 years ago, maybe even 20 years ago. They still make a lot of sense if you need to handle hundreds of millions of queries per second or if you have hundreds of terabytes of data. But, for everything else, you can get by with vertical scaling, which is a lot less work. Unlike backups or rearchitecting your database, it's literally a product you can just buy.
(A lot of the open-source relational databases I listed also support clustering for both HA and scalability; I just think those features are a lot less important when you can buy an off-the-shelf server with tebibytes of RAM.)
The kinds of databases where your business stops working if the database does are OLTP databases (often, databases of transactions in the financial sense), and they aren't hundreds of terabytes or tens of millions of transactions per second. Oracle itself doesn't scale to databases so big or with such high transaction rates that they can't run on one server in 02025. (It does scale to databases so big or with such high transaction rates that Oracle can't run them on one server in 02025, but that's just because Oracle's VAXCluster-optimized design is inefficient on current hardware.)
People run Oracle because it's too late for them to switch.
> When I reached out to Microsoft about Nixon’s comments, the company didn’t dismiss them at all. “Recent comments at Ignite about Windows 10 are reflective of the way Windows will be delivered as a service bringing new innovations and updates in an ongoing manner, with continuous value for our consumer and business customers,” says a Microsoft spokesperson in a statement to The Verge. “We aren’t speaking to future branding at this time, but customers can be confident Windows 10 will remain up-to-date and power a variety of devices from PCs to phones to Surface Hub to HoloLens and Xbox. We look forward to a long future of Windows innovations.”
The purpose of weasel words is to sell falsehoods as pre-meditated gaslighting. The deception was intentional and the press and public believed the deception as intended. I wasn't suggesting that Microsoft is legally liable about anything, only that many people believed their lie as they intended.
If Microsoft encourages people to believe "the world is flat" and deploys deceptions and weasel words when directly questioned for confirmation there are two conclusions: (1) Microsoft knows the world is in fact not flat and (2) Microsoft wants everyone to believe the world is flat.
And ultimately when people do believe the world is flat it is in fact a belief Microsoft has encouraged.
/tmp is not specified to be a RAM disk by POSIX. Just that things in there are considered to be not persistent after a program stops (with implications for backups and disaster recovery). Sure, RAM disks work if the amount of /tmp space you need is less than your free physical RAM but sometimes that's not the case, either.
Back in the day you might place /tmp in a good spot for random access of small files on a disk platter. /var is vaguely similar but intended for things that need to be persistent.
Anyway it's not uncommon for systems to persist /tmp and clean it periodically from cron using various retention heuristics.
Ultimately POSIX concepts of mountpoints are strongly tied to optimizing spinning rust performance and maintenance and not necessarily relevant for SSD/NVME.
reply