Perl being so old means it's extremely fast for what it's designed to do, process streams or pipes. In a few tasks, it's faster than C, but being much faster to create a script or program that is useful, and with the implicit syntactic sugar, and since it's so flexible, you can just do things in the one way you know how and that's usually good enough.
Python is pretty good too for this and because modern computers are so fast it doesn't matter that it's much slower than perl, but if you're doing something like processing terabytes of files, it's probably worth your time to find or vibe code a one-liner in perl and torture it into working for your task.
Perl is amazing when it comes to regular expressions too. It's one of the reasons why perl is way more fun to write than Python. I still use perl for regex heavy tasks. I wish that python had integrated regex into the language the same way.
The complexity between the modern web and a pdf is marginal. PDFs do get printed for menus. Editing a PDF and uploading it to the site, integrating prices and syncing between the site, online ordering, PDF menus is just part of the business. There are lots of platforms that help with this such as Slice.
Of course, by materializing her memories, they are re-establishing or strengthening the neural pathways that would otherwise wither away with time. It's not necessarily grief-dependent memories when she "revisits" her loved ones but over time, the illusion becomes a weird trap that she would grow more aware of, creating an uncanny valley-like situation.
So not necessarily a "hell" but more like unneeded and distracting kitsch cluttering the shelves; turning your memories into cheap trinkets.
LLMs are rather easy to convince. There’s no formal logic embedded in them that provably restricts outputs.
The less believable part for me is that people persist long enough and invest enough resources at prompting to do something with an automated agent that doesn’t have potential for massively backfire.
Secondly, they claimed to use Anthropic own infrastructure which is silly. There’s no doubt some capacity in China to do this. I also would expect incident response, threat detection teams, and other experts to be reporting this to Anthropic if Anthropic doesn’t detect it themselves first.
It sure makes good marketing to go out and claim such a thing though. This is exactly the kind of FOMO panic inducing headline that is driving the financing of whole LLM revolution.
there are llms which are modified to not reject anything at all, afaik this is possible with all llms. no need to convince.
(granted you have to have direct access to the llm, unlike claude where you just have the frontend, but the point stands. no need to convince whatsoever.)
It's C++ programs in a Userscript format, which are compiled with a bundled instance of clang. Windhawk shows diffs of version changes, and most programs aren't much longer than a couple dozen lines, so pretty easy to visually verify
We can’t. We don’t know how to do it.
reply