Yes it is! Internet safety 101. Don't post stuff in public that you don't want to be public. There's always going to be creepy people doing whatever they want with all your public information. No AI company can stop that - the cat's out of the bag once you publish it to the whole world.
Town, outside, and dungeon represent decreasing levels of safety. In most games, players want a clear indication of how much danger they are in just walking around. Some games, like Dark Souls, do blur these lines. I think it would be easy to go overboard.
This strikes me as one of those things that sounds better on paper than in practice.
I think Dark Souls is not a fluke, it shows that when executed well (which very may be hard), it is additive. It makes things feel more organic.
From article :
"Maybe one cave system has a place where it connects to a dungeon, which connects also to a basement in some guy’s house in the middle of nowhere."
This just sounds better than having the black and white delineations between spaces.
Yes!
> Maybe one cave system has a place where it connects to a dungeon, which connects also to a basement in some guy’s house in the middle of nowhere
To an extent, tears of the kingdom really does do this a few places, but not enough. It really is fun finding new holes into the underworld from a cave, and using the caves to get into the shed in that one village or to the tower etc
Something I've occasionally wished for is a classic-style Zelda game[1] where partway through the adventure you discover that all the dungeons are actually adjacent to each other, and you can open up passages connecting them turning it all into one big Metroidvania experience.
[1]: i.e. one with 4-8 dungeons and new navigation/combat tools in each, not a sandbox like BotW
I was surprised that ToTK was so focused on the underworld. The sky islands are much nicer.
That said, I was also surprised ToTK had the same plot as BotW. Like, Ganon takes over the castle and then they defeat him and then they go into the basement and he's just there and he takes over the castle again?
Safety can come from control over the world though. Consider Minecraft and Terraria (especially older MC), where monsters can spawn in most areas outside some minimum radius from the player. Neither is particularly "scary" because they give the player straightforward ways to control the situation. In fact, monster spawning leads to a lot of emergent gameplay in them.
Expression is a bit distracting though, especially if the ultimate goal is to publish text. Doodles and character size changes probably won't make it into a printed manuscript. I'd argue that's why tools like Microsoft Word are bad for writing; such software displays unimportant thing front and center like changing colors and fonts when ultimately we just want to customize the semantics of text like conveying a quote or code or basic emphasis.
There are many stages: creative, editing, and publishing (the above can all be broken down even more, and this is often useful, but I had to stop someplace). Creative is the first state, just getting everything down. Editing is the details of making it correct (both fact checking and grammar). The publishing is making it all look nice. They are 3 separate steps that demand 3 different skill sets. You need to keep them separate.
Unfortunately the above is easially to say, but hard to force. If you are creating something you should stop if the facts are wrong (no point in continuing when you suddenly realize your argument depends on something that might be false) even though fact check is an editing process. You cannot refer to data in a chart until you create the chart. For many people a misspelling is something their brain will not ignore even though they know the word they mean and fixing it belongs to editing - your flow is already interrupted either way and not fixing it means the flow stays interrupted.
I don't use anything like a "writer deck" but for me pen and paper is a non-starter due to hand fatigue. I can type for much, much longer periods than I could ever hope to write by hand.
It also introduces significantly more lag, at least for me, between the thinking and actual writing down of the words.
Sometimes slowing down the process like this is helpful, in other cases it's better to make the emission of the words onto the page as immediate as possible, depends on the piece.
Costs about £5, lasts many months of heavy use -- much more than any cartridge pen as the whole barrel is full of ink -- and I have never ever had one leak in my bag or pocket.
Not the person you were replying to, but “oh, gee, I wish I’d thought of that! /s”.
I spent my first few decades trying to train myself not to write in a way that causes physical pain. The closest I got was when I discover Lamy Safari pens, which won’t let me hold them the “wrong” way. That only makes it a little less horrid.
it's the same kind of "workflow optimization" that notion and obsidian users suffer from. You spend so much time making your tools more productive but don't get any actual work done.
I just use obsidian out of the box. No extensions. I dont use tags. I dont use any fancy features. Its a markdown editor with a file tree to me. Its great.
Government agencies like to take standards off the shelf whenever they can. Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).
That’s the definition of throwing the baby out with the bath water.
Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.
They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.
Office Open XML, the standard behind .docx and other zipped XML formats, was fast-tracked into the international standard without many rounds of reviews (by the same JTC 1!).
> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”
Says that, but I don't agree with that. If anything it would have been less successful being picked up in discount markets if the specs weren't free for download, and I don't know what fringes they're trying to break into but probably none of them care whether the spec is ISO.
That can depend on how the spec gets made into an ISO standard. There is a process called "harvesting" that can allow the original author to continue to distribute an existing specification independently of ISO.
I'm guessing in those kinds of situations it doesn't matter about the arch given x86 and ARM also aren't ISO standards. The manufacturers however should comply with relevant quality standards.
it doesn't matter when there is no ISO standard for a given tech. But as soon as there is one, then you have to provide arguments as to "why didn't you use the standard".
Usual lies. There are a plethora of largely ignored international standards. Making it an international standard is just one of many ways to achieve the wide worldwide acception and still has a high failure rate.
My take is that it could help tie up fragmentation. RISC-V has different profiles defining what instructions come with for different use cases like a general purpose OS, and enshrining them as an ISO standard would give the entire industry a rallying point.
Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means
fossilised is often desirable or requested in some industries. Developing for the embedded market myself, we often have to stick to C99 to ensure compatibility with whatever ancient compiler a costumer or even chipset vendor may still be running.
I wouldn't say it never had a problem, but the profiles are definitely a reasonable solution.
However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).
The FUD keeps being brought up, but the solution here was in place before the potential issue could manifest.
It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.
Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.
The original architects of the ISA knew what they were doing.
Sometimes it helps, sometimes it doesn't. Like when Sun Microsystems submitted ODF for standardization to ISO, it was so successful that Microsoft had to do it too for OOXML. In fact MS pushed so hard that it left a huge trail of destruction in the standards committees.
Other times, like with the "ISO power plug", the result was ISO/IEC 60906-1 which nobody uses. Swiss plugs (IEC Type J), which this plug is based on, use a slightly different distance for the ground pin, so it is incompatible. Brazil adopted it (IEC Type N) but made changes to pin diameter and current rating.
Governments seem to care about "self-sufficiency" a lot more these days, especially after what's happening in both China and the US right now.
If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.
Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.
This is exactly what makes this such an interesting development. Standardization is part of the process of the CPU industry becoming a mature industry not dependent on the whims of individual companies. Boring, yes, but also stable.
Yes, and they're both massively debated and criticised, to the point that the industry developed Risk-V in the firstplace. Not to mention the rugpull licensing ARM pulled a few years back.
>On May 5, 1993, Sun Microsystems announced Windows Application Binary Interface (WABI), a product to run Windows software on Unix, and the Public Windows Interface (PWI) initiative, an effort to standardize a subset of the popular 16-bit Windows APIs.
>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs
It ticks a checkbox. That's it. Some organizations and/or governments might have rules that emphasize using international standards, and this might help with it.
I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).
One of the demos was printing a thing out, but the processor was hopelessly too slow to perform the actual print job. So they hand unrolled all the code to get it down from something like a 30 minute print job to a 30 second print job.
I think at this point it should be expected that every publicly facing demo (and most internal ones) are staged.
so Jupiter is 317.8 M⊕, this thing is around 80-150, but ... Saturn is right there at 80 ... so unlikely to have a solid surface, but likely has a rocky core, and wild winds at this temperature. (Saturn's average temp is -178C, -138C "surface", and this candidate seems to have -48C.)
It seems that all of this is based on 2 data points, and they only provide some examples that are consistent with that, but the models are also very low-confidence (as we don't have a lot of data about cold and small orbiting things - as they are hard to detect).
Offtopic, but such an interesting civilization where the keepers of knowledges seem to relate to this statement so much, innit?
Very Zen or is it just the overwork? Maybe it's a thing installed in our childhoods so that we would not struggle for power. (I certainly remember acquiring this manner of speaking based on fundamental self-deprecation around 5th grade, some other kids not acquiring it, and then 10y later we'd have mutually incomprehensible life scenarios.)
While kinds of dark humor other than "the falsity and futility of my own existence, amirite?" don't quite resonate with people as much, for whatever reason.
I propose main character syndrome as explanation. Reading too many blogs, thinking we are one of the cognoscenti, projecting ourselves a bit too close to the big polymath plasma screen in the sky, and eventually just ending up as ash in the divertor at the bottom of the big social tokamak. We think we know better, because we likely do, but what good does that do us?
It takes a very thoughtful approach to introducing an increasingly complex Scheme implementation. I doesn't shy away from the complexity that many LISP implementation tutorials try to push under the rug.
Posting a photo of yourself online is not an invitation for AI generated nudes.