Hacker Newsnew | past | comments | ask | show | jobs | submit | axblount's commentslogin

Can you not see a difference between someone's imagination and a publicly posted image?

Posting a photo of yourself online is not an invitation for AI generated nudes.


Yes it is! Internet safety 101. Don't post stuff in public that you don't want to be public. There's always going to be creepy people doing whatever they want with all your public information. No AI company can stop that - the cat's out of the bag once you publish it to the whole world.


Really? Is that what everyone thought on Instagram? I shouldn't post these selfies, one day someone will turn them into nudes, without my permission?


No. By the time Instagram became popular, people had disregarded those early internet safety lessons.


Town, outside, and dungeon represent decreasing levels of safety. In most games, players want a clear indication of how much danger they are in just walking around. Some games, like Dark Souls, do blur these lines. I think it would be easy to go overboard.

This strikes me as one of those things that sounds better on paper than in practice.


I think Dark Souls is not a fluke, it shows that when executed well (which very may be hard), it is additive. It makes things feel more organic.

From article : "Maybe one cave system has a place where it connects to a dungeon, which connects also to a basement in some guy’s house in the middle of nowhere."

This just sounds better than having the black and white delineations between spaces. Yes!


> Maybe one cave system has a place where it connects to a dungeon, which connects also to a basement in some guy’s house in the middle of nowhere

To an extent, tears of the kingdom really does do this a few places, but not enough. It really is fun finding new holes into the underworld from a cave, and using the caves to get into the shed in that one village or to the tower etc


Something I've occasionally wished for is a classic-style Zelda game[1] where partway through the adventure you discover that all the dungeons are actually adjacent to each other, and you can open up passages connecting them turning it all into one big Metroidvania experience.

[1]: i.e. one with 4-8 dungeons and new navigation/combat tools in each, not a sandbox like BotW


I was surprised that ToTK was so focused on the underworld. The sky islands are much nicer.

That said, I was also surprised ToTK had the same plot as BotW. Like, Ganon takes over the castle and then they defeat him and then they go into the basement and he's just there and he takes over the castle again?


Safety can come from control over the world though. Consider Minecraft and Terraria (especially older MC), where monsters can spawn in most areas outside some minimum radius from the player. Neither is particularly "scary" because they give the player straightforward ways to control the situation. In fact, monster spawning leads to a lot of emergent gameplay in them.


It's hard for me to see the advantage of using one of these over pen and paper:

- distraction free (except doodling)

- lower power consumption

- expressive in a way that typing can never be

- tends to discourage editing as you write

edit: and less eye strain


Expression is a bit distracting though, especially if the ultimate goal is to publish text. Doodles and character size changes probably won't make it into a printed manuscript. I'd argue that's why tools like Microsoft Word are bad for writing; such software displays unimportant thing front and center like changing colors and fonts when ultimately we just want to customize the semantics of text like conveying a quote or code or basic emphasis.


There are many stages: creative, editing, and publishing (the above can all be broken down even more, and this is often useful, but I had to stop someplace). Creative is the first state, just getting everything down. Editing is the details of making it correct (both fact checking and grammar). The publishing is making it all look nice. They are 3 separate steps that demand 3 different skill sets. You need to keep them separate.

Unfortunately the above is easially to say, but hard to force. If you are creating something you should stop if the facts are wrong (no point in continuing when you suddenly realize your argument depends on something that might be false) even though fact check is an editing process. You cannot refer to data in a chart until you create the chart. For many people a misspelling is something their brain will not ignore even though they know the word they mean and fixing it belongs to editing - your flow is already interrupted either way and not fixing it means the flow stays interrupted.


I don't use anything like a "writer deck" but for me pen and paper is a non-starter due to hand fatigue. I can type for much, much longer periods than I could ever hope to write by hand.


It also introduces significantly more lag, at least for me, between the thinking and actual writing down of the words.

Sometimes slowing down the process like this is helpful, in other cases it's better to make the emission of the words onto the page as immediate as possible, depends on the piece.


> hand fatigue

Try a fountain pen. Seriously. Many people press far too hard with ballpoints; with a fountain pen, you can't -- you'll bend the nib and smudge.

Some adults won't have written with one since school; younger ones, never. But they exist for goods solid ergonomic reasons.

After you get used to them you write better, too.

There are good disposable fountain pens now, and I've been using them for a decade and a half. I like the Pilot V-Pen.

https://macchiatoman.com/blog/2020/1/22/pen-review-pilot-v-p...

https://www.amazon.co.uk/Pilot-Pen-Disposable-Fountain-Black...

Costs about £5, lasts many months of heavy use -- much more than any cartridge pen as the whole barrel is full of ink -- and I have never ever had one leak in my bag or pocket.

Highly recommended.


> but for me pen and paper is a non-starter due to hand fatigue

You may want to look into writing with your arm instead of your hand


Or just using a different type of pen, rather than trying to re-learn to write.


Not the person you were replying to, but “oh, gee, I wish I’d thought of that! /s”.

I spent my first few decades trying to train myself not to write in a way that causes physical pain. The closest I got was when I discover Lamy Safari pens, which won’t let me hold them the “wrong” way. That only makes it a little less horrid.


I can type at spees of thought without discomfort for hours. If i write at that speed for that long i get messy handwriting and pain.


it's the same kind of "workflow optimization" that notion and obsidian users suffer from. You spend so much time making your tools more productive but don't get any actual work done.


I just use obsidian out of the box. No extensions. I dont use tags. I dont use any fancy features. Its a markdown editor with a file tree to me. Its great.


I can type about 10x faster than I can write sustained. And handwritten drafts will need to be typed anyway.


Ergonomics. I can type for far longer than I can write by hand.


As I said in https://news.ycombinator.com/item?id=45932398 -- try a fountain pen.


I have, same problem.


What's the advantage of standardizing through ISO/IEC? Better adoption in industry?

Seems like this would take away a lot of power from RISC-V International. But I don't know much about this process.


Government agencies like to take standards off the shelf whenever they can. Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).

Random example I found at a glance: NIST recommending use of a specific ISO standard in domains not formally covered by a regulatory body: https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.S...


It's impossible to take ISO seriously after the .docx fiasco.


That’s the definition of throwing the baby out with the bath water.

Is ISO as an organisation imperfect sometimes (as in the docs case) sure?, it’s composed of humans who are generally flawed creatures, is it generally a good solution despite that?, also sure.

They’ve published tens of thousands off standards over 70 plus years that are deeply important to multiple industries so disregarding them because Microsoft co-opted them once 20 odd years ago seems unreasonable to me.


What .docx fiasco?


Office Open XML, the standard behind .docx and other zipped XML formats, was fast-tracked into the international standard without many rounds of reviews (by the same JTC 1!).


> Citing something overseen by an apolitical, non-profit organization avoids conflicts of interest (relative to the alternatives).

Of course this is a lie. But yes, governments like to claim that.


As the article says:

> “International standards have a special status,” says Phil Wennblom, Chair of ISO/IEC JTC 1. “Even though RISC-V is already globally recognized, once something becomes an ISO/IEC standard, it’s even more widely accepted. Countries around the world place strong emphasis on international standards as the basis for their national standards. It’s a significant tailwind when it comes to market access.”


Says that, but I don't agree with that. If anything it would have been less successful being picked up in discount markets if the specs weren't free for download, and I don't know what fringes they're trying to break into but probably none of them care whether the spec is ISO.


That can depend on how the spec gets made into an ISO standard. There is a process called "harvesting" that can allow the original author to continue to distribute an existing specification independently of ISO.


> Says that, but I don't agree with that

I guess you just never had to fill in a grant application where you have to justify that you are using official standards so that you can get money


I'm guessing in those kinds of situations it doesn't matter about the arch given x86 and ARM also aren't ISO standards. The manufacturers however should comply with relevant quality standards.


it doesn't matter when there is no ISO standard for a given tech. But as soon as there is one, then you have to provide arguments as to "why didn't you use the standard".


Usual lies. There are a plethora of largely ignored international standards. Making it an international standard is just one of many ways to achieve the wide worldwide acception and still has a high failure rate.


My take is that it could help tie up fragmentation. RISC-V has different profiles defining what instructions come with for different use cases like a general purpose OS, and enshrining them as an ISO standard would give the entire industry a rallying point.

Without these profiles, we are stuck with memorizing a word soup of RV64GCBV_Zicntr_Zihpm_etc all means


riscv was already gaining a profile mechanism outside of ISO, for example 'RVA23' is a known set of extensions


Hardly, see programming languages standards and compiler specific extensions.


languages are more fluid than processor architectures. I don't think they can be compared.


One would think, yet welcome to enterprise consulting, especially customers whose main business is not selling software.

You will find fossilized languages all over the place.


fossilised is often desirable or requested in some industries. Developing for the embedded market myself, we often have to stick to C99 to ensure compatibility with whatever ancient compiler a costumer or even chipset vendor may still be running.


RISC-V never had a fragmentation problem, thanks to the profiles.


I wouldn't say it never had a problem, but the profiles are definitely a reasonable solution.

However even with profiles there are optional extensions and a lot of undefined behaviour (sometimes deliberately, sometimes because the spec is just not especially well written).


The FUD keeps being brought up, but the solution here was in place before the potential issue could manifest.

It started with G, later retroactively named RVA20 (with a minor extra extension that nobody ever skipped implementing), then RVA22 and now RVA23. All application processor implementations out there conform to a profile, and so do the relevant Linux distributions.

Of course, in embedded systems where the vendor controls the full stack, the freedom of micromanaging which extensions to implement as well as the freedom to add custom extensions is actual value.

The original architects of the ISA knew what they were doing.


Maybe it helps get government contracts

“We’re standards compliant”


Sometimes it helps, sometimes it doesn't. Like when Sun Microsystems submitted ODF for standardization to ISO, it was so successful that Microsoft had to do it too for OOXML. In fact MS pushed so hard that it left a huge trail of destruction in the standards committees.

Other times, like with the "ISO power plug", the result was ISO/IEC 60906-1 which nobody uses. Swiss plugs (IEC Type J), which this plug is based on, use a slightly different distance for the ground pin, so it is incompatible. Brazil adopted it (IEC Type N) but made changes to pin diameter and current rating.


It's not like ARM and x86 are standardised by ISO either.


Governments seem to care about "self-sufficiency" a lot more these days, especially after what's happening in both China and the US right now.

If the choice is between an architecture owned, patented and managed by a single company domiciled in a foreign country, versus one which is an international standard and has multiple competing vendors, the latter suddenly seems a lot more attractive.

Price and performance don't matter that much. Governments are a lot less price-sensitive than consumers (and even businesses), they're willing to spend money to achieve their goals.


This is exactly what makes this such an interesting development. Standardization is part of the process of the CPU industry becoming a mature industry not dependent on the whims of individual companies. Boring, yes, but also stable.


Yes, and they're both massively debated and criticised, to the point that the industry developed Risk-V in the firstplace. Not to mention the rugpull licensing ARM pulled a few years back.


Yes, but if 30 years ago ARM had an ISO standard they could point to, that would have probably helped with government adoption?

(It's still a trade-off, because standards also cost community time and effort.)


Relatedly, 30 years ago someone attempted to turn the Windows 3.1 API into an ISO standard:

https://en.wikipedia.org/wiki/Application_Programming_Interf...

It didn't become one, but it did become standardised as ECMA-234:

https://ecma-international.org/publications-and-standards/st...


Well, Wine shows that Win32 is the only stable ABI, even on Linux.


>On May 5, 1993, Sun Microsystems announced Windows Application Binary Interface (WABI), a product to run Windows software on Unix, and the Public Windows Interface (PWI) initiative, an effort to standardize a subset of the popular 16-bit Windows APIs.

>In February 1994, the PWI Specification Committee sent a draft specification to X/Open—who rejected it in March, after being threatened by Microsoft's assertion of intellectual property rights (IPR) over the Windows APIs

Looks like that's what it was.


they are de-facto…


It ticks a checkbox. That's it. Some organizations and/or governments might have rules that emphasize using international standards, and this might help with it.

I just hope it's going to be a "throw it over the fence and standardize" type of a deal, where the actual standardization process will still be outside of ISO (the ISO process is not very good - not my words, just ask the members of the C++ committee) and the text of the standard will be freely licensed and available to everyone (ISO paywalls its standards).


> the ISO process is not very good - not my words, just ask the members of the C++ committee

Casual reminder that they ousted one of the founders of MPEG for daring to question the patent mess around H.265 (paraphrasing, a lot, of course)


This allows RISC-V international to propose their standards as ISO/IEC standards.


Live demos are especially hard when you're selling snake oil.


Ironically the original snake oil salesman's pitch involved slitting open a live rattlesnake and boiling it in front of a crowd.

https://www.npr.org/sections/codeswitch/2013/08/26/215761377...


Jesus dude


Yeah. Everyone wants to be like Steve but forgets that he usually had something amazing to show off.


Didn't Steve flip through 3 iPhones and hardcode the network UI to look like they had good signal?


One of the demos was printing a thing out, but the processor was hopelessly too slow to perform the actual print job. So they hand unrolled all the code to get it down from something like a 30 minute print job to a 30 second print job.

I think at this point it should be expected that every publicly facing demo (and most internal ones) are staged.


He faked shit all the time. He just faked it well and actually delivered later.


Every demo of not yet launched product will have something faked.


Does some interesting things if you up the ball speed to 20. The boundary breaks down.

  data.blackBall.v = data.whiteBall.v = createVector(0, 20);


Mine broke even without speeding up things, the black ball is now working together with the white ball.


"When in doubt, use brute force." --Ken Thompson


I was curious about the acceleration due to gravity at the surface:

    G * (120 Earth masses) / (radius of Jupiter ^ 2)
Comes out to 9.7 m/s. Not bad!


nitpick: 9.7 m/s^2 ;)


minus one mark


> gravity at the surface

Unfortunately that "surface" is gaseous…


That's what I (as a layperson) would think. The size of Jupiter, with half to a third of the mass would make it even more gas gianty than Jupiter.

… or it has a massive shell that is hollow inside /s.

Do any of the other measurements suggest anything about the nature of the surface?


> a massive shell that is hollow inside /s

aren't we all? /?

> nature of the surface?

so Jupiter is 317.8 M⊕, this thing is around 80-150, but ... Saturn is right there at 80 ... so unlikely to have a solid surface, but likely has a rocky core, and wild winds at this temperature. (Saturn's average temp is -178C, -138C "surface", and this candidate seems to have -48C.)

https://arxiv.org/html/2508.03814v1/MR_relation.jpg

It seems that all of this is based on 2 data points, and they only provide some examples that are consistent with that, but the models are also very low-confidence (as we don't have a lot of data about cold and small orbiting things - as they are hard to detect).

see section 5.2 https://arxiv.org/html/2508.03814v1#S5

but also consistent with the data is that it has ring(s):

Alternate explanations for the F1550C brightness include (1) a knot of exozodiacal emission; or (2) a smaller planet with a circumplanetary ring.


>hollow inside

>aren't we all? /?

Offtopic, but such an interesting civilization where the keepers of knowledges seem to relate to this statement so much, innit?

Very Zen or is it just the overwork? Maybe it's a thing installed in our childhoods so that we would not struggle for power. (I certainly remember acquiring this manner of speaking based on fundamental self-deprecation around 5th grade, some other kids not acquiring it, and then 10y later we'd have mutually incomprehensible life scenarios.)

While kinds of dark humor other than "the falsity and futility of my own existence, amirite?" don't quite resonate with people as much, for whatever reason.

Pz https://www.youtube.com/watch?v=n-TkVH2j7gU


I propose main character syndrome as explanation. Reading too many blogs, thinking we are one of the cognoscenti, projecting ourselves a bit too close to the big polymath plasma screen in the sky, and eventually just ending up as ash in the divertor at the bottom of the big social tokamak. We think we know better, because we likely do, but what good does that do us?

https://www.youtube.com/watch?v=GLkweSiuG2E


It makes us do things like take Zizek seriously. Been there.


Lisp In Small Pieces by Christian Queinnec

It takes a very thoughtful approach to introducing an increasingly complex Scheme implementation. I doesn't shy away from the complexity that many LISP implementation tutorials try to push under the rug.


They're in the write-ins for their respective categories. Emacs with 0.1%. Clojure with an impressive 0%!


But they mention it! (:


"there are dozens of us! Dozens!"


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: