Hacker Newsnew | past | comments | ask | show | jobs | submit | csande17's commentslogin

These days, you'll have to keep the plastic in some kind of metallic sleeve. The RFID chip in US Green Cards, and I imagine passport cards as well, is designed to be readable from across a room.

But not from the phone, if you put it the wrong way around.

An iPhone can read the RFID but you have to know that it is NOT where the icon is, but actually on the backside of the hard thick page (or something, I forget).


> I imagine passport cards as well

As someone who has a passport card, I can confirm it definitely has an RFID chip in it. Ironically they come in a protective sleeve.


Did you ever try scanning it with one of those passport checker apps? I tried this morning after reading this thread and couldn’t get it to work

Who do you think is designing the UX for all the new AI products and services?


The only explanation I can think of is that GP is, somewhat tautologically, defining contrast as "the value returned by WCAG 2's formula for computing contrast" (and, probably, assuming that WCAG 2's "science" has more basis in reality than it actually does).

I can't speak to Material You, but I've seen this sort of thinking at companies that are more concerned with legal compliance with the strict wording of WCAG 2, rather than on-the-ground user experience. People can even learn to ignore their lying eyes and fairly accurately guess what the WCAG 2 "contrast" metric for a given pair of colors will be, independently of how easy or hard the colors are to distinguish from one another.

Hopefully WCAG 3 will incorporate better color guidance from places like APCA, and at the very least these companies will stop producing unreadable black-foreground buttons and badges all the time.


> black has more contrast by either measure

No it doesn't? The screenshot of the calculator in the blog post very clearly shows that white has a greater contrast according to APCA. (If the negative numbers are confusing, you can also put the colors into a BridgePCA calculator like https://www.color-contrast.dev/?txtColor=FFFFFF&bgColor=317C... to see WCAG-2-style "contrast ratio" metrics computed using APCA.)

The point of APCA is to make the contrast calculation more perceptually accurate, not just lower the threshold.


> The point of APCA is to make the contrast calculation more perceptually accurate, not just lower the threshold.

If you're unlucky enough to be familiar with the math, it is trivial to show it lowers the threshold at luminances, no matter the luminance (widely accepted Y or L*, or APCA's own luminance calc)

Re: the idea I think the point of APCA is to lower contrast threshold - you're entirely correct that this is a side effect of the main goal - to model contrast more accurately.

There is no such thing as non-perceptual contrast, perceptual contrast is a tell stuff is being regurgitated.

We're a bit far afield when we're doing APCA => BCPA => claims actually contrast ratio is 2.4 => look at actual contrast ratio, and its...5.39.

2.4 is supposed to be unreadable, which clearly isn't the case here.

So what's going on?

The wrong text size is being used on the BCPA site, it's being calculated as if it's 12 pt and it's 36 pt in the article.


I might be totally missing something here, but does Mycoria attempt to prevent network participants from learning the public-Internet IP address corresponding to a Mycoria router ID?

The "iana" field in the configuration kind of suggests that this is not a goal, and this system is basically Tailscale but with IPv6 and a global namespace. But if this is the case, I don't really understand the emphasis on "routing", since pretty much every Internet host can reach pretty much every other Internet host directly using NAT traversal techniques (like BitTorrent does).

If you are trying to hide public-Internet IP addresses (like Tor hidden services do), the routing scheme still doesn't make a ton of sense to me, because presumably you wouldn't want to leak data by picking routes with a deterministic or latency-dependent strategy.


Mycoria is built for resilience - so not relying on the current Internet backbone working flawlessly.

Even if the current Internet works "normally", many users of similar network have reported better connectivity with the overlay network routing. Eg. routing on the IANA Internet is highly influenced.

If you let Mycoria generate a config for you, it will include your current public Interfaces on the device, so this will only be true for servers. Mycoria does not rely on IANA addresses, but uses them to improve the network structure automatically: Finding better routes between routers over the IANA Internet.


Why not both?

Chrome consistently pushes to make it easier for websites to track you -- by being the slowest browser to incorporate privacy protections like third-party cookie isolation, by eliminating extension APIs used by ad/tracker blockers, and by adding new features which expose more fingerprinting surface to websites. This disproportionately benefits Google because Google runs some of the largest web tracking networks (reCaptcha, Google Analytics, AdSense, etc). Even if Chrome was separate from Google, Google (along with other ad companies) could probably keep paying them to sabotage users' privacy.

Chrome also directly uploads a lot of data to Google. It's technically possible to use Chrome without syncing your browser history to your Google account, but a surprising number of people I know mysteriously managed to turn on sync without knowing it. Other Chrome data-collection initiatives, like Core Web Vitals, also provide a lot of value to Google's other businesses. Those are other products that Google could pay directly for.


Not necessarily -- a lot of external hobbyist work has gone into reverse-engineering Sonic Generations, which has an official PC port and is based on the same engine as Unleashed.

Funnily enough, one of the most famous Generations mods is a project that ports over a bunch of levels from Unleashed. IIRC they changes the graphics pipeline to look and work more like the Unleashed one, too.


It's pretty standard for middle schools to hold assemblies discussing sexual harassment and healthy relationships, but they don't always do a great job communicating those concepts.

Back when I was in middle school about a decade ago, the principal got up on stage with a police officer and explained that sexual harassment is when you talk to a girl and she feels uncomfortable. He then went on to assert that the school had zero tolerance for sexual harassment, describe various authorities to whom victims could report instances of sexual harassment, and implore students not to risk their future by engaging in sexual harassment.

If you weren't super confident in your ability to predict or control other people's feelings, probably your takeaway from that assembly was that talking to girls was a risky thing to do.


"Don't make people uncomfortable" and your takeaway is you shouldn't talk to them at all. I don't think the problem there lies with the sexual harassment narrative.


Many young people are vulnerable.

I was bullied in elementary school and graduated the same way Ender Wiggin did.

I was out two years and skipped three, started in the middle of freshman year.

I had no idea how I was going to find a mate. The world my parents grew up in, where my mom was introduced to my dad by his sister, was long gone. I knew I couldn't trust anything I saw on TV or in the movies. Adults, including my parents, were completely dismissive of my concerns. Might have made a difference if I had a sister, but she was born premature and I never saw her before she died.

I sat next to a beautiful girl in English who left me feeling entirely outclassed. [1] I came home crying from school about this every day for most of a year until I met the new physics teacher who let me hang out in the lab during study breaks, which gave me some meaning in my life and led me to get a PhD in the field. I still was afraid I'd wind up alone forever and went to a "tech" school which had an unfavorable gender ratio; I did find a girlfriend in my senior year, then was lonely and miserable in grad school. I found someone who was a friend of a friend and I've been involved in a love triangle ever since which he lost out in. My partner is a 100% reliable person from the same culturally Catholic background of myself (my parents did not involved me with the Church, she did all the things and has a positive orientation towards religion but doesn't take communion because she doesn't believe it literally.)

Boys today don't have it any easier. My immediate reaction is to be sympathetic towards "incels" but as an organized group they teach boys self-loathing which is primary to the 50% attraction-50% hate that they express towards women (hmmm... something a lot of people who are more or less healthy feel towards their parents because the conflicts that come out of being dependent on people)

[1] She was traumatized by her parents going through a nasty divorce. She teaches the Quechua language in Hawaii now. There's a photo of her next to a huge dog, no sign of any human relations. I probably did better at love than she did in the end.


Of course, a lot of activities carry risk; doesn't mean teenagers will completely abstain from them.

The missing piece here might be that, as a teenager, it's pretty easy to convince yourself that the main way girls will reject you is by expressing that they're uncomfortable. (I believe this is called "getting the ick" in modern slang; the old movies your parents like call it "get lost, creep".) So if you're afraid of rejection, it's plausible that you'd be afraid of the legal consequences of making someone uncomfortable more than the personal embarrassment or emotional pain of the actual rejection.


> Indeed, I don’t see the typescript community really complaining that there isn’t another implementation of typescript.

There are actually numerous third-party implementations that can take TypeScript code and compile it to JavaScript. Babel can do it in pure JS/TS, SWC can do it in Rust, and ESBuild can do it in Golang.

The catch is that AFAIK none of these implementations enforce the actual type checks. This is usually called "type stripping" within the JavaScript community, but it's basically equivalent to the "Rust without the borrow checker" implementation that gccrs is working on.

Well, except that the Rust community is extremely ideologically opposed to "Rust compiler without the borrow checker" existing as a viable product, so you end up with the situation described in the article where that implementation is just used to compile a hybrid compiler with all the hard-to-implement type-checker stuff copied verbatim from rustc. That erases most of the spec-robustness benefits of doing a full third-party implementation, but there are still some practical benefits, like being able to bootstrap the compiler more easily.

And who knows -- if you hack the build to re-disable the borrow checker, the resulting compiler might run faster. (That's the main benefit of the third-party TypeScript implementations in practice.)


Typescript was written very intentionally to be really easy to transpile to JS which is why you see so many solutions. The type checking piece remains not reimplemented despite high profile third party attempts to rewrite it in a faster language. The type checking piece is exactly the thing that makes typescript typescript. And the reason the external frontend attempt failed is that it can’t keep up with mainline development which is also true for CPython and its offshoots. In fact, TS is so easy to strip that the syntax is being standardized within official ecmascript so that any type checking annotations could be layered into JS code and engines would natively strip without needing transpolation in the first place.

As for “rust without the borrow checker” it’s very much not the same thing as Typescript transpilation for many reasons among which are that there are many other language rules would still have to be enforced. It’s akin to making a TS type checker that didn’t validate 1 of the language rules but enforced the others.

I’m not opposed to a compilation mode that didn’t enforce the borrow checker but repeated tests of compiling various software packages reveal that the borrow checker is within the noise performance wise. So you’d need a more compelling reason than “maybe it runs faster”.


> repeated tests of compiling various software packages reveal that the borrow checker is within the noise performance wise.

Out of curiosity, do you have a source for this? The stuff I remember reading on the topic is from back in 2018 where they managed to get the NLL borrow checker to be not much slower than the old borrow checker (https://blog.mozilla.org/nnethercote/2018/11/06/how-to-speed...) -- but that took a concerted effort, and the borrow checker was overall a large enough contributor to compilation time that it was worth spending a bunch of effort optimizing it.


WebPKI still has the "devices stop working if certs are rotated" problem, just on a longer timescale. If the user takes an old IoT device out of the box, and the auto-update server is using a newer root cert that wasn't trusted when the device first shipped, they'll have a bad time.

Any solution to this requires something that basically looks like certificate pinning. You've gotta rely on your server provider to use a dwindling set of "trusted by the original device firmware and also still trusted now" root certs forever (which gets fun when all of those certs are mandated to expire), or you've gotta create some other mechanism to establish trust on software updates (delivering them via unencrypted HTTP and then verifying an OpenPGP signature is a common choice).


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: