Hacker Newsnew | past | comments | ask | show | jobs | submit | arghwhat's commentslogin

I have never seen an IR-based on in any store myself. Bluetooth, and possibly some proprietary RF setup, seems popular.

This is, in my opinion, attempting to say the right thing with entirely the wrong perspective:

The people you say are getting "shafted" always got shafted. Their works are the inspiration for all artists and people who lay their eyes on it - maybe they got paid when they made the work, maybe they managed to sell it, but probably not. And still, other artists (and machines) will use remember and be inspired by it, sometimes to the point of verbatim copy (which is extremely common for human artists as well, with verbatim copy and replication being an actual sought after skill).

(Those about to shout "LICENSING", that's a very new invention and we're terrible at it. What are you going to do, cut out the part of your brain that formed new connections while touching GPL code?)

The person (singular) that is actually getting "shafted" at each use is the artist you didn't hire to do the job of making your new work, because it is their skill that got replaced. A skill build from a lifetime of studying other art and practicing themselves, replaced with a skill build from a machine studying other art and by virtue of some closed loops likely also "practicing" itself.

Still, shafting at large, but the obsession with training data is misplaced in that it entirely ignores how society and art worked beforehand.

At the same time, for most of the things you're likely using the tool for, there would probably would never have been an artist in the first place. For example, if you're just making your powerpoint prettier, or if your commission is ridiculous as it often is and yet only willing to offer a single-digit dollar sum per work which no artist should take (RIP the poor souls that take such work anyway).


You're ignoring the biggest problem here: the concentration and extraction of wealth. The sum total of human artists were previously getting those billions of dollars, and now it's OpenAI (and Anthropic, and Google, and Microsoft, and maybe a handful of other players) getting it. Now, maybe it actually used to be hundreds of millions of dollars, and they've grown it to billions, and maybe they deserve some of that - but they're getting all of it. This is the huge issue with this technology, not so much the fact that it exists but that it is being sold by a tiny, tiny amount of people.

I wonder what happened to actual artists though - they seem to be doing fine. I'm sure many people as consumers dabbled in AI art, and reached the conclusion after hours that what they made never looked quite right.

Then they found they could commission an actual artist to draw what they wanted for tens or hundreds of dollars, which is a very good price for getting exactly what you want without having to waste your time playing the token slot machine.


How'd you conclude that artists are doing fine? That doesn't match my experience or observations at all.

I know some pro artists (ppl doing work for big name companies, games, US film studios), either on a contract or employed basis.

They've always told me the same thing - the job is to hit the minimum acceptable level of quality (which to my untrained eye often looks high, but they reassure me, their work is in fact sloppy garbage), using whatever means necessary, even if that means AI.

They don't even hate AI mostly the way art Twitter does, they hate is because it gives unrealistic expectations to what costs how much, and its often not really possible to get useful results - at least that was the case a couple years ago, things might have evolved.

If AI were good enough, they would certainly use it.

As for Twitter people doing commissions, I dont have firsthand experience, but imo their biggest issue is that there are tons of artists from places like Latam or the Philippines who do high quality work and charge very little, and the people who commission don't care - this was the case well before AI.


That's also the wrong framing

AI Labs are getting a tiny cut of the hundreds saved by not hiring an artist.

So regular people save hundreds, the labs get a few dollars, and the artists get nothing.

The artists are still losing, but it's regular people, especially the least able, who are winning.

The coffee shop isn't cutting OAI a $300 check for doing their spring menu. They are pocketing $295 and paying OAI $5.


No. The coffee shop who isn’t paying an artist $300 is gonna get negative reviews and loose customers and money from their bad business decision[1]. I know I would think twice about ordering at a café which uses AI in their marketing, and I am not the only one.

The coffee shop who cannot afford the $300 for an artist and homebrews their design in Microsoft Word is still doing just as before, the coffee shop which can afford it and still pays an artist is still doing fine. The coffee shop which is paying openAI $5 for stolen art, gets to look as cheap as they are.

1: https://www.sfgate.com/food/article/santa-cruz-restaurant-ai...


So to save the idea of $300 (logo design with "local" talent is never $300, it is only that cheap if you offshore it), they tried to ruin a business that presumably employs multiple LOCAL people full time (way more than $300) with 1 star reviews to "punish it"

This is an internet mob at its worst. Not an example of anything to emulate, in my opinion.


People hate AI, and this is one of very few ways people have to punish AI. It is bound to happen.

And in either case, this example destroys the framing that coffee shop owners are the ones who benefit from the systemic art theft employed by AI companies.


Sure, just like every software company using AI is going to go under and every video game using AI will fail?

I am not sure what you mean. The AI backlash is real, and it has real and obvious effects in the real world, with written articles to prove it.

If you are attempting here to shift the focus away from coffee shops (may I remind you, you were the one who brought that as an example) and into video games or software companies, I simply reject that attempt.

That there exists a software company which uses AI in their product and is not failing has no bearing on the framing on how a coffee shop which is too cheap to pay an artist for their logo does indeed look cheap to it’s customers who will be inclined to give that café a negative review or otherwise avoid said café.


I'm shifting the focus to the reality that exists outside of internet mobs.

99% of people don't recognize AI generated content, and don't particularly care enough to pixel scan every image they see.

You can death grip articles of AI art backlash, but they are all these hyper-narrow one off events. But reality is the general population doesn't really see it or care.[1]

1.https://www.forbes.com/sites/conormurray/2026/04/17/the-no-1...


That's an entirely different problem to artists getting "shafted". Not saying it's not a worthwhile discussion, but it is a separate concern.

Having everyone pay phone/internet, office, streaming, music, etc., subscriptions to large tech companies that are effectively monopolies all do that. It's a bigger, pre-existing issue.


Yes, look at how many historical inventors (like the Blue LED, the guys struggling to convince Gates and Balmer to make the Xbox) etc get/got nothing for their efforts compared to the huge sums raked in by the very people actively trying to prevent them from building the idea that made all the money.

AI is hugely beneficial to our species. Our tribalism and "yeah well they earned it!" response to capitalism's rampant production of billionaires is the real problem, not technology.

Why are footballers and movie celebrities paid 50$m a year? There's the answer.


1) Is there a moat? Is there no moat? Are open models as good as the closed ones? I keep getting confused.

2) As one of these artists, I am entirely fine with my entire body of work being used for the purposes of model building. The tech is astonishing and fantastic, and I sincerely hope we will be better through it. As the parent suggested: The idea that people in general previously gave a fuck about compensating artists is hilarious. MS builds models with my work, random people bought, idk, another vacation in Thailand or a fourth pair of shoes with the money that they never spent on art. I know which one I would prefer.

But I do find it particularly juicy that people, who, on the whole, never thought too much about paying artists (which I am also fine with btw!), all of a sudden can't stop wringing their hands about the injustice of it all.


What art have you produced? I did a little googling, and I can't find anything of note in public.

The same issue applies to fastfood, coffee chains and taxi services. Capitalism.

Correct. The way it's being built is exactly all that the US mentality warns about socialism/communism (that giving away your hard work "for the greater good" is a lie and is actually a power grab).

Turns out, if it's American oligarchs profiting from everyone's work, they love the idea!


Children can draw without ever having been to an art gallery. The IP laundromats need the entire stolen corpus of human labor. The latter is clearly an infringing derivative work.

It will be true no matter who many bribes those who have never created anything pay to Marsha Blackburn (who miraculously reversed her AI skepticism).

I wonder how many threats of being primaried have been issued by the uncreative technocrat thieves.


No they can’t just draw by themselves. It’s extremely bad and random.

Their teachers teach them from a very early age how to hold a carton, and how to draw.

Maybe some miraculous humans will reinvent all drawing of growing by themselves in the jungle, most people will not.

Source: I have kids.


> The person (singular) that is actually getting "shafted" at each use is the artist you didn't hire to do the job of making your new work, because it is their skill that got replaced.

1% Yes, and 99% No.

Over 99% of uses would not have resulted in hiring someone to do the work had these models not existed as you yourself acknowledge.


Yes, but this is a bit of an oversimplification. The "99%" tends to be either: 1. Pointless throwaway content which we can just ignore as a new source of noise, 2. Something that could have ended up being a $5 commission[^1] to a kid somewhere out there but now never will be.

Those numbers are also a bit too aggressive - it's easy to miss what kind of gig work exist out there. PowerPoint as a service is a thing on Fiverr for example. A horrible, horrible thing, but a thing none the less.

^1: not at all what art costs, but someone trying to get started might do quick sketches at those prices


> The "99%" tends to be either: 1. Pointless throwaway content which we can just ignore as a new source of noise, 2. Something that could have ended up being a $5 commission[^1] to a kid somewhere out there but now never will be.

Or 3. Something I made and I actually use, but I would never have paid a kid $5 to do.

Yes, I know of Fiverr and similar sites. Even planned on using it once. Even know someone in another country who made side money from it. And yes, it does suck for them. But none of that changes the fact that well over 99% of uses are not depriving them of any money.


I disagree, because when someone can just get those simple works made on a subscription you already paid for, then the $5 commission goes from something someone might end up doing if the idea is stuck in their head long enough (or they find the idea amusing enough), to be something that can never become a commission.

Not pointing fingers or saying that you must pay kids to draw things for you, but it most definitely does take work away by replacing an entire class of commissions. Not sure what to do with that fact.

(I'd put things that would never, ever be worth a $5 commission into the throwaway noise category, even if you do use the outcome.)


I have seen arguments that a lot of your nr. 3 is basically just addiction. You are making the AI slot machine generate stuff for you and you get to have the sense of accomplishment that comes with thinking you created something without putting in any of the work of actually creating something. To the rest of the world this is indistinguishable from your parent’s nr. 1.

Fair point. It's just that his number 1 was "Pointless throwaway content", and I was saying "Well, actually, it's not thrown away but actually used".

You may look at the output and say "Crap!", but the reality is the person using it found value in it.

(To be honest, I used to think "Crap!" to stock photos long before LLMs came on to the scene, so I have little sympathy with stock photo photographers going out of business - those photos exist primarily to attract readers and do not provide any value to the content - they're just like ads in that regard).


Interceptors are just wrappers in disguise.

    const myfetch = async (req, options) => {
        let options = options || {};
        options.headers = options.headers || {};
        options.headers['Authorization'] = token;
    
        let res = await fetch(new Request(req, options));
        if (res.status == 401) {
            // do your thing
            throw new Error("oh no");
        }
        return res;
    }
Convenience is a thing, but it doesn't require a massive library.


That fetch requires so many users to rewrite the same code - that was already handled well by every existing node HTTP client- says something about the standards process.


It could also be trivially written for XMLHttpRequest or any node client if needed. Would be nice if they had always been the same, but oh well - having a server and client version isn't that bad.

Because it is so few lines it is much more sensible to have everyone duplicate that little snippet manually than import a library and write interceptors for that...

(Not only because the integration with the library would likely be more lines of code, but also because a library is a significantly liability on several levels that must be justified by significant, not minor, recurring savings.)


> Because it is so few lines it is much more sensible to have everyone duplicate that little snippet manually

Mine's about 100 LOC. There's a lot you can get wrong. Having a way to use a known working version and update that rather than adding a hundred potentially unnecessary lines of code is a good thing. https://github.com/mikemaccana/fetch-unfucked/blob/master/sr...

> import a library and write interceptors for that...

What you suggesting people would have to intercept? Just import a library you trust and use it.


Your wrapper does do a bunch of extra things that aren't necessary, but pulling in a library here is a far greater maintenance and security liability than writing those 100 lines of trivial code for the umpteenth time.

So yes you should just write and keep those lines. The fact that you haven't touched that file in 3 years is a great anecdotal indicator of how little maintenance such a wrapper requires, and so the primary reason for using a library is non-existent. Not like the fetch API changes in any notable way, nor does the needs of the app making API calls, and as long as the wrapper is slim it won't get in the way of an app changing its demands of fetch.

Now, if we were dealing with constantly changing lines, several hundred or even thousand lines, etc., then it would be a different story.


But you said so yourself they are necessary… otherwise you would just use fetch. This reasoning is going around in circles.


Why the 'but'? Where is the circular reasoning? What are you suggesting we have to intercept?

- Don't waste time rewriting and maintaining code unecessarily. Install a package and use it.

- Have a minimum release age.

I do not know what the issue is.


but it does for massive DDoS :p


Could also be looping videos - some browsers had bugs whereby looping videos would continously redownload.

I recall some years back having corporate IT ask me why I was downloading terabytes off this weird website called "imgur" that they didn't know about. Realized I had a tab open with a stupid jackie chan mp4 a few seconds long on some background workspace, and that had just kept downloading over and over and over and over...


First thing would be that a small geofence (i.e., a narrow church on available data) is entirely orthogonal to having high precision, high quality location data available.

I won't claim with certainty that this is the case, but it seems likely that Factual was overselling their capabilities. That, or they relied specifically on having users grant high precision location data access and had nothing otherwise.

Apps that already need location data are probably the most likely sources of collecting such data - food apps, dating apps, chat apps you have sent your location in, ...


"Apps that already need location data are probably the most likely sources of collecting such data"

Yes, and many companies have access to both feeds.....


And yet, who would you trust more - a CEO that raised 100M on their "vision" or someone who got slapped in the face?


We also shouldn't call it "vegan leather" when it is in fact just plastic.

Naming departs from technical accuracy when adopted by the masses, as they retrofit their common understanding. Wouldn't be too surprised if "vaccine" ends up covering other strong defense-boosters.


> "vegan leather" when it is in fact just plastic.

https://knowingfabric.com/mushroom-leather-mycelium-sustaina...

is pretty neat


Mycelium is neat, but last time I heard of it the problem was far, far too low manufacturing throughput.

I don't think anyone would even consider marketing that as "vegan leather", as doing so would mean putting you in the same bucket as cheap-as-dirt polyurethane (which is what regular "vegan leather" is), at an astronomically higher price. You'd pick a new term to differentiate.

I vote for "shroomskin".


excellent name!


Interesting topic, offensive website. Back to the story …


I found it funny because the opposite direction, people accused Tesla of naming “autopilot” misleadingly, because it gave them the impression of fully unattended self-driving.

In aviation, autopilot features were until recently (and still for GA pilots) essentially just cruise control: maintain this speed and heading, maintain this climb rate and heading, maintain this bank angle, etc.


Because Tesla was claiming in 2016 that "next year" it would be able to drive across the Unted Sttes without any inputs.


Well, okay, but that’s like 95% of flying.


It’s the other 5% that takes 90% of effort :)


Though by the 0.1% highly qualified and extensively trained, so that the chances of misunderstanding by a pilot is like 0.00001% or less.



Yes, but in this case the name is likely to actually reduce the adoption not increase it.


Wouldn't be too surprised, either - but I still think there's merit in using words in a more precise manner than the marketing department would like to do.


Mushroom leather says hello


A good example for the discussion: leather being animal skin which obviously cannot come from a mushroom.

Assuming you were countering my vegan leather claim: Products marketed "vegan leather" is polyurethane or similar, and for marketing reasons you would use a different term if you did something fancier to differentiate. My gut feeling is that a mycelium-based product would be far more expenisive than simple polyurethane, and quite an upsell.


I mean the word “vaccine” literally specifically references cow pox, so it’s already broadened. No reason not to go up another level.


Their subscriptions aren't cheap, and it has nothing really to do with them controlling the system.

It's just price differentiation - they know consumers are price sensitive, and that companies wanting to use their APIs to build products so they can slap AI on their portfolio and get access to AI-related investor money can be milked. On the consumer-facing front, they live off branding and if you're not using claude code, you might not associate the tool with Anthropic, which means losing publicity that drives API sales.


Well, not really. It means you have a renderer that is closer to being portable to web, not an editor that will run in web "with some additional work". The renderer was already modular before this PR.


With the disclaimer that I am comparing to the memory of some entry-level cameras, I would still say that it's way too noisy.

Even on old, entry-level APS-C cameras, ISO1600 is normally very usable. What is rendered here at ISO1600 feels more like the "get the picture at any cost" levels of ISO, which on those limited cameras would be something like ISO6400+.

Heck, the original pictures (there is one for each aperture setting) are taken at ISO640 (Canon EOS 5D MarkII at 67mm)!

(Granted, many are too allergic to noise and end up missing a picture instead of just taking the noisy one which is a shame, but that's another story entirely.)


Noise depends a lot on the actual amount of light hitting the sensor per unit of time, which is not really a part of the simulation here. ISO 1600 has been quite usable in daylight for a very long time; at night it's a somewhat different story.

The amount and appearance of noise also heavily depends on whether you're looking at a RAW image before noise processing or a cooked JPEG. Noise reduction is really good these days but you might be surprised by what files from even a modern camera look like before any processing.

That said, I do think the simulation here exaggerates the effect of noise for clarity. (It also appears to be about six years old.)


The kind of noise also makes a huge difference. Chroma noise looks like ugly splotches of colour, whereas luma noise can add positively to the character of the image. Fortunately humans are less sensitive to chroma resolution so denoising can be done more aggressively in the ab channels of Lab space.

Yes, this simulation exaggerates a lot. Either that, or contains a tiny crop of a larger image.


Yeah, I don't think that it's easy to reproduce noise (if it was, noise reduction would be even better). Also, bokeh/depth of field. That's not so easy to reproduce (although AI may change that).


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: