I would love something like this paired with augmented reality to show/hide/highlight only the products that meet my search criteria. Try going to a big supermarket to look for whole-milk yogurt, for example. It's a pain trying to filter them out of the rest of the 100 varieties.
Another thing I would like to do is compare products by price/quanity. Maybe I'm trying to get the most protein/$, or oz/$, etc. You could also hook it into some database to only show you ethically caught fish, organic projects, fair trade products, local products, etc.
I've thought about this kind of technology a lot. Cool to see that it's actually becoming possible.
> It's a pain trying to filter them out of the rest of the 100 varieties.
Arguably, shops specifically design themselves to do things like this. When I was growing up, it used to drive my dad mad that the local supermarket would move almost literally everything around seemingly at random "to prevent shoplifting". He was a pilot, and usually had little time -- he liked to go into supermarket, grab x, pay, and leave. Suddenly, that stoped working, as finding x became increasingly difficult. It really annoyed him. For a long time the floor manager said that they moved their stock around to prevent shoplifting – until one day the store manager, when asked the same question, said it was "to promote footfall". Suddenly the real reason became clear.
>Another thing I would like to do is compare products by price/quanity. Maybe I'm trying to get the most protein/$, or oz/$, etc. You could also hook it into some database to only show you ethically caught fish, organic projects, fair trade products, local products, etc.
In the UK at least, all of this information is required to be printed on the label: product per unit money on the shop label and misery per unit product on the product's label.
Are you asking for a filter / search engine for that on top? Either way, I don't think your average supermarket would let that stay un-advertised for too long. Products at eyeball level pay a premium to be there.
I don't know if it's legally required, but that's also the standard here in the US. One problem though is the units may not always be equal. I was buying canned coconut milk this weekend and among all the different choices were $/pound, $/gram, and $/liter. It's easier to do the math from scratch yourself at that point.
> In the UK at least, all of this information is required to be printed on the label: product per unit money on the shop label and misery per unit product on the product's label.
Cant find anything about misery per unit, can you link to more information please? Interested to know how this is calculated.
That's a bit strange. At least in the US manufacturers will bid for shelf space, to get closer to eye level, the end of the aisle, or what have you. Store brands get a privileged position as well. So at least here the placements do change, but usually because a different person has won/lost the bid war.
This could have accessibility applications too. I have a hard time reading labels for some products or finding certain ingredients. Having a camera that I could wave around when looking for something and just follow the vibrations when the camera detects a matching product would be huge for me.
This is only the beginning. At the moment, mobile devices must run sub-SOTA models, such as a MobileNetV3[a] for visual recognition, to provide an acceptable user experience. As mobile devices continue to get more powerful and start incorporating dedicated AI hardware for multiply-add-accumulate ops, I expect to see devices that can run today's SOTA models locally within a couple of years. Exciting stuff!
interesting that one of the uses of NN here is to shrink the amount of storage required per product in the product recognition DB -- compression is a neat application of NN that generally follows from recognition / synthesis tricks, and will matter a lot on mobile.
I'm wondering if 'on device' is an exaggeration. The post is saying 64 bytes per product, and a DB of millions of products, but the video is showing product details, prices, and a box image. I wonder if once it identifies the product it then loads metadata from ze cloud?
it's a spectrum, right? 'on-device' is a big claim with different implications if you're thinking about this as an NN problem or a privacy problem. Loading an image from the cloud = sending telemetry on your supermarket shopping to a giant ads company.
This is interesting - I couldn’t determine how they train them system to recognise all the products though.
Flavour variants, special promotional packaging etc make training hard due to the sheer size of images that need to be in a training dataset to yield accurate results.
I am interested into why they haven't released this feature in their main Google Lens app. My hunch is they released a separate app for experimental and beta testing. Once these features get robust and reliable, they will probably move these features to the main Google Lens app.
This is one of those obvious killer-apps that with a moment's thought was obviously computationally prohibitive or much too complex for a sole developer. If this is in the pipelines then we can soon also expect a bunch of semantic apps.
Didn’t Amazon do this with the Fire Phone and FireFly in 2015? What I need is to comparison shop this item at 5 other grocery stores and FreshDirect and Amazon Fresh and have it tell me if I’m getting a good deal.
An RFID scanner will do a lot of this surely ?
The issues are presumably access to the TFID number database, but for me the big open question is that as we progress to an RFID chip in everything world, that access becomes a public good - a complete supply chain record will be the next "food labelling" fight
People who can't recognize products on a shelf often can't use a touchscreen either. I live with a Boomer and he can't even use a Chromecast because he can't see his screen, let alone use screen readers or voice commands.
You know... Every time some new product comes out, I can't help but think "how are they going to use this to track my habits." And how can you blame anybody for that? Think about everything you own that is already tracking you. Your TV, your Phone, your smart speaker... all the apps you use... Can you really blame somebody for being pessimistic?
You know this is a useful tool for visually impaired people, but I can't help but think that nobody should have to give up their privacy in order to have some level of equality. That is total horseshit IMO.
Another thing I would like to do is compare products by price/quanity. Maybe I'm trying to get the most protein/$, or oz/$, etc. You could also hook it into some database to only show you ethically caught fish, organic projects, fair trade products, local products, etc.
I've thought about this kind of technology a lot. Cool to see that it's actually becoming possible.