The examples you listed work because they increase trust to a point people feel safe enough to not second-guess everything. I disagree that AI in its current form can be trusted. Food safety is enforced by law, correctness in Google searches isn't enforced at all, in fact Google is incentivized to decrease the quality to reduce running costs.
So yes, convenience and progress are strongly correlated but they're not the same.
Ironically, just yesterday I had a situation that made me change my mind on some of this; it convinced me that the world isn't ready for "AI results" in search in their current form.
Imagine: an impulsive person, suddenly facing the need to change the ownership structure of their mortgage, worried they'll have to pay a lot for this. A person who doesn't really know first thing about it. They enter a query in Google; because of their lack of domain knowledge, the query is missing one word. Without that word, the query matches two related but distinct kinds of ownership structures. Results for both come, and then AI summary happily blends them together. The person sees an explanation, panics, and shouts my ear off about the bad situation they've been put in by a third party. I'm confused (I know a bit more about this, but not much); they show me the phone.
I'm staring dumbfounded, looking at a seemingly nonsensical AI summary. But it's not the text that made me pause in shock - it's the fact that the other person took it as gospel, and yelled it at me.
Garbage in, garbage out, as they say. The way I see it now, the biggest problem isn't the garbage out that sometimes comes out of LLMs. The problem is the "garbage in" - specifically, the garbage that passes for thinking and communication among most of human population. An LLM may reason 100% correctly - it won't help when user supplies "wrong figures" without realizing it, and then acts on answer without thinking.
The world is not ready for systems that indulge people in their bullshit.
So yes, convenience and progress are strongly correlated but they're not the same.