I've answered maybe 10 Be My Eyes calls over the last couple of years. I can see some value with AI describing labels or food items however most of the calls I've answered are more nuanced. Unusually I've answered two calls this week, 1st one was looking at photos of a hotel to help decide if it was met the requirements of the person. The second call was helping someone perform a blood sugar test, I had to tell the person when I thought the drop of blood on the tip of their finger was large enough to test and read the result off the tester. Neither of these are candidates for AI but let the users be the ultimate judge, I am continually impressed by the ingenuity and resourcefulness of the people I have interacted with.
This isn't trying to replace human volunteers, but complement them.
I know blind people who are making just as many human Be My Eyes calls as they were before, but they're using Be My AI even more, for things where they wouldn't have even bothered to use the service at all before - because the AI is so fast and convenient.
I am surprised by your assessment that these are not tasks for the AI. Well, the first one is troublesome, but judging the shape of a drop of liquid according to well established procedure sounds quite on par.
Maybe but it was on the tip of a finger in a moving image changing size as the person squeezed their finger, also something that I never previously considered, blind people have tend to have no/little artificial light on when alone, luckily the app allows the person providing the assistance turn on the flash on the other persons phone.