I was in an online fashion market place company that wanted to test 60-minute delivery in the same city.
We bought a few bikes and had employees ride out to customers, just to check if there was an actual demand. Turned out there was not and the idea was dropped with a low investment.
Yes, that is really a shame, although it's been fully open sourced.
Yandex is aware of how the geopolitical situation is hurting them and are therefore building a new company called Double.Cloud, based in Europe, to work around the negative public opinion on Yandex, and thereby continue being able to sell Clickhouse cloud services.
You should really consider getting a proper router like Unifi or the like.
It's a one time cost and it will save you from these issues no matter what ISP supplied crap you end up getting.
Just plug whatever ISP router directly into your own, more capable, router and your home network will look identical, no matter where you move to or how many times you change ISP.
That said, running Pi-hole on a Raspberry pi is a treat!
The idea is not to just generate any random string that matches the grammar. The idea is that if your request is "What are the first 10 digits of pi?" and you restrict the response to the regex: "[0-9]+\.[0-9]+", then you actually receive a correct answer of "3.1415926535" and not just a random string such as "1.2346789", which also happens to match the pattern.
That will only work up to the point when the LLM can't generate a correct answer, whether conforming to a grammar or not. After that point, you'll just get grammatically correct bullshit.
Also, as noted in my reply to a sibling comment, grammars do not generate "any random string". That's the whole point of a grammar, that the generation is not random. For example it is perfectly feasible to write a grammar that completes a sentence with missing words, or continues some text etc.
And to be clear, it is entirely feasible to write a grammar that takes some string as input and generates a string as output that is a transformation of the input string satisfying some constraint. This kind of grammar is known as a transducer.
None of this should come as a surprise. Statistical language models are simply an alternative to knowledge-engineered grammars, used to do the same things that one can do with a grammar (except for the determinism). In a broad sense, a statistical language model is a kind of grammar, or perhaps it makes more sense to say that a grammar is a deterministic language model.
Why do everyone seem to think that management does not have actual good intentions with measuring NPS as a KPI? The companies I've worked with have all had genuinely good intentions making the best possible experience for the user, because that is what ultimately wins in the end. If someone scores 1-6, you ask them to provide additional feedback, to learn from it - problem solved and everyone has all the info they need to go about their work.
There seems to be this idea that KPIs are evil and interpreted in vacuum. It's rarely like that.
20% of companies are knocking it out of the park, and 50% are nowhere near a leaderboard. I expect such folks are correct in their observation, but work for the 50%. Perhaps you work for one that is getting a promoter score.
My advice to friends looking for a job, ask them what their NPS is. The leaders in NPS are leaders in employee engagement. If they don’t know, or it’s bad and your role isn’t to make it better, look elsewhere.
We bought a few bikes and had employees ride out to customers, just to check if there was an actual demand. Turned out there was not and the idea was dropped with a low investment.