This reads like FUD, there are reasons for and against telegram and I won't get into them here. But truth is often stranger than fiction[0] and there are actors with something to gain from making telegram look poor.
To your last bullet point, these seem like a good reasons to me. This does not seem to have happened with the iOS version.
>However, the sheer volume of legacy issues, combined with submissions that completely ignore the provided issue template, has created a situation where there are currently more than 1,100 open tickets. Most of these issues have been inactive for several years and they reference versions of Signal that are no longer available. Many open issues are essentially ad-hoc discussion threads or feature requests that are a better fit for the community forum. Perhaps most importantly, many of these issues lack debug logs, descriptions, and the steps that would be necessary to understand and reproduce the reported behavior.
What do tests and CI have to do with the usability and usefulness of a program? I agree that the other issues could be worrying, but those first two throw me for a loop.
Tests and CI help you make sure that functionality actually works, that buttons you press have the desired effect, and that if I as a contributor ship you a pull request for a UX/feature improvement, both your and me can check that it doesn't break anything before merging.
The grandparent poster also wrote about security/privacy, for which I think tests are pretty important.
Finally, having CI would allow the developers/contributors with limited time to spend more time on usability and usefulness instead of on issues like "master doesn't build, anybody else have this problem?" (when I wanted to PR something to Signal, master did not build).
This is still looking at it from a developer's or contributor's point of view. Developer velocity has no practical impact on the user experience. In a way, the constant addition of features or tweaks to the user experience is a net negative, since the user has to re-learn how to do things in the app.
This also makes the assumption that since there are few automated tests, that there are no tests done on a build at all. I disagree that it is a valid assumption. Most security vulnerabilities are not found by unit or functional tests, but instead by linting and code reviews.
> Developer velocity has no practical impact on the user experience.
This may be true in a megacorp. I doubt it's true in a project like Signal where resources are limited and a few developers are the single bottleneck for everything, UX or otherwise.
All those issues were closed, as it is explained in your link, because they lacked debug logs/steps to reproduce the problem/etc. It's not feasible to dedicate enormous amounts of resources to try to address those issues. You're welcome to re-open them, properly this time, if you want.
That is wrong, they closed issues indiscriminately.
Oh, and they will keep doing so: "Legacy issues that are more than one month old will be closed by an automated process as part of this cleanup effort."
That is wrong. If you read the thread, you would immediately see that, as I give examples there of how issues that were in the middle of being tested by users were closed, and that were acknowledged as real issues by the developers days before.
I'm running this on my own server (using ejabberd [1] as Prosody - which I used previously - still has problems with some of the required XEPs[2]) using several different clients (Conversations [3] on Android, Pidgin on Linux, etc).
The one weak link in OMEMO is the difficulty of verifying the authenticity of the device key signatures used by communication counterparts. These signatures are 64-character hexadecimal strings which the user is supposed to verify being authentic before initiating encrypted communications. Humans as not very good at this task, computer can help but they can also be tampered with. There are ways around this problem - e.g. meeting up in person before initiating communications (viz. PGP key signing parties), sending Q-codes off-line, using algorithms to turn those numbers into things which humans are better at recognising (e.g. the coloured emoji used by Telegram etc.). Encrypted group chat only works if all participants have all other participants in their address lists. Then again, it does work and it is fully standardised with several compatible implementations.
That would work for graphical clients but not so much for textual versions. Some type of ASCII-art might serve the purpose but whatever solution is chosen it will be hard to visualise the entropy of a 64-digit hex string in such a way as to distinguish it from the same string with 1 bit flipped.
(not just individual people obsessed with privacy, the large numbers of people who can see the advantage of privacy but may not invest a lot of effort to achieve it)
Yes, it is obfuscated, the pairing of real person and random id would be much more work, than pairing phone number with real person, and you can throw away and generate as many as you want, whenever you want - unlike your phone number.
Still beats using your phone number for chats, or your SSN in place of bitcoin wallet.
Honestly, I'd say it's much worse than using your phone number or SSN. Sure, an SSN makes it easier to tie an identity to a person, but it doesn't instill a false sense of anonymity in the user.
Again, for the practical example, people bought and sold bitcoins with the broad assumption that it was anonymous. It's a widely understood falsehood, yet new people almost universally believed it was anonymous and safe from law enforcement/the tax man.
Pretending that a random yet static ID adds any level of anonymity is simply dangerous. At best, it keeps honest people honest; like locking your door but not the bay window next to it.
[0]: The story of Karl Koch for example https://en.wikipedia.org/wiki/Karl_Koch_(hacker) who's death was ruled a suicide, and his history is... "colourful".