I was an amateur digital artist in the 90s, and I can tell you that although we did work with what we had, no-one liked CRT artifacts. We were always trying to get a sharper, clearer, better picture. The first LCD monitors were so stunningly sharp and clear that it was obvious we would never go back.
At least, that's what I thought. It's funny now to see people glorifying CRTs as some kind of "ideal" display device for low-res art. Past-me would have laughed.
I suppose that depends on whether you were fighting CRT artifacts to produce a realistic image or using them towards an artistic goal. Scroll through that twitter account and you'll see many examples of artists who were clearly depending on CRT artifacts to create their end product.
I mean, there are usually two orthogonal aspects to "CRT emulation" in general, although for the Amiga really only one was relevant, except on the low end of the market.
The first is the display characteristic of the CRT itself. The analog nature of video means that there are no discrete pixels within a scanline, only smooth variations in voltage. The fact that phosphors don't equal pixels, and sometimes the electrons that contribute to the representation of a given pixel can hit more than one phosphor of the desired primary color. The persistence of the phosphors in use, as they fade. Those are relevant to any CRT simulation, regardless of the actual properties of the input signal (composite, s-video, RGB, component).
The second is the artifacts that occur with some video signals, such as the crosstalk that occurs with properly modulated composite video. Since the Amiga was an RGB-first system, these artifacts aren't generally relevant unless you're specifically targeting the look of Amiga graphics when displayed via composite, as was somewhat common (the A1000 and A1200 had color composite video output built in, and the A520 video modulator connected to the RGB video port, and thus worked with pretty much all Amigas). Various platforms had various levels of adherence to the actual NTSC spec (a few platforms, like the Apple II, even intentionally used a slight mis-timing of the signal to leverage NTSC signal characteristics to provide color output with a minimal amount of hardware). The Amiga did, however, output an extremely spec-compliant signal by default, especially when switched into interlace mode. This made the Amiga wonderful for video production work.
This is speculation and opinion, but I feel like a majority percentage of the complaints about CRT video quality are really about the latter. While CRTs aren't perfect, when displaying a high-resolution RGB or component signal on a high-resolution tube, they can offer an image quality that is very hard to match (although modern OLED-based sets are getting pretty good with this, and plasmas were also pretty good in this regard). Some of the deepest black levels you'll ever see, with zero sample-and-hold artifacting as seen on modern LCD panels. Virtually no display lag, the image appears on screen pretty much as soon as it comes into the set (HDTV CRTs with image processing notwithstanding, SD CRTs and PC monitors are often the best in this regard).
Of course, much of this is subjective preference, outside of the actual physical characteristics. Some people are bothered by the 15kHz flyback whine (which I can still hear, albeit at a reduced volume, at 43 years old). Some people find the interlacing used with SD video signals to be annoying. But there are also a segment of people who grew up with CRTs everywhere, who find all of that stuff nostalgic. Some of us even intentionally use NTSC filtering in emulators where possible to really tickle that nostalgia.
At least, that's what I thought. It's funny now to see people glorifying CRTs as some kind of "ideal" display device for low-res art. Past-me would have laughed.