The problem with judging a language by how it was a decade ago is that you're going to be left claiming absurd, outdated things. Imagine if I claimed I didn't like Java because it lacks generics, or because it lacks lambdas, or because I really, really dislike working with EJBs, or because working with Java means "working with a lot of XML" (this would flunk you in an interview, by the way).
Imagine if I complained about Linux and all I used as an argument was the Unix Haters Handbook.
Imagine if I complained about Windows and the most recent version I had used was Windows 95.
That depends on what you want from the judgement doesn't it?
If you want to be able to know the theoretical things you could do with the language, I think your definition is right.
If you want to know the most likely experience you would have using the language, I think mine is right.
I think it also depends on what circumstances you're using the language. Spinning up a new project probably lends itself more towards your definition. If you're looking to join an existing project, mine is probably more useful.
I think plurality of situations it's actually used for currently would be a better standard than what you're advocating for.
This has the advantage that the merits people judge things on, and the experience they will most likely have using it will line up.
The downside is that, the merits used may differ a bit between markets or industries, but I'm personally ok with that.