Hacker News new | past | comments | ask | show | jobs | submit login

I've been pasting comments in to ChatGPT and asking what they mean by certain terms or phrases and it works extremely well and with an answer I can independently verify to be true. Far more useful than getting a simple dictionary lookup.

An example response I got recently:

>In the context of this comment, "dovish" means that the speaker perceives Powell's statement to be more accommodative towards economic growth and less concerned about inflation than Lowe's statement. This suggests that Powell may be more inclined to keep interest rates low or lower them further to stimulate economic growth, rather than raise them to combat inflation. The term "dove" is often used to describe policymakers who prioritize economic growth over inflation concerns. In contrast, "hawkish" refers to policymakers who prioritize fighting inflation over economic growth.

Meanwhile google gives me this response for a defintion

>Definitions of dovish. adjective. opposed to war. synonyms: pacifist, pacifistic peaceable, peaceful. not disturbed by strife or turmoil or war.




> Far more useful than getting a simple dictionary lookup.

Yes.

Long-time dictionary-editor and translator here. What blew me away most when I first tried ChatGPT, and what continues to amaze me, is its ability to distinguish and explain the meanings of individual words relative to their context. I have spent much of my career doing just that, and I always felt that it relied on my knowledge and intuition as a human and would be a really hard problem for AI. Conventional corpus linguistics had not even come close to cracking the problem.

I’ve done tests in which I gave ChatGPT texts that contained very rare words, new words, or nonce words that would not be defined in any dictionaries, and some of which were unattested on the web. When given enough context, it was able to deduce the intended meanings.

In contrast, its ability to explain grammar is terrible. It misidentifies parts of speech, singulars and plurals, the subjects and objects of verbs, even the number of letters in words. I hope that people develop plugins for it to fix its language-explaining ability as, reportedly, other plugins are able to fix its natively terrible mathematical ability.


> even the number of letters in words.

This one isn't surprising at all: it doesn't deal with letters, but symbols. It only indirectly knows about spelling.

> In contrast, its ability to explain grammar is terrible. It misidentifies parts of speech, singulars and plurals, the subjects and objects of verbs,

I wonder if this is an area where having to generate word-by-word an immediate final answer immediately is killing it, and if it could be induced to "diagram" a sentence first and get a correct answer. Can you give me an example of the query you're asking?


> This one isn't surprising at all: it doesn't deal with letters, but symbols.

It’s not surprising to us now. It was very surprising to me when I first noticed it, as it contrasted sharply with ChatGPT’s ability to explain aspects of language that seem to us humans as being much more difficult, particularly word meaning.

> Can you give me an example of the query you're asking?

I have to admit that all of my testing of its grammar-explaining ability was done last December. I have repeated a few of those tests now with GPT-4, and it did fine [1].

[1] https://www.gally.net/temp/202212chatgpt/explaininggrammar.h...


>I have to admit that all of my testing of its grammar-explaining ability was done last December. I have repeated a few of those tests now with GPT-4, and it did fine [1].

A tale as old as time in this space! I appreciate you checking it again. They are improving so fast.


Here is an example of results I got in playing with this some with ChatGPT-4. Note that it doesn't understand the syntactic ambiguity of the sentence:

The sentence can be broken down as follows:

Main Clause: - The burglar threatened the student with the knife.

Subject: - The burglar

Predicate: - threatened the student with the knife.

Object: - the student

Prepositional Phrase: - with the knife

The prepositional phrase "with the knife" modifies the verb "threatened." It tells us what the burglar used to threaten the student.

Now, let's identify the parts of speech of each word in the sentence and whether the nouns are singular or plural:

- The: article, singular - burglar: noun, singular - threatened: verb, past tense - the: article, singular - student: noun, singular - with: preposition - the: article, singular - knife: noun, singular

All the nouns in this sentence are singular. There is no ambiguity in meaning in this sentence.


I tried that sentence and three similar ones, and this time GPT-4 caught the ambiguity. The results are here:

https://www.gally.net/temp/202212chatgpt/20230508-syntactic-...

I am especially impressed by its comment about one interpretation of “The woman cleaned her teeth with a knife”: “The woman cleaned her teeth that had a knife in them (highly unlikely, but still a possible interpretation).”

As I noted in another comment, I might need to backtrack on my assessment of its grammatical-explanation ability, as until today I had only done tests using the initial release of ChatGPT, not GPT-4.


Anthropic's Claude[1] and Claude+[2] both failed some of these.

Question:

I am interested in syntactic ambiguity in English. Can any of the following sentences be interpreted in more than one way? Explain why or why not.

The man opened the package with a knife.

The burglar threatened the student with a knife.

The woman cleaned her teeth with a knife.

The child pointed at the teacher with a knife.

Claude's answer:

These sentences can be interpreted in multiple ways:

1. The man opened the package with a knife. Ambiguous: The knife could have been used as a tool to cut tape and open the package, or it could imply the man threatened someone with a knife while opening the package.

2. The burglar threatened the student with a knife. Unambiguous: The implication is clearly that the burglar threatened the student using a knife as a weapon.

3. The woman cleaned her teeth with a knife. Ambiguous: The knife could refer to a toothpick or dental tool, or it could imply an unusual way of brushing teeth.

4. The child pointed at the teacher with a knife. Ambiguous: The knife could have been used to gesture and point, or it could imply the child threatened the teacher with a knife.

In summary, sentences that suggest using a knife as a tool tend to be ambiguous between a literal and figurative interpretation. Sentences that clearly describe a knife being used as a weapon are generally unambiguous. The ambiguity arises from the multiple meanings and uses of the word "with."

Does this explanation make sense? Let me know if you have any other questions!

Claude+'s answer:

Here are my analyses of potential syntactic ambiguity in those sentences:

The man opened the package with a knife. - This sentence is ambiguous. It could mean either that the man used a knife to open the package, or that the package contained a knife inside. The ambiguity arises because "with a knife" could modify either "opened" or "package".

The burglar threatened the student with a knife. - This sentence is unambiguous. The only coherent interpretation is that the burglar used a knife to threaten the student. "With a knife" can only modify "threatened" in this sentence.

The woman cleaned her teeth with a knife. - This sentence is ambiguous. It's unclear whether it means the woman used a knife to clean her teeth, or her teeth contained a knife. Again, "with a knife" could modify either "cleaned" or "teeth".

The child pointed at the teacher with a knife. - This sentence is unambiguous. The only meaning that makes sense is that the child pointed at the teacher while holding a knife. "With a knife" can only modify "pointed at" in this example.

In summary, ambiguity can arise when a prepositional phrase like "with a knife" can logically modify more than one part of the sentence. Context often helps resolve such ambiguities, but in isolation some sentences remain open to multiple interpretations. Let me know if you have any other questions!

[1] - https://poe.com/Claude-instant

[2] - https://poe.com/Claude%2B


> In contrast, its ability to explain grammar is terrible.

To be fair, grammar is sometimes a bit of an advanced subject - even if it is ostensibly easy (which it isn't). Isn't that the same problem for every other subject? Other than encyclopedic knowledge, it struggles with many subjects at higher forms (i.e. programming more than just a few basics, math, and so on).

I think this is partially an issue of availability of data.


Something else to consider is that languages differ as to the complexity of their grammar.

Much research and training of LLMs has been done in English, but if these models were trained on as much data in other languages as English, I wonder if:

- LLMs would do better or worse on intelligence and other tests if they were tested in other languages?

- Could conversing with LLMs be easier or harder in other languages?

Some languages, like Logan or Lojban, might be especially suited towards this sort of testing and interaction, as they were designed to be easy for computers to interpret.


Grammar models already exist and have for years. I think giving GPT access to one would be the right approach there.


Yes. And that gets into the awkward bit. LLMs are incidentally good as databases/search engines for information that is prevalent on the internet and not disputed. Ideally we could separate the LLM from the information. So when asking it for a definition you'd provide your query, the source comment, and then the computer would add in the definition and some uses in context.

Right now we've tangled an understanding of language and a corpus of information together in a way that causes distrust in AI. If the AI gets some fact wrong (like Bard did when demo'd earlier this year) people laugh and think LLMs are a failure. They should not be used for information recall. That is a task that a vector database would be good at.


Yes, I think this explains a lot of skeptics. They look at ChatGPT as an information repository, as so it is impressive but flawed, while looking right past its incredible facility with language which is positively superhuman. Some humans can exceed its language skills on some details, but no one can speak so many languages (both natural and programming languages) with the ease and facility of ChatGPT.


I just checked with Google, providing minimal context, as in the query, "dovish economy":

> What is does Dovish Mean? Dove refers to an economic policy adviser who advocates for monetary policies involving low-interest rates. The doves argue that inflation isnt bad and that it is bound to have few negative effects on the economy.

Linked to a site that includes all those typos. Followed by other links on doves and hawks.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: