Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Where are we going to find the equivalent data set that will train this other kind of thinking?"

Just a personal opinion, but in my shitty When H.A.R.L.I.E. Was One (and others) unpublished fiction pastiche (ripoff, really), I had the nascent AI stumble upon Cyc as its base for the world and "thinking about how to think."

I never thought that Cyc was enough, but I do think that something Cyc-like is necessary as a component, a seed for growth, until the AI begins to make the transition from the formally defined, vastly interrelated frames and facts in Cyc to being able to growth further and understand the much less formal knowledgebase you might find in, say Wikipedia.

Full agreement with your animal model is only sensible. If you think about macaques, they have a limited range of vocalization once they hit adulthood. Noe that the mothers almost never make a noise at their babies. Lacking language, when a mother wants to train an infant, she hurts it. (Shades of Blindsight there) She picks up the infant, grasps it firmly, and nips at it. The baby tries to get away, but the mother holds it and keeps at it. Their communication is pain. Many animals do this. But they also learn threat displays, the promise of pain, which goes beyond mere carrot and stick.

The more sophisticated multicellular animals (let us say birds, reptiles, mammals) have to learn to model the behavior of other animals in their environment: to prey on them, to avoid being prey. A pond is here. Other animals will also come to drink. I could attack them and eat them. And with the macaques, "I must scare the baby and pain it a bit because I no longer want to breastfeed it."

Somewhere along the line, modeling other animals (in-species or out-species) hits some sort of self-reflection and the recursion begins. That, I think, is a crucial loop to create the kind of intelligence we seek. Here I nod to Egan's Diaspora.

Looping back to your original point about the training data, I don't think that loop is sufficient for an AGI to do anything but think about itself, and that's where something like Cyc would serve as a framework for it to enter into the knowledge that it isn't merely cogito ergo summing in a void, but that it is part of a world with rules stable enough that it might reason, rather than "merely" statistically infer. And as part of the world (or your simulated environment), it can engage in new loops, feedback between its actions and results.



> A pond is here. Other animals will also come to drink. I could attack them and eat them.

Is that the dominant chain, or is the simpler “I’ve seen animals here before that I have eaten” or “I’ve seen animals I have eaten in a place that smelled/looked/sounded/felt like this” sufficient to explain the behavior?


Could be! But then there are ambushes, driving prey into the claws of hidden allies, and so forth. Modeling the behavior of other animals will have to occur without place for many instances.


I like your premise! And will check out Harlie!




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: