Hacker News new | past | comments | ask | show | jobs | submit login

Sure, but we're also talking about drawing inferences about AGI from how superintelligences based on ML processes actually behave rather than from stories. The current of the state of the art suggests an AGI might not need to know anything about fusion power to invent it, if it is equipped with a system to perfectly simulate the outcomes of experimenting with fusion power like Alpha Zero is with the games it plays. I'm not sure we can accidentally code one of those.

The "cheat" argument suppports my argument, not yours. Rather than do something incredibly difficult and indirect like master advanced physics outside the scope of the simulations it can run to invent new power sources to legitimately produce very low cost paperclips, a system to maximise paperclip sales indicators is probably going to exploit gaps in the simulations it can run or data its already supplied with, like data entry errors in a region or logical holes in the supply chain pricing model, or the marketing simulation not including indications of languages spoken. So it might actually be worse than the management consultants at optimising paperclip revenues. On a related note, if you did want your AGI to invent fusion power but your fusion power emulation engine isn't up to scratch, it's likely to design reactors that are very performant inside the simulation but don't actually work, much like those evolutionary systems exploiting physics engine bugs when faced with the comparatively simple task of learning how to walk.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: