Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think what most people are worried about is that, as you say, AGI won't necessarily have our biases/biological drives

That might also mean it has no drive for self-determination. It might just be perfectly happy to do whatever humans tell it to, even if it's far smarter than us (and, this is exactly the sort of AI people are trying to make)

So, superintelligence winds up doing whatever a very small group of controlling humans say. And, like you say, humans want to win



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: