Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It just means that LLMs are an interpolation of everything on the internet. They would seem less like they have a point of view or an opinion on things.


They would have the average point of view of the Internet, which is far from truthful or even useful.


LLMs don't really average viewpoints. They just learn multiple viewpoints.


They would have whatever you prompted them to have, minus the guardrails.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: