Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder if we could have much smaller models if they train on less languages? i.e. python + yaml + json only or even an single languages with an cluster of models loaded into memory dynamically...?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: