Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't "give it a big chunk" just the same problem at a higher level? How do you handle, say, a book?


You don't need to handle a whole book, the goal is to chunk the book into chunks of the correct size, which is less than the context size of the model you're using to chunk it semantically. When you're ingesting data, you fill up the chunker model's context, and it breaks that up into smaller, self relevant chunks and a remainder. You then start from the remainder and slurp up as much additional text as you can to fill the context and repeat the process.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: