Hacker Newsnew | past | comments | ask | show | jobs | submit | rustastra's commentslogin

Same with symbolic systems!


aiui a natural language question e.g. "What is the refund policy?" gets matched against formalized contracts, and the relevant bit of the contract gets translated into natural language deterministically. At least this is the way I'd do it, but not sure how it actually works


I couldn't agree more with this take: > Our demand for stimulating content is being overtaken by supply. Analogously, with AI, we might be in a world where scientific progress is accelerated beyond our wildest dreams, where we have more answers than questions, and where we cannot even process the set of answers available to us.


Curious to learn more about prompt engineering takeaways here. Was feeding more context (or chapters of textbooks, bits of papers, documentation) helpful? It does seem like layering information and being very precise helps a lot. Eerily like with interns


As a user, am I expected to write the labeling algorithms myself or do you offer some in-built ones?


You can do either! We offer a bunch of automation features directly through the Web App but people have also used the SDK to write their own algorithms. We have seen a lot of different annotation processes now so we can often direct people on the best flow to automate their labeling.


I am very curious to know which pre-trained models work better for this task and whether it's possible at all to do without a neural net...


Really just depends on the task. For this particular case I used a Faster-RCNN model with weights pretrained on the COCO dataset


Could feature extraction be helpful here?


Feature extraction from a pre-trained model? Sure, again depends how you use it. We have used feature extraction + clustering for some of these labelling tasks successfully in the past.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: