This is really cool! Both of my parents are cell biologists, and I've done some time in labs as well, so a lot of paper exploring and reading in the family. "Review" articles are a good index but something more on-demand makes a lot of sense, I can definitely see this being extremely useful.
You've probably seen those animated captions underneath videos all over the place. This is a super minimal demo of that, also with chapter summaries. Just one file hosted on val.town! https://www.val.town/v/substrate/subaudio
Hi! I put together this little app using Substrate – https://substrate.run – and Val.Town to help me brush up on my Spanish vocab. It's pretty neat to be able to generate text, images, and audio, and then ask an LLM to generate a website.
Fun alternative to building out the UI yourself – ask an LLM to do it! I do this all the time now, asking the LLM to stream its output as Markdown or HTML.
Little demo that searches Hacker News comments for a topic (using https://hn.algolia.com/api), extracts sentiment and other metadata, then generates a research summary.
Really proud of the API we've built at https://substrate.run – you don't have to think about graphs, but you implicitly create a DAG by relating tasks to each other. Because you submit the entire workflow to our inference service, you get automatic parallelization of dozens of LLM calls for free, zero roundtrips, and much faster execution of multi-step workflows (often running on the same machine).