Hacker News new | past | comments | ask | show | jobs | submit login
Anki and GPT-3 (andrewjudson.com)
209 points by Tomte on Feb 2, 2023 | hide | past | favorite | 77 comments



As a daily Anki user, I am highly skeptical of this idea. The careful creation of new cards is one of the key parts in learning the material. It promotes a critical thinking step where you have to distill the material down into small but important pieces and (if you want to do it well) forces you to evaluate if you have actually understood what you wrote down.

I suspect that forgoing this step will not lead to understanding, but to rote memorization of trivia. Which is fine if that's your goal, but it leaves most of the potential of spaced repetition on the table.


Anki's key benefits stem from its utilization of two highly supported and extensively researched pedagogical strategies: (1) retrieval practice and (2) spaced repetition. In contrast, the practice of note-taking, which essentially entails summarizing information in one's own words, has been found to have 'low utility' in academic literature. [0]

[0]: https://sci-hub.ru/https://pubmed.ncbi.nlm.nih.gov/26173288/


Depends on the note-taking strategy. From the same paper you cited, section 8.4 Practice testing - Issues for implementation, pages 34-35:

> Another merit of practice testing is that it can be implemented with minimal training. Students can engage in recall-based self-testing in a relatively straightforward fashion. For example, students can self-test via cued recall by creating flashcards (free and low-cost flashcard software is also readily available) or by using the Cornell note-taking system (which involves leaving a blank column when taking notes in class and entering key terms or questions in it shortly after taking notes to use for self-testing when reviewing notes at a later time; for more details, see Pauk & Ross, 2010).

Further explanation: https://lsc.cornell.edu/notes.html#post-1037

The Cornell note-taking system combines moderate and high learning techniques: elaborative interrogation, self-explanation, practice testing, and distributed practice.

If you think about it, creating Anki cards is note-taking in a specific format. Of course you can skip it and use a set prepared by someone else. It would be interesting to test: is creating your own flashcards better for studying than using ready-made ones?


Founder of StudyWand.com here, who received a 15k grant to develop an AI generating flashcard app in 2020 after an earlier prototype.

We've found students more consistently study ready-made cards that are at desirable difficulty (they get about 80% correct) and which are segmented by topic (e.g. semantic grouping of flashcards to tackle "one lesson at a time" like Duolingo). Students would prefer to use pre-made flashcards by other students in their class, then AI flashcards, then create and use their own.

There is limited evidence by Roediger and Karpicke who are the forefathers of retrieval practise that creating cards is also important. Frank Leeming (2002 study Exam-a-day) also showed that motivation when studying is peaked when you ask just a few questions a day, but every working day.

Now one of the vital benefits of retrieval practise with AI over creating your own cards is foresight bias - not mentioned yet in this thread - the fact that particularly in some subjects like Physics, students don't know what they don't know (watch this amazing Veritasium video, it also explains why misconceptions are so handy for learning physics): https://www.youtube.com/watch?v=eVtCO84MDj8 - basically, if you use AI quizzes (or any prepared subject-specific right/wrong system), you learn quickly where your knowledge sits and what to focus on, and reduce your exam stress. If you just sit their making quizzes, firstly you make questions on things you already know, you overestimate how much you can learn, and you consolidate on your existing strengths, and avoid identifying your own knowledge gaps until later on, which is less effective.

--

To quote from my dissertation experiment on background reading for retrieval practise, the end is about foresight bias a little: Retrieval practice – typically, quizzing - is an exceedingly effective studying mechanism (Roediger & Karpicke, 2006; Roediger & Butler 2011; Bae, Therriault & Redifer, 2017, see Binks 2018 for a review), although underutilized relative to recorded merit, with students vastly preferring to read content (Karpicke & Butler, 2009; Toppino and Cohen, 2009). Notably mature students do engage in practice quizzes more than younger students (Tullis & Maddox, 2020). Undertaking a Quiz (Retrieval practice) can enhance test scores significantly, including web-based quizzes (Daniel & Broida, 2017). Roediger & Karpicke (2006) analysed whether students who solely read content would score differently to students who took a practice quiz, one week after a 5-minute learning session. Students retained information to a higher level in memory after a week with the quiz (56% retained), versus without (42%), despite having read the content less (average 3.4 times) than the control, read-only group (14.2 times). Participants subjectively report preference for regular Quizzing (Leeming, 2002) over final exams, when assessed with the quiz results, with 81% and 83% of participants in two intervention classes recommending Leemings “Exam-a-day” procedure for the next semester, which runs against intuition that students might biases against more exams/quizzes (due to Test Anxiety). Retrieval Practice may increase performance via increasing cognitive load which is generally correlated with score outcomes in (multimedia) learning (Muller et al, 2008). Without adequate alternative stimuli, volume of content could influence results, thus differentiated conditions to control for this possible confound are required when exploring retrieval practice effects (as seen in Renkl 2010 and implemented in Methods). Retrieval practice in middle and high school students can reduce Test Anxiety, when operationalised by “nervousness” (Agarwal et al 2014), though presently no research appears to have analysed the influence of retrieval practice on university students’ Test Anxiety. Quizzing can alleviate foresight bias – overestimation of required studying time – in terms of students appropriately assigning a greater, more realistic study time plan (Soderstrom & Bjork, 2014). Despite the underutilization noted by Karpicke and Butler (2009), quizzing is becoming more common in burgeoning eLearning courses, supported by the research (i.e. Johnson & Johnson, 2006; Leeming, 2002; Glass et al. 2008) demonstrating efficacy in real exam performance.


I'm a long term (10 year+) user of Supermemo (and general fan of SRS stuff) and finally got around to checking out StudyWand today. This is the best experience I've had making flashcards and general study material ever. Hands down, nothing I've seen comes close.

It's wild because StudyWand took my sample notes and did everything that I would have done with them if I was going to use them to get a good grade in school. I was expecting some semi-decent cloze generated cards but got much more.

Literally, when I was in college I would take notes in class, and then spend about 20-40 minutes post class doing almost exactly what StudyWand does. The classes that I bothered doing that for, I always got a good grade in, nearly effortlessly. The hardest part was making the notes.

The part of this that I'm actually excited about is that this tool also works with any sort of documentation. For example, I can clean up any reference page from MDN as a PDF and get a usable (like actually well-made flashcards) set of 15-20 flashcards for it. Oh, you also get summaries and multiple essay questions too. The only way this would be better is if it gave you cloze deletions that were actual sample code to fill in the blank with.

I didn't really like your intro so I took a few days longer than I normally would have to look at your software (I normally check out every SRS software I see on HN). This software is insane. The value is so, so, so ridiculous. I half-hearted uploaded one poorly made PDF of a webpage and got flashcards that are comparable to what I would make as a 10+ SRS user. I almost stopped doing the initial reviews halfway through and looked for a way to pay for this.

Outside of Supermemo, this is the only other SRS software that I've seen that's worth my money. The hardest part is going to be convincing all my younger family members to actually use this. I've tried so many times to get people to use Anki (Supermemo won't happen), and they just don't get it. I think StudyWand might be able to bridge that gap. I'm going to try and see.


Whoa, awesome to see a fellow SuperMemo user here! I've been going for 17 years, I absolutely can't imagine life without SuperMemo and I've stopped trying to sell my friends on it. Have you tried integrating Study Wand with SuperMemo at all? I'm always on the lookout for ways to maximize my information intake (Aside from SuperMemo, which already does a nearly perfect job at it).


Nothing automated yet, but I may plan to. I don't regularly add that many new cards to my collection to where it'd be an immediate benefit for me. I may add more with StudyWand since it seems to do a good enough job of creating cards. Incremental reading is the primary way I add new material to my collection, but I don't really fully process articles that much into items these days. Most of my SuperMemo use is using incremental reading to ensure I always have something interesting to read as well as tasklists for planning/ideas. I have a lot of stuff that I like to revisit or review, which I find using SuperMemo great for too.


That's an excellent response. Thank you. I didn't think of the foresight bias and now that I do, it makes a lot of sense.

> Undertaking a Quiz (Retrieval practice) can enhance test scores significantly, including web-based quizzes

The way I understand it, retrieval practice increases test scores where you test information retrieval (quizzes, multiple choice, etc.). Which makes sense because you're practicing a skill that the test evaluates.

The follow-up question: does retrieval practice increase scores when you evaluate understanding of a subject, such as open tests or essays?


Thank you.

I am afraid I don't know and couldn't easily find any research. I did find this post, but it looks like SEO spam a little, and doesn't cite the essay claim: https://www.bookishelf.com/the-importance-of-information-ret...

I'll ask Prof Roediger as we occasionally communicate and will aim to get back to you. However, I wouldn't be surprised if any correlation was not statistically significant.


One more thing. When you mentioned Test Anxiety it reminded me of frequent comments about anxiety during job interviews. I'm wondering if flashcards (or other retrieval practice) could help there too. Perhaps you can spin it into a product for professionals.


What kind of flashcards would help there for you?

[edit: removed a point about variance which wasn't supported by more recent literate]

High stress (or anxiety) happens only periodically... like those job interviews you mention. It varies massively within individual subjects. There are some ways to reduce anxiety, but I wouldn't say anything has been conclusive yet. Here is an excerpt from my Dissertation about Test Anxiety... I found practise quizzes had a non-significant overall effect on Test Anxiety, and reduced only the "Tenseness" subscale of a relatively old measure sadly. The CTAS scale, referenced below, would probably be most useful for the job interview case.

"Test Anxiety is the additional stress felt when you must provide answers. It is prevalent, affecting students from preschool (McDonald, 2001) to taught university level, and across demographics (Beidel et al 1994; Beer 1991; Mwamwenda,1993). Prevalence is estimated at 20% (Ergene 2003) to 25% (Thomas et al, 2018) in current students. Test Anxiety often involves pressure, highlighting a social, non-genetic nature, and high variance is recorded between subjects (e.g. Keith et al 2010 longitudinal study showed low individual variation, but high between subjects variation). Overwhelming Test Anxiety significantly impairs wellbeing (Steinmayr et al., 2016). Interventions and online course design choices can markedly reduce Test Anxiety (Abdous, 2019). Unfortunately, few online learning studies record Test Anxiety, instead recording grade/score differences. Yet, Test Anxiety interventions are efficacious – e.g. "test-wiseness training" (Kalechstein et al., 1988), group counselling (Alkhawaja, 2013) and mindfulness training (Seidi & Ahmad, 2018). Research often classifies students into High and Low groups. There are disproportionately negative outcomes for High Test Anxiety students, whom more recently have been enrolled automatically in targeted interventions (e.g. Psychoeducation Bedel et al., 2020). Test Anxiety is related to Social Anxiety (Rothman, 2004; Sarason and Sarason, 1990). This manifests in a socialevaluative component, which appears in most scales, such as the Test Anxiety Inventory (TAI; Spielberger, 1980). TAI was designed for assessing children (Ludlow & Guida, 1991). Recent scales, including the Cognitive Test Anxiety Scale (CTAS; Cassady & Johnson, 2002), are adapted to university participants. An example CTAS item is "I feel under a lot of pressure to get good grades on tests.". CTAS has similar correlations with academic performance (Chapell et al., 2005) as those utilizing TAI (such as Hunsley, 1985), without the concerns of TAI applicability discussed in Szafranski (2012). Test Anxiety is typically construed as a trait under Latent state-trait theory, despite short term experimental intervention differences being observed (e.g. "looking ahead" immediately prior to testing in Mavilidi et al., 2014; or taking practice quizzes in Agarwal et al., 2014). Test Anxiety should not be confused with other forms of anxiety, such as General or Social Anxiety Disorder. Prevalence of anxiety is rising in college populations (Zagorski, 2018), as is loneliness, with 1 in 5 young people reporting having no close friends and feeling alone(n=2,522, Mental Health Foundation, 2019). Usage of social media is also associated with greater loneliness (Wohn & LaRose, 2014). However, artificially induced status updates have been shown to reduce it (Deters & Mehl, 2012), which aligns with status updates requesting academic support reducing Test Anxiety as found in Deloatch et al (2017) and findings linking higher wellbeing with greater academic outcomes (Public Health England, 2014). Research on social anxiety and social media is less clear, although recently Erliksson et al (2020) correlated greater internet usage and activity with increased social anxiety. Despite wide research on the negative aspects of social media usage (e.g. for university students, see Odacı & Kalkan, 2010) - little attention has been paid to online helping, which is fundamentally social."


According to some people (Justin Sung on YouTube is where I heard it, probably, so no idea if credible) you can influence the forgetting-curve and make Anki more effective by having more context, putting knowledge into relationships with other knowledge, etc. It's a multiplier on raw/rote space repetition.


I have found that by adding simple etymologies I can improve my recall of Latin names for plants. The etymology connects an otherwise totally new word to existing word-concepts.


Really depends on how you’re defining context, since to me there’s concepts like memory chains/ladders, which simply fit sequential memories into a visualization — then there are applications contexts, which actually mirror reality of future use.

Great example of this is topic of memory sports [1], which do use techniques that work, but are likely less useful for actual remembering real world complex information.

[1] https://wikipedia.org/wiki/Memory_sport


I think the benefits are multiplicative here.

As another experienced Anki user, I have found I get much more leverage out of spaced repetition (both in terms of efficiency of memorization, as well as in how useful the information is) when I've first made the knowledge my own and structured it in a way that makes sense to me before creating the cards, rather than just dumping a bunch of pure entropy into the Anki database.

That's not really note-taking per se, but it is ensuring that the stuff I'm trying to memorize isn't pure entropy (which is always more of a challenge to memorize in any event), but rather is part of a larger sense-making structure. The purpose of spaced repetition is to help prevent that structure from decaying; it's not a substitute for having it in the first place.


As others already pointed out, the linked article seems to say the reverse of how you interpreted it (though I would agree the terms are a bit ambiguous).

Note taking as parrotting is distinct from note taking as distillation. The latter has much more chance of getting results with proper spaced repetition, since it will help establish a concrete foundation on which to build more knowledge in an easily retrievable manner. The former may or may not benefit from spaced repetition at all.

I would say if GPT distills successfully, while this isnt as good as self-distillation, it may still be useful.

If not, or worse, actively bullshits, it probably won't be much help.


Note taking with SRS may be greater than the sum of the parts.


Yeah. But hey! You can only recall what you learned in the first place, right?


Could you elaborate on the definition of 'low utility'? Is it only in the context of memorization (retaining), or in the context of general understsnding?

Can't see how summarizing isn't a preprocessing step for learning in general. Spaced repetition is a natural next step. I don't see how they compete


It's a classic beginner mistake to load up a 20000 word deck and start learning it when you have no relationship or connection to the cards. Create a card when you first see the word in real life, with that example sentence on the back, and it's so much easier to remember. At least for me anyway.


Those large word decks can be useful though. Download deck, immediately suspend all cards, and then start un-suspending cards as you learn the words. I also add notes to the cards as I go.

I have a similar strategy for the Ultimate Geography deck. I want to know where every country is on a map, as well as their capitol and flag. So I suspended all cards, and then periodically enable 20 new ones, and learn as I go. That often means googling the location or pronunciation the first time I encounter the card.

(I use the low-key Anki configuration with Pass/Fail buttons plugin)


Everyone's different, but the act of creating the card itself helps motivate me to learn it. I really enjoy having a hand-curated deck. Though I can see larger decks being useful if the card is annoying to make (like your geography example).


The opposite theory is that getting _some_ exposure to the information, even if you can't actively recall it, is still valuable. When you encounter that information in the wild, your brain then has an existing pathway to strengthen.


How do you find the time/energy to get the example sentence on the back of each card? Do you have a process/method to make it easier?

When I encounter new words in real life, it tends to be all at once (i.e. watching a show while making dinner). I scribble the word quickly on a nearby notepad and don't have time to look at it again until I'm bulk-uploading new words later. It bothers me that I'm maybe not getting the full benefit of Anki, because I'm missing the context and/or example sentences.

I would love to hear ideas about how to resolve that timing issue.

I seem to retain more from those 20k word decks you mentioned than my personal decks, likely because they come with example sentences and context that my personal decks are missing.


Totally agree. But I'm tempted solve this by adding MORE AI. Have the tool prompt GPT to quiz you about the new card, or force you to use a new language vocabulary word a bunch of times. Maybe even argue against you adding the card and try to convince it it's worthwhile information.

USER: Add a new Anki card for, "It is important to set a high standard from day one."

GPT: That seems like generic advice. Of course high standards are better than low standards in a vacuum, but everything has a cost. Being too rigid in setting high standards may limit the business's flexibility to adapt to changing market conditions, customer needs, and technological advancements. Are you sure high standards are that important?

(That criticism actually generated with GPT, using the example the author of the tool Tweeted.)


Anyone who has heavily used Anki will know that loading some deck with complex content that you haven't learned beforehand won't lead to any satisfying results because cards in spaced repetition lack 1) meaningful pedagogic order for learning content and 2) meaningful context for the big picture. Anki is meant for not forgetting, not much more.


Creating cards is great, but for those seriously deep in the anki game (eg 20,000 active cards or 20 new cards per day for years straight), there isn't enough time to create all the cards you need to learn and pre-made decks are king.

I imagine if you need to learn that much material but the pre-made deck isn't there, this gpt integration might be very helpful.


I agree that generating your own content is better for learning material. But disagree the effect is so large as to nullify any benefits of training.

If you have not drilling material on one end, and writing your own questions on the other. I think drilling gpt generated questions is probably closer to the latter than the former.


Yep, a large aspect of flashcards when studying or the occasions when in college they allow you to bring notes to exams involves the aspect of personally curating the material and thinking about it during the process.

Perhaps the gpt can find truly useful information, but if you don't know why it's useful you're just filling up your memory with material you don't know how to utilize well.


What you say is the commonly accepted advice, and the ideal workflow. But the truth is that I have been using a huge premade deck to learn Chinese for the last 3 years, and it works.

Would I learn more if I created my own cards? Sure, but studying the premade deck takes me 30 minutes per day, which I can afford. I cannot afford the time to make and update a custom deck.


Yeah, I had some success with generating conjugation tables for irregular verbs in Portuguese, but that's just rote memorisation. I still had to clean them up, but the entire process was quicker than typing manually or web scraping.


Not even factual trivia


Very cool! However, I often feel like the process of generating the question/answer necessary for the Anki card is an important part of the learning process because it forces you do deeply think about the material and reflect on it. So I think there is a trade-off involved


Absolutely you lose a lot of effectiveness when you automate card creation. But it's still better than thinking, "Oh, I should make a card about this so I don't forget this important thing..." and never doing it.

For me the biggest risk is being tempted to making way too many cards (because it's so fast and easy now!), end up with way too many reviews, and declaring Anki bankruptcy and uninstalling the app after a few months. I may done this more than once...


imo if you don't make the card then either: the word isn't important, or it comes up again, and you finally do make the card. It's not a big deal.

I'm about 10 years on my Anki deck and have reached the same conclusion as you about keeping deck size down. A single card has a huge time investment if you add up the reviews. Even more if it's a bad card and you fail it a lot. As the words you learn get more and more niche, it's important to weigh up whether a card is worth making or keeping. I actively delete cards that make me feel 'meh' when I see them, or that I fail a lot, so I don't lose motivation.


These days I have a policy to just suspend once the interval hits six months, so that deck size has a max cap and can eventually go to zero if adds stop for a long enough period. Long enough to bootstrap niche words and hopefully maintain them through reading.


Just a side note on Anki - it sounds cool, but how is it possible that there are zero screenshots on the product page? Furthermore, it has a super busy (/intimidating-to-non-techies) download section, and the only real descriptive text is on the linked docs/help site, which is so dense that it is completely overwhelming to anybody who is still in "convince me" mode.

The project would probably have 10x the number of users in 6 months if they learned how to do product marketing.

https://apps.ankiweb.net/


This is one of the reasons i made Mochi[0]. I couldn’t understand how this was the best recommendation for new people. Anki can be very useful and powerful but it is so user unfriendly as to be down right hostile.

(Plus Mochi has GPT-3 built in [1])

[0] https://mochi.cards/

[1] https://twitter.com/MochiCardsApp/status/1603569008493711361


This looks excellent. My one hesitation with moving over is whether or not there is a way to export back to Anki should your tool ever cease to exist.

SRS is a life long game, so I don’t want to lose everything if Mochi shuts down in 5, 10, 20 years (no business lasts forever). Whereas with Anki I know I’ll be able to use it forever.


I don't have an export to Anki format specifically, but you can export to JSON (the JSON format is documented on the website, unlike Anki's, which had to be reverse engineered) or markdown, plus all of the data is stored locally on your computer in a SQLite database.


I've been using Mochie for while now, I tried some of the others out there and settled on Mochi as the best.


Your site makes my cpu sweat with Firefox 108 on linux. Seems the problem are the animations on the page. I disable that and there is no problem.

The product does seems good. I may give it a try.


Looks cool! Will it still work when your company is gone?


Anki has been around for a decade, almost two.

Several comercial competitors came and went over the years.

I suppose it tells you something about the learning tools market.


I initially discovered it ~10 yrs ago as an alternative for Memrise back when all of Memrise courses were still free. UI/UX wise it seemed outdated-ish but overall very powerful. Glad to see this still going strong as FOSS.


Noo don't you'll flood the sync servers and then they'll have to put in ads and tracking and annoying animated birds to keep it all running


the thing is, the difficulty associated with actually using the app on a daily basis is very high. you can try to optimize for top of funnel but then you end up with a toy app like Duolingo that can acquire a bunch of users but doesn't actually fulfill the learning goals very well.


Wait until you have tried SuperMemo, the OG SRS


This is a great idea.

This is what note taking is lacking!

When you pile up 200 notes or 2.000 ... you can push them into GPT to generate anki cards so that you can remember and make sense of it all.


As the founder, I would love to hear your thoughts on StudyWand.com, which does this for PDFs/youtube videos, and has greater factual accuracy than base GPT models by quoting the source documents (thanks to a university grant): https://studywand.com/images/splash/getStudyingGIF.gif


this is really cool.

my first thoughts without thinking are:

1. how easy to set up ubuntu linux and using firefox lol. this is more on anki being somewhat frustrating sometimes.

2. i have tried using notes that i didn't make myself and there were pros and cons but bigger cons

3. that generated note didn't abide by the 20 rules which I find to be a pretty good guideline. that seems pretty fixable though!


Video of it in action (link from his site) https://twitter.com/i/status/1621025729013239813


Very cool! I've experimented with a similar thing for trivia questions based on Wikipedia articles.

I'm also trying to reference the part of the text where the question was generated from, so the user can verify if it was correct or learn more about it in context.

https://trivai.app/

I currently require a sign up to generate questions, but the latest questions can be seen at

https://trivai.app/latest-questions


> What is the belief in the existence of an afterlife called?

Answer: ancestor worship.

Then the explanation talks about the other three options, but does not reference afterlife.

I wonder what went wrong?


Me too! I’ve been experimenting with the prompt to try to make it provide better references and questions.

Logged in users can upvote and downvote questions so the next step will be to try to fine tune the model on good questions.


This looks awesome, great example on how to use GPT-3 for fun and learning


Really cool to see! I totally agree that GPT models will revolutionise education, and a few thousand students are using a tool I've launched in that space that runs on PDFs and Youtube videos.

It's specifically optimised LLMs to align with the Anki Minimum Information Principle, here's a 15 second GIF of StudyWand.com - https://studywand.com/images/splash/getStudyingGIF.gif

... showed it on r/GetStudying a few days back mentioning Anki MIP!: https://www.reddit.com/r/GetStudying/comments/10lg4ci/automa...

... it has >98% recall on psychology (compare to between 70-95% recall depending on the subject for the SOTA AIs). We achieve this through quoted filtering - 80%+ of questions are quoted to specific slides or time stamps in YouTube videos.

Would love to chat with you Tomte.


From my experience, automatically generated flashcards are often disappointing. Recently I've been experimenting with the use of GitHub copilot for accelerating cards creation and I can wholeheartedly recommend this approach, I've noticed at least 2x speed up in the process without sacrificing cards quality


Can you explain your process in more depth?


Wow this has given me so many ideas. I never even thought to have GPT make flash cards.


Why would you use LLM to create Anki cards when you can use said LLM to obtain the answer in the first place, eliminating the need to recall the information you are trying to remember with Anki?


lol came here to say this. this defeats the purpose of anki


why learn anything at all if you can just ask the LLM?


That’s the next logical step in the same argument.


With the timing (ChatGPT out for about a month, GPT3 with close to identical capabilities in a different UI for 2.5 years) this announcement seems to add another datapoint that ChatGPT was a genious marketing campaign that pushes GPT adoption into the mainstream.


This is pretty cool.

One of the things I think is hugely important about GPT3 is that it's paid for by the token. So for small users, it's pretty cheap, and it's fully workable for testing and experimentation.


This statement is accurate for testing and experimentation purposes. Additionally, the $18 starting credit provided by OpenAI is a nice bonus.

I created a small Anki plugin that tags cards based on their content using the Text-Davinci-003 model. However, the cost quickly becomes prohibitive when a large number of tokens/ Anki cards need to be processed. To be fair that plugin is not working great, since tags should be viewed, merged and reused on a global scope.


At 2 cents per 1000 tokens (~750 words), unless you're doing it a lot that should be quite manageable? That's ~$13 for 1000 cards of 500 words (prompt + card + tags). The smaller models are cheaper still and might perform well enough.


I guess you are generally right. My prompts are lengthy, since I let GPT-3 handle the full html syntax of both sides, mathjax and so forth. I found the ada model to be not sufficient, but have not tried the other two models, I will try those! There is room for improvement on my side.

From a budget-conscious student's perspective, this amounts to roughly $20 per semester in my case, just for the tags. Despite this, I am still fascinated by the automation possibilities offered by GPT-3.


That's totally fair, I should be more conscious about different positions (and not get stuck in "as someone with a stable income..." thinking) and you're right these things can build. Curie as a model should cost 1/10th of Davinci - also the recent models are all better I think so if you've not tried them since the GPT-3.5 release it might be worth trying the smaller ones again. If you could get that to $2/semester it's a lot more palatable, as long as the quality isn't too bad. Maybe there's a flow where you can identify low performing cards/subsets so you can do 90% in Curie and 10% in Davinci.

I think my expectation of this kind of service (on demand huge network) is fixed monthly fees, call us for pricing, etc. It's cool that it's approachable as a student. It'll be very interesting as costs for these things drop and/or performance improves at the same price points.


Silly question, but how does one get an Anki API Key? I've installed anki-connect and can't quite figure out where to get the key...


You can setup one in the anki-connect configuration - or just put in a dummy value that is ignored in case you dont have an Key set.

You might still get 403 error, change the allowed origins to "*" in anki-connect.


i want this same thing for notion


Try saving your notion page as a PDF and upload it to StudyWand.com - I'd be super keen to hear your thoughts on the flashcards, which use a grounded LLM approach(Founder here).


It already exists natively https://www.notion.so/product/ai

Use the "summarise" option.



can you elaborate more?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: