That library doesn’t even appear to have a stable release yet, and was at v0.0.x as of a year or so ago… you also may be using chatGPT 3.5 which may predate this library. As a dev with 15 years of experience I haven’t even switched over from jest (but plan to)… all this to say, maybe we can give the bot some slack here. It should be possible to include vitest docs and examples in your prompts to teach it in context, did you try that?
Sure, I realize it's unsuccessful at using vitest because it's (relatively) new.
I'm just saying, this was a really telling example of how to use it for prompting.
A very large chunk of the tools I use in Javascript-land are "too new" for ChatGPT to work with properly.
Giving context unfortunately doesn't really work as ChatGPT usually prioritizes what it's absorbed through the corpus over anything you tell it.
To be clear, it does fine with new information if the things you ask it for don't match token sequences it's already been trained on, so if you give it a fictional library and ask it to perform some task with it, that doesn't seem too much like the things it might do with another library that accomplishes a similar thing with a similar API, it will actually use the custom code more successfully.
But for Vitest, it can't accept enough of the docs you might provide for it to be useful to you (though admittedly, sometimes it will show how to do something with jest that at least makes finding the right thing in vitest easier).
By the way, if you are planning to switch over in the future, the path for doing that is seemingly well documented by vitest and seems to be pretty straightforward as well, though I haven't meaningfully used Jest for comparison
edit: to be clear, I'm very impressed with ChatGPT's capabilities, and I think there are good examples of prompting where it does meaningful work in tandem with the human driver exercising their own judgment.
This was an example of a person asking it for things while not pointing out its limitations, which downplays the extent to which one needs to exercise one's judgment when using it. If they failed to point out the things ChatGPT got wrong which I know about, why would I trust that the other things I don't know it got wrong are accurate.