My experience with taking notes in meetings is definitely that the brain is the bottleneck, not the fingers. There are times where I literally just type what the person is saying, dictation style (ie, recording a client's exact words, often helpful for reference, even later in the meeting). I can usually keep up. But if I'm trying to formulate original thoughts, or synthesise what I've heard, or find a way to summarise what they have been saying - that's where I fall behind, even though the total number of words I need to write is actually much smaller.
So this definitely wouldn't help me here. Realistically though, there ought to be better solutions like something that just listens to the meeting and automatically takes notes.
Not when I want my notes to contain my own thoughts or reminders to myself. That's only in my head and today I either have to miss out on what is being said next to type it out, speak up in that moment (even if not really on topic), or lose the thought entirely.
I haven't found that to be very accurate. I suspect the internal idiosyncrasies of a company are an issue, as the AI doesn't have the necessary context.
Seems like it would be much easier to solve that problem than it would be to cross the brain barrier and start interfacing with our thoughts, no? Just provide some context on the company etc
“Sounds like it would” yes, but on practice no off the self solution works remotely well enough.
> Just provide some context on the company etc
The necessary “context” includes at least the name and pronunciation of the names of all workers of a company with a non English first name, so it's far from trivial.
I can break 100wpm, especially if I accept typos. It's still much, much slower to type than I can think.