I haven't used Thunderbird in a long time, but regularly used Outlook with multi-gigabye .pst files. Surely sqlite on an SSD would be up to the task of handling at least million emails of average size.
What has been your experience? Mine in trying to use and support it is that Outlook is an Exchange client; PSTs are hacks to meet demand, though they work well enough in limited circumstances. Especially PSTs over a LAN connection are a disaster.
The Exchange server hardware was so underpowered (or the software so ill-designed for large mailboxes) that Exchange powered searches would fail, but ones run on the local pst would complete successfully (if slowly). This was on an HDD. SSD would be much faster.
OT but is that right? SSDs have many advantages but sequential read isn't necessarily one of them. SSDs seek is much faster, but this is ~one file. Throughput can be much faster due to the better interfaces, but is throughput the bottleneck for this kind of search?
SSDs are usually better at sequential read as well as seeks. Depending on how outlook organizes the file (and how it gets organized in the file system) there's probably a mix of seeking and sequential reads anyway.
Good question. Property benchmarking would be required to know for sure. It's probably rare that a multi-gigabyte file would be contiguous on disk, so lots of seeking would probably be required anyway.
I did an internship in IT 20 years ago where we were building/maintaining desktops and other general helpdesk type stuff. I'm pretty sure I remember us having a handful of users with multi-gig pst files, running on 2005 hardware.
I'm primarily a Linux user, but Mail.app is probably the best graphical email client I've ever had the pleasure of using (you can pry mutt from my cold, dead hands).