Years ago I worked for a company that claimed to have largest Filemaker Pro database in the southern hemisphere. One day there was some sort of disk issue and filemaker silently failed, then for months it kept pretending that everything was fine, it just wasn't writing to disk. This lasted until a power outage one day when the company discovered it had lost months of customer data, no backups of course, the backup system was working fine but there was no new information being written.
Data loss isn't forgivable, especially in the non-technical user space that filemaker is aiming for.
I'd put that on the DBAs. Once a week they should have been pulling down backups and testing for problems. They'd have also discovered the problem with the backups themselves.
90+% of the good use cases for filemaker are at places with no DBA's, it's meant to be a database tool for non-technical people and small businesses. If you had DBA's they'd just want to lock it down and eliminate the remaining 10% anyway.
If a company is claiming to have the largest database in the Southern Hemisphere, they'd have in-house developers (who double as FM dbas).
I'd be interested in knowing who the company was, and the metric they used to judge their database size. I worked on one that was 4GB data and >40GB binary files (photos, documents). Interestingly, when all the data was removed and the file was optimised it came to less than 2MB.
Years ago I worked for a company that claimed to have largest Filemaker Pro database in the southern hemisphere. One day there was some sort of disk issue and filemaker silently failed, then for months it kept pretending that everything was fine, it just wasn't writing to disk. This lasted until a power outage one day when the company discovered it had lost months of customer data, no backups of course, the backup system was working fine but there was no new information being written.
Data loss isn't forgivable, especially in the non-technical user space that filemaker is aiming for.