Actually, we’re mostly targeting small companies (10–50 people) that need guidance to avoid big fines but can’t afford the bigger, full-featured compliance tools.
Do you think there’s really no room for something like this in the market without having all the compliance certifications first?
There might be. You need to talk to your market and find out. I work at larger companies, so I can’t speak to startup culture right now. There’s no way I would personally sign off on giving access to all of our company data to a small company with no certifications, especially in an AI world where you might leak all of our data into public training models if it’s done wrong.
You're right, now we’re only testing with fake/synthetic data, so no real info is ever scanned. We’re already using local processing, encryption, and access controls to make sure everything stays compliant.
Yes you can test with real docs. they get processed locally, nothing gets saved on our servers, just the scan results which are encrypted.
We’ve been testing ourselves by connecting our own Dropbox/Google accounts using fake docs that simulate GDPR issues
The do you mean? Your demo video clearly shows the document contents in the dashboard. The document contents from all I could see would be processed by a cloud LLM.
Everything I see reads like you have a strange understanding of "local" and shouldn't be trusted with building such software.
Yes the document content is visible in the dashboard when you’re logged in, but it’s fetched at runtime from whichever integration you’re using (Dropbox, Google, etc.) and never stored on our servers.
The cloud LLM just processes the document on the fly to spot potential issues.
And the data you see in the demo is all fake.
> The cloud LLM just processes the document on the fly
That... doesn't sound local, dude. "Locally" would mean that the LLM is actively running in my browser, and in my browser only, which is not what you're describing.
I understand that you're claiming that the documents aren't being stored permanently, but they're still being transferred to your servers, and their full contents are being read there by something.
Yeah, you’re both right, it’s not “local” in the strict sense like running everything including the LLM in your browser.
What I meant is that the docs are fetched at runtime and never stored on our servers.
I’m totally open to ideas on how to make the setup better, even if it means tweaking the business model a bit.