Don't fully disagree but the reason I would care less in this case is I assume most/all of what you would be feeding it is non-sensitive documentation.
I don't think that matters? Sensitive could be just discussing designs, outages, hiring, interview feedback?
There's a lot of stuff in the average Slack account people don't want on the internet, let alone in a LLM which will potential expose it to the entire world?
Maybe companies like Slack will release integrations natively so it won't matter so much.
If you send all of your slack communications to IngestAI, it would include possibly channels where you discuss interview feedback. That's what the parent poster is saying.
And I am saying that was never the intended purpose of this product from what I read. That was my whole point in my OP. I agree that there are unique issues with products like these but its not alone, at the end of the day you should not be feeding sensitive data to third party applications like this.
Edit: This whole thread is goofy. It is the equivalent as saying what if you published your entire internal emails online.
hey, we are not learning anything from your slack history or channels.
The way IngestAI works is it takes your knowledge base as the input (markdown, docs etc) and answers the queries asked by the user on these knowledge base. ur primary usecase has been to simply learn from public documentation of companies and help answer the queries within their Slack/Discord community.
thanks for your comment, Infecto. Totally agree with you. And again, for those who want it to be used with sensitive data, there's an option to use their own AWS S3 Cloud storage or even if that's an enterprise client to put our app locally.
Agree?