"Price on Application
(also available after subscribed to the Trial for self-service customers)"
Please dont do this... I even clicked on the follow through link, then you want me to sso... still no price. Save us all some time, it is in short supply.
It's a fair comment, and I used to have prices on the website and then actual customers, while they were on their way to becoming customers, told us to take the prices offline.
Their procurement teams really didn't like prices on the website.
Doomed if you do, doomed if you don't.
The trial is completely free, we don't ask for a credit card, only when you decide to want to become a customer, you can actually see the self service prices in your ARGOS dashboard.
Those prices are also available via the Azure/AWS marketplaces.
If your entire target market is enterprise then I suppose this route works well but if you want to hit smaller organisations it deviates a lot from what we're used to. GitHub, Slack, GSuite, they all provide prices and are successfully sold to small organisations and enterprises.
As a non-enterprise, there's no point in me signing up for a trial of a tool that may be orders of magnitude outside of my budget. A lot of my decision is based on the value proposition and I can't understand the value I might get if I don't know the price. If it's $10 a month and does what it says then I probably won't think about it, if it's $1,000 a month then that's quite a different decision process. I don't want to bother with a free tiral if I can't even see the price. Am I just booking to test drive a car I'll never be able to afford?
Again, fair comment.
Frankly, I'm still figuring out what this would be worth to smaller organisations.
ARGOS really scales by AWS Account / Azure Subscription / GCP Project right now and I'm currently charging "for each...".
I know what Enterprises are paying for ARGOS and some larger SMB, but not clear on what value smaller orgs or even startups assign to this problem.
keep in mind, AWS best practices (and IAM limitations, for those without resources/time to finely craft the boundaries) encourage account sprawl...
Not all customers spread out like we do... but I manage over 50 AWS accounts, and our DevOps team is 7 FTE... Our application/situation is admittedly unusual, but I could easily see a small business using 10 accounts to manage their org and a single "product"
I used to be a consultant building exactly those designs, limit the blast radius, have separate accounts etc.
"In the earlier days" of ARGOS that's exactly why I didn't charge per Account, but different ways (tried # of resources, then % of spend) and people were always confused.
Similar to the "take the price off the website" that customers told me, me charging per Account is also what customers asked me to do.
What do you think would a good unit be for a product like this?
I'm happy to try anything really, as long as it helps companies be more secure.
Bots are going to be the way we interact with the web (and really all systems) heading forward, this 'real people' at just 'browsers' is quite a misunderstanding of what a 'user-agent' really means in this day and age.
If I launch a new tab in the background and tell it go establish some set of factors for me, or locate price points and details for me, or buy something for me (and right now as me)... or just have it let me browser and interactively direct it but have it block ads as I go.
I know the law, and lawmakers, are looking at this from a fraudulent content perspective, but they are going to be hard pressed to do anything in long run to quell this.
Deletion (or confirmed re-deletion) of the data is irrelevant at this point, it is the models created from that data, and their use, which will now persist in usefulness to Analytica. Armed with these models, and future refined/iterated versions, they likely will capture the data more directly from users in the future. Once the genie is out, it doesn't readily go back in.
I haven't seen this reasonably addressed in any of the discussions, or org-based-presentations thus far. GDPR compliance itself basically ensures you cannot collect enough information to even defend against this type of attack vector.
This is mentioned in the recitals: you can request additional identification, in fact you should if you can't identify the subject [1] and if you can demonstrate that you can't identify the data subject (with reasonable effort), you don't have to comply to the request. [2]
If you are utilizing json-rpc anywhere in your stack, you should be authenticating every request via your transport(s), or the payload itself with JWT (or the like). To not do this, is to trust the world.
This is true over http and browsers, as well as internal servers, sockets, and cross frame communication. There are no such things as trusted internal services, just services that have not yet been breached (looking at you hardware vendors).
See the details in my comment. The same way you would require authentication and/or signing on any request, on any modern platform. Not doing this is poor form.
Please dont do this... I even clicked on the follow through link, then you want me to sso... still no price. Save us all some time, it is in short supply.