Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's solvable with modern cloud offerings - Provision spot instances for a few minutes and shut them down afterwards. Let the cloud provider deal with demand balancing.

I think the real issue is that developers waiting for PRs to go green are taking a coffee break between tasks, not sitting idly getting annoyed. If that's the case you're cutting into rest time and won't get much value out of optimizing this.



Both companies I've worked in recently have been too paranoid about IP to use the cloud for CI.

Anyway I don't see how that solves any of the issues except maybe cost to some degree (but maybe not; cloud is expensive).


Sorta. For CI/CD you can use spot instances and spin them down outside of business hours, so they can end up being cheaper than buying many really beefy machines and amortizing them over the standard depreciation schedule.


Yeah though not for silicon verification because we always run more tests overnight. They get basically 100% utilisation.


Were they running CI on their own physical servers under a desk or in a basement somewhere, or renting their own racks in a data center just for CI?


There are non-IP reasons to go outside the big clouds for CI. Most places I worked over the years had dedicated hardware for at least some CI jobs because otherwise it's too hard to get repeatable performance numbers. At some point you have an outage in production caused by a new build passing tests but having much lower performance, or performance is a feature of the software being sold, and so people decide they need to track perf with repeatable load tests.


Data center.


That’s paranoid to the point of lunacy.

Azure for example has “confidential compute” that encrypts even the memory contents of the VM such that even their own engineers can’t access the contents.

As long as you don’t back up the disks and use HTTPS for pulls, I don’t see a realistic business risk.

If a cloud like Azure or AWS got caught stealing competitor code they’d be sued and immediately lose a huge chunk of their customers.

It makes zero business sense to do so.

PS: Microsoft employees have made public comments saying that they refuse to even look at some open source repository to avoid any risk of accidentally “contaminating” their own code with something that has an incompatible license.


The way Azure implements CC unfortunately lowers a lot of the confidentiality. It's not their fault exactly, more like a common side effect of trying to make CC easy to use. You can certainly use their CC to do secure builds but it would require an absolute expert in CC / RA to get it right. I've done design reviews of such proposals before and there's a lot of subtle details.


I don't know about Azure's implementation of confidential compute but GCP's version basically essentially relies on AMD SEV-SVP. Historically there have been vulnerabilities that undermine the confidentiality guarantee.


Mandatory XKCD: https://xkcd.com/538/

Nobody's code is that secret, especially not from a vendor like Microsoft.

Unless all development is done with air-gapped machines, realistic development environments are simultaneously exposed to all of the following "leakage risks" because they're using third-party software, almost certainly including a wide range of software from Microsoft:

- Package managers, including compromised or malicious packages.

    Microsoft owns both NuGet and NPM!
- IDEs and their plugins, the latter especially can be a security risk.

    What developer doesn't use Microsoft VS Code these days?
- CLI and local build tools.

- SCM tools such as GitHub Enterprise (Microsoft again!)

- The CI/CD tooling including third-party tools.

- The operating system itself. Microsoft Windows is still a very popular platform, especially in enterprise environments.

- The OS management tools, anti-virus, monitoring, etc...

And on and on.

Unless you live in a total bubble world with USB sticks used to ferry your dependencies into your windowless facility underground, your code is "exposed" to third parties all of the time.

Worrying about possible vulnerabilities in encrypted VMs in a secure cloud facility is missing the real problem that your developers are probably using their home gaming PC for work because it's 10x faster than the garbage you gave them.

Yes, this happens. All the time. You just don't know because you made the perfect the enemy of the good.


> ...your developers are probably using their home gaming PC for work because it's 10x faster than the garbage you gave them...

I went from a waiter to startup owner and then acquirer, then working for Google. No formal education, no "real job" till Google, really. I'm not sure even when I was a waiter I had this...laissez-faire? naive?...sense of how corporate computing worked.

That aside, the whole argument stands on "well, other bad things can happen more easily!", which we agree is true, but also, it isn't an argument against it.

From a Chesterson's Fence view, one man's numbskull insistence on not using AWS that must only be due to pointy-haired boss syndrome, is another's valiant self-hosting-that-saved-7 figures. Hard to say from the bleachers, especially with OP making neither claim.


As a 35 year old waiter with no formal education, who has also spent the majority of his free time the last 25 years either coding or self-studying to further my coding, I am super interested in your life story. While struggling to scrape by has been "awesome", I'm hoping to one day succeed at making tech my livelihood. Do you have a blog or something? lol

But to go back to the topic: are companies that have such a high level of OpSec actually outfitting devs with garbage, enterprise lease, mid-to-low tier laptops? I only have knowledge from a few friends' experiences, but even guys doing relatively non-hardware intensive workloads are given a Dell XPS or MacBook Pro. I would imagine a fintech would know better AND have the funds to allocate for either of those options

Maybe an in-house SWE at a major bank would end up with that level of OpSec on a mediocre fleet laptop, although I'd hope they'd have managers willing to go to bat for them and an IT department that can accommodate provisioning multiple SKUs depending on an employee's actual computational needs.... perhaps I too have a skewed/naive sense of how the corporate computing world works haha


> missing the real problem that your developers are probably using their home gaming PC for work because it's 10x faster than the garbage you gave them.

> Yes, this happens. All the time. You just don't know because you made the perfect the enemy of the good.

That only happens in cowboy coding startups.

In places where security matters (e.g. fintech jobs), they just lock down your PC (no admin rights), encrypt the storage and part of your VPN credentials will be on a part of your storage that you can't access.


In my experience, fintech companies (including ones that either belong to or own a bank) follow one of two playbooks:

- Issue high-powered laptops that the developers work on directly, then install so many security suites that Visual Studio takes three minutes to launch. The tech stack is too crusty and convoluted to move to anything else like developer VMs without major breakage. - Rely 100% on Entra ID to protect a tech stack that's either 100% Azure or 99% Azure with the remaining 1% being Citrix. You can dial in with anything that can run a Citrix client or a browser modern enough to run the AVD web client. If they could somehow move the client hardware to the Azure cloud, they would.

I don't really associate fintech with a modern, well-implemented tech stack. Well, I suppose moving everything to the cloud is modern but that doesn't mean it's particularly well done.


Microsoft, Google, or Amazon don't t care about your fintech code. Other fintechs do.

The threat isn't your cloud provider stealing your code, it's your own staff walking out the door with it and either starting their own firm or giving it to a competitor in exchange for a "job" at 2x their previous salary.

I've seen very high security fintech setups first-hand and I've got friends in the industry, including a friend that simply memorised the core algorithms, walked out, rewrote it from scratch in a few years and is making bank right now.

PS: The TV show Severance is the wet dream of many fintech managers.


Between Github and Copilot, MS has a copy of all of your code.


Yeah I actually 100% agree. I think even more important is that the IP isn't even that valuable to competitors. Nobody outside China would touch it for legal reasons, and even in China it's just not that useful without the people that wrote it. Especially given how badly most of my colleagues document their code!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: