I wish US implement a similar system but I wonder how its going to work when housing prices are astonomical especially in the Bay Area
Getting paid 250k/yr with 20% downpayment isn't enough to afford a house with 2 kids, so providing a "free" or "afforable" housing to those who aren't currently employees is only going to upset those who are working hard
IMO govn't need to relax the regulation to build more houses and drive the cost down
I completely agree with Finland's approach though. Permanent housing is the minimal requirement to reduce homelessness. Without placed to stay, mailing address, security, it's difficult to get out of homelessness
A key to this strategy is building sufficient numbers of housing units; if you split these between units to be offered in the market (prevention) and units dedicated to permanent housing of the currently unhoused (cure) you bring down costs for people with income seeking housing in the market while providing immediate (as the units become ready, obviously there is a lag from adopting the approach as policy unless you have vacant capacity that can be instantly repurposed) assistance to those who even with greater supply are not inmediately able to make market rents.
You can't execute a Housing First strategy effectively without adequate housing supply, which is the most fundamental problem in a number of locales, including the Bay Area. But additional market supply alone is not sufficient to address the urgent homelessness problem.
> IMO govn't need to relax the regulation to build more houses and drive the cost down
That absolutely needs to happen, and that helps with prevention, but except for the fairly-well-employed homeless (a group that actually exists and is often ignored, but isn't a big part of the homeless problem), adding new market rate supply alone does not provide significant assistance to the currently homeless.
Could someone clarify on how it can achieve exactly-once processing with idempotency key?
Using the example they provided:
1. Validate payment
2. Check inventory
3. Ship order
4. Notify customer
I'm curious about case when one of the operation times out. The workflow engine needs to then either time out or it may even crash before receving a response
In this scenario, the only option for the workflow is to retry with the same idempotency key it used, but this may re-execute the failed operation which may have been succeeded in the prior run because workflow did not receive the response. The succeeded operations would skip because workflow has run completion record for same idempotency key. Is that correct?
> The succeeded operations would skip because workflow has run completion record for same idempotency key. Is that correct?
This sounds about right. But you need to make sure the service being called in that step is indeed idempotent, and will return the same response which it earlier couldn't in time.
> I think a lot of programmer arguments bottom out in a cultural clash between different kinds of engineers
True. Also it should be the managements'/team leads' role to act as a midiators. It's a waste of time to constantly argue over what is "right" when all proposed solutions are functional.
Should we spend more time flushing out the unknowns? Should we launch ASAP and interate fast? Should we automate the process? Do we have data to back our assumptions?
The "culture fit" is not superficial and I've always been advocate of fire fast if one is culturally unfit because it slows down everyone
You are talking about a third dimension that I thought of immediately upon reading this, which is risk aversion, mentioned long ago in https://gist.github.com/cornchz/3313150 (the original is no longer accessible, thanks Google+)
I'm just trying to imagine the different traits for liberal vs conservative grifters (thought leader vomit vs machiavellian connivance), believers (thought leader vomit vs constant market research), grinders (ship lots of code vs write lots of tests) and coasters (shitpost all day on random slack channels vs do the bare minimum to appear to be working)
1. Clients will do whatever they need to do get their job done, even if it's not the publisher's intended way
2. Clients don't read documentation
3. Bugs will become part of API once enough clients rely on their behavior
4. The number of API calls does not necessarily equate to importance.
---
As such, I aim for the following when developing an API
1. Ship the beta API early and see how they use it to minimize the surprise. (This may not always be possible)
2. In most cases, bump up the major version while supporting the previous version. This means you'll need to define SLA for your API
3. Most clients are OK with breakages as long as they are given enough time to migrate, or the API provider gives them a tool to auto-migrate the code (if that's possible in your product)
Yes, as I have said again and again on hacker news in different comments Netflix went overboard with their microservices and tried to position itself as a technological company when it's not. It has made everything more complex and that's why any Netflix tech blog is useless because it is not the way to build things correctly.
To understand how to do things correctly look at something like pornhub who handle more scale than Netflix without crying about it.
The other day I was having this discussion with somebody who was saying distributed counter logic is hard and I was telling them that you don't even need it if Netflix didn't go completely mental on the microservices and complexity.
Fastly says they do 6M ccv for superbowl (i'm actually surprised they let them do the entire thing and don't mix different CDNs) and I'm not sure they do encoding and manifest serving - they might just cache/deliver chunks. Do you really think tyson vs other guy was only 600k ccv? I'd be shocked if Netflix can't handle this.
You would think, but technology always finds a way to screw things up. Cox Communications has had ongoing issues with their video for weeks because of Juniper router upgrades and even the vendor can't fix it. They found this out AFTER they put it in production. Shit happens.
(Although, even these statistics are not as simple as they seem! E.g., when an H-1B status holder changes employer this counts as a new receipt even though the number of H-1B workers hasn't changed. In periods of time when there is lots of churn in the labor market, like in 2022, you would see higher receipt numbers just from the churn. It's complicated!)
> Meta introduced a feature known as Tag Suggestions to make it easier for users to tag people in their photos. According to Paxton’s office, the feature was turned on by default and ran facial recognition on users’ photos, automatically capturing data protected by the 2009 law.
Meta should also be forbidden from using any features derived from this data, or open source any trained model from the data. 1.4B settlement is too small compared to the long term gains of a company
Getting paid 250k/yr with 20% downpayment isn't enough to afford a house with 2 kids, so providing a "free" or "afforable" housing to those who aren't currently employees is only going to upset those who are working hard
IMO govn't need to relax the regulation to build more houses and drive the cost down
I completely agree with Finland's approach though. Permanent housing is the minimal requirement to reduce homelessness. Without placed to stay, mailing address, security, it's difficult to get out of homelessness