It's a lot harder to make such drastic changes to something that's already serving production traffic at scale. It's okay to reinvent solved problems sometimes because the alternative is be a multi-year project with unclear returns.
Also, the 7 day retention is a feature - it helps us recover data quickly in case of bugs in our storage code.
I had written a novel of counter arguments and realized anyone who tries to tell someone that their bug is a feature isn't going to see the but for what it is.
Best of luck, and God help the poor souls who have to maintain this when the original authors have left.
Correct, but the number of shares outstanding isn't infinite: it's always a fixed number. Just like there's a fixed number of square miles of land. And you can make and remove land. Lots of dredging along coastlines is done to make land. It isn't a great analogy and investment decisions should not be based on the concept.
The thing that makes the redbook special in my opinion is that the editors have been able to apply their research to solve actual problems for paying customers! You don't get to see enough of that in academia.
It is up to date. Things haven't changed substantially, and they probably won't change soon either. There's nothing in the book that you'll have to unlearn or avoid applying.
Its an interesting book in that 2015 was in the middle of the noSQL hype. Since then, people have started looking for results and being more critical.
There's a gazillion technologies that we could list that are newer, and claims that any of them are the next big thing and will fundamentally change everything are, obviously, exaggerated.
You might look at the concept-oriented model [1] which is a major alternative to set-oriented approaches (including RM and MapReduce). Shortly, instead of viewing data processing as a graph of set operations, this approach treats it as a graph of operations on functions which make many data modeling/processing tasks simpler and more natural in comparision to the conventional purely set-oriented approach.
Most companies in the valley have a terminal level (a level at which you're not expected to progress to the next one) that's close to level 3 in the chart here. In my experience, a lot of people don't progress beyond that. For the ones that do, the number of years of experience plays a smaller role than the quality of the experience itself.
Salary negotiation is pretty one sided today with companies holding all the cards. Better visibility into numbers will definitely help level the playing field.
Collecting offer letters is an interesting way to build trust in data. What's the incentive for someone to upload their offer letter though?
It's not all the cards but companies (recruiters) usually have information asymmetry on their candidates. Sites like levels.fyi are a good remedy, with the right info and counteroffers you can negotiate salaries/sign-on bonuses up by pretty large amounts.
I tried this in West Norway recently, being as firm in the negotiation tactics as possible. Didn't yield higher offers at all! So that tells me the market here is actually not nearly as competitive as all the consulting companies try to make people believe.
In other words, the success of this strategy is market-dependent, but it certainly seems to have a huge effect in SV.
For the employer that means hiring the next best candidate, or not hiring anyone for that position. For you it means taking the next best offer, or staying at your old place etc.
The concept is closely linked to opportunity costs.
In general, you can negotiate an agreement only between your BATNA and their BATNA.
(Part of interviewing in batches is that you can credibly present that your BATNA is very high. So you can be tough in negotiations.)
Yep, I'm aware of this and use it. I guess in these terms, my claim is that the BATNA of most companies here is actually that they have all the engineers they need at low cost. But are presenting publicly that they have a shortage and more people should become engineers, so that they can get them even cheaper.
It might be competitive in the other direction: too many engineers, not enough opportunities. Then it becomes a bidding war to see who will accept the least pay.
Just a few years ago many large silicon valley employers (including some companies on the above list) were part of an illegal anti-poaching agreement and were sued and settled with the DOJ for it. Just because engineers get paid a lot doesn't mean their employers aren't scummy.
If you're interested in buying one of these, Unicomp (https://www.pckeyboard.com) purchased the rights to continue making Model M style keyboards once Lexmark removed them from their line of products.
Also, the Model F (https://en.wikipedia.org/wiki/Model_F_keyboard) is considered by many as being superior to the Model M. IBM made far fewer Model Fs compared to the Ms, so if you find one of these in the wild, it'll be really expensive. https://www.modelfkeyboards.com is trying to re-create the original Model F.
> If you're interested in buying one of these, Unicomp ...
Or get a used Model M at a flea market, on eBay, or the like. They are relatively indestructible, and generally still work fine 30+ years after production.
> if you find [a Model F] in the wild, it'll be really expensive
This is a recenct phenomenon. The price e.g. on eBay has gone way up in the past 10 years. You used to be able to buy them for quite cheap. It’s absurd that people will spend >$100 for an XT keyboard with its overlong spacebar, terrible layout, and obsolete communications protocol.
There were hundreds of thousands if not millions of Model F keyboards produced (both XT and AT varieties). The keyboard being recreated in https://www.modelfkeyboards.com is something different: a very obscure banking keyboard, cf. http://kishy.ca/?p=648
I can second that. I have the pleasure of having my grandfather's Model M ('87) and I use it nearly every day. I had to clean it extensively when I first got it, since both grandparents smoked in their house. Cleaned up nicely, good as new. To think, grandma wanted to throw it out!
I've flown with it several times... Gotten a few smiles from older TSA agents. Haven't mailed it, but its gone both under the seat and in checked baggage and survives each time. (It fits snugly in my messenger bag).
I use it a lot more than my laptop's keyboard and I've had to replace the laptop's up arrow a couple of times...
Oh, and it works wonders with my father's text editor too (vi, though I prefer vim).
Over 800 dishwasher cycles later and the keyboard still functions as if it were brand-new. All I need to do is replace the PS/2 cord which has worn out at the strain relief.
Reading your comment, I thought either "nobody is going to believe this guy" or "people are going to think he's nuts". Actually, I thought it sooner than that, because I almost shared a story about how I used to clean up my Northgate Omnikey and decided not to because I didn't want to be judged. :) And yet, here you are, with an equally insane, similar but much better approach.
My method for cleaning my Omnikey was to fill the kitchen sink with water and clean it like one cleans a dish. After a quick towel-dry, I'd toss it in the oven for 12-or-so hours at the lowest temperature (I think about 150-200 degrees American). I did this at least 10 times during the decade and a half that I used this board (and it was sold in perfect, working condition).
I'm sure I did some damage, somewhere. There's got to be parts in that keyboard that corrode, but it always came out "like new". I haven't owned a keyboard like it since and I expect the one I use day-to-day wouldn't survive a strong gust of wind. I'm not completely, nuts, though. The first time I performed this all-day, insane, cleaning routine was after I had lost all hope of resurrecting my keyboard due to an unfortunate encounter with a full glass of Mountain Dew.
At no point did I ever consider putting it in the dish washer. That would have saved me hours!
Sent my Apple Pro keyboard through my dishwasher last week! Hadn’t been cleaned in probably 8 years so in addition to looking almost brand new, it now types significantly better.
When I worked for a CS department long ago, I took all the Sun Type 4 keyboards home in batches of 6 to run through my dishwasher.
I would never do it to a board unless I was really sure it was completely unpowered when disconnected, and waited days for it to dry.
There are lots of other caveats depending on the exact keyboard and exact dishwasher model.
I wouldn't dream of using my current dishwasher. The heating element cannot be turned off and it outright melts some less expensive (barely dishwasher safe) plastic-ware.
Probably good for cooking crab legs, but would likely warp the plastic on a typical keyboard.
I've put model M keycaps and the top shell (no decals there) in the dishwasher. Keycaps came out great. The shell warped slightly from the heat but it was not enough to ruin the keyboard.
I’ve cleaned key caps with dishsoap in a plastic container and snug lid. A psyllium husk container worked really well. Could go nuts with agitation then leave them on a towel to dry. This was for a white mac keyboard though so I had to clean the body with qtips and paper towels. its kind of funny how odd cleaning a keyboard can feel. They’re not designed for it, but should be!
I guess I shouldn't be too surprised[0], we used to clean our sneakers in the dishwasher.
And about a decade ago I had a crab dinner served at a friend's house. The guy knew how to cook -- easily the best crab legs I've had (helped by the fact that he knew where to get good product). After we finished the meal, he explained that he cooked them in the dishwasher. I think part of me was grossed out, but it lasted only a few seconds ... they were extraordinary.
Is this the capacitive type or the membrane type? My understanding is that liquids kill the membrane-type model Ms very quickly, which is why they sprouted drainage holes when IBM switched from the capacitive buckling springs to the membrane buckling springs.
(Liquid does not permanently kill the membrane, it's just that liquid gets stuck between the two membranes and can't get out. You can disassemble everything, dry it, and get it working again. Probably. I haven't tried it.)
I have spilled on my Topre keyboards before and there is no fix but to take everything apart and dry the individual pieces. The keyboards are quite well made so can take many cycles of this careful cleaning. To me, that's the best you can hope for; water is not good for keyboards. Though you can probably use Hall effect keyboards under water, if the driving electronics are conformally coated.
The capacitive keyboard was the Model F. Instead of the plastic barrel frame, it uses separate plastic barrels held between two metal sheets, and the little conductive plastic flippers trigger capacitive pads on a PCB. https://deskthority.net/wiki/IBM_Model_F
“This source here” is wrong (as was pointed out in the comments below in 2014). It is just misquoting Wikipedia, which it links to: “In a Model M, the electrical contact is a membrane sheet similar to that of a modern dome switch keyboard. On the older Model F design, a capacitive contact was used instead.”
Go ahead and take a Model M apart. I guarantee you’ll find a membrane sheet.
I will clean my keyboard in the dishwasher every time after I've descaled/cleaned the dishwasher and run a rinse cycle to clear everything else out, so maybe every other month. But this keyboard came from my elementary school, where they'd clean them weekly, because we all know how nasty kids can be. The vast majority of these cleaning cycles came from that school.
While it's true that Unicomp have purchased all the rights and equipment to make Model M keyboards, the quality doesn't seem to a candle to the original. I managed to break two in the span of two years. At which point I gave up on them and simply built my own. This could just have been catastrophically bad luck. Or like Linus says in the video, maybe it has something to do with their version being based on the newer, cheaper Lexmark version and not the IBM original.
Model F was a very interesting keyboard. I was rather fond of the 24 function keys on mine. And I now regret not having taken mine with me when I moved out at age 18.
The model M got progressively "cheaper" throughout its lifetime. Mostly, metal was replaced with plastic. The keyswitches and keycaps are still top quality.
What Unicomp makes today is a mid-90s version of the model M. It's not the best ever made, but it's still pretty good. However, there are other keyboards around that are also pretty good.
I have a Unicomp, and my roommate spilled water on it once, which caused a bunch of the keys to not register and required a rebuild (which cost almost as much as a new keyboard).
I use both (model M at home, Unicomp at work) and while I've not had any failures, it's clear that the Unicomp is made of cheaper materials. One example is the case itself - the Model M's case is made of thick ABS, while the Unicomp is made of thin polystyrene. As a result, the two keyboards feel different - there's a a resonance from keypresses on the Unicomp that's almost entirely dampened on the M.
Like every piece of anecdata, it varies. I have a unicomp with PS/2 adapter from ... at least a decade ago; before they even made one with "Windows" keys. Maybe longer. It works as good today as it did then.
What I've heard is that Unicomp owns the actual original molds and such, and keeps using them, even though they're deteriorating. So I guess it makes sense that older Unicomp hardware is better. (I'm not sure how accurate that is, but it's what I've heard.)
Have seen clickykeyboards.com before, seems like a great place to get a refurbished model M. They seem to take great care in making sure everything is in good working order.
Considered picking one up when I got into mechanical keyboards last year, but I went with a WASD with Cherry MX ‘Clear’ switches. Really happy with it!
Most people who joined in 2015 were getting RSUs and not options. Also, the strike price on options is dependent on the 409A valuation and not the valuation at which they raised money.
1. The US issues approximately 1M greencards every year - employment based petitions are around 10% of that number
2. The bill does a gradual rollout of eliminating the cap - so most people who have already applied for a green card and are in the US should not be affected
3. Employment based green cards are allocated based on skills - there's a specific allocation of 50K green cards just for diversity every year. That's around 35% of the employment based allocation.
I believe the problem of the "gradual roll out" is it doesn't consider people from the countries that already exceed the per country limit, those people will wait a long time before they can get the green card.
Kafka can probably guarantee exactly ones semantics on publishing (conditions apply). It definitely cannot guarantee exactly once semantics on the consumer and processing side. Imagine a scenario where you receive a message from Kafka and process it, but the processor crashes or has a network partition right after. There's no way for the message to be acknowledged and you either have to design your system to be idempotent or handle exactly-once semantics further down the stack.
Databases have been handling exactly-once semantics for decades now. What Kafka is doing is not new and actually gives you a false sense of security when it comes to these kinds of things.
Disclaimer: Not the author, but I was on the team that migrated our infrastructure to GCP.
As a startup with limited resources, it's important for us to invest all our engineering strength into the things that create direct value for our business. We'd rather pay Google to manage machines and run services like Kubernetes, Spanner, Pub/Sub and others and free up the engineers to work on our core analytics platform.
I think the parent may have meant the opposite of how you interpreted it... That writing a tsd db from scratch doesn't match what you just stated about investing engineering time where it makes the most sense.
We don't run a TSDB. A TSDB doesn't work for the kinds of queries we run - specifically, TSDBs don't work if you want to
* analyze every datapoint you receive
* when the dimensional cardinality is high
* you want to analyze behaviors over time (e.g. the output depends on the orders of events followed - like creating a funnel report)
There's no off-the-shelf solution that does this at the scale at which we operate - hence the need to write our own custom solution.
Also, the 7 day retention is a feature - it helps us recover data quickly in case of bugs in our storage code.