Because the meat you buy in the grocery store goes bad after a few days, and (here is where I think parent is being slightly intellectually dishonest) while it is true that the fast food companies certainly source their meats from authentic, standard-holding institutions, that's only the beginning. That isn't to say the rounds of processing and preservation that occurs afterwards. You can't just buy a McDonalds chicken nugget off the shelf. Sure the inputs are the same, but the outputs are vastly different, and that's where the perceived difference in quality comes from. Fast food optimizes for longentivity, ease of cooking so they can reduce the labor costs associated in preparing that meat, and eliminating sanitation issues so they reduce their total liability and loss around food borne illnesses so they can reduce the labor costs associated in preparing that meat for you. That is why McDonalds undergoes the painstaking process of sanitizing their meat with an ammonia wash (prompted by the 90's nationwide beef e-coli outbreaks that resulted in serious litigation against popular burger joints), then adding in artificial flavorings back in to make it taste like a burger again (furthermore this is how McDonalds achieves that "miraculous" feat often described here when this topic comes up of having their burgers taste the same and have the same consistent product everywhere for over 20 years).
Tl:dr; yes it is true, inputs are the same as what you get in the store, but the outputs are vastly different. You have to factor in additives and preservation process. There really is a simple "sniff" test/heuristic I have developed after my decades in food service, and it's not really a secret, but seems like some aren't in on it, and it goes like this: If it goes bad, it's good, if it doesn't go bad, it isn't good. (Good is obviously subjective here, so my criteria is obviously "real" food in the sense that it is minimally processed and preserved and minimal chemical additives)
That is why McDonalds undergoes the painstaking process of sanitizing their meat with an ammonia wash (prompted by the 90's nationwide beef e-coli outbreaks that resulted in serious litigation against popular burger joints), then adding in artificial flavorings back in to make it taste like a burger again (furthermore this is how McDonalds achieves that "miraculous" feat often described here when this topic comes up of having their burgers taste the same and have the same consistent product everywhere for over 20 years).
For what it is worth, McDonald's says on their web site that this is not true. So either you are mistaken or McDonald's is committing fraud. Do you have a source for your claim?
"Every one of our burgers is made with 100% pure beef and cooked and prepared with salt, pepper and nothing else—no fillers, no additives, no preservatives."
"Do you use so-called 'pink slime' in your burgers or beef treated with ammonia?"
"Nope. Our beef patties are made from 100% pure beef. Nothing else is added. No fillers, no additives and no preservatives.
"Some consumers may be familiar with the practice of using lean, finely textured beef sometimes treated with ammonia, which is referred to by some as “pink slime.” We do not use this. "
Full disclaimer: Info coming from a variety of sources: some documentaries which I know have their own inaccuracies, some my own personal experience in food service, and specifically working at McDonalds, and some having an ex-girlfriend of 3 years who was the district manager of another international large fast food burger chain, so by chance that gleaned me a lot of insight into the inner workings of fast food burger chains (he was previously a DM of several Golden Arches)
I haven't read this page recently (it certainly has changed much), but on skimming through, none of the terminology McDonalds uses on their website are meaningful in the sense that the standards there is no official definition and could not be distinguished between similar competitors that make similar claims besides simply what they say, and, more importantly, there is no regulated term between what constitutes beef being “pure” or “not pure” or otherwise put. It’s just some “thing” they say about their beef and we have to take them up on their word. Now if they said “we use USDA organic ground beef” that might have some teeth. It’s the same thing between a bag of candy telling you they no longer use artificial flavors and now use “natural” flavors.
Now the question of do I believe they have changed? Possibly. Do the burgers taste different? No, so common sense tells me you don’t drastically change your process like this and still get the same tasting burger from 20 years ago.
But I will actually do something uncommon here and admit I could be wrong, and have an outdated understanding of their process.
"100% Beef Patty. Ingredients: 100% Pure USDA Inspected Beef; No Fillers, No Extenders. Prepared with Grill Seasoning (Salt, Black Pepper)."
That seems pretty clear to me. USDA defines beef as flesh of cattle. If there is anything but "flesh of cattle" in the patty, McDonald's is committing fraud.
USDA has a pretty wide definition for what constitutes as ground beef[1]:
> After a months-long evaluation, the United States Department of Agriculture’s Food Safety and Inspection Service (FSIS) determined in December that BPI’s signature product—the offering famously called “pink slime” in an ABC News exposé that got the network in a lot of trouble—can be labeled “ground beef.” Legally speaking, it’s now no different from ordinary hamburger, and could even be sold directly to the public.
McDonalds does in fact use 100% USDA-inspected beef, with "no fillers, additives, or preservatives", and explicitly does not use mechanically separated meat. It's not "organic", but then, most beef isn't.
Then I am happy to say I am happy for their recent changes, this certainly wasn’t the case a few years ago when I wrote them off, and even more not the case 10 years ago when I was still in the industry. I think it should be promoted as an example that consumer pressure and expectations can cause companies to change for the better.
I will say I remain skeptical on exactly how they can go from the ammonia wash + chemical food flavoring process to using fresh beef and not have any discernible difference in taste or food safety, but kudos to them.
I'm not saying you shouldn't write them off. McDonalds fries are solid, but everything else there is terrible. And fast food is in general bad, and I don't want to be coming off like I'm saying that people should eat more of it.
The ammonia (when it was present) was present in such small amounts that it couldn't conceivably have altered the flavor, and they were never using "chemical food flavoring." The only change they've made recently is using fresh instead of frozen for a few products. It has always been just beef.
Only the quarter pounder is fresh beef. The rest is frozen. If you look at places that review fast food, they unanimously applauded the fresh beef quarter pounder - they did have a discernible difference in quality.
The ammonia is a bad scene, but the idea of using TG to repurpose "trimmings" isn't something we should be demonizing; if you're going to kill animals to feed people, you should be maximizing the yield (of muscle protein, that is). This is just an extension of the idea that if you're going to eat pork chops, you shouldn't be grossed out by the idea of eating offal; however ecologically irresponsible it is to eat meat at all, it must be more irresponsible to waste it because it squicks you out to eat anything but a loin chop.
(TG'd meat was a faddish fine dining trend a few years back, and it's pretty neat; for instance, you can make a solid, ribeye-like slab of skirt steak by "gluing" layers of skirt together, which is pretty delicious. It's also a technique that's been used in sausagemaking for a long time.)
I appreciate you have some first hand experience here, but this is a very HN-style axiomatic argument --- "restaurants must have issues with meat going bad that ordinary people don't, ergo their meat must somehow be mummified with preservatives". Isn't it in fact the case that fast food restaurants have, relative to supermarket consumers as an entire cohort, extremely high and predictable turnover?
A neighborhood sushi place has an even bigger problem with spoilage than a friend chicken shack, but, for pretty intuitive reasons (I think?), I'd trust any well established sushi place with ahi and salmon than I would my own fridge.
Further: while fast food input costs are the highest single line item in their cost breakdowns, they don't come close to dominating, and labor plus rent dwarfs inputs even before you factor in franchise fees, marketing, and other expenses.
> I'd trust any well established sushi place with ahi and salmon than I would my own fridge.
FYI, in the US it is common to mummify fish with carbon monoxide. It preserves the fish's color despite age. The best way to stop oxidation is to add a reducing agent!
I don't think CO is a problem, but it is a hack restaurants use that consumers are unlikely to be aware of which masks the visual indications of freshness.
AFAIK this is specific to tuna and done by places like fish markets, that publicly display it. A restaurant would most likely be getting it frozen and vac packed.
My favorite thing was when Wendy's did its "never frozen" campaign. As if freezing was the problem. It's not going to magically last just as long if you don't freeze it, you have to close that gap with more preservatives.
If you're trying to minimize cost, you're better off buying the right quantity of meat to begin with rather than buying preservatives.
Freezing is hugely deleterious to ground beef quality. It changes the texture irrevocably. The two major differences between fast food burgers and fast casual burgers are frozen/fresh and cooked in advance/cooked to order. All the more expensive, more lauded burgers like Five Guys, Shake Shack, In 'n' Out, etc are fresh beef. McDonald's launched a fresh beef quarter pounder to universally positive reviews.
If I'm not mistaken the original inspiration for this research was precisely that farmers noticed their cattle (or perhaps it was sheep) were happier when allowed to graze in fields which happened to be along the shore, allowing them to naturally eat quantities of sea weed.
Careful having extreme stances on things. Packaged water keeps many people alive during times of natural disasters or crises. Volunteers flock in and one of the first items brought in is bottled water. It was a lifeline for many people during the Flint water crisis. Finally it serves an overall utility of giving people another choice at the vending machine. It sounds silly to you and me as well, but it has a real impact on public health.
It’s a fine choice if your local water source is truly not potable, but for the vast majority of the western world that’s not the case. On the contrary, the majority of tap water that people can drink for effectively free is of higher quality and testing standards than most bottled water.
> Each time EPA establishes a standard for a contaminant, FDA either adopts it for bottled water or finds that the standard isn’t necessary for bottled water.
> In some cases, standards for bottled water and tap water differ. For example, because lead can leach from pipes as water travels from water utilities to home faucets, EPA has set its limit for lead in tap water at 15 parts per billion (ppb). For bottled water, for which lead pipes aren’t used, the lead limit is set at 5 ppb.
So it could go in either direction though leans toward tight coordination. Enlightening to know there is more scrutiny there.
That aside, the horrible taste alone should be enough to dissuade people with a known safe alternative from drinking bottled water.
So why rust then? Rust does not have a specification and it is currently still under heavy development, with many changes to the language expected to come out in the years to come. As many have put it before me, all of rust is technically undefined behavior. Change my mind -- I like the language, but I prefer those that have specifications and have multiple implementations so that the behavior is verified.
I'm pretty up to date on the science myself, and I still consider myself anti-GMO for a host of reasons, little to do with the science. My objections are more philosophical.
two primary reasons: hubris and greed.
The hubris to think we know which varietals are the best and will continue to be the best. We may go all in on one species or variant and then turns out an unknown bacteria we have previously no clue about wipes out all of them. You never know, you need variety. Bacteria outnumber us all.
number two: greed. You worried about tech being consolidated into the big 5? how about this scientific research? you want our food, something we actually depend on, to be consolidated into 2-3 chemical companies? I don't.
Smaller reasons include: the power of being able to still survive on pure nature's means, and the freedom to do so. We don't realize it, but these things we do out here in the more advanced nations greatly impact the developing world, where a large portion of the world's population exists.
> The hubris to think we know which varietals are the best and will continue to be the best.
If they don't grow GMO Cavendish, they will pick some other variety and grow that everywhere, like they did after the Gros Michel went under. GMO doesn't seem to be a prerequisite for monoculture at all.
> the power of being able to still survive on pure nature's means
We haven't been able to do that since the dawn of agriculture. If modern agriculture disappeared tomorrow, the Cavendish plantations would not resemble Cavendish plantations very long, Panama disease or not.
The only problem I have with a "GMO Cavendish" is that it would likely be patented, which could create a company with some extraordinarily powerful IP, which is very dangerous. But that's a problem with the law, not GMOs per se.
Many of us that are considered anti gmo are more anti patent for food genes rather than of anti gmo. The developing world mostly has agricultural economy now Western companies are trying to horn in that as well. A few years ago an American company tried to patent and stop Pakistan and India from selling a rice variety called Basmati by modifing it genes and trying to patent it. When its come to gmo the trust deficit in rest of the world is not just about the science but of the western companies and their patents.
Literally all of the problems you mentioned existed before GMO was a thing. Companies have been patenting varietals since the 1930's. Patent trolling in agriculture is also quite old. E.g. the yellow bean patent debacle:
I choose to avoid GMO products (it's my choice, right?). To do that, I have to be able to distinguish them from non-GMO products. That means labeling; and if GMO promoters are lobbying against labeling (and they are), then I'm against the GMO promoters. Unless GMO products are clearly labeled, I favour a ban on importing them at all.
As far as genetic modification in the lab being indistinguishable from genetic modification the way farmers have always done it (cross-breeding), here's the difference, in a nutshell: farmers have no method for cross-breeding a potato with a jellyfish (or adding genes from bacteria, or whatever). That is, the lab technique permits technicians to effectively create new species.
Now I'm OK (in principle) with new species appearing on the shelves; but I don't want to eat them myself, until they have been tested with the same rigour as if they were novel medicines. My choice, you see. If the GMO products are smuggled onto the shelves in disguise, then what happened to my choice?
Or, to put it another way, you can be for (or against) the broader concept of social media, with (or without) being against Facebook, by name, but in a broader discussion, this subtle distinction is easily lost. I'm for GMOs but against Monsanto and their seed DRM.
My biggest concern is similar, that a GMO banana is intellectual property and I detest the thought of large companies preventing people from growing their own food.
The most likely way large companies would prevent people from growing bananas would be by not making GMO bananas, and just letting disease ruin all the other varieties (if some varieties were resistant, then how could the existence of GMO versions prevent people from growing those?)
Have you ever worked in a large engineering organization full of engineers with varying degrees of experience all trying to accomplish the same goal?
I can't imagine anyone has ever tried to do engineering at scale (people wise) and did not find the value in static typing.
It's why startups eventually moved off RoR once they started scaling. It's why there is such a large push to type JavaScript (have you seen the rollbar article about the top 10 errors in JavaScript? All but one have to do with types: https://rollbar.com/blog/top-10-javascript-errors/), it's why Facebook created Hack, and outside of parentheses repulsion, it's probably why so few large projects have been written in a LISP or LISP descendant.
Python is great for small: small teams, small organizations, small projects with a few dedicated tasks, small scripting tasks. Most people aren't trying to take anything away from python here in the comments save a few irrational responses.
*again want to stress in my comment when I speak of scale I mean scaling people wise: more organizational structures in your company, more engineers, more collaboration between teams.
>I can't imagine anyone has ever tried to do engineering at scale (people wise) and did not find the value in static typing.
>why so few large projects have been written in a LISP or LISP descendant
The major dialect of Lisp, Common Lisp, is strongly typed, and many large projects have been written in it, for CAD/CAM, controlling a NASA spaceship, complete operating systems (Open Genera), the Mirai 3D graphics suite used for creating Gollum in "the lord of the rings", etc.
> 1. Uncaught TypeError: Cannot read property
If you’re a JavaScript developer, you’ve probably seen this error more than you care to admit. This one occurs in Chrome when you read a property or call a method on an undefined object.
Does typing stop null object errors in JS, Java or C for that matter? No. You need to continually check for null objects in all langs I use including Python. It seems most of the bugs on that page are of a similar vein.
Null reference errors in Java and C are due to The Billion Dollar Mistake, which is a specific deliberate weakening of a static type system. Statically-typed languages that do not commit The Billion Dollar Mistake do not have null reference errors.
My spitball opinion on that: we had diversity in America during the beginning of the industrial age that we heavily relied on for it's host of benefits, innovation primarily as well, but it didn't create the conflict (conflict theory terms) we see today because you were expected to integrate (conform) to be "American", which was a shared set of values and beliefs. America is still today one of the few geopolitical distinctions today that is solely demarcated as such by subscribing to a fundamental set of beliefs, rather than by blood as you have in most other countries in the world, especially in Europe. America is chiefly an idea, and by believing in that ideal, you gain the entire inheritance.
This is something that is continually being eroded unfortunately, especially as we continually descend into more and more hyphenated subgroups of Americans, rather than just focusing on what makes us simply Americans, but something we should still strive for.
(And I know you will get downvoted for posting about Robert Putnam research here because it appears to come off as a knock against diversity, even though Putnam himself said his research actually affirms the benefits of diversity)
Except your opinion does not actually follow from the facts.
African Americans weren’t just “hyphenated subgroups” earlier. They weren’t even allowed to be in the same places as white male americans. Similarly, Chinese Americans formthe most part lived a separate life, and Japanese Americans were considered so different they were placed in internment camps during WW2.
Even the Irish and Italians were treated differently when they arrived, and had entire sub cultures.
The idea that the US is more hyphenated today just doesn’t seem to follow from what was actually happening in the 20th century.
Early America was so hyphenated that people only occupied regions with people of the same original nationality. We still see places with lots of French street names, or a place with lots of German street names. To some degree this might have been self-selecting, but there are documented cases of people being denied mortgages in particular areas based on their ethnicity into the twentieth century.
We are in a place in American history where ethnicity has the least amount of impact on someone's life. It's illegal to discriminate on job applications, mortgage applications, or school applications based on ethnicity.
We have a lot further to go, and sometimes we take a step backwards. I think we'll continue to see an increase in diversity in all public spaces.
It's quite possible that the US is at an inflection point. Now those hyphenated groups are spread so thin that they lost all their meaning. Now you can't find comfort in sameness, because yes, you share some common history, but you are also very different in other ways.
And thus people can't seem to find their place, can't seem to find friends, and the default fallback of going back to your people is no more.
" In the short term, he writes, there are clearly challenges, but over the long haul, he argues that diversity has a range of benefits for a society, and that the fragmentation and distrust can be overcome. It’s not an easy process, but in the end it’s “well worth the effort.” Putnam cites the integration of institutions like the U.S. Army as proof that diversity can work."
The point about the army is a particularly interesting one because they certainly are experts in integration and conformity, which hits back to my point in my previous comment as well.
The army life is not a good example; it is an intense, catalytic experience that accelerates and exaggerates processes that may not even appear in the real life. For many people their army buddies are the people they trust because they trusted their life to them, nothing like this happens for regular people.
It's not just the military in wartime though. My father was in the US Army in the late 1950s after the Korean ceasefire was signed but when they were still drafting soldiers to be deployed there. It was the first time he had any major interaction with African-Americans as they were rare in the small Northern town he grew up in. Serving alongside them and realizing that they weren't much different from himself was a significant experience for him.
"A utility-first CSS framework for rapidly building custom designs."
It's an excellent description in of it's own, but if I were to attempt to put it in other terms, it allows you to compose css components and classes out of other smaller, essentially one liner css classes, while maintaining a consistent styling and spacing across the different utilities, which you define up front using the config but also comes with good defaults out of the box.
It is excellent and now my preferred method for hacking up a new site. It also helps you think about your design system overall so now I find myself paying attention to things up front I would normally wouldn't and it ends up saving me time in the long run. It also pairs really well with the excellent Refactoring UI book.
How hard is it to say "these are the types of things we want to do. anything must look like these types of things in terms of (insert x y and z criteria: size, scale, opportunity, profitability, growth, etc)
spoiler alert: it isn't hard and this is what we do at my startup that is currently experiencing hyper growth and is on multiple "unicorn" lists
I can't even tell if it's satire. But if it's not: I haven't see it be done easily over several hundred engineers. If you have that kind of charisma, hat off to you!
Not to heap it on...but ok who am I kidding I am heaping it on: I had a terrible interview experience with them.
Basically there was a period on HN where every other post was about stupid hiring practices and how absurd some code interviews were. Well Gitlab embodied all of them.
They phone interview quizzed me on several "gotcha" questions, and finally I was asked to describe Prototypical inheritance (JS), in which I was knowledgeable about and gave a very detailed and technical answer, in my own words (so not some memorized wikipedia answer), and basically what I got back from the recruiter was in essence, "Nope that's not what this piece of paper I have in front of me says the answer is" that is that although I had answered the question technically correctly, it wasn't the type of answer they were looking for.
Anyways, not salty about it anymore because it seems like I dodged a real bullet.
Yuck, the phone interview sounds like exactly why I didn't make it to the phone interview. I answered their 2 pre questions via web form... Very detailed answers although the questions were annoyingly vague on how much detail to include. Definitely qualified or even over qualified for the role and got a "not moving forward" with no explanation. Kindly asked for an explanation so I could improve and got so response. I'm leaning towards ageism because I have nearly 20 yrs experience.
Hi Liam, I think the comment captures the essence of what feedback I would have and I am not interested in providing anymore feedback, no offense I am busy, but appreciate your willingness to learn and grow, sounds like the company could have used a few more people like you at the time I interviewed (was 2 years ago now).
Tl:dr; yes it is true, inputs are the same as what you get in the store, but the outputs are vastly different. You have to factor in additives and preservation process. There really is a simple "sniff" test/heuristic I have developed after my decades in food service, and it's not really a secret, but seems like some aren't in on it, and it goes like this: If it goes bad, it's good, if it doesn't go bad, it isn't good. (Good is obviously subjective here, so my criteria is obviously "real" food in the sense that it is minimally processed and preserved and minimal chemical additives)