When I was an exchange student at RIT and had just arrived from France a month before, one of the admin staff invited me and a friend in the same situation for thanksgiving because she didn't want to leave us by ourselves for a major holiday.
I have fond memories of that kindness.
I tend to find that for things like this that are really math heavy, it's usually better to create a DSL (or create easily readable function calls, etc) that you can easily write yourself instead of relying on AI to understand math heavy rules.
Bonus points, if the rules are in an easily editable format, you can change them easily when they need to. It seems that was the path the author took...
And yes this kind of use-case is exactly where unit tests shine...
> create a DSL (or create easily readable function calls, etc)
These aren't really that different. Consider the history of the earliest (non-assembly) programming languages, particularly https://en.wikipedia.org/wiki/Speedcoding , as well as the ideas expressed by Lisp.
Oh yeah, that's why I added the parenthesis. I consider lisp macros to be a dsl and that's exactly what I tend to like using. Similarly with ruby and some meta programming tricks.
I do the opposite, set up everything myself in terms of architecture/design of the software, so the AI can do the boring boilerplate like "math heavy rules". Always interesting to see how differently we all use LLMs.
I've usually not been impressed in AI's implementation of math heavy rules so I wouldn't trust it much and I tend to find it easier for me to write them myself and then verify :) Yup, it's always interesting to see the different usages.
That's what the French government paid per year per student at my engineering school in the early 2000s. Tuition fees paid by the student were 540 euros a year, but the cost to the government was quite high.
France is the same, the better universities are all public. But I know that the government spent an average of 35,000 euros per students at top public engineering schools in the early 2000s, not sure nowadays, so they do have funds it's just that the way of bringing money depends on actually being great academically.
This is why when my son is old enough to choose a university, I'd probably try to advise him against doing undergrad in a UK or US university if he's studying STEM. Based on interviewing CS graduates, it doesn't seem that the level is that high in most UK/US universities compared to other countries (of course with the exclusion of the very top) and that seems partly due to a culture of pushing for profits over education and making it very hard to fail.
When I did CS at a UK university in the 1980s it was brutal - I was an idiot in my first year and had to retake maths 3 times meaning I slipped behind by a year. However, I eventually did learn my lesson(s) and did increasingly well over my second, third and fourth years - ending up with a 1st and being particularly fond (ironically) of the mathematical parts of the course.
We had quite a lot of foreign students on the course and they were all, without exception, completely awesome and great people to do a course with. Mind you, Norwegian moonshine is horrific...
Second paragraph rings true of my experience 10 or so years ago. But may depend on the university and the current funding environment is a bit different.
Exactly. Also, the same push for profit made cohort sizes for many courses extremely large, simultaneously making the more interesting/required (or if you a slacker, easy) classes extremely difficult to get in (first-come-first-serve basis, you need to ensure you are the first in the online line to get your seat) and the class sizes are too large to make learning interactive.
Funnily enough, as a full-fee paying international student, I had an easier time learning in India than in the US a decade or so ago; the only thing that made my masters education worthwhile was the research opportunities, the general quality of students, and an easier job market (at that time). Given that all three are in decline right now, I would not advise anyone to pursue masters abroad.
Without knowing anything about your situation, this sounds like a bad idea. I think roughly you want a university that is well-regarded[1] and hard to get into so that one’s attendance carries some signal.
[1] by well-regarded, I mean well-regarded by eg people at competently run well-paying firms who do hiring, rather than eg people who are really into politics and who have idiosyncratic opinions about particular universities
Oh I mean alternative would be a well regarded university/school in Germany or France... I'm French but we live in HK and most kids here (even the ones who go to the French International School or the German Swiss School) end up trying to go to UK or US universities. French and German international schools tend to not be that well ranked in the most well known rankings despite being very good technically (which is annoying when trying to get a visa to certain countries).
Part of my bias is that I was an exchange student at RIT and while I appreciated the experience, I was not impressed by the CS courses or the level of maths of the students going there.
I've been using it since I got it. It's been working great with one small issue that I haven't been able to solve. For some reason when I use plasma on Arch linux (but not ubuntu), the display outputs garbage. I'm guessing it's not detecting the EDID correctly and setting a weird resolution or refresh rate. It's not a major issue since other desktop work well so I haven't spent much time looking into it.
Ever since I started using Wayland I’ve had this problem, even when I’m not using KVMs. I have to do a power reset of my monitor to get it to negotiate the correct resolution and refresh rate. Meanwhile Windows and macOS use the monitor without issue. I suspect the issue isn’t solely with the KVM.
I get that too on a machine running Mint Cinnamon. It also happens from the BIOS screen, so I don't think its a Linux issue. A re-plug fixes it, but that's not great for a remote access device.
Most people can buy an opinel and be happy for decades. You don't need anything fancy for a general purpose knife. $50 max, and that's if you're feeling like getting something special.
Expensive steels are, by and large, incremental progress over cheaper knife steels, provided it got an appropriate heat treatment and has good edge geometry. In almost no applications will an end consumer notice the difference.
I've been using the same thrift store knife I picked up 15 years ago. It gets sharpened maybe once a year, honed every so often. It was like $20 i think? Most chefs I know have a similar story with their knife/knives, something cheap that does the job.
Spending more on knives is just status symbol nonsense, which unfortunately has infected absolutely everything. It's like spending $300 on a spanner wrench. Who in the hell spends that much on a wrench? Why would you spend that much on a knife? lol. It's what you do with it that matters.
I remember seeing a comment by a local "celebrity chef" where he said he never sharpens cleavers - he just buys a specific inexpensive brand 5 at a time for $8 each and throws them away when they become dull.
While I don't agree with externalising the manufacture/disposal costs with that sort of disposable consumption, I do see the economically-rational decision making behind it.
If you're running a restaurant in Australia, your lowest paid kitchen staff get $24 an hour during weekdays, 30-35 and hour on weekends, and as much as $55 an hour on public holidays. And if they work more than 8 hours in a day it's 1.5 times those rates for the first 2 hours of overtime, and double those rates for anything more than 2 hours overtime. https://www.fairwork.gov.au/find-help-for/fast-food-restaura...
While spending 15 or 20 seconds honing the edge with a sharpening steel during use makes sense (and I'll bet he does that just out of reflex), once the edge gets damaged enough to need more that what a steel can fix and you start needing a whetstone, it's probably not cost effective to have kitchen staff spend time doing that.
If a dull knife takes whole 5 minutes to sharpen, it's 12 knives an hour. At $8 per knife, this is $96 per hour. Not worth it during deep overtime during a public holiday, otherwise...
I suppose someone less handsomely paid collects these disposed knives, sharpens them, and resells them on the side.
I'd take issue with your price point but agree with the sentiment
I've seen victorinox fibrox knives in Michelin Star kitchens, they get the job done and are very reasonably priced ($60 for a chef's knife).
Admittedly the knives I have at home are significantly more expensive largely because the knives I have at home are on display so I want something that looks good and I actually enjoy using them.
On one level it's a little silly but then on another level people spend thousands on art/sculptures which has no useful purpose.
There are lots of great knife makers. Depends on what you want. Knives become about aesthetics and feel pretty quickly, price point wise. Not cutting functionality (ease of cutting, whatever).
Victorinox knives rank very well in just about any real-use ranking I’ve ever seen and are extremely affordable. If you just want good knives that will serve you well, won’t break the bank, and you won’t feel bad using them, that’s what I would do. There are other good recommendations in the thread as well.
As for custom steels - outside of currently very expensive processes (powdered metallurgy, etc), it is basically “maximum sharpness”, “edge retention”, “ease of sharpening”, pick maybe two. Edge retention here is shorthand for both brittleness (chipping) and abrasion resistance (regular wear), even though they differ for some things.
High grade carbide, for example, is extremely tough and resists edge abrasion. But because of the large grain size it is ~impossible to get it as sharp as carbon steel by hand. Additionally, the same abrasion resistance also means you need something hard enough to sharpen it.
If you remember those little scratch kits you may have played with once in science class as a child where you tried to see which rocks scratches other rocks, this is the practical application of that.
Even in metalworking people will often make or use hss cutters when they need something really really sharp or custom. Or just cheap. And use carbide ones when they don’t. Because you really can’t get carbide as sharp as HSS and sometimes it matters. I can also easily make a really good HSS cuttter, but making a really good carbide one would take significantly more expensive tooling and time.
This is one example.
Ceramic knives[1] tend to have very high edge retention, but are very brittle and fracture easily. So it's very easy to nick them. This makes them last forever if you are slicing but not if you are chopping. They are also ~impossible to sharpen without diamonds.
In the end - we can construct steels and other things with very nice properties at high cost, and it's cool and fun to explore the limits there, but it’s not going to make you a better chef, or make your prep 10x faster or whatever. This isn't to say it's completely impossible to make somethign that is awesome at everything, but we use what we use because we can make them without nudging atoms into a matrix one by one :)
So while it's possible to get 5x the edge life out of an impossible to sharpen knife (for example), for most people, it's not worth it. They don't even notice once the novelty wears off.
[1] Tungsten carbide is really a ceramic but people often mistake it for a metal/steel, when in reality it's often just alloyed/glued/etc to metals, etc. Assume i'm not talking about tungsten here.
To be fair, tipping the cook makes more sense to me than the waiter. I come to a restaurant for the food, I don't particularly care about the service beyond a certain baseline. It never makes sense to me that waiters can earn more with tips than kitchen staff.
Yes it seems totally arbitrary. When I first visited the US I paid for our group, and didn't tip the waiter because he got our order wrong, and was met with aghast faces. I didn't realise you're supposed to tip EVEN when the service is bad!
The employer is shifting the responsibility of wages to the customer (you). It is customary in business to pay a wage even if an employee makes a mistake. The tipout structure of most restaurants, where the server tips out the kitchen and support staff, also collects a percentage of all *sales* from the server's tips, so not tipping results in a server paying your tipout from their tips just for the privilege of serving you, hence the agast faces.
Tipping should be illegal to substitute for pay. Majority-tipped restaurants are almost always predatory and take advantage of both customers and employees in order to further enrich the owners.
I don't think that's true in the vast majority of establishments? Tip pooling usually means that the front of the house staff pool their tips. Not that they share with the entire restaurant.
Yeah, I don't know anything about the majority of establishments.
My single reference is the Norwegian upscale restaurant Theatercafeen, which introduced tip pooling across waiters and kitchen. It was highly contentious when introduced by the restaurant: The waiters took the case to the courts, and it went all the way to the supreme court of Norway [1], where it was decided that the employer could decide rules for tip-sharing.
That’s a very engineering viewpoint. But much of the world values the whole package, including clean and neatly set tables and place settings, advice on the menu, timing of courses, QA of prep and fixing issues without customer intervention, help with any mishaps like spilled drinks or dropped silverware, boxing of food to go, etc.
A utilitarian only interested in pure food quality is much better off cooking at home. You can do better at a quarter the price.
Food/software is only about 25% of the cost and value in these businesses, though perceptions on value differ of course.
I end up using Perplexity a lot too, especially when I'm doing something unfamiliar. It's also a good way to quickly find out what are best practices for a given framework/language I'm not that familiar with (I usually ask it to link to examples in the wild and it find opensource projects illustrating those points)
> The notch has been around for 4 years now, and Apple still hasn't provided a solution for the problem they introduced.
As a lot of people told you, you can just disable it. I've been doing that for 4 years, just set your resolution to a 16:10 ratio and you're good to go. The resolution is exactly the same as it was before they introduced the notch
Personally I like the fact that Apple gives us the choice. I dislike the notch and prefer my menu bar below because I use apps like intellij. My wife likes the notch and keeps it. So, both of us can have what we want.
Maybe Apple could have made it slightly easier to disable it by having an option instead of choosing a 16:10 resolution but, to be honest, most of the people who dislike it tend to be power users who can figure it out.
What I'm learning from this entire line of discussion is that there is a subset of personalities who will find any reason to hate $COMPANY, in this case Apple. No amount of logical explanation will change their mind, there just isn't anything that $COMPANY can do right, no design decision sensible enough.