I am using wayland since 5 years and never looked back to X11. I think it is the right way and time to remove the old insecure X11 backend. GNOME should not be bloated with legacy stuff.
Your experience is not universal. On an intel cpu/gpu laptop, I have zero issues.
But on an AMD/Nvidia desktop it's unusable because it's buggy as all hell. It's endless glitches in dozens of applications. At first it appears fine and then you get subtle stuff like like letters not appearing in vscode when you type, OBS won't record etc.
I have experience with all three manufacturers. We deploy them at work and the integrated AMD GPUs work just as good as the Intel systems. However I can't say much about the discrete AMD GPUs or older hardware.
Just yesterday I changed one nvidia system to the proprietary Wayland driver and started gnome with a three monitor setup. Works like a charm.
Frontend Developer for Desktop Applications (Munich / Germany) - AI Vision Platform
Are you interested in autonomous developing and finding creative solutions on your own?
Then join our Wahtari team as Frontend Developer (m/f/d) and take an active role in shaping the future of our hardand software platform for machine vision tasks.
You will work in a highly focused, independent and enthusiastic team, where you will get to play your dev skills in an highly motivated environment. Your responsibilities include the whole lifecycle of software products such as design, development, testing, deployment, maintenance and improvement. Also, utilize your expertise to solve scalability issues and to expand Wahtari’s product portfolio.
We have a similar high performance AI stack written in Go capable to load many different models from different frameworks. This is work of several years. Just saw your comment and thought about our company internal talk to release everything under an open source license. Thanks for reminding me :)
What are your use-cases?
Wow, make it open source quickly!!! :hype:. It's a classic Python REST API for model serving. But we have very low latency constraints. As such, rewriting in more high performant backend languages e.g. Go or Rust would substantially reduce resource usage (by reducing horizontal scaling need). Pre-baked model serving frameworks e.g. Nvidia's Triton aren't an option, since we have to query a feature store, and do some input feature tracking in between. Go seemed like an efficient, developer friendly choice, but there aren't any well maintained model inference libraries in Go up to this day...
We used Triton Inference Server (with a Golang sidecar to translate requests) for model serving and a separate Go app that handled receiving the request, fetching features, sending to Triton, doing other stuff with the response, serving. This scaled to 100k QPS with pretty good performance but does require some hops.
In general writing pure Go inference libraries sucks. Not easy to do array/vector manipulation, not easy to do SIMD/CUDA acceleration, cgo is not go, etc. I wrote a fast XGBoost library at least (https://github.com/stillmatic/arboreal) - it's on par with C implementations, but doing anything more complex is going to be tricky.
We are working with a huge Go and Python codebase and Python is just a pain in terms of using all system resources. We moved many parts to C++ which are called and handled by goroutines. The outcome was a big success.
This proposal/change is a big step forward, especially for the deep learning community.
Quote: "In PyTorch, Python is commonly used to orchestrate ~8 GPUs and ~64 CPU threads, growing to 4k GPUs and 32k CPU threads for big models. While the heavy lifting is done outside of Python, the speed of GPUs makes even just the orchestration in Python not scalable. We often end up with 72 processes in place of one because of the GIL. Logging, debugging, and performance tuning are orders-of-magnitude more difficult in this regime, continuously causing lower developer productivity."
Quote: "We frequently battle issues with the Python GIL at DeepMind. In many of our applications, we would like to run on the order of 50-100 threads per process. However, we often see that even with fewer than 10 threads the GIL becomes the bottleneck. To work around this problem, we sometimes use subprocesses, but in many cases the inter-process communication becomes too big of an overhead. To deal with the GIL, we usually end up translating large parts of our Python codebase into C++. This is undesirable because it makes the code less accessible to researchers."
This requirement could have been well served with a gil per thread and arena based (shared) object allocation model. Every other use case would have been unaffected.
Now we change the world for everyone and put most of library developers through a valley of desperation for 5 years+, just so that a very few narrow use cases get the benefits they want.
Good point. Did the Meta and Deepmind devs really miss this?
I try to avoid python as much as possible, because I mainly work with Go & C++ and multi-threading with those languages is just better (imho). Bringing python a step forward and making it future proof might be a good thing... Even if this means to break some things? Not sure if dismissing the GIL is the right step, but there is a big performance gap to fix. Or maybe the AI community must move to a better suited language? Having python code in production just feels so wrong. Especially if a rewrite in another language shows the performance gap.
The PEP notes subinterpreters as an alternative and says it can be considered a valid approach to achieve paralleism. However it does not discuss why nogil was given preferences. I guess that's ok because the PEP is about nogil.
I'm not sure whether the SC has considered alternative approaches but it would be surprising if not
The use cases of the ML and AI world are very important though, as they massively contribute to Python's popularity. Thanks to Python, researchers and developers don't have to use different languages and library ecosystems for developing and scaling models.
Alas, subinterpreters sound like they could be a feasible solution for many use cases as well.
There may be a misunderstanding about terminology here. COVID-19 is the clinical disease caused by the SARS-CoV-2 virus. If you are infected by the virus but asymptomatic then you don't "have Covid".
Can't say too much about it. Didn't really follow the whole Covid discussion and just continued my lifestyle. Eating healthy (fresh self-made food), doing sports and looking after a good mental state. Family and my circle did the same. Friends who have been vaccinated got Covid several times. But are also good now... My grandmom (89 years), also unvaccinated, didn't get Covid. Her sister got it (vaccinated). Both healthy now... Just let everybody do their own thing... The whole hate in the communities was unnecessary.
Attitudes like yours are why it kept spreading instead of petering out. I'm not saying that people need to get vaccinated, but I'm god damn sick of people not caring about spreading disease, whether vaccinated, or not. Humanity, as a whole, is in a war with disease. We don't need collaborators. All it takes for evil to triumph is for good people to do nothing.
> Healthy, young people who were intentionally exposed to the coronavirus SARS-CoV-2 developed mild symptoms — if any — in a first-of-its-kind COVID-19 human-challenge study.
That doesn't mean they weren't contagious.
> The first participants received a very low dose — roughly equivalent to the amount of virus in a single droplet of nasal fluid — of a virus strain that circulated in the United Kingdom in early 2020. Researchers anticipated that a higher dose would be needed to infect a majority of participants, says Andrew Catchpole, chief scientific officer of hVIVO. But the starting dose successfully infected more than half of the participants.
> The virus replicated incredibly rapidly in those who became infected. On average, people developed their first symptoms and tested positive, using sensitive PCR tests, less than two days after exposure, on average. That contrasts with the roughly five-day ‘incubation period’ that real-world epidemiological studies have documented between a probable exposure and symptoms. High viral levels persisted for an average of 9 days, and up to 12 days.
> Attitudes like yours are why it kept spreading instead of petering out.
Defining “why” can be a complex exercise, but let’s take a very simple approach: if there were not attitudes like the GP and everyone who could got vaccinated, would COVID have petered out? I don’t think so.
It’s plausible that, if enough production capacity had existed to rapidly vaccinate, say, 85% of the world population, evenly distributed, that it would have worked. But getting a uniform 85% was never in the cards, and, starting some time in 2021, the vaccine was nowhere near effective enough for a two-dose series to suppress transmission even with 100% coverage.
Sorry, but the idea of eliminating Covid with the vaccines we have was a nice fantasy, but it was not going to happen.
(If the vaccine were much better and had good worldwide coverage, then maybe. The smallpox vaccine was good enough. The measles and chickenpox vaccines are plausibly good enough. The oral polio vaccine might be good enough, but I have serious doubts that the strategy with which it’s used is actually appropriate. Somehow there does not appear to be community transmission of polio in New York right now, and I’m a bit surprised.
(People under about 23 years old in the US have generally received the injectable polio vaccine, not the oral vaccine. The injectable vaccine seems to be generally considered inadequate to prevent transmission. Maybe the under 23 year old NY population coupled with modern hygiene is not actually able to sustain an outbreak?)
My opinion doesn't lead to harm to other people, so you'll understand why I don't respect yours. Your right to swing your infected spittle ends where other people's mouths and noses begin.
> My opinion doesn't lead to harm to other people, so you'll understand why I don't respect yours. Your right to swing your infected spittle ends where other people's mouths and noses begin.
Don't you have the ability to stay home and avoid breathing near other humans if you're so concerned? I'm confused by that statement. How is demanding reduced freedom for him more just than simply exercising your own?
To be fair, Covid was never all that dangerous (in a statistical sense) for relatively young, relatively healthy people.
Of course, in the beginning that wasn't clear. And you might still want to get vaccinated, to decrease the likelihood of you passing the virus to your older relatives.
See my other comment. My circle consists of people up to 89 years. Thanks for the hint, but I am not convinced of the vaccine. I'll continue doing my stuff and it's my own responsibility.
I was very interested at the results of my spouse's antibody test! It was negative, and we thought for sure she had antibodies from infection. I have no scientific evidence, but she has genetic abnormalities in certain blood proteins, and I wonder if that assists with her resisting the infection!
antibodies are only measurable for a short time; long-term ability to defend is "learned" by the immune system but not measurable in any ordinary way; here in California coastal area there is a lot of social pressure about vaccination. Random people still insist that vaccination is important for healthy adults and sometimes under-18.
Just go out, do some sports and enjoy life :) I stopped spending too much time in front of the computer and started doing more outdoor activities. Best decision ever.
“Go touch grass” is used derisively, but it’s something I tell myself more and more. We overvalue the online world and all its drama. Go outside, meet people, make your own organic, locally grown drama.
These days I schedule my work around the weather. Few things bring me as much happiness as a day in the sun. I know it has been a good day when I have not touched my laptop once.
I recently made a small webapp to make me "touch grass". The idea behind it is that you enter some activities (or keep the random defaults), and when you are bored or doom scrolling, it call tell you what to do.
It's a bit silly, and still very bare bones, but I just like the phrase "touch grass", and this is my effort to reclaim it from the depths of derisiveness.
> As with many things in life, "go touch grass" isn't actually about touching grass.
As with many things in life, though "go touch grass" isn't actually about touching grass, touching grass really is a good thing. (Well, except that it makes me itch all over the area of contact. Still worth it.)
For me it is complementary. While I consciously touch grass, I ground what I am doing with reality.
Is the problem I am stuck coding on really that important? Is there a simpler solution, or is something else more important right now?
At least, that's what works for me, sometimes metaphors are to be also taken literal.
I don't like sports, and I live in an endless suburban wasteland where there's nothing to do but go to bars, restaurants or the mall.
I can't afford to go to restaurants all the time, and I don't like bars or the mall.
I'm shy and I don't do well around strangers, and even when I do meet new people 99% of the time we don't have much in common, so it feels like a waste of time.
I'd much rather surf the web... at least there I'm learning stuff, and I can communicate with people who I actually have something in common with.
The curse of the high IQ is that statistically you wont find a lot in common with the average Joe. Too bad just deal with it.
Get married and raise children.
Go to church.
Spend time in your local library.
Volunteer at a local CSA (community supported agriculture).
Take long walks.
Go hiking.
Ride a mountain bike in the woods.
Go to the gym and lift weights.
But dont spend your life online and staring at computer screens.
I don't see a problem with spending time in front of a screen, and would much rather do that than do pretty much anything you mentioned.
Some things, like spending time in nature are nice once in a while, but there's no nature near where I live, and even if there was I'm often not in the mood to go.
Personally I wouldn't survive in a suburban environment without nature being near by.
Going to restaurants, bars or meeting strangers isn't what I meant. It's still the artificial human made world. Spend time outside the city, hiking, boarding, climbing, running or just enjoying nature... That's it. Finding like-minded people will come by itself.
Oh, and beeing active is just awesome. Pushing the body to certain limits is just so important for my mental health. I am a complete different person, if I don't do sports for a certain time.
I do think, that trusting thrid-party hardware is always a bad idea. There will never be a safe execution model where you don't own the hardware. Just my opinion...
The point of zk proofs is that you don't have to trust. But you can verify. Without having to rerun the computation yourself, simply verifying the proof which is much cheaper computationally.
Paranoia about untrusted hardware is absolutely warranted but just to try and convince you of what this is trying to do. Imagine you have a file that you have never shown anyone, and some untrusted host wants to convince you they also have that file. They can prove this to you without you revealing the file to them (or them to you) by having them send you the hash of the file that you can compare to your own. If it matches there is an overwhelming probability they also have the file even though you completely distrust them or their execution environment. In other words, you are able to verify the computation (via the hash) was executed even though you don't trust them. Cairo is using similar techniques that let you verify other forms of computation than just hashes using more advanced primitives.
The obvious caveat to this just as with hashes is you trust the underlying cryptography is secure.
Thanks for the explanation. That sounds like a really smart idea. I first thought, that this is based on Intel SGX, which seams to have some security problems (words of a friend working with that technology).
I'll have to dig deeper into this topic. Are there any limitations?