> Also 60fps is pretty low, certainly isn't "high fps" anyway
Uhhhhhmmmmmm....what are you smoking?
Almost no one is playing competitive shooters and such at 4k. For those games you play at 1080p and turn off lots of eye candy so you can get super high frame rates because that does actually give you an edge.
People playing at 4k are doing immersive story driven games and consistent 60fps is perfectly fine for that, you don't really get a huge benefit going higher.
People that want to split the difference are going 1440p.
Anyone playing games would benefit from higher frame rate no matter their case. Of course it's most critical for competitive gamers, but someone playing a story driven FPS at 4k would still benefit a lot from framerates higher than 60.
For me, I'd rather play a story based shooter at 1440p @ 144Hz than 4k @ 60Hz.
You seem to be assuming that the only two buckets are "story-driven single player" and "PvP multiplayer", but online co-op is also pretty big these days. FWIW I play online co-op shooters at 4K 60fps myself, but I can see why people might prefer higher frame rates.
Games other than esports shooters and slow paced story games exist, you know. In fact, most games are in this category you completely ignored for some reason.
Also nobody is buying a 4090/5090 for a "fine" experience. Yes 60fps is fine. But better than that is expected/desired at this price point.
> These are the companies you want to be at IMHO. Provided the compensation is adequate, slow and stable > fast and pivot-y.
Absolutely...not.
Slow does not mean stable. Slow means the floor is rotting out from under you constantly.
Being prudent about when and where to upgrade is a very active, intentional process that the typical company simply don't have the stomach or skill for.
Yeah, eventually you will have to upgrade and deal with all of the accumulated debt. You don’t have to be on the bleeding edge but you should still be updating regularly.
If you just want to move some files around and do basic text substitution, turning to Python or another other "full fledged programming language" is a mistake. There is so much boiler plate involved just to do something simple like rename a file.
I have a lot of scripts that started as me automating/documenting a manual process I would have executed interactively. The script format is more amenable to putting up guardrails. A few even did get complex enough that I either rewrote them from the ground up or translated them to a different language.
For me, the "line in the sand" is not so much whether something is "safer" in a different language. I often find this to be a bit of a straw-man that stands in for skill issues - though I won't argue that shell does have a deceptively higher barrier to entry. For me, it is whether or not I find myself wanting to write a more robust test suite, since that might be easier to accomplish with Ginkgo or pytest or `#include <yourFavorateTestLibrary.h>`.
Is it really so bad? A bit more verbose but also more readable, can be plenty short and sweet for me. I probably wouldn't even choose Python here myself and it's the kind of thing shell scripting is tailor-made for, but I'd at least be more comfortable maintaining or extending this version over that:
from subprocess import Popen, PIPE
CMD = ("printf", "x:hello:67:ugly!\nyy$:bye:5:ugly.\n")
OUT = "something.report"
ERR = "err.log"
def beautify(str_bytes):
return str_bytes.decode().replace("ugly", "beautiful")
def filter(str, \*index):
parts = str.split(":")
return " ".join([parts[i-1] for i in index])
with open(OUT, "w") as out, open(ERR, "w") as err:
proc = Popen(CMD, stdout=PIPE, stderr=err)
for line_bytes in proc.stdout:
out.write(filter(beautify(line_bytes), 2, 4))
I would agree though if this is a one-off need where you have a specific dataset to chop up and aren't concerned with recreating or tweaking the process bash can likely get it done faster.
Edit: this is proving very difficult to format on mobile, sorry if it's not perfect.
That way, if something is easier in Ruby you do it in ruby, if something is easier in shell, you can just pull its output into a variable.. I avoid 99% of shell scripting this way.
But if all I need to do is generate the report I proposed...why would I embed that in a Ruby script (or a Python script, or a Perl script, etc.) when I could just use a bash script?
Bash scripts tend to grow to check on file presence, conditionally run commands based on the results of other commands, or loop through arrays. When it is a nice pipelined command, yes, bash is simpler, but once the script grows to have conditions, loops, and non-string data types, bash drifts into unreadability.
I don’t think it’s fair to compare a workflow that is designed for sed/awk. It’s about 10 lines of python to run my command and capture stdout/stderr - the benefit of which is that I can actually read it. What happens if you want to retry a line if it fails?
> I don’t think it’s fair to compare a workflow that is designed for sed/awk.
If your position is that we should not be writing bash but instead Python, then yes, it is absolutely fair.
> the benefit of which is that I can actually read it.
And you couldn't read the command pipeline I put together?
> What happens if you want to retry a line if it fails?
Put the thing you want to do in a function, execute it on a line, if the sub-shell returns a failure status, execute it again. It isn't like bash does not have if-statements or while-loops.
My point is that if you take a snippet designed to be terse in bash, it’s an unfair advantage to bash. There are dozens of countless examples in python which will show the opposite
> And you couldn't read the command pipeline I put together?
It took me multiple goes, but the equivalent in python I can understand in one go.
> Put the thing you want to do in a function, execute it on a line, if the sub-shell returns a failure status, execute it again. It isn't like bash does not have if-statements or while-loops.
But when you do that, it all of a sudden looks a lot more like the python code
I have not really been a fan of ChatGPT quality. But even if that were not an issue, it is kinda hard to ask ChatGPT to write a script and a test suite for something that falls under export control and/or ITAR, or even just plain old commercial restrictions.
I stopped caring about POSIX shell when I ported the last bit of software off HP-UX, Sun OS, and AIX at work. All compute nodes have been running Linux for a good long while now.
What good is trading away the benefits of bash extensions just to run the script on a homogeneous cluster anyways?
The only remotely relevant alternative operating systems all have the ability to install a modern distribution of bash. Leave POSIX shell in the 1980s where it belongs.
> Strategies for attracting and retaining tech talent in a non-tech industry
As someone who is a tech lead at a non-tech company that does software in house for a niche product in a niche market...
You can either treat the software as a product in it of itself (as opposed to bolting it on to other projects for your core business) or you should expect to include some hazard pay to compensate for the mental and spiritual trauma caused by trying to pretend you can treat the software like a line item in non-software projects that "just" need software support, ending up with 4-5 project managers trying to oversee/status a single software release.
We do the latter and if some of the other benefits were not as good as they are, it would absolutely be a deal breaker.
I suspect most other non-tech companies also operate in a similarly sub-optimal way (the blind leading the blind), but why would I put up with that level of ass-hattery for anything less than what the market can bear?
Not sure. But I have recently discovered that some NVMe drives do support self-tests and I was able to start and monitor the test from smartmontools, may be worth looking in to an extended self test.
TVB specifically is exclusive to the i9 product line, but K SKUs in the i7 line have also experienced a heightened occurrence of stability issues.
So you kind of have to go all in on Turbo boost 3.0 as the issue and then explain why the non-K variants of the i7 and i9 don't seem to have the same frequency of occurrence.
You may want to take a look at some of the sleuthing Wendell over on Level1Techs did.
He was able to identify a large group of CPUs used in W680 chipset motherboards (i.e. using stock Intel power curves, never seeing a lick of overclocking/overvolting) which exhibited issues.
It would seem enthusiast ricing is, at best, an aggravating factor to some other underlying cause.
It isn't "overclocking" if you conform to the Intel specification.
What you are getting at is Turbo boost, which is not a new thing. We can split hairs over the incremental enhancements to Turbo boost over the years of course.
That depends on how you authenticate. If you pass your SSH password on the command line then anyone on your machine doing a `ps` at the right time could see your password.
I find using a ssh-agent to load password protected SSH keys works best.
With Xpra you have a headless X session on the remote machine and you "detach" and "attach" to it from the client. So whatever X applications you leave running will be right where you left them.
Uhhhhhmmmmmm....what are you smoking?
Almost no one is playing competitive shooters and such at 4k. For those games you play at 1080p and turn off lots of eye candy so you can get super high frame rates because that does actually give you an edge.
People playing at 4k are doing immersive story driven games and consistent 60fps is perfectly fine for that, you don't really get a huge benefit going higher.
People that want to split the difference are going 1440p.