> Cheat Engine doesn’t modify the binary. Ghidra can.
To clarify for other people who may not be familiar, (though I'm far from an expert on it myself) you can inject/modify asm of a running binary with CE. I'm not sure if there's a way to bake the changes to the exe permanently.
>”Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.”
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
Also, courts record charges which are dismissed due to having no evidential basis whatsoever and statements which are deemed to be unreliable or even withdrawn. AI systems, particularly language models aggregating vast corpuses of data, are not always good at making these distinctions.
That is a critical point that AI companies want to remove. _they_ want to be the system of record. Except they _can't_. Which makes me think of LLMs are just really bad cache layers on the world.
> I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Many countries have solved this with a special background check. In Canada we call this a "vulnerable sector check," [1] and it's usually required for roles such as childcare, education, healthcare, etc. Unlike standard background checks, which do not turn up convictions which have received record suspensions (equivalent to a pardon), these ones do flag cases such as sex offenses, even if a record suspension was issued.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
That's the reality in my country, and I think most European countries. And I'm very glad it is. The alternative is high recidivism rates because criminals who have served their time are unable access the basic resources they need (jobs, house) to live a normal life.
Then before I give you my business or hire you, I also want to know that you are the kind of person that thinks they have a right to any other person's entire life, so I can hold it against you and prevent you from benefitting from all your other possible virtues and afforts.
So I likewise, require to know everything about you, including things that are none of my business but I just think they are my business and that's what matters. I'll make that call myself.
No one is forcing you to hire formerly incarcerated nannies but you also aren’t entitled to everyone’s life story. I also don’t think this is the issue you’re making it out to be. Anyone who has “gotten in trouble” with kids is on a registry. Violent offenders don’t have their records so easily expunged. I’m curious what this group is (and how big they are) that you’re afraid of.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
>That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent. Alternately if they are convicted and served their sentence, they might need to prove that in the future.
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
Or in the case of, down the road, repeating an offense. The judge sees you had an issue in the past, was good for a while, then repeated, suggesting an event or something has happened or that the individual has lost their motivation to stay reformed. Sentencing to time for the crime but then also being able to assist the individual in finding help to get them back on track. We have the systems in place to do this, we just don’t.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
> Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
>Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
> If someone is charged with and found innocent of a crime, you can't just remove that record. If someone else later finds an account of them being accused, they need a way to credibly assert that they were found innocent.
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
That seems compatible with OP's suggestion, just with X being a large value like 100 years, so sensitive information is only published about dead people.
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
Right, except there are some cases where that information should be disclosed prior to their death. Sensitive positions, dealing with child care, etc. but those are specific circumstances that can go through a specific channel. Like we did with background checks. Now, AI is in charge and ANY record in ANY system is flagged. Whether it’s for a rental application, or a job, or a credit card.
The AI should decide if it's still relevant or not. People should fully understand that their actions reflect their character and this should influence them to always do the right thing.
I find this a weird take. Are you saying you _want_ unaccountable and profit driven third party companies to become quasi-judicial arbiters of justice?
> What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
France has a similar system to the German Führungszeugnis. Our criminal record (casier judiciaire) has 3 tiers: B1 (full record, only accessible by judges), B2 (accessible by some employers like government or childcare), and B3 (only serious convictions, the only one you can request yourself). Most employers never see anything. It works fine, recidivism stays manageable, and people actually get second chances. The US system of making everything googleable forever is just setting people up to fail.
The UK has common law: the outcomes of previous court cases and the arguments therein determine what the law is. It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
It is the outcome of appellate court cases and arguments that determine law in common law jurisdictions, not the output of trial courts. Telling what the law is in a common law system would not be affected if trial court records were unavailable to the public. You only actually need appellate court records publicly available for determining the law.
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
> It’s important that court records be public then, because otherwise there’s no way to tell what the law is.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
You are correct which is why I recently built such a tool. Well, an evidence management tool.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
One could argue that's a cynically accurate definition of most iterative development anyway.
But I don't know that I accept the core assertion. If the engineer is screening the output and using the LLM to generate tests, chances are pretty good it's not going to be worse than human-generated tech debt. If there's more accumulated, it's because there's more output in general.
Only if you accept the premise that the code generated by LLMs is identical to the developer's output in quality, just higher in volume. In my lived professional experience, that's not the case.
It seems to me that prompting agents and reviewing the output just doesn't.... trigger the same neural pathways for people? I constantly see people submit agent generated code with mistakes they would have never made themselves when "handwriting" code.
Until now, the average PR had one author and a couple reviewers. From now on, most PRs will have no authors and only reviewers. We simply have no data about how this will impact both code quality AND people's cognitive abilities over time. If my intuition is correct, it will affect both negatively over time. It remains to be seen. It's definitely not something that the AI hyperenthusiasts think at all about.
You absolutely can copy that, it’s called voice cloning and you can do it on as little as a few seconds of audio. Once cloned, you can generate audio with that voice, saying whatever you want it to.
To be clear, I mean someone can’t file a lawsuit against someone else for sounding like them.
Of course you can have an AI target someone else’s voice. My point is that unless there is evidence it was intentional, it’s silly to claim that just because it sounds similar to a human’s voice, that means it must’ve been intentional.
It can be done, relatively easily.
reply