Posted by harel 9 hours ago
Or it should be sealed for X years and then public record. Where X might be 1 in cases where you don't want to hurt an ongoing investigation, or 100 if it's someone's private affairs.
Nothing that goes through the courts should be sealed forever.
We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions? No thanks.
AI firms have shown themselves to be playing fast and loose with copyrighted works, a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
1000x this. It’s one thing to have a felony for manslaughter. It’s another to have a felony for drug possession. In either case, if enough time has passed, and they have shown that they are reformed (long employment, life events, etc) then I think it should be removed from consideration. Not expunged or removed from record, just removed from any decision making. The timeline for this can be based on severity with things like rape and murder never expiring from consideration.
There needs to be a statute of limitations just like there is for reporting the crimes.
What I’m saying is, if you were stupid after your 18th birthday and caught a charge peeing on a cop car while publicly intoxicated, I don’t think that should be a factor when your 45 applying for a job after going to college, having a family, having a 20 year career, etc.
That's up to the person for the particular role. Imagine hiring a nanny and some bureaucrat telling you what prior arrest is "relevant". No thanks. I'll make that call myself.
Well, no ; that's not up to you. While you may be interested in this information, the government also has a responsibility to protect the subject of that information.
The tradeoff was maintained by making the information available, but not without friction. That tradeoff is being shattered by third parties changing the amount of friction required to get the information. Logically, the government reacts by removing the information. It's not as good as it used to be, but it's better than the alternative.
They are only available for vulnerable sectors, you can't ask for one as a convenience store owner vetting a cashier. But if you are employing child care workers in a daycare, you can get them.
This approach balances the need for public safety against the ex-con's need to integrate back into society.
[1] https://rcmp.ca/en/criminal-records/criminal-record-checks/v...
You're over-thinking it, trying to solve for a problem that doesn't exist. No one has a "right" to work for me. There's plenty of roles that accept ex-cons and orgs that actively hire them.
So I likewise, require to know everything about you, including things that are none of my business but I just think they are my business and that's what matters. I'll make that call myself.
Note you are free to advertise hiring prior offenders.
You can also look up business ownership details and see if they have criminal records as well.
This is why this needs to be regulated.
I also think someone who has suffered a false accusation of that magnitude and fought to be exonerated shouldn’t be forced to suffer further.
Thanks, but I don't want to have violent people working as taxi drivers, pdf files in childcare and fraudsters in the banking system. Especially if somebody decided to not take this information into account.
Good conduct certificates are there for a reason -- you ask the faceless bureaucrat to give you one for the narrow purpose and it's a binary result that you bring back to the employer.
Please don't unnecessarily censor yourself for the benefit of large social media companies.
We can say pedophile here. We should be able to say pedophile anywhere. Pre-compliance to censorship is far worse than speaking plainly about these things, especially if you are using a homophone to mean the same thing.
This made me pause. It seems to me that if something is not meant to inform decision making, then why does a record of it need to persist?
Sometimes people are unfairly ostracized for their past, but I think a policy of deleting records will do more harm than good.
Also, when applying for a loan, being a sex offender shouldn’t matter. When applying for a mortgage across the street from an elementary school, it should.
The only way to have a system like that is to keep records, permanently, but decision making is limited.
Should it though? You can buy a piece of real estate without living there, e.g. because it's a rental property, or maybe the school is announced to be shutting down even though it hasn't yet. And in general this should have nothing to do with the bank; why should they care that somebody wants to buy a house they're not allowed to be in?
Stop trying to get corporations to be the police. They're stupendously bad at it and it deprives people of the recourse they would have if the government was making the same mistake directly.
To me any other viewpoint inevitably leads to abuse of one group or class or subset of society or another. If they are legally allowed to discriminate in some ways, they will seek to discriminate in others, both in trying to influence law changes to their benefit and in skirting the law when it is convenient and profitable.
I'm not sure we can write that much more COBOL.
At the heart of Western criminal law is the principle: You are presumed innocent unless proven guilty.
Western systems do not formally declare someone "innocent".
A trial can result in two outcomes: Guilty or Not Guilty (acquittal). Note that the latter does not mean the person was proven innocent.
Couldn't they just point to the court system's computer showing zero convictions? If it shows guilty verdicts then showing none is already proof there are none.
You are found Guilty or confirmed you continue to be Not Guilty.
In Scotland there was also the verdict "not proven" but that's no longer the case for new trials
At some point, personal information becomes history, and we stop caring about protecting the owner's privacy. The only thing we can disagree on is how long that takes.
The UK does not have a statute of limitations
https://en.wikipedia.org/wiki/Limitation_Act_1980
Applies to England and Wales, I believe there are similar ones for Scotland and NI
As if "character" was some kind of immutable attribute you are born with.
If you can know the character of individual people, you have less reason to discriminate against those from statistically higher criminal communities.
That is a great recipe for systematic discrimination.
Indeed. And as far as I know, "courts" is not an alternative spelling of "AI".
I'd go further and say a lot of charges and convictions shouldn't be a matter of public record that everyone can look up in the first place, at least not with a trivial index. File the court judgement and other documentation under a case number, ban reindexing by third parties (AI scrapers, "background check" services) entirely. That way, anyone interested can still go and review court judgements for glaring issues, but a "pissed on a patrol car" conviction won't hinder that person's employment perspectives forever.
In Germany for example, we have something called the Führungszeugnis - a certificate by the government showing that you haven't been convicted of a crime that warranted more than three months of imprisonment or the equivalent in monthly earning as a financial fine. Most employers don't even request that, only employers in security-sensitive environments, public service or anything to do with children (the latter get a certificate also including a bunch of sex pest crimes in the query).
The appellate court records would contain information from the trial court records, but most of the identifying information of the parties could be redacted.
So anyone who is interested in determining if a specific behavior runs afoul of the law not just has to read through the law itself (which is, "thanks" to being a centuries old tradition, very hard to read) but also wade through court cases from in the worst case (very old laws dating to before the founding of the US) two countries.
Frankly, that system is braindead. It worked back when it was designed as the body of law was very small - but today it's infeasible for any single human without the aid of sophisticated research tools.
The premise here is, during an investigation, a suspect might have priors, might have digital evidence, might have edge connections to the case. Use the platform and AI to find them, if they exist.
What it doesn’t do: “Check this video and see if this person is breaking the law”.
What it does do: “Analyze this persons photos and track their movements, see if they intersect with Suspect B, or if suspect B shows up in any photos or video.”
It does a lot more than that but you get the idea…
The interpretation of the law is up to the courts. The enforcement of it is up to the executive. The concept of the law is up to Congress. That’s how this is supposed to work.
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
If all you care about is preventing the information from being abused, preventing it from being used is a great option. This has significant negative side effects though. For court cases it means a lack of accountability for the justice system, excessive speculation in the court of public opinion, social stigma and innuendo, and the use of inappropriate proxies in lieu of good data.
The fact that the access speedbump which supposedly worked in the past is no longer good enough is proof that an access speedbump is not a good way to do it. Let's say we block internet access but keep in person records access in place. What's to stop Google or anyone else from hiring a person to go visit the brick and mortar repositories to get the data exactly the same way they sent cars to map all the streets? Anything that makes it hard for giant companies is going to make it hard for the common person. And why are we making the assumption that AI training on this data is a net social ill? While we can certainly imagine abuses, it's not hard to imagine real benefits today, nonetheless unforeseen benefits someone more clever than us will come up with in the future.
> What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.
We've been dealing with people making bad decisions from data forever. As an example, there was red lining where institutions would refuse to sell homes or guarantee loans for minorities. Sometimes they would use computer models which didn't track skin color but had some proxy for it. At the end of the day you can't stop this problem by trying to hide what race people are. You need to explicitly ban that behavior. And we did. Institutions that attempt it are vulnerable to both investigation by government agencies and liability to civil suit from their victims. It's not perfect, there are still abuses, but it's so much better than if we all just closed our eyes and pretended that if the data were harder to get the discrimination wouldn't happen.
If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them. If an AI rejects my loan application, you better be able to prove that the AI isn't doing so based on my skin color. If you can do that, you should also be able to prove it's not doing so based off an expunged record. If evidence comes out that the AI has been using such data to come to such decisions, those who made it and those who employ it should be liable for damages, and depending on factors like intent, adherence to best practices, and severity potentially face criminal prosecution. Basically AI should be treated exactly the same as a human using the same data to come to the same conclusion.
It worked well enough for a pretty long time. No solution can be expected to work forever, we just need to modify the restrictions on criminal histories to keep up with the times. It's perfectly normal to have to reassess and make adjustments to access controls over time, not only because of technology changes, but also to take into account new problems with the use/misuse of the data being restricted and our changing values and expectations for how that data should be used and accessed.
> If you don't want algorithms to come to spurious and discriminatory conclusions, you must make algorithms auditable, and give the public reasonable access to interrogate these algorithms that impact them.
I think we'd have much better success restricting access to the data than handing it out freely and trying to regulate what everyone everywhere does with that data after they already have it. AI in particular will be very hard to regulate (as much as I agree that transparent/auditable systems are what we want), and I don't expect we'd have much success regulating what companies do behind closed doors or force them to be transparent about their use of AI
We both agree that companies should be held liable for the discriminatory outcomes of their hiring practices no matter if they use AI or not. The responsibility should always fall on the company and humans running the show no matter what their tools/processes are since they decide which to use and how to use them.
We also agree that discrimination itself should be outlawed, but that remains an unsolved problem since detection and enforcement are extremely difficult. It's easier to limit the opportunity to discriminate than try to catch companies in the act. You mention that hiding people's race doesn't work, but that's actually being explored as a means to avoid bias in hiring. For example, stripping names and addresses (which can hint at race) before passing resumes to algorithms seems like it could help reduce unintentional discrimination.
Ultimately, there'll always be opportunities for a bigot to discriminate in the hiring process but I think we can use a multifaceted approach to limit those opportunities and hopefully force them to act more explicitly making deliberate discrimination a little easier to catch.
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
Instead, we should make it illegal to discriminate based on criminal conviction history. Just like it is currently illegal to discriminate based on race or religion. That data should not be illegal to know, but illegal to use to make most decisions relating to that person.
Then there's cases like Japan, where not only companies, but also landlords, will make people answer a question like: "have you ever been part of an anti-social organization or committed a crime?" If you don't answer truthfully, that is a legal reason to reject you. If you answer truthfully, then you will never get a job (or housing) again.
Of course, there is a whole world outside of the United States and Japan. But these are the two countries I have experience dealing with.
The question a people must ask themselves: we are a nation of laws, but are we a nation of justice?
Absolutely not. I'm not saying every crime should disqualify you from every job but convictions are really a government officialized account of your behavior. Knowing a person has trouble controlling their impulses leading to aggrevated assault or something very much tells you they won't be good for certain roles. As a business you are liable for what your employees do it's in both your interests and your customers interests not to create dangerous situations.
However, there's also jobs which legally require enhanced vetting checks.
I think the solution there is to restrict access and limit application to only what's relevant to the job. If someone wants to be a daycare worker, the employer should be able to submit a background check to the justice system who could decide that the drug possession arrest 20 years ago shouldn't reasonably have an impact on the candidate's ability to perform the job, while a history of child sex offenses would. Employers would only get a pass/fail back.
Welcome to the world of certificates of the good conduct and criminal record extracts:
Or, you might just be doing the meme: https://x.com/MillennialWoes/status/1893134391322308918?s=20
Exactly. One option is for the person themselves to be able to ask for a LIMITED copy of their criminal history, which is otherwise kept private, but no one else.
This way it remains private, the HR cannot force the applicant to provide a detailed copy of their criminal history and discriminate based on it, they can only get a generic document from the court via Mr Doe that says, "Mr Doe is currently eligible to be employed as a financial advisor" or "Mr Doe is currently ineligible to be employed as a school teacher".
Ideally it should also be encrypted by the company's public key and then digitally signed by the court. This way, if it gets leaked, there's no way to prove its authenticity to a third party without at least outing the company as the source.
Coincidentally these same countries tend to have a much much lower recidivism rate than other countries.
I'm an employer and I want to make sure you haven't committed any serious crimes, so I ask for a certificate saying you haven't committed violent crimes. I get a certificate saying you have. It was a fistfight from a couple of decades ago when you were 20, but I don't know if it's that or if you tortured someone to death. Gotta take a pass on hiring you, sorry.
Seems like the people this benefits relative to a system in which a company can find out the specific charges you were convicted of would be the people who have committed the most heinous crimes in a given category.
And only released if it's in the public interest. I'd be very very strict here.
I'm a bit weird here though. I basically think the criminal justice system is very harsh.
Except when it comes to driving. With driving, at least in America, our laws are a joke. You can have multiple at fault accidents and keep your license.
DUI, keep your license.
Run into someone because watching Football is more important than operating a giant vehicle, whatever you might get a ticket.
I'd be quick to strip licenses over accidents and if you drive without a license and hit someone it's mandatory jail time. No exceptions.
By far the most dangerous thing in most American cities is driving. One clown on fan duel while he should be focusing on driving can instantly ruin dozens of lives.
But we treat driving as this sacred right. Why are car immobilizers even a thing?
No, you can not safely operate a vehicle. Go buy a bike.
But the Internet's memory means that something being public at time t1 means it will also be public at all times after t1.
You can do something very simple like having a system that just lists if a person is - at that moment - in government custody. After release, there need not be an open record since the need to show if that person is currently in custody is over.
As an aside, the past few months have proven that the US government very much does not respect that reasoning. There are countless stories of people being taken and driven around for hours and questioned with no public paper trail at all.
Democrats love it too.
They call em Jump Outs. Historically the so called constitution has been worth less than craft paper. From FDRs executive order 9066 to today, you have no rights.
Is the position that everyone who experienced that coverage, wrote about it in any forum, or attended, must wipe all trace of it clean, for “reasons”? The defendant has sole ownership of public facts? Really!? Would the ends of justice have been better served by sealed records and a closed courtroom? Would have been a very different event.
Courts are accustomed to balancing interests, but since the public usually is not a direct participant they get short shrift. Judges may find it inconvenient to be scrutinized, but that’s the ultimate and only true source of their legitimacy in a democratic system.
To be this brings in another question when the discussion should be focused on to what extent general records should be open.
Now for a serious answer, what happens in practice in Europe is not secret trials, because trials are very much public. Since there is only so many billionaries, their nephews, actual mafiosi and people with political exposure prosecution, the journalists would monitor them closely, but will not be there on a hearing about your co-workers (alleged) wife-beating activities.
It's all reported, surname redacted (or not, it depends), but we all know who this is about anyways. "Court records says that a head of department at a government institution REDACTED1 was detained Monday, according to the public information, the arrests happened at the Fiscal service and the position of the department head is occupied by Evhen Sraka".
What matters when this is happens is not the exact PII of the person anyways. I don't care which exact nephew of which billionarie managed to bribe the cops in the end, but the fact that it happened or not.
Rank and file cops aren't that interesting by the way, unless it's a systemic issue, because the violence threshold is tuned down anyway -- nobody does a routine traffic stop geared for occupational army activities.
Like everything, privacy is not an absolute right and is balanced against all other rights and what you describe fits the definition of a legitimate public interest, which reduces the privacy of certain people (due to their position) by default and can be applied ad-hoc as well.
If a conviction is something minor enough that might be expungable, it should be private until that time comes. If the convicted person hasn't met the conditions for expungement, make it part of the public record, otherwise delete all history of it.
Sometimes can you can't prove B was more qualified, but you can always claim some BS like "B was a better fit for our company culture"
Do you regard the justice system as a method of rehabilitating offenders and returning them to try to be productive members of society, or do you consider it to be a system for punishment? If the latter, is it Just for society to punish somebody for the rest of their life for a crime, even if the criminal justice considers them safe to release into society?
Is there anything but a negative consequence for allowing a spent conviction to limit people's ability to work, or to own/rent a home? We have carve-outs for sensitive positions (e.g. working with children/vulnerable adults)
Consider what you would do in that position if you had genuinely turned a corner but were denied access to jobs you're qualified for?
Sure there is still some leeway between only letting a judge decide the punishment and full on mob rule, but it's not a slippery slope fallacy when the slope is actually slippy.
It's fairly easy to abuse the leeway to discriminate to exclude political dissidents for instance.
Good luck proving it when it happens. We haven't even managed to stop discrimination based on race and religion, and that problem has only gotten worse as HR departments started using AI which conveniently acts as a shield to protect them.
wouldn't making it illegal to discriminate based on criminal records prevent an elementary school of refusing to employ a candidate that is "fit for the job" (graduated from a good university, has years of experience in the field, etc) who just happens to have a child rape conviction on the basis that he has a child rape conviction? doesn't 1 + 1 equal 2?
Alternatively consider that you are assuming the worst behavior of the public and the best behavior of the government if you support this and it should be obvious the dangerous position this creates.
I robbed a drug dealer some odd 15 years ago while strung out. No excuses, but I paid my debt (4-11 years in state max, did min) yet I still feel like a have this weight I can’t shake.
I have worked for almost the whole time, am no longer on parole or probation. Paid all fines. I honestly felt terrible for what I did.
At the time I had a promising career and a secret clearance. I still work in tech as a 1099 making way less than I should. But that is life.
What does a background check matter when the first 20 links on Google is about me committing a robbery with a gun?
Edit: mine is an extreme and violent case. But I humbly believe, to my benefit surely, that once I paid all debts it should be done. That is what the 8+ years of parole/probation/counseling was for.
But the courts are allowed to do it conditionally, so a common condition if you ask for a lot of cases is to condition it to redact any PII before making the data searchable. Having the effect that people that actually care and know what to look for, can find information. But you can't randomly just search for someone and see what you get.
There is also a second registry separate from the courts that used to keep track of people that have been convicted during the last n years that is used for backgrounds checks etc.
But why shouldn't a 19 year old shoplifter have that on their public record? Would you prevent newspapers from reporting on it, or stop users posting about it on public forums?
great moral system you have there
In one comment you managed to violate a whole bunch of the HN commenting guidelines.
please, show me your good faith interpretation and i will take back my comment
[1] https://en.wikipedia.org/wiki/Disclosure_and_Barring_Service
Yes
Why are we protecting criminals, just because they are minors? Protect victims, not criminals.
Unfortunately reputational damage is part of the punishment (I have a criminal record), but maybe it's moronic to create a class of people who can avoid meaningful punishment for crimes?
This - nearly all drug deliveries in my town are done by 15 years olds on overpowered electric bikes. Same with most shoplifting. The real criminals just recruit the schoolchildren to do the work because they know schoolchildren rarely get punishment.
At a certain point, we say someone is an adult and fully responsible for their actions, because “that’s who they are”.
It’s not entirely nuanced—and in the US, at least, we charge children as adults all the time—but it’s understandable.
At a certain point, poorly thought out "protections", turn into a system that protects organized crime, because criminals aren't as stupid as lawmakers, and exploit the system.
There is a big difference between making a mistake as a kid that lands you in trouble, and working as a underling for organized crime to commit robberies, drug deals, and violent crime, and not having to face responsibility for their actions.
The legal system has so many loopholes for youth, for certain groups, that the law is no longer fair, and that is its own problem, contributing to the decline of public trust.
Have you ever considered that these children are victims of organized crime? That they aren't capable of understanding the consequences of what they're doing and that they're being manipulated by adult criminals?
The problem here isn't the lack of long term consequences for kids.
12 year olds know it’s not right to sell crack.
The problem is the gap between lack of legal opportunities for youth and the allure of easy money, status and power from being a criminal. Doesn’t help that the media makes it look so fun and cool to be a gangster.
Just because exceptions are exploitable, doesn't mean we should just scrap all the exceptions. Why not improve the wording and try to work around the exceptions?
The problem that is happening in most Western countries is that criminal organizations take advantage of the fact that minors get reduced sentenced and that their criminal records are usually kept sealed (unless tried as an adult). Whether it be having them steal cars, partake in organized shoplifting operations, muggings, gang activity, drug dealing, etc...
Your reasoning for why this information shouldn't be public record seems to boil down to the fact that you don't agree with other peoples judgement of someone's past crimes. You'd like to see more forgiveness, and you don't think others will show the same forgiveness, so you want to take away all the access to information because of that. To me that seems like a view from a point of moral superiority.
I'd rather people get access to this information and be able to use their own brains to determine whether they want that person working there. If you were involved in shoplifting at 17 years old, and turn 18, I think it would be very fair for a store owner to be able to use that information to judge you when making a hiring decision. To me it doesn't make sense that you turn a magical age of 18 and suddenly your past poor decisions vanish into a void.
Protect victims and criminals. Protect victims from the harm done to them by criminals, but also protect criminals from excessive, or, as one might say, cruel and unusual punishment. Just because someone has a criminal record doesn't mean that anything that is done to them is fair game. Society can, and should, decide on an appropriate extent of punishment, and not exceed that.
If it's not supposed to be public then don't publish it. If it's supposed to be public then stop trying to restrict it.
If something is expungable it probably shouldn't be public record. Otherwise it should be open and scrapable and ingested by both search engines and AI.
No, I don't think if you shoplift as a teenager and get caught, charged, and convicted that automatically makes you a shoplifter for the rest of your life, but you also don't just get to wave a magic wand and make everyone forget you did what you did. You need to demonstrate you've changed and rebuild trust through your actions, and it's up to each individual person to decide whether they're convinced your trustworthy, not some government official with a delete button.
Additionally, do you want a special class of privileged people, like a priestly class, who can interpret the data/bible for the peasantry? That mentality seems the same as that behind the old Latin bibles and Latin mass that people were abused to attend, even though they had no idea what was being said.
So who would you bequeath the privileges of doing “research”?Only the true believers who believe what you believe so you wouldn’t have to be confronted with contradictions?
And how would you prevent data exfiltration? Would you have your authorized “researchers” maybe go to a building, we can call it the Ministry of Truth, where they would have access to the information through telescreen terminals like how the DOJ is controlling the Epstein Files and then monitoring what the Congressmen were searching for? Think we would have discovered all we have discovered if only the Epstein people that rule the country had access to the Epstein files?
Yes, convictions are permanent records of one’s offenses against society, especially the egregious offenses we call felonies on the USA.
Should I as someone looking for a CFO or just an accountant not have the right that to know that someone was convicted of financial crimes, which is usually long precipitated by other transgressions and things like “mistakes” everyone knows weren’t mistakes? How would any professional association limit certification if that information is not accessible? So Madoff should Ave been able to get out and continue being involved in finances and investments?
Please explain
Is this the UK thing where PII is part of the released dataset? I know that Ukrainian rulings are all public, but the PII is redacted, so you can train your AI on largely anonymized rulings.
I think it should also be against GDPR to process sensitive PII like health records and criminal convictions without consent, but once it hits the public record, it's free to use.
On the other hand, perpetrating crime is a GREAT predictor of perpetrating more crime -- in general most crime is perpetrated by past perps. Why should this info not be available to help others avoid troublemakers?
https://bjs.ojp.gov/library/publications/returning-prison-0
https://www.prisonpolicy.org/graphs/sex_offense_recidivism_2...
https://usafacts.org/articles/how-common-is-it-for-released-...
https://pmc.ncbi.nlm.nih.gov/articles/PMC3969807/
https://ciceroinstitute.org/research/the-case-for-incarcerat...
I think this is wrong, it should be reported entirely at least for 5 years after the fact happened
That is the entire point of having courts, since the time of Hammurabi. Otherwise it's back to the clan system, where justice is made by avenging blood.
Making and using any "profiles" of people is an entirely different thing than having court rulings accessible to the public.
The idea that society is required to forget crime is pretty toxic honestly.
Though if the details of the case were not public or hard to access they might assume it was worse than it was. (Realistically no child would get prosecuted for stealing a candy bar one time, but I'll grant maybe there are other convictions that sound worse without context.) Maybe the problem is actually that the data is not accessible enough, rather than too accessible?
We can allow access to private persons while disallowing commercial usage and forbid data processing of private information (outside of law enforcement access).
Kinda like it was in pre-digital days, no we can't go back but we can at least _try_ to make PII information safeguarded.
Most EU countries have digital ID's, restricting and logging (for a limited time) all access to records to prevent mis-use. Anyone caught trying to scrape can be restricted without limiting people from accessing or searching for specific _records or persons of interest_ (seriously, would anyone have time to read more than a couple of records each day?).
> We should give up with the idea of databases which are 'open' to the public, but you have to pay to access, reproduction isn't allowed, records cost pounds per page, and bulk scraping is denied. That isn't open.
I really don't see why. Adding friction to how available information is may be a way to preserve the ability for the public to access information, while also avoiding the pitfalls of unrestricted information access and processing.
It's not about any post-case information.
Any tool like this that can help important stories be told, by improving journalist access to data and making the process more efficient, must be a good thing.
How about rate limited?
If load on the server is a concern, make the whole database available as a torrent. People who run scrapers tend to prefer that anyway.
This isn't someone's hobby project run from a $5 VPS - they can afford to serve 10k qps of readonly data if needed, and it would cost far less than the salary of 1 staff member.
I’d then ask OpenAI to be open too since open is open.
I could have my mind changed if the public policy is that any public data ingested into an AI system makes that AI system permanently free to use at any degree of load. If a company thinks that they should be able to put any load they want on public services for free, they should be willing to provide public services at any load for free.
It's like having you search through sand, it's bad enough while you can use a sift, but then they tell you that you can only use your bare hands, and your search efforts are made useless.
This is not a new tactic btw and pretty relevant to recent events...
OR it should be allowed for humans to access the public record but charge fees for scrapers
England has a genuinely independent judiciary. Judges and court staff do not usually attempt to hide from journalists stuff that journalists ought to be investigating. On the other hand, if it's something like an inquest into the death of a well-known person which would only attract the worst kind of journalist they sometimes do quite a good job of scheduling the "public" hearing in such a way that only family members find out about it in time.
A world government could perhaps make lots of legal records public while making it illegal for journalists to use that material for entertainment purpose but we don't have a world government: if the authorities in one country were to provide easy access to all the details of every rape and murder in that country then so-called "tech" companies in another country would use that data for entertainment purposes. I'm not sure what to do about that, apart, obviously, from establishing a world government (which arguably we need anyway in order to handle pollution and other things that are a "tragedy of the commons" but I don't see it happening any time soon).
Eg if you create a business then that email address/phone number is going to get phished and spammed to hell and back again. It's all because the government makes that info freely accessible online. You could be a one man self-employed business and the moment you register you get inundated with spam.
I don't think all information should be easily accessible.
Some information should be in libraries, held for the public to access, but have that access recorded.
If a group of people (citizens of a country) have data stored, they ought to be able to access it, but others maybe should pay a fee.
There is data in "public records" that should be very hard to access, such as evidence of a court case involving the abuse of minors that really shouldn't be public, but we also need to ensure that secrets are not kept to protect wrongdoing by those in government or in power.
Yes, license plates are public, and yes, a police officer could have watched to see whether or not a suspect vehicle went past. No, that does not mean that it's the same thing to put up ALPRs and monitor the travel activity of every car in the country. Yes, court records should be public, no, that doesn't mean an automatic process is the same as a human process.
I don't want to just default to the idea that the way society was organized when I was a young person is the way it should be organized forever, but the capacity for access and analysis when various laws were passed and rights were agreed upon are completely different from the capacity for access and analysis with a computer.
They have ability to seal documents until set dates and deal with digital archival and retrieval.
I suspect some of this is it's a complete shit show and they want to bury it quickly or avoid having to pay up for an expensive vendor migration.
I think it's right to prevent random drive by scraping by bots/AI/scammers. But it shouldnt inhibit consumers who want to use it to do their civic duties.
One individual could spend their entire life going through one by one recording cases and never get through the whole dataset. A bot farm could sift through it in an hour. They are not the same thing.
Why? They generate massive traffic, why should they get access for free?
What about family law?
Family law is just the most obvious and unarguable example.
The government can decide to stop paying for the infra, but the only way to delete something that was once public record should be for all interested parties to also stop their nodes.
The consensus mechanism would be: block is good if it has a judge's signature (or some other combination of signatures from other elected officials, depending on how the laws work where you are).
Or are you proposing that somebody out there is prepared to subvert the signatures by computing a hash collision or some other herculean task?
It's not easy to see who to believe. The MP introducing it claiming there is a "cover up" is just what MPs do. Of course it makes him look bad, a service he oversaw the introduction of is being withdrawn. The rebuttal by the company essentially denies everything. Simultaneously it's important to notice the government are working on a replacement system of their own.
I think this is a non-event. If you really want to read the court lists you already can, and without paying a company for the privilege. It sounds like HMCTS want to internalise this behaviour by providing a better centralised service themselves, and meanwhile all the fuss appears to be from a company operated by an ex-newspaper editor who just had its only income stream built around preferential access to court data cut off.
As for the openness of court data itself, wake me in another 800 years when present day norms have permeated the courts. Complaining about this aspect just shows a misunderstanding of the (arguably necessary) realities of the legal system.
An analogy would be Hansard and theyworkforyou.com. The government always made Hansard (record of parliamentary debates) available. But theyworkforyou cleaned the data, and made it searchable with useful APIs so you could find how your MP voted. This work was very important for making parliament accessible; IIRC, the guys behind it were impressive enough that they eventually were brought in to improve gov.uk.
> “We are also working on providing a new licensing arrangement which will allow third parties to apply to use our data. We will provide more information on this in the coming weeks.
https://www.tremark.co.uk/moj-orders-deletion-of-courtsdesk-...
They raise the interesting point that "publicly available" doesn't necessarily mean its free to store/process etc:
> One important distinction is that “publicly available” does not automatically mean “free to collect, combine, republish and retain indefinitely” in a searchable archive. Court lists and registers can include personal data, and compliance concerns often turn on how that information is processed at scale: who can access it, how long it is kept, whether it is shared onward, and what safeguards exist to reduce the risk of harm, especially in sensitive matters.
Absolutely fucking crazy that you typed this out as a legitimate defense of allowing extremely sensitive personal information to be scraped.
> only system that could tell journalists what was actually happening in the criminal courts
Who cares? Journalism is a dead profession and the people who have inherited the title only care about how they can mislead the public in order to maximize profit to themselves. Famously, "journalists" drove a world-renowned musician to his death by overdose with their self-interest-motivated coverage of his trial[1]. It seems to me that cutting the media circus out of criminal trials would actually be massively beneficial to society, not detrimental.
[1] https://www.huffpost.com/entry/one-of-the-most-shameful_b_61...
If it is public, it will be scraped, AI companies are irrelevant here.
If information is truly sensitive, do not make it public, and that's completely fine. This might have been the case here.
>Oh no, some musician died, PASS THE NATIONAL SECURITY ACT, LOCK DOWN ALL INFORMATION ABOUT CRIMINALS, JAIL JOURNALISTS!!!!
"Free speech" is some kind of terminal brain worm that begs itself to be invoked to browbeat most anybody into submission, I suppose. Now we're apparently extending "free speech" to mean "the government must publish sensitive private information about every citizen in an easily-scraped manner". Well, I don't buy your cheap rhetorical trick. I support freedom of speech in exactly the scope it was originally intended, that is, the freedom to express ideas without facing government censorship or retaliation. Trying to associate your completely unrelated argument with something that everybody is expected to agree with by default is weak.
Nor is it "an attack on journalism". If a real journalist still exists in the current year, they can do investigative work to obtain information that is relevant to the public's interests and then publish it freely. Nobody is stopping them.
> LOCK DOWN ALL INFORMATION ABOUT CRIMINALS
Notably, people who aren't criminals may find themselves in court to determine whether or not they aren't. Unfortunately, people like yourself and the entirety of the broadcast and print media then go on to presume every person who goes to court are criminals and do everything in their power to ruin their lives. Far from just "some musician", most people who are arrested on serious charges get a black mark on their record that effectively destroys their careers and denies them the ability to rent property outside of a ghetto, as employers and landlords discriminate against them baselessly even after they are acquitted of all charges against them.
it's a clear cut free speech issue, you just don't want to admit it since certain data being available and spread to other people doesn't suit your political ideology
That said I don't know why the hell the service concerned isn't provided by the government itself.
"We hired a specialist firm to build, in a secure sandbox, a safety tool for journalists. They are experts in building privacy-preserving AI solutions - for people like law firms or anyone deeply concerned with how data is held, processed, and protected. That’s why we chose them. Their founders are not only respected academics in addition to being professionals, they have passed government security clearance and DBS checks in the past, and have worked on data systems for the National Archives, the Treasury, and other public agencies. They’ve published academic papers on data compliance for machine learning.
"The Minister says we ‘shared data with an AI company”... as if we were pouring this critically sensitive information into OpenAI or some evil aggregator of data. This is simply ridiculous when you look at what we do and how we did it.
"We didn’t “share” data with them. We hired them as our technical contractor to build a secure sandbox to test an idea, like any company using a cloud provider or an email service. They worked under a formal sub-processor agreement, which means under data protection law they’re not even classified as a “third party.” That’s not our interpretation. It’s the legal definition in the UK GDPR itself. ... "And “for commercial purposes”? The opposite is true. We paid them £45,000 a year. They didn’t pay us a penny. The money flowed from us to them. They were prohibited, in writing, from selling, sharing, licensing, or doing anything at all with the data other than providing the service we hired them for.. and they operated under our supervision at all times. They didn’t care what was in the data - we reviewed, with journalists, the outputs to make sure it worked."
If this is true, it does seem that the government has mischaracterized what happened.
> Our understanding is that some 700 individual cases, at least, were shared with the AI company. We have sought to understand what more may have been shared and who else may have been put at risk, but the mere fact that the agreement was breached in that way is incredibly serious.
> ... the original agreement that was reached between Courtsdesk and the previous Government made it clear that there should not be further sharing of the data with additional parties. It is one thing to share the data with accredited journalists who are subject to their own codes and who are expected to adhere to reporting restrictions, but Courtsdesk breached that agreement by sharing the information with an AI company.
(from https://hansard.parliament.uk/Commons/2026-02-10/debates/037...)
But when the conspiracy involves lack of prosecution or inconsistent sentencing at scale and then the Ministry of Justice issues a blanket order to delete one of the best resources to look into those claims...? Significantly increases the legitimacy of the claims.
I assumed it was the usual conspiracy stuff up until this order.
The counter claim by the government is that this isn't "the source of truth" being deleted but rather a subset presented more accessibly by a third party (CourtsDesk) which has allegedly breached privacy rules and the service agreement by passing sensitive info to an AI service.
Coverage of the "urgent question" in parliament on the subject here:
House of Commons, Courtsdesk Data Platform Urgent Question
> ... the agreement restricts the supply of court data to news agencies and journalists only.
> However, a cursory review of the Courtsdesk website indicates that this same data is also being supplied to other third parties — including members of @InvestigatorsUK — who pay a fee for access.
> Those users could, in turn, deploy the information in live or prospective legal proceedings, something the agreement expressly prohibits.
https://hansard.parliament.uk/Commons/2026-02-10/debates/037...
Relatedly, there's an extremely good online archive of important cases in the past, but because they disallow crawlers in robots.txt: https://www.bailii.org/robots.txt not many people know about it. Personally I would prefer if all reporting on legal cases linked to the official transcript, but seemingly none of the parties involved finds it in their interest to make that work.
https://x.com/CPhilpOfficial/status/2021295301017923762
https://xcancel.com/CPhilpOfficial/status/202129530101792376...
This kind of logic does more disservice than people realize. You can combat bigotry towards immigrants (issue #1), without covering up for criminal immigrants (issue #2) in fear of increase of issue #1 among the natives. It only brings up more resentment and bigotry.
That’s why the government should be transparent.
Prosecution of sexual assault is often handled extremely badly. It needs to be done better, without fear or favor, including people who are friends with the police or in positions of power. As we're seeing the fallout of the Epstein files.
Great. How does it change the substance of my comment?
Perhaps, instead of arguing about whether “immigrants” is always a group as a collective, or a certain number of individuals acting together, you would focus on the high level implications of government’s action or inaction?
What do Epstein files have to do with anything right now? Stop shifting the goal posts.
- Policing language to distract from the topic.
- Trying to claim things are just a series of isolated incidents with absolutely nothing in common
- Claiming there are wider problems (that should be addressed in a manner that would take years and isn't even defined well enough to claim measure as being "better")
In the local data that the audit examined from three police forces, they identified clear evidence of “over-representation among suspects of Asian and Pakistani-heritage men”.
It’s unfortunate to watch people and entire countries twist themselves in logic pretzels to avoid ever suggesting that immigration has no ills, and we’re just being polite here about it.
https://www.aljazeera.com/news/2025/6/17/what-is-the-casey-r...
Whilst we're on Rotherham:
"...by men predominantly of Pakistani heritage" [0]
https://www.bbc.com/news/uk-england-south-yorkshire-61868863
Their parents or grandparents were immigrants...
So it would seem that you're the one straying from the topic.
This principle applies directly to search engines and data processors. When Courtsdesk was deleted before reaching AI companies, the government recognized the constraint: AI systems can't apply this "completeness" standard. They preserve raw archives without contextual updates. They can't distinguish between "investigato" (under investigation) and "assolto" (aquitted). They can't enforce the court's requirement that information must be "correttamente aggiornata con la notizia dell'esito favorevole" (properly updated to reflect the favorable resolution of the proceedings).
The UK government prevented the structural violation the Cassazione just identified: permanent archival of incomplete truth. This isn't about suppressing information, it's about refusing to embed factually false contexts into systems designed for eternity.
Then they start jailing people for posts.
Then they get rid of juries.
Then they get rid of public records.
What are they trying to hide?
In other countries, interference with the right to a fair trial would have lead to widespread protest. We don't hold our government to account, and we reap the consequences of that.
Though I'm not sure stopping this service achieves that.
Also - even in the case that somebody is found guilty - there is a fundamental principle that such convictions have a life time - after which they stop showing up on police searches etc.
If some third party ( not applicable in this case ), holds all court cases forever in a searchable format, it fundamentally breaches this right to be forgotten.
Obviously the government Ministry of Justice cannot make other parts of government more popular in a way that appeases political opponents, so the logical solution is to clamp down on open justice.