Calculated cruelty: When the algorithm makes decisions instead of public bodies

The accelerated implementation of automated private and public-sector decision-making processes leads to ludicrous circumstances and cruel destiny, the bulk of the price is paid, not unexpectedly, the poor communities
.

In his compositions, Ir Ariel liked to mix the mundane with the prophetic and “We Passed by Pharaoh” is a perfect example of this. The album, now 30 years old, deals in part with the triviality of ordinary life and the petty citizen’s difficulties in coping with the big establishment’s faceless bureaucratic violence. But the second stanza, like a consumer from the world of science fiction, is in its essence a futuristic account of violence that becomes mechanically mechanized and automated.

The algorithm (embodied in the song in the form of the e-secretary and the automated judge) that takes decisions based on uncertain and unassailable factors is the machine that positions debt or erases money in a mysterious manner. A individual is no longer at the whim of an official of flesh and blood, but under the jurisdiction of a machine based on artificial intelligence, incapable of character and feeling, which will determine his destiny for grace or for a tribe.

A programming mistake has cost me a million,

As an ATM cheated me of an account balance,

For me an electronic secretary declined an interview,

My license was suspended by an automatic judge,

I lost a token in my mouth to the Promoter of Mechanics … “

-Meir Ariel, Pharaoh’s We Moved, 1990

Artificial intelligence and machine learning were used mainly in science laboratories in academic institutions when Ariel wrote the music. But his prophecy, the dystopian vision of a world where all facets of our lives are governed by them, scored a goal. And not just when it comes to the algorithms of the technology giants that form every part of our online lives and can endanger us all, as I wrote here last week . Yet in even broader and more substantial respects, the songs of the last century correspond to them.

It has been so widespread that a group of U.S. lawyers have seen that they can rob vulnerable residents of access to subsidized homes, jobs and basic care, and even jail people in poverty without being saved, using algorithms and artificial intelligence software to make decisions on people’s status, skills, and rights.

Credit rating systems, which gather data on residents’ financial habits and use them to send them a score that decides their degree of risk when it comes to having a loan, are one of the big shortcomings in this respect.

A loan on favorable terms and interest rates will earn a high score, a small low grade lead usury high and low score will not give you approval for a loan at all in Israel, a credit rating will be handled by the Bank of Israel for a year and a half, and still early to determine the effect on Israelis at the bottom of the social ladder. But in the US the credit score influences the willingness of residents to buy vehicles, and even secure a full-time job. Depending on what choices are being made.

But a credit rating algorithm is only one aspect where economic skills and everyday life are influenced by artificial intelligence systems. Algorithms make additional choices, such as which kids will be sent to foster homes, which patients will receive medical treatment, or which families will receive discounted or affordable accommodation, according to an article in the MIT Technology Analysis journal. For low-income earners, it has been suggested that a secret network of interconnected traps has been created by the accelerated introduction of automatic decision-making systems.

In a previous case, Prof. Michelle Gilman, a lawyer who runs the University of Baltimore legal clinic, a family member who serves the clinic lost his work, and a machine malfunction immediately denied him eligibility for unemployment insurance, discussed with the publication.

The family had late rent payments as a result of the error, and the landlord filed a lawsuit to evict her. The family was finally not evicted from their house after the battle against the Corona plague outbreak, but the lawsuit still appears in public documents.

This listing will be fed into programs used to screen tenants, making it impossible for the family in the future to locate a new apartment. The credit value is also harmed by failing to pay rent and other payments, contributing to other repercussions. “It will affect the ability to subscribe to a cellular company, take out a loan, buy a car or get a job (the electronic secretary who turned down an interview, etc.), these are effects that roll in the chain,”It will affect the ability to subscribe to a cellular company, take out a loan, buy a car or get a job (the electronic secretary who turned down an interview, etc.), which are effects that roll in the chain.

And this is not a case that is exceptional. It happens to all of our clients in a sweeping way,”It happens in a sweeping way to all our customers,” They are stuck in too many algorithms that rob them of essential resources, and since some of these mechanisms are opaque, consumers might not be conscious of it.’

Gilman distinguishes two kinds of algorithms whose collision causes an unsafe spiral. The first is the credit rating systems that are privately held in the US. These are developed using a wealth of data that is now easier than ever to obtain and distribute, including public records, social network knowledge, internet interaction, banking and application use. Not only lenders, but employers, tenants and even schools as well.

The second category is algorithms used by government agencies to assess the fitness of people for medical insurance, unemployment compensation, separate grants and more. These are also implemented with the goal of streamlining procedures and as part of digitization and modernization efforts. A noble purpose in itself but the issue is that the method of buying the systems is not clear and no side is able to take blame when anything goes wrong.

The lack of clarity often raises these systems’ propensity to err. Michigan operated an electronic system in 2013 to assess the right to unemployment insurance. However the algorithm incorrectly determined that the method had been fooled by 34,000 individuals and the mistake was found just about two years later. She told Philadelphia Legal Assistance attorney Julia Simon-Michelle magazine, “It led to a massive loss of eligibility,” “There were bankruptcies, suicides and a serious mess.”

Gilman released a paper last September detailing the different algorithms that attorneys in the field can experience. Automated procurement processes for workers, automated debt management systems, an algorithm that calculates the likelihood of convicted convicts in order to aid judges in sentencing (automatic judge who refuses a license) and whether to eliminate criminal convictions Multiple have previously shown that these appear to place minorities at a higher degree of risk), including systems to help businesses minimize two-hour wages Gilman also describes in the article how to cope with multiple circumstances. The study suggests that the solicitor defending him first verify if the details submitted into the credit reporting system is correct for a client unable to rent an apartment due to a poor credit rating.

Knowledge of the problem is only in its infancy in the United States, and many lawyers serving underprivileged groups do not realize how much the algorithms effect the life of their clients. Even those who are awake also find themselves in the darkness groping. I need more experience, more knowledge. “We need more training, more knowledge. Not just in law but in these systems, “Not only in law but in these systems.” Eventually, every case will become a case on an algorithm. “