Posted 15 апреля 2021, 06:28

Published 15 апреля 2021, 06:28

Modified 24 декабря 2022, 22:36

Updated 24 декабря 2022, 22:36

Without knowledge and control: what the new type of social management is fraught with

Without knowledge and control: what the new type of social management is fraught with

15 апреля 2021, 06:28
Algorithms decide the fate of people without their knowledge and without any control.

The risks of discrimination against the rights and legitimate interests of citizens - this is what, according to representatives of the Commission for Legal Support of the Digital Economy of the Moscow branch of the Russian Lawyers Association, the transition to a new type of social management is fraught.

The lawyers shared their opinion with Novye Izvestia.

Today, many decisions that have legal implications for citizens are made by automated decision-making (APR) systems without human intervention.

These decisions can be based on a variety of inputs collected from diverse sources, including social media, telecom operators, government databases, and even so-called “inferred data”. Such algorithms are often based on artificial intelligence technologies, including those based on self-learning.

APR systems are actively used in various fields, including banking, sports, insurance, recruitment and government. For example, in the United States, APR algorithms and systems account for almost 90% of the total purchase and sale of shares on the US stock exchanges (according to a report by JP Morgan Bank). Also, in a number of European countries, more and more companies are using similar algorithms when selecting candidates for employment, thereby filtering more than 70% of incoming resumes before they reach HR managers.

In the Russian Federation, APR systems are already actively used in the banking sector. For example, VTB and Credit Bank of Moscow use similar algorithms when making decisions on loans.

In addition, in the field of public administration, APR systems are used to bring to administrative responsibility on the basis of video or photographic data, as well as on the basis of other technical data, for example, from cellular operators. Such systems were used to prosecute in the spring of last year in Moscow for violating the "self-isolation" and "quarantine" regimes.

The media featured many curious cases of erroneous prosecution, for example, of paralyzed citizens or citizens whose apartment was located on the border of Moscow and the Moscow region. According to experts, in this case we are talking about low-quality data and incorrect algorithms for the operation of such systems.

According to experts of the Commission on the Legal Support of the Digital Economy of the Moscow Branch of the Association of Lawyers of Russia , with the accelerated digitalization of relations in society, an increasing number of management decisions will be made by APR systems, which will entail the emergence of a new type of management - algorithmic governance.

According to the chairman of the Commission, Alexander Zhuravlev, the use of APR systems helps to reduce the time and financial costs associated with making decisions, as well as reduce subjectivity in their adoption, but at the same time, the use of such algorithms carries risks of discrimination of the rights and legitimate interests of citizens in relation to which APR systems operate.

“Within the framework of APR systems, a discriminatory nature can arise due to imperfection of the algorithms embedded in it or poor-quality data used to train the system ,” explains Zhuravlev .

Therefore, in the opinion of the Commission, in Russia it is already important now to form an adequate legal regulation of the APR systems.

At the same time, the main goal of regulation is to increase the transparency of the functioning of such systems for the citizen and society as a whole.

The Commission proposes to use the following set of instruments as possible mechanisms for ensuring such transparency.

One of the most important measures aimed at ensuring transparency in the operation of APR systems is to ensure the audibility of the source code of such a system. In addition, in particular, it is advisable, following the example of Canada, to consider the introduction of independent peer review mechanisms at the stage of procurement and implementation of such systems in the public sector, as well as increasing the level of decision-making on the implementation of such systems.

It is also extremely important to create effective mechanisms for appealing against decisions of state bodies adopted in an automated manner.

Such mechanisms should be based on qualitatively different principles than appeals against traditional “analogue” decisions. According to experts, the mechanisms for appealing against decisions of state bodies made in an automated manner should be as “friendly” as possible for citizens and organizations, prompt and ensuring the participation of competent persons considering the complaint.

Experts also note that mechanisms should be envisaged to minimize incentives for the automatic “stamping” of decisions made by algorithms, as well as to provide prompt and high-quality feedback on the results of the consideration of complaints.

At the same time, Alexander Savelyev , Deputy Chairman of the Commission on Legal Support of the Digital Economy of the Moscow Branch of the Russian Lawyers Association, notes:

“Recently, organizations and the state are increasingly using automated decision-making tools in relation to citizens, which directly affect their rights. It depends on such decisions whether they will be able to get a loan or a job, whether they will be brought to administrative responsibility and many other aspects of their daily life. In this regard, the issue of developing an adequate legal regulation of the procedure for the development and use of such systems in order to ensure the transparency of their functioning and their accountability to society is quite urgent. At the same time, in my opinion, the state should start from itself and at the initial stages extend such regulation to its own information systems, since they make the most sensitive decisions for citizens, as well as due to the presence of a general request for increasing transparency in the activities of state bodies. Proven approaches can then be extended to the private sector. Prior to that, the use of such systems by commercial operators could be carried out within the framework of self-regulation and existing legal norms with their adaptation through explanations from the regulators".

At the same time, in the opinion of the Commission, APR systems in the field of legal proceedings can be introduced only after they have been tested and brought "to mind" in "digital sandboxes" within the framework of experimental legal regimes and only within certain categories of disputes, since excessive "algorithmicization" of the Russian the judicial system can negatively affect the principle of equality of arms in the proceedings and provide unreasonable advantages to individual companies that develop and apply appropriate technological solutions.

The Commission for the Legal Support of the Digital Economy of the Moscow Branch of the Association of Lawyers of Russia sent to Novy Izvestia a list of those problems, the emergence of which can be provoked in various fields by automated decision-making systems.

Banking sector:

In the banking sector, the APR system may result in a citizen's refusal to issue a loan due to his qualification by the algorithms of the system as unreliable, since his friends on social networks include persons who do not repay the loan earlier, or because the linguistic analysis of posts or photos published by such a citizen demonstrates, in the opinion of the algorithm developers, the low social status of such a person. There is no external control over the validity of such conclusions of the algorithms, this is the exclusive discretion of their developer.

The problem is exacerbated by the fact that the input data used to operate the system may be incomplete, inaccurate or “fake”. Thus, the opacity of algorithms and data sources for decision-making, the lack of a real opportunity for a citizen to somehow influence the decision-making process based on APR, together with their large-scale use in the banking sector, means that he cannot get a mortgage, consumer loan and realize many of their immediate needs.

Insurance scope:

In the insurance sector, APR systems can be used to analyze insurance risks in relation to a given particular policyholder - a citizen or an organization. His driving style, hobbies and interests, medical data and other facts will be weighed by an algorithm to establish an individualized insurance premium that many cannot afford to pay. As a result, insurance can become available only for those cases where, from the point of view of the insurance company, the risks are minimal, which distorts the very idea of insurance.

Scope of employment:

In the field of employment, the APR systems used by recruiters carry out the primary filtering of resumes based on opaque criteria laid down in the algorithms of the system by their developer. As a result, many resumes HR-employees do not even see that it cannot but affect the actual implementation of citizens' rights to employment.

Sphere of public administration:

In the field of public administration, APR systems are used to bring to administrative responsibility on the basis of video or photographic data, as well as on the basis of other technical data, for example, from cellular operators. Such systems were used to prosecute in the spring of 2020 in Moscow for violating the “self-isolation” and “quarantine” regimes. The media featured many curious cases of erroneous prosecution, for example, of paralyzed citizens or citizens whose apartment was located on the border of Moscow and the Moscow region. Obviously, in this case we are talking about low-quality data and incorrect algorithms for the operation of such systems.

In Russia, there is still little information on the use of such systems by specific organizations or authorities.

However, cases of confusion over APR systems are leaking into the media. Surely many remember the story of a bedridden Muscovite - professor of the Peoples' Friendship University of Russia Irina Karabulatova, who has the first group of disabilities, who received a fine of 4,000 rubles for "violating" the self-isolation regime.

Partly, the fact that there is not a lot of information about the use of APR is due to the fact that government agencies and companies themselves do not yet fully realize that they are using such systems, since they perceive it as one of many information systems or computer programs.

On the other hand, they do not want to advertise the use of such systems so as not to provoke unnecessary questions from regulators and citizens. But even if we assume that their use in Russia is currently not as great as abroad, this does not in any way remove the acuteness of the question of the expediency of their adequate legal regulation. Since their widespread use in Russia is only a matter of time, since it is a matter of technology and economy, which are universal for most countries, regardless of their cultural and country characteristics

What are the key proposals of the Commission for the Legal Support of the Digital Economy of the Moscow Branch of the Association of Lawyers of Russia?

  1. It is necessary to get rid of an exclusively technocratic approach to the development and use of such systems in the mainstream that the law should not interfere with innovation. Since these innovations directly affect the rights of citizens, their use must be carried out in the legal field.
  2. One of the main directions of development of legislation in this area is to increase the transparency of their use, both for citizens and for control and supervisory authorities.
  3. At the same time, the introduction of large-scale and complex regulation of such systems is somewhat premature. You need to start gradually, extending proven solutions to related areas. And it is advisable to start with the regulation of the use of APR by the state bodies themselves, having tested on them a set of measures to increase transparency and technical possibilities for appeal and effective correction of unfair and discriminatory decisions. Subsequently, these mechanisms, with some adaptations, can be extended to the commercial sector. Thus, one of the key proposals is to turn government agencies into a "sandbox", within which they test on themselves those measures that they subsequently want to extend to the private sector.
Subscribe