Artificial Intelligence Processing and Risks of Discrimination
- Авторлар: Talapina E.1
-
Мекемелер:
- Institute of State and Law, Russian Academy of Sciences
- Шығарылым: Том 15, № 1 (2022)
- Беттер: 4-27
- Бөлім: Legal Thought: History and Modernity
- URL: https://journals.rcsi.science/2072-8166/article/view/318155
- DOI: https://doi.org/10.17323/2072-8166.2022.1.4.27
- ID: 318155
Дәйексөз келтіру
Толық мәтін
Аннотация
Discrimination poses a threat to equality as a basic concept of the rule of law. In the digital age, the use of artificial intelligence to make important legal decisions has added a new dimension to the problem. More specifically, artificial intelligence is capable of making faulty decisions which are often based on discrimination about individuals. The aim of the article was to examine the risks of discrimination in order to account for and avoid them in future legal regulation. The research is based on an analysis of doctrinal and regulatory sources from various countries and an examination of existing experience with the use of artificial intelligence. A specific method of data mining is profiling, which leaves little room for individual autonomy and self-determination. In this context, it is suggested that the theory of information self-determination be reassessed, exploiting its potential to divide responsibility between the data owner and the processor. Due to the clear discriminatory risks of profiling, some operations are already banned (e.g. redlining in the USA, genetic profiling in insurance and employment in several countries). The undeniable predictive potential of data deserves careful consideration, especially when it comes to personalization, where the predictive abilities of artificial intelligence are used to legally assess the behavior of an individual. Experience with algorithmic prediction of human behavior in the USA criminal justice system suggests the probabilistic nature of such assessments, which has the potential to infringe human rights to a fair trial and individualization of punishment if algorithmic assessment becomes the sole basis for adjudication. In general, the development of applications to solve routine legal problems that will produce results based on past judicial decisions is particularly relevant in common law countries where case law is prevalent. Given that Russia belongs to the continental law system and that case law even on a one type of dispute is often contradictory and not consistent across the country, the prospects for using American experience are doubtful. Consideration of specific types of deficiencies that can lead to discriminatory data processing, namely incorrect data collection, aggregation of erroneous data, insensitivity of artificial intelligence to regulatory settings, allowed drawing conclusions on the contours of future legislation regarding the activities of artificial intelligence, taking into account all analyzed risks of discrimination.
Негізгі сөздер
Авторлар туралы
Elvira Talapina
Institute of State and Law, Russian Academy of Sciences
Хат алмасуға жауапты Автор.
Email: talapina@mail.ru
ORCID iD: 0000-0003-3395-3126
Doctor of Science (Law), Doctor of Law (France), Chief Researcher
Әдебиет тізімі
- Barocas S., Selbst A.D. (2016) Big Data's disparate impact. California Law Review. Vol. 104, pp. 671-732.
- DOI:https://doi.org/10.2139/ssrn.2477899
- Bartenev D.G. (2019) Forbidding discrimination: the review of the approaches of the ECHR. Mezhdunarodnoe pravosudie=International Justice, no 1, pp. 43-66. (in Russ.).
- Bosco F., Creemers N., Ferraris V. et al. (2014) Profiling Technologies and Fundamental Rights and Values: Regulatory Challenges and Perspectives from European Data Protection Authorities. S. Gutwirth, R. Leenes, P. de Hert (eds.). Reforming European Data Protection Law. Berlin: Springer, pp. 3-33.
- DOI:https://doi.org/10.1007/978-94-017-9385-8_1
- Bygrave L.A. (2002) Data protection law: Approaching its rationale, logic and limits. The Hague: Kluwer Law International, 456 p.
- Feldman E.A., Quick E. (2020) Genetic Discrimination in the United States: What State and National Government Are Doing to Protect Personal Information. Khoury L., Blackett A., Vanhonnaeker L. (eds.). Genetic Testing and the Governance of Risk in the Contemporary Economy. Berlin: Springer, pp. 331-354.
- DOI:https://doi.org/10.1007/978-3-030-43699-5_15
- G'sell F. (2020) Les progrès à petits pas de la «justice prédictive» en France. ERA Forum. Vol. 21, pp. 299-310.
- DOI:https://doi.org/10.1007/s12027-020-00618-6
- Hildebrandt M. (2009) Profiling and AML. In: K. Rannenberg, D. Royer, A. Deuker (eds.). The Future of Identity in the Information Society. Challenges and Opportunities. Heidelberg: Springer, pp. 273-310.
- Khoury L., Blackett A., Vanhonnaeker L. (2020) Legal Aspects of Genetic Testing Regarding Insurance and Employment. General Report. Khoury L. et al (eds.), Genetic Testing and the Governance of Risk in the Contemporary Economy, Ius Comparatum — Global Studies in Comparative Law 34. Berlin: Springer, pp. 3-67.
- DOI:https://doi.org/10.1007/978-3-030-43699-5_1
- Lapaeva V.V. (2008) The Principle of Formal Equality. Zhurnal rossijskogo prava=Russian Law Journal. no 2, pp. 67-80 (in Russ.)
- Miné M. (2003) Les concepts de discrimination directe et indirecte. ERA Forum. Vol. 4, pp. 30-44.
- DOI:https://doi.org/10.1007/s12027-003-0015-0
- Nersesyants V.S. (2009) Towards law. About the origin of equality (from the unpublished). Istoriya gosudarstva i prava=History of State and Law, no 17, pp. 2-7 (in Russ.).
- Nömper A. (2001) Geenitestide öiguslikust regulatsioonist. Jurídica II, pp. 113-123.
- Pormeister K. (2020) The Prohibitions Against Genetic Discrimination in Estonia. Khoury L., Blackett A., Vanhonnaeker L. (eds.). Genetic Testing and the Governance of Risk in the Contemporary Economy. Berlin: Springer, pp. 179-191.
- DOI:https://doi.org/10.1007/978-3-030-43699-5_7
- Regan J. (2016) New Zealand passport robot tells applicant of Asian descent to open eyes. Available at.https://www.reuters.com/article/us-newzealand-passport-error/new-zea-land-passportrobot-tells-applicant-of-asian-descent-to-open-eyes-idUSKBN13W0RL/(accessed: 17 February 2022)
- Romanovskiy V.G. (2020) Profiling Terrorists and Human Rights Constitutional Protection. Konstitutsionnoe i municipal'noe pravo=Constitutional and Municipal Law, no 10, pp. 46-50. (in Russ.)
- DOI:https://doi.org/10.18572/1812-3767-2020-10-46-50
- Rouvroy A., Poullet Y. (2009) The right to informational self-determination and the value of self-development. Reassessing the importance of privacy for democracy. S. Gut-wirth et al. (eds.). Reinventing Data Protection? Dordrecht: Springer, pp. 45-76.
- DOI:https://doi.org/10.1007/978-1-4020-9498-9_2
- Tischbirek A. (2020) Artificial Intelligence and Discrimination: Discriminating Against Discriminatory Systems. Th. Wischmeyer, T. Rademacher. Regulating Artificial Intelligence. Springer pp. 103-121.
- DOI:https://doi.org/10.1007/978-3-030-32361-5_5
- Zuiderveen Borgesius F. (2018) Discrimination, intelligence artificielle et décisions algorithmiques. Etude à l'intention du Service anti-discrimination du Conseil de l'Europe. Strasbourg: Conseil de l'Europe. 51 p.
Қосымша файлдар
