Denaturalizing Technologies – Jandon Bernardo Soares’s (UFRN) review of “Algorithmic Racism: Artificial Intelligence and Discrimination on Social Networks”, by Tarcizio Silvio

Tarcízio Silvio | Image: Mundo Negro

Abstract: Racismo Algorítmico: inteligência artificial e discriminação nas redes sociais, by Tarcizio Silvio addresses structural racism in AI technologies. Silva, master and doctor of humanities, explores the influence of algorithms on the reproduction of racist standards. Although repetitive in some respects, it stands out for not adopting a fatalistic tone and suggesting strategies to combat racial prejudice.

Keywords: racism, artificial intelligence, technologies, social networks.

 

Brazil is experiencing a moment of clash in relation to the internet, its use and management, which has as its nevral point the Bill 2630/20, known as PL of Fake News. On the one hand there are those who advocate liability, as well as transparency and moderation of contents conveyed by this medium, charging a new Big Techs posture [1]. On the other, those who falsely intercede for the right of unrestricted expression, respect for the terms of commitment made between social media platforms and their users, even if they disrespect constitutional values attributed and expensive to democracy. It is in the dynamics of this type of debate that the work Racism Algorithmic: Artificial Intelligence and Discrimination on Social Networks, by Tarcizio Silvio, is located, with the objective of “unveiling the mechanisms” of the greedy “Big Techs” industry (p.16) .

The author holds a master’s degree in Social Communication and Contemporary Culture from the Federal University of Bahia, and a doctor of human and social sciences at the Federal University of ABC, such production is part of a climbing, throughout his career, in the exploitation of Relations between racial and internet relations, which had its first evidence in 2011, still in the master’s degree. Since then, Silva has organized various publications that relate social sciences to the field of technology, as it has constituted political and social actions manifest in the creation of research institutions or participation in projects to promote digital rights and public awareness of algorithmic damage.

Divided into six chapters, Silva problematized algorithmic racism from the writings of Maria Aparecida Bento and Alana Sambo Machado to enunciate the concept of whiteness, and Ruha Benjamin, Achille Member, Langdon Winner and Meredith Broussard, to outline the political character of technologies. , listing these devices as a biopower reproduction instruments. The structure of the work can be thought from three trends. The first is epistemic and materializes in the historiographical examination of trends related to structural racism, indicating a hegemonic incidence of this theme for the fields of politics, knowledge and economics. When focused on Internet Internet theme, these studies were directed to the most apparent dynamics, such as their use to convey racist information, or to add supranational groups from ordinary organizational interests and demands, for example, supremacists.

Silva goes further as he sought to investigate the infrastructural layers, so less visible in the networks, present in the algorithms. Defined as artificial intelligences, capable of systematizing procedures logically and performing tasks in computational space, these technologies would be able to order, classify, include and exclude, constituting value hierarchies between objects and capital, which would have as parameters the standards of socially constituted whiteness and guided by its creators, thus configuring itself as another layer of structural racism.

Through these ideas racial issues began to cut the computational environment from practices such as recommendation and moderation of content, facial recognition and image processing, capable of promoting invisibilities, when culture or Afro diaphoiric achievements, and hyper visibilities, related to representations and contritions of racial character.

For the author, these algorithms and their databases would be configured as technical instruments that favor the reproduction of racism. These would use impunity, insofar as when they establish such practices, they enjoy the opacity and neutrality inherently attributed to them, omitting in their archeology economic, cultural and political interests, capable of modeling the public sphere at its beauty .

The second trend manifests itself in chapters 3 and 4, when the work gains a more essayist tone, when the author demonstrated, through the presentation of a series of research, examples of how algorithmic racism manifests itself. Thus, it begins by dealing with the computational vision and the role it keeps in relation to invisibility and hyper visibility. For the first case it demonstrates how there is an appreciation of white phenotypic traits, disregarding the characteristics of Afro -diapho people, aspects that would manifest both in image applications, with their filters, as well as search engines. The latter would not only value patterns of whiteness, but constitute devaluation of black groups from the condition of greater visibility to these groups from association with violence and pornography. Silva thus indicates that the development of technology would be overlapping the racialist dimension, reproducing hegemonic patterns present in society.

When it comes to image processing by computational vision, the author emphasizes how these hegemonic models corroborate the constitution of recurring errors that vary according to the physical characteristics of the agents present in the analyzed cut, as well as the places and objects related to it. Thus, a black man with something in his hand can be understood as armed and dangerous, while a white in the same conditions, no.

For Silva, the problem of these technologies would be in the constitution of the databases themselves that feed the algorithms that, although they are traded on the idea of neutrality, are marked by racialist dynamics that would be present since their genesis, being necessary for their diversification. According to it, this option is not made by the companies responsible for these services, choosing to try to modify the algorithms or tend to be responsible for their creations.

When applied to surveillance technologies, the author emphasizes that they tend to hypervalize the logic of white culture, determining black bodies as mature or control targets. Thus, despite their recognized errors, they have been used to promote imprisonment, especially of black, Asian and original groups; Criminalizing their practices and social spaces, stigmatized as which, when taken as spaces for manifestation of danger and risk, outlines forms of approach, identification, typification, condemnation and segregation.

Silva also points out how these standards have been involved in fields such as health, delimiting, through discretionary racial criteria, whether or not individuals have the ability to have access to more costly treatments to the public coffers, whether their material lives will allow him to execution of treatments. For him, both in the case of surveillance and the doctor, it is about delimiting those who may or may not die, those who have the permissiveness to be killed or not.

For the author this process of determination through technology unfolds from colonization, with the purpose of transforming the other into commodity from a process of dehumanization. This lists the very development of the conceptual field as part of these technologies that tend to materialize in the production devices of institutionalized violence. In the case of algorithms this is updated from the very skewed of the databases

Chapters 5 and 6 operate in a way that designing how technologies generally incorporate, in their genesis, power and domination relations and how, from the recognition of such logic, it is possible to acquire a combative stance from these dynamics.

Thus chose to present technologies produced in the nineteenth and twentieth century that incorporated, in their production process, hierarchizations that strengthened the logic of whiteness. Examples cited by the author are: a) Urban planning, b) Medical sciences, c) Epistemological currents of interpretation of reality, d) photography, e) cinema and f) bibliographic cataloging systems. The choice for such a dynamic enables the reader allows us to observe how these issues are not unique to algorithms, but unfold as a historical trend.

Cida Bento explains what is tyhe pact of witness | Image: Roda Viva/TV Cultura

Finally, the author advocates, as a form of combative action, the development of integrative, intersectoral actions, of the black movement, as well as the constitution of diabopic solidarity capable of developing information that emphatically denounce the manifestations of algorithmic racism. This act would enable the constitution of social pressures capable of influencing, sometimes in the regulation of computational space, sometimes in the correction of these provisions.

For him, studies of this type also corroborate the harmful aspect of these systems, as well as the measurable technical fagility of these codes, which say much more about old problems of society than themselves. In this way, its confrontation is not limited to the writing of new codes and programming, much less the simple audit, but also the constitution of campaigns of wide mobilization against racism in its structural dynamics.

Finally, it also signals the constitution of reinvention performances as a form of resistance, capable of altering the racist character of technologies for the valorization of blackness and, therefore, if other minorities. This type of enterprise implies the struggle for the occupation of spaces in which technological materiality develops, in order to generate diversity in these places, acting as computer scientists, programmers, engineers and developers. It is ultimately to reinterpret technologies, in order to return them to specific goals, situated from local and social issues, as well as constitute a racial literacy of these devices, capable of breaking with the idea of naturalness they carry.

I consider it as a negative point that the work repeats explanations about the manifestations by which algorithmic racism presents itself and by adopting a rehearsal tone, with times when the author rests on his own research. However, it is noteworthy that it is this same synthesizing tone that makes the work very rich in examples that give materiality to the arguments used, allowing the most technical and specific, related to this type of technology, to be clear to an audience. specialized or not.

I consider it positive that the work does not adopt a fatalistic tone in relation to technology, but to point out strategies and combative forms, involving reappropriations, social diversification in production environments, as well as the use of the epistemology proposed by the author in the very writing of this text. I also understand the fact that such technologies are not taken as independent of the power networks that structure society, even putting them in relation to previous movements of technological innovation, demonstrating how they keep common issues, related to colonialism and a culture of whiteness.

The book fulfills well the central objective announced above, thus serving as a relevant reading for those who are interested in the debate and production of specialized knowledge, whether related to racism and its manifestations, technologies, postcolonist themes and computational space regulation. I also advocate the usefulness of reading to those who need to position themselves in the face of violence from these devices, be these individual agents, victims of these micro aggressions, or collectives, such as organized social movements, who need to vocalize the demands of those who integrate them in front of them Old repagated challenges.

Summary of Racismo algoritmo: inteligência artificial e discriminação nas redes sociais 

  • 1. Discursos racistas na web e nas mídias sociais
  • 2. O que as máquinas aprendem?
  • 3. Visibilidades algorítmicas diferenciais
  • 4. Necropolítica algorítmica
  • 5. Tecnologias são políticas. E racializadas
  • 6. Reações, remediações e invenções
  • Referências
  • Sobre o autor
  • Créditos

 

Reviewer

Jandson Bernardo Soares holds a PhD in History from the Universidade Federal do Rio Grande do Norte. He published, among other works, A institucionalização do livro didático no Brasil (2021) “História e Espaços do Ensino: historiografia”, PNLD e a busca por um livro didático ideal, A institucionalização do livro didático no Brasil  e “Produzindo livros didáticos de História: prescrições e práticas – notas de uma pesquisa em andamento”. ID LATTES: 915196220680100 2; ID ORCID: orcid.org/0000-0001-8195-5113; E-mail: [email protected].ID LATTES: 915196220680100 2; ID ORCID: orcid.org/0000-0001-8195-5113; E-mail: [email protected].

 

To cite this review

SILVIO, Tarcízio. Racismo algorítmico: inteligência artificial e discriminação nas redes digitais. São Paulo: Edições Sesc São Paulo, 2022. 216p. Review by: Soares, Jandson Bernardo. Denaturalizing technologies. Crítica Historiográfica. Natal, v.3, n.13, Sep/Oct, 2023. Available at <https://www.cricohistoriografica.com.br/Desnaturalizing-technologies-resenha-da-jandson-bernardoores-ufrn-sobre -The-Livro-Racism-Algorithm-Inteligence-Artificial-and-Discrimination-NOs-Social Reds/>.

 

© – The authors who publish in historiographical criticism agree with the distribution, remixing, adaptation and creation from their texts, even for commercial purposes, provided that the proper credits are guaranteed by the original creations. (CC by-SA). 

 

Crítica Historiográfica. Natal, v.3, n. 13, Sep/Oct, 2023 | ISSN 2764-2666

Pesquisa/Search

Alertas/Alerts

Denaturalizing Technologies – Jandon Bernardo Soares’s (UFRN) review of “Algorithmic Racism: Artificial Intelligence and Discrimination on Social Networks”, by Tarcizio Silvio

Tarcízio Silvio | Image: Mundo Negro

Abstract: Racismo Algorítmico: inteligência artificial e discriminação nas redes sociais, by Tarcizio Silvio addresses structural racism in AI technologies. Silva, master and doctor of humanities, explores the influence of algorithms on the reproduction of racist standards. Although repetitive in some respects, it stands out for not adopting a fatalistic tone and suggesting strategies to combat racial prejudice.

Keywords: racism, artificial intelligence, technologies, social networks.

 

Brazil is experiencing a moment of clash in relation to the internet, its use and management, which has as its nevral point the Bill 2630/20, known as PL of Fake News. On the one hand there are those who advocate liability, as well as transparency and moderation of contents conveyed by this medium, charging a new Big Techs posture [1]. On the other, those who falsely intercede for the right of unrestricted expression, respect for the terms of commitment made between social media platforms and their users, even if they disrespect constitutional values attributed and expensive to democracy. It is in the dynamics of this type of debate that the work Racism Algorithmic: Artificial Intelligence and Discrimination on Social Networks, by Tarcizio Silvio, is located, with the objective of “unveiling the mechanisms” of the greedy “Big Techs” industry (p.16) .

The author holds a master’s degree in Social Communication and Contemporary Culture from the Federal University of Bahia, and a doctor of human and social sciences at the Federal University of ABC, such production is part of a climbing, throughout his career, in the exploitation of Relations between racial and internet relations, which had its first evidence in 2011, still in the master’s degree. Since then, Silva has organized various publications that relate social sciences to the field of technology, as it has constituted political and social actions manifest in the creation of research institutions or participation in projects to promote digital rights and public awareness of algorithmic damage.

Divided into six chapters, Silva problematized algorithmic racism from the writings of Maria Aparecida Bento and Alana Sambo Machado to enunciate the concept of whiteness, and Ruha Benjamin, Achille Member, Langdon Winner and Meredith Broussard, to outline the political character of technologies. , listing these devices as a biopower reproduction instruments. The structure of the work can be thought from three trends. The first is epistemic and materializes in the historiographical examination of trends related to structural racism, indicating a hegemonic incidence of this theme for the fields of politics, knowledge and economics. When focused on Internet Internet theme, these studies were directed to the most apparent dynamics, such as their use to convey racist information, or to add supranational groups from ordinary organizational interests and demands, for example, supremacists.

Silva goes further as he sought to investigate the infrastructural layers, so less visible in the networks, present in the algorithms. Defined as artificial intelligences, capable of systematizing procedures logically and performing tasks in computational space, these technologies would be able to order, classify, include and exclude, constituting value hierarchies between objects and capital, which would have as parameters the standards of socially constituted whiteness and guided by its creators, thus configuring itself as another layer of structural racism.

Through these ideas racial issues began to cut the computational environment from practices such as recommendation and moderation of content, facial recognition and image processing, capable of promoting invisibilities, when culture or Afro diaphoiric achievements, and hyper visibilities, related to representations and contritions of racial character.

For the author, these algorithms and their databases would be configured as technical instruments that favor the reproduction of racism. These would use impunity, insofar as when they establish such practices, they enjoy the opacity and neutrality inherently attributed to them, omitting in their archeology economic, cultural and political interests, capable of modeling the public sphere at its beauty .

The second trend manifests itself in chapters 3 and 4, when the work gains a more essayist tone, when the author demonstrated, through the presentation of a series of research, examples of how algorithmic racism manifests itself. Thus, it begins by dealing with the computational vision and the role it keeps in relation to invisibility and hyper visibility. For the first case it demonstrates how there is an appreciation of white phenotypic traits, disregarding the characteristics of Afro -diapho people, aspects that would manifest both in image applications, with their filters, as well as search engines. The latter would not only value patterns of whiteness, but constitute devaluation of black groups from the condition of greater visibility to these groups from association with violence and pornography. Silva thus indicates that the development of technology would be overlapping the racialist dimension, reproducing hegemonic patterns present in society.

When it comes to image processing by computational vision, the author emphasizes how these hegemonic models corroborate the constitution of recurring errors that vary according to the physical characteristics of the agents present in the analyzed cut, as well as the places and objects related to it. Thus, a black man with something in his hand can be understood as armed and dangerous, while a white in the same conditions, no.

For Silva, the problem of these technologies would be in the constitution of the databases themselves that feed the algorithms that, although they are traded on the idea of neutrality, are marked by racialist dynamics that would be present since their genesis, being necessary for their diversification. According to it, this option is not made by the companies responsible for these services, choosing to try to modify the algorithms or tend to be responsible for their creations.

When applied to surveillance technologies, the author emphasizes that they tend to hypervalize the logic of white culture, determining black bodies as mature or control targets. Thus, despite their recognized errors, they have been used to promote imprisonment, especially of black, Asian and original groups; Criminalizing their practices and social spaces, stigmatized as which, when taken as spaces for manifestation of danger and risk, outlines forms of approach, identification, typification, condemnation and segregation.

Silva also points out how these standards have been involved in fields such as health, delimiting, through discretionary racial criteria, whether or not individuals have the ability to have access to more costly treatments to the public coffers, whether their material lives will allow him to execution of treatments. For him, both in the case of surveillance and the doctor, it is about delimiting those who may or may not die, those who have the permissiveness to be killed or not.

For the author this process of determination through technology unfolds from colonization, with the purpose of transforming the other into commodity from a process of dehumanization. This lists the very development of the conceptual field as part of these technologies that tend to materialize in the production devices of institutionalized violence. In the case of algorithms this is updated from the very skewed of the databases

Chapters 5 and 6 operate in a way that designing how technologies generally incorporate, in their genesis, power and domination relations and how, from the recognition of such logic, it is possible to acquire a combative stance from these dynamics.

Thus chose to present technologies produced in the nineteenth and twentieth century that incorporated, in their production process, hierarchizations that strengthened the logic of whiteness. Examples cited by the author are: a) Urban planning, b) Medical sciences, c) Epistemological currents of interpretation of reality, d) photography, e) cinema and f) bibliographic cataloging systems. The choice for such a dynamic enables the reader allows us to observe how these issues are not unique to algorithms, but unfold as a historical trend.

Cida Bento explains what is tyhe pact of witness | Image: Roda Viva/TV Cultura

Finally, the author advocates, as a form of combative action, the development of integrative, intersectoral actions, of the black movement, as well as the constitution of diabopic solidarity capable of developing information that emphatically denounce the manifestations of algorithmic racism. This act would enable the constitution of social pressures capable of influencing, sometimes in the regulation of computational space, sometimes in the correction of these provisions.

For him, studies of this type also corroborate the harmful aspect of these systems, as well as the measurable technical fagility of these codes, which say much more about old problems of society than themselves. In this way, its confrontation is not limited to the writing of new codes and programming, much less the simple audit, but also the constitution of campaigns of wide mobilization against racism in its structural dynamics.

Finally, it also signals the constitution of reinvention performances as a form of resistance, capable of altering the racist character of technologies for the valorization of blackness and, therefore, if other minorities. This type of enterprise implies the struggle for the occupation of spaces in which technological materiality develops, in order to generate diversity in these places, acting as computer scientists, programmers, engineers and developers. It is ultimately to reinterpret technologies, in order to return them to specific goals, situated from local and social issues, as well as constitute a racial literacy of these devices, capable of breaking with the idea of naturalness they carry.

I consider it as a negative point that the work repeats explanations about the manifestations by which algorithmic racism presents itself and by adopting a rehearsal tone, with times when the author rests on his own research. However, it is noteworthy that it is this same synthesizing tone that makes the work very rich in examples that give materiality to the arguments used, allowing the most technical and specific, related to this type of technology, to be clear to an audience. specialized or not.

I consider it positive that the work does not adopt a fatalistic tone in relation to technology, but to point out strategies and combative forms, involving reappropriations, social diversification in production environments, as well as the use of the epistemology proposed by the author in the very writing of this text. I also understand the fact that such technologies are not taken as independent of the power networks that structure society, even putting them in relation to previous movements of technological innovation, demonstrating how they keep common issues, related to colonialism and a culture of whiteness.

The book fulfills well the central objective announced above, thus serving as a relevant reading for those who are interested in the debate and production of specialized knowledge, whether related to racism and its manifestations, technologies, postcolonist themes and computational space regulation. I also advocate the usefulness of reading to those who need to position themselves in the face of violence from these devices, be these individual agents, victims of these micro aggressions, or collectives, such as organized social movements, who need to vocalize the demands of those who integrate them in front of them Old repagated challenges.

Summary of Racismo algoritmo: inteligência artificial e discriminação nas redes sociais 

  • 1. Discursos racistas na web e nas mídias sociais
  • 2. O que as máquinas aprendem?
  • 3. Visibilidades algorítmicas diferenciais
  • 4. Necropolítica algorítmica
  • 5. Tecnologias são políticas. E racializadas
  • 6. Reações, remediações e invenções
  • Referências
  • Sobre o autor
  • Créditos

 

Reviewer

Jandson Bernardo Soares holds a PhD in History from the Universidade Federal do Rio Grande do Norte. He published, among other works, A institucionalização do livro didático no Brasil (2021) “História e Espaços do Ensino: historiografia”, PNLD e a busca por um livro didático ideal, A institucionalização do livro didático no Brasil  e “Produzindo livros didáticos de História: prescrições e práticas – notas de uma pesquisa em andamento”. ID LATTES: 915196220680100 2; ID ORCID: orcid.org/0000-0001-8195-5113; E-mail: [email protected].ID LATTES: 915196220680100 2; ID ORCID: orcid.org/0000-0001-8195-5113; E-mail: [email protected].

 

To cite this review

SILVIO, Tarcízio. Racismo algorítmico: inteligência artificial e discriminação nas redes digitais. São Paulo: Edições Sesc São Paulo, 2022. 216p. Review by: Soares, Jandson Bernardo. Denaturalizing technologies. Crítica Historiográfica. Natal, v.3, n.13, Sep/Oct, 2023. Available at <https://www.cricohistoriografica.com.br/Desnaturalizing-technologies-resenha-da-jandson-bernardoores-ufrn-sobre -The-Livro-Racism-Algorithm-Inteligence-Artificial-and-Discrimination-NOs-Social Reds/>.

 

© – The authors who publish in historiographical criticism agree with the distribution, remixing, adaptation and creation from their texts, even for commercial purposes, provided that the proper credits are guaranteed by the original creations. (CC by-SA). 

 

Crítica Historiográfica. Natal, v.3, n. 13, Sep/Oct, 2023 | ISSN 2764-2666

Resenhistas

Privacidade

Ao se inscrever nesta lista de e-mails, você estará sujeito à nossa política de privacidade.

Acesso livre

Crítica Historiográfica não cobra taxas para submissão, publicação ou uso dos artigos. Os leitores podem baixar, copiar, distribuir, imprimir os textos para fins não comerciais, desde que citem a fonte.

Foco e escopo

Publicamos resenhas de livros e de dossiês de artigos de revistas acadêmicas que tratem da reflexão, investigação, comunicação e/ou consumo da escrita da História. Saiba mais sobre o único periódico de História inteiramente dedicado à Crítica em formato resenha.

Corpo editorial

Somos professore(a)s do ensino superior brasileiro, especializado(a)s em mais de duas dezenas de áreas relacionadas à reflexão, produção e usos da História. Faça parte dessa equipe.

Submissões

As resenhas devem expressar avaliações de livros ou de dossiês de revistas acadêmicas autodesignadas como "de História". Conheça as normas e envie-nos o seu texto.

Pesquisa


Enviar mensagem de WhatsApp