Skip to main content
Logo GMV

Main navigation

  • Sectors
    • Icono espacio
      Space
    • Icono Aeronáutica
      Aeronautics
    • Icono Defensa y Seguridad
      Defense and Security
    • Icono Sistemas Inteligentes de Transporte
      Intelligent Transportation Systems
    • Icono Automoción
      Automotive
    • Icono Ciberseguridad
      Cybersecurity
    • Icono Servicios públicos Digitales
      Digital Public Services
    • Icono Sanidad
      Healthcare
    • Icono Industria
      Industry
    • Icono Financiero
      Financial
    • Icono Industria
      Services
    • All Sectors

    Highlight

    EMV Transit
    EMV Transit: technology that keeps on working
  • Talent
  • About GMV
    • Get to Know the Company
    • History
    • Management Team
    • Certifications
    • Corporate Social Responsibility
  • Communication
    • News
    • Events
    • Blog
    • Magazine GMV News
    • Press Room
    • Media library
    • Latest from GMV

Secondary navigation

  • Products A-Z
  • GMV Global
    • Global (en)
    • Spain and LATAM (es - ca - en)
    • Germany (de - en)
    • Portugal (pt - en)
    • Poland (pl - en)
    • All branches and all GMV sites
  • Home
  • Communication
  • News
Back
New search
Date
  • Services

Algorithmic bias: leaving behind the biased world of yesterday and building a fairer tomorrow

11/12/2018
  • Print
Share
José Carlos Baquero, Director of Artificial Intelligence and Big Data in GMV’s Secure e-Solutions, analyses the thorny issue of algorithmic bias

For decades we have been witness to the great benefits of algorithms in decision taking. In the real world, their application ranges from medical diagnosis and court judgments to professional recruitment and the detection of criminals. However, as their uses have extended as a result of technological advances, many have demanded greater responsibility in their implementation, with particular concern being expressed about the transparency and fairness of machine learning. Specifically, this uncertainty arises due to the ability to recreate historical prejudices which normalise and increase social inequality through algorithmic bias. This subject has been analysed by José Carlos Baquero, Director of Artificial Intelligence and Big Data at GMV’s Secure e-Solutions, and which made those attending Codemotion Madrid stop and think.

The advances in machine learning have led companies and society to trust data, on the basis that their correct analysis gives rise to more efficient and impartial decisions than those taken by humans. Yet “despite the fact that a decision taken by an algorithm is arrived at on the basis of objective criteria, the result may be unintentional discrimination. Machines learn from our prejudices and stereotypes, and if the algorithms that they use are becoming the key part of our daily activities, we urgently need to understand their impact on society,” argues Baquero. This is why we must insist on a systematic analysis of the algorithmic processes and the generation of new conceptual, legal and regulatory frameworks to guarantee human rights and fairness in a hyperconnected and globalised society. A task that obviously must be done jointly by organisations and governments.

During his presentation, José Carlos Baquero explained some recent cases about this problem, such as Amazon’s AI tool for hiring employees which systematically discriminated against women. In this case, the program reached the conclusion that men were better candidates and tended to give them a higher score when reviewing their CVs. This is just one of the examples that shows that there are increasing concerns about the loss of transparency, responsibility and fairness of algorithms due to the complexity, opaqueness, ubiquity and exclusivity of the environment.

In search of fair forecasting models

Regardless of how the algorithm is adjusted, they all have biases. Ultimately, forecasts are based on general statistics, not somebody’s individual situation. But we can use them to take wiser and fairer decisions than those made by individual humans. For this, we need to look urgently for new ways to mitigate the discrimination that is being found in the models. Moreover, we must be sure that the predictions do not unfairly prejudice those groups with certain sensitive characteristics (gender, ethnicity, etc.).

Amongst other things, José Carlos Baquero stressed the need to focus on interpretation and transparency, making it possible to interrogate complex models, and also to make models that are more robust and fairer in their predictions, modifying the optimisation of the functions and adding restrictions.

In short, “building impartial forecasting models is not simply a question of removing certain sensitive attributes from the data set. We clearly need ingenious techniques to correct the profound bias in the data and force models to make more impartial predictions. All of this involves a reduction in the performance of our models, but this is a small price to pay to leave behind the biased world of yesterday and build a fairer tomorrow,” concluded Baquero.
 

  • Print
Share

Related

GMV-IBM
  • Services
GMV revolutionizes data access with an intelligent solution based on IBM technology
PAIT, la herramienta de GMV y Peoplematters, galardonada en los XVI Premios Comunicaciones Hoy
  • Services
PAIT, the tool of GMV and Peoplematters, wins an award in the 16th Comunicaciones Hoy Awards
PAIT: apoyo tecnológico para cumplir con la normativa
  • Digital Public Services
  • Services
PAIT® solution: technological support for the new equal pay and pay transparency regulations

Contact

Europaplatz 2
64293 Darmstadt | Deutschland
Tel. +49 6151 3972 970
Fax. +49 6151 8609 415

Zeppelinstraße, 16
82205 Gilching | Deutschland
Tel. +49 (0) 8105 77670 150
Fax. +49 (0) 8105 77670 298

Contact menu

  • Contact
  • GMV around the world

Blog

  • Blog

Sectors

Sectors menu

  • Space
  • Aeronautics
  • Defense and Security
  • Intelligent Transportation Systems
  • Automotive
  • Cybersecurity
  • Digital Public Services
  • Healthcare
  • Industry
  • Financial
  • Services
  • Talent
  • About GMV
  • Shortcut to
    • Press Room
    • News
    • Events
    • Blog
    • Products A-Z
© 2025, GMV Innovating Solutions S.L.

Footer menu

  • Contact
  • Legal Notice
  • Privacy Policy
  • Cookie Policy
  • Impressum

Footer Info

  • Commitment to the Environment
  • Financial Information