Introduction
Gemma is changing the way technology is created through her innovative algorithms audit methodology which ensures technology does not replicate the biases and injustices present in real-life decision-making tools. Gemma works to shift the mindset of governments, companies, and public opinion to ensure that the hard-won rights and regulations that govern the offline world are translated into the design and development of tech tools.
The New Idea
In a world where artificial intelligence increasingly perpetuates injustice and discrimination, Gemma is using her innovative and tailormade Algorithms Audit as a tool to increase critical thinking amongst key players in the technological industry with the ultimate goal of assuring that ethical standards, social justice, equity, and transparency are included in the conceptualization and implementation of technological advancements.
The rapid proliferation of algorithms requires urgent attention, as more and more vital decisions are being automated within public, private, and charitable sectors. Most algorithms are conceived and designed by male engineers with a build it now, fix it later mentality and little attention is paid to the inherent bias, lack of security and discrimination that these artificial intelligence tools can pose. Much has been written and discussed about ethics in the digital world, but few have transformed the academic debate into a working solution. Gemma has been successful at translating the concept of digital ethics into a tangible process: an Algorithm Audit methodology for companies and governments that provides results, analysis, and a new mindset on how to do technology and its impact.
Automated audits are available and provided by entities such as Facebook, Google, and Microsoft. The problem with these is that they do not capture true bias; they tend to reinforce the same “bad” variables without questioning the original data used to train the algorithm. On the other hand, manual audits are not efficient nor cost effective. Gemma has built a hybrid methodology that combats the negatives of automated auditing by including manual intervention in key steps. Her goal is to create an accepted industry standard integral to the company and government governance, universally adopted, just like the financial audit.
Her overarching objective is to revolutionize and introduce a new mindset in the industry, whereby technology must work for society and not the other way around. To serve this purpose, Gemma uses the Algorithm Audit as a path to opening conversations with key players in the industry, getting their feedback, input, and knowledge to help to define her ongoing strategy. She devotes significant resources to building a community of academics, politicians, tech companies and industry bodies to raise awareness, deepen the debate and ultimately impact press and policy makers.
The Problem
The increasing use of algorithms to make decisions that have impact on people’s lives requires that we ensure that such decisions respect people’s rights, and also that bias and discrimination which operate in the offline world are not translated into such data systems. We are lacking a “decision making framework” for governments, companies, and individuals to make the right decisions ensuring the good of all in the design of algorithms and tech.
For example, if the historical data used to train a job platform algorithm show that women occupy less leadership positions, the algorithm will use that criteria to function, connecting women less and less to jobs related to leadership. Or for an insurance company, if the data used by the algorithm to determine who should receive urgent medical attention is related to previous expenditure, the patient who cost more to the company will receive urgent medical care, not the one that most needs it.
There are several reasons for data discrimination to be skewed. The main one, however, is the fact that the way we “do” technology has been shaped by teams of (largely male) engineers that are not always trained in understanding the world they are trying to code in their systems, resulting in technologies that often worsen problems. This results in substantial community and social impacts, and a serious infringement on fundamental rights when such systems are deployed to make decisions on a person’s chances to get medical attention, get a job, receive benefits, or go to jail.
Algorithmic bias is much talked about, but while transparency and accountability are standard responsible data principles, it is unclear how those principles translate into technical and engineering practices and standards.
In public and charitable sectors, there are tighter budgetary constraints and in many cases a lack of in-depth understanding of the nuts and bolts of tech, resulting in superficial algorithms. This makes many public decisions open to even greater inherent bias or discrimination.
As AI takes more and more space and responsibility in our daily lives, we must ensure that the technology of the future can understand and translate the hard-won rights and regulations that govern the offline world into transparent technical terms. Data ecosystems are a chance to build decision-making processes where people are treated with dignity, respect, and fairness. This opportunity will only be met if on the one hand key players are willing to participate in the debate, made aware of the potential issues and motivated to design self-governance and on the other the general public is made aware of the potential dangers that technology poses.
The Strategy
Gemma’s work advances the entire system’s understanding of the social, ethical, and legal implications of algorithms, artificial intelligence, and technology. Gemma and her team actively research and provide practical guidance and technology decision making frameworks on issues related with migration, labor and trade unions, gender, and governance.
Unique audit methodology for Companies and Governments
Gemma has been researching for many years the unethical aspects of technology. That body of research has revealed that one of the most damaging and most expanding practices was the use of biased algorithms. In 2018 Gemma developed an Algorithm Audit methodology to change this reality. To date she has conducted seven Algorithm Audits with private, public, and social organizations (Telefónica, Inter- American Development Bank, and City Council of Barcelona, among others) to gain hands-on experience with real life cases. This has been the basis for her innovative methodology and the source of the credibility that will allow Eticas to scale and become a reference in the field.
Eticas’ methodology is a hybrid model which harnesses the efficiency of automated audits combined with the highly targeted intervention of manual audit. The manual intervention is especially important in the initial analysis of the data used to train the algorithm, like the definition of vulnerable populations that varies among sectors and cultures, for instance, and also at the end of the process, when human beings will have to use the algorithm output and need to be trained to avoid bias as well.
Eticas’ team has pinpointed the three key areas which other audit services miss:
- The planning stages: For Eticas it is essential to build the correct multidisciplinary team to define the data sources, analysis, and labelling as well as establish the model and the potential areas for bias.
- The development stage: Once the model begins to function and produces sufficient output to analyze, Gemma and the team identify and rectify any bias seen in outcomes by revisiting both the original data and the workings of the model. Protected groups are identified, and disparate impacts are calculated using algorithmic audit tools, to see how the model is creating definitions of normality and who is left out, and whether those exclusions are justified.
- The implementation stages: Eticas provides KPIs to measure the output which feed into a continuous learning and monitoring loop that checks whether bias reappears, or other discriminatory elements are identified.
The first Algorithm Audit of an institution is life changing. Organizations trust that the outcome of their AI is not biased, but after conducting their first audit they realize that they were automating inequality & bias, implementing wrong data-driven decisions, or not being GDPR compliant, among other issues. In the work of Eticas’ auditing algorithms, Gemma’s team has already managed to identify issues in relation to gender or ethnic background discrimination in algorithms used both in public and private settings.
Creating community, forging a new mindset, changing the law.
Eticas is focused on community building as a strategic tool to impact private company and public policy awareness, raising and fostering an informed public debate around these issues.
Gemma helps society understand and know the tangible effects of algorithms – including their potential dangers. The first step towards dismantling and fighting against such danger is comprehensive knowledge-building on how algorithms work, not just in their conventionally understood capacity as mathematical instruments, but also as socio-technical systems with multifarious societal consequences.
Such a complex issue needs a multifaceted and multi-population approach and Eticas has designed a series of initiatives to attack the issue from different angles. On the one hand, Eticas uses knowledge dissemination and creation of open methodologies to incentivize and facilitate a more correct use of technology and algorithms in particular. And on the other, it uses legal actions to control misuse or abuse.
Through OASI (Observatory of Algorithms with Social Impact), it works to advance public understanding of algorithms by creating an accessible, searchable database that gathers and sorts real stories of algorithms and their human impact from around the world. OASI gathers, classifies, and makes algorithms searchable by impact, sector, and location, and is regularly updated. Eticas creates and disseminates open methodologies too, so that anyone with knowledge of data science can audit algorithms and identify potential risks and negative impacts. For example, Gemma has designed an Audit Algorithm Guide in collaboration with the Spanish Agency for Data Protection, with the aim of generating a certification in the near future. This certification, inspired by the B Corp model, is now a tangible work in progress and will be key to give visibility and create a standard in the market.
More recently, the Foundation has implemented litigation as a strategy to hold institutions accountable and to enforce and invoke institutional change. The courts and litigation can have a very important role in producing change, if the laws that protect the citizen are leveraged. Gemma uses the GDPR as a gateway for litigation. For instance, Article 22 states that data subjects have a right “not to be subject to a decision based solely on automated processing, including profiling” if it has legal effects that affect them. Article 5(1)(c) limits the scope of data that can be processed by AI systems. And finally, Article 5(1)(d) enables individuals to challenge tech systems that make inaccurate inferences about identifiable individuals from seemingly unrelated data.
Gemma is passionate that Eticas will be key to changing the current way we “do” technology and that many current unethical practices will be a thing of the past. Her aspiration is that the Silicon Valley slogan “move fast and break things” will be looked back upon as a terrible example of an age when tech’s priorities were upside down. She strives for a world where technologies protect and enhance rights, instead of limiting or infringing upon them.
Structure
In order to access key players even including the public administration, Eticas needed to create a business arm. Many potential partners were not comfortable contracting these kinds of services from a charity. Nonetheless, 100% of the profits from the consultancy are channeled into the non-profit foundation by agreement of the board of directors. Eticas Foundation also receive funds from other foundations and governments to do specific research and projects.
Eticas’ HQ is in Barcelona, but the staff is global, currently based in three continents and growing. Some examples of Gemma and her team’s work in policy and institutional influence:
- In Latin America, she works very closely with the BID Banco Internacional de Desarrollo, helping in the ethics of the projects that they finance and implementing algorithm audits. In Chile she has worked with the government on an algorithm audit for a project with children at risk.
- In Spain, she works with local administrations to develop data strategy, ethical tech procurement, digital transformation, and algorithm audits.
As an example of her lobbying work, Gemma offered her views and recommendations concerning the current scandals in relation to the use of an algorithm for the A-levels that has given privileged students coming from elite schools and wealthy geographical areas an academic advantage in the UK in interviews in both the BBC and The Guardian.
The Person
As in many Spanish families, the civil war played a part in Gemma’s upbringing as her grandparents were Republicans, the losing side, and had to go into exile. From that part of the family, Gemma was brought up on values of social justice and equality which were to shape her passions.
Gemma’s mother was only 14 when she was born and that was also to be an important factor in her development. Gemma admires her mother very much – her convictions and strength to look after and educate her daughter despite the conservative views that surrounded her in society. She believes that she was born against all odds and beating the odds has become a sort of constant in her life.
Gemma always involved herself in social battles. She organized fundraisers for local NGOs in secondary school, started a Student Newspaper in high school and later became involved in student organizing around social causes. She learned to organize debates, launch campaigns, and speak in public thanks to her experience in various social movements in Barcelona and the United States.
She spent over ten years working as an activist with The Transnational Institute. Her work took her to the Philippines, Bolivia, and Venezuela working first in protests against the privatization of natural resources and later working with governments in development plans.
When she returned to Spain, she was offered a pre-doctoral grant and decided to look at the field of surveillance and urban planning and this is where she began to become passionate about the impact of technology on everyday life and particularly on the more vulnerable. Financing was readily available for this kind of study, and Gemma continued to develop her research into the intersection of human rights and tech, finally concentrating on algorithmic audits as a key, practical route to put humanity into technology. Gemma was invited by the Podemos political party to become part of the team in 2014 as Secretary for Technology, Privacy and Security and she maintained the post for two years as a vehicle to lobby for policy change in the tech world. She also became well-known as an outspoken participant in the main morning radio talk show in Cataluña. These experiences showed her that she wanted to forge her way forward without any political or media restraints.
Gemma is a highly appreciated player in the tech transformation world and a key figure in the European Commission. In her role of senior advisor, she evaluates projects and helps SMEs to assure their AI structure is not biased. She is also a key collaborator with the EU Fundamental Rights Agency doing research on the impact of social facial recognition, biometrics, and borders.