Jobs lottery for the youth

Blessing Mbalaka is a Junior Researcher, Digital Africa Unit, Institute for Pan-African Thought and Conversation at the University of Johannesburg. | Supplied

Blessing Mbalaka is a Junior Researcher, Digital Africa Unit, Institute for Pan-African Thought and Conversation at the University of Johannesburg. | Supplied

Published Mar 10, 2023

Share

BLESSING MBALAKA

The Extended public works programme (EPWP) was implemented to increase employment opportunities for many unemployed South Africans below the poverty line. However, the prevalence of nepotism has not made the process fair.

But the EPWP lottery recruitment drive was developed by the City of Tshwane Metropolitan Municipality to resolve the dilemma of nepotism. The system aimed to work by randomly selecting aspiring job seekers from their data base.

The programme looked to reduce human interference by automating the recruitment procedure and improving the access to opportunities. I first came across the initiative when I saw an SABC video titled “Tshwane Municipality uses artificial intelligence to fight nepotism”.

The video did not engage in the AI aspect, and I have struggled to find a sufficient explanation of this AI component. Perhaps, the video title was misleading, but it nonetheless inspired this opinion piece.

AI has been argued by scholars such as Gebru and Boulamwini as encompassing biases. These biases are argued to be the culmination of biased training data which is usually determined by groups which are not sufficiently diverse. The world inherently encompasses biases, and the mere act to choose is a bias.

This could be the conscious or subconscious choice between preferences or the villainous act of being prejudiced. Or it could be unintentional bias due to the lack of exposure to global diversities.

AI is not exempt from the villainous racial histories and the present pseudo-conjecture of the rainbow nation or the entrenched inequalities which affect numerous facets of our lives including AI. The data which informs an AI algorithm is influenced by these worldly biases.

Some of these biases could be the culmination of insufficient evidence and exposure which informs the AI. This lack of exposure to diverse data sets may mean that, the lived experiences of the marginalised are not included in the training data, this lack of diversity can lead to the dissemination of AI which fails to sufficiently cater for diverse applications or risks culturally offending people due to its lack of training.

Digital colonialism is another argument which can be used to explain the reason why AI can be biased. AI researchers need to be cognizant of the systemic historical divides which shape our reality. This can be illustrated by the question of ownership. Ownership of data and the algorithms which are imported could be problematic. One way this is so, is through the implementation of Eurocentric criteria in the data sets.

In the hypothetical sense, if the EPWP hiring lottery system was modelled after the controversial hiring AI algorithm, which purposefully rejected female applicants in technical roles, one can rudimentarily delineate how that can become problematic for society. What I am arguing here is that AI, unless sufficiently assessed and regulated, could lead to the proliferation of AI which encompasses harmful biases.

Employment, in the general sense, should be based on merit, but if the patriarchal selection informs the AI, this could yield highly problematic results for a society which seeks to move on from prejudice. AI thus should not reinforce the very problems which society wishes to circumvent.

Ideally, one could think that the reduction of human involvement may mitigate the aspect of greed-induced corruption and nepotism, but it is also important to ensure that AI is monitored and measured on its inclusivity to ensure that the algorithm’s data set is not as prejudiced as the Amazon hiring system.

The City of Tshwane Metropolitan Municipality’s “AI” acts as a random lottery system which works differently from the Amazon hiring algorithm. However, it still needs to be critically examined if this alternative lottery approach is better. The lottery system looks at randomly awarding qualifying candidates with temporary employment opportunities.

However, the lottery system isn’t an AI, and the proposed title is highly misleading. The dilemma of AI hiring still remains topical, especially considering the encroachment of AI. Perhaps AI hiring technology is merely regurgitating the pre-existing biases in the hiring process. Selection is a bias, and the process of choosing is based on criteria, including work experience, age, region, qualifications, and other essential skills. This thus means that separating bias from recruitment is not possible.

Merit is usually the common ground that justifies selection, but the complexities of the Apartheid legacies have made selection more complex. It is thus important to ensure that the adopted algorithm is well-acclimatised to work toward the state’s redistributive prerogative through affirmative action.

It is imperative to be cognizant of the limitations of AI and ensure that the algorithms do not entirely replace human-based selection. This ensures that human resources professionals identify biased judgements contrary to the state’s affirmative action agenda.

Careful policy measures are imperative to help mitigate these considerations. Perhaps it is time to develop an AI algorithm framework to facilitate AI adoption in South Africa.

This framework can be informed by the Department of Telecommunications and Postal Services Presidential Commission on the 4th Industrial Revolution. This is because of the commission’s complementary agenda of ensuring that 4IR technology and AI are best integrated into society through rigorous planning and discussions by academics, civic society and other stakeholders.

It is important that this presidential commission is mindful of issues such as algorithmic bias circumvention and many other issues which could transpire from the integration of 4IR technologies into society.

The role of smart technology is fast becoming an apparatus that can be used to improve processes, but there tends to be a trade-off which comes with technological adoption which may lead to social problems. It is important to ensure that the presidential commission on the 4th industrial revolution explores these potential negative social impacts so that technology does not fix one problem whilst creating the next.

However, having said that, the European Union (EU) proposed an AI law to showcase that they are aware of the problems of AI. This approach toward AI regulation was first implemented in Singapore in 2017 when AI ethics became a topical area of concern.

In the space of AI research, numerous AI algorithms have been deemed highly problematic in an array of applications such as AI judges and self-driving cars which, in some instances, failed to identify pedestrians. The problems with AI are well documented, but government intervention is imperative to ensure that this encroachment of AI does not create problematic dilemmas due to insufficient regulations.

Therefore, what I would like to raise is that algorithmic transparency and auditing are imperative to ensuring that the AI adopted encompasses, programmed guidelines which mitigate biases.

For this to work, the state needs to ensure that the AI algorithms adopted are sufficiently vetted for social applications. Although, it is difficult to sufficiently critique AI if the engine under the hood is not visible for the public to critique. I am thus proposing that AI prior to its launch is vetted for its performance in an array of social functions.

For this to happen, multiple stakeholders are needed to help inform the formation of an inclusive framework which deals with the limitations of AI. These requirements should be universalised across numerous sectors to ensure that AI adoption is met with the necessary scrutiny.

Blessing Mbalaka is a Junior Researcher, Digital Africa Unit, Institute for Pan-African Thought and Conversation at the University of Johannesburg.

Daily News