Unethical AI poses risk to mining companies
The issue was highlighted at the recent annual Wits Mining Institute (WMI) Sibanye Stillwater DigiMine hybrid seminar roundtable discussion, where the question of counterproductive technology was brought to the fore.
“In many instances, the creators of AI are unregulated, there is a question of ethics and there is no registration required,” says Professor Cawood.
Data a challenge
Jean-Jacques Verhaeghe, programme manager: Real Time Information and Management Systems at the Mandela Mining Precinct, notes that gathering data is a challenge.
“You can simulate data, or download datasets off the Internet, but receiving data from mining companies proves challenging, as they often keep their data closely guarded,” he says.
“Finding those mining companies and original equipment manufacturers that want to collaborate on data collection, is critical,” he adds.
A miner’s perspective
From a miner’s perspective, Sibanye-Stillwater Group head of Innovation, Alex Fenn, says that companies were reluctant to share datasets as the end-use and application of that data is not always disclosed.
“We have to protect our business, understand what we are handing over and ensure that we are not compromising governance frameworks by handing over that data to third-parties.”
Fenn highlighted that where AI can be problematic is when the values of the creators of AI do not have values that reflect company values, and where it is incorporating bias.
“Hypothetically, if you have a programme that’s responsible for making a determination based on diversity and inclusivity, you’ll likely use historical datasets to train the model and because you are trying to change what’s occurred in history, that dataset is going to introduce a bias to the model that will then inform the inferences the model makes,” he explains.
It’s being weary of what you use to train your AI to ensure that the outcome is not biased in the direction that you are trying to move away from.
“This is massively complex, because in essence the person programming the AI has inherent biases themselves,” Fern highlights.
Regulation of AI
Where the regulation of AI was concerned, the European Union in April released draft regulations on the governance of AI.
“In simple language, the regulations distinguish between good AI and bad AI,” says Cawood.
AI systems considered a clear threat to the safety, livelihoods and rights of people will be banned.
This includes AI systems or applications that manipulate human behaviour to circumvent users' free will and systems that allow ‘social scoring' by governments.
“This also includes AI where the consequences are not fully understood, have not been properly tested and end up harming a community, a business process or a person,” says Cawood.
World first AI system patent
In August this year, South Africa granted a patent to an artificial intelligence system – a world first.
Granting AI legal rights, Fenn notes, was difficult to comprehend, but that it was extremely important that it was well understood before setting precedents that could offset the balance of innovation in general.
“Going forward, this will be hotly debated as it can bring complications in the future and will need to be understood to advise policymakers to make the right decisions.
“Professional bodies and governing bodies are going to be instrumental in the success or extreme failure of various aspects of AI in industry,” he says.
Not all doom and gloom
Cawood reiterated the importance of test mines, such as the facilities at the WMI, paired with the DigiMine facilities to enable the further development of AI and technologies in a controlled environment.
Sibanye-Stillwater shift boss Katekani Maswanganyi noted that in future, regulation would possibly look to hold both the manufacturer and the programmer legally liable for any problems that may arise.
However, for Maswanganyi, the advent of AI was not all doom and gloom. “It will also allow for women in science, technology, engineering and mathematics to become more involved in industry,” she says.