Deloitte launches certified quality mark for AI and Robotics

31 October 2018 4 min. read

As society prepares for a future where artificial intelligence plays an ever more prominent role, the ethical implications of the emergent technology cannot be ignored. Big Four firm Deloitte has partnered with the Responsible Robotics organisation to offer a new certification system to attempt to ensure that AI does not simply end up reinforcing the same old prejudices.

While it is often easy and even tempting to fall into discussing artificial intelligence as if Skynet is about to overthrow humanity and take control of the world, the truth of the matter is that AI is only as competent and ethical as its creators. It is a far greater possibility that those who programme it might pre-package their own ideological predispositions inside the new technology, or that the information AI has access to ‘learn’ from may likewise contain certain social norms and biases.

A well publicised illustration of this is the AI which Amazon reportedly pulled the plug on after the technicians could not find a way to stop the automated recruitment programme discriminating against female job applicants. Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand, but biases in that data could therefore simply be reinforced in the AI systems.

Deloitte launches certified quality mark for AI and Robotics

At the same time, since AI is largely to be developed in the interests of corporate entities or governmental security services, ethical accountability in the matter is key. Without it, those in charge of the future of AI could employ it to consolidate existing power structures and social disparities – with the trials of the Chinese Government’s heavily criticised ‘Social Credit’ system currently showing how this could take shape. In some areas of the steadfastly authoritarian superpower, citizens can now be tracked by facial recognition technology, and have certain rights and privileges removed if they are judged to be threatening the stability of the current economic or social order.

AI in control

With this in mind, Netherlands-based organisation Responsible Robotics has teamed up with Deloitte to launch a certified quality mark for AI and Robotics. According to Aimee van Wynsberghe, co-director Foundation for Responsible Robotics, the goal is to shape “a culture of responsible development of AI and Robotics to promote public good and a better life for us and generations to come.”

In terms of what the project offers consumers, the hope is that whether they are a concerned parent, an NGO worker, a farmer or just someone looking for a job, a quality mark handed to responsible AI developers will foster transparency and responsibility to protect societal values. Meanwhile, for producers, the aim is to help foster responsible design and innovation. The programme will also work with businesses to assess their products and build a pathway for responsible innovation in their company.

Former Robot Wars judge and co-director of the Foundation for Responsible Robotics, Professor Noel Sharkey, said, “Society could reap enormous benefits from AI and Robotics, but only if we get it right. We need to counter the scare stories, and the hype, or risk a public backlash. We must offer the public a mark of quality that helps them to make informed decisions.”

The partnership was initiated by the Dutch arm of Deloitte, but the broader idea is to operate as a pan-European certification. If all goes well then the plan within Deloitte is to leverage the knowledge gained for a global roll-out.

Marc Verdonk, a Partner in Deloitte's Emerging Technology practice within Risk Advisory, added, “By contributing our knowledge and experience in auditing, innovation and AI we can help to create trust in the chain from producer to consumer.”

Earlier this month, IT services giant Atos granted a Quantum Learning Machine to the Technical University of Denmark.