AI should enhance, not replace, humans, say CEOs of IBM and Microsoft

AI should enhance, not replace, humans, say CEOs of IBM and Microsoft

Artificial intelligence should enhance human workers, not replace them, at least according to the CEOs of IBM and Microsoft.

Ginni Rometty and Satya Nadella made clear their view of the role of AI in a panel discussion at the World Economic Forum in Davos, Switzerland, on Tuesday, just a few hours after Rometty circulated IBM's three guiding principles for the development of cognitive technologies to company staff.

Less dramatic and snappily expressed than Isaac Asimov's three laws of robotics, IBM's three principles are nevertheless intended to limit the harm the introduction of AI technologies causes.

The first thing to understand is the purpose of these technologies. For IBM, Rometty said, "it will not be man or machine: Our purpose is to augment and be in service of what humans do."

Next on her list is transparency: "If someone is using a system, tell them it is artificial intelligence. Tell them how it got trained. Was it trained by experts? What data was used to train it? The human needs to remain in control of these systems," she said.

The third principle is to ensure humans have the skills to work with new cognitive technologies.

"The odds are there are some jobs that will be replaced, but most of us will be working with these systems," she said. Companies like IBM need to ensure not just that AIs are trained, but that people are trained, too.

"The skills needed to succeed in this world are not all high-degree skills," Rometty said, while encouraging businesses to work with schools. "Give them a curriculum that's relevant, give them mentorship, and be sure they're teaching what you're hiring for," she said.

Nadella echoed IBM's purpose: "It's our responsibility to have AI augment human ingenuity and opportunity," he said.

How far human responsibility goes in this industry is an open question, though. "This is one of the harder challenges," he said. "How do you take accountability for the decisions algorithms are making in a world where the algorithms are not being written by you, but are being learned?" 

He agreed with the need for transparency. Otherwise, he said, "whose black box do you trust? What is the framework of law and ethics that is ... able to govern the black box? Who is in charge of that?"

That's something the IT industry needs to work on, said Rometty.

"It's our responsibility, as leaders that are putting these technologies out, to guide them in their entry to the world in a safe way," she said. She pointed to the Partnership on AI as an example of the way in which industry is taking a lead.

IDG Insider

PREVIOUS ARTICLE

«OS X El Capitan gets a supplemental security update

NEXT ARTICLE

This Wi-Fi alternative rides on LTE rails»
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

Add Your Comment

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should companies have Bitcoins on hand in preparation for a Ransomware attack?