Every single company in the world – and its wife – is leaping on the AI bandwagon. There is a mad scramble to find the most worthy (and lucrative) use cases, make them work, and then get them integrated with existing social frameworks. So, it is little surprise that Microsoft dedicated a keynote at today’s Future Decoded event in London to its strategy in this area.
This includes a rather impressive live demonstration on it new multilingual audio translation service – due to be shipped at the end of the year. It also features feedback from UK predictive keyboard startup, SwiftKey, which Microsoft acquired earlier this year, and now incorporates simple multi-language features as well as making it easier for users to include context-relevant emojis.
Chris Bishop, Laboratory Director at Microsoft Research Cambridge talks broadly about the future world where humans work even more closely along machines. Naturally he is keen to refute any charges about artificial beings stealing all our jobs and instead talks about machines taking away the donkey work from mankind.
Interestingly, Stephen Hawking was down to speak at the event but cancelled yesterday. So, we’ll never know if he planned to do his standard ‘AI is evil speech’ or if he had a slightly different message prepared.
Bishop specifically describes AI “democratising” across the four key areas. He lists these as agents, applications, services and infrastructure. None of this seems especially new or revolutionary and all are areas being covered extensively by other companies. Yet this will always be an issue for large companies like Microsoft and doesn’t mean it can’t make extensive headway though concerted R&D and its wide network of existing partnerships.
Bishop talks about AIs which go far beyond chatbots and escalate customer service queries to a human agent when the machine can’t answer. The solution he describes is currently being tested on 2000 customer service agents in the US and sounds a long way behind the work of bespoke providers like IPSoft.
Bishop highlights a specific application in healthcare where radiologists can cut out the manual labour associated with, ‘drawing around’ cancer tumours which need zapping and healthy organs which don’t, on a 2D x-ray. Instead two hours human work can be delegated to an AI which can perform the task more precisely in a faction of the time. “To be able to do this automatically opens up a whole sphere of personalised healthcare,” he says.
Bishop points to the IK Prize for art by the Tate in conjunction with Microsoft. This year it was won by Fabrica for its exhibit, Recognition. This took faces pulled from Reuters news images and matched them to a vast archive of digital art works. The Tate described the result on its website as “a time capsule of the world represented in diverse types of images, past and present.”
“Machine learning is based on big data, algorithm and compute,” says Bishop who is keen to highlight Microsoft’s background strength in computing power. Microsoft datacentres always included computing (CPU) and graphics (GPU) and more recently have added FPGAs, he says. This development has helped Microsoft launch its first exascale AI supercomputer deployed across 15 countries. To add some context Bishop explains that this increased processing power allows for the translation of the entire English-language Wikipedia into Russian in under a tenth of a second.
PREVIOUS ARTICLE«Rant: Times change, keyboards don’t
NEXT ARTICLERant: Apple says ‘hit the road, Mac’ to loyal users»
Jon Collins’ in-depth look at tech and society
Phil Muncaster reports on China and beyond