Could tech detect workplace liars?
Business Management

Could tech detect workplace liars?

“It’s so easy to lie,” said University of Massachusetts psychologist Robert S. Feldman in 2002. Feldman had just authored a report which found that 60% of people lied at least once during a 10-minute conversation and told an average of two to three lies. Last month, Dr. Peter Chadha, CEO of business and IT consultants Dr Pete, posed the question of whether this inherent lying could actually be detected using new technology and asked what benefit this would have for business.

It’s an intriguing question which he claims was in part prompted by global pharmaceutical firm Sanofi’s ex-CEO Christopher Viehbacher hitting the headlines following a series of boardroom bust-ups. Viehbacher was accused of being economical with details about company deals, including plans to sell off an $8bn portfolio of off-patent medicine, reports said.

“Would he have revealed his plans to the board if he knew Big Brother was watching his every movement?” asks Dr. Chadha.

Probably not. Dr Chadha suggests that this is “not the domain of the CIA” and suggests that anyone could in theory get access to the latest technology to develop their own Big Brother system.

“Advances in cloud computing and an amplification algorithm combined with physiology and psychology means a simple smartphone recording, video conferencing footage, or Google Glass real-time video can be used with cloud computing to identify the physiological state of the CEO or IT director you've just had a business meeting with,” continues Dr Chadha. “In effect, without any ECG probes, these algorithms can detect any micro-signalling with a high degree of certainty in real time to know instantly if what they've told you is in fact true.”

It seems excessive but when billions of dollars are at stake you never quite know how large corporations are going to play. The benefits to decision-makers could override any legal or moral case for its deployment.

Dr. Chadha suggests a possible scenario.

“Imagine conducting a video conference with a potential business partner; they are telling you all about how great their new product is. Then, BUZZ… your smartwatch vibrates letting you know that there are signs that they are perhaps not being totally honest, just before you place an order for 10,000 units. Would you still place the order?”

Visual magnification

So is this all possible? In 2014 Michael Rubinstein, a computer scientist working out of a Google lab in Cambridge, Massachusetts, delivered a TED talk that unveiled a visual magnification technology that can determine changes in people’s heartrates through amplifying microscopic movements and colour changes in the skin. It’s still impressive viewing and the technology is creeping forward steadily.

Rubinstein admits that surveillance is one of the first applications that springs to mind, especially when it comes to being able to determine sounds and conversations by visually recording the tiny vibrations in nearby plants or movable objects and inverting the process. It’s very clever and although the quality is not great you can see where it is going.

But is it right? Business decision makers might argue that it could be justified if it means the business is not duped, saves costs, gets better deals and so on. But equally this could be deemed a slippery slope. Even if a business uses the technology, probably discreetly, for its own ends, where will it stop?

Chris Dyson, partner at law firm Ashfords, suggests that any commercial advantage would be countered by greater negative impacts, especially if the process was revealed in any way. And it would be an almost impossible secret to keep.

So could it be used above board? In the US it is actually illegal to put employees through a polygraph. In other parts of the world it is not necessarily illegal but whether or not it would have any value, particularly in terms of further legal action, is debatable.

“It is unlikely that polygraph evidence would be considered admissible in court or at an employment tribunal, though it would be a matter for the court to decide in each case,” says Dyson.

There is a UK precedent.

In Stephen Robert Allen v FCA (6 August 2014) (FS/2012/0019), a recent decision of the Upper Tribunal (Tax and Chancery Chamber), the individual who was the subject of proceedings by the FCA to disqualify him from practising a regulated activity was not permitted to rely on a polygraph test to demonstrate his truthfulness.

“Any application of a polygraph would have to be with the subject’s consent,” adds Dyson. “There is no scientific consensus as to the accuracy of polygraphs. The main problem is false positives. However, several police forces have recently started to use polygraphs to monitor convicted sex offenders but anyone considering using a polygraph would need to consider compliance with any applicable general law, such as the Data Protection Act.”

For businesses this is surely a case of not ‘when’ but ‘if’ and that ‘if’ is surely the moral obligation to employees and customers. If that can be upheld then perhaps there is scope but that would mean transparency of use, which would undermine its usefulness anyway and probably lead to reputational damage. Few businesses can afford that. And that’s the truth.

PREVIOUS ARTICLE

«Are our cities really getting smarter?

NEXT ARTICLE

“Robot Rights”: The future of robots at work»
Marc Ambasna-Jones

Marc Ambasna-Jones is a UK-based freelance writer and media consultant and has been writing about business and technology since 1989.

Add Your Comment

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should we donate our health data the same way we donate organs?