Collaborative Working

Was IT better before the 90s?

Aren’t you sick to death of all this open systems, collaboration and multitasking culture? Is The Cloud just a polite way saying we’re ‘all over the place’? Were we better off in the days of information rationing, when control was centralised and ‘The Computer Department’ had a code of omerta.

These days, interviews with developers are depressing. It’s not the friendly helpfulness that bothers me so much as the open-mindedness. How do they ever concentrate under these conditions?

In the spirit of openness, it’s worth examining some of the cultural changes in our relationship with technology since the beginning of the 1990s.

 

Proprietary culture versus sharing

It wasn’t just the systems that were proprietary back then. We IT staff networked like an IBM System 360, which was hardly at all. Even our help messages were indecipherable. Taking our lead from Big Blue, we saw the best way of increasing our value was to hoard information. These days it’d probably called something like Scrooging-as-a-Service.

If you were the only person who’d seen a telecoms engineer reset a PBX that gave you a competitive knowledge advantage that could make you the most valuable drone in the call centre – and I speak from experience here. Why dilute your value by sharing the secrets of pulling a circuit board in and out? When an IBM engineer came to visit, if he (it was always a he) spotted a rival computer in the office, he’d get on the phone to report he was on an “infected site”.

People were nasty, single-minded and shallow – but we were happier because we knew our place. Today’s open systems culture is baffling. The other day I interviewed three Ruby On Rails developers at digital agency dxw, all of whom shared everything. If Duncan was away with the users (another odd, modern concept) then Tom or Michael could cover, because they’d spent so much time on Slack, sharing information and being a team. There weren’t even all that fussed about their job titles. Is there a danger these young people are ‘slack walking’ their way into a world where status doesn’t matter?

Millennials, view collaboration and sharing of information as normal and they’re in the majority now, warns David McLeman, CEO of service provider Ancoris.

It’s the social networks you see. These poor kids can't remember a time without the internet, mobile phones and web applications. “Social media is part of the way they work, socialise and communicate,” says McLeman.

I think he’s trying to say is that their concentration spans never stood a chance.

 

The non-doc society versus non-stop blogs

Back in the heady days of the 90s, documentation was for wimps. Nobody left explanatory notes about anything. In fact, being told how things work was regarded as unmanly, like asking for directions when you’re lost. Everyone had to pretend that they learned about computers the hard way, by taking them to bits and somehow working out for themselves the complex relationships between tapes, RAM, disks, machine code and operating systems. Oddly, taking driving lessons was regarded as OK. But learning how to drive and maintain a computer was for losers.

As a consequence of this ‘trial by error’ learning system, London’s traffic was brought grinding to a halt one day in 1990. Thousands sweated in their stationary cars for hours, with no idea why. It was because the AS/400 that controlled the removal of road blockages was running slow, as someone in IT (for the London Metropolitan authority of the day) chose a disastrous menu option out of a quest for self-taught knowledge. I know this from personal experience too. Sorry to anyone that suffered. Anyone whose car wasn’t unclamped within four hours was legally due a refund. But that was another piece of information my employer hoarded.

Many a company was held to ransom by a contract programmer who’d created all kinds of mysterious loops and smokescreens then left. The company had no choice but to rehire him when more work needed doing. Nobody ever left explanatory notes because, as IT suppliers would say, “there’s margin in mystery”.

These days, I’m told, millennials love sharing their knowledge. In theory there’s so much work available that nobody needs to be greedy. The downside of this is that for every hour they spend coding, today’s digirati spend four hours blogging about it, making “How to” videos for YouTube and optimising it for some search algorithm.

Now that software defines everything, the old two-year release cycle has been replaced by a regular stream of changes, says Sacha Labourey, CEO of CloudBees

This shift forces companies to move from the typical ‘silo’ approach to production teamwork. What I think he’s trying to say is: isn’t it sad we are losing our individuality?

 

Focus versus multi-tasking

Just as machine refused to speak unto machine, tribalism was rife among staff. We had more internal divisions than a VMware operator could spin up in a week. In IT, operators and programmers would routinely blame problems on each other, like warring neighbours lobbing dog poo over the garden fence. When the phones failed, you would be batted between the internal and external engineering depots of the telecoms provider until you eventually gave up.

These days, everyone wants to handle voice calls. They’re all converged multitaskers and get involved in all kinds of jobs, forming a fluid team of interchangeable versatile units. It’s like IT’s answer to the Dutch concept of total football.

Is the modern obsession with multi-tasking based on a productivity myth? Andrew Filev, CEO of Californian “work management” startup Wrike, says there’s evidence that doing several jobs at once is brainless.

“People who enthuse about multitasking have no idea about cognitive functions,” he wrote in a recent blog. “Your brain doesn’t work by flitting from one subject to another. You need time to concentrate on a single subject.” 

Studies by the American Psychological Association and the University of Utah’s Department of Psychology suggest that modern multitaskers are at least 40% less efficient, with the loss in productivity created by each adjustment needed every time you switch jobs. In addition, only two per cent of the population can multitask effectively anyway.

 

Conclusion

We technology dinosaurs once dismissed open source software as a teenage phase but entire companies are now based upon it, according to Tarkan Maner, CEO of Nexenta, one of the companies that aims to break up the proprietary storage racket. Technologies that are created on the basis of sharing information can instill a transparent way of working that encourages a joint effort to help achieve the bigger picture, argues Maner.

“When companies like Tesla open up their patents to drive the development of electric cars, it’s obvious that we’re in a culture shift and well on the road to an open source-driven world,” says Maner.

Maybe so. But I still don’t like multitasking. It’s more than my job’s worth.

Am I alone in this?

PREVIOUS ARTICLE

« Is Apple's $1 billion stake in Didi a good strategic move?

NEXT ARTICLE

The real meaning of… 'As A Service' »
Nick Booth

Nick Booth worked in IT in the UK’s National Health Service, financial services and The Met Police, witnessing at first hand the disruptive effects of new technology. As a journalist and analyst, his mission is to stop history repeating itself.

  • Mail

Poll

Do you think your smartphone is making you a workaholic?