IBM Watson's next act: The fashion designer

IBM’s cognitive computing system, Watson has come a long way since winning American quiz show Jeopardy! in 2011. One of the industries has been in healthcare. Physicians don’t have time to read all of the medical literature generated - so Watson helps out by examining “all available data sources to form hypotheses and test them”.

This is what Watson is great at and the legal, retail and healthcare industries have all been using Watson’s natural language abilities to understand complex information and making use of its recommendations.

But testing computational creativity is where it gets most interesting. Watson’s most recent spark of creativity unveiled itself yesterday at New York’s Met Gala at the Metropolitan Museum of Art. A model wore Watson’s “cognitive dress” covered in roses with embedded LEDs that change colour in real-time depending on the nature of the tweets about the Met Gala.

In designing the dress, Watson collaborated with fashion designer Marchesa. An IBM post talking about the project said:

“The dress’ cognitive creation relies on a mix of Watson APIs, cognitive tools from IBM Research, solutions from Watson developer partner Inno360 and the creative vision from the Marchesa design team.”

Does this show AI creativity?

Experts are divided on what constitutes “creativity” and some may not be convinced by this particular project, seeing it as just another gimmick in the world of fashion as designers experiment with different ways to incorporate tech in their designs.

But delving deeper into the creative process, there were a number of sophisticated tools like the Tone Analyzer that were used by Watson to understand the “psychological effects of colours, the interrelationships between emotions, and image aesthetics”. Watson also made suggestions for “colour palettes in line with Marchesa’s brand”.

While all this is impressive, in the world of song writing and musicals, the computers face a more daunting challenge. Last year, Watson attempted to analyse famous musician Bob Dylan’s lyrics by reading 800 million pages per second and concluded that Dylan’s major themes were “time passes and love fades”.

That’s great but what about the subtleties in lyrics? As our Editorial Director, Martin Veitch observed: Can Watson “spot the ironies in words” or “detect sarcasm in the voice”?

Computers also faced challenges with song writing earlier this year when they attempted to help create the world’s first computer-generated musical. The composers that worked with the computers admitted almost tearing their hair out and even input some of their frustrations in the songs that appeared in the musical. In the end, around 25% of the lyrics generated from the computer ended up in the finished piece. But the computer was still very much an integral part in the creation of the musical.

All this shows that AI is not one hundred percent there yet, but the creative sparks are, and while computers still need humans to ignite it, maybe one day they won’t.


Also read:

Computer-musical hits the right ‘notes’ but still needs humans to deliver

IBM Watson, Bob Dylan and the limits of machine intelligence

AI in Art: Can a computer be an art critic?


« Wired Health 2016: Organs-on-chips, AI doctors and robot surgeons


If Craig Wright did create Bitcoin, what next? »
Ayesha Salim

Ayesha Salim is Staff Writer at IDG Connect

  • twt
  • Mail


Do you think your smartphone is making you a workaholic?