TED Watson
Cognitive computing meets the world's most powerful ideas
Searching Video with Natural Language
In 2015, IBM Watson partnered with TED to create something unprecedented: a cognitive computing system that enables natural language searching across the entire TED Talks archive. For the first time, users could pose questions in everyday language and receive curated video clips from multiple talks addressing that topic.
The Challenge
More than 95% of the world's digital material is video, yet it remains difficult to search effectively through traditional methods. Text-based search engines can't understand what's happening inside videos. We needed Watson's cognitive capabilities to unlock the knowledge hidden within TED's vast archive.
A Universe of Ideas
Watson indexed approximately 1,900 TED videos and transcripts, creating a conceptual universe where ideas cluster into "neighborhoods." The visualization revealed unexpected connections: Music talks were found to be located proximate to discussions of Time and Mind, providing new insights into how ideas relate across disciplines.
Below each video clip, a timeline displays related concepts, enabling users to explore contextually-linked topics serendipitously. This transforms passive video watching into an active exploration of interconnected knowledge.
Technology Stack
Concept Insights
Maps relationships between ideas and concepts across talks
Personality Insights
Analyzes speaker characteristics and communication styles
AlchemyAPI
Concept tagging and entity identification
NLP Engine
Natural language processing for contextual understanding
Impact
The demonstration at World of Watson in Brooklyn showcased how cognitive computing could transform video from a passive medium into an interactive knowledge resource. By making the invisible connections between ideas visible, we helped audiences discover unexpected pathways through the world's most powerful ideas.