Revealing Data Possibilities Through Touch
Tactile data interfaces and the evolution of data interaction from touch to voice and AI.
title: "Revealing Data Possibilities Through Touch" slug: "revealing-data-possibilities-touch" description: "Tactile data interfaces and the evolution of data interaction from touch to voice and AI." datePublished: "2012-09-15" dateModified: "2026-03-15" category: "Data Strategy" tags: ["UX", "touch", "interaction", "visualization"] tier: 3 originalUrl: "http://www.applieddatalabs.com/content/revealing-data-possibilities-touch" waybackUrl: "https://web.archive.org/web/20120915093640/http://www.applieddatalabs.com:80/content/revealing-data-possibilities-touch"
Revealing Data Possibilities Through Touch
In 2012, I wrote an unusual piece inspired by Fox's TV show "Touch," the Kiefer Sutherland series about a boy who could see the hidden patterns connecting all human lives. I compared Jake's ability to see invisible connections to what big data and analytics could theoretically do: reveal the hidden patterns and trends in our interconnected world.
It was a fun analogy. It was also accidentally prophetic about how we'd end up interacting with data.
The 2012 Vision
The original article used "Touch" as a lens to explore a deeper idea: everything is data, everything has patterns, and the right analytical tools could make those patterns visible. Jake was "the world's greatest analytics program crammed in the body of a small boy," I wrote. The show demonstrated how tiny peripheral changes could produce effects, like the butterfly effect applied to human relationships.
I argued that this level of understanding was "theoretically possible" with enough data and sophisticated enough analysis. That the invisible connections between events, people, and trends could be surfaced by software.
In 2012, this was aspirational. In 2026, large language models are doing something uncomfortably close to this on a daily basis.
In 2012, I used a TV show about a boy who sees patterns everywhere as a metaphor for analytics. By 2026, we'd built AI systems that actually find those patterns, and we access them by talking.
From Touch Screens to Natural Language
The title "Revealing Data Possibilities Through Touch" had a double meaning. It referenced the show, but it also pointed at touch-screen data interaction, which was still novel in 2012. The iPad had been out for two years. The idea of pinching, swiping, and tapping to explore data felt exciting and new.
Look at the progression since then:
Touch (2010-2015): Tablets made data exploration physical. Roambi let executives swipe through visualizations on iPads. It felt like the future.
Voice (2014-2020): Alexa and Siri brought voice commands to data queries. Tableau added natural language input. It worked, kind of. Not good enough for complex questions.
Conversational AI (2023-present): ChatGPT changed everything. "Analyze this spreadsheet and tell me which categories are declining" now produces coherent answers with charts. The interface for data analysis became a conversation.
"Show Me" Actually Works Now
I spent years in the data industry watching business users struggle with BI tools. Tableau was supposed to be easy, and it was easier than writing SQL, but it still required learning an interface and understanding data modeling concepts. Most business users never got past the basics.
Now someone can upload a CSV to ChatGPT and say "what's interesting in this data?" and get back a thoughtful analysis with visualizations. They can follow up with "break that down by quarter" or "compare the top 5 customers" using plain English. No training required. No interface to learn.
This is what the original article was really about. Not touch screens or TV shows, but the aspiration to make data accessible to anyone. To reveal the patterns without requiring the viewer to be a data scientist. Natural language AI achieved that in a way that touch screens, dashboards, and voice assistants never quite did.
The Hidden Patterns Are Real
AI actually does find patterns that humans miss. Fraud detection identifies transaction networks no analyst would connect. Supply chain AI spots correlations between weather, shipping routes, and demand. The patterns were always there. The journey from touch screens to AI-powered natural language interfaces was really about making those invisible patterns visible to everyone.
What This Means for Organizations
The practical implication is that the barrier to extracting value from data dropped dramatically. When a sales manager can ask an AI "which of my deals are most at risk this quarter and why?" and get a data-backed answer in seconds, the entire relationship between business users and data changes.
But there's a catch. Easy access to data analysis doesn't mean the analysis is always correct. AI can find patterns that are spurious. It can hallucinate insights. It can present correlations as causations with the same confident tone it uses for genuine findings. The human judgment about what to trust and what to question matters more than ever, precisely because the interface makes everything feel authoritative.
Jake from "Touch" always knew which patterns mattered. AI doesn't have that judgment. That's still on us.