How Facebook's Graph Search Will Affect Google, Technology, and Privacy
The evolution of social search — from Facebook Graph Search to AI-powered discovery.
title: "How Facebook's Graph Search Will Affect Google, Technology, and Privacy" slug: "how-facebooks-graph-search-will-affect-google-technology-and-privacy" description: "The evolution of social search — from Facebook Graph Search to AI-powered discovery." datePublished: "2013-01-19" dateModified: "2026-03-15" category: "AI & Privacy" tags: ["Facebook", "search", "privacy", "social data"] tier: 1 originalUrl: "http://www.applieddatalabs.com/content/how-facebooks-graph-search-will-affect-google-technology-and-privacy" waybackUrl: "https://web.archive.org/web/20130119073105/http://www.applieddatalabs.com:80/content/how-facebooks-graph-search-will-affect-google-technology-and-privacy"
How Facebook's Graph Search Will Affect Google, Technology, and Privacy
In January 2013, Jeremy wrote about Facebook's shiny new product: Graph Search. Mark Zuckerberg personally introduced it. Lars Rasmussen, the engineer who created Google Maps, had been poached to lead the effort. The idea was that you could type natural language questions like "What restaurants do my friends like in Boston?" or "Who do I know in Chicago?" and Facebook would search its social graph to give you answers Google couldn't. Tom Stocky, the product director (also a Google defector), said it would "make the world feel a bit smaller." We thought Graph Search could dominate social searching. Instead, Facebook killed it entirely within six years. The whole product vanished.
But the underlying technologies we wrote about -- natural language search, personalized results, direct answers instead of links -- ended up everywhere. Just not at Facebook.
What We Got Right and Wrong in 2013
Jeremy's original take was measured for the time. He didn't think Graph Search would threaten Google's core business. "Google isn't sweating this one, trust me," he wrote. He argued that Facebook would own social search -- finding things through your friends' data -- but Google would remain the destination for "substance, for academic papers, current news, business, forums, government." That turned out to be roughly correct, except that social search itself turned out to be a product nobody actually wanted enough to sustain.
We were more interested in the technology trends Graph Search represented. Natural language processing was still novel in 2013. Siri had launched in 2011, and most people were still typing keyword queries into search boxes. Facebook's bet was that people would learn to ask questions in full sentences. Jeremy noted the irony: "Natural language will make things more user friendly, eventually, but as silly as it sounds this user friendly feature will be a hurdle for some users." People had trained themselves to drop articles and prepositions from searches. "What now I leave the 'the,' 'are,' 'to,' and 'for' in there?"
The privacy angle was the most prescient part. We noted that Facebook stored 84 categories of data about each user, "not just what you share, but even things like what friend requests you denied and what tags you removed." Graph Search was going to expose a sliver of that data to other users. We predicted that the tool would "make people more aware of the information they share."
Facebook killed Graph Search in 2019. The social search revolution never happened. But the technologies behind it -- natural language queries, direct answers, AI-powered knowledge retrieval -- became the entire future of search.
The Slow Death of Social Search
Graph Search never gained traction. Facebook scaled it back in 2014, limited it further in 2017, and removed it entirely in 2019. Technically, result quality depended on users filling out structured profile data, and people were doing less of that as usage shifted toward passive feed scrolling. Then the Cambridge Analytica scandal broke in 2018, revealing that the political firm had harvested data from 87 million users, and suddenly nobody wanted their social data to be more searchable.
Zuckerberg published a 3,200-word manifesto in March 2019 promising to make Facebook more private, encrypted, and ephemeral. The company that built Graph Search to open up its social graph was now promising to hide it.
Google+, the other social search contender we discussed in the original article, died even faster. Google shut down the consumer version in April 2019 after disclosing a bug that exposed 500,000 users' personal data. The dream of social graph-powered search died twice in the same year.
AI Search Ate Everything
The real successor to Graph Search's vision arrived a decade later, and it looked nothing like what Facebook built. ChatGPT launched in November 2022, reached 100 million users in two months, and by early 2024 had integrated web search directly. It started eating into Google's core business in a way Facebook never could.
Perplexity AI, founded in 2022 by former Google research scientist Aravind Srinivas, built a search engine that does exactly what Graph Search promised: you ask a question in natural language and get a direct answer instead of a list of links. Perplexity hit 100 million monthly queries by early 2024 and was valued at $9 billion by late 2024. It doesn't search your friends' likes. It searches the entire web and synthesizes answers with citations.
Google responded with AI Overviews, rolling out AI-generated answer summaries to all U.S. users in May 2024. The company Jeremy said "isn't sweating" is now completely restructuring its core product around AI-generated answers.
Facebook tried to build natural language search on a social graph in 2013 and failed. A decade later, AI companies built natural language search on the entire internet and it worked immediately. The social graph wasn't the right foundation. Language models were.
Enterprise Knowledge Search is the Real Story
Here's where this history becomes directly relevant to businesses. The same shift from keyword search to AI-powered question answering is happening inside organizations, and it's happening fast.
Enterprise search has been terrible for decades. McKinsey estimated that employees spend 20% of their time just looking for internal information. SharePoint search and Confluence search use keyword matching and return pages of irrelevant results.
RAG (Retrieval-Augmented Generation) systems fix this. An employee asks "What's our policy on customer refunds for defective products?" and the system searches internal knowledge bases, generates a specific answer, and cites its sources. Microsoft's Copilot does this inside Office 365. Google's Gemini does it in Workspace. Glean, valued at $4.6 billion in 2024, does it as a standalone product.
This is exactly what operational knowledge management looks like in practice. The technology that Facebook tried to build for social connections in 2013, where you'd ask a question and get a real answer from your network, actually works when you apply it to organizational knowledge using modern AI.
But deploying these systems raises the same privacy questions we flagged in 2013. An AI search system that can answer "Who in our company has worked on pharmaceutical accounts?" is powerful and useful. It's also potentially dangerous if access controls aren't right, if it surfaces confidential HR data, or if the AI hallucinates an answer that sounds authoritative but is wrong. AI governance matters here because bad answers from an enterprise AI system don't just waste time. They can drive bad decisions.
From 84 Categories to Every Document You Own
Jeremy's warning about Facebook's 84 categories of user data was about personal privacy. The enterprise version of this problem is worse. A large company's knowledge base contains tens of millions of documents: legal agreements, personnel records, financial data, customer information, strategic plans. Point an AI search system at all of that without proper access controls and you've got a liability engine.
The companies getting this right treat AI-powered search as an Operational AI challenge. They map data sensitivity, enforce role-based access at the document level, audit what the AI retrieves, and monitor for hallucinations. The rest are hoping nothing goes wrong. Graph Search failed because people didn't want their social data searchable. Enterprise AI search will fail for the same reason if governance isn't there from the start.