Longform

Can MIT’s Tim Berners-Lee Save the Web?

Thirty years ago, MIT professor Tim Berners-Lee invented the World Wide Web and altered the course of human history. Now, in the face of misinformation, malicious behavior, and the exploitation of personal data online, he’s determined to slay the beast it has become.


Photo by Simon Dawson/Reuters

It’s a muggy July morning, and Tim Berners-Lee—the man who once upon a time invented the World Wide Web and now wants to reinvent it—is giving me hope about my marathon time.

Berners-Lee—Tim to friends, Sir Tim to the rest of us commoners—is laying out his vision of the new, more powerful, more righteous Web he’s been working on for the past several years, a Web that would, among other things, undercut a chunk of the power that Facebook, Amazon, Google, and the like have over our lives. The example he’s landed on to help me understand it? Someone who’s preparing for a marathon and downloads a training app. Given that I am, completely coincidentally, training for a marathon this fall and have, IRL, just downloaded an app, I’m all ears.

“It’s going to be much more informed than any app you can run right now,” says Berners-Lee, 66, a native Brit who’s worked out of MIT since the mid-’90s and who’s dressed this morning in a blue golf shirt that goes quite nicely with the blue frames of his glasses. He’s in the study of his home in the U.K. I’m in the South End. We’re talking via Zoom.

“Right now, you can tell an app where you go to run or people you run with,” he continues, “but it only looks at running. It can’t talk to you about nutrition.” It can’t even see what trainers you ordered.

This new type of app, on the other hand—built on a platform called Solid, which Berners-Lee and his MIT colleagues have been developing for the past several years and which lets users control the mega-amounts of digital data that exist about them—would be able to do all of those things…and lots more. It would, if I so chose, be able to look at my blood chemistry and make training recommendations based on what it sees. It would be able to monitor me for signs of depression (given my sometimes-glacial running pace, a real possibility). It would be able to look at my diet, my past race times, my friends’ past race times, the weather at my past races, the weather at my friends’ past races. The list goes on.

“When you go to a restaurant, it may say, ‘May I suggest this instead of that?’” Berners-Lee tells me. “Or, ‘Can I suggest you run with this guy? You should run with Big Bob, because of all the people you run with, he runs a bit faster than you.’ It will be looking at your whole life, coaching you.” I nod quietly, suddenly feeling emboldened: Perhaps slow and pathetic is not my destiny.

Now, I think we can all agree that an app designed to help middle-aged weekend warriors is not necessarily the most consequential technological advancement of our time. But Berners-Lee’s example actually underplays—by a lot—how revolutionary Solid could be. The real point, he says, is that for the first time ever, we users—not big tech companies—will be in control of our data, which means that websites and apps will be built to benefit us and not them. That, in turn, could mean revolutions in things that really are consequential, from healthcare and education to finance and the World Wide Web itself. As Berners-Lee said when I first asked him to sketch out his vision for this revamped Web, “Well, that requires us to imagine a world that’s very different.”

If all of this sounds a bit grand, it’s worth remembering one thing: Tim Berners-Lee has transformed the world once already. Is there any reason to think he can’t do it again?

Three decades ago, Berners-Lee dreamt up, wrote the code for, and then launched the World Wide Web, putting in place the digital architecture the Web still uses to this day. At the time, the Internet—the electronic communications network that allows computers in distant places to connect to one another—had been around for a couple of decades, but hardly anyone used it, mostly because it was so hard to use. Berners-Lee’s invention made the Internet practical and the dream of a connected planet real—and turbocharged the digital age we’re still in the midst of. Indeed, is there any part of our lives that hasn’t been affected by the Web? It’s transformed commerce and entertainment; politics and finance; dating and sex; and sports and transportation. It’s overhauled what we know and how we know it. It’s completely shaken up how we connect to other human beings. And it’s done all of this on a massive scale. At last count, more than 4.6 billion people—nearly 60 percent of Earth’s human inhabitants—had regular access to the World Wide Web. And the number grows significantly every year.

And yet, despite the gargantuan impact of his creation, Berners-Lee is dissatisfied—distraught, even—at the Web’s current state. He believes, first of all, that the platform is now controlled by too few entities that are far too powerful—Google, Facebook, Amazon, you know the names. Their domination, in fact, sits in direct opposition to the Web he says he was trying to invent some 30 years ago: a tool that democratizes access to information; that elevates humanity; that exists, as he’s said often, “for everyone.”

At the same time, Berners-Lee has come to believe that, from a technological perspective, there’s far, far more that the World Wide Web can do to really improve the lives of all those everyones.

Which is where Solid comes in, as well as the Boston-based company that Berners-Lee has cofounded to help Solid go mainstream. That company is called Inrupt, and Berners-Lee has launched it with Boston tech CEO John Bruce. Their goal? To have Solid take over the Web’s ecosystem, clipping the wings of tech’s titans while lifting everyone else.

Whether they can actually do that is an open question, particularly given the forces arrayed against them (Amazon and Facebook being examples 1 and 1A). But the battle has been joined. Which puts Berners-Lee—MIT professor, member of the British Empire—into a somewhat unexpected scenario, one that feels a little like a Hollywood thriller: Can a brainy computer nerd who’s seen his invention fall into the wrong hands capture it back before it’s too late?

Tim Berners-Lee and John Bruce founded the company Inrupt with the goal of taking the Web back from the likes of Facebook, Amazon, and Google. / Photo by Simon Dawson/Reuters

John Bruce. / Photo by Sue Bruce Photography

Berners-Lee is commonly referred to as the inventor of the World Wide Web, which is correct, but doesn’t really go far enough. Over the past three decades he’s more accurately been the father of the Web—the man who not only conceived the platform, but raised it, developed it, guided it, and has tried to ensure that it’s a moral, upstanding citizen. In 1994, for instance, Berners-Lee relocated from Switzerland, where he dreamt up the Web, to Boston to create the World Wide Web Consortium (W3C), an organization, hosted by MIT, whose main function is to ensure a common set of technical guidelines for the platform (without which it would devolve into a thousand different mini systems, none of which talk to each other). In 2009, after the Internet had been adopted by more than 20 percent of the world’s population, he and his now-wife, Rosemary Leith, cofounded the World Wide Web Foundation, whose mission remains “to advance the open Web as a public good and a basic right.” Both outfits ladder back to Sir Tim’s broader vision: that the Web be a single, unified thing that connects all of humanity.

It’s curious that, though our digital age has spawned plenty of celebrities—Zuckerberg, Bezos, Jobs, Musk—Berners-Lee remains all but unknown outside the tech world, even here in Boston. Though he was knighted by Queen Elizabeth II in 2004, took center stage at the opening ceremonies of the London Olympics in 2012, and was presented with the 2016 ACM A.M. Turing Award (computing’s Nobel), he could probably sit through nine innings of a Red Sox game at Fenway without having to pose for a single selfie. (When I mentioned to friends that I was working on a story about the inventor of the Web, at least four of them said to me, only half-jokingly, “Al Gore?”)

On the day we chatted, Berners-Lee was riding out the pandemic in the U.K. In more normal times he’s bi-continental, maintaining a home in the suburbs here but jetting back to London a couple of times per month for, among other things, academic duties at Oxford, his alma mater. In conversation he is, in a way, exactly what you’d imagine the inventor of the Web to be, possessing a certain brilliant-British-professor quality. He speaks quickly but haltingly, with an occasional stammer and sentences that sometimes start down one path before reversing course and heading down another. Meanwhile, a current of energy seems to run through him, producing various head tilts and body movements.

As a collaborator, Sir Tim’s colleagues say he has an appealing mix of creativity, focus, and down-to-earth decency. “He’s a really, really, really good guy. He’s one of the good people,” says Bruce, Berners-Lee’s cofounder at Inrupt. “No dilettante. No ‘I invented the Web, why are you even bothering to talk?’ None of that.”

If a software engineer were to write code designed to spit out a tech genius, chances are they might program something like Berners-Lee’s childhood. He grew up in London in the late 1950s and ’60s, the eldest child of not one but two mathematicians. (His mother and father both worked on the first commercially available computer, the Ferranti Mark I.) Math, he recalls, was a constant in the house, as was the opportunity to use his imagination and tinker. In an oral history Berners-Lee recorded for MIT’s 150th anniversary in 2010, he recalled a scrap-materials drawer his parents set up inside their house, which he and his three siblings were encouraged to rifle through. “My mother would secretly throw things in,” he said. “As something like a dishwashing liquid bottle became empty, she’d put it in there.” The various parts could be used for axles and funnels and such. His tinkering habit kicked up a notch at Oxford, where—true to tech-genius form—he built a computer out of an old TV and other discarded parts.

To hear Berners-Lee tell it, the invention of the Web was, at least initially, simply an attempt to solve a problem that was nagging him. Following graduation from Oxford in 1976 and a series of tech-oriented jobs, he was, by the mid-’80s, working at CERN, a highly regarded scientific research center in Geneva. He enjoyed his work, but was frustrated at his inability to see up close what his colleagues were up to. Yes, you could ask people what they were doing—the coffee break room was a lively and important place to exchange information—but accessing whatever data a colleague might have on his computer was a challenge. While computer scientists Vinton Cerf and Robert Kahn had developed the technology that made the Internet possible in the early ’70s, using it was cumbersome at best. “With enough research and installing enough bits of programming,” Berners-Lee recalled in that MIT conversation, you could log onto another computer remotely to get files that way. “But that wasn’t really a great way to get information. On the other hand, there was such potential.”

He began thinking about the problem, and in March 1989, at the age of 33, wrote a memo to his superiors at CERN, outlining a new way to link computers together—one using, in part, an existing technology called hypertext. The proposal was promptly…ignored. It wasn’t until the following year, after Berners-Lee resubmitted the memo, that his boss—who had scrawled “vague but exciting” on the document—told Berners-Lee to give it a go. That fall he got to work, and in a matter of weeks created a new language (hypertext markup language, or HTML) for building what would be called Web pages; a new protocol (hypertext transfer protocol, or HTTP) for specifying how computers could request and receive those HTML pages; and the first-ever browser to let users view Web pages. More important, somewhere along the line, the project morphed from being a way to link CERN’s computers together to being a way to link all computers together—a fact evident in the name Berners-Lee eventually chose for his new creation: the World Wide Web. He made the first Web page available to the public in August 1991; its subject, fittingly, was the Web project itself, giving visitors an overview of the concept along with the code to build their own Web pages.

Almost immediately, the Web’s father and his colleagues at CERN faced a dilemma: Should they charge people to use the software he’d written, as Bill Gates and other computer pioneers did with their software? Or should they make it free? Understanding that his vision—a sprawling network of computers linked together and sharing information—could only really come true if people adopted the technology, Berners-Lee pushed for the latter. In 1993, CERN put the World Wide Web software in the public domain; its inventor would be content not to see a dime from it.

The Web developed both slowly and quickly, if such a thing can be true. It was slow in the sense that the number of early users—most of them connected to universities—was puny, particularly compared to the billions of people roaming the planet. But what was exciting was that the number grew exponentially every year. Just after launch in the summer of 1991, Berners-Lee’s original server was getting between 10 and 100 visits a day. The next year it was 1,000 visits a day. By 1993 it was 10,000 visits a day.

In 1994, recognizing the need for an organization to unify standards and protocols for the technical part of the Web, Berners-Lee founded W3C, choosing to bring it to MIT after being recruited by several veteran computer scientists at the university.

What’s fascinating, at least in retrospect, is how relatively low Berners-Lee’s profile was for the next decade or more, even as millions of people began discovering his invention and making it part of their daily lives. Partly this was his choice—he and his first wife, Nancy Carlson, had two young kids, and they wanted their childhood to be as normal as possible (the couple divorced in 2011; he married Leith in 2014). But it was also a reflection of who and what drew the attention of the media. Time might have named Berners-Lee one of the 100 most important people of the 20th century in 1999, but for the most part coverage of the Web was focused on all of the different things, good and bad, that Berners-Lee’s fellow human beings were doing with it: online stock trading; the dot-com boom and bust; the rise and fall of Napster; Internet porn; the launch of an online-only bookstore called Amazon.com. None of those things would have happened without Berners-Lee’s invention, but when it came to public recognition, the Pets.com sock puppet was far more famous.

Photo by Catrina Genovese/WireImage

In November 2019, just a few months before COVID shut down the world, Berners-Lee stood before a select audience at the Design Museum in London to deliver the Richard Dimbleby Lecture, an annual talk broadcast by the BBC and presented by a distinguished speaker. In recent years the once-reticent Berners-Lee has become more of a public figure, doing more speeches and interviews, and on this evening, he spoke about the state of the Web and why he saw a need for what he called “a midcourse correction.” When the speech was over, the audience was invited to ask questions.

A man took hold of the mike and, in a tone that was more observational than critical, addressed Berners-Lee. “You created machinery to help the exchange of scientific information, but yet it’s brilliant at spreading disinformation. You gave it away for free, and yet it creates billionaires and monopolies. You created a communications tool that allows people to create echo chambers around themselves.” He paused. “I suppose the question is, when did you realize you’d built something so ironic?”

After a short pause, the crowd burst into laughter. Sir Tim laughed, too, looked away briefly, then looked back at the man. “The Web grew steadily, exponentially,” he said, suddenly waving his arm for an added bit of drama, “and so did the irony. There was no one particular point, so now the sense of irony is quite high.”

The man followed up: Was there a moment Berners-Lee realized the Web wasn’t what he wanted?

Sir Tim responded that, for a long time, he counseled people who complained about all of the bad stuff on the Web to just ignore it and only consume the good stuff. “That worked for everybody I knew—they engaged with the Web because they just visited the places they liked,” he said. “Then, in 2016, we realized there was a whole bunch of other people, not connected with the people I knew at all, who were doing the same thing—going to websites they liked. And they were very different websites, and so they have a very different filter bubble. And the problem was…they vote. And so even if it’s fine for me to live in a filter bubble, actually it’s not okay to have filter bubbles. So with the election of 2016, I think a lot of people did a double-take. I think certainly at the Web Foundation, we blogged that it’s time to turn left. It’s not, we realized, just about keeping the Web open and free. It’s about what people do with it.”

For the past decade or so, and certainly over the past half dozen years, Berners-Lee’s focus has been on two broad areas: trying to nudge the Web back closer to the humanity-serving entity he initially intended it to be, and building technology that will let the future Web be more useful and powerful than any of us ever imagined it could be. Increasingly, those two missions seem to be converging.

For the former, Berners-Lee has largely relied on his status as the founder—no, father—of the Web, which gives him a certain moral authority on the matter no one else really seems to have. In recent years, he and Leith have used their perch atop the World Wide Web Foundation to highlight various issues, including creating a “Contract for the Web”—which spells out the rights and responsibilities that users, governments, and corporations have—as well as publishing open letters on topics such as the treatment of women online and the moral need to increase access to the Web in developing countries. Meanwhile, Berners-Lee has become increasingly outspoken in public forums such as the Dimbleby Lecture. As he said that evening after bemoaning what dark forces on the Web were doing to science and democracy: “The Web does not have to stay the way it is now. It can be changed. It should be changed. It needs to be changed.”

When he and I talk, Berners-Lee goes back to what people are doing with his invention. For many years, he says, he and other technologists saw the Web as simply a neutral infrastructure, a place capable of reflecting both the good and bad parts of human nature. If people chose to use that infrastructure for negative things, well, that wasn’t necessarily his and his colleagues’ responsibility. But what’s changed his mind, he tells me, is the ever-growing—and increasingly divisive—impact of social media.

“With social networks, it’s different,” he says. “It has a direct effect—the protocols, the workflows you go through, what color the buttons are—all that can have a psychological effect on whether people are going to be more constructive or more argumentative. And so anybody who’s doing something with social networks on top of the Web has a serious ethical obligation to look at it and make sure that it does serve humanity well.”

Mark Zuckerberg’s name is more or less hanging in the air, and so I ask Berners-Lee if he has a relationship with him or any of the other tech titans. He doesn’t answer directly—he’s met a lot of people through the World Wide Web Foundation, he says—but he does tell me that Facebook doesn’t have to operate the way it does.

“I blogged at one point that Mark Zuckerberg should think about, could you change Facebook so that instead of polarizing people, it tends to anti-polarize them? When you go onto Facebook, it says, ‘You should be friends with Tom, because Tom is friends with Julie. Tom is a good bet for a friend.’ Which means I end up with a lot of people I agree with. It means when I go to a party with all these people, it’s kind of boring.”

A better idea, he suggests, would be for Facebook to change its algorithm so that it also regularly offers a suggestion for what Berners-Lee calls a “stretch friend”—something akin to a “stretch school” when you’re applying to college. “A stretch college is going to be about effort,” he says. “A stretch friend is going to be about effort. Suppose this week—this is somebody who’s just like you, but they’re living in Saudi Arabia.

“Maybe people end up finding out it becomes the richest part of the whole experience,” he continues. “Maybe they find the network of stretch friends is much more valuable than the networks of [regular] friends.”

I think we can all agree this is a fascinating idea. I think we can also agree the odds of Zuckerberg doing it are exactly zero.

Berners-Lee was one of the first recipients of the 2013 inaugural Queen Elizabeth Prize for Engineering, presented by Queen Elizabeth II herself, for his work founding the World Wide Web. / Photo by Lewis Whyld/WPA Pool/Getty Images

If Sir Tim has become more comfortable in the bully pulpit, at heart he remains a computer scientist, one focused on remaking the architecture of the Web as a way of remaking the Web itself. Which brings us back to Solid and Inrupt.

Berners-Lee and his colleagues have been working on Solid for the better part of a decade, helped along by a $1 million grant from Mastercard in 2015. Some have said Solid aims to turn the Web upside down, but Sir Tim frames it the opposite way: He says it aims to turn the Web right side up.

By that he means that Solid seeks to correct what has turned out to be a flaw, an imbalance, in the Web’s original design: where data resides and who controls it. Under the Web’s current architecture, if you visit Facebook or Amazon or Google or just about any site, those entities capture and control whatever data is generated—what you search for, what you buy, what you say, who you interact with—in order to create their own profiles of you. Sir Tim now recognizes several problems with this arrangement, starting with what happens when an unwanted third party gets hold of that data and does something nefarious with it. The most famous instance is Cambridge Analytica, which used Facebook data to psychologically target and manipulate tens of millions of potential Trump voters in 2016. “Cambridge Analytica opened people’s eyes to how complicated the system was and how detrimental,” he says. “It’s not really about the privacy thing; it’s about my data being abused.”

That should never happen to anyone, he says, and we have to take steps to protect against it. But equally problematic, at least from a Web performance standpoint, is that your data sits in thousands of separate silos. Facebook has some. Google has some. Amazon has some. Uber has some. Zappos has some. But nobody really has a full picture of you, which means computer applications are limited in how much they can help you.

The solution to both problems? Let users, not sites, collect and control their own data, then let users decide which websites or apps are allowed to access that data. With Solid technology, this is accomplished by giving users the ability to set up their own “pods,” digital storage lockers where all of your data resides. And in theory, all really could mean all—not just your search history and how many episodes of Bridgerton you watched, but your medical records, tax returns, banks accounts, Instacart purchases, and more. The advantage of this is clear: When you download an app to, say, help you train for a marathon, you can let that app have access to all of that data. Developers can therefore create software that’s far more powerful—and potentially far more useful—than what we currently have.

Berners-Lee believes this is a better arrangement not only for users (who’ll finally control what rightfully belongs to them: the details of their lives) and developers (who’ll be able to build some truly killer apps), but for all parts of the digital ecosystem. Most online retailers, he argues, don’t really want to be collecting and storing and having to protect your data, particularly now that privacy and security are becoming hot-button issues. “They don’t get revenue from abusing private data,” Sir Tim says. “They just make shoes. They want to be able to access your shoe size.”

After several years of development, the challenge for the Solid team isn’t making the technology work—it does—but getting an already-developed Web ecosystem to adopt the new platform. That’s no small feat, and that’s where the new company, Inrupt, comes in.

The idea of having a commercial venture—with plenty of talent and venture capital behind it—to help push Solid technology out into the world became clear to Berners-Lee a few years ago. Yes, it was ironic: The man who’d made the original Web software free in order to get people to use it was now contemplating starting a company for the exact same reason—to encourage usage. But this is three decades later, and different times and different situations call for different tactics. In 2017, Berners-Lee met with Rudina Seseri, cofounder of the Newbury Street–headquartered VC firm Glasswing Ventures, to discuss starting a company.

“Tim came in and had on his computer this screenshot of what the Web and all the various parts of the ecosystem, from consumers to enterprises to different applications, would look like,” she says. “I took a look at it and said, ‘Tim this is not a business, this is a paradigm.’” She laughs. “He said, ‘Yes.’”

Inspired by the concept nonetheless, Seseri set about helping Berners-Lee find a business mind to partner with. She’d known Bruce—a U.K. native who’s launched and sold a number of tech startups—for several years, and asked if he’d take a meeting with the man who invented the Web. The two men eventually got together at 80 Thoreau in Concord.

“I found a chap who was close in age to me,” says the 65-year-old Bruce, who has a full head of white hair, an affable manner, and a British accent that hasn’t faded after 25 years in the States. “He, too, was raised in the U.K., and so we could remember things that happened as we were growing up. But it was an intriguing dinner for me because his vision for the Web, as he described it to me, was one of those mind-expanding kinds of experiences. You know, so broad in its scope, yet so simple in how one might do it.”

After several more meetings, the pair came together to form Inrupt, which at least initially is focusing on selling Solid technology to large enterprises such as governments and corporations. The company was officially announced in 2018, but Seseri, whose firm invested in the venture after Berners-Lee and Bruce launched it, says the initial meeting between her two British friends was crucial. “They met in Concord, and to this day I give them both grief over it. The American Revolution started in Concord, and I joke that this is a second attempt to take over.”

The martial metaphor no doubt works for Sir Tim. As he said at the end of his Dimbleby Lecture in 2019, “We have to all fight now for the Web we want.”

Can the revolution be successful?

Last November, Inrupt announced the first product it was bringing to market—a server built with Solid technology—along with its first four clients: the BBC, the National Health Service in the U.K., NatWest Bank in the U.K., and the government of Flanders, a region of Belgium. Each organization announced a different pilot project, but all share a goal of providing services that are easier to use, more personalized, and more dynamic. The BBC, for example, is using the technology to power a more sophisticated recommendation engine for its streaming services—an engine that relies not only on which BBC shows you’ve watched, but what you consume on Netflix and Spotify. The government of Flanders, meanwhile, piloted a Solid-backed service called My Citizen Profile, in which residents store their data in a pod, then give access to the government as necessary for services.

Bruce says that so far, the results are encouraging for all of the clients. In May, he and Berners-Lee were on hand in Belgium as Flemish authorities committed to making My Citizen Profile available to all residents, in essence making it standard operating procedure for how the government interacts with citizens.

For Bruce, the strategy of working with governments and large companies has a dual benefit. It not only exposes Solid-backed services to tens of millions of Web users, but once users have set up pods, they’re teed up for other Solid-backed apps that could come down the pike. It’s a quasi–Trojan Horse strategy he compares to the early days of the light bulb. In order to get his new electric lamps to work, Thomas Edison first had to outfit homes for electricity. But once the wiring was in place, turning on a light bulb was just the beginning, as the homes were poised for countless other electrical gadgets yet to come.

There are, of course, numerous challenges facing Solid in general and Inrupt in particular. Indeed, one could argue that starting the Web might actually have been easier than changing it will be. In the days when he was counting visits to his original server, Berners-Lee’s biggest task was convincing people how cool his new technology was. Now he has to battle an entrenched digital status quo that would just as soon leave things as they are, thank you very much. At least at the moment, for instance, it’s hard to see how powerful tech companies like Facebook, Amazon, and Google—whose business models depend on hoovering up as much data as possible—will benefit from giving control of data back to users, then having to ask their permission for access to it. As Upton Sinclair once said, “It is difficult to get a man to understand something when his salary depends upon his not understanding it.” (Facebook and Amazon didn’t respond to requests for comment about Berners-Lee, Inrupt, or Upton Sinclair.)

That’s one reason that Lalana Kagal, Sir Tim’s colleague at MIT, is skeptical that the adoption of Solid will be a bottom-up phenomenon driven by users. She’s bullish on the technology long-term, but believes a more likely scenario for adoption is that governments get increasingly strict about giving users control of their own data—and a company like Solid becomes the default way that happens.

The other big issue is whether any of this will truly make the Web any better—that is, any closer to Sir Tim’s original vision of a platform that exists to share information and lift humanity, not further fill Bezos’s and Zuckerberg’s already bulging wallets. Sinan Aral, a professor at MIT Sloan School of Management and author of The Hype Machine, a book about the nefarious impact of social media, says changing the code that powers the Web is necessary, but insufficient. Solid, he says, “is a technical solution to enable the private control of data. But I think there are a lot of social, economic, and legal changes that have to accompany any technical solution.” By social changes, Aral mostly means the way we users behave. Studies suggest people say they care about privacy and data, but at the end of the day we never do much to back it up. In such a world, it’s easy to see billions of people giving Facebook as much data as they want, the consequences be damned.

Sir Tim says focusing too much on the privacy aspect of Solid misses the point a bit. Yes, controlling our own data will make atrocities like Cambridge Analytica less likely to happen. But the real magic is in what else that data can do. “If you give your data to people doing research so they can discover new drugs, they can use it and you can use it. You can use it to run your life; they can use it to discover new drugs. And so being in control of how your data is used is a massively positive thing.”

It’s worth noting, too, that for all of its demons, the World Wide Web remains, quite possibly, the greatest example of cooperation in human history.

If the odds against Solid, Inrupt, and Sir Tim are steep, it still somehow feels like a mistake to bet against them. What were the odds, after all, that a young computer scientist sitting in an office in Switzerland could, in a matter of weeks, create something that has completely transformed civilization? It’s worth noting, too, that for all of its demons, the World Wide Web remains, quite possibly, the greatest example of cooperation in human history. The Web, remember, was not born of government fiat, and it isn’t run by a giant corporation; it works because thousands of programmers—and billions of users—have all simply agreed on a method for linking themselves together. We might be at each other’s throats in other ways, but at least in this one respect we’ve found a way to unite.

As our conversation comes to an end, I ask Berners-Lee if he ever takes a moment to marvel at what he’s created—the good, the bad, the ever-circulating memes. His answer doesn’t surprise me. “I very rarely go, ‘Oh, wow.’ I guess, at the point when 20 percent of the world was on the Web, there was a bit of an ‘oh, wow’ moment. And that’s when we started, Rosemary and I cofounded, the Web Foundation.

“But there’s so much stuff to do,” he continues. “There’s so much work to do now trying to clean the Web up. For Solid, there’s a serious amount of work. There’s never very much time to sit back and go, ‘Oh, wow.’”

The day before I talked to Sir Tim, I’d gone over to the building that houses the computer science department at MIT, just to get a feel for where he has spent a big chunk of the past 25 years—years when billions of people found the information they needed and millions of people were misled and hundreds of unicorns were created and who-knows-how-many souls reconnected with loved ones or were radicalized by terrorists or laughed at cat videos or just passed the hours playing Words with Friends. I didn’t get to see Sir Tim’s office, but I did see a new generation of students hard at work, as well as something else that caught my eye. Attached to a wall was a parody of a yellow street sign, with a silhouetted figure wearing a backpack and big letters that read NERD XING.

“The 1987 Nerd Crossing sign is, in many ways, the gold standard of sign hacks,” a description mounted next to the sign said. “The bright yellow crossing sign, posted at the busy crosswalk at 77 Massachusetts Avenue, cleverly morphed a textbook nerd into the regulation graphics of a municipal crossing sign.”

The image made me laugh, but the longer I looked at it the more I realized it offered a deeper message, too. After all, in a world increasingly built by nerds, we really do need to pump our brakes for the ones who are trying to do the right thing.