The Smart City isn’t enough. How might real-time data awaken the humanity in our buildings and public spaces?
The interface between humans and their environments—the very definition of architecture—is rapidly changing. Advances in digital, communication, and even biological technologies aren’t just transforming the way we interact with each other—they are also fundamentally shaping how we experience the city around us. For many, these developments encapsulate the idea of the Smart City, where the convergence of digital and physical is ostensibly seamless.
For us at the SENSEable City Laboratory, the Smart City presents an incomplete picture. In this scenario, the city becomes filled with technology, virtually augmented with information flows often running in only one direction. It fails to highlight the most important aspect of urban life—the human side of the city. In place of this model, then, we advocate for a “senseable” or responsive city, where the totality of the urban environment—from its buildings and bridges to its public spaces and plant life—can engage in a dynamic interaction with people.
This is an old dream of architecture that goes all the way back to Michelangelo, who, it’s said, struck his finished sculpture of Moses with a chisel, shouting, “Perché non parli?” (“Why don’t you speak?”) That feeling really expresses our desire as architects to create living, responsive systems that exist outside and apart from ourselves. Where the French Enlightenment automaton was an “enfleshed” machine designed, like Jacques de Vaucanson’s duck, to merely simulate life processes, today we have smartphones that literally talk to us.
This dialogue reaches far beyond Siri, as Paola Antonelli demonstrated with her exhibition at the MoMA, Talk to Me, in 2011. Real-time feedback loops extend that information exchange across all scales, “from the spoon to the city,” as the old Bauhaus slogan went. Nearly a century later, when we produce more data every 48 hours than we did in all previous millennia combined, our capabilities have caught up with this ambition. Technologies now allow architects to think of design from the detail of the computer chip to the scale of the whole planet.
From our partners:
At the lab, and at my design practice, we try to express this systemic thinking in our design work. Trash Track (2009) used micro-sensors to tap into New York and Seattle’s waste-evacuation infrastructure. We are now expanding the project to encompass illegal waste trafficking from the United States to Asia, giving the research a much wider scope that will help expose the global flows of garbage. By doing so, we hope to grow bottom-up tactics for managing resources and promoting a behavioral change towards a circular economy.
[infobox]“Technologies now allow architects to think of design from the detail of the computer chip to the scale of the whole planet.”[/infobox]Similarly, Underworlds (2014) mines real-time information from the sewage pipes buried beneath our streets. By analyzing sewage collection, we can create computational models to monitor microbes at the level of the city, something we could call the “urban microbiome.” Using the data extracted from this “smart sewage,” health officials might be able to prevent epidemics before they happen. That would mean catching the spread of viruses faster before the symptoms of, say, SARS or influenza become manifest in the population.
All these responsive technologies “blanket” every surface of the urban fabric. Still, the future “sensing” city will not change significantly in appearance, much in the same way that the Roman urbs is not all that different than the city as we know it today. We will always need horizontal floors for living, vertical walls in order to separate spaces—sorry, Zaha—and exterior enclosures to protect us from the outside. The key elements of architecture will still be there, and our models of urban planning will be quite similar to what we know today.
Yet, while the urbs might not look incredibly different from the outside, the life of the city will change dramatically. Buildings will learn to start talking to us. Our waterways will notify us with updates from the deep. Cars will drive themselves. Already, real-time information has made urban interactions much more fluid, from Airbnb to Uber. People are now sharing their bedrooms online—what else will they be prepared to share tomorrow?
This article originally appeared in Metropolis Magazine.