A few months ago, I stopped by a Vancouver-area urbanist meetup focused on how planning professionals communicate their plans to the public. Sitting among a group of smart young urbanists, we considered a central question: “How would Steve Jobs communicate a plan?”
It’s an obvious answer, but also obviously true: he’d use innovative and elegant technology to give people a totally new experience. The only question is: what technology will drive the next big approach to visualizing urban futures? We chatted and speculated about what kinds of technology would drive this process: crowdsourcing design? Easy 3D modeling? Augmented reality apps? I walked away from the exercise with a head full of exciting ideas, but also with the lingering impression that the technology to accomplish this would remain out of a cash-strapped city’s reach for many years to come.
So, I was pleasantly surprised to learn about two innovative pieces of hardware announced this year–the Oculus Rift and the Structure Sensor–that suggest the capacity to upgrade how we communicate design could arrive much sooner. With affordable tools to sense and record the existing environment in three dimensions, model proposals with CAD software and insert it into the sensed built environment, and let people explore their city or neighborhood with immersive virtual reality. If this isn’t the holy grail of urban design visualization, it’s at least getting close.
The first step to illustrating how a development proposal will impact an urban space is to understand its context–the topography of the land, the buildings and vegetation that frame the parcel, the public and private spaces around it, the views that it impacts, and the shadows it casts. Tools like crowdsourced building models embedded in Google Earth provide a start at a 3D spatial representation of these tools, but lack the fine details and accuracy to provide a ground-level experience of the built environment. This is where 3D scanning technology like the Structure Sensor comes in: attach the device to an iPad, and you have a tool that can create accurate digital models of the spaces around it. Cheap 3D scanning will allow planners, designers, and architects to create and maintain databases of the physical structure of a city’s streets, buildings, and public spaces.
With that data on hand, planners and stakeholders will gain the ability to model proposed changes in density or comparing the results of different form-based codes in the context of existing built form without big budgets for design and visualization. This is a big step in and of itself–with these tools on hand, planners could show the public different planning scenarios at a much more fine-grained and true-to-life perspective than simply showing different massing diagrams from a bird’s eye view.
The technology that could take these tools farther to deliver an unprecedented degree of immersiveness and realistic experience of urban design visioning is the arrival of practical and affordable virtual reality hardware, like the Oculus Rift. Originally targeted at game developers, the Rift has already been adopted by architectural technology firm Arch Virtual to provide architectural clients with a virtual rendering of a proposed building that can be explored in first-person 3D. They’ve even rendered portions of downtown Dubuque, Iowa as a demonstration of how planners might use the Oculus Rift to engage with urban space.
If the capabilities this technology presents sound like science fiction to you, you’re not alone–but 3D scanning, modeling, and virtual reality visualization may in fact present the most natural and intuitive way for everyone from designers to the public to experience and understanding changes in the fabric of our cities.
Given how much potential this approach has to enhance the dialogue we can have about the future of urban development, the costs are almost unbelievably low. The Oculus Rift is slated to be released for $300, while developer Occipital is offering Kickstarter backers the device for as little as $359. This means that this technology isn’t just for deep-pocketed developers to show off penthouse condos–as the tools to smoothly translate sensed data to navigable worlds improves, planners and advocates will be able to put these tools to use in recording, exploring, and rebuilding virtual simulations of the spaces they plan.
We’ll undoubtedly see these technologies put to work in other creative ways in the urban domain. We’ll likely see the proliferation of services like CityScan that use 3D scanning technologies to assist with functions like code enforcement. Spatial databases of cities around the world could allow urban designers to tap into an unprecedented treasure trove of precedents for laying out streetscapes and public spaces. And imagine showing a house or apartment with an Oculus Rift in every home: just scan the space, post the data online, and interested buyers or renters could put on their headsets and get a complete tour.