Smart planning - how digital technology can inform city shaping - NLA Think Tank

Monday 3 July 2017

How can digital technology help to inform city shaping? Is it the future for the London Plan and smart cities? And how do we make progress on a 3D model? A Think Tank organized by NLA at CBRE’s City offices in May sought to find out.

Everyone developing digital models, said Millerhare’s Managing Director John Hare, will be familiar with how futile it would be think of a single digital model of London. Model contents and structure are highly tuned to specific applications, each requiring differing attributes, accuracy and levels of detail. Equally, cities are constantly changing and each application needs to track these changes to a greater or less extent. ‘However’, said Hare, ‘if we can agree some basic ground rules, we can start to exchange data in meaningful ways’. We do, however, need to explain the purposes for which we have assembled information and hence what level of detail and accuracy can be relied upon. And we must be careful to distinguish applications that require auditable accuracy from those that at are simple and easy to use. For the most “democratic” access, we have to consider a user with an iPad at home or a mobile phone on the top of a bus, where there are real processing power and bandwidth issues. For these sorts of applications we should accept that the international players such as Google make incredible investments in data capture and maintainence as well as enormous strides in terms of ease of use. The issue for public consultation should perhaps be to work out how you could take a model supplied data and translate it so it ‘lands’ correctly in Google Earth or Streetview. 

For Space Syntax managing director Tim Stonor, there are drivers now that are making so much more possible – such as the prevalence of data and capability of computing to process it. ‘We have got powerful engines and plenty of fuel, and this is driving knowledge and understanding, which is the really important part.’ This is then driving demand from the client side, which is in turn driving a need for simplicity and thus the integration of different models, rather than just one single model. Working in partnership with the Future Cities Catapult, Space Syntax is creating 'Tombolo', a way of writing code to knit models together, said Stonor, which he backs rather than the creation of any ‘all-singing-and-dancing’ model.

Daisy Estrada, senior planning officer at the City of London Corporation agreed, but said there was a lack of expertise inside authorities like hers at the City, alongside the problem of how models could be updated. Cobus Bothma, senior associate principal, KPF said the idea would be a controlled model, but there are considerable questions to consider: when do you make models available, who owns it and who has access to it? This issue of who owns the data and licensing agreements is one of the critical issues highlighted by Euan Mills and his Future City Catapult. Some 3D model providers have half of the boroughs signed to five year licensing agreements where the boroughs provide all the schemes, and once the license is up, all the data they have provided is lost. ‘There needs to be a role for public authorities to actually own something’, said Mills. All of this 3D massing and energy data, for example, is very valuable, and there was a big risk of the authorities being tied to licensing agreements, with the providers ‘harvesting’ some of the data, or ‘strange monopolies’ emerging such as the planning portal that, again, authorities cannot move away from. Is there also a role for someone like the DCLG to think about data standards? In other industries, we are starting to see software offered as a service rather than a product that you buy. ‘Be wary’, Mills advised local authorities considering this area. 

The bit that the local authorities can own is by defining what they would like to receive as data, said Hare. But in order to do that, terminology must be agreed upon on over accuracy and complexity. ‘I would hope that if public authorities can lead this debate a very active community effort can be built around it’, said Hare, citing the quality of apps now available to help navigate the public transport system, since TfL agreed to publish its own internal realtime movement data.

For Commonplace CEO Mike Saunders and the public perspective, it was important to be clear about what the problem the technology aims to solve is, rather than being ‘dazzled’ by technology. In essence the issue is engagement in planning process before the final design is published, and when it comes to getting the understanding of the public, it is all about simplicity. ‘The average person on the street does not understand a 3D model’, he said. They actually don’t understand a map most of the time. ‘And so we’re constantly having to pare back in order to get something which is super simple’. Useful ventures include creating a before and after view from particular points, rather than a walk around, Saunders added, although getting to that point is itself complex. Some will want to use VR, but at the moment it does not work well on phones, and there will always be some that want a simplified version that does.

Stonor said there was no doubt that some want the simplified before and after shot. ‘But there is this view that coming to work is a real technological disappointment because you’ve left back at home all the really interesting kit.’ Many coming to community events have just finished playing Minecraft or Call of Duty, so what will come in the next five years in a 3D world, albeit from a game culture rather than a BIM model? The real immediate need, though, will be for those who have a small amount of ‘dead’ time on the bus, say, and getting them to interact will be the main thing, said Saunders. And yet Mills said his organisation is working with an SME on a prototype of how you can visualize buildings through a tablet. There are challenges on mobile processing power and obstructions, but Mills suggested that within two to three years, people will be able to experience and ‘walk around’ buildings using their tablet.

Capita’s head of architecture Peter Trebilcock said we haven’t yet grasped the full commercial value of data. Just as Google ‘sucks up’ data about our preferences to directly advertise to us, we haven’t tapped that with 3D mapping data. But GLA London Plan Team’s Jonathan Brooker said it was perhaps this value of data that actually restricts the ability to share it. Deciding to sell the London development database backfired somewhat because the value brought in wasn’t worth it, but acted as a barrier to using the data properly and sensibly, he explained.

And yet part of the problem is that City Hall doesn’t often actually have the data – certainly this was true when the NLA launched its tall buildings research a few years ago, said Mills. ‘We had to trawl through 100s of PDFs, count them, and create a spreadsheet of the data.’ Local authorities collecting the data was thus a small step in the process. The next generation of PDFs, however, will be interesting – Mills said they can contain link data, hyperlinked to a bigger database, an ‘incredibly simple system’ that allows each end to be updated simultaneously.

TfL has a 3D model of London that it started working on for the Olympics and transport modelling, and the GLA are exploring the use of this model, said Elliot Kemp, principal strategic planner, London Plan at the GLA. ‘With boroughs going down particular routes with particular providers collectively we   should make a decision on how to go forward or make recommendations at least’. It also needs to be decided how the issue of using digital planning data is addressed in the next London Plan.

The City works with 3D model provider GMJ, paying them to put planning applications into the model, but does not have any in-house expertise. Perhaps, said Harry Knibb, Principal Sustainable Development Consultant at WSP, this is the area where the government could contribute by providing an open platform for all this data.

But the City has also been using 3D modelling to look at the eastern cluster and where development might go in line with constraints like view corridors. Perhaps the view management framework might be better operated in 3D, suggested Think Tank chair Peter Murray. That comes down to the accuracy, said Kemp, if the model is not accurate and up-to-date it won’t be useful for assessing the view. Millar Hare put a lot of resources each year into maintaining the accuracy of its model. A additional complication Hare raised, is that a 3D model does not takin into account the curvature of the earth, or the refraction or ‘bend’ of light, which are necessary to calculate to properly assess the protected views.

'Impact modelling' – how it works – has to have the same weighting, said Stonor, as 'visual modelling' - what projects might look like. Standardization was also important, in trying to simulate how things work, said Woods Bagot senior associate Lucy Helme, on things like underlying data like land use or value, and is a ‘tough ask’. What the public realm might look like around schemes such as the eastern cluster is another element to consider, said Estrada, so the City is trying to be more ‘proactive’ in this area, developing ‘comfort criteria’ looking at issues like wind and temperature on the street.

A key step forward might be if local authorities requested models of a certain specification along with planning applications. ‘I think customers are king, and if the planning authority wants to start to establish a set of data, then they could say ‘give me the analytics’’, said Trebilcock.

This is where we’re looking for leadership, said Stephen McDonald, LB Barnet’s director of place. But then again it could equally be the right time to call upon developers to ‘just do it’ and ‘wow’ the planners and a skeptical public with their 3D modelling. And yet, a problem with this scenario is how far behind local authorities are with their technical capabilities and systems. Some very basic files may be described in under a megabyte, however, said Hare. But another problem was the changes that are made to buildings further down the planning process – the Section 73 additions that put, say, an extra 1.5m onto a building. ‘So all the data the committee saw is out of date.’ It was crucial, said the GLA’s Jonathan Brooker, to ensure that the data flows through the whole process.

Planning has been seen as a brake and a burden, too, on the whole development process so it has been politically unpalatable to add anything more to the process, said CBRE director of planning Richard Lemon. But communities and their needs are important to add to the data set, said Saunders. ‘It’s very easy to fixate on the technology of presenting beautiful, 3D, completed designs, and actually we should be looking as much at how we use technology to engage people in the design process.’

Mills said there may be a role for central government - and it is working with DCLG – to build a platform on which they can give to authorities and start to build a database.

Knibb felt it was incumbent on developers to submit 3D plans, to the extent perhaps that they could be incentivized to do that, towards a ‘Utopia’ of a SimCity-like scenario that reacts to different scales and typologies. This could even be linked into some of a borough’s targets, and perhaps a Section 106 produced accordingly. Other countries around the world have incredibly advanced planning tools – such as Singapore and Dubai, which tend to be authoritarian regimes, said Mills. These are at the CIM – City Information Modelling – level rather than BIM. But there will likely be a ‘big shift’ everywhere in how inexpensive 3D data is to collect as drones become more widespread, again highlighting issues over whether it is right for local authorities to be tied into data providers on long contracts.

So, what happens next? The GLA is keen to set the lead now at officer level, said Kemp, looking to design the best route, perhaps asking for data standards or a platform, but which is not for the public to use on the web. ‘But I don’t think it’s going to be that we buy this massive model from one provider, layer in all this GIS data and then manage it, because I don’t think that is realistic.’

Perhaps we could charge more for planning fees, suggested McDonald, using the uplift that brings to invest in the area. It could certainly help communities envisage what change may entail, and McDonald said he can’t understand why developers don’t just get on with it. ‘Why is it that we miss a trick that we can’t just show people, in really simple terms: “this is what it is going to look like?” It can’t be beyond the wit of man.’

Hare thinks it is a solvable problem, certainly. ‘We have been having this debate now for 25-30 years of various forms of computerization. Almost any planning application someone will have built some sort of 3D model of it and if not the average housebuilder can go to a bureau and pay £150. So I don’t think it is at all unachievable that all planning applications get accompanied by some simple 3D shape.’ 

Despite the enthusiasm from the DCLG and City Hall, it will most likely be a ‘gradual move’ forward, said Mills. ‘Between us and the catapult we’ve got all the tech guys and data scientists, but it’s not going to happen one day to another.  It will be about building very simple platforms to begin with, that local authorities can run on their worst laptops.'

David Taylor

Editor, New London Quarterly 
@davidntaylor

This event, which took place on 24 May 2017, was part of NLA’s year-long Planning programme.  

Programme Champions Toggle

  • #
  • #
  • #
  • #

Programme Supporters Toggle

  • #

Share this page Toggle