Duplicate of Putting data into the hands of communities
Matt Leach, our chief executive, argues now that place based funding is back at the centre of policy, it's time to improve the quality of data available to local communities
Some six years ago, along with Tom Smith, now MD of the ONS Data Science Campus, I was part of a small team that created Community Insight - a fairly simple, socially-focused web based open data browser. Within a year we’d sold over £1m of subscriptions - largely to social housing providers and local authorities wanting to understand more about the communities they work in.
It’s still very good business for HACT and OCSI, the organisations we used to run. Over that time, thanks to a fantastic developer team, it has got a lot slicker to use than the early buggy prototype that first gained attention. And its picked up a growing user-base – not least among the 150 Big Local areas that Local Trust works with nationwide. But, in functional terms, it is still much the same tool as was originally built in 2012. Largely because – despite nearly a decade of hype about open data, big data, localism and public service modernisation – the sorts of official data available to those looking to map, analyse and understand local communities hasn’t obviously got any better.
This is becoming a bit of an issue for those of us interested in understanding more about neighbourhoods, communities and places.Over the last five years we have seen an increasing focus on place-based social action across funders, government and the civil society sector – something I blogged about last week in Taking back control. But the data, evidence base and tools needed to inform decision making – both hyper-local and national don’t feel like they have kept pace.
The UK Shared Prosperity Fund, the much-anticipated post-Brexit social/structural funding, may bring some £2.4bn a year of funding to our most disadvantaged places. But as government gears up for the launch of its consultation on UKSPF, it feels like the evidence base to inform its delivery remains inadequate.There is a pressing need to build the tools needed to better understand complex, localised economic change, cutting through the noise, enabling informed decisions on community social and economic development priorities to be made. It is not a new ambition - there have been some heroic attempts in the past, dating back to the time of The New Deal for Communities, (NDC), and Regional Development Agencies, (RDAs); but surely, a decade on from the last big area based regeneration programmes, we can do better?
Many are now expecting a new, wider statement of policy on communities next spring setting out the government’s stall for the forthcoming spending review. It is even possible we might see within it early signs of a new wave of area-based investment. But if we are to about usher in a new era of place-based change, the data that will be available to inform it has in many ways not moved on from what was around at the time of the last big place-based programmes.
While a lack of high quality hyper-local data might not have mattered for the Neighbourhood Renewal Fund, (NRF), and NDC – both of which were, whatever the rhetoric, largely top down programmes that often failed to successfully engage with local communities – it is not clear it will be sustainable if there is any ambition to move towards more localised, community-focused, resident-led place-based programmes. If we are to properly devolve responsibility for commissioning and delivery to local people, we will need to give those people access to granular, more immediate, more dynamic information about the places they are looking to transform.
It’s a challenge that crosses political divides. If James Brokenshire is serious when he talks about “neo-localism” and “double devolution”, we’ll need to move on from data models designed for top-down statutory agencies and put better quality data in the hands of local communities. And when John McDonnell talks about local economic systems sitting at the heart of Labour’s plans for a new economic settlement based around “community wealth”, he describes a world that can only be achieved if local people have significantly better information on how their economies are functioning and changing at a hyper-local level; ideally, in real time.
Local Trust is working with OCSI on adding new layers of data to the Index of Multiple Deprivation, (IMD), to understand and map the places that are disconnected from both civil society and wider economies. But in the end, what we are working with is still the same, static, limited, Lower-layer Super Output Areas, (LSOA), based IMD, with some neat transport, building-use data and services data overlaid. When what we really need is systems-based data, ranging across civil society and statutory sectors; immediate, predictive, dynamic and model-based, not just historic.
While we’re at it, some thought as to how to make this data easily accessible to local people and straight forward to interrogate. A few years back there was much talk about 'citizen data scientists' who – with access to newly open data – would transform everything from public services to business accountability. I’m not sure many have yet been seen in the wild.The truth is that many of the people using Community Insight in local communities have no interest in becoming scientists, they just want great, accessible data available to inform their decisions on improving their areas.It is time to commit to putting it in their hands.