Simon Hewison on the OpenStreetMap talk list reports that he’s the first ever mapper of Chaville St, N3 London, and speculates on ways to receive notifications and stay on top of geographic changes.
Back in June, I looked up the development plans in San Jose and had the pleasure of being the first ever mapper in ex-apple orchard suburban development.
“We make an average of 5,000 changes to our large-scale data every day,” said an OS spokesperson. “It’s not about mapping the Isle of Wight once – it’s about continuing to map it … It is expensive to collect detailed, accurate information on the ever-changing world to the level of detail our customers require.”
Yet it’s exactly the opposite .. OpenStreetMap can be more accurate and current due to its openness and trust, while other commercial and government entities have about 1-2 year turn around on distributing changes .. from the actual change on the ground, to notification, to ground truthing, incorporation in their db, to distribution to customers digitally and printing.
Yet the OS spokesperson is very right .. it is going to require deliberate effort to keep the maps up to date after the initial OSM map is complete. Most of that ever changing world goes through planning permission with some authority, usually local. And the authorities have a duty to make that available to the public, and they do to varying degrees. In Hitchhiker’s Guide to the Galaxy, Arthur Dent found the plans to demolish his home on display in a “in the bottom of a locked filing cabinet, stuck in a disused lavatory with a sign on the door saying ‘Beware of the Leopard’.” Nowadays the planning permissions are hopefully published on the web, but stuck in PDFs or Word DOCs, in a labyrinth of directories and search .. a technical barrier for users and for developers not as dangerous as a leopard, but just as ridiculous.
There are no set formats for local authorities to publish real-world-geographic change, nor set procedure for notification and distribution. Truly this information is in the public domain, but those producing it lack the resources to make it useful. Hence, private companies stepping in to aggregate. Around the world, there are companies which simply monitor changes with local authorities to produce geographic change notification reports. Navteq and TeleAtlas has made arrangements with local authorities to receive changes, and also employ screen scraping to gather data.
There’s nothing stopping us from doing this in an open way.
Jo Walsh did a bit of work a couple years ago, screen scraping planning applications in Tower Hamlets, East London. I used that data for visualization of planning application acceptance and rejection.
It doesn’t necessarily require precise geo-referencing of the changes to make them useful. With a general idea of the area a particular change is coming from, that gives enough to notify interested humans, for further investigation.
I could see this as a distinct project, useful for OpenStreetMap, but even more widely useful for anyone interested in what’s happening in their community. Web sources of change can be identified, and scrapers configured, within a wiki, similar to Katrina People Finder. Scrapers populate a database with notifications, with fuzzy georeferencing. GeoRSS feeds are available for subscribing to change in a particular area.
Such a useful thing I think, in the public interest, I could see finding support for this. Perhaps I will float the idea with mySociety. Or is there something like a federation of local authorities?