Feed on

Find here a new update of the OSMlanduse.org map. By injecting known tags provided by OpenStreetMap (OSM) into a remote sensing feature space using deep learning, tags were predicted when absent thus creating a contiguous map - initially for the member states of the EU. By design our method can be applied when- and wherever OSM and Copernicus data is available. Now we eye application for full continental coverage, other continents, and land use evolution. Improvements related to initial processing errors will be deployed soon. Insights will be provided in an upcoming publication authored by researchers Michael Schultz, Hao Li, Zhaoyan Wu, Daniel Wiell, Michael Auer and Alexander Zipf.

Among others, in collaboration with the Joint Research Center (JRC), Ispra and International Institute for Applied Systems Analysis (IIASA) the map is subjected to a online validation campaign that is launched during the EU Regions week the validation event will be initialized on Tue 14, October 2020, 9:30 by Michael Schultz of GIScience Research Group Heidelberg University and Ana-Maria Raimond of IGN France.

Our map is the first successful large area fusion of OSM and Copernicus at 10m spatial resolution or higher, where we acknowledged varying label noise and feature space quality, scales and effective use of artificial intelligence and computing. Our method solely relies on openly available data streams and does not depend on additional expert knowledge.

Brief method outline:

  1. OSM key value pairs (tags) were translated into Coordination of Information on the Environment (CORINE) land cover (CLC) land use (LU) classes and used as training labels
  2. Preprocessed Sentinel 2 RGB 10m data for EU was provide from Food and Agriculture Organization (FAO) and used as a feature space
  3. 1) and 2) were combined to produce a CLC classification of EU using deep learning
land use of Europe, Heidelberg and a countryside in Utrecht

Examples of the novel GIScience OSMlanduse land use product, from left to right: land use of Europe, Heidelberg and a countryside in Utrecht

The map is developed, deployed and hosted with support from HeiGIT (Heidelberg Institute for Geoinformation Technology).

Related work:

With the release of openrouteservice v6.3.0 at HeiGIT, we are presenting a new and improved speed-up technique that makes it possible to calculate much larger isochrones than with the existing algorithm. An isochrone is an area on the map that can be reached from a starting point within a time or distance limit specified by the user.
How does the improved algorithm work?
By utilizing the topological properties of our road network, we can identify cells of the network that are connected to other cells only via very few roads. With this process called partitioning, it is possible to drastically reduce the computational time needed for finding all roads that can be reached within a certain time limit.

Baden-Württemberg bike graph partitioned into cells

Baden-Württemberg partitioned into cells

A 5h isochrone around the town of Tübingen is calculated as an example. With the old isochrones algorithm, this request takes more than 35s to process. The new, partitioning-based implementation returns the same result within just 3.5s. More than a 90% decrease in processing time!

Bike isochrone of 5h around the town of Tübingen

Bike isochrone of 5h around the town of Tübingen

The ORS API restrictions due to openrouteservice’s old algorithm, a maximum of one hour driving time for car or trucks and five hours for walking and biking, are therefore not necessary anymore. Now, it is possible to calculate isochrones for hours of traveling time in just seconds.
The new algorithm is currently only available on self-hosted instances of openrouteservice, but will be enabled on our live API soon.

The algorithm is adapted from “Fast Computation of Isochrones in Road Networks” by Valentin Buchhold.

Further details on the implementation can be found in the openrouteservice GitHub repository, while more news concerning openrouteservice can be found under http://giscienceblog.uni-hd.de/tag/openrouteservice/.

Über die letzten Jahre wurden von der OpenStreetMap Community in Ulm vermehrt Mapathons und Mappingaktionen veranstaltet. Am Donnerstag, den 1. Oktober 2020 ab 19:00 Uhr ist ein weiterer Mapathon des Ulmer OSM-Stammtisches geplant - mit Unterstützung von Melanie Eckle-Elze (disastermappers heidelberg und Geoinformation für Humanitäre Hilfe HeiGIT) und Katharina Lorenz (Deutsches Rotes Kreuz).

Wir wollen im Rahmen des Mapathons Mapping-Aufgaben zur Bekämpfung von COVID-19 und zur Unterstützung von internationalen Projekten des Deutschen Roten Kreuzes (DRK) bearbeiten.
Daher wird Katharina Lorenz bei der Veranstaltung unterstützen und Hintergründe und Ziele des DRK Projektes vorstellen. Katharina ist Referentin für Geoinformatik am DRK Generalsekretariat in Berlin und unter anderem die Missing Maps Verantwortliche des DRK. Sie organisiert aktuell zusammen mit HeiGIT die 25 Mapathons Reihe für nationale DRK Gesellschaften. Weitere Informationen zu den 25 Mapathons und Teilnahme hier.

Herzlich willkommen zum Mapathon sind erfahrene OSM-Mapper, OSM-Anfänger aber auch Interessierte, die OpenStreetMap noch nicht kennen. Es wird eine Einführung in das Kartieren in OpenStreetMap gegeben, daher sind keine Vorkenntnisse notwendig.

Der Mapathon findet sowohl im Verschwörhaus (Weinhof 9, 89073 Ulm Zur Karte) als auch virtuell unter https://bbb.ulm.dev/b/covid19_mapathon statt.

Weitere Informationen zur Veranstaltung und Anmeldungen findet ihr auf der Wikiseite.

As part of the LOKI project (Luftgestützte Observation Kritischer Infrastrukturen), a first test run of a newly launched OpenStreetMap (OSM) completeness mapping project in MapSwipe has been conducted yesterday.

The aim of the LOKI project is to develop an interdisciplinary system that enables fast and reliable airborne situation assessments following an earthquake. The system will serve to reduce longer-term damage after an earthquake by recording information on the current situation as efficiently as possible, thereby enabling remediation actions to be undertaken within appropriate timescales.

Complete OSM building footprint data in case of an earthquake is critical input for exposure modelling, UAV mission planning, and for the assessment of building-sepcific damage from UAV point clouds and images. In the first test run of the completeness mapping project, colleagues from GIScience and 3DGeo research groups and from HeiGIT evaluated the completeness of OSM building footprints in Mapswipe for different geographic regions prone to earthquakes.

The design testing of the completeness mapping was set up together with the German Research Center for Geoscience (GFZ), a project partner in the LOKI project.

Big thanks to all participants!

As part of the Missing Maps project, MapSwipe is a mobile app that was created to crowdsource map data from a network of global volunteers - just one swipe at a time. Individuals, volunteers from communities all over the world, can swipe through the app and tap areas where they find missing crucial infrastructure such as buildings and roads. In case of an earthquake, incomplete areas of the affected region will be enriched during mapathons with volunteers mapping missing building footprints.

LOKI is a collaboration project that combines existing expertise in earthquake research with a variety of new technologies and concepts, including machine learning, crowdsourcing, exposure modelling, 3D monitoring and the deployment of civil drones.

Find more details on the project website and latest project updates on Twitter or follow LOKI on ResearchGate.

The LOKI project is running from 2020-2022 and is funded by BMBF (funding code: 03G0890)

Recently we’ve had the first minor release 1.1 of the ohsome API, which brings several new features and upgrades. In the following lines we want to present to you the most important ones.

Prior to 1.1, when you were requesting data through one of our data extraction endpoints, you would always get the geometries clipped to the given spatial boundary|boundaries. With the addition of the parameter clipGeometry, you can now specify by yourself if you want to get them clipped (clipGeometry=true) or unclipped (clipGeometry=false). The default behavior is having clipping of the features enabled, as we want to ensure backwards compatibility. To demonstrate a use case we’ve prepared an example request where we take a look at the buildings within a 2km catchment area of the football stadium of TSG Hoffenheim in Sinsheim. The graphic shows the buildings with an unclipped geometry in purple overlayed with the clipped features in orange.

You can clearly see the cut of the boundary through the buildings. Using clipped geometries could make sense if someone is computing aggregations of the features in the given region, e.g. to get the length of the road network. When extracting and visualizing data, like in the given example, it is better though to disable the clipping. Here are the example URLs of clipped vs. unclipped.

The second new feature comes along with an upgrade of the ohsome filter library to 1.2 bringing the following two new filter functionalities:

  • having a “tag key in value-list” operator to select a specific set of tags having the same key
    • filter example: filter=highway in (residential, living_street)
    • request example: give the length of residential and living streets (URL)
  • selecting features by their OSM ID, either through giving one particular ID, a list of IDs or a range of IDs; it should always be combined with an OSM type
    • filter examples: filter=type:way and id:12345   filter=type:reladion and id:(1, 2, 3)   filter=type:way and id:(1 .. 123)
    • request example: select one specific OSM feature (URL)

Of course these new filters are also documented within the dedicated filter documentation page.

Apart from these changes, there have been a few updates on the GitHub readme, as well as stuff that only affects the code itself, like the increase of the Java compatibility to version 11 or diverse smaller bug fixes and code refactorings. The full list of changes can be found in the changelog of the ohsome API GitHub repository. Have fun with using the new features and remember: stay ohsome!

Context: the aim of the ohsome OpenStreetMap History Data Analytics Platform is to make OpenStreetMap’s full-history data more easily accessible for various kinds of OSM data analytics tasks, such as data quality analysis, on a global scale. The ohsome API is one of its components, providing free and easy access to some of the functionalities of the ohsome platform via HTTP requests. Some intro can be found here:

Do you want to find out about the potential of simulating LiDAR data over synthetic forest stands and the steps to get there? Get to know the SYSSIFOSS research project in the video below!

Link to video

In SYSSIFOSS we are using 3D LiDAR forest data to create a database of diverse model trees (different species and characteristics). Using tree positions and parameters provided by a forest growth simulator, 3D forest scenes will be assembled from these model trees. They serve as input for the Heidelberg LiDAR Operations Simulator (HELIOS). Based on the resulting simulations, we will conduct a sensitivity analysis to identify the most important factors influencing LiDAR based forest inventories. Furthermore, we will investigate the potential of synthetic data to minimize the amount of field data collection.

SYSSIFOSS is a joint project between the Institute of Geography and Geoecology (IFGG) of the Karlsruhe Institute of Technology (KIT) and the 3DGeo Research Group of Heidelberg University. It is furthermore closely linked to the HELIOS project.

Find further details about the SYSSIFOSS project on the project website, in recent blogposts, or on Twitter (#SYSSIFOSS).

SYSSIFOSS is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) - project number: 411263134.

The open, flexible and collaborative nature as well as the benefits and advantages of OpenStreetMap (OSM) lead to the creation of a whole new ecosystem evolving around the project. They range from local and global communities of data and software developers to a large amount of tools and services like disaster response, routing, art etc. This ecosystem has one thing in common: each member needs to interact with the data from the central OSM database.

Over the years a large number of tools emerged to accomplish this task, each of which has its personal strengths and weaknesses. Some are limited to certain use cases or data sizes, some are highly complex and for specialist only, and some were abandoned and are no longer maintained. To find out which tool is suitable in which situation can be a time consuming task.

In 2017 HeiGIT created yet a new set of tools to work with OSM data that fills a gap especially in the big data domain. Since then the ohsome framework has matured and is now well established within the OSM ecosystem being used by professionals, researchers and the general public. For example at the State of the Map 2020 conference (the global OSM conference) a considerable number of presentations on the academic track directly made use of these tools to create new knowledge about the project as a whole and the services welcome more and more users every month.

If you now think of switching to this new and innovative set of applications that are easy to use and actively developed but are not quite sure if they are suited for your use case, we have created a simple decision tree to support you! If you are new to the world of OSM data then this decision tree might help to get you started quickly and try out the right tools by answering a few simple questions on your use case.

Of course choosing the right tools also highly depends on personal preferences. Yet the following diagram indicates where we see our tools best suited, listing alternatives where they are not (yet) fully suitable. But no matter your OSM knowledge, your data usage frequencies or your programming skills: One ohsome tool shall be suited for your needs!

Decisiontree to match different use cases with OSM tools

Decision tree to match different OSM data processing use cases with OSM tools

Please find here a pdf version of the chart including direct links to the mentioned ohsome tools.

Further readings:

We are happy to share that our paper “A Comprehensive Framework for Intrinsic OpenStreetMap Quality Analysis” (Barron, Neis, Zipf 2013) belongs to the top 5 most cited papers of the international journal “Transaction in GIS” (TGIS). Only recently we became aware of this. Thank you for considering this work.
Certainly this paper influenced also our work afterwards leading to further publications related to VGI and especially OSM quality assessment methods, reviews and case studies.
But very importantly was also the motivation for developing the OpenStreetMap history analytics platform ohsome.org based on our OpenStreetMap history database (OSHDB). This platform aims at scalable, efficient and reproducible intrinsic OSM quality analytics for the whole globe. It allows to investigate the mapping history of OSM. Here we shortly introduce the general architecture. You can access the current version already through the OSHDB, dashboards, esp. the OSM History EXplorer ohsomeHEX and the open Ohsome API (see here the interactive API documentation). Maybe you want to include your own analytics functionality through contributing to the active open source project? Learn more about using the API through the Blog series: How to become ohsome or in general http://giscienceblog.uni-heidelberg.de/tag/ohsome/

Barron, C., Neis, P. & Zipf, A. (2013): A Comprehensive Framework for Intrinsic OpenStreetMap Quality Analysis. , Transactions in GIS, 18(6), 877-895. DOI: 10.1111/tgis.12073.

List of Best Paper in TGIS: https://onlinelibrary.wiley.com/journal/14679671#pane-01cbe741-499a-4611-874e-1061f1f4679e11?tabActivePane=

Selected related work:

Das Karlsruher Softwareunternehmen Disy Informationssysteme GmbH stellt die neue Version der Data-Analytics-, Reporting- und GIS-Plattform “Cadenza” vor. Die Erweiterung der Analytics-Funktionalitäten mit der Bereitstellung einer integrierten Routingfunktion und der POI-Suche sind zwei der wesentlichen Neuerungen.

Realisiert wurde das Routing in Cadenza, in dem für die Streckenberechnungen eine Anbindung für den Openrouteservice (ORS), der von der HeiGIT GmbH, dem Heidelberg Institute for Geoinformation Technology entwickelt wurde, geschaffen wurde. Die Datengrundlage für den Openrouteservice bildet das Straßennetzwerk von OpenStreetMap (OSM) sowie weitere optionale Daten, wie z.B. Ländergrenzen. Der Openrouteservice ist als Open-Source-Produkt frei verfügbar und wird u.a. auch vom Bundesamt für Kartographie und Geodäsie (BKG) genutzt. Die Erweiterung um die Anbindung des Routing-Services erfolgt per ConfigFile.

Der sich aus der Routenberechnung ergebende Streckenverlauf kann als Kartenthema gespeichert werden. Er steht damit für weitere Geooperationen, wie beispielsweise einer Pufferbildung von 5 km Breite um diese Streckenführung, zur Verfügung. Das Anwendungsspektrum für Routingberechnungen ist entsprechend groß. Zum Beispiel für Analysen im Dienste des Umweltschutzes: „Welche für die Entnahme von Grundwasser geschützten Gebiete liegen bei einem Gefahrguttransport von A nach B im Abstand von 500 Meter an der Route?“ „Welche Luftmesspunkte gibt es entlang dieser Streckenführung?“

Wie immer kann dieselbe Route, vom ORS Karten-Client, über die ORS API und in dem in Entwicklung befindlichen neuen Karten-Client von HeiGIT erzeugt werden. Informationen zu Parametern finden Sie in der interaktiven ORS Dokumentation, und Sie können sich im ORS Entwickler-Dashboard für einen kostenlosen API-Key anmelden.

Openrouteservice dient der OSM-Community und der Allgemeinheit bereits seit 2008.
Viel Spaß beim Routing!

Weitere news zu Openrouteservice http://giscienceblog.uni-hd.de/tag/openrouteservice/



Related Research Publications:

A recently published paper provides a bibliometric review between integration of authoritative and volunteered geographic information (VGI) for the specific purpose of cartographic updating of urban mappings. The adopted methodology was through a bibliometric survey of the literature published by Web of Science and Science Direct. The period was evaluated from 2005 to 2020 and the keywords used were: integration of authoritative data, volunteered geographic information, VGI, large scale topographic mapping, Authoritative urban mapping. The number of publications found was small for the topic that deals with this integration, with only 14 articles at Web of Science and 23 at Science Direct. 38% of them were published in the International Journal of Geo Information (ISPRS), 16% in the International Journal of Geographical Information Science. 5% were published in the Cartography and Geographic Information Science and the Computer Geosciences respectively. The other 36% is shown in several other journals, approximately 3% each. Regarding the origin of publications, 25% are in Germany (University of Heidelberg), 14% in the UK (Newcastle University), 13% in China (Wuhan University), 11% in Canada (Calgary University), and other countries show percentages between 3% and 5%. Among the research, areas are physical geography, remote sensing, computer science, information science, engineering, and public administration. Among themes addressed in the articles, potentials can be pointed out as existence of models which institutions can implement management of information received collaboratively, existence of several methodologies for quality control of this information so that they can be integrated into authoritative data that are called as data conflation. Methodologies for handling big data and semantic interoperability, as well as automation of processes. This data potential is not only on platforms such as OpenStreetMap, but also on data collected through scraping from social networks such as twitter, sites, and others.

Among the challenges, there are still somethings to investigate regarding consideration of temporal, historic, political, and economic aspects, as well as the consideration of legal aspects. The integration of this volunteered geographic information is necessary, mainly in cities with economic and cultural difficulties to maintain their mapping up to date, as well as the difficulty of accessing information that allows access to authoritative data.

Fernandes, V. O., Elias, E. N., and Zipf, A.: INTEGRATION OF AUTHORITATIVE AND VOLUNTEERED GEOGRAPHIC INFORMATION FOR UPDATING URBAN MAPPING: CHALLENGES AND POTENTIALS, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLIII-B4-2020, 261–268, https://doi.org/10.5194/isprs-archives-XLIII-B4-2020-261-2020, 2020.

Selected overview publications (with focus on VGI/OSM quality):

Older Posts »