Feed on

This week the GIScience research group Heidelberg and HeiGIT visited the exhibition “MatheLiebe” in Heidelberg. The exhibition demonstrates that:
• Mathematics is understandable not only for scientists, but for everyone,
• that mathematics is interesting and useful and full of exciting surprises,
• and that mathematics is important for technological progress and for our everyday lives.
We very much enjoyed the guided tour led by Reinhold Weinmann and colleagues.
The traveling exhibition from Liechtenstein is presented in the MAINS, the mathematics computer science station of the Heidelberg Laureate Forum Foundation (HLFF).

“Experiencing Mathematics” is an international exhibition, initiated and supported by UNESCO. Since opening in 2004, it has been shown in more than 30 countries and over 150 times. With the exhibition in Heidelberg “Experiencing Mathematics” can be seen in Germany for the first time.

Further Information:


The Heidelberg Laureate Forum Foundation (HLFF) annually organizes the Heidelberg Laureate Forum (HLF), which is a networking event for mathematicians and computer scientists from all over the world. The HLF was initiated by the German foundation Klaus Tschira Stiftung (KTS), which promotes natural sciences, mathematics and computer science, and the Heidelberg Institute for Theoretical Studies (HITS). The KTS and the HITS were brought to fruition by physicist and co-founder of SAP, Klaus Tschira (1940 – 2015). The Forum is organized by the HLFF in cooperation with KTS and HITS as well as the Association for Computing Machinery (ACM), the International Mathematical Union (IMU), and the Norwegian Academy of Science and Letters (DNVA).

Admittedly, the OpenRouteService Matrix API is anything but new. Already implemented in mid 2017, it’s been soaring to high demand by numerous clients across the globe ever since. Finally we want to give it the credit it deserves.

The ORS Matrix API is the precursor to solve important problems like the traveling salesman problem and even more complex use cases logistics companies face every day. While we’re not exactly at the point of solving all their multi-faceted problems yet, you might still be interested in this tool to calculate up to 5.25 Mio routes per day. Yes, that’s right, 5.25 MILLION PER DAY

And that happens incredibly fast, too. Within a single request, you can fire off a matrix of 50×50 locations and it won’t take noticeably longer than calculating a single route from Milano to Berlin.

There’s gotta be a catch, you think? Well yes, there is: you will not see any geometry of your routes. You will only be returned what matters most: distance and/or duration. Who has time to examine 5 Mio routes every day anyway…

If we sparked your interest, take a look at the documentation, open your IDE and start stressing our servers!


A recently published paper presents an approach for classifying urban blocks according to their built-up structure based on high-resolution spaceborne InSAR images. Most attributes considered in the classification describe the geometric structure and spatial disposition of the polygon and line features extracted from each block. The feature extraction is carried out on two intensity images acquired at the satellite’s ascending and descending orbits. The strategy used for extracting polygon features is described in detail. We also present a Markov random field model used to perform context-based classification of built-up structures. The model establishes a probabilistic dependency between the class labels of two neighbouring blocks, taking in this way advantage of the fact that blocks with the same structure are frequently clustered. 1695 urban blocks were classified into five general built-up types. It is shown that the context-based classification accuracy is up to 6% more accurate than the standard classification on which the context-based model is based. We hence provide evidence (1) that urban block-based classifications can potentially be improved if context is considered and (2) that general built-up structures can be distinguished to a good extent using available high-resolution spaceborne radar images.


Novack, T. and Stilla, U. (2018): Context-Based Classification of Urban Blocks According to Their Built-up Structure PFG – Journal of Photogrammetry, Remote Sensing and Geoinformation Science, Vol. 1 (5), pp. 1-12.


we cordially invite everybody interested to our next open GIScience colloquium talk!

Prof. Dr. Peter Baumann / Computer Science, Jacobs University, Bremen

Friday, January 19 2018, 10.15 am, INF 348, Room 013, Heidelberg University, Institute for Geography

Datacubes form an enabling paradigm for serving massive spatio-temporal Earth data in an analysis-ready way by combining individual files into single, homogenized objects for easy access, extraction, analysis, and fusion - “one cube says more than a million images”. In common terms, goal is to allow users to “ask any question, any time, on any size” thereby enabling them to “build their own product on the go”.
Today, large-scale datacubes are becoming reality: For server-side evaluation of datacube requests, a bundle of enabling techniques is known which can massively speed up response times, including adaptive partitioning, parallel and distributed processing, dynamic orchestration of mixed hardware, and even federations of data centers. Known datacube services exceed 600 TB, and datacube analytics queries have been split across 1,000+ cloud nodes. Intercontinental datacube fusion has been accomplished between ECMWF/UK and NCI Australia, as well as between ESA and NASA.
From a standards perspective, as per ISO and OGC, datacubes belong to the family of coverages, aka “spatio-temporally varying objects”. the coverage data model is represented by the OGC Coverage Implementation Schema (CIS) standard, the service model by OGC Web Coverage Service (WCS) together with its OGC Web Coverage Processing Service (WCPS), OGC’s geo datacube query language. Additionally, ISO is finalizing application-independent query support for massive multi-dimensional arrays in SQL.
In our talk we present the concept of datacubes, the standards that play a role, as well as interoperability successes and issues existing, based on our work on the OGC Reference Implementation, rasdaman.

Further details + dates?

We are looking forward to a large attendance!

Liebe Studierende aller Semester und Studiengänge,

ich möchte Sie herzlich zu zwei besonderen Veranstaltungen im Rahmen der Vorlesung “Geodatenerfassung” in die Geographie einladen, die Sie sicher interessieren könnten:

  • Montag, 22.01.2018 um 9.15 im gHS (INF 230, COS): Spezialvorlesung: Amtliche Geodaten bei der Stadt Heidelberg: Gewinnung, Nutzung und Vorhaltung. Hubert Zimmerer, Leiter des GIS der Stadt Heidelberg, berichtet in einem spannenden Vortrag aus der Praxis der Geodatenerfassung in “unserer” Stadt. Er zeigt neueste Beispiele und was uns die nahe Zukunft zum Thema Geodaten in Heidelberg bringt.
  • Montag, 29.01.2018 um 9.15 im gHS (INF 230, COS): Spezialvorlesung: Spielbasierte Ansätze in der Geodatenerfassung. Heinrich Lorei führt sie in die Welt der spielbasierten Geodatenerfassung ein und lässt Sie auch aktiv im Rahmen seiner Forschungsarbeit daran teilhaben.

H. Lorei: “Punkte, Levels und Highscores - von diesen Begriffen habt ihr bestimmt schon einmal was gehört: Sie sind elementare Bestandteile von Spielen und erfreuen sich steigender Popularität. Doch was macht Spiele so interessant und spannend? Anhand von Beispielen wie Tetris, World of Warcraft und Grand Theft Auto wird dies in der Vorlesung erläutert. Weiterhin wird unter Gamification die Verwendung von Spielelementen in spielfremden Kontexten verstanden. Sie wird immer öfter im Alltag, z.B. beim Joggen oder im Haushalt eingesetzt, um aus “langweiligen” Aktivitäten spannende Herausforderungen zu schaffen und den inneren Schweinehund zu überwinden. Deswegen stellt sich die Frage: können Spielelemente dazu genutzt werden, um auch in die Erfassung von Geodaten mehr Abwechslung zu bringen? Auf diese Weise können Geodaten häufiger aktualisiert werden und fehlende Attribute von den Spielern eingetragen werden. Apps wie KORT und StreetComplete haben dies am Beispiel von OpenStreetMap bereits realisiert, schöpfen ihr Potenzial jedoch nicht aus. Der letzte Teil der Vorlesung behandelt somit Erweiterungen, die infrage kommen, um dessen Wiederspielwert zu steigern.

Disaster events damage human infrastructure and its surroundings within seconds. To support humanitarian logistics, the Disaster OpenRouteService needs the latest, most accurate data available. While crowd-sourcing OSM updates during disasters proved very successful, there is not yet a convenient way of automatically accessing up-to-date OSM data for specific regions of interest. Addressing this need, HeiGIT @ GIScience Heidelberg developed a server that provides up-to-date OSM extracts via a web interface that is easy to use.

real-time OSM manages the creation and execution of update tasks, each responsible for extracting, updating and serving OSM data of a user-defined region of the earth. Tasks can be added, modified and deleted easily via an API or a convenient web interface.

We are now working towards deploying this server as a web service hosted by HeiGIT. This will provide you with easy access to the most current data as needed by the disasterORS. Stay tuned for further improvements and updates!

The tool was developed by Stefan Eberlein. The work at HeiGIT is supported by the Klaus Tschira Foundation, Heidelberg.

For more information, visit the real-time OSM Github page.

ps. check also our Open Position: Software Developer: OSM Routing Services

Recently, deep learning has been widely applied in pattern recognition with satellite images. Deep learning techniques like Convolutional Neural Network and Deep Belief Network have shown outstanding performance in detecting ground objects like buildings and roads, and the learnt deep features are further applied in some prediction tasks like poverty and population mapping. On the other hand, such deep learning techniques usually rely on a large set of labeled training samples (i.e., human knowledge) for supervision. Volunteered Geographic Information (VGI) like OpenStreetMap provides a way to easily get a large set of such training data. Meanwhile, utilizing VGI for deep learning brings new technical challenges like
1) how to deal with the noise in VGI data which are usually contributed by common people instead of experts, and
2) how to transfer learnt models from area to area and from time to time, as there is usually a gap between the volunteer labeled targets and the unknown targets waiting for prediction.
A chapter in a recently published book by Karimi and Karimi (2017) introduces the current work in this field, including satellite image classification with deep learning, challenges and solutions in utilizing VGI data, esp. OpenStreetMap (OSM) but also MapSwipe data, domain adaptation and feature transferring, and applications.

First the typical deep learning studies in satellite image classification as well as some classic benchmarks are analyzed, and then the chapter focuses on the problem of automatically extracting big sample sets from VGI data for the supervision of training deep networks. Two main technical challenges about sample noise and domain adaptation as well as their solutions in VGI data quality research and machine learning research are introduced. Finally, several applications where the above techniques and data can be applied are presented.

The chapter builds upon work done in the deepVGI project at HeiGIT (Heidelberg Institute for Geoinformation Technology) at Heidelberg University.

Chen, J., Zipf, A. (2017): Deep Learning with Satellite Images and Volunteered Geographic Information (VGI). In: Karimi, H. A. and Karimi, B. (eds.): Geospatial Data Science Techniques and Applications. Chapter 3. pp. 63-78. crc press. Taylor & Francis.


Software Engineer OSM Routing Services, Backend & Algorithms
Heidelberg Institute for Geoinformation Technology (HeiGIT)

You genuinely enjoy developing open source Geoinformation Services used by thousands on a daily basis? You are a highly motivated Java Backend Developer? And you love using and enhancing OpenStreetMap for high-performance services for global coverage? Then we actually might have a suitable and interesting job for you.

In order to promote technology transfer and applied research in the area of ​​Geoinformatics, the Heidelberg Institute for Geoinformation Technology (HeiGIT http://www.heigit.org) is currently being established with the support of the Klaus-Tschira Foundation. In the future this endeavour is to be continued in the future as an independent institute. To this end we are looking for a motivated Software Developer in the field of Routing Services (Java Backend). Depending on your experience the tasks are related to at least one of the following areas:

  • Route Planning, Smart Mobility and Navigation Intelligence
  • Development and design of innovative routing services, algorithms and extensions for the well-known open source project http://OpenRouteService.org (Java)
  • Extension of the services infrastructure of various location-based services (LBS) using OpenStreetMap
  • Development of highly performant algorithms, methods and GI web services, especially for the analysis and data enrichment of heterogeneous global geodata sets e.g. from the Social Web, OpenStreetMap, etc., especially in the domain of routing, traffic and navigation etc.

We offer an attractive position in an interdisciplinary team in a highly dynamic and growing market. HeiGIT is and will be related closely to the Department of Geoinformatics, which is member of the Interdisciplinary Center for Scientific Computing (IWR) and the Heidelberg Center for the Environment (HCE). We offer a stimulating interdisciplinary research environment with many personal development opportunities.

We expect a highly motivated backend developer with solid experience in Java or C++ and an above-average university degree in one of the subjects such as Geoinformatics, Computer Science, Geography, Mathematics or similar.

The position is to be filled as soon as possible and for administrative reasons initially limited to two years - with the option of sustainable extension. Please send application documents (CV, certificates, references, etc.) as soon as possible to zipf@uni-heidelberg.de




15th International Conference on Information Systems for Crisis Response and Management (ISCRAM 2018)
May 20-23, 2018, Rochester, NY, USA

Track: Geospatial Technologies and Geographic Information Science for Crisis Management (GIS)

Deadline for paper submissions: December 10, 2017

Track Description

With disasters and disaster management being an “inherently spatial” problem, geospatial information and technologies have been widely employed for supporting disaster and crisis management. This includes SDSS and GIS architectures, VGI, spatial databases, spatial-temporal methods, as well as geovisual analytics technologies, which have a great potential to build risk map, estimate damaged areas, define evacuation routes, and plan resource distribution. Collaborative platforms like OSM have been also employed to support disaster management (e.g., near real-time mapping). Nevertheless, all these geospatial big data pose new challenges for not only geospatial data visualization, but also data modeling and analysis; existing technologies, methodologies, and approaches now have to deal with data shared in various formats, different velocities, and uncertainties. Furthermore, new issues have been also emerging in urban computing and smart cities for making communities more resilient against disasters. In line with this year’s conference theme, the GIS Track particularly welcomes submissions addressing aspects of geovisualization in disaster risk and crisis research. This includes SDSS, near-real-time mapping, situational awareness, VGI, spatio-temporal modeling, urban computing, and other related aspects. We seek conceptual, theoretical, technological, methodological, empirical contributions, as well as research papers employing different methodologies, e.g., design-oriented research, case studies, and action research. Solid student contributions are welcome.

Track topics are therefore focused on but not limited to the following list.

Geospatial data analytics for crisis management
Location-based services for crisis management
Location-based technologies for crisis management
Geospatial ontology for crisis management
Geospatial big data in the context of disaster and crisis management
Geospatial linked data for crisis management
Urban computing and geospatial aspects of smart cities for crisis management
Spatial Decision Support Systems for crisis management
Remote sensing for crisis management
Geospatial intelligence for crisis management
Spatial data management for crisis management
Spatial data infrastructure for crisis management
Geovisual analytics for crisis management
Spatial-temporal modeling in disaster and crisis context
Crisis mapping and geovisualization
Crowdsourcing and VGI in the context disaster and crisis management
Spatial analysis of OpenStreetMap (OSM) data for crisis management
Spatial analysis of social media messages in the context of crisis management
Interoperability aspects regarding disaster-related geodata

Important Dates

Submission of Completed Research papers (CoRe): December 10, 2017
Acceptances or otherwise of Completed Research papers (CoRe): January 7, 2018

Submission of Work in Progress (WiPe) papers, demos and posters: January 21, 2018.
Acceptances or otherwise of Work in Progress (WiPe) papers, demos and posters: February 18, 2018.
Camera Ready Paper: March 4, 2018.

iscram 2018
Paper submission guidelines


Track Chairs

Dr João Porto de Albuquerque (primary contact)
University of Warwick, United Kingdom

Prof. Dr. Alexander Zipf
University of Heidelberg, Germany

Dr Flávio E. A. Horita
University of São Paulo, Brazil

Over the last weeks the recently introduced Climate Protection Map (Klimschutzkarte.de) has been extended to cover the whole world and features now also some new layers. The Global Climate Protection Map is based on user-contributed data from OpenStreetMap. It allows interested citizens to share information and find out about topics related to climate protection and sustainability, such as sustainable energy supply, mobility forms and consumption / nutrition etc. Click on one of the categories to choose and see related information layers on the map.

Heidelberg University recently also issued a press release about the project (in German): http://www.uni-heidelberg.de/presse/meldungen/2017/m20171222_klimaschutzkarte-fuer-mehr-nachhaltigkeit-im-alltag.html

From citizens to citizens: Share your local knowledge about the relevant topics and inform yourself about possibilities for more sustainable living!



Older Posts »