Feed on
Posts
Comments
OEV Logo

OEV Logo

One of the contributions showcased by the HeiGIT/GeoScience team in last year’s Free and Open Source Software for Geospatial (FOSS4G) 2022 conference in Florence (Italy) was Moritz Schott, Sven Lautenbach, Leonie Großchen, and Alexander Zipf’s novel paper “OpenStreetMap Element Vectorisation: A Tool for High Resolution Data Insights and its Usability in the Land-use and Land-cover Domain.”

The contribution presents a tool to address the much-discussed issue of fitness for purpose. As researchers and users take advantage of OpenStreetMap in spatial analysis and location-based service applications, they often miss information on the data quality. Available information in that regard is often limited to certain regions or data aspects. The new tool combines a large number of such information into a single tool and at the highest possible resolution: single OSM elements.

The new tool, OpenStreetMap Element Vectorisation (repository), currently provides access to 32 attributes or indicators at the level of single OpenStreetMap objects. These indicators cover aspects concerning the element itself, surrounding objects and the editors of the object. A graphic workflow illustrates the steps required to use the tool from file configuration and database setup to result output. Yet, the tool provides a Docker workflow that simplifies the process into a few simple commands. Alternatively, the website and the api provide a starting point to test a few precomputed examples.

A visual represenation of the OEV workflosh stages

A visual represenation of the OEV workflosh stages

Testing Grounds

The paper not only introduced but sought to test OpenStreetMap Element Vectorisation’s usability for the use case of LULC elements of land-use and land-cover polygons. The data and all related figures of the analysis are openly available in the respective repository.

The researchers found that OpenStreetMap objects in more densely-populated areas tended to be smaller, while the age and size of the objects differed across continents. In Europe and North America, the researchers detected older and smaller objects. These findings were backed up by statistical test. Yet it became clear that such trends are hardly visible on a global perspective and more localised analyses are required.

In addition, a k-means cluster analysis was used to identify groups in the data in search of global data practices. One finding of the method was a cluster highly influenced by North American lakes that originate from imports.

Eye to the Future

The tool offers amble opportunities for future research, supports the OpenStreetMap community by making informed planning decisions for future activities and enables data consumers to make informed decisions on data usage. While the development was made with land-use and land-cover information in mind, the tool can be seamlessly applied to any polygonal OpenStreetMap data and also supports linear and point data. In addition it is not restricted to quality analyses but rather provides a universal collection of data attributes that have relevance in different applications.

For full paper and all references:

Schott, M., Lautenbach, S., Großchen , L., & Zipf, A. (2022). openstreetmap element vectorisation - a tool for high resolution data insights and its usability in the land-use and land-cover domain. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLVIII-4/W1-2022. https://doi.org/10.5194/isprs-archives-XLVIII-4-W1-2022-395-2022

Related work:

  • Bruckner, J., Schott, M., Zipf, A., Lautenbach, S., 2021. Assessing shop completeness in OpenStreetMap for two federal states in Germany. AGILE: GIScience Series, 2, 20.
  • Herfort, B., Lautenbach, S., de Albuquerque, J. P., Anderson, J., Zipf, A., 2021. The evolution of humanitarian mapping within the OpenStreetMap community. Scientific Reports, 11(1), 1–15.
  • Jokar Arsanjani, J., Mooney, P., Zipf, A., Schauss, A., 2015. Quality assessment of the contributed land use information from openstreetmap versus authoritative datasets. J. Jokar Arsanjani, A. Zipf, P. Mooney, M. Helbich (eds), OpenStreetMap in GIScience: Experiences, Research, and Applications, Springer International Publishing, Cham, 37–58.
  • Neis, P., Zielstra, D., Zipf, A., 2013. Comparison of Volunteered Geographic Information Data Contributions and Community Development for Selected World Regions. Future Internet, 5(2), 282–300.
  • Neis, P., Zipf, A., 2012. Analyzing the Contributor Activity of a Volunteered Geographic Information Project — The Case of OpenStreetMap. ISPRS International Journal of GeoInformation, 1(2), 146–165.
  • Raifer, M., Troilo, R., Kowatsch, F., Auer, M., Loos, L., Marx, S., Przybill, K., Fendrich, S., Mocnik, F.-B., Zipf, A., 2019. OSHDB: a framework for spatio-temporal analysis of OpenStreetMap history data. Open Geospatial Data, Software and Standards, 4(1), 1–12.
  • Schott, M., Grinberger, A. Y., Lautenbach, S., Zipf, A., 2021. The Impact of Community Happenings in OpenStreetMap—Establishing a Framework for Online Community Member Activity Analyses. ISPRS International Journal of GeoInformation, 10(3), 164.
  • Schott, M., Zell, A., Lautenbach, S., Zipf, A., Demir, B., 2022. LULC multi-tags based on OSM, Version 0.1. https://gitlab.gistools.geog.uni-heidelberg.de/giscience/idealvgi/osm-multitag.
  • Schultz, M., Voss, J., Auer, M., Carter, S., Zipf, A., 2017. Open land cover from OpenStreetMap and remote sensing. International Journal of Applied Earth Observation and Geoinformation, 63, 206-213.
  • Zielstra, D., Zipf, A., 2010. A comparative study of proprietary geodata and volunteered geographic information for Germany. 13th AGILE international conference on geographic information science, 2010, 1–15.

A new dataset of UAV and terrestrial laser scanning point clouds of snow-on and snow-off conditions at a Black Forest site (Hundseck, 48.643°N, 8.228°E) was published open access:

Winiwarter, L., Anders, K., Battuvshin, G., Menzel, L. & Höfle, B. (2023): UAV laser scanning and terrestrial laser scanning point clouds of snow-on and snow-off conditions of a forest plot in the black forest at Hundseck, Baden-Württemberg, Germany [data]. heiDATA. DOI:10.11588/data/UCPTP1.

The data was acquired in a joint field campaign by the research groups Hydrology and Climatology (Prof. Menzel) and 3DGeo (Prof. Höfle). The plot was captured under snow conditions in January 2021, and the acquisition was repeated under snow-off conditions in February 2021. Terrestrial laser scanning (TLS) point clouds were acquired as reference to UAV laser scanning, which provides a fast way to capture the forest scene at high spatial detail and accuracy. Check out the point clouds in our web viewer: https://3dweb.geog.uni-heidelberg.de/potree/scenes/hundseck.html

The data can be used to assess snow depths in the observation period, as well as the topography and vegetation structure. The thesis of Thomas Scharffenberger used the data to investigate influences of tree structure and topography on sub-canopy snow depth. This research demonstrates a strong link of snow depth to the distance of the stem. Overall, snow depth in the forest is linked to a broad range of variables, among them structure of vegetation, forest stand density, topographic conditions as well as potential solar radiation to reach the forest floor - a highly complex situation which needs further investigations with such valuable data.

Are you a highly motivated individual who loves designing and developing machine learning and deep learning systems? Do you want to use your machine learning expertise for the benefit of society and the environment? Do you want to improve the availability and quality of geospatial data and further develop geoinformatics methods used for open, non-profit applications in the field of sustainability, mobility and humanitarian aid? That’s our mission, too! We might have just the job for you.

Deep Learning Engineer (m/f/d, up to 100%)

HeiGIT gGmbH is a non-profit start-up aiming at technology transfer and applied research in the area of geoinformatics, especially in areas such as humanitarian aid or other non-profit goals for the benefit of society and the environment.

To this end, we are looking for a motivated and experienced Deep Learning Engineer to help create AI products for smart mobility, humanitarian aid, disaster management & other geospatial applications to support sustainability and climate action.

Your tasks are related to at least one of the following areas:

  • Designing and developing machine learning and deep learning systems
  • Exploring state-of-the-art model families and machine learning algorithms
  • Deep feature extraction using multi-modal data including satellite imagery, OSM data etc.
  • Innovative use of advanced machine learning methods
  • Data fusion of heterogeneous data sources for geoinformation

Close collaboration and communication with team members & stakeholders is your favorite way of working. Your in-depth knowledge of your favorite language and tools will be valued, but you’ll also be expected to help with whatever the team needs to work towards its goal.

Your Expertise

  • Experience with computer vision and machine learning for Object Detection, Instance Segmentation tasks. Familiarity with latest DL Models like Transformers, Generative Models etc. is beneficial.
  • Strong foundations in probability, linear algebra and optimization
  • Hands-on experience in evaluating and developing new approaches from the literature and actively implementing new concepts
  • Hands-on working experience with anyone of the deep learning frameworks- TensorFlow, Pytorch, etc.
  • Quick prototyping skills in Python and coding
  • Good communication and collaboration skills
  • Interest in working at the interface between science and practical requirements with implementations for different user groups (e.g. requirements of aid organizations, environmental organizations, government agencies, citizens etc.).
  • Previous experience in working with spatial data is beneficial.
  • An above-average university degree in one of the subjects such as Geoinformatics, Computer Science, Geography, Mathematics, Physics or similar.
  • German language skills preferred, but English only possible.

We offer an attractive position in an interdisciplinary team in a highly dynamic and growing area. HeiGIT gGmbH is cooperating closely with the GIScience Research Group at Heidelberg University, which is member of the Interdisciplinary Center for Scientific Computing (IWR) and the Heidelberg Center for the Environment (HCE) at Heidelberg University. We offer a stimulating interdisciplinary research environment with many personal development opportunities. All HeiGIT employees can, in agreement with the rest of the team, work from home or remote up to half of their time.

HeiGIT receives core funding by the Klaus Tschira Foundation (KTS).

The position is to be filled as soon as possible and for administrative reasons initially limited to 2 years – with the option of sustainable extension. Please send your application including CV, certificates, references, etc. in digital form as PDF as soon as possible to maria.martin@heigit.org.

Please note our data protection information for applicants.

If you’re on the job market or know someone who is, check out this exciting new opening from our partners at GIScience. The offer is included as text below.

You are interested in enhancing methods for analyzing & improving OpenStreetMap data? You are an experienced Spatial Data Scientist innovating geoinformatics methods & workflows? You have profound hands-on experience on contributing to & working with OpenStreetMap (OSM) data?

We are looking as soon as possible for a senior researcher / Postdoc (m/f/d) in the context of a project aiming at developing, evaluating & advancing methods & technologies for OSM data quality analysis. The application domain will be routing & navigation. The goal is to innovate robust & meaningful approaches for evaluating OSM data quality for transportation (fitness for purpose), which actually can be applied in practice. Ensuring transferability of the methods to usage around the globe is a key focus.

You will be working closely together with the teams at both the GIScience Research Group at Heidelberg University, as well as the associated Heidelberg Institute for Geoinformation Technology (HeiGIT.org). At HeiGIT a team is already working at developing a comprehensive open source framework for OSM data analysis (ohsome.org). This can be a base for further developments in the analysis of OSM data quality targeted to routing & navigation. In addition GIScience & HeiGIT jointly develop the open source OSM routing platform openrouteservice.org which allows detailed insights & discussions about realizing OSM based mobility services.

We offer an attractive position (up to 100%, part time possible if preferred) in an interdisciplinary dynamic team and in a cutting-edge research field. The department is, among others, a member of the University’s Interdisciplinary Center for Scientific Computing (IWR) and a founding member of the Heidelberg Center for the Environment (HCE). The affiliated institute HeiGIT gGmbH translates the research into practical applications. The Heidelberg University of Excellence (top 3 national university ranking) offers a particularly stimulating interdisciplinary research environment in one of Germany’s most attractive cities, with many attractive personal development & education opportunities.

We expect an PhD and above-average university degree preferably in one of the subjects such as geoinformatics, computer science, data science, geography or similar disciplines. In addition to a strong team spirit & high motivation, excellent & demonstrated broad methodological skills & research experience in GIScience are required, e.g. in topics such as Spatial Data Science & Machine Learning. In addition we expect a deep understanding of OSM, programming skills (e.g. Python, Java or R), the ability to work effective & efficient in a team, as well as excellent communication & presentation skills. The position is to be filled as soon as possible and is initially limited to 3 years, with potential extension to 6 years. The remuneration is according to TV-L E13. Please send your meaningful application documents (cv, certificates, references, etc.) as soon as possible digitally to zipf@uni-heidelberg.de.

We are looking forward to your application!

Featured Image: Road network analysis for the driving profiles. A Normal conditions before the flood event. B Evolution of scores after the floods induced by Cyclone Idai. The lower row shows a close-up of the area surrounding the city of Dondo

The ability of disaster response, preparedness, and mitigation efforts to assess the loss of physical access to health facilities and to identify impacted populations is key to reducing the humanitarian consequences of disasters. Recent studies use either network- or raster-based approaches to measure accessibility in respect to travel time. Our analysis compares a raster- and a network- based approach that both build on open data with respect to their ability to assess the loss of accessibility due to a severe flood event. As our analysis uses open access data, the approach should be transferable to other flood-prone sites to support decision-makers in the preparation of disaster mitigation and preparedness plans. This research was led by our intern Sami Petricola, with support and supervision from HeiGIT and the GIScience research group. The results are now openly available in the International Journal of Health Geographics.

Petricola, S., Reinmuth, M., Lautenbach, S. et al. Assessing road criticality and loss of healthcare accessibility during floods: the case of Cyclone Idai, Mozambique 2019. Int J Health Geogr 21, 14 (2022). https://doi.org/10.1186/s12942-022-00315-2

Image: History analysis of OSM contribution: number of objects and active users over time. Flooded regions represent the study area, other regions represent the rest of the country. The dashed line marks the impact date of the cyclone in March 2019

The paper identifies a difference of 300,000 residents losing access to health care facilities depending on the accessibility method. This discrepancy was related to the incomplete mapping of road networks and affected the network-based approach to a higher degree. The novel modified centrality indicator we presented allowed us to identify road segments most likely to be affected by flooding and to highlight potential alternate roads in disaster situations.

The different results obtained between the raster- and network-based methods indicate the importance of data quality assessments in addition to accessibility assessments as well as the significance of fostering mapping campaigns in large parts of the Global South. Data quality is therefore a key parameter when deciding which method is best suited to local conditions. Another noteworthy aspect is the required spatial resolution of the results. Identification of critical segments of the road network provides essential information to prepare for potential disasters.

Previous related work:

An approach for automatic characterization of surface activities from large 4D point clouds is presented in a new paper by Daan Hulskemper et al. in collaboration between the 3DGeo research group and the departments of Geoscience and Remote Sensing and Coastal Engineering at TU Delft.

Hulskemper, D., Anders, K., Antolínez, J. A. Á., Kuschnerus, M., Höfle, B., & Lindenbergh, R. (2022). Characterization of Morphological Surface Activities derived from Near-Continuous Terrestrial LiDAR Time Series. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLVIII-2/W2-2022, pp. 53-60. doi: 10.5194/isprs-archives-XLVIII-2-W2-2022-53-2022.

Abstract:

The Earth’s landscapes are shaped by processes eroding, transporting and depositing material over various timespans and spatial scales. To understand these surface activities and mitigate potential hazards they inflict (e.g., the landward movement of a shoreline), knowledge is needed on the occurrences and impact of these activities. Near-continuous terrestrial laser scanning enables the acquisition of large datasets of surface morphology, represented as three-dimensional point cloud time series. Exploiting the full potential of this large amount of data, by extracting and characterizing different types of surface activities, is challenging. In this research we use a time series of 2,942 point clouds obtained over a sandy beach in The Netherlands (Vos et al., 2022). We investigate automated methods to extract individual surface activities present in this dataset and cluster them into groups to characterize different types of surface activities (e.g., intertidal sandbar deposition, anthropogenic beach nourishments). We show that, first extracting 2,021 spatiotemporal segments of surface activity as 4D objects-by-change (4D-OBCs; Anders et al., 2021), and second, clustering these segments with a Self-organizing Map (SOM) in combination with hierarchical clustering, allows for the unsupervised identification and characterization of different types of surface activities present on a sandy beach. The SOM enables us to find events displaying certain type of surface activity, while it also enables the identification of subtle differences between different events belonging to one specific surface activity. Hierarchical clustering then allows us to find and characterize broader groups of surface activity, even if the same type of activity occurs at different points in space or time.

The work will be presented at the Optical 3D Metrology (O3DM) Workshop in Würzburg on 15 December 2022. We are looking forward to meeting some of you there!

If you are interested in related work on monitoring of surface activity on a sandy beach, check out:

Anders, K., Winiwarter, L., Mara, H., Lindenbergh, R., Vos, S. E., & Höfle, B. (2021). Fully automatic spatiotemporal segmentation of 3D LiDAR time series for the extraction of natural surface changes. ISPRS Journal of Photogrammetry and Remote Sensing, 173, pp. 297-308. doi: 10.1016/j.isprsjprs.2021.01.015.

Vos, S., Anders, K., Kuschnerus, M., Lindenbergh, R., Höfle, B., Aarninkhof, S., & de Vries, S. (2022). A high-resolution 4D terrestrial laser scan dataset of the Kijkduin beach-dune system, The Netherlands. Scientific Data, 9 (1), pp. 191. doi: 10.1038/s41597-022-01291-9.

More information on the method of 4D objects-by-change can be found on this website: https://www.uni-heidelberg.de/4dobc

Find out more about the CoastScan project here: https://coastscan.citg.tudelft.nl/

Featured Photo: Ohsome dashboard interface for Heidelberg, Germany.

In the words of Confucius, “The man who moves a mountain begins by carrying away small stones.” As we release OSHDB (OpenStreetMap History Database) Version 1.0, we look back at versions 0.5, 0.6, 0.7, and all the other small improvements to our historical OpenStreetMap database as the small stones allowing us to move the mountain.

The mountain itself was identified by our Scientific and Managing Director Prof. Dr. Alexander Zipf back in 2010. According to HeiGIT’s current product owner for the ohsome and big data team Benjamin Herfort, researchers at the time faced significant hurdles if they intended to study OSM data over time. While it was fairly straightforward to analyze contemporary developments in OSM, creating a framework for looking at the evolution of data proved complicated. “Every researcher had to find their own setup and their own way of crunching OSM data. That’s what we wanted to change.”

Prof. Zipf led the team in setting up a server so that researchers could probe historic OSM data without worrying about any setup and processes besides their study focus. With the release of OSHDB, users of historic OSM data were no longer required to be computer scientists, database engineers, or familiar with running a cluster. OSHDB would do that job for them through an intuitive API.

Since then, the changes to OSHDB have been small and meaningful, focusing on creating proper software through improved internal documentation and testing. Instead of adding any crazy features, releases have concentrated on well-functioning software to enable accessible analysis. Now, after five years of development and testing, the team is releasing OSHDB Version 1.0.

This version continues to allow users to visualize and explore the amount of data and contributions to OSM over time starting from the beginning of OSM itself. Not only are the features in the OSM History Data interesting in and of themselves, ranging from country borders to buildings and turn restrictions, but they also facilitate the investigation of data quality, regional quality comparisons, and allow for the computation of aggregated data statistics.

One of the most important aspects of the database is its usability. Two clicks on the ohsome dashboard allow any researcher, journalist, or citizen scientist to view the evolution of OSM data over time for any region. This temporal change can inform us about data quality as we evaluate saturation and check for currentness. In the case of Heidelberg, for example, the number of buildings has not significantly increased since 2012, at which point we may say that Heidelberg became saturated and data in the area is likely of a high quality. To draw such conclusions, we would also like to check for currentness, looking at when entities were changed or added. For more information on this application, make sure to read our blog post on the topic.

Graphs: Saturation Indicators for Heidelberg, Germany over time using OSHDB.

With our simple API constructed for intuitive use with a wide range of analysis queries in mind, users can work with a lossless dataset including deletions of past objects, erroneous, and partially incomplete data. Data can be viewed as snapshots at specific points in time or as a full history of the region, tag, or entity type to allow for endless use cases and to meet all user needs.

The full list of changes is available on github, with most improvements occurring in the “boring bits” that users will only notice through a smoother analysis process. We would like to highlight one change, however, that will prove useful to many researchers. New OSHDB filters allow practitioners to filter entities by the shape of the geometry. This feature targets a common discussion in the OSM community: how to deal with imperfectly-mapped objects.

When some beginners add objects to OSM, they may make minor mistakes in adding new buildings by drawing edges and not properly aligning rectangular buildings. These small errors can contribute to data quality considerations, a major application of OSHDB. With Version 1.0, users can now filter by rectangular or not-perfectly-rectangular objects and can thus identify distorted shaped in OSM. Filter methods like this addition allow researchers to comment on the quality of their OSM data for a region over time.

This filter method along with the many behind-the-scenes changes in this version and over the evolution of OSHDB contribute to the accessible database that allows researchers to easily interpret data in a way that was not possible in 2010, when OSHDB was only a vision in the minds of our team members. With the release of Version 1.0, we’re proud to offer a simple tool for the important task of data quality analysis. We look forward to carrying away many more small stones in the months and years to come.

Video: Evolution of OSM road network mapping in Heidelberg, Germany since 2007.

In the next few days, we will offer a range of content to celebrate and expand upon this release including demos, features, use cases, and insights into the development process. Keep up with us through our blogs and social media channels!

References:

If you’re on the job market or know someone who is, check out this exciting new opening at GIScience. The offer is included as text below!

You are interested in enhancing methods for analyzing & improving OpenStreetMap data? You are an experienced Spatial Data Scientist innovating geoinformatics methods & workflows? You have profound hands-on experience on contributing to & working with OpenStreetMap (OSM) data?

We are looking as soon as possible for a senior researcher / Postdoc (m/f/d) in the context of a project aiming at developing, evaluating & advancing methods & technologies for OSM data quality analysis. The application domain will be routing & navigation. The goal is to innovate robust & meaningful approaches for evaluating OSM data quality for transportation (fitness for purpose), which actually can be applied in practice. Ensuring transferability of the methods to usage around the globe is a key focus.

You will be working closely together with the teams at both the GIScience Research Group at Heidelberg University, as well as the associated Heidelberg Institute for Geoinformation Technology. At HeiGIT a team is already working at developing a comprehensive open source framework for OSM data analysis. This can be a base for further developments in the analysis of OSM data quality targeted to routing & navigation. In addition GIScience & HeiGIT jointly develop the open source OSM routing platform openrouteservice.org which allows detailed insights & discussions about realizing OSM based mobility services.

We offer an attractive position (up to 100%, part time possible if preferred) in an interdisciplinary dynamic team and in a cutting-edge research field. The department is, among others, a member of the University’s Interdisciplinary Center for Scientific Computing (IWR) and a founding member of the Heidelberg Center for the Environment (HCE). The affiliated institute HeiGIT gGmbH translates the research into practical applications. The Heidelberg University of Excellence (top 3 national university ranking) offers a particularly stimulating interdisciplinary research environment in one of Germany’s most attractive cities, with many attractive personal development & education opportunities.

We expect an PhD and above-average university degree preferably in one of the subjects such as geoinformatics, computer science, data science, geography or similar disciplines. In addition to a strong team spirit & high motivation, excellent & demonstrated broad methodological skills & research experience in GIScience are required, e.g. in topics such as Spatial Data Science & Machine Learning. In addition we expect a deep understanding of OSM, programming skills (e.g. Python, Java or R), the ability to work effective & efficient in a team, as well as excellent communication & presentation skills. The position is to be filled as soon as possible and is initially limited to 3 years, with potential extension to 6 years. The remuneration is according to TV-L E13. Please send your meaningful application documents (cv, certificates, references, etc.) as soon as possible digitally to zipf@uni-heidelberg.de.

We are looking forward to your application!

Der Klimawandel, Grenz- und Wasserkonflikte stellen enorme Herausforderungen für die aktuelle und kommenden Generationen dar. In all diesen Bereichen und darüber hinaus bietet die Geographie einmalige Chancen, zielgerichtet Lösungen zu erarbeiten.

Das Geographische Institut der Universität Heidelberg möchte gemeinsam mit der Fachschaft Geographie, dem HeiGIT(Heidelberg Institute for Geoinformation Technology), GeoDACH e.V. und anderen internationalen Partnerorganisationen Aufmerksamkeit auf das Fach Geographie und dessen Potenzial lenken.

In der Woche vom 14.-19.11.2022 findet die von National Geographic ausgerufene Geography Awareness Week statt. Durch Veranstaltungen wie Mapathons, Keynote-Vorträgen und Events zum sozialen Austausch oder Berufseinblick werden die interdisziplinäre Bedeutung des Fachs sowie die Chancen, die es bietet, herausgestellt. Angesprochen sind explizit auch Menschen außerhalb des Universitätsalltags.

Der breite Anwendungsbereich von geographischem Wissen im Alltag spielt bei den Veranstaltungen eine entscheidende Rolle. Verglichen mit anderen Fächern bietet die Geographie mit ihrem breiten Themenspektrum von Geomorphologie über wirtschaftliche und politische Fragestellungen bis hin zur Informatik die Möglichkeit, Brücken zwischen Disziplinen zu schlagen. Sie spielt damit eine zentrale Rolle in der gemeinsamen Bewältigung drängender Zukunftsfragen.

Programm:

  • Di.-Do., ganztägig, INF 348: Pflanzentauschbörse. Im Foyer des geographischen Instituts im Neuenheimer Feld 348 werden Pflanzen und Pflanzensamen gesammelt. Jede Person, die eine Pflanze mitbringt, darf sich eine bereits vorhandene nehmen.

  • Di., 11 Uhr, STAR der Berliner Straße 48: Q&A „Ehrenamt und gemeinnützige Tätigkeiten im geographischen Kontext“. GeoDACH e.V., die Vertretung deutschsprachiger Geographiestudierender stellt sich vor und bietet ein Forum zum lockeren Austausch für engagiere Menschen und solche, die sich engagieren wollen. Gemeinsam mit dem HeiGIT soll über gemeinnützige Tätigkeiten in der Geographie gesprochen werden.
  • Di., 19 Uhr, INF 227: Gegen eine kleine Spende an die Heidelberger Geographische Gesellschaft (HGG) können Interessierte einen Vortrag von Frank Keppler, Professor für Biogeochemie am Institut für Geowissenschaften der Universität Heidelberg, besuchen. Der Vortrag mit dem Titel „Methan: Energieträger, Klimagas und bioaktive Substanz“ ist Teil der HGG-Vortragsreihe „Hothouse Earth“.
  • Mi., 14 Uhr, Hörsaal der Berliner Straße 48: Interaktiver Vortrag „Experimente in der Wirtschaftsgeographie“ von Johannes Glückler, Professor für Wirtschafts- und Sozialgeographie am Geographischen Institut der Universität Heidelberg. Diese interaktive Veranstaltung richtet sich neben Studierenden vor allem an Studieninteressierte, die im Rahmen des Studieninformationstages die Universität Heidelberg besuchen.
  • Do., 11 Uhr, Seminarraum 015 INF 348: Interaktiver Vortrag „Relevanz des Raumes: Geographische Analysen in Umwelt, Entwicklung und Gesundheit“ von apl. Prof. Dr. Sven Lautenbach. Der Vortrag soll Transdisziplinarität als Stärke der Geographie hervorheben und konkrete Anwendungsbereiche von geographischen Methoden aufzeigen.
  • Do., 17 Uhr, Hörsaal der Berliner Straße 48: Videovortrag „Klimawissen Student LAB“ mit anschließendem Q&A von Studierenden und Dozierenden. Die Geography Awareness Week soll mit einem kurzen Videovortrag und anschließendem, lockeren Beisammensein bei Speis und Trank ausklingen. Hierzu sind alle interessierten Menschen eingeladen.

Mehr Informationen.

Beteiligte Organisationen bei der Geography Awareness Week in Heidelberg:

Poster #EUGAW2022 in voller Auflösung.

The commune Sandhausen (Baden-Württemberg) got its name from the inland dune, which is located in the area of the village. In 2021 and 2022, the 3DGeo group of Heidelberg University conducted UAV-based and ground-based surveys of three areas of the inland dune of Sandhausen to acquire 3D point clouds and orthophotos. The dataset is freely and openly accessible on the PANGAEA data repository:

Weiser, Hannah; Winiwarter, Lukas; Zahs, Vivien; Weiser, Peter; Anders, Katharina; Höfle, Bernhard (2022): UAV-Photogrammetry, UAV laser scanning and terrestrial laser scanning point clouds of the inland dune in Sandhausen, Baden-Württemberg, Germany. PANGAEA, https://doi.org/10.1594/PANGAEA.949228

https://3dweb.geog.uni-heidelberg.de/potree/scenes/zugmantel_thumbnail.png

View of the point cloud of Zugmantel-Bandholz in the potree renderer

The inland dune formed during the last glacial period (Würm) by drifting sands from the Rhine Valley. The age of the dune is estimated at around 10,000 to 15,000 years. After glaciation, the dunes were forested and experienced little change for a while. Only in the High Medieval period, the dunes were partially deforested to do agriculture. Intensive use as pasture forest led to destruction in parts. In 1950, the areas were placed under nature protection. In the protected areas, the steppe vegetation could be preserved or re-established, which today is considered a botanical peculiarity and floristic rarity. The fauna of the inland dune is also remarkable and worthy of protection. Particularly among the insects, there are a number of specialists that otherwise occur only very rarely.

Different nature protection measures are undertaken, such as species registration and monitoring, mowing, grazing, or installing fences to limit disturbances. In particular, climate change with increasing heat and dry periods influences the characteristic of the dunes and leads to shift of Mediterranean species to Sandhausen but also loss of other species.

Our dataset captures the current state of the inland dune in 2021 and 2022, in particular the topography and vegetation cover in different seasons of the year. This supports the monitoring of the area and the improvement of protection measures. The products can also be helpful in public outreach and environmental education.

We surveyed three dune areas in Sandhausen:

Our dataset encompasses:

  • UAV-based photogrammetric point clouds
  • UAV-based photogrammetric orthophotos
  • UAV images which were used to create the point clouds and orthophotos
  • Metadata on the images and the photogrammetric processing, including locations of ground control points
  • Terrestrial laser scanning point clouds
  • UAV-borne laser scanning point clouds
Check out the point clouds of Zugmantel-Bandholz in our potree viewer!
German website of the inland dune: http://duene-sandhausen.de/

Older Posts »