• 0

maintain a postgresql database using osm2pgsql


Question

hi there good day dear friends, 

 

 

I am trying to maintain a postgresql database using osm2pgsql to maintain my database. 

idea: I want to track the changes made by the daily replication files. do i need to create triggers that fired after delete, update and insert. should i do a daily update process. 

the idea: having a OSM dataset on Postgresql via osm2pgsql. Now I am trying to query all public buildings in UK such as 

 

- hospitals, 
- schools, 
- fire stations and 
- churches. 
- gasoline stations and so on and so forth

 

Well in order to do that, I use something like this query:

 

SELECT *
FROM planet_osm_point pop
WHERE pop.amenity IN ('hospital','fire_station','place_of_worship','school, gasoline stations')

 

well - how do you think about this approach? Is is a good method  do do so!?

 

By looking at the results, it just extracts some of the existing hospitals. i am pretty sure that there are much more hospitals. I can see the red plus logo for hospitals that I also know that they exist there. But it does not show them in the query results.

 

Question: How can I include all of these buildings?

 

assumption: i guess that i am missing those hospitals mapped as areas. We have to run the same query on planet_osm_polygon as well, or we could construct a "union" query: This would come with alot of benefits for the request: That gives us the centrepoints of polygons in addition to the points we have above. 

i guess that we need to use the amenity = 'hospital' from the polygon table and union all with the points. The benefit: it would gives a few more hospitals. The next step needs to be to find out the node, way or relation ID of a hospital that we're missing and query our database for it. 


btw: can i do this with Python - working on the Overpass API

what about the query OSM data with the Overpass API, but how can we use this data now? a idea and a method to download the data is by using the command line tools curl or wget. In order to do this we need to access one of the Overpass API endpoints, where the one we will look go by the format http://overpass-api.de/api/interpreter?data=query. When using curl we can download the OSM XML of our query by running the command

 

curl --globoff -o output.xml http://overpass-api.de/api/interpreter?data=node(1);out;

 

well - the previously crafted query comes after data= and the query needs to be urlencoded. The --globoff is important in order to use square and curly brackets without being interpreted by curl. This query returns the following XML result

 

<?xml version="1.0" encoding="UTF-8"?>
<osm version="0.6" generator="Overpass API 0.7.54.13 ff15392f">
<note>The data included in this document is from www.openstreetmap.org. 
      The data is made available under ODbL.</note>
<meta osm_base="2018-02-24T21:09:02Z"/>  <node id="1" lat="61.4779481" lon="-0.0014863">
    <tag k="historic" v="memorial"/>
    <tag k="memorial" v="stone"/>
    <tag k="name" v="gasoline-station at the corner"/>
  </node></osm>

 

regarding the methods to get the data which method is better and more appropiate?

 

regarding the formats: well we have to say: there are various output formats to choose from in the documentation. In order to download the query result as JSON we need to add [out:json]; to the beginning of our query as in the command: 

 

curl --globoff - o output.json http://overpass-api.de/api/interpreter?data=[out:json];node(1);out;

...giving us the previous XML result in JSON format. You can test the query also in the browser by accessing 

http://overpass-api.de/api/interpreter?data=[out:json];node(1);out;.

which way would you go?

 

look forward to hear from you

Edited by tarifa

1 answer to this question

Recommended Posts

  • 0


update: i have found a good and solid manual that describes to get started with open-street-map-to-postgis-the-basics

 

https://www.cybertec-postgresql.com/en/open-street-map-to-postgis-the-basics/

  Quote

OSM to PostGIS – The Basics
Ever wondered how to import OSM (OpenStreetMap) data into PostGIS [1] for the purpose of visualization and further analytics? Here are the basic steps to do so. There are a bunch of tools on the market— osm2pgsql; imposm; ogr2org; just to mention some of those. In this article I will focus on osm2pgsql [2].

Let’s start with the software prerequisites. PostGIS comes as a PostgreSQL database extension, which must be installed in addition to the core database. Up till now, the latest PostGIS version is 3, which was released some days ago. For the current tasks I utilized PostGIS 2.5 on top of PostgreSQL 11.
This brings me to the basic requirements for the import – PostgreSQL >= 9.4 and PostGIS 2.2 are required, even though I recommend installing PostGIS >=2.5 on your database;  it’s supported from 9.4 upwards. Please consult PostGIS’ overall compatibility and support matrix [3] to find a matching pair of components.

Osm2pgsql Setup
Let’s start by setting up osm2pgsql on the OS of your choice – I stick to Ubuntu 18.04.04 Bionic Beaver and compiled osm2gsql from source to get the latest updates.

Install required libraries

Expand  
sudo apt-get install make cmake g++ libboost-dev libboost-system-dev \
libboost-filesystem-dev libexpat1-dev zlib1g-dev \
libbz2-dev libpq-dev libproj-dev lua5.2 liblua5.2-dev
Grab the repo

git clone https://github.com/openstreetmap/osm2pgsql.git
Compile

mkdir build && cd build
cmake ..
make
sudo make install
If everything went fine, I suggest checking the resulting binary and its release by executing

./osm2pgsql-version. 
osm2pgsql version 1.0.0 (64 bit id space)
[/CODE]

 

[B]Data acquisition[/B]In the world of OSM, data acquisition is a topic of its own, and worth writing a separate post discussing different acquisition strategies depending on business needs, spatial extent and update frequency. I won’t get into details here, instead, I’ll just grab my osm data for my preferred area directly from Geofabrik, a company offering data extracts and related daily updates for various regions of the world. This can be very handy when you are just interested in a subregion and therefore don’t want to deal with splitting the whole planet osm depending on your area of interest – even though osm2pgsql offers the possibility to hand over a bounding box as a spatial mask. As a side note – osm data’s features are delivered as lon/lat by default.

So let’s get your hands dirty and fetch a pbf of your preferred area from Geofabrik’s download servers [4] [5]. For a quick start, I recommend downloading a dataset covering a small area:

[CODE]wget https://download.geofabrik.de/europe/iceland-latest.osm.pbf[/CODE]

[/QUOTE]

 

here is another guide - the Part 1: [B]Loading OpenStreetMap data into PostGIS: An Almost Idiot's Guide[/B]
[url]https://www.bostongis.com/PrinterFriendly.aspx?content_name=loading_osm_postgis[/url]

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • That’s why I love classic Notepad from W10 IOT LTSC. It opens up instantly, uses very little resources, is just a plain notepad and nothing more. the title should be notepad has lost focus.
    • Seeing my profile picture in Notepad creeped me out. I do not appreciate that Copilot is opt-out rather than opt-in. I don't want Notepad to have connectivity. The other improvements are fine, including the upcoming Markdown support (although I hate Markdown's handling of newlines with a passion).
    • More like Trelane: https://memory-alpha.fandom.com/wiki/Trelane But, yes, Q.  
    • This would come in handy to me as an Uber driver. My biggest pet peeve is when passengers listen to their own audio on their phones without using headphones when they can hear I'm already playing music through my car's head unit.
    • Astonishing new tech could kill headphones forever as it bends sound right into ears by Sayan Sen Imagine being able to listen to your favorite song or podcast out loud without disturbing anyone nearby even without wearing headphones. That’s what a team at Penn State University has been working on. Led by acoustics professor Yun Jing, they’ve come up with a clever way to create invisible audio zones called “audible enclaves” where sound can be heard only at one exact spot. They use ultrasound, which is normally inaudible to people, along with something called acoustic metasurfaces—tiny lenses that can bend sound in specific directions. By combining two ultrasound beams that travel in curved paths and meet at a single point, they’re able to make sound audible only at that intersection. As Jing explained, “The person standing at that point can hear sound, while anyone standing nearby would not. This creates a privacy barrier between people for private listening.” To make this happen, the system includes two ultrasonic speakers and the metasurface lenses, which were 3D printed by Xiaoxing Xia from Lawrence Livermore National Lab. Each beam has a slightly different frequency, and when they meet, a local reaction makes the sound audible. Neither beam is loud on its own—the sound only forms at that shared point. Jia-Xin “Jay” Zhong, one of the researchers, shared how they tested the idea: “We used a simulated head and torso dummy with microphones inside its ears to mimic what a human being hears at points along the ultrasonic beam trajectory, as well as a third microphone to scan the area of intersection. We confirmed that sound was not audible except at the point of intersection, which creates what we call an enclave.” One of the biggest advantages of their approach is that it works across a wide range of sound frequencies—between 125 Hz and 4 kHz, which covers most of what people can hears. Even in rooms where sound usually bounces around, their system held up well. And it’s surprisingly compact too: the whole setup measures about 16 centimeters, roughly the size of a pencil case. “We essentially created a virtual headset,” Zhong said. In practice, it means that someone standing in the audible enclave can hear what’s being played clearly, while everyone else around hears nothing at all. That could be especially useful in shared spaces like cars, classrooms, or open offices. Right now, the sound can travel about one meter and hits around 60 decibels which is similar to regular talking volume. The team believes they can push those limits further by using stronger ultrasound. All this might seem futuristic, but it’s grounded in solving a basic problem: how to direct sound only where it’s needed. If you’re into tech and sound design, this could open up a whole new world of personalized audio experiences. Source: Penn State, PNAS | Image via Depositphotos This article was generated with some help from AI and reviewed by an editor. Under Section 107 of the Copyright Act 1976, this material is used for the purpose of news reporting. Fair use is a use permitted by copyright statute that might otherwise be infringing.
  • Recent Achievements

    • First Post
      leoniDAM earned a badge
      First Post
    • Reacting Well
      Ian_ earned a badge
      Reacting Well
    • One Month Later
      Ian_ earned a badge
      One Month Later
    • Dedicated
      MacDaddyAz earned a badge
      Dedicated
    • Explorer
      cekicen went up a rank
      Explorer
  • Popular Contributors

    1. 1
      +primortal
      503
    2. 2
      ATLien_0
      211
    3. 3
      Michael Scrip
      197
    4. 4
      Xenon
      149
    5. 5
      +FloatingFatMan
      125
  • Tell a friend

    Love Neowin? Tell a friend!