Re: [mkgmap-dev] DEM-file generation from geo-tiff sources ?

For my opinion your data sources are very interesting. This is a short look to the alps, the tile E40N20 from https://land.copernicus.eu/. There seem's to be a little problem at the near of some coastlines, but we have on the other hand a high resolution of 25m. I belive, a bigger problem is the coordinate reference system. Copernicus data have EPSG:3035, garmin use EPSG:4326. It would be fine, if mkgmap could be read geotiffs and calculate with different coordinate reference systems. This would perhaps collect the interpolation for changing the coordinate reference system and for the garmin map. On the other hand: this is a lot of mathematical stuff. I'm not a java hacker and i don't know, if there are usable geografic librarys for this work. At the moment we have to translate the tiffs from EPSG:3035 to EPSG:4326, then the segmentation in 1°x1° tiffs an at last the conversion to hgt files. This should not be a very great problem with gdal-tools or so, but it is additional work. Frank

Frank Stinner wrote
At the moment we have to translate the tiffs from EPSG:3035 to EPSG:4326, then the segmentation in 1°x1° tiffs an at last the conversion to hgt files. This should not be a very great problem with gdal-tools or so, but it is additional work.
Hi Frank, thanks for the hint It was a long process (some hours), but in the end I succeeded in converting 4 of the giant copernicus tiff-tiles to 627 small (1°x1°) hgt files. 1. I joined the 4 tiles to one monster-file using gdal_merge 2. I converted the coordiante system with "gdalwarp -s_srs EPSG:3035 -t_srs EPSG:4326" 3. with a small shell script the file got cut to 1°x1° tiles using gdal_translate i.e. "gdal_translate -strict -q -eco -projwin 10 48 11 47 " for the N47E010 tile 4. with another script the tiles were converted from tif to hgt format also using gdal_translate "gdal_translate -strict -q -eco -of SRTMHGT -outsize 3601 3601 " the scripts I used are only "quick 'n' dirty" hacks. a few notes: There were a few problems to work around the coordinate system in use by copernicus does not follow the latitude/longitude degrees Corner Coordinates of the eu_dem_v11_E30N30.TIF: Upper Left ( 3000000.000, 4000000.000) ( 12°15'58.58"W, 57°13'49.19"N) Lower Left ( 3000000.000, 3000000.000) ( 8° 7'39.52"W, 48°38'23.47"N) Upper Right ( 4000000.000, 4000000.000) ( 4°25'17.11"E, 58°59'15.42"N) Lower Right ( 4000000.000, 3000000.000) ( 5°31' 4.07"E, 50° 1'26.83"N) in EPSG:3035 coordinate it's a perfect square, but in EPSG:4326 it's a wedge. I decided to join the copernicus tiles before converting the coordinate system in order not to run into any problems at the original tile borders. the hgt files must start precisely at the degree borders, so the tiles need to be cut at the precise "projwin" coordinates last the hgt tiles must be 3601x3601 pixel Ciao, Franco -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

Hi Franco, i have just downloaded all copernicus tiffs. The pure tiff data are round about 37GB. My short test was: gdalbuildvrt all.vrt *.tif gdalwarp -t_srs EPSG:4326 -r lanczos -ot Int16 -dstnodata -32768 -of GTiff -co "COMPRESS=LZW" -co "TILED=YES" -co "PREDICTOR=2" -ts 3601 3601 -te 11 50 12 51 all.vrt N50E011.tif gdal4hgt --raster=N50E011.tif --dstpath=. The first command create a vrt file (virtual). This is only a xml file with links to the real tif's. It runs a few seconds. The second command cut a the area with longitude 11..12 and latitude 50..51 (in EPSG:4326) from all.vrt and write it with3601x3601 pixel as lzw compressed geotif. A pixel is saved as int16. The resampling method is lanczos. It also runs a few seconds. The last command is my own prog. It converts the tif in a few seconds to hgt. It seem's to be ok. If gdalwarp run in a loop, it create a lot of new tif's. With the new tif's can we create a new vrt. gdal4hgt should convert this new vrt to all nonempty hgt's in one run. Frank --- Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft. https://www.avast.com/antivirus

Hi Frank, thanks for the new hints, this sounds much simpler than my approach. Especially the "virtual" tif-File should save a lot of time and disk space. But anyway there will be more than 1000 tiles for all the copernicus data. 1000 times a few seconds - it's still over one hour. But converting 37GB of data just takes it's time. for the last step you use your own program. Is it faster than gdal_translate ? "gdal_translate -strict -q -eco -of SRTMHGT -outsize 3601 3601 " should basically do the same job, right? Ciao, Franco -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

Hi Franco, i have little bit "played" with copernicus-files. I have build a win-command-file with this perl-file: use strict; my $lonleft = -24; my $lonright = 50; my $latbottom = 27; my $lattop = 64; my $srcdir = "../org"; my $delta = 1 / 3600 / 2; # half pixel print "set GDAL_TRANSLATE_OPT=-multi -wo NUM_THREADS=ALL_CPUS -t_srs EPSG:4326 -r near -ot Int16 -dstnodata -32768 -of GTiff -co \"COMPRESS=LZW\" -co \"TILED=YES\" -co \"PREDICTOR=2\" -ts 3601 3601\n"; for (my $lat = $latbottom; $lat != $lattop; $lat++) { for (my $lon = $lonleft; $lon != $lonright; $lon++) { printf "gdalwarp %%GDAL_TRANSLATE_OPT%% -te %.7f %.7f %.7f %.7f $srcdir/all.vrt %s%02d%s%03d.tif\n", $lon - $delta, $lat - $delta, $lon + 1 + $delta, $lat + 1 + $delta, 'N', $lat >= 0 ? $lat : -$lat, $lon >= 0 ? 'E' : 'W', $lon >= 0 ? $lon : -$lon; } } Yes, i was very lazy with the huge area. That are 2740 1°x1° tiles. But my computer is very diligent ;) . Have in mind the $delta. Hgt's have not exact 1° width/height. It's 1 pixel more, that means a half pixel on each side. I have tested different interpolation methods. I believe "near" is the best (and fastest) method. My computer created in 65 min all the tif's. I'm sure, that is strong depend on the hardware. Surprisingly the new tif's have only round about 4 GB. Then i create with gdalbuildvrt all.vrt *.tif gdal4hgt --raster=all.vrt --dstpath=..\hgt -O the zipped hgt's. That is very easy but very slow (160 min). By the way, gdal4hgt create only hgt's with min. 1 non-nodata value. The resulting zipped 1001 hgt's need 5,2 GB! You wrote: "gdal_translate -strict -q -eco -of SRTMHGT -outsize 3601 3601 " should basically do the same job ... The format SRTMHGT was new for me, but i think, you are right. I have not tested the speed. Frank --- Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft. https://www.avast.com/antivirus

Hi Frank, I tried to adapt your perl script to my Ubuntu System I got the following warnings Here's my Ubuntu-script Frank Stinner-2 wrote
Hi Franco,
i have little bit "played" with copernicus-files.
I have build a win-command-file with this perl-file:
-- cut ---
Yes, i was very lazy with the huge area. That are 2740 1°x1° tiles. But my computer is very diligent ;) . Have in mind the $delta. Hgt's have not exact 1° width/height. It's 1 pixel more, that means a half pixel on each side. I have tested different interpolation methods. I believe "near" is the best (and fastest) method. My computer created in 65 min all the tif's. I'm sure, that is strong depend on the hardware.
Let's see how fast my machine will be -- still running
Surprisingly the new tif's have only round about 4 GB.
Then i create with
gdalbuildvrt all.vrt *.tif gdal4hgt --raster=all.vrt --dstpath=..\hgt -O
the zipped hgt's. That is very easy but very slow (160 min). By the way, gdal4hgt create only hgt's with min. 1 non-nodata value. The resulting zipped 1001 hgt's need 5,2 GB!
You wrote: "gdal_translate -strict -q -eco -of SRTMHGT -outsize 3601 3601 " should basically do the same job ...
The format SRTMHGT was new for me, but i think, you are right. I have not tested the speed.
Frank
Now I will have to find a way to identify the empty Tif-Files. I think gdal_translate will also create empty tiles, so I will have to remove the empty Tifs before converting them to hgt Ciao, Franco -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

I don't know why my "raw-text" blocks got eaten by the nabble.com web interface franco_bez wrote
I got the following warnings
gdalwarp $GDAL_TRANSLATE_OPT -te -24.0001389 26.9998611 -22.9998611 28.0001389 ../all.vrt N27W024.tif Creating output file that is 3601P x 3601L. Warning 6: Driver GTiff does not support "COMPRESS creation option Warning 6: Driver GTiff does not support "TILED creation option Warning 6: Driver GTiff does not support "PREDICTOR creation option
Here's my Ubuntu-script
use strict; my $lonleft = -24; my $lonright = 50; my $latbottom = 27; my $lattop = 64; my $srcdir = ".."; my $delta = 1 / 3600 / 2; # half pixel print "#!/bin/sh\n"; #print "GDAL_TRANSLATE_OPT='-multi -wo NUM_THREADS=ALL_CPUS -t_srs EPSG:4326 -r near -ot Int16 -dstnodata -32768 -of GTiff -co \"COMPRESS=LZW\" -co \"TILED=YES\" -co \"PREDICTOR=2\" -ts 3601 3601'\n"; print "GDAL_TRANSLATE_OPT='-multi -wo NUM_THREADS=ALL_CPUS -t_srs EPSG:4326 -r near -ot Int16 -dstnodata -32768 -of GTiff -ts 3601 3601'\n"; for (my $lat = $latbottom; $lat != $lattop; $lat++) { for (my $lon = $lonleft; $lon != $lonright; $lon++) { printf "gdalwarp \$GDAL_TRANSLATE_OPT -te %.7f %.7f %.7f %.7f $srcdir/all.vrt %s%02d%s%03d.tif\n", $lon - $delta, $lat - $delta, $lon + 1 + $delta, $lat + 1 + $delta, 'N', $lat >= 0 ? $lat : -$lat, $lon >= 0 ? 'E' : 'W', $lon >= 0 ? $lon : -$lon; } } -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

Hi Frank, so these are the steps I used on my ubuntu machine: 1. Download the Copernicus Data and unzip 2. create the virtual tif-image gdalbuildvrt all.vrt *.TIF 3. cut the 1°x1° tiles with the perl script this results in almost 65GB of uncompressed tif tiles 4. convert every tif-tile to hgt using the following loop for i in N*.tif ; do gdal_translate -strict -q -eco -of SRTMHGT $i hgt/`basename $i tif`hgt ; done ; rm -f *.hgt.aux.xml ; 5. I found out that the file N27W000.hgt is completely empty, so all tiles with the identical content are empty also. with the following commands I removed the empty tiles mv N27W000.hgt EMPTY.hgt; for i in N*.hgt ; do if diff EMPTY.hgt $i ; then rm -f $i; fi done ; rm -F EMPTY.hgt; Now I got 25GB in 1001 hgt tiles -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

Hi Franco, i have seen an silly error in my area. Europa is a little bit greater ;) Just i have "$lattop = 70". The split create 3608 tif's with 4,5 GB in 85 min. gdal4hgt create 1171 zipped hgt's with 5,8 GB in 3 h :(( And then i rebuild the tif's from the hgt's in 20 min. That are 4 GB. Perhaps you have an to old gdal-version without lzw-compression? Or perhaps -co \"COMPRESS=LZW\" -co \"TILED=YES\" -co \"PREDICTOR=2\" should be better -co COMPRESS=LZW -co TILED=YES -co PREDICTOR=2 I have found hgt's with a few deep "holes" (< 0 m) but they seems to be ok (surface mining). Frank --- Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft. https://www.avast.com/antivirus

Frank Stinner-2 wrote
Hi Franco,
i have seen an silly error in my area. Europa is a little bit greater ;) Just i have "$lattop = 70". The split create 3608 tif's with 4,5 GB in 85 min. gdal4hgt create 1171 zipped hgt's with 5,8 GB in 3 h :(( And then i rebuild the tif's from the hgt's in 20 min. That are 4 GB.
OK
Perhaps you have an to old gdal-version without lzw-compression? Or perhaps
It's the current version of gdal
-co \"COMPRESS=LZW\" -co \"TILED=YES\" -co \"PREDICTOR=2\"
should be better
-co COMPRESS=LZW -co TILED=YES -co PREDICTOR=2
That did the trick :-) Thanks a lot. now this is the new perl script for Ubuntu --- use strict; my $lonleft = -24; my $lonright = 50; my $latbottom = 27; my $lattop = 70; my $srcdir = ".."; my $delta = 1 / 3600 / 2; # half pixel print "#!/bin/sh\n"; print "GDAL_TRANSLATE_OPT='-multi -wo NUM_THREADS=ALL_CPUS -t_srs EPSG:4326 -r near -ot Int16 -dstnodata -32768 -of GTiff -co COMPRESS=LZW -co TILED=YES -co PREDICTOR=2 -ts 3601 3601'\n"; for (my $lat = $latbottom; $lat != $lattop; $lat++) { for (my $lon = $lonleft; $lon != $lonright; $lon++) { printf "gdalwarp \$GDAL_TRANSLATE_OPT -te %.7f %.7f %.7f %.7f $srcdir/all.vrt %s%02d%s%03d.tif\n", $lon - $delta, $lat - $delta, $lon + 1 + $delta, $lat + 1 + $delta, 'N', $lat >= 0 ? $lat : -$lat, $lon >= 0 ? 'E' : 'W', $lon >= 0 ? $lon : -$lon; } } --- Ciao, Franco -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html

Sorry for bringing this back up - I'm having problems converting tiff to .hgt. - how can I run a script that converts all geotiff (not 25m resolution but 30m) in folder /home/example/ to .hgt? Basically I want to fill the voids in SRTM1v30 and need to convert .hgt to tiff as gdal and then get them back to hgt. so first I simply run: for hgtfile in *.hgt;do gdal_fillnodata.py -md 500 $hgtfile $hgtfile.tif; done which will voildfill all .hgt files but gdal created rubbish (at least if I feed it to phyghtmap) if I try to simply voildfill hgt and have output also as hgt. So I need to convert them back to hgt before I can hand them over to pyhghtmap. It should work vs a folder of the whole world 60N/60S. Best in bash if possible, but perl is also fine if I understand the steps (I downloaded the SRTM1 from earthexplorer as .hgt - or better I will have them in a couple of days, download is pretty slow even though it is only 85GB zipped...) On 7 January 2018 at 13:25, franco_bez <franco.bez@web.de> wrote:
Frank Stinner-2 wrote
Hi Franco,
i have seen an silly error in my area. Europa is a little bit greater ;) Just i have "$lattop = 70". The split create 3608 tif's with 4,5 GB in 85 min. gdal4hgt create 1171 zipped hgt's with 5,8 GB in 3 h :(( And then i rebuild the tif's from the hgt's in 20 min. That are 4 GB.
OK
Perhaps you have an to old gdal-version without lzw-compression? Or perhaps
It's the current version of gdal
-co \"COMPRESS=LZW\" -co \"TILED=YES\" -co \"PREDICTOR=2\"
should be better
-co COMPRESS=LZW -co TILED=YES -co PREDICTOR=2
That did the trick :-)
Thanks a lot.
now this is the new perl script for Ubuntu ---
use strict;
my $lonleft = -24; my $lonright = 50; my $latbottom = 27; my $lattop = 70;
my $srcdir = "..";
my $delta = 1 / 3600 / 2; # half pixel
print "#!/bin/sh\n"; print "GDAL_TRANSLATE_OPT='-multi -wo NUM_THREADS=ALL_CPUS -t_srs EPSG:4326 -r near -ot Int16 -dstnodata -32768 -of GTiff -co COMPRESS=LZW -co TILED=YES -co PREDICTOR=2 -ts 3601 3601'\n"; for (my $lat = $latbottom; $lat != $lattop; $lat++) { for (my $lon = $lonleft; $lon != $lonright; $lon++) { printf "gdalwarp \$GDAL_TRANSLATE_OPT -te %.7f %.7f %.7f %.7f $srcdir/all.vrt %s%02d%s%03d.tif\n", $lon - $delta, $lat - $delta, $lon + 1 + $delta, $lat + 1 + $delta, 'N', $lat >= 0 ? $lat : -$lat, $lon >= 0 ? 'E' : 'W', $lon >= 0 ? $lon : -$lon; } }
---
Ciao, Franco
-- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html _______________________________________________ mkgmap-dev mailing list mkgmap-dev@lists.mkgmap.org.uk http://www.mkgmap.org.uk/mailman/listinfo/mkgmap-dev
-- Felix Hartman - Openmtbmap.org & VeloMap.org Schusterbergweg 32/8 6020 Innsbruck Austria - Österreich

Hi Frank, just my opinion: I would keep this separated from mkgmap. I mean, you need once convert the data and that's it and the tools all exist. So I think it's more useful to create a documentation about how to convert the copernicus-files into hgt. Btw. Are these files so much better compared to the viewfinder-hgt files? Henning On 02.01.2018 22:58, Frank Stinner wrote:
For my opinion your data sources are very interesting. This is a short look to the alps, the tile E40N20 from https://land.copernicus.eu/.
There seem's to be a little problem at the near of some coastlines, but we have on the other hand a high resolution of 25m. I belive, a bigger problem is the coordinate reference system. Copernicus data have EPSG:3035, garmin use EPSG:4326.
It would be fine, if mkgmap could be read geotiffs and calculate with different coordinate reference systems. This would perhaps collect the interpolation for changing the coordinate reference system and for the garmin map. On the other hand: this is a lot of mathematical stuff. I'm not a java hacker and i don't know, if there are usable geografic librarys for this work.
At the moment we have to translate the tiffs from EPSG:3035 to EPSG:4326, then the segmentation in 1°x1° tiffs an at last the conversion to hgt files. This should not be a very great problem with gdal-tools or so, but it is additional work.
Frank
_______________________________________________ mkgmap-dev mailing list mkgmap-dev@lists.mkgmap.org.uk http://www.mkgmap.org.uk/mailman/listinfo/mkgmap-dev

Hi Henning, yes, we have specialized tools for this work, they make a good job and we should use this tools. A little interesting idea is: If we can use additional use tif's a "container" for dem data, we can safe space on the hd. My little test with the copernicus files show: 5.2GB for zipped hgt's and only 3,54GB for the tif's with lzw compression (round about 68%). "container" means: only use 16-bit grayscale values, no metadata, no coordinate reference system or whatever. A "on-the-fly-conversion" is to expensive. You ask: Are the copernicus-files so much better compared to the viewfinder-hgt files? How we can decide this? I believe, there is no way. I have not found any information about the +- interval for SRTM or the viewfinder-data. I only know, that the copernicus-files have the 25m resolution from tanger up to the north cape in norway. Frank --- Diese E-Mail wurde von Avast Antivirus-Software auf Viren geprüft. https://www.avast.com/antivirus

Henning Scholland wrote
Btw. Are these files so much better compared to the viewfinder-hgt files?
https://www.openstreetmap.org/?mlat=47.8569&mlon=10.6028#map=15/47.8569/10.6... In the viewfinder_1 data there is a "ghost-hill" in my area. The hill does not exist in real-life. Oregon Screenshot with the Ghost hill VIEW_1 <http://gis.19327.n8.nabble.com/file/t339543/VIEW_1_ein_Huegel.bmp> Even in the low-resolution viewfinder_3 data there is no hill - correct. Oregon Screenshot VIEW_3 <http://gis.19327.n8.nabble.com/file/t339543/VIEW_3_kein_Huegel.bmp> The Copernicus data also does not have the "ghost-hill", but also the high resolution. So, IMHO the Copernicus data has a better quality compared to the viewfinder data. -- Sent from: http://gis.19327.n8.nabble.com/Mkgmap-Development-f5324443.html
participants (5)
-
Felix Hartmann
-
franco_bez
-
Frank Stinner
-
Frank Stinner
-
Henning Scholland