Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CSV import and ESRI-ShapeFile conversion #6

Open
TimotheeMichon opened this issue Mar 13, 2017 · 4 comments
Open

CSV import and ESRI-ShapeFile conversion #6

TimotheeMichon opened this issue Mar 13, 2017 · 4 comments

Comments

@TimotheeMichon
Copy link

Hello everybody,

I come here to give an alternative solution i found to import and display CSV files which have latitude and longitude attributes for coordinates (or equivalent).

Note that such tool already exists to import such data : the Converter vector data sources can be used to build table in PostgreSQL. You just have to check the SQL option and to type the following request:
select *,ST_SetSRID(ST_Point(COL_X,COL_Y) as wkb_geometry from table_name
where table_name is the CSV file name and COL_X and COL_Y, the columns names to use for building the geometry.
The CSV file is then imported in PostgreSQL.

Here i wanted to have my CSV file as a ESRI ShapeFile. The solution i found also makes use of ogr2ogr.
Assume that you have a CSV file data.csv, which is comma separated and with two columns which are coordinates in a given SRS (e.g. WGS84, IGNF:LAMB93, etc.).

According to the ogr2ogr csv documentation, you need to specify which fields contain the geometry in a VRT file:

<OGRVRTDataSource>
    <OGRVRTLayer name="data">
        <SrcDataSource>data.csv</SrcDataSource>
        <GeometryType>wkbPoint</GeometryType>
        <LayerSRS>IGNF:LAMB93</LayerSRS>
        <GeometryField encoding="PointFromColumns" x="COL_X" y="COL_Y"/>
    </OGRVRTLayer>
</OGRVRTDataSource>

where data.csv is the CSV file name, IGNF:LAMB93 the SRS, and COL_X and COL_Y, the columns names to use for building the geometry.

Save this as a file with VRT extension (i.e. data.vrt) and use it as the source:
ogr2ogr -f "ESRI Shapefile" data_shapeFile.shp data.vrt

The ESRI Shapefile can then be imported as any .shp in the Distiller.

I hope this comment will help !

Bye

@thareUSGS
Copy link

thareUSGS commented Mar 13, 2017

We regularly use this simple method too. The only thing I would change would be to also create a *.csvt which allows one to specify if the incoming fields are strings or numeric. Without this most extra fields will be mapped to strings.

So say you have a CSV with:
COL_X, COL_Y, elevation, year, comment
34.343, 45.343, 102.23. 2017, GroundPoint

your *.csvt would simply have:
Real, Real, Real, Integer, String

Now upon conversion to Shapefile, applications can use the fields as intended.

There is a good description and example here: http://giswiki.hsr.ch/GeoCSV#CSVT_file_format_specification

@gfenoy
Copy link
Member

gfenoy commented Mar 16, 2017

Dear Thimothée,
many thanks for your explanation. Nevertheless, as you are using MapMint as the platform to manage your GIS data, I personally see a better way to do the exact same thing.

The first would be to activate the Importer module by setting importers=true in the [mm] section of your main.cfg file. This module can handle simple CSV files but also more complexe Excel files where you can have the data as lines or as more complexe data, see the screenshot bellow for a more complexe use case. Note that this MapMint module also supports MDB database and help you to extract the data you need from a big mdb file.

screenshot

In your case you may use the button "Use Column Definitions". Indeed, you should have the correct field names displayed in your case rather than the FieldX I have in my case. Once you have pressed this button, you should have the table panel displaying all your fields names. Here, you can choose what are the type of each fields. Once this has been done you can press the "Import" button then access your table as any other in your main PostgreSQL database. Go back to your Distiller, ton consult your main database. At this step your new table won't appear as a geographic table as it does not contain any geographic field yet. So from here, press the db button to access your standard db. From here you can add a field as a geographic field and then decide what are the field to be used to create the geographic one.

From here you can access the html form that make you able to invoke the Ogr2Ogr WPS service from your distiller to create a shapefile or any other format supported by GDAL.

I hope my explanation is clear and it helps you to handle your work directly from MapMint.

Note that this module has not been made for simple file like CSV but I have tried with your csv file and it has worked perfectly fine. So I simply mention it here as there can be more complexe case you may be forced to deal with and this path may be the right one to follow in a more complexe case.

Best regards.

@TimotheeMichon
Copy link
Author

Dear Gerald,

Your solution seems to be perfect for me.

The modification the file /home/src/mapmint/mapmint-services/main.cfg works properly. It now gives me an access to the Importer module.

However i have some issues with the management of the CSV files.

As attested with the following picture, the file upload is correct.
capture d ecran 2017-03-17 a 09 21 08

I also may use the button Use Column Definitions, but nothing seems to happen... I am furthermore not able to choose the type of each fields.
When i try to save with the Save button, the following error message appears:
Unable to run the Service. The message returned back by the Service was the following: An error occured when processing your request: relation "mm_tables.pages" does not exist LINE 1: INSERT INTO mm_tables.pages (name,tablename,type,ofield,otyp... ^ Traceback (most recent call last): File "/home/src/mapmint/mapmint-services/np/service.py", line 3085, in insert cur.execute(req) ProgrammingError: relation "mm_tables.pages" does not exist LINE 1: INSERT INTO mm_tables.pages (name,tablename,type,ofield,otyp... ^

Might you now where this is coming from ?

Many thanks

@gfenoy
Copy link
Member

gfenoy commented Mar 17, 2017

Dear Timothée,
many thanks for following up on this.

First of all you should have imported the tables.sql file in your PostgreSQL database. This file can be found in the template/sql directory. This is required by the importer module.

Also in your case you should have chosen Line in the "Choose a type" select list. Indeed in your case the data is inline not spread along the spreadsheet.

So here is what you should see form the Importer module.

screenshot

Pressing the button "Use Column Definitions" should be pressed only once you have select the type Line.

I hope this can solve your issue.

Best regards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants