Top
Best
New

Posted by tosh 1 day ago

GeoJSON(geojson.org)
159 points | 74 comments
Stratoscope 23 hours ago|
One task where GeoJSON falls down is simplification of a group of polygons with common boundaries, e.g. the 48 conterminous US states. If you start with a highly detailed set of polygons, you need to simplify them for practical display in an online map.

GeoJSON doesn't encode the fact that the boundary points are common between adjacent polygons. When you simplify those polygons, each one is handled separately, and you end up with "slivers" where the boundaries are misaligned:

https://www.bing.com/images/search?q=map+slivers+betwen+poly...

TopoJSON solves this by encoding each such boundary only once. So when you simplify the polygons, they are all done together, and the same simplification applies to adjacent polygons. No more slivers!

https://github.com/topojson/topojson

https://github.com/topojson/topojson-simplify

Demiurge 22 hours ago||
Is this actually GeoJSON falling down, or decades of convention extended to JSON? Topology is great, but it is sidestepped by Shapefile/WKT/WKB/etc, in favor of independent primitives like POINT, LINE, POLYGON. If GeoJSON did not exist as a new JSON GIS data format encoding these primitives, TopoJSON would not have "replaced" it, due to the added mis-match with other non-topological formats.

From what I can tell, the top criticism of GeoJSON is the under-enforced winding order specification, and crossing the antemeridian.

jvanderbot 22 hours ago||
Right. Encoding a union algorithm into the data structure just introduces the reverse problem: Selecting a subset now requires extra logic beyond jq.
Stratoscope 18 hours ago||
Similarly, typical map APIs like the Google Maps API accept GeoJSON and not TopoJSON. I was not suggesting TopoJSON as a replacement for GeoJSON, but as a complement to it. With the tools on the TopoJSON GitHub, you can have GeoJSON input and output, but convert to TopoJSON for the simplification step to avoid the "slivers" problem.
pramsey 21 hours ago|||
GeoJSON is not TopoJSON. Saying that is "falling down" is like criticizing a zebra for not being a giraffe. GeoJSON is a mapping of the (non-topological) "simple features" model into JSON, full stop. It does that fine.
Stratoscope 18 hours ago||
Yes, the same "slivers" problem occurs when you try to simplify features in any format that uses individual polygons, such as shapefiles or whatnot. That's the only case I was referring to.

I don't think I would trust a zebra or a giraffe for this task either.

echoangle 22 hours ago|||
How is that a geojson problem? If your dataset is correct, adjacent borders will just use the same points and will match exactly.
sdenton4 21 hours ago||
The problem is simplification. Suppose two regions share a border with some nonlinear points a, b, c, d. Simplifying the polygon for the first region might yield a, b, d while the second yield a, c, d. This creates gaps or overlaps between the two regions.
qurren 20 hours ago|||
But what is the border? Set the border to what it actually is, not a simplification of it. The state of Colorado is formally a 697 sided polygon, don't simplify it to a rectangle.
tomrod 20 hours ago|||
This is not what OP is describing. It is very common to simplify objects for decreasing boundary objects by orders of magnitude. GeoJSON is missing correlation when you do that. Simplifying country objects from a GeoJSON source could lead to a gap between the country borders. So you either have poor representation or a longer pipeline to convert objects to an amenable object set. It also breaks idempotency in some regards.
echoangle 20 hours ago||
To do the simplification, you detect shared borders, simplify and generate polygons again. That doesn’t make topojson inherently superior. You can convert back and forth and for many applications geojson is easier to process.
Stratoscope 17 hours ago||
Yes, you could write code to do that. Or use the utilities provided in the TopoJSON GitHub and let them do it for you: convert to TopoJSON, simplify, convert back to GeoJSON. They have already written all the code for you.
echoangle 16 hours ago||
Yeah, or you could use Geojson and use https://mapshaper.org/
Stratoscope 17 hours ago||||
It depends on what purpose you are using the polygons. In an online map you need to simplify way down. Consider these Colorado maps at two different zoom levels:

https://maps.app.goo.gl/JH93ko96QcoLXuBJ9

https://maps.app.goo.gl/au53iTnsmNdFuEZV8

Even the one zoomed in on the state appears to use maybe 15-20 vertices max.

In the second one, if I squint real hard I can just barely make out one slight dogleg on the western border and one on the south. And that is partly because I knew to look for them in the zoomed-in map.

If we use, say, the Census TIGER/Line boundary definitions for the states, we are probably talking about hundreds of thousands of vertices, perhaps millions. You won't be using those in an online map without simplifying.

AlotOfReading 19 hours ago|||
The Texas border with Mexico is formally down the centerline of the Rio Grande, even as the river moves (ignoring fiddly complications). Even if you could somehow take a perfect snapshot of it at a given time, you'd run into the coastline paradox when sampling it.
echoangle 20 hours ago|||
So don’t simplify the shapes on their own. Geojson is a storage and exchange format, you can still convert it to other formats if you want to modify it.
rented_mule 14 hours ago||
I think what the original comment is pointing out is that GeoJSON lacks a concept of a shared boundary. Shared boundaries expressed in GeoJSON get around that by duplicating data. Whenever data is duplicated, there's a risk that the copies will not be exactly the same. That makes the task of modification more challenging given that the real world is full of messy data, like duplicates not matching.

20-25 years ago I worked a lot with map data from otherwise high quality, and sometimes authoritative, sources like the USGS and NOAA that had this non-identical shared boundaries problem (in formats other than GeoJSON). If the format doesn't allow such mistakes to be expressed, then they have to fix their data to publish it in said format.

echoangle 14 hours ago||
Sure, but not every format is useful for everything. Geojson is great if you want a simple way to express a shape to show on a map. It’s like criticizing CSV because people put strings in choice value fields instead of doing a foreign key to another table. That’s just not what the format is used for.
rented_mule 11 hours ago||
I'd take your point further... No format is useful for everything. But we have to be aware of the trade-offs of each format (or language or tool or ...) in order to make the right choice of what to use for a given use case. We do that by sharing knowledge of where a given tool succeeds and where it falls down. Pointing out something a format doesn't handle well is not condemning that format for all use cases (I happily choose GeoJSON over other formats for many things).
NelsonMinar 19 hours ago||
I like TopoJSON and have used it in projects. But it's weird to set it up as opposition to GeoJSON. It's a complement. GeoJSON is a general data format meant to replace uses of ESRI Shapefiles and other complex formats. TopoJSON is more of a solution for a particular application need.

Is there much work developing or using TopoJSON these days? I haven't seen much about it in a few years.

Stratoscope 18 hours ago||
To be clear, I'm not suggesting TopoJSON as an alternative to GeoJSON. I like GeoJSON and was loosely involved with the working group that created and updated its spec.

I'm just saying that for the specific task I mentioned GeoJSON or any format such as shapefiles that store polygons individually naturally leads to the "sliver" problem.

A nice processing pipeline is:

1. Convert GeoJSON to TopoJSON.

2. Run the simplification on the TopoJSON.

3. Convert the resulting TopoJSON back to GeoJSON.

The TopoJSON GitHub has tools for each of these steps.

Waterluvian 1 day ago||
I’ve applied GeoJSON (among many other GIS tech) for mapping and monitoring tens of thousands of warehouse robots. It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.

The dangerous part is that some tools fully assume this and will completely screw with calculations if you’re assuming a flatland CRS. So you’ve got to be careful in checking and setting those parameters.

One nice thing is that the structure of GeoJSON works incredibly well in typescript. It has discriminated unions built in so you can walk entire geodatasets in a pretty comfortable way.

papercrane 1 day ago||
> It works great as long as you squint just a bit, ignoring that it generally calls for long,lat and is designed with the assumption of a world CRS.

I thought the spec allowed you to specify the CRS, but I just checked the RFC and they removed that from the 2016 specification and WGS84 is specified. It does allow for alternative CRS with prior arrangement, but like you said that does require a lot of care.

CornCobs 12 hours ago|||
Yes, they deprecated the CRS field and the current state of geojson handling libraries is pretty messy as a result since geojson does not have versioning!

If you have old geojson in a different projection, will your library respect the crs field or will it simply misinterpret your data?

Wondering if anyone could shed light on the decision to remove it as a standard when projection seems to be a critical part of GIS.

Waterluvian 12 hours ago||
Coordinate systems and projections are one of those deeply complicated truths that makes such a headache in GIS. I still shudder at all the pain in school and at previous jobs dealing with inconsistent datasets.

It seems like they decided to just opt out of trying (see the yellow box in section 4): https://stevage.github.io/geojson-spec/#section-4

I think they should have completely backed off from touching on projections and datums in the format altogether. Ie. Something like, “coordinates are 2 or 3 tuples where the values in order correspond with easting/long and northing/lat and elevation/altitude. See metadata for agreed upon units and CRS/projection semantics. It is strongly encouraged to standardize on WGS84 when encoding data with an earth-resolvable datum.”

Because GeoJSON otherwise works fine for indoor spaces, video game spaces, fictional lands, other celestial bodies, etc. You just have to educate on the idea that there’s more to data compatibility than it being GeoJSON.

drewda 19 hours ago|||
Yup, technically speaking if the coordinates aren't in WGS84, it isn't GeoJSON
matt-p 21 hours ago|||
OK, I had not considered just using GeoJSON for my flatland CRS (indoor routing). Quite obvious in hindsight, thank you.
sam_lowry_ 1 day ago||
> tens of thousands of warehouse robots

Sounds like Amazon

Waterluvian 1 day ago||
Definitely not Amazon. Yuck.
DarkNova6 23 hours ago||
I’ve had nothing but problems using GeoJson. The specification has limitations everywhere and doesn’t even support z + m values at the same time.

But thankfully there is also the SQLite backed GeoPackage, which is not only more flexible but also much smaller. It takes some extra steps to get testing teams working due to it’s binary nature, but other than that it is the best format in geospatial data analysis.

Long live SQLite!

sureglymop 5 hours ago|
I'm glad there are sqlite backed file formats in that space. Having that said, they're not always the ideal choice.

For example, for map tiles mbtiles (sqlite) files can be used. In many applications though, pmtiles files are better because they allow for http range requests.

cr125rider 22 hours ago||
Made by Sean Gillies and a few others. Back when mapbox was doing all sorts of great open source stuff. Legends

https://github.com/sgillies

jackconsidine 1 day ago||
GeoJSON is super useful. At Getcho (delivery, logistics) we use zip code GeoJSON encodings to draw polygons on zone maps and quickly generate rates. This has been a persistently annoying thing to do until we discovered this format. If you're curious, someone made a repo with all the 2010 census zips a while back [0].

[0] https://github.com/OpenDataDE/State-zip-code-GeoJSON/blob/ma... although you can generate newer versions from the last census.

korkoros 1 day ago||
About 25% of ZIP codes don't have a corresponding Census Bureau ZCTA, for example 10118. Do you end up needing special handling for those cases? Or has it not yet come up in practice?
jackconsidine 1 day ago||
Excellent question it certainly does come up. Practically speaking the more populous zip codes are all accounted for and that’s where the vast majority of deliveries go to. For example I took the census zip code data 150 miles (crow flies) outside Philly and found virtually 100% coverage.

For missing ones you have to fall back to distance based estimates and in my business that means you’re quote may be off and you’re exposed

ryandrake 22 hours ago||
No shade whatsoever at you or your business: I'll say upfront that you certainly made the right practical decision for the goal of running a business.

That said, this is a textbook example of what I have always found so infuriating, personally, about working on commercial software, and one of the many reasons I ultimately moved into a non-software-writing role. The (very sensible and practical) shortcuts and tradeoffs that are commonly made due to time and cost constraints. The attitude of "well the vast majority of our use cases work, so we're done." I've always thought edge cases must be addressed. Something in my brain hurts when I knowingly release something where only 99% of cases work.

I can imagine this is probably the same thing some artists feel when they are commissioned to produce (in their view rushed, flawed, or incomplete) artwork for business purposes.

I only write software at home, as a hobby now, and this gives me the outlet to follow my heart around edge cases!

heed 22 hours ago||
imo it's not a great solution. the problem is there is no standard or source of truth for zip code boundaries because they are a usps concept used for mail logistics. zip codes change all the time, are approximations of an area, and generally shouldn't be used for something that requires precision like calculating rates. may be ok to use as a fallback though.

also i hear your point on swe roles and don't disagree

wodenokoto 5 hours ago||
How did geojson make this easier than .wkt or .csv with a geometry field?
nobleach 23 hours ago||
We used this extensively when I worked in this space (2010 - 2014). My favorite addition was using https://github.com/topojson/topojson to add arcs. That cut down on quite a bit of points to represent curves.
jtbaker 23 hours ago|
Dang, fun memories of when I was first getting in to geo/data stuff and doing a lot of web mapping stuff with D3, Leaflet and friends. Seems as tools like Vector tiles/PMTiles have supplanted topojson for a lot of visualization oriented use cases.
nobleach 21 hours ago||
I'm gonna have to dive into a rabbit-hole! I was working on an ESRI Shapefile to GeoJson converter back in those days. But D3 and Leaflet were such cool tech! MapBox too. Linking SagaGIS with PostGIS to do pre/post wildfire analysis was my jam.
ragebol 1 day ago||
Have been using GeoJSON, very handy and human-readable, but we recently switched to GeoPackage files, as it allows for different layers, each with a different schema for additional data.

GeoPackages also allow to set a proper CRS, which is not as easy in GeoJSON IIRC.

Getting your CRSes wrong is fun...

michaeljhg 1 day ago||
Also https://postgis.net/
thibautg 1 day ago||
And with PostgREST [0], you can automatically convert any PostGIS table (with geometry or geography column) to GeoJSON by using an "Accept: application/geo+json" header in the request.

[0] https://docs.postgrest.org/en/v14/how-tos/working-with-postg...

pramsey 21 hours ago||
At the SQL level, the ST_AsGeoJSON(record) variant will convert a tuple that includes a geometry and any combination of other columns into a GeoJSON output.
steve-chavez 19 hours ago||
Many thanks for your work pramsey. We use that exact function [1], do you have any plans for a similar function for TopoJSON? One that also has a record parameter? [2].

[1]: https://github.com/PostgREST/postgrest/blob/f1d0e8ea2266077d...

[2]: PostGIS has https://postgis.net/docs/AsTopoJSON.html but it doesn't take a record.

Zambyte 23 hours ago||
Also https://github.com/timescale/timescaledb

I've found it very useful for storing geospatial data over time.

pramsey 21 hours ago||
MobilityDB might also be of interest, for people handling trajectories.
layer8 20 hours ago||
Somewhat related: Falsehoods Programmers Believe About Map Coordinates: https://news.ycombinator.com/item?id=24659039
cogman10 21 hours ago|
Interesting but, IMO, probably one of the worst uses of JSON. The data you would want to consume is already not "human readable" so it instead introduces a lot of bloat for really no benefit.

If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.

Imagine, for example, if we encoded this as a binary. First 2 bytes for the feature type, second 2 bytes for the geometry type, 3 bytes for a fixed point x, 3 bytes for a fixed point y, and you could optionally provide the properties as a json blob in a trailing string. That's 10 bytes for all the coordinate stuff. Less bytes than what currently stores the `"type": "Feature"` string.

doginasuit 21 hours ago||
Do you mean geocoordinates when you say not human readable? Those are obviously at the heart of geospatial information but there is quite a bit more to the spec that does benefit from being human readable, and I'd include longitude/latitude among them. There are also solutions like cbor which allow them to be transferred and decoded/encoded from binary. For performance critical data you can also use something like protobuf, but it would be a huge pain to handle everything that way. Json is a great choice as a general spec.
morganherlocker 17 hours ago|||
> If you have a non-insignificant amount of data points to track this is going to eat just a ton of memory while also being pretty slow to encode/decode.

This is a fair critique, however, for any large GeoJSON, the coordinate arrays will dominate the size. I think it's also safe to assume this data will be gzipped at rest and over the wire, which will eliminate most of the "header" metadata size you mention. As you point out, it would be much more efficient to have a binary format, and there are good examples like these, that are ~2-3x smaller in benchmarks:

https://flatgeobuf.org/ https://github.com/mapbox/geobuf

That said, I think GeoJSON should be compared against other human readable formats like KML, which has a lot of wasted space as well, while being more difficult to read/write.

dinkumthinkum 7 hours ago||
This is just pretty wrong. Sure, geojson can be bloated but it is not for "no benefit." It is a very popular format and it is easy to encode and decode, even if it is slow for large data. It is more for sharing than long term storage. Take a site like below, it is very convenient to render json this way.

https://geojson.io

More comments...