Under the Influence of Metcalfe’s Law

Its not every day someone takes the time to write me an open letter – I have to say its kind of fun. Brian added some additional thoughts to our ongoing conversation about GML. In truth, this is where blogging breaks down a bit, it would be much easier to sit down in a room for an hour and have a great in-depth technical discussion (of course, then our discussion wouldn’t be available for the whole world to see which is significant downside).

Since its a bit hard sifting through where things stand in a long discussion, let me recap the points I think we agree on:

  • GML is a toolkit that provides rules for translating your proprietary data model into XML
  • Having translated your data model into GML/XML, it is then necessary to code both clients and servers to understand it

Where we disagree is whether this is a good idea or not.

I see at least three very different use cases here:

  • I want to share within my own organization
  • I want to share with a preselected set of outside organizations
  • I want to share with the world

I’ll agree with Brian that for the first two use cases, GML 2 (and 3) provides a workable solution (although I think GML 1 was a better solution and that the overhead of GML 2 is prohibitive).

Its item #3 though that really matters. One of the things that makes the Web different is Metcalfe’s Law (and Reed’s Law) becomes predominant – the value of something becomes much more important the more people use it. Which leads me to the conclusion that everyone has to agree to a shared data model and format. Otherwise you end with thousands of one-off data integrations, which does nothing to solve the general problem.

There are obvious downsides to agreeing to a general data model – it will always be a lowest common denominator and wont work for many complex integrations that live in the realm of the first two use cases. But there is an obvious upside – it is the only thing that has any chance of working out on the web. If you don’t agree, then please show me a real-life example that disproves it.

So where does that leave us? I believe that GML as it is formulated has no chance of success out on the Web because its simply not designed for it. The obvious consequence is the emergence of the Atom / GeoRSS combination and KML. And truth be told, those standards solve the problem of rendering maps made up of multiple geographic data sources well enough.

What they don’t solve is exchanging attribute data between systems. And this leads right into the hornet’s nest of the Semantic Web and data modeling – no one has every come up with a solution to this problem and I doubt anyone ever will.

So faced with that daunting task – why not try the simplest thing that could possibly work – which ironically was more or less GML 1:

<Feature typeName="Road">
  <description>M11</description>
  <property typeName="classification">motorway</property>
  <property typeName="number" type="integer">11</property>
  <geometricProperty typeName="linearGeometry">
    <LineString srsName="EPSG:4326">
      <coordinates>
        0.0,100.0 100.0,0.0
      </coordinates>
    </LineString>
  </geometricProperty>
</Feature>

In today’s world, I’d modify this a bit and start with Atom, add in GeoRSS, and then add in an new namespace that encodes properties like above. And I’d stick the same stuff in the KML metadata tag.

Now, I don’t expect this to do diddly-squat for machine to machine integration. What I do expect it to do is make it easy for clients to show a nice property browser to users when they mouse over a feature on a map. And for the web, that’s good enough since it all comes down to humans in the end anyway.

Leave a Reply

Your email address will not be published. Required fields are marked *

Top