My Experiences with OGC

Some of my recent posts could be interpreted as veiled critisism of the Open Geopspatial Consortium (OGC). But in truth, I’ve been very impressed how OGC has reinvented itself over the last six or seven years. So I thought I’d post about my experiences with OGC – and of course give you my spin!

Back when I was with Smallworld and leading the development of the Smallworld Internet Application Server, I was Smallworld’s, and then General Electric’s, representative to the OGC. It was hardly a plum assignment – I was the only one who wanted it – and it took plenty of cajoling to get reluctant management buy-in.

Now to be clear, I was hardly a mover and shaker in OGC. I attended the meetings, spoke up ever so often, drove Smallworld’s participation in testbeds (see below) and helped write a discussion paper. But it sure was an interesting experience, and I quickly figured out that making standards is really hard.

The Early Years

Smallworld was an early participant in OGC, but eventually gave up on the process as OGC developed a series of standards that went nowhere – like Simple Features for SQL (the most successful), CORBA and COM. These standards never made any sense for two main reasons. First, they require a fundamental rewrite to support, and no vendor has the stomach for that. Second, they are based on a faulty assumption that distributed object protocols actually work.

The combination of suffering through the inevitable, and interminable squabbles about technical minutiae, with the full realization it was all for naught, was too much to take for the Smallworld representatives before me and they eventually stopped going.

The Web Testbed Years

I got involved with OGC right after the first Web Mapping Testbed, so around 2000/2001. The Web Mapping Testbed was a brilliant idea – sitting around in a room writing standards wasn’t working out. So the OGC decided to create six month testbeds, with each testbed focused on solving some problem that a large OGC member had (with the member funding some of the cost of the testbed). By the end of the six months you needed a rough spec, and much more importantly, a working implementation. That set off a torrent of innovation – and gave birth to all of the important OGC specs used today, including WMS, GML, WFS, etc.

But back in 2000, none of that existed. As we wrote SIAS, we sure wanted some standard, any standard, to follow. And thus I pushed Smallworld and GE to get back into the OGC process. On our part, we implemented full support for WMS (including SVG at a later release!) and were one of the first companies to support GML.

But it still took lots of cajoling to convince anyone it was a good idea. I used to give talks about OGC at our user conferences, and I remember sitting on panels at GITA with titles such as “Why Use OGC Standards.”

More interesting was the reaction of our customers. Smallworld/GE dominated the utility and telecoms business at the time. Utilities and telecoms are some of the most conservative organizations around – telling them they could now share their data on the Web was enough to send them into epileptic shock.

US customers were particularly uninterested. But European customers were different, particularly German customers. Many of our German customers were actually small organizations, with a service area limited to a city or two. They were huge supporters of OGC standards, and that’s where we made most of our progress.

And What About the Web?

A quick caveat before continuing – since leaving GE for Ubisense and later MapBuzz, I haven’t been involved in OGC. So beware – some of these thoughts may be wrong.

As I’ve talked about in a previous post, I don’t think the OGC standards have succeeded on the Web. I find nothing surprising about it – there are a couple of good reasons for it.

First, the OGC testbeds are designed to solve hard problems for large organizations. For example, the last one I took part in modeled a disaster response to a series of tornadoes that touched down in the Washington DC metro area. They goal was for the federal, state and scores of local governments to effectively share information in real-time to manage the emergency response. Thus the demos were all about combining data from multiple sources – the latest satellite imagery, reconnaissance missions flow by droids in real-time, street vector data, parcel data, etc. If something 1/100 as effective as these demos had been used in New Orleans after hurricane Katrina, a lot of people would have been spared a lot of misery.

To get this to work you need complex standards – things like GML, SLD, WCS and WFS. But this was certainly not the Web. Sure the “Web Services” moniker was thrown around all the time, but these were SOAP services combined with sophisticated Java clients, and very few browsers in site.

Second, the OGC membership is a combination of academics, leading companies in the industry, and large organizations like Lockheed Martin and the Federal Government. Thus, the organization is geared towards writing standards that play in that world.

The Role of Google

It seems like a great coup to me that Google submitted KML to the OGC for standardization.

The big question in my mind is how will Google change OGC? The Web is in Google’s DNA. Will it be able to use its knowledge, and power in web mapping, to nudge the OGC to a more web centric view? Time obviously will tell, but it sure seems like a great time to be part of OGC and watch the technical minutiae fly.

Leave a Reply

Your email address will not be published. Required fields are marked *

Top