CarbonArc revisited

Jeff Harrison’s mass email caught my eye today with the free trial of CarbonArc. I had the pleasure of taking an early release of CarbonArc for a spin last year and while it was a good start there were a few key things missing from my point of view. Having pushed and prodded Mapinfo 9.5 support for use with our SDI, why not run the new CarbonArc 1.6 through its paces …

From the press release,

CarbonArc PRO 1.6 eliminates barriers to SDI usability through advanced SOA-based discovery, analysis, exploitation, transaction management and security tools for OGC SDI – directly from the ESRI ArcGIS desktop.

Instead of just regurgitating the release, the key question that always gets asked is … doesn’t ESRI ArcGIS already do all this?

Its a tricky question to answer because while on paper i could say, “Yes it does!”, in reality there are just so many annoying quirks and missing features, it quickly becomes a nightmare integrating SDI features into your normal ArcGIS workflow (which is what this is all about, right?).  So with fingers crossed, lets dive right in and see what i can find in 5 minutes flat …

The Good

  1. Well laid out and robust tool set which delivers fully functional WMS/WFS/WFST/WCS capabilities … with no mucking about required!. The thing just works.
  2. Very easy export to Shapefile tool
  3. Full featured filter encoding support … no black magic, user has full control
  4. Full access to the query string for all service types
  5. Caching of features / images when saving to project files

The Bad

  1. GML Integration will always be difficult given its a completely new ArcGIS feature source, but its still not up to the average users expectations I dont think. There is no attribute table, the features are independant of pretty much all other ArcGIS functionality so you are really left with GAIA functionality squished into an ArcGIS window.
  2. Web services request headers are devoid of any accept-encoding headers. I’m still pondering the choice of Expect: 100 requests … but i’m sure there’s a good reason in the depths of HTTP …
  3. The exclusive tool for CubeWerx “Identity Management Service” seems a bit strange as i’m not aware of any services that use this technology? Shoot me a link if i missed something!
  4. Lack of support for any WMS LegendGraphics. Users can choose from multiple Styles, but theres no way to actually view legend information which is a bit disappointing
  5. In the full minute spent trying, i wasnt able to get the WFST support working … i could get as far as the schema but as soon as i’d try to insert a feature the thing would just bail out.

I’m beginning to think the best suggestion would be to somehow merge CarbonArc WFS functionality with an automated export to shape on each filter response. This would allow excellent SDI WFS support, while still giving full ESRI ArcGIS functionality to users without having to reinvent the wheel (for things like the edit system). With the WMS/WCS support pretty solid, i think getting the balance of WFS right could mean the difference between a very promising product and one I would recommend to everyone using our platform.


WFS-T adventures with Mapinfo 9.5

So i’ve been a bit late taking a look at the new Mapinfo Professional v9.5. With the consistent dissapointment towards the consumption of OGC standards in commercial apps I wasn’t holding my breath … but wait a sec, it did work and it worked damn well. I mean, it worked flawlessly; updates, inserts, deletes, lock support and it also comes complete with a semi-intelligent conflict manager.


A few more suggestions to improve things further (for anyone listening~) …

  1. Add HTTP compression handling. Huge performance gains with the transfer of features and its really a no brainer to enable in any http library.
  2. I am by no means a “Mapinfo Master” ™, but it would be great to enable an automated WFS Table refresh especially if you are retrieving features based on CURRENT_MAPPER.  I guess the CTRL+F5 shortcut makes it easy-ish … but i certainly found myself wondering whether i had retrieved the features or not and ended up just sending unnecessary requests.
  3. If a transaction is successful, give me some kind of alert. Alerting only when it fails does not instill much confidence whether my long edit session went through or not (even after refreshing)
  4. It would be fantastic to add helpful warning messages when performance drops. I’d imagine most users would skim over the maxfeatures and column / row filters and just add the layer.
  5. If the first request takes 5 minutes and Mapinfo tells me i just retrieved 4000 poly features totalling 10mb and it kindly directed me to the WFS how-to, i’d be more inclined to see what the filtering options were all about :)

So there’s no WFS1.1 support … but i’m still trying to get my head around handling the axis order issue and are more than happy to let sleeping dogs lie … at least for the moment. I only had time to test against our Geoserver installs, but it certainly seems tested against many other apps including Cadcorp, Ionic & Mapinfo. Geoserver specific here, but the advanced security in 1.6.x works very well with the bundled support for basic authentication.


WFS Feature paging … yes please

Sean posted his thoughts in response to Chris’ and all i can say is, yes please!

My random thoughts,

  • Why this functionality was never embedded into WFS i will never know. After playing with CSW for the last 6 months where similar “pagination” is available … it just makes sense. How the average Jo Blogs will ever understand what maxFeatures should be set to is irrelevant if the user cannot even determine how many *total* features are available given his query. OGC CS-W handles this quite nicely, almost identical to how Chris H. described it. If i search for “hydro”, it will give me a numberOfRecordsMatched=”340″ but then also tell me that i’m just viewing the first 10 records.
  • Paging has been linked to server performance, particularly caching a set number of features. This imo, would only hold true if the given features are retrieved in the same manner. How this would handle filters i’m a little unsure of (beyond the simple bbox). Just because search engines index doesn’t mean that the same features will appear in the same page 10 days later, for example. Checksum? HTTP Last-modified? *shrug*

Looks like i need to pay more attention to the OGC boards :)