OSGeo Tiling Spec

If you haven’t already seen the backchatter from the FOSS4G conf, there is a big push at the moment on implementing a common spatial tiling scheme.

I’m not going to bother rehashing, so swing by Distributed Tile Caching or the live docs at Tile Map Service Specification to get up to date quickly.
The distributed caching article is certainly an interesting read. Something that caught my eye today was a similar discussion on #worldwind about the futility of trying to use torrents or other technology such as coral to help distribute these caches of tiling goodness.

My question, what about Amazon S3? I have been using this service since its release for offsite backups of 60gb+ and can not praise the system enough for performance and reliability. WebDav, Torrent, SOAP … the list goes on.

Sure its not free, but it sure as hell is cheap … Based on Schuylers calculation from the wiki article,

Taking 15m pan-sharpened Landsat-7 composites as an example, at a tile size of 512 x 512 pixels, each tile would be about 7,680 meters on a side, or about .0625 degrees across. Plugging in the other values, we get a maximum of 22,118,400 tiles in the layer.

Assuming the optimum size of 64kb is reached per tile, we’re looking at 1415.577gb of physical storage. Lets take a wild guess of 50gb of transfer per month, with the actual tiles only be updated annually and we have the following,

$0.20 * 1415 = $283 to initially upload the cache

$0.15 * 1415 * 12 = $2547 for a years worth of storage

$0.20 * 50 * 12 = $120 of user transfers (eg. downloads)

Total: $2950

Cheap, no? Try it out, you wont be dissapointed … i know of at least 1 other mapping outfit that have enquired into s3 for similar tile storage for their application. I’d be keen to see a proof of concept to see whether this is do-able or not *looks at refractions*.

Certainly if my figures are correct, the costs are almost negligble even for you poor opensource developers :)

Mapguide 1.0.2 released

The release notice just hit my inbox, Mapguide Opensource has hit v1.0.2 which includes the following items of interest to me,

  1. Update configure.in to use FDO 3.1.0
  2. Performance Improvement: Add Feature Service caching
  3. Performance Improvement: Add FDO connection pooling
  4. Update GetSpatialContext API to set OGC WKT for FDO WFS/WMS provider spatial contexts

Even though it was released as minor, those 4 points alone make it a worthwhile upgrade. This also overcomes my bugbear mentioned on Jason Birch’s blog about incompatibilities between current FDO project versions and the one bundled with 1.0.1 … speaking of which Jason, i’m still waiting for that “easy” mapguide/fdo compile guide :-)

It will be interesting to see the improvements to the FDO providers in particular … *goes off to install*

Quick weekly wrapup

Five quickies …

  1. Openlayers gets a nice plug on the popular ajaxian blog. Unfortunately its not exactly well written and it left even me questioning the point of the post. Ah well, publicity is publicity i guess
  2. UMN Mapserver 4.10 was released with a few new goodies such as curved labelling, improved mapscript service support and various other bug fixes. Surprised no one else picked up on this … i’m sure ms4w builds will be updated in the coming week.
  3. For those not attending FOSS4G and missed out on the PgRouting sessions, Lorenzo has pleaded with me to give the project a plug. A bit late i know, but check out the awesome prototype whipped up with Kamap here. Select the tool that looks like a pin to define the start/end points on the map, and then wait for the route to be updated.
  4. Geoserver is at 1.4M2 … and even though its beta, it now has KML reprojection support (yay!). The guys can’t make it much easier for the average joe to serve their data into GE now, winner.
  5. What_nick has been working on the improved WorldWind WMS/WFS plugins and has assured me it will make it into the next release (1.4?). Other good news is the smaller footprint that Bull mentioned in his post because of the move to using WFS for placenames rather than bundling the data with the download.

1 down, how many to go?

With the dust settling on A9’s decision to can its mapping operations … who’s next?

In my opinion, A9 are actually the smart ones … they tried, failed to see benefit and are now moving on. Unfortunately with Google, Yahoo, Microsoft, Ask and whoever else is hopping on the bandwagon, not one has actually proven its business model for sustaining it’s current “Gimme all the data you have and heres $1m to not give it to the others” mentality.

Despite the obvious monetary blackhole these operations must be, i am going to stick my neck out and predict a enormous problem which will unfold in the next few months; licensing.

errorfunny.png

In the past few days as was blogged by Bull, Rich of mappinghacks fame got another take down notice from Google for a small perl script that simply stiched together GMap tiles into a usable, downloadable image.

It sounds harmless, and an almost obvious tool to write. So why all the fuss? It broke Googles licensing restriction limiting use strictly to its mapping service. From mappinghacks,

Every single use of Google Maps involves downloading their ‘copyrighted images and data.” I don’t have to get into the whole ‘how the web works’ screed. Google’s _whole business model_ is based on downloading other people’s copyrighted images and data, and doing things with that data that the original creator’s did not intend.

Where am i going with all this? Case in point, Google Earth. The day someone “opens” the format of Google Earth’s cache, the application will literally fall apart. I’m sure many spatial professionals will be licking their lips at the thought of using the imagery outside of GE, but for the data providers nothing could be more damaging.
Will it happen? Yes. When? I’d say sometime soon.

*Watch this space*

I’m starting to sound like one of those 9/11 conspiracy theorists .. :)