ERDAS Apollo results updated

With Apollo 10.1 about to get out the door I have updated the WMS image serving benchmark results. I’m still yet to update the product-by-format graphs as I will be rolling out a more dynamic and easier to maintain page shortly. One main addition was extending the ECW Test plan to from 150 to 300 users. I received a lot of requests from people wondering what the peak throughput was, which turns out to be not much higher at around 120 maps per second (but still, crazy quick at ~2 sec avg response).

ERDAS has also just registered for the Benchmarking event in Barcelona which brings the tally to 11 products which is great to see *queue herding cats picture*. So everyone, please stop asking me :-)

ERDAS Apollo vs ESRI ArcGIS Server

Lets face it, whatever benchmark results a vendor (*gasp*) publishes always draws a certain amount of suspicion. Luckily however, T-Mapy (Czech Republic) have just made available a detailed independant 20 page report on ERDAS Apollo vs ESRI ArcGIS Server.

T-Mapy have a long history with ESRI and now also ERDAS technology so they offer great perspective and expertise on both products. Michal Å eliga has done a wonderful job analysing performance and other metrics for serving a very large (290gb) 10cm aerial photo via WMS. Word of the day goes to “eyemetricaly worse” on page 13 :)

Image serving updates

I had hoped to post a lot more WMS image serving challenge results by now, but to date only Robert Parker at Lizardtech has taken me up on the offer with Express Server 6.1. Apologies to Rob for taking so long to publicize  the results as he was very eager to send them through and I’ve been sitting on them for well over a month now. Gold star to Lizardtech.

ESRI? Autodesk? Deegree? Mapserver? Geoserver? Oracle? Manifold? Mapinfo? … Show me your muscles (in my best Arnie voice). Don’t forget that results from real world users, not just developers are just as valuable.

On the benchmark side of things, I have also updated the Mapserver results with the 5.6.1 build. Sheesh, talk about being spammed very vocal :) ECW support was dropped and unfortunately I was unable to get the Kakadu JP2 driver working. I’ll update the individual graphs when I get some time but here is the formats-by-product result. The solid, bold line represents 5.6.1, the dotted stroke the original 5.4 results. Yes, something crazy happened on the TIFF External test but I reproduced the result over the typical 3 test run … will revisit that one later.

Prohibitively difficult vs Protecting IP

It seems various discussions are appearing postulating that ERDAS is making it “prohibitively difficult” to download the ECW SDK from our website. I’d like to make clear that this is absolutely not the case. From the website,

Request a Download
The ECW SDK 3.3 and the ECW SDK 3.3 Source Code are made available for download on an as-needed basis, after consultation with the product manager, Mr. Paul Beaty. To request a download, please contact by e-mail Mr. Paul Beaty,, with your name, organization, full address to include country, telephone, and email, and a description of your intended use. Use of the SDK requires advance acknowldgement of a EULA.

We have been forced to remove the direct download link due to numerous, frequent disregard for the attached SDK license terms and therefore ERDAS’ Intellectual Property. This post is not aimed at anyone in particular, but I emplore any potential users to email Paul and he would be happy to provide you the SDK. I just got off the train with the guy and he is eager to talk to anyone on our core technology, including a lot of new functionality available in the upcoming SDK v4 series which contains some pretty exciting stuff, v3.3 is over 3 years old guys!

We understand previous license terms have been somewhat ambiguous for some users, therefore emailing Paul will also ensure your intended use (and thus your organisation) complies with the terms and allow ERDAS to better track the usage throughout the community.

I am sure Paul will update his blog with more information very soon *hint hint* …

So I’ve been thinking ..

The raster benchmarks have been a outstanding success with 3,000 + page views over the last 40-odd days.  But what I’ve been struggling with is how to expand to more software or more platforms. Clearly I am not the master of web GIS Applications because I still for the life of me can’t get Mapguide configured so I’ve thrown up my hands and will claim DLL Hell. The Deegree guys are keen, but their preferred storage mechanism is tiles. I’ve gotten lots of hits from ESRI, Lizardtech, Cadcorp, Autodesk, Caris, Rolta, Intergraph and Mapinfo (to name a few) so I’m sure they’re keen aren’t you guys? *nods*.  Everyone wants stats on different hardware configurations. Everyone keeps emailing me.

So here’s my thought. It might fail miserably; I might get no-one submitting any responses but here goes. If it fails, then there’s always Barcelona i guess and the list

I’d like to propose the following raster challenge to whoever is reading this (eg. you). You do not need to be the software developer on the project, in fact it will be more interesting if there’s both developer and real user feedback!

  1. Download the BlueMarble world-topo-bathy-200406-3×86400×43200 (2.2 gb torrent) worldwide series
  2. Convert, tile, compress, pyramid, overview, palette the original dataset into whatever format or file composition you’d like. Configure your server accordingly to read the dataset and serve out as an OGC WMS
  3. Document the steps used to configure the dataset (hint: reproducible). Include details on the final disk storage and or number/size of files
  4. Document your server hardware configuration. Particular emphasis on OS, CPU, Memory and Disk configuration
  5. Document your software configuration. This time you’re not bound to prior documentation so developers, go for your life … users, do your best
  6. Download the attached JMX plan and reconfigure the server details. Do not modify any other part of the plan apart from Lines 482 to 550
  7. Install JMeter if you havent already and execute the plan
    1. jmeter -n -p -t myserver-bluemarble.xml -l myserver-bluemarble.xml.logs
  8. Run the OSGEO Benchmarking
    1. python myserver-bluemarble.xml.logs > myserver-bluemarble.xml.sum
  9. Zip the documentation, myserver-bluemarble.xml.logs as well as myserver-bluemarble.xml.sum
  10. Email the zip to me at and i’ll update the benchmarking page as soon as they come in

The idea behind this is to remove any ambiguity behind a single person configuring all apps, see if the they scale across different deployments, allow applications to use their “preferred” format and most importantly see whether users can reproduce the results!

This is clearly not going to be a comparative exercise. Even if you dont have a crazy 8/16 core server machine, I’d still urge you to submit the results. The point here is to get as many applications documenting how to squeeze the highest peak performance out of each. Results will not be compared as the platform will never be the same by design … Apple. Meet Orange.


Let the games begin~