Here are the documents from the meeting, including the eGIS Repository data list that I passed out (and was super small to read).
And last but not least – the draft of the Los Angeles 2nd Geospatial Summit (April 30th). Hope folks can attend that.
grid levels 0-19
I got an email with a link to some cache validation tools, which was really nice. The original link about previewing caches, wasn’t that useful, but the additional resources for cache validation tools (click this link to go there) contained some very useful tools.
If you are doing map caching (which we do a lot of) one of the things that you might look into is breaking your cache area into tiles, and caching them tile by tile. This helps you break the job into smaller pieces, so if something fails you don’t need to start from scratch again. The benefit of this tool is that you can break your caching down specifically into the tiling blocks that ESRI uses to store its caches, which will match your caching area with the blocks that it gets stored with. This is especially useful if you have more than one machine to cache with.
So I have updated our cache tiling grid to match the blocks, but clipped the tile grid to the County boundary (so we only cache our County) and now have two cache grids – one for Cache Levels 0-19, and another for cache Level 20 (in order to save disk space I am not caching areas deep inside the forest).
grid level 20
For anyone looking at installing ArcGIS Server and looking for a capacity planning guide, the attached documents are very good references. I still have a number of question (like how the benchmark map service draws maps in .46 seconds when at the end it seems to show more like one second, but these are questions we can all answer:
I was on the closing panel at the end of the recent ESRI Regional User Group in Redlands, and had a chance afterward to meet with Jack Dangermond and some of the product managers about the next releases of their client and server software. It was an extremely informative discussion, and I came away impressed with the scope of what ESRI is trying to do – I thought I would pass my thoughts along.
A couple of major changes are underway that I think will impact GIS managers and practitioners, and may reduce overall cost structures for GIS.
- ESRI is planning on a cloud computing platform for its server capabilities, scheduled for June/July release this year. The key is that it may provide an additional cost savings for GIS, since the requirement for acquisition, maintenance, disaster recovery, and systems administration for a GIS environment within an organization may be reduced. Specifically, the proposed build-out of a GIS infrastructure may not require as much hardware acquisition as previously required.
- The next release (version 10) of the GIS thick client software (ArcGIS Desktop) is also slated for release summer 2010. It has been designed to move many of the functions of the current client (GIS data creation, editing, etc) from the client to a distributed web-based framework. This has the potential of reducing ongoing licensing costs, but just a critical, it may make a distributed editing environment possible, so that framework GIS base layers (Streets, Addresses, Facilities, etc) can be edited by non-GIS experts in different locations inside and outside of the County (for example, a city being able to edit address data through the web, eliminating the need for the County to duplicate the effort), reducing the maintenance costs for data layers that cut across multiple jurisdictions.
My primary caution is that ESRI will not want to cannibalize its existing revenue structures, so cost savings may not be dramatic. As well, ESRI has not had the best track record at initial releases – it generally takes an update cycle or two before their software becomes stable and bug-free. However, the possible cost savings and reduced need for in-house and contract maintenance support offered by these two changes is something that will definitely be worth watching.