Archive for the 'Scale Issues' Category

Apr 05 2011

Cache on the Barrel – Community Maps Work

The CPU cycles are burning as I set up some test cache building for our local edition of ESRI Community Maps – Large-Scale Topographic Base Map.  It’s rather a treat to see a bunch of execution threads and trying my best to keep them all busy!

As I work through glitches here and there, I’m catching up on some of the good information that is online to help folks in my situation, like Advanced Map Caching from the 2010 ESRI Developer Conference.

 

No responses yet

Jan 07 2011

Terrain progress linked to Community Basemap Large-Scale Topo

The ESRI Community Basemap program’s new ArcGIS 10 template for Large-Scale Topographic Mapping is somewhat streamlined and cleaner when compared to its ArcGIS 9.3.1. predecessor.  But the template map document still has the ability to bring my workstation to its knees when doing stuff like printing.  I find myself unable to print 11×17 or export a B-size PDF without having incomplete results.  Mercifully, I can print and export to A-size so people are starting to get a taste of what is to come when we start building cache tiles.  Everyone who has seen the base maps was quite pleased, and some were almost gushing over them.

Preparing input features to pour into the template geodatabase that  are good enough to be worthy of 1:1200 scale mapping is every bit the challenge it might seem.  Each feature class that is making its way into the template seems to require a unique bit of spatial analysis to get pulled together, and usually this involves combining our best available input data in ways that we haven’t before.  This week’s challenge has been water polygon and water line feature classes.  For utility, we’re choosing to split out the water polygon from its water line.  For lakes and ponds, they will coincide, but for our tidal coast, we’re conveying extra information by setting the polygon to represent Mean Low Water, and the water line to represent Mean High Water.  This helps to convey the widely varying slopes that Marin County has at the tidal shores, and provide some hint as to the width of the public access way below the high tide line.

These tidal shores have been compiled from the 25cm-interval topo contours that were generated for the Community Basemap, and in NAVD88 we used nominal 50cm contour for Mean Low Water, and 175cm contour for Mean High Water.  Between these elevations one often finds various “water” polygons depending on the application; in our base map we’ll have a purposeful gap.

Along the Sonoma borderlands I struggled for a few hours hand-tracing ponds using NAIP 2009 near-infrared grayscale—the one where standing water is basically black.  I was a bit sloppy and found myself tracing at 1:4000 on screen but it was a fairly thorough job through the adjacent shared watersheds along Estero Americano, Stemple Creek, San Antonio Creek, and Petaluma River.  I was just about fed up with drawing lines around standing water when I checked in with MarinMap member Bill Voigt of San Rafael for other reasons.  He provided an inspired suggestion that I use the water lines from our 2004 photogrammetry.  Of course!  I had used them in the terrain, but they were not hard constraints and the sparse contouring in west county lands tended to wash over stock ponds in the contours.

Empowered by Voigt’s suggestion, I found that the 2004 photogrammetric water line work, which was only available within Marin County proper, had very high fidelity with 2009 NAIP summer ponds, and it was vastly easier to select the pond-circumscribing lines, sometimes 300 segments at a time, copy and paste them into the Inland_Waters_li features, Merge them all at once into a single compound line, then Explode the multi-parts into discrete ponds.  With that approach I was able to harvest hundreds of Marin stock ponds, reservoirs, and vernal pools, and have geometry that appears smooth and accurate at better than 1:2400.  There are also quite a few meandering creek features in the west county that are well represented horizontally in the water break lines, but my impression of them was that they took a lot of upgrading to serve as 3D guides in the terrain, and construction rules broke the pond lines wherever a drainage reached its edge.  So there were a great many segments that needed to be merged to create closed loops that could build pond polygons.

By the end of this week, we had over 1400 water bodies identified in the watersheds that touch Marin county, and this exercise that was initially motivated by improved cartography may augment our set of candidate sites for wetland inventory.  These  water features are set into a regional water layer from The National Map that is valid at 1:24k and includes a refined coastline from Big Sur to north of Point Mendocino, and inland to the Sierra crest, from San Luis reservoir up to Mendocino Lake.

We refined Marin and adjacent shorelines (as both MHW and MLW boundaries) from Sears Point, up to and down from Petaluma, along San Pablo, San Rafael, Richardson and San Francisco Bays, the Golden Gate, Gulf of Farallones, Drakes Bay, outer Point Reyes, Tomales and Bodega Bays, and Bodega Head using 25cm interval topographic contours.  All Farallon Islands were traced near 1:2400 scale as visible in NAIP 2009 near infrared band.

Next week should find the last couple of administrative layers (Parks and Open Space) and refinements to road centerlines to reflect docks and proper arterial status, and then it will be time to start making test tiles.  With any luck, we’ll take some CPU cycles from production server to give us the best shot at getting good tiles fast.  With 120 layers, several of which use representations, and many of which use complex Maplex rules for labeling, the Community Basemap Large-Scale Topographic template for ArcGIS 10 is by far the most demanding map document I’ve ever manipulated.  Everyone is looking forward to getting the tiles generated for testing, and sending them off to ESRI for review while sending samples to MarinMap members for their comments as well.

No responses yet

Aug 27 2010

Marin County Topo-Bathy Surface “tbsm45cm” Released as 2010.08

Blemishes notwithstanding, nearly six months of back-burner work has reached a threshold of readiness and is outward bound to some engineering firms, flood mappers at FEMA, and interested parties within the county. A handful of known issues remain unresolved. Proper name is “tbsm45cm_20100823”, proper edition is “2010.08”.

This is the third version of the terrain. Second version was “2010.01” and included multiple LiDAR data sets, but fewer than presently used, and was a topographic model only. First version was “2009.09” and was mainly photogrammetry and FEMA LiDAR, and was the last version to be developed in California Coordinates. Once the massive NCALM LiDAR data sets were processed, it became easier to move everything into WGS84 UTM zone 10 north meters projection, WGS84 NAVD88 CONTUS Geoid 2003 vertical position.

The NOAA utility program VDatum, a brilliant Java-based application able to stream-process data sets of near-infinite size, brought the NCALM data to heel, and opened up decades of NOAA depth surveys to our use in integrated topographic-bathymetric surface modeling.

First-return NOAA ALACE LiDAR swaths were fused along the outer coast, as bare-earth filtered versions were not produced in 1997–2002; the benefits of LiDAR detail along the rocky coast do seem to outweigh the distracting appearance of structures near Rodeo Lagoon, Stinson Beach, and outboard Bolinas.

When ArcGIS 9.4 beta 2 reached its limit in ability to render the terrain dataset into 45cm grid over the full extent, the clipping quadrants created to resolve this problem ended up chopping a very small portion of Sonoma county that drains into Estero Americano; the full watershed remains intact in the 1-meter version of the terrain grid under analysis for county-wide hydrology. Likewise, the tighter clipping quadrants lost a few hundred meters of San Pablo Bay bathymetry just west of where Marin, Sonoma, Contra Costa, and Solano counties meet. Also, tighter clipping quadrants snipped a portion of the San Francisco Bar southerly of San Francisco’s Seal Rock that was intended to be part of the model. All of these areas exist in the 100cm grid, and will be part of drainage analysis.

Happily, we have updated the workstation to ArcGIS 10, and have been enjoying such great speed gains with Spatial Analyst that our ERDAS use has been noticably reduced. Finally, Spatial Analyst is often showing performance nearly on par with ERDAS. Thank goodness that the Raster Calculator survived the transition to version 10 ArcGIS!

Painfully, the existence of unutilized bathymetric data sets for upper broad-channel Corte Madera Creek and Bolinas Lagoon have been revealed this week. Hey, there’s already something to look forward to for the next build!

The new terrain is getting some immediate use in support of an effort to participate in ESRI’s Community Maps Program for large-scale topographic mapping. The Program provides a template geodatabase with 36 vector feature classes and two raster, into which local agencies may pour their data. Once tucked into a conforming schema, a template multi-scale map document is provided with 120 layers—30 at each of four large scales that correspond to Google Maps and Bing Maps projection and cache tiling schema. The difference is that the template document makes use of ESRI tools to allow much more local detail to be packed into a map designed with notably more sophisticated cartography than either Google or Bing maps now have. The Community Maps Program concept is that local agencies may publish their local detailed content in a fairly uniform style, while retaining a world-wide seamless context for their surrounding area.

Qualitatively, the effect is that, when viewing the ArcGIS.com topo map alongside either Google or Bing maps (on two monitors, with comparison made at the same scale), the ArcGIS.com map looks to be a larger scale. It isn’t, and I’ve measured the size of features to convince myself, but my mind insists that I’m zoomed farther in on the ArcGIS.com map for some reason. My guess is that it is a perceptual effect of the much greater amount of information that is cleanly displayed in the ArcGIS.com map versus the much sparser Google and Bing content at these large zoom levels. Try it out—it’s like a carto version of an optical illusion!

The 120 layers in the template large-scale topographic base map from the ESRI Community Maps Program are arranged to provide four precise cartographic designs for Google/Bing map cache levels 16 through 19, which correspond to these display scales
1:15000–1:6001 (level 16, a.k.a. ~9k)
1:6000–1:3501 (level 17, a.k.a. ~4.5k)
1:3000–1:1501 (level 18, a.k.a. ~2k)
1:1500–1:501 (level 19, a.k.a. ~1k)
One of the most attractive areas currently online is Toronto, ON where at levels 18 and 19, individual building outlines are graced with street addresses.

Anyhow, the new tbsm45cm model will serve County of Marin’s effort at large-scale topographic mapping several ways. First, it has made possible a very detailed hillshade that helps emphasize the grading around each hillside structure in the county. Second, it helps us to create the required metric topographic contours. These are necessary to meet world-wide mapping standards, and throughout this weekend, contours are being generated from a related (smoothed version) of the terrain on 50cm vertical interval. Needless to say, most of these won’t get used in the map renderings, but the ESRI cartographers have shared a very clever indexing scheme that will help us use this single set of metric contours to support the requirements for all four of our topographic map scales.

No responses yet

Jul 06 2009

OpenSim Terrain notes, and Darb has Process Credit history!

I’d read about this, but never before experienced the agony first-hand.  Extracting funds from SL, the wait for funds to arrive at PayPal was a bit slow.  In fact, in the time it took funds to go from Linden to PayPal, a bamboo shoot in my back yard could have grown taller than me (that’s my RL not SL height!), and would have been over 2 meters tall.  Anyway, Process Credits are quite lacking in symmetry with how quickly credit charges can flow into the Linden realm.

During this week of waiting my random prims have been cleared out from Amida and nary a trace of Berkurodam BART Station remains besides a video in Gualala.  The video screen was actually entombed by a neighbor, who may not like it but did not send any message.

Anyway–for me this week is all about generating maps and graphics while keeping up with work.  I’ve generated a 50cm terrain grid for parts of my county where perhaps 150,000 people live.  With computational process improvements I should be able to make production stable enough to generate a 25cm grid.  The point is to model terrain slope and aspect within urban parcels.  OpenSim can pack 64 terrain megaprim sculpties over each region to refine terrain more than the built-in 1-meter postings, and display 10cm orthoimagery at full resolution.

Last year, I used first-return LiDAR data of the UC Berkeley campus to generate a 25cm grid for 10cm imagery.  Now, I’m working with bare-earth LiDAR data from FEMA, topographic contours (densified to 1.5m vertex spacing), and most importantly, photogrammetric terrain and water break lines.

Throwing all those data into the mix, the data are built into an ESRI Terrain Dataset, from which I generate TIN and GRID models at various reolution and extent.  The ESRI ArcGIS 3D Analyst Terrain-to-TIN generator breaks down after about 10 mega-faces (so would I…)  And the ArcGIS Terrain-to-GRID generator seems to drift into Windows-unconsciousness after about 1.0 giga-cells.  So for the grid, I break it down and do the pieces, then merge the tiles using ERDAS Imagine, because the ESRI ArcGIS raster mosaic function does not produce output grids much over 10 GB.  As annoying as learning these ArcGIS limits can be, it is very satisfying (and instructive) to see huge swaths of seamless terrain with great detail once it all comes together.  Thanks to the break lines, many driveways and most home building site cuts and fills are resolved.  And it will be a lot of terrain by OpenSim standards–enough to calibrate terrain for over 20,000 contiguous regions–not that I ever expect to build it all at 1:1 scale!

No responses yet

Feb 24 2009

Civic Center terrain version 0.9 – dumped in Stanford region

The terrain forge was fired up tonight, and the virtual dump truck made dozens of runs into immersive reality.  Compared with OpenSim where terrain megaprims can be fine-tuned to the precise size requirement, in Second Life I must back-calculate the scale from actual terrain sculpties.  Apparently my trusty Gene Replacement 40-meter spheres can be induced into 35.70-meter terrain blocks, using the method I developed for use in Level 2 OpenSim build of UC Berkeley.  This version 0.9 of the terrain made some mis-calculations about the ultimate size of the top sections of each sculpted prim.  Look for better scale control with version 0.91 soon!

view Sly in RL, vu Ely in SL, northerly Stanford region

view Sly in RL, vu Ely in SL, northerly Stanford region

closer look at drive-under location ("first arch")

closer look at drive-under location ("first arch")

view Wly in RL, view Sly in SL Stanford region

view Wly in RL, view Sly in SL Stanford region

No responses yet

Feb 16 2009

Terrain on tap – OpenSim on deck

There’s some progress on a couple of project fronts.  I’ve started assembling some USGS terrain for a 1:10 scale Level 1 build that could involve more OpenSim regions than I’ve ever stood up on one machine before.  Snapshot of progress is here, with a goal of 304 OpenSim regions for the model.  I expect that the OpenSim server will be re-imaged and a new build attempted in the next couple of weeks.

Vicinity of Colorado Springs, CO - with a full mile of terrain elevation subtracted

Vicinity of Colorado Springs, CO - with a full mile of terrain elevation subtracted

The site design for the Marin Civic Center build in Second Life Stanford region is also moving along with its target 1/8 region (two SL acres) based on RL terrain and building at 1:1 scale Level 3 build.  Progress sketch below:

Context model data of terrain for Civic Center Administration Building

Context model data of terrain for Civic Center Administration Building

A bit more can be reviewed by looking at the PDF of the same map here.cc_topo_20090211

Finally, a sky tag has been added to the space above the build. It is visible as a streak to anyone who visits SLURL.com by a mouse roll.  The Stanford region now has a large “MARIN” visible in its upper reach, squarely in the middle of the ancient Second Life Outlands.

Stanford vicinity from SLURL.com on 15 Feb 2009

Stanford vicinity from SLURL.com on 15 Feb 2009

One response so far

Feb 03 2009

SIMGIS Move: OpenSim for workbench, SL for presentation

Published by under Marin Civic Center,Scale Issues

Down in the “basement” a server awaits a bit of configuration to become an OpenSim lab to support the Marin Civic Center development.  How nice it would be to load terrain into a portion of a Second Life region, but alas–it seems unavailable to those of us without Estate controls.  The 1:1.00 scale terrain will need to be calculated to back the base ortho-image that has already been tiled over the site as shown in the last blog posting.  I have verified that one of the 40-meter sphere oversize prims in my inventory can be rebuilt into a massive terrain sculptie.  Placing a 32×32 sampling (30×30 at most that are usable for terrain) of 1-meter terrain samples will give me a guide, and then I’ll just need to diddle with the SL client bulldozer tool to approximate the terrain prim, before disposing of it.

Thanks to the frequent updates of the region map at SLURL.com, I can already see some of the build taking shape.  On the second image below I sketched my virtual moving van’s path from Gualala to Stanford.  To aid overview map navigation and VFR-flying avatars, I have constructed a large readable sky label in crude imitation of the long-standing skywriting by SL resident Web Page in region Da_Boom.

Da Boom -- probably named after De Boom

Da Boom — probably named after De Boom

From that origin, the old Gualala locale was at region grid (1008,998), and the new Stanford site is at region grid (1006,1000).

The heart of the old SL Mainland

The heart of the old SL Mainland

Oddly enough, Stanford appears to be the fourth-oldest region, according to this relatively ancient map.

Second Life regions 2002 11 21

Second Life regions 2002 11 21

On the ground, I’m still in process on some lot line adjustment I’d like to make before breaking ground. The terrain sculptie method has been proven, although I have yet to grid actual terrain values for the project site. Also, I’m trying to minimize RL dimension measurements if possible, by using best available historical information on the RL site.

Some things have changed in SL viewers in the last few months. About a year ago, it was possible to take an oversize prim and modify a single dimension, having that snap to 10 meters. Currently, any change in a dimension of an oversize prim results in all three dimensions snapping to values less than or equal to 10 meters. It’s a new challenge, but can be managed. Still, once a builder has tasted the freedom of OpenSim, it is awfully hard not to chafe at those sorts of restrictions.

I’m giving thought to a copy of Second Inventory to facilitate the use of OpenSim for dev and SL for production, but the issue of prim size shrinking will be a big issue for me.

Curiously, I have found that physical prims to not drop to the ground in Stanford.  This has never been the case in Gualala, so I’m intrigued and opened a ticket with Linden Lab.  I’ll see what they say. Meanwhile, I’ll keep my eye on the sky for the project’s mark (in the old ceneter of OUTLANDS)

No responses yet

Jan 31 2009

Marin Civic Center 1:1 scale texture in Stanford – feels bigger than OpenSim

Only the four-story Administration building (wing), not the two-story Hall of Justice. I’m tired so I’ll let the shot speak for me.

photo from 2009 01 30

photo from 2009 01 30

To me, it’s mildly amazing to realize that F.Ll.Wright’s design fits so snugly in 1/8 of a Second Life region at 1:1.00 scale.  The Civic Center Administration building is a Real-Life building that can be visited, providing an easy way to get a true RL immersive sense of its scale.  Building at 1:1 scale in Second Life for the first time, this has been my first experience of transferring that awareness into the multi-region contigous space of the very beautiful Second Life.  Sure, I’ve built large areas at 1:1 using draped LiDAR data, but to have a rather large single building (or at least its footprint for now) in context with existing builds that I’ve seen for months, well, at the moment SL seems larger than I’d thought.  That shift in my perception of SL scale may be the contrast between flying (quite fast as it turns out) around 40 to 100 OpenSim regions versus walking around the site and knowing how long it takes to traverse the RL building.

Anyway, check out the build’s progress at secondlife://Stanford/100/235/30

No responses yet

Jan 13 2009

Meta-machinima, Berkurodam for sale, OpenSim server offline

After two years, it seems time to work on a new big build.  In the interest of conserving SL resources, I’m looking to get enough from selling the Gualala land and Berkurodam build to purchase adjacent land for a new build at the Stanford site.

I’m interested in selling the Berkeley build to architecturally-minded SL folk, so that with a properly sized and shaped parcel, I can do another RL scale model build (not based on any location in the East Bay). Work circumstances have changed and I’ll be spending much less RL time in Berkeley, so Inquiries are welcome care of darb (at) simgis.com.

Unfortunately, the public-facing OpenSim server that was loaded with the 40 region 1:1 scale UC Berkeley model has now been taken offline and is in search of a data center slot.

Meanwhile, just to prove that I’m still around the metaverse, I’ve made my first meta-machinima. Using the YouTube MP4 streaming service, which is apparently available for any uploaded video, one can map a texture into a video stream as part of parcel media settings on Linden servers.  This machinima was shot at the Gualala Level 3 Berkurodam build.

For those with the site blocked, the URL is http://www.youtube.com/watch?v=Ntjkj4eyQvM 
I’m embedding the video below

No responses yet

Dec 11 2008

A time for OpenSim reflection – standalone Linden servers on the horizon

Silent though these pages be, much has been thought.  I’ve had some quality time with inquisitive Lindens and learned to expect some sort of standalone Linden server product along about 2009.  For me, that’s a game-changer as it’s hard enough to suggest (at work) creating content without also keeping up with an open source thread to stand that content up upon.

This past week I’ve made a real-life geographic shift for a family event, and learned that I’ve got a relation involved in the study of architecture.  That insight has reinvigorated my interest in Jon Brouchoud and some of his writing here.  The notion of architecture as it is currently an academic subject, versus architecture as a current professional practice, and the disruptive possibility of widespread virtual world deployment—this is a notion not so different from geographic science as an academic subject, GIS as a professional practice, and the possibility of immersive 3D disruption of the status quo.

Others in academic circles, including University College London, Centre for Advanced Spatial Analysis (CASA), published a Working Paper about the time this past summer when I was so focused on my 1:1 immersive build.  It was gratifying to see the CASA acknowledge Second Life technology’s place in the world of neogeography and geospatial informatics.

Sitting side by side yet somehow abstracted from mapping, gaming and digital earths
is Second Life and other similar virtual environments. Second Life and their like are
easy to dismiss as pure distraction and entertainment. Yet look under the lid of
Second Life and it contains one of the most powerful geographical data visualisation
kits available

And the fine writing and attention to detail of Jon / Keystone was spotlighted in NY Times’ Style magazine this past weekend.  It was a pleasure to share that link with architecture students!

It’s a big world, and immersive 3D systems must balance the tradeoff between quality and performant physics, and an economically practical level of large land areas served up to relatively sparse users, if we are to identify applications that consume vast tracts of GIS data.  Spanning that scale will require that standalone Linden servers have the ability to shortcut some of HAVOK’s demands to pile in many more than four regions per physical server.  After all, if I can get 100 regions stood up on a 1 GHz Celeron using OpenSim, then a four-threaded dual 64-bit Xeon server really ought to do the same for standalone Linden regions, right?  I surely hope so.

One response so far

Next »