Archive for the 'Terrain Models' Category

Jul 07 2017

Unity 3D – Camera Build

One of the beauties, and for some of us the terror, of Unity is its nature as a gaming engine.  Basically it is a sort of 4D Integrated Development Environment (IDE) that makes it relatively efficient to design and build prototype or production game applications—including serious games.

The terror is that Unity is basically an extraordinary IDE for what is most likely C#, and the built-in components are very versatile yet very primitive, which can mean starting from very little.  The beauty includes experiences learning how to create just about anything from those primitives and realizing with some wonder that it’s possible not only to develop, but to actually become productive inside of a week of evenings.  Once I attained my basic understanding of the various (sometimes virtual-literally) moving parts, it was not difficult to identify a feature to add and get that added in one evening, over and over again to where things really accumulate.

Along with the beauty was an occasional burst of joy, as certain coding errors produce psychedelic visual experiences.  It has been very satisfying to reconnect with some old friends, the Quaternions.

No responses yet

Feb 13 2011

Not asleep, just spread thinly.

Published by under OpenSim,Terrain Models

As models of surface water flow for all Marin County are getting ready for review, there’s also a convergence with the large-scale topographic base (LSTB) map effort.

In fact, the hydrologically-enforced flow lines (HEFL) are getting their first chance to fit into the LSTB context.  This has helped immensely to clarify the uses that most folks will see the flow lines within.

Also, the Spring 2011 class in Spatial Analysis at College of Marin has gotten underway for three weeks now, and the students have turned out to be the best-qualified of any I’ve ever had the pleasure of teaching.

While cleaning up some web presence matters, I stumbled across this interview that relates to OpenSim and the sorts of themes that got this blog going in the first place.

The Android phone is getting some heavy use with its rooted install of Android 2.2/Froyo, in the form of Chromatic 4.5.  It feels like the G1 hardware is really near its limit, and I’m not anxious to move it too much farther forward.  It really is time to start looking where to move next—and Android will most likely be the foundation, but exactly what mix of GPS, compass, gryro, and connectivity will be there when I upgrade this summertime?  It’s a bit exciting to start tracking developments now to help make a better-informed choice when the time comes.

The Cr-48 has been amazingly fun.  It’s been moved to a home-built image of Chromium OS, and back, and forth, and back again.  It’s been made to work on T-Mobile network, although only at EDGE speeds.  It’s doing a very fine job of scrounging print resources with Google Cloud Print.  Hey, it’s even got its own blog going now, to save clutter here.

Keeping up with class work, and preparing for lecture and lab time each week, is taking time from writing, while at the same time providing so much more to write about!  Most likely it will all balance out by June when the semester is finished.

No responses yet

Jan 07 2011

Terrain progress linked to Community Basemap Large-Scale Topo

The ESRI Community Basemap program’s new ArcGIS 10 template for Large-Scale Topographic Mapping is somewhat streamlined and cleaner when compared to its ArcGIS 9.3.1. predecessor.  But the template map document still has the ability to bring my workstation to its knees when doing stuff like printing.  I find myself unable to print 11×17 or export a B-size PDF without having incomplete results.  Mercifully, I can print and export to A-size so people are starting to get a taste of what is to come when we start building cache tiles.  Everyone who has seen the base maps was quite pleased, and some were almost gushing over them.

Preparing input features to pour into the template geodatabase that  are good enough to be worthy of 1:1200 scale mapping is every bit the challenge it might seem.  Each feature class that is making its way into the template seems to require a unique bit of spatial analysis to get pulled together, and usually this involves combining our best available input data in ways that we haven’t before.  This week’s challenge has been water polygon and water line feature classes.  For utility, we’re choosing to split out the water polygon from its water line.  For lakes and ponds, they will coincide, but for our tidal coast, we’re conveying extra information by setting the polygon to represent Mean Low Water, and the water line to represent Mean High Water.  This helps to convey the widely varying slopes that Marin County has at the tidal shores, and provide some hint as to the width of the public access way below the high tide line.

These tidal shores have been compiled from the 25cm-interval topo contours that were generated for the Community Basemap, and in NAVD88 we used nominal 50cm contour for Mean Low Water, and 175cm contour for Mean High Water.  Between these elevations one often finds various “water” polygons depending on the application; in our base map we’ll have a purposeful gap.

Along the Sonoma borderlands I struggled for a few hours hand-tracing ponds using NAIP 2009 near-infrared grayscale—the one where standing water is basically black.  I was a bit sloppy and found myself tracing at 1:4000 on screen but it was a fairly thorough job through the adjacent shared watersheds along Estero Americano, Stemple Creek, San Antonio Creek, and Petaluma River.  I was just about fed up with drawing lines around standing water when I checked in with MarinMap member Bill Voigt of San Rafael for other reasons.  He provided an inspired suggestion that I use the water lines from our 2004 photogrammetry.  Of course!  I had used them in the terrain, but they were not hard constraints and the sparse contouring in west county lands tended to wash over stock ponds in the contours.

Empowered by Voigt’s suggestion, I found that the 2004 photogrammetric water line work, which was only available within Marin County proper, had very high fidelity with 2009 NAIP summer ponds, and it was vastly easier to select the pond-circumscribing lines, sometimes 300 segments at a time, copy and paste them into the Inland_Waters_li features, Merge them all at once into a single compound line, then Explode the multi-parts into discrete ponds.  With that approach I was able to harvest hundreds of Marin stock ponds, reservoirs, and vernal pools, and have geometry that appears smooth and accurate at better than 1:2400.  There are also quite a few meandering creek features in the west county that are well represented horizontally in the water break lines, but my impression of them was that they took a lot of upgrading to serve as 3D guides in the terrain, and construction rules broke the pond lines wherever a drainage reached its edge.  So there were a great many segments that needed to be merged to create closed loops that could build pond polygons.

By the end of this week, we had over 1400 water bodies identified in the watersheds that touch Marin county, and this exercise that was initially motivated by improved cartography may augment our set of candidate sites for wetland inventory.  These  water features are set into a regional water layer from The National Map that is valid at 1:24k and includes a refined coastline from Big Sur to north of Point Mendocino, and inland to the Sierra crest, from San Luis reservoir up to Mendocino Lake.

We refined Marin and adjacent shorelines (as both MHW and MLW boundaries) from Sears Point, up to and down from Petaluma, along San Pablo, San Rafael, Richardson and San Francisco Bays, the Golden Gate, Gulf of Farallones, Drakes Bay, outer Point Reyes, Tomales and Bodega Bays, and Bodega Head using 25cm interval topographic contours.  All Farallon Islands were traced near 1:2400 scale as visible in NAIP 2009 near infrared band.

Next week should find the last couple of administrative layers (Parks and Open Space) and refinements to road centerlines to reflect docks and proper arterial status, and then it will be time to start making test tiles.  With any luck, we’ll take some CPU cycles from production server to give us the best shot at getting good tiles fast.  With 120 layers, several of which use representations, and many of which use complex Maplex rules for labeling, the Community Basemap Large-Scale Topographic template for ArcGIS 10 is by far the most demanding map document I’ve ever manipulated.  Everyone is looking forward to getting the tiles generated for testing, and sending them off to ESRI for review while sending samples to MarinMap members for their comments as well.

No responses yet

Dec 16 2010

Many changes in a month – AGU Fall Meeting 2010 and Cr-48

Each year for the past 40 or so years, the American Geophysical Union has met in San Francisco around December for what is now the Fall Meeting.  I’ve been an AGU member for about 28 years, and for a time was attending each and every Fall meeting—but these days it’s about once every three years.

This was one of those years, and it was a great pleasure today running into a good handful of friends and former school colleagues from years past!  Also, it was much fun to present a poster that summarized an analysis of synthetic flow lines built on the integrated topographic-bathymetric surface model.  Basically, with a very detailed 3D surface grid that runs continuously from mountaintop to offshore out to the 3-nautical-mile legal boundary of California counties, it is possible to draw streams as they would have flowed when sea level was lower, like 7000 years ago.

Much of the interesting topography from those streams got clobbered by sea level rise.  As the Ice Age retreated and continental glaciers melted out, the waves from the Pacific Ocean pounded the coast back to where it is today, and planed off much of where the streams once ran.

With ArcHydro-style drainage analysis on our terrain model that has fused detailed multibeam bathymetry from the California Seafloor Mapping Project, it is possible to identify extremely subtle signatures in the portion of the offshore platform that is Santa Cruz mudstone formation, a harder Miocene formation that expresses bedding in its surface.

With the analysis, when synthetic drainage paths are symbolized to emphasize flow lines with greater catchment area one can observe suggestions of right-lateral offset.  In California, this is a signature pattern for tectonic offset of drainages that cross strike-slip faults with right-lateral offset.  Because the formation where the analysis has detected possible offset is older (Miocene is more than 5 million years old, but the offset is perhaps only in the last 1 million years), this result should not cause much excitement with regard to modern seismic hazard.  It could however prove helpful to those who would decode the geologic structure of the Point Reyes peninsula.

(comments on the Cr-48 have been moved to its own blog at

No responses yet

Aug 27 2010

Marin County Topo-Bathy Surface “tbsm45cm” Released as 2010.08

Blemishes notwithstanding, nearly six months of back-burner work has reached a threshold of readiness and is outward bound to some engineering firms, flood mappers at FEMA, and interested parties within the county. A handful of known issues remain unresolved. Proper name is “tbsm45cm_20100823”, proper edition is “2010.08”.

This is the third version of the terrain. Second version was “2010.01” and included multiple LiDAR data sets, but fewer than presently used, and was a topographic model only. First version was “2009.09” and was mainly photogrammetry and FEMA LiDAR, and was the last version to be developed in California Coordinates. Once the massive NCALM LiDAR data sets were processed, it became easier to move everything into WGS84 UTM zone 10 north meters projection, WGS84 NAVD88 CONTUS Geoid 2003 vertical position.

The NOAA utility program VDatum, a brilliant Java-based application able to stream-process data sets of near-infinite size, brought the NCALM data to heel, and opened up decades of NOAA depth surveys to our use in integrated topographic-bathymetric surface modeling.

First-return NOAA ALACE LiDAR swaths were fused along the outer coast, as bare-earth filtered versions were not produced in 1997–2002; the benefits of LiDAR detail along the rocky coast do seem to outweigh the distracting appearance of structures near Rodeo Lagoon, Stinson Beach, and outboard Bolinas.

When ArcGIS 9.4 beta 2 reached its limit in ability to render the terrain dataset into 45cm grid over the full extent, the clipping quadrants created to resolve this problem ended up chopping a very small portion of Sonoma county that drains into Estero Americano; the full watershed remains intact in the 1-meter version of the terrain grid under analysis for county-wide hydrology. Likewise, the tighter clipping quadrants lost a few hundred meters of San Pablo Bay bathymetry just west of where Marin, Sonoma, Contra Costa, and Solano counties meet. Also, tighter clipping quadrants snipped a portion of the San Francisco Bar southerly of San Francisco’s Seal Rock that was intended to be part of the model. All of these areas exist in the 100cm grid, and will be part of drainage analysis.

Happily, we have updated the workstation to ArcGIS 10, and have been enjoying such great speed gains with Spatial Analyst that our ERDAS use has been noticably reduced. Finally, Spatial Analyst is often showing performance nearly on par with ERDAS. Thank goodness that the Raster Calculator survived the transition to version 10 ArcGIS!

Painfully, the existence of unutilized bathymetric data sets for upper broad-channel Corte Madera Creek and Bolinas Lagoon have been revealed this week. Hey, there’s already something to look forward to for the next build!

The new terrain is getting some immediate use in support of an effort to participate in ESRI’s Community Maps Program for large-scale topographic mapping. The Program provides a template geodatabase with 36 vector feature classes and two raster, into which local agencies may pour their data. Once tucked into a conforming schema, a template multi-scale map document is provided with 120 layers—30 at each of four large scales that correspond to Google Maps and Bing Maps projection and cache tiling schema. The difference is that the template document makes use of ESRI tools to allow much more local detail to be packed into a map designed with notably more sophisticated cartography than either Google or Bing maps now have. The Community Maps Program concept is that local agencies may publish their local detailed content in a fairly uniform style, while retaining a world-wide seamless context for their surrounding area.

Qualitatively, the effect is that, when viewing the topo map alongside either Google or Bing maps (on two monitors, with comparison made at the same scale), the map looks to be a larger scale. It isn’t, and I’ve measured the size of features to convince myself, but my mind insists that I’m zoomed farther in on the map for some reason. My guess is that it is a perceptual effect of the much greater amount of information that is cleanly displayed in the map versus the much sparser Google and Bing content at these large zoom levels. Try it out—it’s like a carto version of an optical illusion!

The 120 layers in the template large-scale topographic base map from the ESRI Community Maps Program are arranged to provide four precise cartographic designs for Google/Bing map cache levels 16 through 19, which correspond to these display scales
1:15000–1:6001 (level 16, a.k.a. ~9k)
1:6000–1:3501 (level 17, a.k.a. ~4.5k)
1:3000–1:1501 (level 18, a.k.a. ~2k)
1:1500–1:501 (level 19, a.k.a. ~1k)
One of the most attractive areas currently online is Toronto, ON where at levels 18 and 19, individual building outlines are graced with street addresses.

Anyhow, the new tbsm45cm model will serve County of Marin’s effort at large-scale topographic mapping several ways. First, it has made possible a very detailed hillshade that helps emphasize the grading around each hillside structure in the county. Second, it helps us to create the required metric topographic contours. These are necessary to meet world-wide mapping standards, and throughout this weekend, contours are being generated from a related (smoothed version) of the terrain on 50cm vertical interval. Needless to say, most of these won’t get used in the map renderings, but the ESRI cartographers have shared a very clever indexing scheme that will help us use this single set of metric contours to support the requirements for all four of our topographic map scales.

No responses yet

Mar 12 2010

Sharing Terrain With the World – Google Earth style

It’s not fully 3D immersive, but hey, 2-1/2D ain’t half bad. The “dsm40cm” model of Marin County has been published as the county’s default terrain on Google Earth. It’s a great pleasure to work with folks who are not troubled by a county representing its surface on a 40cm single-precision float grid that weighs in at 77 GB. In terms of data bulk, that is about the same as the entire 30-meter version of the US National Elevation Dataset.

What one gets when piling that much detail into a single county of around 520 square miles of land area is every building pad, driveway, and crown of road paving that were resolved. The dsm40cm model was derived from an ESRI Terrain Dataset that incorporates our best available topographic contours (1:4800 scale 10-foot; 1:2400 scale 2-foot,) photogrammetric break and water lines, FEMA LiDAR and NCALM (GeoEarthScope) LiDAR data sets. The Terrain Dataset currently comprises 40 GB of vector GIS data.

When the finely detailed surface grids were first developed, we broke the county up into 20 work areas to maintain ArcGIS 9.3.1 in a stable and productive state, and 30cm posting interval grids were generated that covered the entire county–at least during development. When necessary, these grid tiles were mosaicked with ERDAS Imagine into a single seamless grid. The 40cm version was produced directly as a single seamless grid using ArcGIS 9.4 beta 1, on a workstation imaged with Windows Server 2003. The WGS84 UTM, NAVD88-Geoid 2003 result was provided to the Google Earth team earlier this year.

As with all GIS data sets, it seems, the more detailed it is, the more rapidly it may need updating. In the works for the next year or so are several improvements to the dsm40cm model. First: the photogrammetric break lines will be segregated into steeper sets that tend to run along ridges, and shallower slopes that tend to delineate road cuts and building pads. The ridge set will be used as soft constraints to resolve some artifacts where they rise above some contours.
Second: incorporate new LiDAR data as it becomes available. Some data has already been provided for the lowest part of Lagunitas creek, and it appears that Prof. Ellen Hines of San Francisco State University’s Department of Geography and Human Environmental Studies has been funded by USGS to gather LiDAR county-wide this year.

So there will be revisions, but an exciting aspect is to see data flows being brought into existence that support different levels of mirror world development.
Publishing the dsm40cm model in Google Earth is an important (and beautiful) threshold to cross. Making use of the dsm40cm model in county operations such as creek and watershed delineation will be the practical benefit that drives the work in the first place. And before too many more weeks, there may be entirely new approaches to publishing the data in an immersive environment (neither Second Life nor Opensim) to share.

Building pad in Kent Woodlands shows driveway-level detail

Kent Woodlands building pad and driveway, in the shadow of Mt. Tam

No responses yet

Jul 06 2009

OpenSim Terrain notes, and Darb has Process Credit history!

I’d read about this, but never before experienced the agony first-hand.  Extracting funds from SL, the wait for funds to arrive at PayPal was a bit slow.  In fact, in the time it took funds to go from Linden to PayPal, a bamboo shoot in my back yard could have grown taller than me (that’s my RL not SL height!), and would have been over 2 meters tall.  Anyway, Process Credits are quite lacking in symmetry with how quickly credit charges can flow into the Linden realm.

During this week of waiting my random prims have been cleared out from Amida and nary a trace of Berkurodam BART Station remains besides a video in Gualala.  The video screen was actually entombed by a neighbor, who may not like it but did not send any message.

Anyway–for me this week is all about generating maps and graphics while keeping up with work.  I’ve generated a 50cm terrain grid for parts of my county where perhaps 150,000 people live.  With computational process improvements I should be able to make production stable enough to generate a 25cm grid.  The point is to model terrain slope and aspect within urban parcels.  OpenSim can pack 64 terrain megaprim sculpties over each region to refine terrain more than the built-in 1-meter postings, and display 10cm orthoimagery at full resolution.

Last year, I used first-return LiDAR data of the UC Berkeley campus to generate a 25cm grid for 10cm imagery.  Now, I’m working with bare-earth LiDAR data from FEMA, topographic contours (densified to 1.5m vertex spacing), and most importantly, photogrammetric terrain and water break lines.

Throwing all those data into the mix, the data are built into an ESRI Terrain Dataset, from which I generate TIN and GRID models at various reolution and extent.  The ESRI ArcGIS 3D Analyst Terrain-to-TIN generator breaks down after about 10 mega-faces (so would I…)  And the ArcGIS Terrain-to-GRID generator seems to drift into Windows-unconsciousness after about 1.0 giga-cells.  So for the grid, I break it down and do the pieces, then merge the tiles using ERDAS Imagine, because the ESRI ArcGIS raster mosaic function does not produce output grids much over 10 GB.  As annoying as learning these ArcGIS limits can be, it is very satisfying (and instructive) to see huge swaths of seamless terrain with great detail once it all comes together.  Thanks to the break lines, many driveways and most home building site cuts and fills are resolved.  And it will be a lot of terrain by OpenSim standards–enough to calibrate terrain for over 20,000 contiguous regions–not that I ever expect to build it all at 1:1 scale!

No responses yet

Jun 22 2009

My Second Life tier will soon be history

Sometime, it just isn’t worth it. Such is my new view of tier, in the context of what matters to me with immersive 3D and GIS. For about six months I’ve continued my hold on some land in the classic Stanford sim of Second Life, without quite being able to work out the boundary changes to just barely squeeze in a 1:1 scale model of a single large building. Even if I had been able to get the parcel into the shape that I needed, I still would not be able to model the structure’s dome with a prim that naturally had the large radius required. Not everyone is trying to model a Frank Lloyd Wright public building; perhaps the land can be better used by someone else with an architectural focus.

I’m scaling back ownership this week to the tier-free 512 square meter level in Second Life. I’m also building up a freshly configured Ubuntu 9.04 Jaunty Jackalope 32-bit server (dual 3.4 GHz Xeon – 4 GB, HP DL360 G4) to do some more serious sort of work with OpenSim. In the past five months I’ve developed some terrain data that can handily provide 1-meter postings over more than 500 square miles. With that much to publish, I really need much, much more than 1/8 of a sim, even a suberbly cool sim like Stanford.

View of beautiful Stanford sim with pond features

View of beautiful Stanford sim with pond features

The orange area is available at L$20/square meter

The orange area is available at L$20/square meter

So if anyone reading this has use for a great 7520 (< 1/8 sim) mainland location in Second Life with over 40 meters of terrain sculptability, it’s available for L$20/square meter. Discount available for OpenSim community members or known GIS people. With the world’s economy as challenged as it seems to be, I’ve decided that it’s time to focus on where things matter most, and for me now that’s OpenSim more exclusively.

No responses yet

May 24 2009

Some thoughts on geography

updated 2009 05 26

I’ve been waiting for some property boundary issues to resolve in SL, and it’s sort of pitiful to see how long that can take.  It’s with ever more regret that I find myself on the Mainland.  But that hasn’t kept lots of real-world interesting stuff from taking shape.

The following video is not new.  In fact it’s about a year old, but somehow I hadn’t seen it until tonight and I found it somewhat encouraging. Thanks for O’Reilly and Where 2.0 for bringing these two on stage together!

And the following pean to Google Earth did inspire me, personally. Hey, I was reading road maps at 5, covered my wall completely with National Geographic maps at 10, learned to navigate with nautical charts at 12, read aeronautical charts and completed an urban planning project at 14. Sometimes, it’s fun in rare moments when it’s dark overcast and I’m in an exotic place for the first time and I don’t know the way north; more often, I’ll savor the feeling of knowing which way is north while dreaming.

Meanwhile, back at the lab, the global set of county terrain is being compiled into an ESRI Terrain Dataset. This will include over 360 million masspoints, merging both interpolated 2-foot interval contour vertices together with FEMA LiDAR mass points, plus break lines and waterlines from photogrammetry. The goal is to use the ESRI Terrain data as a format to stage everything together to produce 30cm grid interval DEM in the urban areas. With luck, we’ll have that ready about the time that the latest photo mosaic finally gets loaded into ArcSDE successfully. Maybe grids from the Terrain can help create very detailed 3D county models. Hey Wei – we still have inverted terrain in Google Earth at the quarry on San Pedro Point! ;^)

No responses yet

Mar 25 2009

Terrain Tenacity, fresh ortho pixels

Terrain has been in the mix for me quite a bit these past four weeks.  I’ve worked on pushing ESRI ArcGIS 3D Analyst to its limits of masspoint digestibility, trying hard to bring everything into focus at the same time that everything is sinking down to NAVD88 datum.  An abundant set of waterlines and terrain breaklines have helped to make possible some terrain models that appear to be as good as any one is likely to get from photogrammetric data.  As with LiDAR source, I’m working toward a 30cm gridding interval to sample any reasonable-looking TIN models.

One fascinating aspect of the terrain model is where it ends.  There appears to be a new 1:1200 or 1:4800 shoreline that can be sussed out of some combination of 2.5-foot elevation waterlines, 2-foot elevation contours, and related artifacts.  In fact, it’s a great patchwork of artifacts that must be stringed together.  In the tidal flat areas, there is also plenty of need for validation with multiple photos (hopefully shot at times of lower tides).

Adding to the data bulk there’s a new ortho in town, 30cm natural color flown just about two years ago.  There’s hope of extracting it from the grip of California HARN coordinates after it is all mosaicked.

No responses yet