Nov 15 2011

3D Geospatial For Real—not a simulation and Kitely, on-demand Opensim

Thanks to the astronauts aboard the International Space Station, their time-lapse photography at very high ISO that helps to share some of what their eyes may well see, and of course Michael Koenig for his care and smoothing of the HD video, with some loungy score, too.
Take five (minutes) and watch it on HD in a darkened room. You might find yourself pausing, reviewing, and spending 20 minutes enjoying.

Earth | Time Lapse View from Space, Fly Over | NASA, ISS from Michael König on Vimeo.

I was fascinated by an orange wiggle, that turned out to be the astoundingly well-lit India-Pakistan border, around 1000 km long.

Meanwhile, I’m forming some plans for next semester’s course, and have realized that it may well be possible to offer students training in multi-user virtual environments without hacking one of the lab workstations to image it as an Opensim server. Thanks to the incessant business analytics of Maria Korolov over the past few years, it was possible for me to quickly get caught up in the new and improved options for cloud hosting of Opensim regions.

Right away it became clear that the business model of Kitely was quite compatible with my modest but area-expansive needs for real-life terrain simulations.  I’ve found it quite easy to get set up with a single region, and that’s a really big start.  I was able to use the latest beta Second Life 3.2 viewer to connect to the latest Opensim 0.7.2 stable release, tweak terrain and set up a few flexi-prims to test the weather.  Nice work technically, and a very nice pricing scheme for my sort of use.  I’m also very sympathetic to Ilan Tochner’s philosophy of “just keep building new regions”—it’s a consistent theme with cloud solutions, and refreshing to see it in connection with Opensim.

No responses yet

Nov 01 2011

Google Maps new MapsGL engine – heavy use and lots of love

Published by under Google Maps MapsGL

I’ve got a version of streams set up, and we’re poised to update our terrain before running through some cycles of review and revision.

But first, I need to get our community base map updated, and that’s been one detour after another it seems.  An innocent-sounding reviewer request to straighten out our parks has led to a months-long campaign of updating shoreline, lower-low water inundation, the judicious trimming of private parcels at the high water line, the representation of park lands both named and parceled as well as public access spaces in the intertidal reaches, and more.  That more has involved a serious effort to accurately represent our marsh lands in terms of tidal channels, mud flats, and vascular marsh vegetation—all of it based on aerial photography and much of it derived from our latest National Agricultural Imagery Program (NAIP) 1-meter 4-band imagery, which has proven very high quality both in spatial edge content as well as dynamic range of lighting.  Marsh features both current and fading in less-than-vigorously reclaimed land have been given much attention.  The details are covering the Petaluma River marsh (downstream of the outflow of San Antonio Creek waters into the Petaluma River) and the Petaluma River banks.

But that led to the latest  detour: those massive steel lattice towers that support electrical transmission lines.  They’re actually very significant landmarks, as well as eminently mappable features.  They were easy, but led to harder stuff: the sub-transmission network.  It’s like you start out with the easy 240 kV lines, and then come back for more.  One day it’s just a few big towers, and then the next thing you know, one’s back for 120 kV, and maybe 64 kV and 32 kV rural lines too.  You know it’s bad when you can’t stop and you just want to locate a few more poles to make it to the next county line…  And then one day you wake up and it’s been 10,000 poles.  ;^)

Anyhow, as a result of what began as a mappable affront to wetland areas, tidal marshes in particular, has now turned into a draft electrical transmission and sub-transmission (not single-house distribution) network feature class.  Turns out that the NAIP 2010 imagery, together with 10cm imagery from 2004 in the urban areas and various 30cm sources in other areas and years has been quite enough to dial in  public utility assets, frequently constructed in public rights-of-way, extracted using basic geospatial intellgence techniques applied to publicly available imagery resources.  The catalyst has been the evolved Google Maps MapsGL viewer engine.

I’ve only started to use MapsGL intensively in the past 10 days or so, but it is astoundingly well integrated.  Right now, I have the sense that there is nothing else quite like it out there for public use.  The interface experience was very different at once, and the viewer actually suggested that I try it when I was very actively moving between 2D map view and Street View, using the mouse wheel.  When I switched over to the new viewer, I was very pleasantly shocked.  The 2D “satellite” view, the “45-degree” views, and Street View were all smoothly mediated by a 3D model textured with imagery from the 45-degree views—using features from the Google Earth plugin.

It was shocking, and something that got me to jump out of my chair to share with a colleague, when I realized that the Earth view was being used to generate a transition between different rotations of the 45-degree view.  Sound obscure?  Consider an oblique 45-degree view looking default north, where you want to look toward the west instead.  Click the compass ring and it will turn, as expected.  What’s not expected (at first) is that the oblique view transition, rather than blanking out and plopping the next view onto the screen, instead puffs out to become a textured 3D sculpty model.  Yes, that means that the buildings, terrain, and trees are shown as they might be when very far zoomed in on Google Earth, and then that view rotates just like it would in a well-handled Second Life viewer, until it settles into the new oblique direction, after which the 3D effect fades and the oblique is presented.

But in one’s mind, crucially, the 3D impression remains and informs the interpretability of the the oblique.  The tree that covers the back part of the house now has been ascribed a 3D volume in the user/analyst’s mind, and suddenly makes more sense than the flat 45-degree view would on its own.

Even without the obliques as an intermediary, popping from 2D map view to Street View is mediated by the virtual reality of textured 3D sculpty objects, and this helps make the Street View perspective far more readable in an instant after arrival.

From my perspective, the MapsGL interface engine is a major evolution of 3D GIS–because it uses a simulated 3D textured surface space to mediate among 2D vertical orthoimagery, 2D oblique imagery, and panoramic ground-level imagery.  That it’s public and cost-free makes it compelling to use for meaningful applications.  In the past few days, I’ve been able to follow sub-transmission pole sequences through fairly rugged forested suburban settings, because my GIS-based moderately detailed imagery allows me to digitize whatever I can see or estimate, while on a full adjacent screen, MapsGL provides sharper orthoimagery, frequent oblique views in urban and adjacent areas, and Street View to tenaciously follow lines as they pass under tree canopy along roadways.

No responses yet

Aug 12 2011

New Flow Lines, and Marin Community Map progress

Published by under SL In General

I’m ashamed to see that posts have been blank since May. I have been busy on another site related to Cr-48 Chromebook usage—but that’s not about this stuff.

In the past months, I’ve been grinding on the Marin Community Map, in particular working out the details of how park lands interact with the tidal reaches. This as graded into a representation of tidal lands, a pulling back of water polygons to lower-low water, and the start of harmonization with the San Francisco Estuary Institute’s Bay Area Aquatic Resource Inventory (BAARI).

I’ve spent hours dealing with topographic (elevation)-based definitions of shorelines such as were used in our model of San Francisco Bay Conservation and Development Commission (SF BCDC) jurisdiction. But as it turned out, all of our interesting marshes and tidal lands are tilted down toward the bay—go figure! So using guidelines for delineation that were very aptly documented by SFEI for BAARI, I started returning to the National Agricultural Imagery Program (NAIP) 1-meter, 4-band imagery of tidal lands for photointerpretation. In many cases, the wetlands were more appropriately mapped using the NAIP imagery than they were using terrain-derived contours. Features like tidal channels creep up to higher elevations while maintaining their widths, while contours tend to pinch out at some point and start going back down the other side of the tidal channel.

One of the by-products of all this attention is that I’ve split out the tidal lands around Marin in that span between lower-low water—which will be cartographically filled in with a blue polygon and bathymetry contours—and high water, where the public easement for beaches stops. For consistency, I’ve detailed out every little patch of space between these tidal ranges, all around the county, and only left out places that were plainly in private ownership, like a back yard with a dock. In keeping with BAARI criteria, I’ve used NAIP color infrared imagery to detail out polygon areas for vascular life forms (marshes) and the tidal channels and outboard mud flats around them.

Particularly good views of lower-low water were captured in NAIP 2005 imagery. Fair views of medium tide were found in NAIP 2009 imagery. A nice mix of low tide and improved quality near-infrared band data are in the NAIP 2010 images. In the end, I’m using NAIP 2005 to trace the outer limits of mud flats at lower-low water, and using NAIP 2010 to detail out the extent of marshes, because excellent red contrast makes it easy in that year’s data.

Also, we’ve had significant progress / closure on the ArcHydro generation of flow lines countywide for Marin and associated watersheds. As of now, we have got flow models for drainage networks below 1-hectare catchment in all of Lagunitas creek, and below 1-acre catchments elsewhere. These flow lines have been attributed with catchment area every 10 meters along their length, which has allowed us to provisionally classify them for perennial, intermittent, ephemeral, tidal, or impoounded flow. Also, we have attributed USGS NHD FCode feature codes for every segment as either a flow-specific creek, various storm drain pipes and ditches, or artificial paths through standing water. All of this stuff is being run on our 45cm topographic-bathymetric surface model, so all of the ArcHydro flow lines are running seamlessly through the tidal reaches and out into deep water. It’s been particularly interesting to see where soft sediments meet granite and other outcrops offshore, as flow lines go from largely parallel sheets to dendritic patterns even when they are underwater, using this technique.

The modeled flow lines can be found at this link.

No responses yet

May 12 2011

My Goodness — it’s full of *stars*… WebGL fun and games

Published by under SL In General

I’ve been catching up on this week’s Google I/O 2011 via some videos.  Much of my interest has been on how Chrome is presented, and the video does not disappoint.



The most fascinating insight, in terms of 3D GIS, and shared (not yet multi-user) virtual environments, appears to be Web GL.   For the benchmark of performance that leads to hours of time wasting entertainment, would you care to see Angry Birds?  I’ve only tested this with Chrome 12 on the Cr-48 and Chromium 13 on Ubuntu, but here’s the site

And what’s (much) more, a vision of seamless integration of 2D animation, video, and interactive immersive 3D environments at 25+ FPS — the project at

If you’re like me, and can’t view it in full WebGL glory because you’re on a Cr-48 or some other earlier browser, here’s the trailer to help give you motivation to try out something new in the browser world.



And what really caught my ear, and hasn’t been a top note in yesterday’s blogs was this announcement: the Chrome Book subscription pricing of $20/user/month for education institutions is also for government institutions.  If this should include the same centralized web-based management of user cadre—it would seem a very attractive price point.  Right now, a typical well-endowed elementary school might have a cart with 28 MacBooks that gets wheeled around between classrooms.    If 30 Chrome Books weigh in at $600/month, they’ll get replaced under the upgrade program before the cost approaches that of the initial MacBook acquisition, won’t they?    Government offices might stand to get 60%–80% of their users off of Windows desktops and onto something less costly.

The initial pricing of the retail Chrome Books at $500 seems a bit steep, although the Samsung unit will probably have an Atom processor with four threads and some better GPU capability.  It’s the subscription pricing that really seems to be the main kick—because it sounds far more attractive than the retail option.

No responses yet

May 10 2011

Domain Hack Acquired –

Published by under SL In General

Somehow, it just came in a flash – why not explore a 3D GIS domain hack?  So I did, thanks to the good folks in Iceland.
Most earlier domains still bring one to this blog, but the coolest IMHO is this:


No responses yet

Apr 29 2011

Spring 2011 – Revolution in Interfaces: Narwhal and Windows 7

Published by under Cr-48 Notebook

My purpose-built OpenSim workstation is now 40 months old. It was about $900 worth of boxed parts gathered around the Winter solstice in 2007 and built up by January.   An overclocked Intel Core2 Duo E6550 two-thread processor at 3.4 GHz, 4 GB memory, and requiring a modestly noisy processor fan, tonight it has been racing its 4-month-old companion machine built as a learning system for a second grader.  The 40-month-old machine first ran Ubuntu 7.10 x86_64.

The 4-month-old machine was inspired by the Cr-48 laptop design, and features an Intel D510MO system board with a 1.6 GHz Atom four-thread processor, 2 GB memory, and an SSD as its only storage device.  It was gathered as $275 worth of boxed parts around Winter solstice in 2010.  The 4-month-old machine first ran Ubuntu 10.10 x86_64.

Anyway, both machines are running the Ubuntu Narwhal upgrade at the same time.  The older Core2 started first, but the new Atom machine is already at the Cleaning Up state and the Core2, which started its upgrade about 10 minutes earlier, is now five minutes behind the Atom.  While surely some of this is due to the 10x age ratio between the machines, I also take note that it’s a rather sturdy validation of the sort of hardware that sits within the Cr-48.

Atom is restarting to boot into Narwhal at 2305h; Core 2 restarting at 2308h.

Whoa.  There’s a ribbon along the left.  It’s like a color flashback of NeXTSTEP…

You know, the more things change, the more they stay the same.  I’m sort of mildly shocked at how the new Unity interface has changed Ubuntu desktop so much—and yet it’s rather familiar to me and anyone else who used NeXTSTEP about 22 years ago!  If anyone tells you this looks like Apple OS X, pat them on their pointed head and say “It sure does!  But what does OS X look like?”

In the past 10 days I’ve sat in front of the arrival of Windows 7 on my work workstation, and now Ubuntu 11.04 Narwhal here at home.  It is a big change in interface in both cases, and I’m casting about a little bit for some of my familiar points of reference for adjusting system stuff.

In Narwhal, I find that the System Settings are now found at the base of the Off button in the far upper right.  The new Control Center reminds me of some of the CPanel screens I use to configure web domains.

What’s really wild about the Control Center is that it starts to look like the Chrome 12 and up Settings panel.  This is where the experience is a bit discordant for me.  On the desktop, I’m back at NeXTSTEP, but for the system settings, I’m right up here just like ChromeOS.  It was even a touch like that on Windows 7 this week when I installed Internet Explorer 10 preview, and saw interface and settings that looked to me a lot like Google Chrome and Chromium browsers.  Hey, it’s a small world.

OpenOffice suite has evolved into LibreOffice.  Apps have some shutdown buttons on the upper-left that look like the three jewels in Safari’s interface.  There’s a bit of everything showing up here, and as long as its aggregating all the good ideas from all the various interfaces, that seems like a good thing.  In a way, that was one of the positive experiences that I recall from using MS Windows 95 the first time.  At that juncture, Microsoft seemed willing to incorporate the best keyboard shortcuts and take some ideas from NeXTSTEP, Macintosh, and Windows 3.1 and just include everything together.

That approach suited me far, far better than the Steve Jobs diktat. Remember, he was the one who insisted for almost decades that the mouse should only have one button, and that keyboards should have clovers imprinted upon them.  That sort of preciousness in design turned me away from Apple, and although it smells sweet once in awhile, whenever I drive the interfaces that appear from Cupertino—where others see polish and finely oiled machinery, I see patronizing choices made on behalf of consumers being bled while held in cramped chambers.  But enough of the past!

My last three weeks systems experience has been revolutionary.  I’ve made my first use of Windows Server 2008 R2 64-bit on a 16-thread machine, and was able to fully tap out its processing resources with various GIS activities.  I’ve started settling in to a new Windows 7 workstation with 12-thread hardware (and struggled to get it more than 25% utilized—more on that soon).  I’ve watched a $275 system configured for a child’s use outrun a 3-year-old overclocked gamer rig.   As I write here on a Cr-48 (with merely two threads) I sense the Chrome interface has either found its way into other interfaces, or Chrome has somehow copied some prescient proto-interface that I never saw.  It’s like things nifty and new just five months ago with Chrome OS are now getting mainstreamed in Microsoft products (at least as I see IE 10), and in Ubuntu 11.04 (with its System Settings interface update), even as this ChromeOS interface itself evolves toward what’s in Chromium and  Google Chrome 13.

It’s all rather good, but this Spring’s revolution in interfaces tells me that we’re on the brink of a post-Microsoft-hegemony consumer and commercial computing world.  Smaller stuff is smarter than one might expect.  Cheaper stuff is more capable than it seemed just a few months ago.

For my work, these new systems capabilities mean that products that were fancy and costly to produce six months ago are now far more attractive because they are both affordable and higher-quality.

No responses yet

Apr 29 2011

ArcGIS Explorer – consuming Large-Scale Topographic Base

Published by under SL In General

To help an interested community group,  I’ve today checked out how the draft Marin Community Map service can be draped over terrain with the free ArcGIS Explorer program (for Windows users, at least).

Dillon Beach oblique view

ArcGIS Online and Marin Community Map draped on world topography


No responses yet

Apr 28 2011

Visit to ScienceSim – with new graphics card

Published by under SL In General

Testing out the new workstation and its NVidia Quadro 4000 (2GB / 256 CUDA cores), it was a treat to visit ScienceSim again!

ScienceSim - Dakota South region, 2011 04 28

Visit to ScienceSim in Dakota South region

No responses yet

Apr 26 2011

New workstation – New cache

Published by under GIS in general,Marin County

The Marin Community Map cache has been draft-built, and can be explored as the highest four zoom settings here

I’m settling in to a new GIS workstation, custom-configured HP Z400 with twelve execution threads.  Windows 7 configures with a  Windows Experience Index of 7.3 and running very new browsers is a treat.  The preview version of Internet Explorer 10 checked out thusly with its Windows-loving HTML5 Fish Bowl Speed Test.  Two of them running, each with 2000 fish, both hitting 31 FPS.

Dual 1600x1200 screen, two IE 10 browsers, 2000 fish each, 31 FPS

Fast HTML5 Browsing, according to Microsoft

I I think that this will meet my needs for now. ;^)



No responses yet

Apr 05 2011

Cache on the Barrel – Community Maps Work

The CPU cycles are burning as I set up some test cache building for our local edition of ESRI Community Maps – Large-Scale Topographic Base Map.  It’s rather a treat to see a bunch of execution threads and trying my best to keep them all busy!

As I work through glitches here and there, I’m catching up on some of the good information that is online to help folks in my situation, like Advanced Map Caching from the 2010 ESRI Developer Conference.


No responses yet

« Prev - Next »