Archive for the 'Vision Statement' Category

Jul 23 2016

Esri ArcGIS Pro 1.3

Published by under GIS in general,Vision Statement

For this blog I’m starting a bit of new direction by focusing on Esri tools with available data. Having just completed a draft final version of what will become public-domain building footprints, I’m starting to create 3D City mapping at Level of Detail 1, which I’m inclined to express as a version 1.01 of 3D City.

I’m also in Day 11 of quitting ArcMap cold turkey. Or at least well chilled—with three very narrow returns to ArcGIS 10.4.1 for Desktop for a specific use, then return to Esri ArcGIS Pro 1.3 only. A couple of weeks ago we got the last updates to some building footprints. Now after a couple of day’s work, modeled with bare-earth and first-return LiDAR-derived surfaces, we’ve got extrusions, can run the Procedural textures to give the scene a CityEngine look, and have exported a Level of Detail 1 (call it LoD 1.01) multipatch model.

Gathering statistics from LiDAR was approached in a fairly serious way, with 2 ppsm airborne LiDAR modeled on a 50cm grid with Natural Neighbors (which uses tesselation) for each of [all ground-classified points] and [first-return points]. With some smoothing to attenuate any scanning marks, a [1st return minus bare-earth] difference surface was also calculated and called [height]. The height grid was carefully smoothed with an unsharp mask just enough to eliminate airborne scanner noise, while preserving as much tree canopy detail as possible.

The new building footprints were cleaned and sorted by Shape_Area descending, and assigned an Area_ID in the range [1–177023], then the Area_ID was used with the footprint shapes to define a 25cm zonal grid based on maximum cumulative area that was snapped to the 50cm LiDAR-derived grids. These 25cm cells were used for Zonal Statistics to Table, after converting the three input grids to integer centimeter precision. Zonal statistics on the original floating-point meter grids would not yield either median nor majority values, which were considered useful for stable roof height detection. So all available zonal statistics were run on integer centimeter grids (50cm sampled) of bare-earth [gnd], first-return [1st], and a blurred difference [hgt], sampled over zones of 25cm cells representing each of the 177023 building footprints.

I briefly returned to ArcMap to interactively join the three statistics tables in a more stable way. After that, it was great to get back to Pro 1.3 and edit the schema, renaming and reordering fields, and calculating a few key statistics like minimum ground and median first-return from integer centimeters back into floating-point meters. The minimum ground was used to position each footprint at an absolute elevation based on our bare earth model “tsm50cm” for topographic surface model at 50cm gridding, and the median first-return was used to position the roof at an absolute elevation.

On Day 9 of Pro 1.3, I tried to export the shapes with Procedural textures using Layer 3D to Feature Class, and crashed repeatedly. Even with smaller areas of the city. But finally I got down to just Treasure Island, and it worked with some curious anomalies at edges but with attractive Procedural textures. So I exported to multipatch without color or texture, and got a better-shaped result almost instantly. So, I ran the export for about 1/4 of the city again with Layer 3D to Feature Class, exporting the extruded building footprints to enclosed multipatch 3D boxes—and Pro 1.3 did it in five seconds. Feeling ambitious after that success, I ran the whole city for a set of 177,023 footprints, all Polygon-Z that had been positioned at the lowest NAVD 1988 meters, and the Layer captured the extrusion up to median 1st-return. Not only did that multipatch export complete without crashing, the entire city was done in just 20 seconds.

The Procedural textures were very nice to see. They are oh so precise, but for our city the International Building texture is not so accurate. Still, I was able to tune upper floor height very usefully. And on my little department-issue workstation, I saw all eight CPU threads firing on full while the rendering was taking place. Finding ArcGIS Pro 1.3 running multi-threaded in just the right way to fully utilize the workstation while keeping the interface reasonably responsive—it is a very nice balance indeed. I don’t miss ArcMap at all so far!

No responses yet

Feb 09 2010

OpenSim: and now, a word from the Founder [Second Life]

Many thanks to Singularity U, director Matt Rutherford, and to Randall Hand who brought it to my attention After chatting at SLCC 2009 this past summer, I appreciate the immediacy of this lecture. OpenSim is discussed around minute 37 (video is available at 720p HD, and is just over 51 minutes long.)
Discussion of augmented reality, and mirror world creation in Second Life and virtual world simulators, just after minute 44.

It’s hard for me to listen to the entire talk just one time and retain the best explanations – but clear and current they are. In a virtual environment, immersed in near-infinite possibilities, Rosedale may no longer be guiding the Second Life ship, but I believe he remains the compass needle

No responses yet

Jan 13 2009

Meta-machinima, Berkurodam for sale, OpenSim server offline

After two years, it seems time to work on a new big build.  In the interest of conserving SL resources, I’m looking to get enough from selling the Gualala land and Berkurodam build to purchase adjacent land for a new build at the Stanford site.

I’m interested in selling the Berkeley build to architecturally-minded SL folk, so that with a properly sized and shaped parcel, I can do another RL scale model build (not based on any location in the East Bay). Work circumstances have changed and I’ll be spending much less RL time in Berkeley, so Inquiries are welcome care of darb (at) simgis.com.

Unfortunately, the public-facing OpenSim server that was loaded with the 40 region 1:1 scale UC Berkeley model has now been taken offline and is in search of a data center slot.

Meanwhile, just to prove that I’m still around the metaverse, I’ve made my first meta-machinima. Using the YouTube MP4 streaming service, which is apparently available for any uploaded video, one can map a texture into a video stream as part of parcel media settings on Linden servers.  This machinima was shot at the Gualala Level 3 Berkurodam build.

For those with the site blocked, the URL is http://www.youtube.com/watch?v=Ntjkj4eyQvM 
I’m embedding the video below

No responses yet

Jul 11 2008

OpenSim holding the immersive middle ground?

While impatiently waiting for a local build of Mono to complete, I explored the new lively.com from Google Labs (the Mono build left some unused capacity in the XP+IE part of the lab ). It was fun to take keyboard knowledge of the SL client and guess the ways to zoom, dolly, pan, orbit, and dive around one’s avatar in Lively—and of course, find everything was there with googlish care.

I read a reminder (from a review of Wagner James Au’s book on early days of Linden Labs) that an original intent of that Linden crew was to build a representational and immersive model of real world.  And somewhere between rest and awakening grew a fresh recognition about OpenSim-type paraverses. They still occupy some application space not quite like, but spanning gaps in use that exist among Linden’s Agni grid, Google’s Lively, and Google Earth. A paraverse seems such a reasonable effort to pursue—for although it might seem a pedestrian app to describe, once it exists, its fidelity with real world should allow easier connections to all sorts of business, while offering all the creative possibilities that can derive from human-created worlds, like having both gravity and flying, having weather and having it the way you prefer it, and so on.

No responses yet

Jan 19 2008

Simulator GIS

Published by under OpenSim,Vision Statement

Don’t fret about the silence here of the past two months–activity in the lab has been greater than ever before!
The 1 GHz Coppermine PIII / 1.5 GB memory has had 81 sims sqeezed onto it (with mere Basic Physics), and has been tested with three users, loaded with real-life terrain, and offshore areas filled with orthoimage-decked megaprims.

Really – please check out the new screenshots posted on OpenSimulator.org

More, there’s a new system on shakedown. It’s an ASUS P5KC, with Core2 Duo E6550 overclocked to 3.4 GHz with 4 GB DDR2800 overclocked to 485 / 970 MHz. Ubuntu 7.10 Gutsy x86_64 is getting decked out with x64 VMware server, Samba 4 / AD Domain Controller, and soon will check out how far OpenSim can get scaled up from 1:4 closer to 1:1 with these better resources. Oh, and the alpha Second Life client has been working OK on Ubuntu x64 with an NVidia 8600 (x64 driver built and installed with Envy)

Some very fun images of the Berkeley 1:4 sims were prepared for the American Geophysical Union Fall 2007 Meeting in San Francisco, under abstract IN13A-0902 on 20071210. The sim hasn’t changed much since then.

With the new year, and a fresh focus on using OpenSim as the server-side vehicle together with Second Life client, I’ve felt that the most effective way to get my point across — of the value that I see in joining immersive 3D simulators to GIS data with the purpose of building 1:1 maps to work inside — could be done better than constant reference to Second Life. So the domain stack grows a bit, and will drop off a bit. Please consider hooking to the stacked domains http://blog.simgis.com or simgis.org as well as the original slgis.org and secondlifegis.com if you’ve got an interest in following these developments.

OpenSim 81-region Berkeley, CA

No responses yet

Jun 15 2007

Berkurodam 1.1 has been attained

Nothing like a user conference to motivate poster production! Somehow the chance to share work with perhaps 20,000+ eyeballs at the San Diego Convention Center always adds a bit to the excitement. There are now eight 3-foot by 4-foot color posters that emphasize shots of the progress made on the land surface, buildings, street signs, street lighting, sidewalk lighting, and foliage.

Oh, and there’s one other panel that has SL snapshot images for decoration, but is really a little manifesto of the importance of metaverses (in 2007) to the future of spatial systems.

This is Darb’s manifesto posted for attendees in the Map Gallery of the ESRI International User Conference in the Sail Room of the San Diego Convention center, 18–22 June 2007
The attendees are geographic information systems professionals, managers, and supporting industry folk who largely work with maps, map servers, and related technology for a living.

—————————-<>—————————-
YOU WILL SOON WANT A METAVERSE FOR YOUR SPATIAL DATA

Metaverses are immersive 3D computer graphics platforms
– They are not too much like 2-1/2D raised terrain or globes.
– Their objects may not support the vertex model of GIS or CAD,
but use parametric points or U-V maps and raster textures instead
– Through a viewer or other tools, metaverses immerse the user into the 3D model.
– An immersed user is as likely to look up or under as a globe user is to look downward.

If the metaverse holds a model built honoring GIS data, then a metaverse might
– Place the user into the map
– Allow one to stroll through a geodatabase
– Publish spatial data in real-time 3D for very many simultaneous users

Metaverses can allow massively simultaneous at-will rendering in near-real time
– As an example, Second Life is built on grid computing with >5000 processor cores
– Second life spatial data are integrated parametric point objects and raster textures
(34 Terabytes as of 5 May 2007)
– Second life supports over 40,000 simultaneous users worldwide with streaming audio and video.
Integrated VOIP is in beta.

Open-source options exist for single regions, and are developing for grids
– Second Life’s producer, Linden Lab, has announced plans to open source their server code
– This would allow cost of hardware / server power / model development to become the
limiting factors for a civic-scale metaverse
– City of Berkeley could stand up a 1:1 scale immersive model on about 512 processor cores,
or 1k cores with a redundant grid

Metaverses typically include a physics engine
– this manages object collisions and optionally provides gravity and
– in Second Life, the physics engine in each processor core handles collisions among
up to 15,000 objects in the core’s region.
– the engine does so at 40 Hz (forty cycles per second) to allow rendering throughout
the region as real-time movies for each client.

Metaverses will change your data center expectations
– There will be a desire to build out grid computing
– Performance will be tied to processor cores, while most related resources such as
system memory and disk storage (per core) are not exceptional
– In metavserses, the simulated space expands linearly with the number of regions in your grid.
Second life has 64K square meters, about 16 acres, at 1:1 scale, for each processor core
– people interested in grid computing are very interested in having processors with more cores
– these people may be equally uninterested in having operating system costs, or even server application costs, scale with the number of cores

SL Darb Dabney, Berkeley, California 20070615

No responses yet

Nov 24 2006

Vision: Second Life Metaversal–Geographic Information Systems Interchange Association

Published by under Vision Statement

This is hoped to be the start of a new direction for spatial sloggers: the new world of 1:1 scale immersive mapping. In it, we hope to transcend the scale of mapping that has been a given for centuries, and through the proxy of a user’s avatar, go into and explore the map at full 1:1 scale.

Those who are professionals in the field of geographic information systems, managers of facilities CAD drawings, and government mapping agencies concerned with parcel-scale efforts are welcome to join us and help to pioneer the driving of data along the Trans-Metaversal railroad—getting high quality map data into a persistent 3-d world such as Linden Labs’ Second Life metaverse, so that detailed mapping of streets, sidewalks, curbs, trees, fire hydrants, parcels, buildings and more can be experienced immersively through an easy to use application such as the Second Life client.

Compared to the wondrous entertainment and commercial offerings that grew in Second Life (SL) between 2003 and late 2006, putting real cities into SL must at first seem mundane or perhaps even pedestrian. Yet in the early months of 2007 we will quickly be converging toward a world where the best municipal mapping has so many layers with such detail that the most natural way to experience, provide certain data quality assurance, and even develop a customer service interface will be to load all the physical features up into a single metaversal space such as SL, and then go into that space via avatar and just be in the map.

Much change in the real life (RL) GIS world will come of this, and much good in terms of publishing reference- grade civic models for general use. What it will require in terms of development are tools to translate our 3-d real surface models into SL terrain, translate our parcels into 4-meter grid representations, convert building footprints into low-primitive (prim) approximations for starters, and identify the bottlenecks in how we will be converting our point, line, and polygon features into 3-d shells of cubes, sphere, cylinder and related objects, sections, and twists in some automated ways.

2 responses so far