Archive for the 'GIS in general' Category

Jul 23 2016

Esri ArcGIS Pro 1.3

Published by under GIS in general,Vision Statement

For this blog I’m starting a bit of new direction by focusing on Esri tools with available data. Having just completed a draft final version of what will become public-domain building footprints, I’m starting to create 3D City mapping at Level of Detail 1, which I’m inclined to express as a version 1.01 of 3D City.

I’m also in Day 11 of quitting ArcMap cold turkey. Or at least well chilled—with three very narrow returns to ArcGIS 10.4.1 for Desktop for a specific use, then return to Esri ArcGIS Pro 1.3 only. A couple of weeks ago we got the last updates to some building footprints. Now after a couple of day’s work, modeled with bare-earth and first-return LiDAR-derived surfaces, we’ve got extrusions, can run the Procedural textures to give the scene a CityEngine look, and have exported a Level of Detail 1 (call it LoD 1.01) multipatch model.

Gathering statistics from LiDAR was approached in a fairly serious way, with 2 ppsm airborne LiDAR modeled on a 50cm grid with Natural Neighbors (which uses tesselation) for each of [all ground-classified points] and [first-return points]. With some smoothing to attenuate any scanning marks, a [1st return minus bare-earth] difference surface was also calculated and called [height]. The height grid was carefully smoothed with an unsharp mask just enough to eliminate airborne scanner noise, while preserving as much tree canopy detail as possible.

The new building footprints were cleaned and sorted by Shape_Area descending, and assigned an Area_ID in the range [1–177023], then the Area_ID was used with the footprint shapes to define a 25cm zonal grid based on maximum cumulative area that was snapped to the 50cm LiDAR-derived grids. These 25cm cells were used for Zonal Statistics to Table, after converting the three input grids to integer centimeter precision. Zonal statistics on the original floating-point meter grids would not yield either median nor majority values, which were considered useful for stable roof height detection. So all available zonal statistics were run on integer centimeter grids (50cm sampled) of bare-earth [gnd], first-return [1st], and a blurred difference [hgt], sampled over zones of 25cm cells representing each of the 177023 building footprints.

I briefly returned to ArcMap to interactively join the three statistics tables in a more stable way. After that, it was great to get back to Pro 1.3 and edit the schema, renaming and reordering fields, and calculating a few key statistics like minimum ground and median first-return from integer centimeters back into floating-point meters. The minimum ground was used to position each footprint at an absolute elevation based on our bare earth model “tsm50cm” for topographic surface model at 50cm gridding, and the median first-return was used to position the roof at an absolute elevation.

On Day 9 of Pro 1.3, I tried to export the shapes with Procedural textures using Layer 3D to Feature Class, and crashed repeatedly. Even with smaller areas of the city. But finally I got down to just Treasure Island, and it worked with some curious anomalies at edges but with attractive Procedural textures. So I exported to multipatch without color or texture, and got a better-shaped result almost instantly. So, I ran the export for about 1/4 of the city again with Layer 3D to Feature Class, exporting the extruded building footprints to enclosed multipatch 3D boxes—and Pro 1.3 did it in five seconds. Feeling ambitious after that success, I ran the whole city for a set of 177,023 footprints, all Polygon-Z that had been positioned at the lowest NAVD 1988 meters, and the Layer captured the extrusion up to median 1st-return. Not only did that multipatch export complete without crashing, the entire city was done in just 20 seconds.

The Procedural textures were very nice to see. They are oh so precise, but for our city the International Building texture is not so accurate. Still, I was able to tune upper floor height very usefully. And on my little department-issue workstation, I saw all eight CPU threads firing on full while the rendering was taking place. Finding ArcGIS Pro 1.3 running multi-threaded in just the right way to fully utilize the workstation while keeping the interface reasonably responsive—it is a very nice balance indeed. I don’t miss ArcMap at all so far!

No responses yet

Mar 23 2016

Practical application: consensus neighborhoods

Published by under GIS in general

Given: a set of overlapping polygon features in a single Feature Class (FC) and ArcGIS for Desktop

1) Add a field of type Double named “value” to the neighborhood FC and for all features calculate the value “1.0” to this field.

2) Self-Union the neighborhood polygon FC to a new FC named with “_union” appended.
ArcTools > Analysis Tools > Overlay > Union

3) Spatially join the original neighborhood polygon FC to the _union FC, ensuring that each polygon will receive a summary of numeric attributes, and also checking the box Sum to sum those received attributes into a useful value. The desired value should appear in the output FC as attribute “Sum_value”. Finish the output FC name with “_union_join”


4) For visualization, consider converting the _union_join FC to a raster using
ArcTools > Conversion Tools > To Raster > Polygon to Raster

Just choose the Sum_value field as the Value field.

No responses yet

Jun 25 2014

Open Simulator joins the SGeoS build — a strategy for blogging the builds

This is the first of what should be  a set of posts that detail a server build process for the San Francisco Enterprise Geographic Information Systems Program (SFGIS) Standard Geospatial Server (SGeoS).  In fact, the build work has been ongoing for several weeks and is concluding here, with OpenSim.

The motivation for including OpenSim in the platform was a desire to provide support for legacy .NET applications that may exist in various departments. In the interest of creating a Microsoft-neutral build that is framed with Open Source components, it was natural to bundle the Mono framework into the SGeoS design.  And while individual department applications are their own business and not part of the standard build, OpenSim serves as an excellent demonstration of the utility of the Mono framework as included on the server.  That , together with my perspective that immersive 3D clearly should be associated with geospatial servers, is why OpenSim is included in the Standard Geospatial Server.

OpenSim is not trivial by any means, and yet it is not such a resource hog that it would be infeasible to bundle it.  What’s more, it is an opportunity to distribute immersive 3D technology packaged with other geospatial capabilities.

Since the build descriptions are being transcribed from a build document that is approaching 80 pages on Google Docs, it seems prudent to break it up into individual modules.   And since WordPress here is configured to show older posts below newer ones—I’ll start down at the end modules and post new build descriptions for earlier modules in later days.

The original notion for SGeoS was to have modular build chapters that could provide a unit of capability.  That way, only selected modules need be configured.  After discussions with VMware engineers, I became intrigued by the notion of making a single server image that could run everything, all at once, and then disable unneeded featured in an actual deployment.  So the build document was initially structured with module-like chapters, but in fact the server builds them all—so it’s worth viewing the build document in sequence.

The modules will probably end up  numbering about 10, including packaging for production and possibly default-disabling of most items.   If one watches too closely, it might seem like I’m making a countdown to completion.  But this will end with a stub for deployment packaging, work back through an OpenSim build, and end up with imaging an install of CentOS 6.5 onto a new VM guest system.

No responses yet

Nov 15 2011

3D Geospatial For Real—not a simulation and Kitely, on-demand Opensim

Thanks to the astronauts aboard the International Space Station, their time-lapse photography at very high ISO that helps to share some of what their eyes may well see, and of course Michael Koenig for his care and smoothing of the HD video, with some loungy score, too.
Take five (minutes) and watch it on HD in a darkened room. You might find yourself pausing, reviewing, and spending 20 minutes enjoying.

Earth | Time Lapse View from Space, Fly Over | NASA, ISS from Michael König on Vimeo.

I was fascinated by an orange wiggle, that turned out to be the astoundingly well-lit India-Pakistan border, around 1000 km long.

Meanwhile, I’m forming some plans for next semester’s course, and have realized that it may well be possible to offer students training in multi-user virtual environments without hacking one of the lab workstations to image it as an Opensim server. Thanks to the incessant business analytics of Maria Korolov over the past few years, it was possible for me to quickly get caught up in the new and improved options for cloud hosting of Opensim regions.

Right away it became clear that the business model of Kitely was quite compatible with my modest but area-expansive needs for real-life terrain simulations.  I’ve found it quite easy to get set up with a single region, and that’s a really big start.  I was able to use the latest beta Second Life 3.2 viewer to connect to the latest Opensim 0.7.2 stable release, tweak terrain and set up a few flexi-prims to test the weather.  Nice work technically, and a very nice pricing scheme for my sort of use.  I’m also very sympathetic to Ilan Tochner’s philosophy of “just keep building new regions”—it’s a consistent theme with cloud solutions, and refreshing to see it in connection with Opensim.

No responses yet

Apr 26 2011

New workstation – New cache

Published by under GIS in general,Marin County

The Marin Community Map cache has been draft-built, and can be explored as the highest four zoom settings here

I’m settling in to a new GIS workstation, custom-configured HP Z400 with twelve execution threads.  Windows 7 configures with a  Windows Experience Index of 7.3 and running very new browsers is a treat.  The preview version of Internet Explorer 10 checked out thusly with its Windows-loving HTML5 Fish Bowl Speed Test.  Two of them running, each with 2000 fish, both hitting 31 FPS.

Dual 1600x1200 screen, two IE 10 browsers, 2000 fish each, 31 FPS

Fast HTML5 Browsing, according to Microsoft

I I think that this will meet my needs for now. ;^)



No responses yet

Apr 05 2011

Cache on the Barrel – Community Maps Work

The CPU cycles are burning as I set up some test cache building for our local edition of ESRI Community Maps – Large-Scale Topographic Base Map.  It’s rather a treat to see a bunch of execution threads and trying my best to keep them all busy!

As I work through glitches here and there, I’m catching up on some of the good information that is online to help folks in my situation, like Advanced Map Caching from the 2010 ESRI Developer Conference.


No responses yet

Mar 14 2011

Shaken Awake – and new terrain product

The enormously tragic megathrust event east of Honshu island in Japan has been in my thoughts throughout the weekend.   It is natural to focus on the human loss and images of civic destruction; I’ve got little to add to that story.

When I saw the first news items, it was already Friday morning and eight hours past the event.  My concerns were for Hawaii and the arrival of a tsunami.  I was shocked by the energy released, as reported at NEIC:  Mo=3.9×10^22 Nm,  Mw=9.0 — for as many earthquakes as have been recorded and studied in Japan, this was a much larger event.  For disaster planning, that is Not A Good Thing.

Moment Tensor USGS/WPHASE  Mo=3.9x10^22 Nm

The beach ball diagram shows compression back up the thrust. Depth of event is 24km

While there was some concern for Hawaii, there is something to be said for the diffraction caused by the long line of seamounts extending northwesterly from the big island.  While the energy won’t be cancelled out, the height of the crest would be widened some.  By the time the tsunami reached California, it was a real concern and a literal wake-up call for emergency workers.  Our damage here was mostly an economic annoyance, with Del Norte county once again taking the brunt of the damage in Crescent City harbor.

With a check-in call to relatives in Hawaii saying everything OK, their eyes and mine turned towards Japan.  As I watched shocking videos of tsunami damage, I struggled to calibrate what I was seeing with my mental model of how fragile most human habitation structures are.  I’ve thought much about the effects of shaking, liquefaction, and occasionally about wildland fire.  I’ve read about losses in Marin county during 1982 flooding where debris flows destroyed houses or literally rolled them end-over-end down a slope.  I’d seen some videos from near Sumatra of debris flowing up streets after waves climbed up the beach.  Yet images from Japan recorded damage unfolding at an entirely worse scale.

My concern became much more engaged when I saw helicopter video of the south end of nuclear power reactor facility Fukushima Daiichi (No. 1).  You see, in the video clip I recognized not the reactor, but the cut slope behind it.  In a very strange sense of model-based deja vu, my memories were unequivocal: “I know that place!”.

Why?  Because about 17 or 18 years ago, I was fortunate to be part of a site design project for the (still yet to be constructed) Unit 7 and Unit 8 reactors.  Fortunate for me, because it was a first opportunity to learn not just ESRI Arc/INFO, but how to work with Bentley CAD software to create a 3D cut slope design.  My small contribution was to create a realization of slope that was  not simply faceted as a buffer surrounding the building footprints, but instead create a semblance of the natural cliff slopes  adjacent to the plant, while meeting the engineering requirements for not-to-exceed slope steepness, and a more natural-looking accumulation of drainage from the slope.

So apparently, after spending a couple of weeks learning to use 3D design software and creating a design that extended the existing cut farther southward, I had an image of the plant, its cliffs, and the breakwater that guided cooling water that stuck with me.  After having the flash of recognition with the video, I opened up Google Maps and found the site on my first inward zoom.  It was a bit spooky.

So now when I watch coverage of hydrogen venting that leads to building explosions, I feel a curious terrain-based sense of connection with the site.  I wish them well, and the safest possible return to production.  The TEPCO power is needed by many people.

And more ominous for the Pacific Northwest, I can’t help but reflect on what a similar megathrust event would mean for Cascadia.  Both Portland and Seattle would be in some sort of peril, although I don’t have a clear understanding of how tsunamis are modeled for either lower Columbia River or for Puget Sound.  But the possibility of a 9.0 megathrust event along the Cascadia subduction zone was a whole lot more abstract for me until last Friday.  It may not be the best time to reflect on it now while Japan is suffering—but the risk to the northwest has been a matter of public record for several years now and it not the time to forget that, either.  An event of that size and location would have tsunami implications both locally, and perhaps a far greater risk for windward Hawaii.


Closer to home,  this past week we’ve created a prototype 3D product that provides a facsimile of our 45cm terrain model—imported to Google SketchUp 8 with a georeferenced orthophoto texture on the terrain.  Things are not fully tuned up yet, but we are able to take TIN and decimated TIN surfaces out of ArcGIS 10, by way of clipping polygons that are interpolated to multipatch (3D multi-polygon features) and then exported to Collada with a KML point for georeference.  The Collada can be imported to SketchUp, at least up so some limit of detail.  Once there, I’m trying now to find the right way to smooth the facets and improve the rendering of the surface without having high-contrast dark facets—because the orthophoto textures are arriving intact and draping over the surface.  Only at the moment, many black facets are covering up almost half of the orthophoto in an aggravating triangular patchwork.

Next step: smoothing within SketchUp.

And inspiration from others who have gotten gridded terrain into SketchUp.

No responses yet

Dec 16 2010

Many changes in a month – AGU Fall Meeting 2010 and Cr-48

Each year for the past 40 or so years, the American Geophysical Union has met in San Francisco around December for what is now the Fall Meeting.  I’ve been an AGU member for about 28 years, and for a time was attending each and every Fall meeting—but these days it’s about once every three years.

This was one of those years, and it was a great pleasure today running into a good handful of friends and former school colleagues from years past!  Also, it was much fun to present a poster that summarized an analysis of synthetic flow lines built on the integrated topographic-bathymetric surface model.  Basically, with a very detailed 3D surface grid that runs continuously from mountaintop to offshore out to the 3-nautical-mile legal boundary of California counties, it is possible to draw streams as they would have flowed when sea level was lower, like 7000 years ago.

Much of the interesting topography from those streams got clobbered by sea level rise.  As the Ice Age retreated and continental glaciers melted out, the waves from the Pacific Ocean pounded the coast back to where it is today, and planed off much of where the streams once ran.

With ArcHydro-style drainage analysis on our terrain model that has fused detailed multibeam bathymetry from the California Seafloor Mapping Project, it is possible to identify extremely subtle signatures in the portion of the offshore platform that is Santa Cruz mudstone formation, a harder Miocene formation that expresses bedding in its surface.

With the analysis, when synthetic drainage paths are symbolized to emphasize flow lines with greater catchment area one can observe suggestions of right-lateral offset.  In California, this is a signature pattern for tectonic offset of drainages that cross strike-slip faults with right-lateral offset.  Because the formation where the analysis has detected possible offset is older (Miocene is more than 5 million years old, but the offset is perhaps only in the last 1 million years), this result should not cause much excitement with regard to modern seismic hazard.  It could however prove helpful to those who would decode the geologic structure of the Point Reyes peninsula.

(comments on the Cr-48 have been moved to its own blog at

No responses yet

Nov 03 2010

Getting OGC spatial tables back into ESRI ArcSDE on SQL Server 2008 R2

Published by under GIS in general,Spatial SQL

Sure, all the OGC-style spatial queries are great stuff, but when an organization already has very many feature classes in an ArcSDE geodatabase, there’s little appeal to making copies of reference data  for the sole sake of spatial queries.  Yet as of ArcSDE 9.3.1, when installed on MS SQL Server 2008 and 2008 R2, the option exists to bypass the classic SDE_BINARY format for geometry features, and instead use SQL Server’s native GEOMETRY format as the default format for new SDE features created in that geodatabase through ArcGIS.

Sounds great!  Less work!  And yet, frequently, it often doesn’t work because valid SDE geometry must still be made OGC-valid.  Three days ago I found myself with a working syntax to apply the (.NET CLR invocation of) MakeValid() method to my existing GIS data, and so I had valid OGC spatial tables available.  However, these OGC tables were missing a couple of things that needed to earn their registration back into the SDE fold.  Things needed, or just good-to-have when invoking sdelayer to register an OGC table include:

  1. A defined primary key field with its own clustered index
  2. OGC Spatial indexes (up to four levels available in OGC tables, and three in ArcSDE), with a name to pass along to SDE
  3. A numeric expression of the geometry’s extent, as bounding rectangle limits
  4. A pointer to the geometry’s projection “srid”  in the SDE_spatial_references table, linking it to real SRID as “auth_srid”
  5. A friendly 63-character name to describe the spatial features
  6. A known instance of ArcSDE, such as port ‘5151‘ or DirectConnect ‘sde:sqlserver:servername
  7. A resolvable name or IP address for the ArcSDE server
  8. The name of the ArcSDE database on the server
  9. Login credentials for the database under the ArcSDE instance, most useful with CREATE privileges

One last detail: you’ll likely need to have the administrative parts of ArcSDE installed on your workstation.  This means installing ArcSDE 10, but not configuring it to run as a server locally.  Mercifully, this type of install does not require access to an ArcGIS Server license.  Of course, you’ll use it to perform actions a server instance where that license was required.  But after that install, your workstation should be able to run command line administrative tasks on a remote SDE server, often with either the ArcSDE monitor ‘sdemon’ or the Feature Class utilities of ‘sdelayer’, each of which has very many options.

1) Update the OGC table to ensure you have the correct SRID on every Geometry feature

UPDATE [SpatialTesting].user.PARCEL_OGC_LI
	SET Shape.STSrid = 2872

2) Define the primary key in your OGC table using SQL Server Management Studio (SSMS), and doing so will create a cluster index for you. In SSMS 2008 R2 this could mean a right-click on the OGC table name, selecting the Design view (second choice), then highlighting the OBJECTID attribute row by clicking its box along the left, right-click and choose Set Primary Key.  Save the change, or close the view and accept to save the change.  This should provide the “gold key” icon for OBJECTID as a Primary key, add an entry in the table’s Keys folder, and list an index named like “PK_your_table name  (Clustered)” in the Indexes folder.
Please note that you can’t skip this step and succeed in building a spatial index.   The spatial index appears to require a clustered index on a defined Primary Key.

3) Create the spatial indexes for the OGC table by right-clicking the table’s Indexes folder and choosing “New Index…” option.
With the General page chosen in “Select a page”, give the index a name (e.g. “spx_50”) and choose Index type: Spatial, then click “Add…”
In the “Select Columns from” popup, hopefully you will only  have one spatial column’s row to choose from (e.g. Shape).
Check its box and click OK.
Then in “Select a page”, choose Spatial, and here you will enter bounding coordinates for the OGC Geometry extents.
(e.g. 5816131, 2125631, 6028424, 2312384 as X-min, Y-min, X-max, Y-max)
For Geometry, ensure that the “Geometry grid” Tessellation Scheme remains chosen, and 16 Cells Per Object seems OK thus far.
While SQL Server 2008 offers four levels of spatial index grids, it only has qualitative labels for its options; by contrast ESRI file or SDE geodatabases have three grid levels, but one can specify precise index grid scales.
Given the prompts in the interface, I have typically understood Level 1 / top-level grid to be the coarsest, and chosen “Low”, then at Level 2 chosen “Medium”, and for both Level 3 and Level 4, chosen “High”.   Once you’ve chosen, click OK and wait while the index is built.

4) Once the spatial index has been built on the OGC spatial table, it should be ready to register with SDE and return it to ArcGIS service. The example below (less credentials) is what worked for us with line feature class with over 275,000 features.
Just open up a ‘CMD’ command-prompt window  and invoke the ‘sdelayer’ command with options along these lines:

sdelayer -o register -l PARCEL_OGC_LI,SHAPE -e l -t GEOMETRY -C OBJECTID,SDE
 -E 5816131,2125631,6028424,2312384 -R 2 -S "OGC Version of 2010 10 Parcels"
 -i sde:sqlserver:sqlgisdev -s SQLGISDEV -D SpatialTesting -u <user> -p <password>

And that should do it.

Another bit of syntax that I feel more comfortable with is making use of “UNION ALL” to chain queries together, so that they are easier to visualize in SSMS as overlays.

select Shape.STCentroid().STBuffer(10000) from SpatialTesting.bquinn.COUNTYBNDY_PG_OGC
select Shape.STExteriorRing() from SpatialTesting.bquinn.COUNTYBNDY_PG_OGC

Takes our county boundary polygon, finds its centroid, buffers that centroid point out 10k feet, and then overlays that centroid blob on the outline of the county polygon, garnered from the poly’s exterior ring. Sounds simple enough, and the pasted code worked.  But 10 days ago I would have been very hard pressed to come up with that syntax!

No responses yet

Oct 30 2010

Tied up in knots with Spatial SQL

The rules for geometry validation are just different between the ESRI world and the OGC world as instantiated by Microsoft in SQL Server 2008 R2.
I’ve been pounding my head against the wall over what I thought were the challenges of taking fine examples from Alastair Aitchison (a.k.a. tanoshimi on the MSDN forums) that are presented in Beginning Spatial with SQL Server 2008, and trying to make these happen in useful ways on real GIS data.  That is to say, data that were not created by hand-entry or even well-known-text (WKT) entry into SSMS, which are often  simple single features.

I was particularly concerned reading the first paragraph in Chapter 11: “All of the techniques discussed in this chapter apply only to a single item of geography or geometry data, and generally do not require any parameters.”  As I understand now, Mr. Aitchison meant “All of the techniques as discussed…” and all the various Spatio-Temporal methods like STIntersects() work perfectly well on perfectly valid spatial tables.

I thought that these methods just weren’t going to work on my GIS data,  typically large tables with lots of objects, not the scalar objects of the book examples.  But as it turns out, I had a different problem.  My geometry, as transcribed via Shapefiles and the Shape2SQL tool from, was arriving in a form that was occasionally invalid for certain features.  One invalid feature in a spatial table makes the entire SQL statement crash.  Errors were reported with reasons that did not help me find my actual problem very easily.  Bad geometry meant that I needed to clean it up by running the MakeValid() method.

After a couple of hours goofing around with T-SQL syntax, I’ve made my first baby steps toward cleaning up my stuff.  Here’s how that is happening:

(select ID, PARCEL, Prop_ID, Shape_area, Shape_len, (CASE Shape.STIsValid() WHEN 1 THEN Shape ELSE Shape.MakeValid() END) as Shape INTO spatial_01.dbo.Parcel2 FROM spatial_01.dbo.parcel)

As so often with code, it looks stupid-simple to see it written out in the way that actually works; a major “duh” factor.
The pay-off is that with 100% valid geometry, table vs. table spatial queries are now working.  I’ve not yet broken any speed records in query performance vs. ArcGIS 10, and yet I have finally proven to myself that I can use  Spatial SQL to do powerful geoprocessing that, in some cases, will be easier to write and maintain in SQL than in ArcGIS / Model Builder / Python.

For context, I’ve struggled through the past four days trying to figure out where my Spatial SQL syntax was messed up.  As it turned out, exactly ZERO of my uploaded spatial data tables arrived with 100% valid GEOMETRY data.  Anyway, once I made everything OGC-valid, a query like this

SELECT NAME, b.PType INTO spatial_01.dbo.School_Geology
FROM spatial_01.dbo.school2 a, spatial_01.dbo.geology2 b
WHERE a.Shape.STIntersects(b.Shape) = 1
Is really all it takes to return the mapped geologic unit beneath the point for every school in the county.
For 120 school points, and 3700 geology polygons, the query above returns its result in about 30 seconds, with no tuning of the spatial indices and no SQL hints provided (yet).

On a last note, this week I have validated the experience of using ESRI ArcSDE 9.3.1 settings to configure SDE (when running atop MS SQL Server 2008) to store its geometry in native OGC GEOMETRY binary form rather than the classic SDEBINARY form.  To do this, someone with db owner privileges can update the SDE_dbtune table with these directions to cause the GEOMETRY format to be used by default.  In principle, this should allow one to exploit the live feature classes in SDE for Spatial SQL queries—without the muss and fuss of exporting through Shapefiles and Shape2SQL (which is not intended for production-environment use, anyway), or need to code something in .NET to export a copy.

Wouldn’t it be nice to have ArcSDE running just like it always has, and then also be able to use Spatial SQL queries on the exact same feature classes that are being used throughout the enterprise?  Sounds great, in principle.  After tonight’s insight into the need to MakeValid() some perfectly functional feature classes, I do have some concerns that SDE feature classes could get broken by MakeValid().  On the other hand, it appears that one can take a Spatial SQL GEOMETRY-based table that meets SDE limitations: (1) only one geometry column, (2) all rows of the table must have the same geometry type, (3) all rows of the table must have the same Spatial Reference ID (SRID) defined—and any SDE feature class that was cleaned through MakeValid() should retain those characteristics—and then register the resulting spatial SQL table once again as an SDE feature class.  Hey, it sounds like it’s worth a try!

More working through Aitchison’s book has yielded many successful uses of OGC methods on our GIS data:

-- SELECT Shape.STGeometryType() FROM [spatial_01].[dbo].[road]
-- SELECT Shape.STGeometryType() FROM [spatial_01].[dbo].[Parcel]
-- SELECT Shape.STGeometryType() FROM [spatial_01].[dbo].[school_pt]
-- SELECT Shape.STGeometryType() FROM [spatial_01].[dbo].[geology]

-- SELECT Shape.STDimension() FROM [spatial_01].[dbo].[geology]
-- SELECT Shape.STDimension() FROM [spatial_01].[dbo].[road]
-- SELECT Shape.STDimension() FROM [spatial_01].[dbo].[school_pt]

-- SELECT Shape.InstanceOf('Surface') FROM [spatial_01].[dbo].[geology]
-- SELECT Shape.InstanceOf('Polygon') FROM [spatial_01].[dbo].[geology]
-- SELECT Shape.InstanceOf('Curve') FROM [spatial_01].[dbo].[road]
-- SELECT Shape.InstanceOf('LineString') FROM [spatial_01].[dbo].[road]
-- SELECT Shape.InstanceOf('Point') FROM [spatial_01].[dbo].[school_pt]

-- SELECT Shape.STIsSimple() FROM [spatial_01].[dbo].[road]
-- SELECT Shape.STIsSimple() FROM [spatial_01].[dbo].[BldgFoot]

-- SELECT Shape.STIsClosed() FROM [spatial_01].[dbo].[BldgFoot]

-- SELECT Shape.STIsRing() FROM [spatial_01].[dbo].[road]

-- SELECT Shape.STNumPoints() FROM [spatial_01].[dbo].[road]
-- SELECT Shape.STNumPoints() FROM [spatial_01].[dbo].[geology]
-- SELECT Shape.STNumPoints() FROM [spatial_01].[dbo].[school_pt]

-- Find Centroids and blob them out big
-- SELECT Shape.STCentroid().STBuffer(3500) FROM [spatial_01].[dbo].[City]

-- Calculate Perimeter Lengths and sort them out
-- SELECT Shape.STLength() as Shape_len FROM [spatial_01].[dbo].[City] ORDER BY Shape_len

-- Calculate City areas and sort them out
-- SELECT NAME, CAST (Shape.STArea()/1000000 AS INTEGER) as Shape_km2
--   FROM [spatial_01].[dbo].[City] ORDER BY Shape_len
-- Reading the SRID
-- SELECT Shape.STSrid FROM [spatial_01].[dbo].[City]

-- Setting the SRID
-- UPDATE [spatial_01].[dbo].[City] SET Shape.STSrid = 32610
-- SELECT Shape.STSrid FROM [spatial_01].[dbo].[City]

-- oops, that's wrong, let's set it back to 2768
-- UPDATE [spatial_01].[dbo].[City] SET Shape.STSrid = 2768
-- SELECT Shape.STSrid FROM [spatial_01].[dbo].[City]

-- counting elements
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[City]
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[school_pt]
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[geology]
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[road]
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[Parcel]
-- SELECT SUM(Shape.STNumGeometries()) FROM [spatial_01].[dbo].[BldgFoot]

-- Geometric Difference between cities and their own 3500-meter centroid blobs
-- SELECT Shape.STDifference(Shape.STCentroid().STBuffer(3500)) FROM [spatial_01].[dbo].[City]

-- Geometric Symmetric Difference between cities and their own 3500-meter centroid blobs
-- SELECT Shape.STSymDifference(Shape.STCentroid().STBuffer(3500)) FROM [spatial_01].[dbo].[City]

-- Creating 50-foot Buffer around roads
-- SELECT Shape.STBuffer(50) FROM [spatial_01].[dbo].[road]

-- Simplified Buffer around roads, tolerance 8 feet
-- SELECT Shape.BufferWithTolerance(50, 8, 'false') FROM [spatial_01].[dbo].[road]

-- Create convex hull around schools
--SELECT Shape.STConvexHull() FROM spatial_01.dbo.school_pt

-- buffer parcels
-- SELECT Shape.STBuffer(300) AS Shape INTO pcl_seed FROM parcel where Prop_ID = '001-001-01'
-- select * from pcl_seed  -- to verify the selection
-- SELECT Prop_ID, a.Shape FROM spatial_01.dbo.parcel a, spatial_01.dbo.pcl_seed b
--   WHERE a.Shape.STIntersects(b.Shape) = 1

No responses yet

Next »