Planet Geospatial

Directions MagazineChuck Croner to be Inducted Into URISA’s GIS Hall of Fame

Directions MagazineZillow Announces Acquisition of Trulia for $3.5 Billion in Stock

Directions MagazineNew Ordnance Survey mapping service launched for UK resilience professionals

Directions MagazinePhase One Industrial Announces iX Capture: A New Application for Aerial Photography

Directions MagazineGetmapping Post Strong Results

VerySpatialA VerySpatial Podcast – Episode 471

A VerySpatial Podcast
Shownotes – Episode 471
27 July 2014

Main Topic: Our conversation with Rick and Rusty from Eating West Virginia

  • Click to directly download MP3
  • Click to directly download AAC
  • Click for the detailed shownotes


  • When I am King by Great Big Sea

  • News

  • OSTP releases National Plan for Civil Earth Observation
  • Google Coordinate now part of Maps Engine Pro
  • Google partners with Environmental Defense Fund to map gas leaks
  • MapBox shows Don’t Fly Here

  • Web Corner

  • Foss4g Academy

  • Main topic

  • This week we feature our conversation with Rick Lawson and Rusty Hefner, hosts of the new blog Eating West Virginia, about their new undertaking and food geography.

  • Events Corner

  • 22nd IAHR Symposium on Ice: 11-15 August, Singapore
  • 5th Digital Earth Summit: 9-11 November, Nagoya, Japan – abstracts extended to Aug. 9th
  • Digital Past 2015: 13-14? February, Swansea, Wales, UK
  • International Symposium on Glaciology in High-Mountain Asia: 2-6 March, Kathmandu, Nepal
  • Directions MagazineMaking Sense of the 2014 Esri International User Conference

    Directions MagazineFinding OGC WMS, WFS, WCS services

    LiDAR NewsAutomated Vehicles – Autos vs. Computers

    Brad Templeton explains in this thought provoking article that there are two cultures who are thinking about automated vehicles. Continue reading →

    Click Title to Continue Reading...

    LiDAR NewsUsing LiDAR to Measure Farm Air Quality

    “It lets us see what’s going on from many dimensions.” Continue reading →

    Click Title to Continue Reading...

    LiDAR NewsrFpro Delivers Driving Simulation Solutions

    A quick introduction to rFpro showing highlights of the latest (2014) build of the Nordschleife Continue reading →

    Click Title to Continue Reading...

    AnyGeoFirst Look Video – Cedar Tree CMP1, mini Android smartphone from @cedar_tech

    TweetI’m pleased to share this one with you, shot on location at the 2014 ESRIUC in San Diego… likely your first chance to have a look at the really awesome, Cedar Tree CMP1, mini Android smartphone – My first look … Continue reading

    GeoServer BlogGeoServer Sunday Sprint (FOSS4G)

    Back in March we had a Save the Date post for FOSS4G 2014 in Portland. Check your travel plans include participating in the Code Sprint at the end of the conference.

    The code sprint is an opportunity for the team to get together and work on “tough problems without funding”. For GeoServer we have two candidates:

    1. Update to a recent version of Wicket to improve browser compatibility
    2. Update CITE Conformance Tests

    GeoServer is extending the code sprint to include:

    • Saturday, September 13th: FOSS4G is providing facilities at WhereCampPDX.
    • Sunday, September 14th: Boundless is arranging facilities at nedspace from 10am-4pm.

    To attend add your name to OSGeo wiki page and we will look forward to seeing you in Portland!




    Suite 250

    Suite 250, SW 11th Avenue, Portland

    Thanks to Mike Pumphrey for arranging the venue for the Sunday Sprint.




    GIS LoungeNational Transportation Atlas Database

    The Bureau of Transportation Statistics recently released the 2014 National Transportation Atlas Database.   From the BTS The DVD is a set of nationwide geographic databases of transportation facilities, transportation networks, and associated infrastructure. These datasets include spatial information for transportation modal networks and intermodal terminals, as well as the related [...]

    The post National Transportation Atlas Database appeared first on GIS Lounge.

    GIS LoungeEvaluating Ecosystems from Space

    The impact of humans upon the planet’s ecosystems has long been a concern for scientists, and now the European Space Agency (ESA) is taking their assessment of natural resources to a new level with satellite technology. The emphasis of a new project from the ESA is on promoting sustainability and [...]

    The post Evaluating Ecosystems from Space appeared first on GIS Lounge.

    All Points BlogUpdate 2: Esri Announces MOOC: Going Places with Spatial Analysis

    More info via various sources. Suggestion to Esri: It'd sure be nice if all of this information was on the MOOC home page. Start date: Sept 3 (via Learn ArcGiS) MOOC guest lecturers: (via @esrimooc):   David Gadsden - Esri Nonprofit Program coordinator. Geographer. Humanitarian.... Continue reading

    thinkwhereNorth bar stone uncovered

    Via Penny Goodman’s tweet about a Secret Leeds Forum post on the North Bar Stone.  Here’s a photo of the Beating the Bounds walk around the stone for Terminalia Festival 2014 and the photo of the stone today. wpid-IMG_20140223_162651-500x375 IMG_1898 IMG_1895

    AnyGeoNew from Google – Maps Coordinate + Maps Engine Pro

    TweetGoogle has this week announced that Google Coordinate will now be included with every Maps Engine Pro subscription ($5/user/month). Recall, Google Maps Coordinate, is the mobile and web app that lets teams assign jobs and share their locations with each … Continue reading

    LiDAR NewsOregon Cities Partner on LiDAR Data Collection

    DOGAMI uses Quantum Spatial, a geo-mapping company with a Portland office, to gather the data. Continue reading →

    Click Title to Continue Reading...

    Directions MagazineURISA Board of Directors Election Results Announced

    All Points BlogGIS Health News Weekly: Animal and People Pathogens

    Map Pathogens in Man and Animal for Prevention Researchers at the University of Liverpool's Institute of Infection and Global Health are building the world's most comprehensive database describing human and animal pathogens. Called the Enhanced Infectious Diseases (EID2) database it... Continue reading

    GeoServer BlogGeoServer 2.6-beta Released

    The GeoServer team is overjoyed to announce the release of GeoServer 2.6-beta.

    I hope you are enjoying the new website – the download page for 2.6-beta provides links to the expected zip, war, dmg and exe bundles. For this release we are experimenting with providing source downloads directly from the GitHub 2.6-beta tag.

    As a development release, 2.6-beta is considered experimental and is provided for testing purposes. This release is not recommended for production (even if you are excited by the new features).

    This release is made in conjunction with GeoTools 12-beta. Thanks to Kevin for making a beta release of GeoWebCache 1.6.0-beta with relatively little notice.

    What to expect … and what we expect from you

    A complete change log is available from the issue tracker. We will ask you to wait for 2.6.0 before we let Andrea write a pretty blog with pictures illustrating what features have been added. Instead 2.6-beta is my chance to ask you to download GeoServer 2.6-beta for testing.

    Testing is a key part of the open source social contract. The GeoServer team have identified a few areas where we would like to ask for help. This is your best chance to identify issues early while we still have time to do something about it. For those making use of commercial support ask your vendor about their plans for 2.6-beta testing. We would like to ensure the functionality you depend on is ready to go for a Q2 release.

    When testing Geoserver 2.6-beta please let us know on the user list (or #GeoServer) how it works for you. We will be sure to thank you in the final release announcement and product presentations.

    Java 7 Testing

    With Oracle retiring Java 6 security updates the time has come to raise the minimum bar to Java 7.

    We know a lot of downstream projects (such as OSGeo Live) have been waiting for GeoServer to support Java 7. Thanks to CSIRO, Boundless, GeoSolutions for providing Java 7 build environments allowing us to make this transition in a responsible fashion.


    • This is a major testing priority on all platforms.
    • Windows 7: The start.bat used by the run manually install has trouble running as an administrator. We recommend installing as a service of this release (GEOS-5687)
    • Mac: You will need to install Oracle Java 7 (as OpenJDK 7 is not yet available for OSX). We have not yet figured out how to run GeoServer.App with Java 7 (GEOS-6588) and are open to suggestions.


    WFS Cascade

    This is a really exciting change, swapping out our gt-wfs client code for a new gt-wfs-ng implementation with a new GML parser / encoder.  After comparing quality of the two implementations we decided to go all in with this transition .. and thus would really like your help testing.

    We would like to hear back on cascading the following configurations:

    • GeoServer
    • deegree
    • MapServer
    • tinyows - there is a critical fix about axis order in tinyows trunc. It corrects (finally!) the output … but perhaps not yet the filters?
    • ArcGIS
    • Other – any other WFS you are working with!


    • Pay special attention to the flags used for axis order. There are different flags to account for each way a WFS implementation can get confused. You will find some implementations expect the wrong axis order on request, but are capable of producing the correct axis order output.
    • We especially ask our friends in Europe to test WFS services published for INSPIRE compliance

    This was an epic amount of work by Niels and we have a couple of new features waiting in the wings based on the success of this transition.

    Curves support for GML and WMS

    A large amount of work has been put into extending the Geometry implementation used by GeoServer.

    We have experimented with several approaches over the years (including ISO 19107 and a code sprint with the deegree project) and it is great to finally have a solution. As a long time user of the JTS Topology Suite we have been limited to a Point, Line and Polygon model of Geometry. Andrea has very carefully extended these base classes in to allow for both GML output and rendering. The trick is using a tolerance to convert the the arcs and circles into line work for geometry processing.

    Testing for the 2.6-beta release is limited to those with Oracle Spatial. If you are interested in funding/volunteering support for PostGIS please contact the geoserver-devel email list.


    • Look for “Linearization tolerance” when configuring your layer.

    Advanced projection handling for raster

    We would like to hear feedback on how maps that cross the date line (or are in a polar projection) have improved for you.


    • No special settings needed


    Coverage Views

    We struggled a bit with how to name this great new feature, however if you work with raster data this is your chance to recombine bands from different sources into a multi-band coverage.


    • Use “Configure new Coverage view” when creating a new layer


    Startup Testing

    Yes this is an ominous item to ask you to test.

    GeoServer 2.6 simplifies where configuration files are stored on disk. Previous versions were willing to spread configuration files between the webapps folder, the data directory and any additional directories on request. For GeoServer 2.6 configuration files are limited to the data directory as a step towards improving clustering support and growing our JDBC Config story.


    • No special settings needed
    • Special request to check files that are edited by hand on disk (such security settings and free marker templates)


    Pluggable Styles

    For everyone happy with the CSS Style Extension we would like to ask you to test a change to the style edit page (allowing you to create a CSS or SLD style from the start).


    • Install CSS Extension and look for a new option when creating a style


    Wind barbs and WKT Graphics

    I am really happy to see this popular extension folded into the main GeoServer application.


    • Check GeoTools WKT Marks for examples you can use in your SLD file


    New Formats and Functionality

    We have new implementations of a couple of modules:

    • Printing – new implementation from our friends at MapFish
    • Scripting – includes a UI for editing scripts from the Web Administration Application

    A final shout out to ask for help testing new formats:

    • NetCDF
    • Rib
    • OGR

    About GeoServer 2.6

    Articles and resources for GeoServer 2.6 series:

    Between the PolesCommercial-scale carbon capture and sequestration project begins construction

    About 37% of U.S. electric power generation is from coal-fired power plants.  New EPA regulations scheduled to come into effect July 1, 2015 will restrict CO2 emissions from power plants.  Coal-fired power plants will have to implement some form of carbon capture and sequestration (CCS) , convert to natural gas or some other cleaner fuel, or shut down.   CCS takes two forms, pre-combustion or post-combustion.  I blogged about a pre-combustion commercial CCS implementation in Kentucky that began operation earlier this year.

    CCS amine-based CO2 removalConstruction of the first U.S. commercial-scale post-combustion carbon capture and sequestration (CCS) retrofit has begun.  Post-combustion CCS technology will be installed at the coal-fired 240 MW Parish Generating Station in Houston, Texas.

    The Petra Nova project will capture 90 % of the plant’s CO2 emissions through an advanced amine-based process.  The process was piloted in a three-year project in Alabama.  The CO2 capture rate will result in lower greenhouse gas emissions than from a traditional natural gas-fired power plant.   The process involves scrubbing the flue gases with an amine solution, to form an amine–CO2 complex, which is then decomposed by heat to release high purity CO2. The regenerated amine is recycled to be reused in the capture process. The CO2 capture and compression system will be powered by a cogeneration plant comprised of a combustion turbine and heat recovery boiler.  The oil field will be monitored to verify that the CO2 remains underground.

    The captured CO2 will be compressed and transported via an 80-mile pipeline to increase oil output from an oil field with declining production.   After sseparation from the oil, the CO2 will be injected underground for permanent sequestration. 

    LiDAR NewsLiDAR Used to Quantify Carbon Stored in Hedgerows

    Secondly, it also quantifies one of the key ecosystem services of hedgerows in taking up carbon dioxide and storing it as biomass. Continue reading →

    Click Title to Continue Reading...


    I’ve been a consultant/programmer/integrator/other for over twenty years now. That’s not quite long enough to say I’ve seen it all but long enough to notice a few patterns. Admittedly, I’ve spent the vast majority of that time working in the defense world so the patterns may be heavily skewed to that but I think not.

    I’ve run across a number of well-entrenched government-developed systems, such as command-and-control systems, with user interfaces and experiences that many professional designers would consider abhorrent. Yet, I have watched smart, motivated men and women in uniform stand in front of working groups and committees dedicated to “improving” systems and workflows and advocate passionately for these seemingly clunky systems.

    Why? Because they know how to use these systems inside and out to meet their missions. User experience is ultimately about comfort and confidence. A user that is comfortable with a system will have a great experience with it regardless of its appearance. DOD tackles this reality through training. For all its faults, there is still no organization better at developing procedures and thoroughly training its people in them. It results in a passionate loyalty for the tools that help them do their jobs and places a very high hurdle in front of any tools that seek to replace current ones.

    This experience has given me a different view of the concept of “lock-in.” Over my career, I have heard this term used in a pejorative sense, usually proceeded by the word “vendor.” Although I have used the term myself, I usually hear it levelled by a vendor’s competitors. It is typically meant to refer to practices a vendor uses to establish barriers to exit for its customers, making it harder for them to choose a competing technology. Such practices can include artificial bundling of unrelated tools, license trickery, half-truths in marketing, and many more; all of which do happen.

    Lock-in is a real thing. Lock-in can also be a responsible thing. The organizations I have worked with that make the most effective use of their technology choices are the ones that jump in with both feet and never look back. They develop workflows around their systems; they develop customizations and automation tools to streamline repetitive tasks and embed these in their technology platforms; they send their staff to beginning and advanced training from the vendor; and they document their custom tools well and train their staff on them as well. In short, they lock themselves in.

    This is the right and responsible thing to do. An organization, once it has selected a technology, has a responsibility to master it and use it as effectively as it can. If you start applying numbers to all the activities listed above, you will quickly see that it is an investment that far outstrips the original investment in the technology itself. In fact, the cost of the technology itself is often seen as marginal to the overall lifecycle cost, which makes arguments about removing licensing costs, for example, less effective than they would appear to be.

    This is true regardless of the provenance of the technology. The original technology has to start to become a hindrance before change is seriously considered, which I am seeing in a few cases these days. But, by and large, the very strong pattern I have seen is that the majority of lock-in originates with users. To fail to recognize that and continue to target the vendor is to miss the point and, ultimately, the target.

    Directions MagazineOGC calls for comment on candidate Moving Feature Encoding standard

    Directions MagazineTomTom Telematics Tops 400,000 Vehicles Subscribed to Its Software as a Service

    Directions MagazineFormer Super Bowl Champion Running for Congress to Keynote MAPPS Summer Conference

    Directions MagazineHERE is the Official Map of Red Bull

    All Points BlogTomTom to Offer Faster Updates of Map Database

    TomTom's Multinet-R platform promises to deliver faster updates to clients using a more narrowly constrained quality assurace process. The QA process will utilize faster validation leveraging crowd-sourced data. The objective is to get new data into the hands of clients with updates... Continue reading

    Directions MagazineCall for Chapters:  STEM and GIS in Higher Education

    thinkwhereLeeds Creative Labs – Initial steps and ideas around The Hajj

    Cross posted from The Leeds Creative Labs blog.

    I signed up to take part in Leeds Creative Labs Summer 2014 programme with the hope that it would result in something interesting, something that a techie would never get the opportunity to do normally. It’s certainly exceeded that expectation – it’s been a fascinating enthralling process so far, and I feel honoured to have been selected to participate.


    I’m the designated “technologist” who is in partnership with Dr Seán McLoughlin and Jo Merrygold on this project around The Hajj and British Muslims. Usually I tend to do geospatial collaborative and open data projects, although I’m also a member of the Leeds group of Psychogeographers. Psychogeography is intentionally vague to describe but one definition is that it’s about the feelings and effects of space and place on people. It’s also about a critique of space – a way to see how modern day consumerism/capitalism is changing how our spaces are, and by definition how we in these spaces behave.

    We had our first meeting last week – it was a “show and tell” by Seán and Jo to share some of the ideas, research, themes and topics that could be of relevance to what we will be doing.

    Show and tell

    Seán, from the School of Philosophy, Religion and The History of Science introduced his research on Islam and Muslim culture, politics and society in contexts of contemporary migration, diaspora and transnationalism. In particular his work has been around and with South Asian heritage British Muslim communities. The current focus of his work, and the primary subject of this project is about researching British Muslim pilgrims’ experiences of the Hajj.

    The main resources are audio interviews, transcripts and on-line questionnaires from a number of different sources such as pilgrims of all ages and backgrounds, other people related to the Hajj “industry” such as tour operators and charities.

    Towards the end of the year are a few set days for the Hajj – a once in a lifetime pilgrimage to the holy Saudi Arabian city of Mecca. You have probably seen similar photos such as this where thousands of pilgrims circle the Kaaba – the sacred cuboid house right in the centre of the most sacred Muslim mosque.

    It’s literally the most sacred point in Islam. It’s the focal point for prayers and thoughts. Muslims orient themselves towards this building when praying. The place is thought about everywhere – for example, people may have paintings with this building in their homes in the UK, and they may bring back souvenirs of their Hajj pilgrimage . You can see that the psychogeography of space and place on the emotions and thoughts of people could be very applicable here!

    And yet the Hajj itself is more than just about the Kaaba – it’s a number of activities around the area. Here’s a map!

    The Hajj

    These activities, all with their own days and particular ways of doing them are literally meant to be in the footsteps of key religious figures in the past. I will let the interested reader to discover for themselves, but there’s a number of fascinating issues surrounding the Hajj for British Muslims with Seán outlined.

    Here’s a small example of some of these themes:

    Organising the Hajj (tour operators, travel etc).
    What the personal experiences of the pilgrims were.
    How Mecca has changed, and how the Hajj has changed.
    The commercial, the profane, the everyday and the transcendent and the sacred.
    How this particular location and event works over time and space.
    What are the differences and similarity of people and cultures, and possible experiences of poverty.
    “Hajj is not a holiday” and Hajj Ratings.
    Differences in approach of modern British Muslims to going to the Hajj (compared to say their grandparents).
    Returning home and the meaning and expectations of returnees (called Hajjis).
    What we did and didn’t do

    We didn’t rush to define our project outputs – but we all agreed that we wanted to produce something!

    Echoing Maria’s post earlier we are trying to leave the options open for what we hope to do. Allowing our imaginations to run and to explore options. I think this justice to the concept of experimentation and collaboration, and should help us be more creative. I think that we can see which spark our imaginations, what address the issues better – what examples and existing things are out there that can be re-appropriated or borrowed, and which things point us in the right direction.

    What I did after

    So after the show and tell my mind was spinning with new ideas and concepts. It took me a few days to go over the material and do some research of my own, and see what sorts of things I might be able to contribute to. It’s certainly sparked my curiosity!

    I was to prepare for a show and tell (an ideas brain-dump) for the next meeting. The examples I prepared included things from cut and paste transcriptions, 3D maps, FourSquare and social media, to story maps, to interactive audio presentations and oral history applications. I also gave a few indications as to possible uses of psychogeography with the themes. I hope to use this blog to share some of these ideas in later posts.

    Initially I mentioned the difference between a “hacker” approach and the straight client and consultant way of doing development. For example encouraging collaborative play and exploration rather than hands off development. Allowing things to remain open. The further steps would be crystallizing some of these ideas – finding better examples and working out what we want to look at or devote more time to. We’d then be able to focus on some aims and requirements for a creative interesting project.

    All Points BlogGIS Education News Weekly: Geotech in PhysEd, TSA Geographic Literacy, Map Projections

    Geospatial in Phys Ed? Danielle Grant, a Potsdam, NY elementary school physical education teacher has been named the 2014 New York State Physical Education Teacher of the Year. She's high tech: Grant uses pedometers, pulse sticks, GPS units, gaming systems, iPads, PowerPoint, the... Continue reading

    All Points BlogMOOC from Open Online Academy Starts this Fall: Introduction to GIS and Mapping

    Introduction to GIS and Mapping starts this fall from the Open Online Academy. The course page suggests it's either four weeks or eight; it's not clear. Until recently cartography skills where hard to learn. Today, new technologies allow us to download geographical data, use... Continue reading

    Geoinformatics TutorialReading and Map projecting raster data with geolocation arrays using gdal

    The following describes how to geocode a raster band, whose the geolocation information is stored in one channel containing latitude values for each pixel, and another channel containing longitude values for each pixel. Opening the image, you see the unprocessed orbital swath.

    Information on the hdf image can be retrieved with gdalinfo:

    gdalinfo C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5

    The metadata printed out contains the names of the subchannels, as seen here:
    and information about a specific channel can be retrieved by using gdalinfo on exactly this subchannel-name from the metadata:

    gdalinfo HDF5:"C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5"://Brightness_Temperature_(89.0GHz-A,V)

    The tricky part is now, to transfer such raster bands into a map projection.

    gdalwarp can  read separate latitude and longitude rasters, which is called geolocation arrays in gdal. the comment on this page brought me a step further although I think that the vrt-code and the commands given are not completely correct. One has to create a vrt file describing the image raster, also referring to the latitude and longitude bands. Then in addition a vrt file describing the latitude band and a vrt file describing the longitude band.

    So I create a vrt-file (see here for info) named GW1AM2_201301311114_050A_L1SGBTBR_1110110.vrt:



    Then a file named lon.vrt :


    Finally a file named lat.vrt :


    With these files created and placed in the same directory, I can use gdalwarp, but note that you are calling the vrt file rather than the hdf file!
    gdalwarp -geoloc -t_srs EPSG:4326 C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.vrt C:\Users\max\Documents\test100.tif

    The result is the following swath:

    In order to get it onto the polar stereographic projection, in my case the NSIDC EPSG:3411 projection, I first subset the image:

    gdal_translate -projwin -94 90 40 35 C:\Users\max\Documents\test100.tif C:\Users\max\Documents\test101.tif

    which results in this image:

    now I reproject into EPSG:3411:

    gdalwarp -s_srs EPSG:4326 -t_srs EPSG:3411 C:\Users\max\Documents\test101.tif C:\Users\max\Documents\test102.tif

    resulting in this image:

    Geoinformatics TutorialReading AMSR-2 Data into GeoTIFF raster using gdal_grid

    The passive microwave data from the Advanced Microwave Scanning Radiometer (AMSR), available at, is delivered in orbital swaths. The raster band contain the image data in various frequency range, and the geolocation information for 89GHz is stored in one containing latitude values for each pixel, and another channel containing longitude values for each pixel. This data needs processing before having a regular geocoded raster.

    The following is not an exhaustive description, but extended notes on how to read AMSR-2 files, which possibly may be of   help to others trying to solve a similar task.

    For some, the final commented script at our github page may be enough help, the text below tries to explain the most important steps in this script:

    The various frequency bands of AMSR-2 have different resolution (see the product specs and the user manual ), we choose the L1R dataset, where the data is already processed to match the geolocation stored in the lat/long band for 89GHz.

    Information on the hdf image can be retrieved with gdalinfo:

    gdalinfo C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5

    The metadata printed out contains the names of the subchannels, as seen here:
    and information about a specific channel can be retrieved by using gdalinfo on exactly this subchannel-name from the metadata:

    gdalinfo HDF5:"C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5"://Brightness_Temperature_(89.0GHz-A,V)

    Opening one of the bands in QGIS, the data looks like this (all images in this post are © JAXA EORC ):

    In this example we want to convert the data into the NSIDC sea ice raster (EPSG:3411) covering the Arctic areas, using the gdal_grid utility, which creates a regular grid out of a point collection.

    Important to know: The 89GHz channel is divided into 89A and 89B, and only both together give the full resolution of about 5km. Each 89 GHz channel has an associated latitude and a longitude raster giving the coordinates for each pixel. For the L1R product, the geolocation information of other frequencies with lower resolution can be derived from the 89A longitude and latitude raster by using only their odd columns.

    In the first step, I unzip the gz - zipfiles, then open the hdf file (*.h5). The various frequency bands are stored in Subdatasets, so you open the hdf file with, but then use again for a subdataset (for information on the bands you can run gdalinfo on the *.h5 files:

    HDFfile = gdal.Open( r'C:\Users\max\Documents\GW1AM2_201301311114_050A_L1SGBTBR_1110110.h5' )

    HDF_bands = HDFfile.GetSubDatasets()

    #HDF Subdatasets are opened just as files are opened:
    HDF_SubDataset = gdal.Open(HDF_bands[channel][0])
    HDF_SubDataset_array = HDF_SubDataset.ReadAsArray()

    HDF_Lat89A = gdal.Open(HDF_bands[46][0])
    HDF_Lat89A_array = HDF_Lat89A.ReadAsArray()
    HDF_Lon89A = gdal.Open(HDF_bands[48][0])
    HDF_Lon89A_array = HDF_Lon89A.ReadAsArray()

    In the next step, a create a comma-separated file containing longitude, latitude, brightness values for each raster point. This comma separated file is then the input for gdal_grid. I loop through each pixel and write the three values (  longitude, latitude, brightness ) into a csv-file.

    So for the 89GHz channel, I write both 89A and 89B to a csv-file:
    #Add header line to textfile
    textfile = open( AMSRcsv, 'w')

    ## Loop through each pixel and write lon/lat/brightness to csv file
    for i in range(rows):
        for j in range(cols):
            lonA = HDF_Lon89A_array[i,j]
            latA = HDF_Lat89A_array[i,j]

            # lon/lat written to file already projected to EPSG:3411
            (lonA_3411, latA_3411) = pyproj.transform(wgs84, EPSG3411, lonA, latA)
            brightnessA = HDF_Br89AH_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

            lonB = HDF_Lon89B_array[i,j]
            latB = HDF_Lat89B_array[i,j]

            # lon/lat written to file already projected to EPSG:3411
            (lonB_3411, latB_3411) = pyproj.transform(wgs84, EPSG3411, lonB, latB)
            brightnessB = HDF_Br89BH_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

            if 35 < latA < 90:
                textfile.write(str(lonA_3411) + ',' + str(latA_3411) + ',' + str(brightnessA) + '\n')

            if 35 < latB < 90:
                textfile.write(str(lonB_3411) + ',' + str(latB_3411) + ',' + str(brightnessB) + '\n')

    For the lower resolution channels, I use the odd numbers of the 89A long and lat channel:

    #Add header line to textfile
    textfile = open( AMSRcsv, 'w')

    ## Loop through each pixel and write lon/lat/brightness to csv file
    for i in range(rows):
        for j in range(cols):

            #For low resolution the odd columns of Lon/Lat89 array to be taken!
            lonA = HDF_Lon89A_array[(i) ,(j*2+1)]
            latA = HDF_Lat89A_array[(i) ,(j*2+1)]

            # lon/lat written to file already projected to EPSG:3411
            (lonA_3411, latA_3411) = pyproj.transform(wgs84, EPSG3411, lonA, latA)
            brightnessA = HDF_SubDataset_array[i,j]* 0.01 #APPLYING SCALING FACTOR!

            if 35 < latA < 90:
                textfile.write(str(lonA_3411) + ',' + str(latA_3411) + ',' + str(brightnessA) + '\n')


    Now I can almost run the gdal_grid, but as described at I need to create a xml file describing my comma-separated csv file.
    <OGRVRTLayer name="GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H">
    <GeometryField encoding="PointFromColumns" x="lon" y="lat" z="brightness" />

    This xml file above can be created in a python script in the following manner, some more info
    root = ET.Element("OGRVRTDataSource")

    OGRVRTLayer  = ET.SubElement(root, "OGRVRTLayer")
    OGRVRTLayer.set("name", AMSRcsv_shortname)

    SrcDataSource = ET.SubElement(OGRVRTLayer, "SrcDataSource")
    SrcDataSource.text = AMSRcsv

    GeometryType = ET.SubElement(OGRVRTLayer, "GeometryType")
    GeometryType.text = "wkbPoint"

    GeometryField = ET.SubElement(OGRVRTLayer,"GeometryField")
    GeometryField.set("encoding", "PointFromColumns")

    GeometryField.set("x", "lon")
    GeometryField.set("y", "lat")
    GeometryField.set("z", "brightness")

    tree = ET.ElementTree(root)

    Now we finally can run gdal_grid, either command line:

    gdal_grid -a_srs EPSG:3411 -a average:radius1=4000:radius2=4000:min_points=1 -txe -3850000 3750000 -tye -5350000 5850000 -outsize 760 1120 -l GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.vrt GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.tif

    or called from a Python script:

    AMSRcsv_shortname =  GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H
    AMSRcvs_vrt = GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.vrt
    AMSR_tif = GW1AM2_201301010834_032D_L1SGRTBR_1110110_channel89H.tif  

    radius1 = 4000  
    radius2 = 4000  
    os.system('gdal_grid -a_srs EPSG:3411 -a average:radius1=' + str(radius1) + ':radius2=' + str(radius2) + ':min_points=1 -txe -3850000 3750000 -tye -5350000 5850000 -outsize 760 1120 -l ' + AMSRcsv_shortname + ' '  + AMSRcsv_vrt + ' ' + AMSR_tif)

    The radius indicates in which distance around a given output raster point the algorithm searches for points falling into the NSIDC raster -- if too small, it will result in empty pixels, if too large there will be too much smoothing since many pixels are averaged into one.

    The result is, finally, the part of the swath falling into the NSIDC-raster:

    In a final step, I take all of such swaths for one day and average them into a full image of that given day, see the Average Daily function in the script for details (images  © JAXA EORC ).

    One issue using gdal_grid is its very low performance regarding speed (see this comment ), one 89GHz band takes 10 minutes and a lower resolution band 2 minutes calculation time. This is then about 25 minutes for all channels of one hdf file, and since every day has about 20 files, this means 8 hours for one averaged daily raster. gdal_grid may therefore not always be feasible until the speed issue is improved.

    My Corner of the WebNaming conventions on Ruby & Ruby on Rails

    Originally posted on Selva's Blog:

    Ruby Naming Conventions

    Local Variables
    Lowercase letter followed by other characters, naming convention states that it is better to use underscores rather than camelBack for multiple word names, e.g. mileage, variable_xyz

    Instance Variables
    Instance variables are defined using the single “at” sign (@) followed by a name. It is suggested that a lowercase letter should be used after the @, e.g. @colour 

    Instance Methods
    Method names should start with a lowercase letter, and may be followed by digits, underscores, and letters, e.g. paint, close_the_door

    Class Variables
    Class variable names start with a double “at” sign (@@) and may be followed by digits, underscores, and letters, e.g. @@colour

    Constant names start with an uppercase letter followed by other characters. Constant objects are by convention named using all uppercase letters and underscores between words, e.g. THIS_IS_A_CONSTANT

    Class and Module 
    Class and module names starts with an uppercase letter, by convention they…

    View original 497 more words

    My Corner of the WebWhy You Should Test

    Originally posted on Six Months of Ruby:

    Test driven development is both loved and loathed by software engineers everywhere. It’s a complicated relationship. While the benefits of TDD are well established, programmers are human, and the human mind is reluctant to associate non visible progress with perceived progress.

    However, saying “I’ll get this done faster if I don’t write tests” is like saying “I’ll get to china faster by going in a straight line.” It’s a great sentiment if you’re willing to dig through a lot of rock. The fact is that writing tests might seem like taking a detour, but they also show you an easier path to where you want to go.

    The Benefits

    The first benefit of TDD is something almost intangible. Writing tests forces you to think about what you want to do. It forces you to plan, to design, and to define in your own mind what needs to be done. This…

    View original 166 more words

    It's All About DataAutomatically Sync Data with Google Maps Engine

    Sean Wohltman of Google has published a series of video tutorials on how to automatically synchronize any data with Google Maps Engine. He uses a simple FME Workspace to check for updates on the data, and automates the workflow with an FME Server instance running on Google Compute Engine.

    At the time of writing, there are three videos, but by the sounds of it, more might be on the way. The demos are clear and easy to follow, and offer a fantastic overview of what’s possible with Google Maps Engine and FME. Check them out below.

    1. Simple translation from a JSON feed to Google Maps Engine

    In the first video, Sean demonstrates how to create a simple workflow that reads data from an ArcGIS Server running on Amazon EC2, and loads it into Google Maps Engine. He then goes over what you can do with the data once it’s in GME—including styling the layer and info window, publishing a map, submitting it to Google Maps Gallery, and contributing to Google Crisis Map.

    (Worth noting: Not shown in the video, the FME Data Inspector does offer background maps, which you can turn on in the preferences.)

    While Sean reads from an Esri JSON feed of CAP images, it’s easy to see how any source data can be transformed and loaded into GME in a few simple steps. Anything goes in the Google Maps Gallery! If you don’t believe me, go look at this map of James Bond movies. (Then please sign my petition to have the next one filmed in Canada…)

    2. Keeping the data synchronized

    Data is ever-changing, so it’s often not enough to load it into the destination system just once. Take this map of coffee shop chains, for example. I’ll eat my limited edition Safe Software hat if that data hasn’t changed since it was loaded in 2013.

    Sean’s second video shows how to keep the above data in sync. Since it’s probable that the ArcGIS Server in this scenario will get frequent updates, it’s important to keep the live data connected to the GME map.

    Sean demonstrates how to create the workspace that checks for and inserts any new records that aren’t currently in the GME table. He then shows how to check for existing fields that have been updated.

    3. Automatically check for changes and update the map

    After the above two videos, we now have an FME Workspace that synchronizes the ArcGIS Server data with the Google Maps Engine table. The third video shows how to check for new or updated data at a set interval, and update the Google Maps Engine table as necessary.

    Sean introduces us to FME Server, which he’s running on a Linux virtual machine inside of Google Compute Engine. He goes over how to publish a workspace to FME Server, and gives an overview of what you can do with the workspace once it’s there. He configures the server to automatically run this ArcGIS-to-GME job every half hour.

    In the end, Sean’s workflow successfully syncs Civil Air Patrol photos from FEMA to Google Maps Engine using FME. What other scenarios can you envision? How about this one, which updates every 5 minutes with earthquakes around the world. (And then there’s this one of seismic hazards, which makes me think I should probably be hiding under my desk right about now.)

    To learn more and see other scenarios like this, check out our many resources, including this webinar recording and blog post. If you’d like to try FME technology and connect your data with Google Maps Engine, download your free trial of FME Desktop. If you need a simple one-off translation to get your data into the Google Maps Gallery, try out the free Data Loader for Google Maps Engine.

    We would love to see what you’re doing with FME and Google Maps Engine. Be sure to share your data challenges with us in the comments!

    The post Automatically Sync Data with Google Maps Engine appeared first on Safe Software Blog.

    GIS LoungeViews From the 2014 Esri International User Conference: GIS In Imagery

    Photographer Kristina Jacob shares moments she captured from the 2014 Esri International User Conference that was recently held during July in San Diego, California.  One of the largest GIS conferences in the world, the Esri UC played host to over 15,000 attendees from around the world. The Esri International User [...]

    The post Views From the 2014 Esri International User Conference: GIS In Imagery appeared first on GIS Lounge.

    GeoServer BlogGeoServer 2.5.2 release

    The GeoServer team is happy to announce the release of GeoServer 2.5.2. Download bundles are provided (zipwardmg and exe)  along with documentation and extensions.

    GeoServer 2.5.2 is the next the stable release of GeoServer and is recommended for production deployment. Thanks to everyone taking part, submitting fixes and new functionality:

    • Some fixes in the new GetFeatureInfo engine, for unfilled polygons and dashed lines
    • Solved a configuration issue that might have linked two styles toghether (edit one, the other changed as well)
    • DBMS connection leak in some WFS GetFeature with bounds enabled
    • Have WPS work properly as a workspace specific service, and report the current process activity during asynchronous requests (for processes that do report what they are doing in plain english, besides offering a progress percentage)
    • Add a way to resctrict output formats supported by WMS GetMap and GetFeatureInfo
    • More docs on how to setup JNDI connection pools in GeoServer
    • Thanks to Andrea (GeoSolutions) for publishing this release
    • Check the release notes for more details
    • This release is made in conjunction with GeoTools 11.2

    About GeoServer 2.5

    Articles and resources for GeoServer 2.5 series:


    AnyGeoCool Geo Technology – 10 Awesome Things seen at 2014 ESRIUC

    TweetIndeed there’s no shortage of cool and amazing technology to touch, see, and hear about at ESRIUC, actually, its almost overwhelming so creating a list of just 10 awesome things I saw is no simple task. Hopefully I won’t crush … Continue reading

    LiDAR NewsUSGS to Host Briefing: Safer Communities, Stronger Economies – in 3D

    The USGS is doing an excellent job of selling the benefits and value of LiDAR-derived 3D information. Continue reading →

    Click Title to Continue Reading...

    All Points BlogShoppers: Don’t Track Me Unless…

    PunchTab's "Mobile Tracking: Are Consumers Ready?" report surveyed 1,153 consumers about sharing their personal information, including locations, in return for deals and services. Fifty percent of participants did not want to be tracked and 27 percent said "maybe" it'd be ok, but only... Continue reading

    All Points BlogGIS Government News Weekly: Saginaw Alerts, Irish Open Data, $83 Deeds

    Saginaw Alerts Landlords of Police/Fire Calls When Saginaw police officers or firefighters respond to a call at one of Saginaw's nearly 5,000 registered rental properties, the landlord will be notified the following Monday. The new feature of the GIS goes live August 4. California... Continue reading