Planet Geospatial

GIS LoungeCongressman Calls for a Subcommittee with Jurisdiction over Federal Geospatial Activities

Congressman Doug Lamborn recently sent a letter to the House Natural Resources Committee Chairman Doc Hastings asking that a Natural Resources Subcommittee be formed that has primary jurisdiction over federal geospatial activities.  In his letter to Hastings, the Congressman for Colorado’s 5th Congressional District noted that, “Congress does not have a committee [...]

The post Congressman Calls for a Subcommittee with Jurisdiction over Federal Geospatial Activities appeared first on GIS Lounge.

Directions MagazineASPRS UAS Reno Data Processing Team Leader Issues Correction to SimActive Press Release

Directions MagazineUK eLoran now in operation to backup vulnerable GPS

GIS LoungeGIS in Social Science: Mapping Social Capital Strength to Measure Organizational Efficacy

Turn unmeasurable goals into quantifiable indicators by mapping social capital strength of service oriented nonprofits, NGOs, or government agencies. Whether the need is to prove organizational efficacy, evaluate organizational reach, or demonstrate resource connectivity, social network mapping with GIS tools can bring the often fuzzy field of social science measurement to a new level of clarity by using social capital as a metric.  

The post GIS in Social Science: Mapping Social Capital Strength to Measure Organizational Efficacy appeared first on GIS Lounge.

LiDAR NewsLandslides and Lidar

This paper outlines a novel algorithm to automatically and consistently detect landslide deposits on a landscape scale Continue reading →

Click Title to Continue Reading...

Spatial Law and PolicyUAV Spatial Law and Policy Update (October 29, 2014)

All Points BlogHaunted Savannah StoryMap JS for Halloween

Kelly Bigley Sr. Geospatial Developer, GISi put together a StoryMap for Halloween: the Haunted Savannah story map. There are details of the what and how in these two blog posts (1,2). I just want folks to know about this open source project. See also: Digital Storytelling Made... Continue reading

All Points BlogGIS Health News Weekly: Vaccine-Preventable Outbreaks, UASs and Infectious Disease, ACA Mapping

Vaccine-Preventable Outbreaks In advance of World Polio Day on October 24, the Global Health program at the Council on Foreign Relations (CFR) has expanded its "Vaccine-Preventable Outbreaks Map," adding new data showing how a hostile climate for vaccinators thwarts the... Continue reading

It's All About DataMonster Spatial Horror Theatre Presents: FME Cloud vs. the Zombie Horde

Dear Reader… it’s Halloween!

And so today on the blog, we bring you an extra-special-spooky-scary edition, with much-appreciated story inspiration from Michael Fitzpatrick of 1Spatial Australia.

There are zombies, and some slightly gory bits, because it’s hard to have zombies without the gory bits. So if that’s not your cup of tea, skip this one and rest assured that we’ll be back to our regularly scheduled blog programming next week.

If it is your cup of tea, Dear Reader, sit back and get ready for a really, really scary time.

Welcome to Monster Spatial Horror Theatre with Count Dale – and press PLAY… if you dare.

Horizbar

FME Cloud vs. the Zombie Horde

Shaun Ofded’s life was pretty ordinary right up until autocorrect interfered on that Monday morning.

Shaun’s job was social media analysis. And since the death of prominent and controversial Senator Walken over the weekend was sparking a deluge of public response (some on the rather lunatic side), the first political sentiment search he entered that morning was “walken dead”. Which autocorrect chose to “fix” for him.

deskFrom his office at Integrated Emergency Services in Manhattan, the unintentionally revised search – “walking dead” – made its way into a slick, FME Cloud-supercharged Redbooth task collaboration platform that coordinated the efforts of first responders – everyone from fire and police to military.

And that was when things started to get a bit, shall we say, weird.

FME Cloud took those keywords, hit Twitter for geotagged tweets and comments, grabbed an HTML template from an FME workspace, tossed the works up onto Amazon S3, linked it to Google Maps, and the unintended results showed up on his screen just seconds later. Shaun had only one thought.

“Crikey.”

walkingmapHe rubbed his eyes and double-checked the input. Pockets of people in the northeastern US were tweeting about what appeared to be actual ambulating cadavers and not a TV program. Ridiculous as it seemed, this might actually be a zombie outbreak.

He was going to need more coffee.

And a cronut.

Horizbar

zombiesigns2Sugar-fuelled and amply caffeinated, Shaun brushed the crumbs off of his keyboard and got down to business.

IT had built an FME custom transformer (supposedly for testing) called ZombieSigns that searched social media for keywords like biting, undead, looting –the usual zombie stuff. Shaun pulled up the workspace and started checking the results.

Sure enough, something was definitely going on. And if it hit New York, he really didn’t want to be stuck at the office. So with a few keystrokes and a click, he sent his results under the subject “Potential Zombie Outbreak (Seriously)” up to the cloud-based iPaaS system, FME Cloud handed it off to Redbooth, the message was immediately delivered to all the emergency services under IES’ mantle, and Shaun skedaddled for home.

As he raced for his apartment, the streets seemed normal, but the 911 call center was bedlam. Panicked reports were pouring in – and as they were logged, FME was mapping them and updating the central PostGIS database. The analysts peered in disbelief at the hotspot maps it produced using Google Maps Engine.

“I think it’s time to call Bob,” said one of them.

Horizbar

Bob was the Chief Inspector. Bob was not the least bit happy about having his smartphone updated with a set of locations to go and site-check for bleeding zombies, of all things.

His attitude was radically adjusted in very short order.

z-bizmanThe first location was a roiling mass of screaming, flailing humanity. There were hundreds of them. Those who were still human were falling under the weight of mindless, gibbering hordes of zombies with slobbering jaws gnashing and tearing, only to rise moments later and join in the carnage. It was a scene straight out of a B-movie, but with better effects.

z-elvisWhat appeared to be a former businessman, missing a shoe, clothing in tatters, and putrid flesh rapidly decomposing, bounced off of the windshield of his truck and carried on as if nothing had happened.

That was the moment in which Bob decided it was high time to quit his day job. He slammed the truck into reverse, backed over a former Elvis impersonator, winged an arm-chewing ex-bike courier with the side view mirror, and with a squeal of rubber on asphalt he peeled out and headed for anywhere but there.

z-armHe did, however, feel obligated to at least confirm the zombie situation before sending in his resignation. He grabbed his smartphone, keyed in “OMG URGENT STAGE 2 Z CRISIS CONFIRMED”. With no idea who to send it to, he just sent it to FME Cloud, which parsed out the keywords, assigned topics, and sent the message to just about everyone back at headquarters.

He sent one more message: “PS I QUIT. GOOD LUCK”.

z-grammaThen a drooling, grunting,  chartreuse-tinted grandmother landed on the roof, smashed through the windshield, and sank her teeth into his arm.

“Her pearls are awfully nice,” he thought, just before the truck smashed into a light standard.

A few minutes later, the only thing on his mind was brains.Horizbar

Shaun was holed up in his apartment, eyes glued to his laptop screen. The gravity of the situation was undeniable, and he needed a plan. He had to get out of New York. The maps he was monitoring on the collaboration platform were showing the spread in real-time, and new incidents were popping up way too close for comfort. When the crisis alert was raised to stage 3, he knew he didn’t have long to act.

There were some new data layers available on the FME Cloud data download service – someone had been building predictive models of the spread. A quick glance showed the entire Eastern Seaboard overrun within 48 hours. He backed up the model to the next couple of hours, and added a point layer of convenience stores.

predictionmap

The media were still downplaying the crisis, so he should have a short window of opportunity to stock up on essentials while he figured out his next step. Three lower-risk options popped up, and with a click he sent them to the map app on his phone, grabbed his knapsack, and headed out the door for the nearest one.

A quick check on the collaboration platform showed that the airport hadn’t yet shut down, which seemed like the best bet for the quickest way to get as far away as possible. As he filled his bag with water and snacks with one hand, the other was madly thumbing his phone, as he keyed in “to JFK” to another FME Cloud tool – this time to scrape ride-sharing websites.

Sure enough, a fancy town car was waiting at the curb when he ran out the door. Apparently the news that zombies were chomping their way towards Manhattan hadn’t sunk in yet – but he wasn’t about to belabor the point. They headed full tilt boogie for JFK, as Shaun, still connected to the office system on his phone, watched the flight paths of plane after plane heading for safety – while the clusters of 911 calls and tweets crept closer and closer to the airport.

Horizbar

globeThe terminal was chaos. Shaun elbowed his way through the crowd, waving his smartphone boarding pass purchased enroute, and raced for his gate. It wasn’t until he was safely settled in his seat that he started thinking about what to do when he arrived. This flight would take him to Hong Kong, and with any luck, he’d connect to Australia from there. A nice, big, isolated island – that was the safest destination he could think of.

He was checking the iPaaS portal to Salesforce for FME users to connect with when he glanced out the window and went cold. The zombies had reached the airport – and they were massing against the chain link fence like fans at a Justin Bieber concert. The fence was bulging dangerously, and as the plane turned onto the runway, engines spinning up, the fence gave way and hordes of the undead spilled onto the tarmac.

hordeThe pilots reacted instantly – full throttle and the jet leapt down the runway, smashing and squashing, leaving a long, wet smear of flattened zombie goo in its wake.

The most persistent one, clinging to the nose wheel, met a most unfortunate end when the gear was retracted several hundred feet in the air.

 

 

Thump.

Horizbar

48 hours later, Shaun breathed a sigh of relief. Cold Foster’s in hand, he watched the news coverage of the spread throughout North America, and thanked his lucky stars he’d made it safely to Cairns.

It wasn’t until much, much later that he realized it should have occurred to him that they could walk underwater.

 

“Crikey,” he thought, “that looks like Bob.”

underwater

“If you prepare for the zombie apocalypse, you’ll be prepared for all hazards.” – David Daigle, CDC Spokesman.

Horizbar

…and now we rejoin our broadcast in progress:

 

HAPPY HALLOWEEN!

Our thanks to Michael Fitzpatrick of 1Spatial Australia, one of the featured speakers at the 2014 FME International User Conference, who presented what may be one of the most imaginative use cases for FME Cloud (and iPaaS in general) we’ve seen. In additional to showing how a wide range of web services can seamlessly work together while navigating a critical situation, the story was too good not to borrow! His presentation, and all presentations from the UC, are available on our website.

@KrisAtSafe

The post Monster Spatial Horror Theatre Presents: FME Cloud vs. the Zombie Horde appeared first on Safe Software Blog.

From High AboveIntegration of 3D streetview earthmine with interactive 3D GIS Skyline Terraexplorer and aero3Dcity city model as a base dataset

Integration test of 3D Street Level Imagery Solutions earthmine into Skyline Terraexplorer pro. The 3D base dataset is a virtual 3D city model of Karratha, WA generated with aero3dpro. Users can navigate in the earthmine window and the corresponding location will be displayed into Terraexplorer in real-time. Users can also pick up a location on […]

GIS LoungeGIS Data and the Coastline Paradox

Imagine that you’re taking a GIS class and your instructor tasks everyone with coming up with the answer to, “what is the length of the coastline of Maine?”  Everyone downloads a different GIS data set to calculate the length and everyone comes back with a completely different answer to that [...]

The post GIS Data and the Coastline Paradox appeared first on GIS Lounge.

Azavea AtlasChart Your Way to Visualization Success

Following up on the themes of Sarah’s earlier blog, “4 Cartography Color Tips Hue Should Know”, here are a few tips I picked up from DataWeek 2014 in San Francisco in September 2014:

Visualizations and infographics are a powerful way to communicate data. However, with great power comes great responsibility, so here are a few ways to make sure they turn out clean, beautiful, and well-suited for their purpose: to be shared with the public.

Use The Cycle of Visual Analysis

Tableau guru Mike Klaczynkski defined the cycle of visual analysis as a six-step process that’s applicable across a broad range of data analysis:

  1. Define the question

  2. Get data to answer the questions

  3. Structure and clean the data

  4. Visualize and explore the data

  5. Develop insights

  6. Share the results

Simple enough, right? The last step, however, is a doozy. If you’ve gone through the trouble of steps 1-5 and then don’t share the results clearly, it could unravel all that hard work. As a data professionals you should provide legible visualizations to share your results with your intended audience.

Check Your Charts Before You Wreck Your Charts

When producing a visualization, do what Dave Fowler from Chart.io recommends and ask yourself: Am I trying to impress people with how cool this looks? Or am I trying to share my results clearly? If you’re more concerned about bells and whistles on your visualizations, you’ll end up graphics from a 1997 clipart nightmare instead of a powerful way to stream your message. Use the eight steps below and chart a voyage away from the rocky shores of bad decisions:

  1. Make your visualization audience-appropriate. You might not use the same chart to explain something to your dad as you would for your fellow data analysts. You might if he were also a data analyst.

  2. Make a graphic appropriate to the data (e.g. don’t make a time series for something with no time component). This site has a great breakdown on what kinds of charts to use for what kinds of data.

  3. Make sure it’s not a pie chart (people can understand square area better than they can circular areas). Read Death to Pie Charts to learn more and also get a bunch of great visualization tips.

  4. If you’re making a map, make sure it’s not just showing population density, as pointed out in this excellent example from webcomic XKCD (which has been linked before in a previous Atlas Blog about bicycle and pedestrian crashes in Philadelphia by Daniel McGlone). Sometimes you can get around it by normalizing the data by population.

  5. Avoid skeuomorphism in your charts, or trying to make an object look like the thing it represents. While there’s still some debate about whether websites and apps all need be stop being skeuomorphic, but there’s no question that pseudo-3D charts with photos of bananas on them need to go:

  1. Ask yourself if you’re showing the data clearly.

  2. See if someone unfamiliar with the results can interpret it.

  3. Show it off to everyone!

Chart Your Journey to Better Visualization

There are a ton of resources out there to make sure that your visualizations look good and get your message across. Get a head start by checking out the beautiful infographics blog, Information is Beautiful, thumbing through books by legendary visualization experts Edward Tufte or Stephen Few, or trying your hand at a cornucopia of data visualization tools at Datavisualization.ch:

Remember, the point of any visualization, whether it’s a chart, graph, or a map, is to communicate data to an audience in a meaningful format.

Azavea LabsRunning Vagrant with Ansible Provisioning on Windows

At Azavea we use Ansible and custom ansible roles quite a bit.

We’ve also been using Vagrant for quite some time to create project-specific development environments.  Adding Ansible as a provisioner makes setting up a development environment wonderfully smooth.

Unfortunately, Ansible is not officially supported with Windows as the control machine.

It is possible to get Ansible running in a Cygwin environment.  With a bit of work, you can get it running from Vagrant too!

Installing Cygwin

The first step to getting Ansible running is installing Cygwin.  You can follow the normal installation instructions for Cygwin if you’d like to, or if you already have a Cygwin environment set up that’s great too!

We’re using babun instead of Cygwin’s normal installer for a simpler installation and package installation process.  If you’re new to using Cygwin or having trouble with the standard installer I’d recommend this.

Setting up Ansible

Once you’ve got Cygwin installed, you’ll want to open up a terminal. You’ll need to use a Cygwin terminal, and not cmd.exe, whenever you want to run ansible-playbook or vagrant.

You’ll need to install pip, to be able to install Ansible. You’ll also need some packages Ansible needs to run that can’t be installed by pip. If you’re using the standard Cygwin installer, run it again and make sure python, python-paramiko, python-crypto, gcc-g++, wget, openssh python-setuptools are all installed. We need gcc-g++ to compile source code when installing PyYAML from PyPi.

If you’re using babun, this is:

pact install python python-paramiko python-crypto gcc-g++ wget openssh python-setuptools

You might get the following error if you try to run python: ImportError: No module named site.
If you see that error add the following to your ~/.bashrc or ~/.zshrc (in your Cygwin home folder) and source it:

export PYTHONHOME=/usr
export PYTHONPATH=/usr/lib/python2.7

Next lets get pip installed, and install Ansible itself.

python /usr/lib/python2.7/site-packages/easy_install.py pip
pip install ansible

Making Ansible Run From Vagrant

Once that is done, you should be able to run ansible-playbook from bash or zsh.

However, that isn’t enough to use Ansible as a Vagrant provisioner. Even if you call vagrant from bash or zsh, vagrant won’t be able to find ansible-playbook, because it isn’t on the Windows PATH. But even if we put ansible-playbook on the Windows PATH, it won’t run, because it needs to use the Cygwin Python.

To ensure we’re using the Python in our Cygwin environment, we need a way to run ansible-playbook through bash. The solution we came up with was to create a small Windows batch file and place it somewhere on the Windows PATH as ansible-playbook.bat:

@echo off

REM If you used the stand Cygwin installer this will be C:\cygwin
set CYGWIN=%USERPROFILE%\.babun\cygwin

REM You can switch this to work with bash with %CYGWIN%\bin\bash.exe
set SH=%CYGWIN%\bin\zsh.exe

"%SH%" -c "/bin/ansible-playbook %*"

This is enough to let Vagrant find ansible-playbook and run the Ansible provisioner.

You’ll likely run into the following error when you try and provision your first Vagrant VM:

GATHERING FACTS ***************************************************************
fatal: [app] => SSH encountered an unknown error during the connection. We recommend you re-run the command using -vvvv, which will enable SSH debugging output to help diagnose the issue

To get around this, we had to create a ~/.ansible.cfg (this can also go in your project directory as ansible.cfg) changing what the ssh ControlPath was set to:

[ssh_connection]
control_path = /tmp

And with that you should be ready to provision using Ansible!

If you want to run other Cygwin programs from your Vagrantfile, such as ansible-galaxy, you’ll have to make another batch file. For an example of how to easily make a bunch of wrapper batch files, checkout this gist.

new jersey geographer“Doing more with SQL” talk at MAC URISA

Last week, I gave a 30 minute talk on “Doing More with SQL” at MAC URISA 2014. The slides from the talk are available below:

(...)
Read the rest of “Doing more with SQL” talk at MAC URISA on my blog.


© John Reiser for new jersey geographer, 2014. | Permalink | No comment | Add to del.icio.us
Post tags: , , , , , , ,

Directions MagazineOnix Networking Now a Premier Partner with Centigon Solutions

GIS CloudThe Future of Your GIS Workflow

Then…

Not so long time ago, we were still manually drawing maps, populating survey forms in the field with a pen and inputting collected data manually in Excel as a part of an extensive paper based workflow. And what about if ‘heavens forbid’ we had to review the data…off we go back to the field with more paper forms. Then back to the office to cross-reference the data then and now, but only after we inputted the new data in excel. Hmmm… and then we printed out a map with administrative areas and did some massive coloring.

Desktop

Then came the Desktop GIS and man did ‘coloring’ get better. Now we had an option to spatially have all the data on a map on a computer that had the data associated to each location we collected with a GPS device. We could join Excel tables and review data all in one place, overlaying multiple layers on top of each other and geospatial analysis suddenly got faster and better. It’s just… we became inpatient to wait for the team to bring back the data from the field with their GPS units. Especially considering we had all these meetings we had to attend across the country. Common now, we want it faster if not instantly… like in real time please!

…and now

So now we sit in our very minimalistically decorated office as we don’t have tables covered in papers nor big desktop computers that require massive storage units. Even better, now we can be in Orange County on a meeting while our crew collects data in Florida and sends it to you. You receive it instantly, data is updated in real time and you can automatically show the progress to your colleagues. Quickly filter the data based on an attribute (same city let’s say) click on locations and get info associated to it like photos, lat, long, GPS accuracy, notes from the field… the list continues. Done in your browser… in the Cloud.

Future

As GIS practitioners we can say the progress has been fast and mind blowing so far. Making GIS more integrated in various industries and simplifying workflows than ever before. So what has changed so much? At GIS Cloud, we believe that GIS is a concept that you customize to you and your needs rather than customizing and adapting yourself to the GIS standards. Future is in:

    Integration Real Time Mobility High Accuracy Fast Performance Non GIS Expert user friendly solutions

Simplify your workflow today!

actionable

All Points BlogThe Next Wave of Cybersecurity Issues as We Approach the IoT

I attended a presentation today by Michael Lee, Cyber Security Research Manager at CFDRC, who explained that the Internet of Things (IoT) opens a new front in cyber security and exposes vulnerabilities to many types of devices and systems. On the top of his list were supervisory... Continue reading

LiDAR NewsBentley’s Be Inspired Awards 2014

For a lifelong civil engineer this is the top event of the year. Continue reading →

Click Title to Continue Reading...

Between the PolesNew York State moves toward Utility 2.0

The New York Public Utility Commission is moving the New York state utility industry toward a radical refinition of utilities as we have known them over the past hundred years since Tesla and Edison created the electric power industry.  In the future in New York state the utility industry may be comprised of distributed system platform (DSP) providers, basically providing the grid but not directly selling energy, and and energy market with many energy providers including bulk power generators and many distributed energy generators, you and me with rooftop solar panels.

Currently the grid is comprised of large, central power plants interconnected via transmission lines and distribution networks that feed power to customers. But this is beginning to change with the rise of distributed energy resources (DER) such as small natural gas-fueled generators, combined heat and power plants, electricity storage, and especially solar photovoltaics (PV) on rooftops and in larger arrays connected to the distribution system.  DER already has had a significant impact on the operation of the electric power grid and its role will undoubtedly become even more important in the future. 

Installed capacity mix 80% high and low load NREL June 2012Solar power is growing rapidly.  4.2 GW of solar power was added in 2013 in the U.S.  It is not unreasonable to expect reaching 20GW of solar and a million installations in the near future.  This proliferation of solar is driven primarily by customers wanting choice and by solar PV becoming very economical.  For the first time in 100 years companies like SolarCity are providing consumers with a competitive alternative to the local power company.  This is disruptive for the traditional utility business model.   Many utilities are aleady trying to adapt to this new world, for example by utility-provided solar power at Wright-Hennepin Cooperative Electric Assn.

In the traditional utility model, if you compare variable and fixed costs with fixed and variable revenue, there is a mismatch.  This translates into a revenue shortfall or unrecovered cost when you compare a non-solar customer's with a solar customer's annual bill.  The problem is not technology, it is the current utility business model and in the utility industry in the U.S. there is an increasing recognition that it has to change.

In Maryland earlier this year in response to a recommendation from the Governor’s Task Force on Grid Resiliency to “scope out a Utility 2.0 pilot proposal", the Energy Future Coalition (EFC) prepared recommendations for a smart grid pilot project that would address key business objectives including a regulatory framework for the smart grid and a new utility business model that would keep utilities financially viable even as they delivered less electric power.  EFC's key recommendation is a new utility business model that would decouple utility revenue from selling electric power.

Reforming the Energy Vision in New York State

New York has gone a step further.  A Straw Proposal submitted by New York Department of Public Services (DPS) staff in its capacity as adviser to the Public Service Commission (PSC) proposes increasing the use and coordination of distributed energy resources(DER) via markets operated through a distributed system platform (DSP). Fundamentally this means that DSPs are not in the business of directly selling power themselves (unlike today's utilities), but create and maintain the market and infrastructure for the sale and distribution of electricity produced by the wholesale market (bulk generation) and thousands of DER generators.  In the Straw Proposal DER is used to describe a wide variety of distributed energy resources, including end-use energy efficiency, demand response, distributed storage, and  distributed generation. 

Earlier this year New York’s Governor Andrew Cuomo asked the New York Public Service Commission to fundamentally shift utility regulation to meet the needs of a more distributed, consumer-focused energy system.  In April 2014, the Commission issued an Order Instituting Proceeding for Reforming the Energy Vision (REV) which proposed a platform to transform New York’s electric industry, for both regulated and non-regulated entities, with the objective of creating market-based products and services that would drive an increasingly efficient, clean, reliable, and customer-oriented industry.  Regulatory reform would enable coordination of a wide range of distributed energy resources  to manage load, optimize system operations, and enable clean distributed power generation.  New and existing  markets and tariffs would be developed to empower customers to optimize their energy usage and reduce electric bills, while stimulating innovation and new products. 

The Straw Proposal supports the central vision of REV, that increasing distributed energy resources(DER) via markets operated through a distributed system platform (DSP) is achievable and offers substantial customer benefits.  The proposal found there is large potential for the integration of Distributed Energy Resources (DERs) into the New York electricity market, via a Distributed System Platform (DSP) framework.  The integration of DER offers customers the opportunity to manage their usage and reduce their bills while at the same time creating important system and societal benefits such as increased system efficiency and reduction of carbon emissions. (The DPS Staff was assisted in preparing this proposal by Rocky Mountain Institute, the Regulatory Assistance Project, and the New York State Energy Research and Development Authority. )

Distributed System Platform

The foundation for Utility 2.0 in New York is the distributed system platform.  The definition of DSP used by the DPS staff is an "intelligent network platform that will provide safe, reliable and efficient electric services by integrating diverse resources to meet customers’ and society’s evolving needs. The DSP fosters broad market activity that monetizes system and social values, by enabling active customer and third party engagement that is aligned with the wholesale market and bulk power system."

The DSP provides basic functions including market operations, grid operations, and integrated system planning. 

  • The DSP will enable participation by DER service providers in a transparent, open market.  It will also  create an open platform for new energy products and service delivery by third parties and energy suppliers to consumers.
  • The DSP will need to integrate new market operation functions with both utilities’ existing grid operations and advanced “smart grid” capabilities.  The distributed grid will facilitate deployment of DERs, two-way power flows, advanced communications, distribution system monitoring and management systems, and  automated balancing of energy sources and loads. This is intended to lower peak demand on the bulk power system, increase reliability and manage investment needs of the distribution system.
  • The DSP will  the require modernization of power distribution systems.  In the future distribution systems will need to operate under conditions very different from those of today.  Modernization of distribution systems must meet and balance important policy objectives such as system reliability and resiliency, customer empowerment, consumer protection, system efficiencies, cost-effectiveness, competitive markets, energy efficiency, power quality, fuel diversity, and responsible environmental stewardship.

Implementing the DSP

Currently, power distribution utilities deliver electricity services directly to end-use residential, commercial and industrial customers.  The New York Independent System Operator (NYISO) operates the transmission network and manages wholesale electricity markets.  Distribution utilities construct, maintain and operate distribution system infrastructure and assets.  A growing number of customers are engaged in distributed generation, for example, rooftop solar.   There are demand response and energy efficiency programs sometimes run by the utilities themselves, sometime by other agencies.

The Straw Proposal recommends that New York's existing distribution utilities, such as Consolidated Edison, National Grid, and New York State Electric and Gas, serve as the DSPs. This would entail New York's utilities focusing on energy distribution (the grid), market-creation and other electric services but getting out of the business of direct energy sales. 

Under the REV vision, the DSPs will responsible for creating a retail energy market which includes both the wholesale energy market and the growing DER markets - you and me with our rooftop panels.

Utility grid operations divisions will continue to be responsible for distribution system planning across the electric network, including the distribution network and connections to the bulk power system.

The NYISO will continue planning for bulk system upgrades, bulk generation forecasts, and other service needs.

Customers will become participants in the management and optimization of the electric system through wide-scale adoption of DER products.  For larger customers this could involve actively managing energy usage and generation.  For smaller customers automatic technologies and controls could optimize their usage patterns.  DER service providers could become aggregators between customers and the DSP.

The Public Services Commission's role will be to maintain a critical oversight of the market.  This will include guidance and processes for market rule making, approving investment plans and rate designs by regulated utilities, and reviewing the activities of ESCOs (energy suppliers), third-party service providers, and utilities for compliance with market rules.

Information exchange

One of  the important features of the Straw Man proposal is an information exchange.  Since customer electric power usage data can reveal opportunities for DER investment and development of innovative products and services such as consumer apps, the Straw Man proposes a bi-directional electricity data  information exchange involving customer usage data from smart meters and data from energy generating installations on both sides of the meter.  The data exchange is intended to help with monitoring the distribution system, identifying opportunities for DER products and services tied directly to customer data, and to support the development of innovative DER products and services.  Customers would have the option to opt out of the data exchange program.

Privacy

Customers would also be able to access to their own energy usage data in a standard format.  In addition, customers would be able to authorize that their energy usage data be provided to third-parties such as DER providers, to enable providers to develop and offer products and services that are tailored to the customer’s specific energy patterns and needs.   New tools, often apps running on hand-held devices, are being developed to help energy consumers understand the alternatives in purchasing electricity from a third party provider, installing solar PV on their roof, and other energy-related services.

All Points BlogGIS Education News Weekly: GIS for Africa, GTCM, UWF Quality of Life Map

An update on GIS for Africa's 2014/15 Plans The session of GIS for Africa's EduCONNECT (2014/2015) started on Oct 13, a bit later than planned due to ebola finding its way to Nigeria. The EduCONNECT 2014/2015 is focusing on the revitalization of geography education. In our... Continue reading

Prioleau AdvisorsIFTTT has a better geo strategy than Twitter

Screen Shot 2014-10-29 at 2.12.51 PMI just added a new recipe on IFTTT that sends me a message whenever someone sends a tweet within a specified radius.  They bill it as a neighborhood tweet watch but it could obviously be used to watch for tweets within any area or place.  Right now it used a simple point-and-radius geofence but presumably that could be extended into any sort of bounding box.

I mentioned that I’d added this and got a number of responses including a comment fromScreen Shot 2014-10-29 at 2.23.38 PM Arjun Ram that suggested that IFTTT’s geo strategy is better than Twitter’s.  It raises the question of whether Twitter does in fact have a geo strategy and why they’ve been so slow in doing anything in that area.  After an early acquisition of the geo-team at Mixer Labs in 2009, they’ve not really done much.  Not only has Twitter been slow to encourage location tagging of tweets that could benefit (something like 2-3 % of all tweets have a location tag), they’ve not even deployed simple tools to allow location filtering of those tweets that are geotagged. Now IFTTT has done it.

Maybe it’s just not a priority.  But it seems like there would be some useful applications. Screen Shot 2014-10-29 at 2.21.03 PM For instance:

  • Monitor tweets about a sporting event or concert only from people at the venue.
  • Monitor tweets along a route to detect traffic incidents.  With a little creative UX design, you could make a driver friendly crowd-sourced traffic and road condition similar to Waze.
  • Local businesses could monitor for people who were close by and sent out a hashtag about what they wanted (#lunchdeals) and respond with an offer.
  • News, public safety or marketing companies could monitor public places for increases in twitter activity that might give early warning of a spontaneous event or unusual occurrence happening there. This would take some serious data munging to establish baselines activity from which anomalies could be detected.

Those may not all be good ideas but they’re just a few I though of in 10 minutes. The point is that it seems like the Twitter Firehose has a vast amount of information some of which is usefully tied to location.  I’m surprised that Twitter has not done more to make something useful out of it.

LiDAR NewsFARO Reports Strong 3rd Quarter

In a recent press release FARO announced strong earnings for the third quarter. Sales in the third quarter of 2014 increased 20.6% to $82.2 million from $68.2 million in the third quarter of 2013....

Click Title to Continue Reading...

GIS LoungeUsing Citizens to Map Atmospheric Particulates

Dutch researchers were able to successfully map levels of atmospheric particulates by pulling data from over 8,000 smartphone users.  A team from the University of Leiden in the Netherlands developed a smart-phone adaptor called an iSPEX.  This inexpensive, mass-producible add-on is attached to the front of the phone’s camera that [...]

The post Using Citizens to Map Atmospheric Particulates appeared first on GIS Lounge.

BoundlessCreating a custom build of OpenLayers 3 (Revisited)

OpenLayersSince OpenLayers 3 likely includes more than needed for any single application, we previously described how to generate custom builds with just the relevant code.

As promised back in February, things have changed for the better and it’s now easier than ever to create a custom build of OpenLayers 3 thanks to a task called build.js that uses a JSON configuration file. Documentation for the tool resides here.

Configuration file

We will now use build.js to create a custom build for the GeoServer OpenLayers 3 preview application, the same application that we used in our previous blog post. The full configuration file for our application can be found here.

The exports section of the configuration file defines which parts of the API will be exported in our custom build. By using the name of a class such as ol.Map we export the constructor. By using a syntax such as ol.Map#updateSize we are exporting the updateSize method of ol.Map. The exports section is basically the replacement for the exports file that we used in the older blog post.

We did not make any changes to the compile section. In the define section we are using the same defines as we were using before with Plovr, but its syntax is a bit different "ol.ENABLE_DOM=false" versus "ol.ENABLE_DOM": false. By the way, OpenLayers 3 does not use Plovr anymore.

Compiling

To compile our custom build, first make sure you have run npm install in your OpenLayers 3 git clone directory, then use the following command:

node tasks/build.js geoserver.json ol.min.js

The first argument (geoserver.json) is our build configuration file, the second argument is the name of the output file.

That’s it!

The end result should be a much smaller file tailored specifically to the needs of this specific application.

Interested in using OpenLayers in your enterprise? Boundless provides support, training, and maintenance. Contact us to learn more.

The post Creating a custom build of OpenLayers 3 (Revisited) appeared first on Boundless.

LiDAR NewsMagic Leap

I would definitely say they are doing some sort of laser scanning or LiDAR." Continue reading →

Click Title to Continue Reading...

All Points BlogGet Paid to Have Your Cell Phone take Video Outside your Window for Geodata Collection

Placemeter is paying people up to $50 a month for the video feed they supply to the company. The raw film is transmitted to a sensor that turns the feed into aggregated and anonymised data for local businesses, urban planners and advertisers to purchase so that they can get a more... Continue reading

All Points BlogGIS Government Weekly News: PA Gets Coordinating Committee, Open Data Catalog, Vote in Vermont

PA Gets a GIS Coordinating Committee Only three U.S. states (and commonwealths) do not have a geospatial coordinating council. But Pennsylvania soon will should the state bill be signed into law. A local paper has the story. Campbell County's 1% Tax Map Campbell County, Wyoming put... Continue reading

AnyGeoAntares Rocket Explodes on Lift Off – suffers a catastrophic anomaly!

TweetA tragic and sudden end to the Antares Rocket for NASA today as the launch went sideways, so to speak, and the rocket exploded shortly after launch. This from the CBC…  The 14-storey rocket, built and launched by Orbital Sciences … Continue reading

Letters from the SALCanopy Height Models - An Object-Based Approaches

Canopy Height Models (CHM) derived from airborne LiDAR are nearly as old as LiDAR itself.  CHMs are typically raster representations of the tree canopy, but in some cases people have used the term to describe models that represent all features above ground, whether or not they consist of only canopy.  A true CHM is one in which other above-ground features such as buildings and utility lines are removed.

Even if a CHM is accurate in the sense that it only represents tree canopy LiDAR returns there are a two primary limitations with the most CHMs.  The first is that the CHM is stored in raster format.  Raster cells don't represent actual features and thus the data are less accessible to decision makers who may have questions such as "Where are the tallest trees in our community located?" and "How many trees over 80 feet do we have in our parks?"  The second limitation stems from the fact that LiDAR are often acquired leaf-off and thus a CHM derived from leaf-off LiDAR does not represent the canopy, but rather the occasional branch and stem that generated a return from the LiDAR signal.

As part of our tree canopy assessment for Boone, Campbell, and Kenton Counties in northern Kentucky that we carried out in collaboration with Mike Galvin (SavATree) for the Northern Kentucky Urban and Community Forestry Council we developed an object-based approach to canopy height mapping that overcomes the limitations of traditional CHMs.  Our object-based approach to land cover mapping integrates leaf-on imagery (A) and leaf-off LiDAR (B) to map tree canopy (C).  This process overcomes the limitations are inherent in the imagery (no clear spectral signature for trees) and the LiDAR (leaf-off returns resulting in tree canopy gaps) to create a highly accurate tree canopy map.  In this project the accuracy of the tree canopy class was 99%.  We then feed the LiDAR (B) and tree canopy (C) into a second object-based system that creates polygons approximating tree crowns and returns the max (D) and average (E) canopy height using only those LiDAR returns that are actually trees.  The result is a vector polygon database that can be easily queried and merged with other vector datasets for subsequent analysis.

This project would not have been possible without the LiDAR data distributed through the Kentucky from Above program.  If you would like to reference the graphic we have posted it to FigShare.


All Points BlogWeather Radar Captures Antares Rocket Explosion

Correction Oct 29: Updated to clarify Satellite vs. Ground Rader. Thanks to reader Robb for pointing out the error. --- Baron Services, a weather technology company, captured the explosion of the Antares launch on weather radar. Baron’s most accurate weather radar technology... Continue reading

James FeeCartographer Mapping Shots to Evaluate NBA Players

Cartographer Mapping Shots to Evaluate NBA Players: Unlike the static, state-to-state action in...

knowwhere GIScussionsA #WTF Story Map

Maps tell stories. Thanks to Kris Krug https://www.flickr.com/photos/kk/

The idea that a map can tell a story or illuminate a timeline is not new but it appeals to the geo evangelist in me. Nowadays it is pretty easy to produce a fairly slick map that tells a story.

  • The Knight Foundation have StoryMap JS which has some simple but elegant examples in its gallery like this Yahoo map of the World Cup nations. You could say that these maps are more about the images than the map but it is a neat way to tell the story and it works for me.
  • MapStory is an open source app that enables the user to build and share animated time sequence maps. Have a look at ‘The spread of the killer bees” for a good example but remember to switch the legend on or it really doesn’t tell you anything.
  • Esri offer their StoryMap service which fuses multimedia into elegantly curated stories which are a lot more than just another animated map. I like the approach of 10 National Parks threatened by oil trains for example.

Esri’s StoryMap really is sweet and building a story map using one of their templates looks pretty simple to me, I couldn’t find an animation option (which seems to be MapStory’s key differentiator) but maybe a bit of custom coding would get you there. MapStory feels a bit clunky and certainly the examples in the gallery don’t have the polished feel of StoryMaps or StoryMap JS (if you aren’t confused by the variants of ‘map’ and story’ yet you should be).

When one of the Esri twitter accounts pointed at a StoryMap I was expecting something pretty neat

Unfortunately this example shows that just loading points into a story map does not tell any story or add very much to the underlying data. This mass of pins and scrolling call out boxes with link-outs all over the place is just a mess, I am not sure what it is meant to be telling me or why Esri think this is a great example of their StoryMaps product. For non-Californians who don’t know him, Huell Howser was California’s answer to Michael Palin (but a little more constrained in terms of the places he visited), his programmes have something of a cult following.

Story Maps or Map Stories can communicate ideas either navigating via the map or linking a story line and images to the map or by animating timelines, they can be interactive multimedia spatial infographics (that sounds like I have signed up for the Esri marketing team) but they can also be really naff, crass and cartocrap [(c) Kenneth Field] if we aren’t careful

Footnotes