The Only Fantasy World Map You'll Ever Need by Jake Manley isn't the first map of its kind that I've seen (see also the map in Diana Wynne Jones's Tough Guide to Fantasyland); still, it's clear that fantasy maps are a proven vehicle to satirize and critique the genre. (And be satirized and critiqued.) Via @scalzi.
]]>insula_serpilor_map-by-bogdan-giuscaA recent mini-series of GeoCurrents posts by Claire Negiar discussed divided islands that were in the past, or still are, bones of contention between sovereign states. Other islands have become the subject of international disputes in their entirety. One case is Damansky (Zhenbao) Island, which has been disputed by Russia and China, as discussed in an earlier GeoCurrents post. Another example is Snake Island, also known as Serpent Island, or Ostriv Zmiinyi in Ukrainian and Insula șerpilor in Romanian. This tiny islet—with a total area of 0.066 square miles (0.17 square kilometers)—has been at the center of a dispute between Ukraine and Romania. Although the Snake Island is now officially recognized as part of Ukraine, with the territorial limits of the continental shelf around the island having been delineated by the International Court of Justice in 2009, some discontent remains on the Romanian side.
This post is from GeoCurrents
]]>insula_serpilor_map-by-bogdan-giuscaA recent mini-series of GeoCurrents posts by Claire Negiar discussed divided islands that were in the past, or still are, bones of contention between sovereign states. Other islands have become the subject of international disputes in their entirety. One case is Damansky (Zhenbao) Island, which has been disputed by Russia and China, as discussed in an earlier GeoCurrents post. Another example is Snake Island, also known as Serpent Island, or Ostriv Zmiinyi in Ukrainian and Insula șerpilor in Romanian. This tiny islet—with a total area of 0.066 square miles (0.17 square kilometers)—has been at the center of a dispute between Ukraine and Romania. Although the Snake Island is now officially recognized as part of Ukraine, with the territorial limits of the continental shelf around the island having been delineated by the International Court of Justice in 2009, some discontent remains on the Romanian side.
This post is from GeoCurrents
]]>Some of GeoDA's features are either a) not present in ArcGIS and its extensions or b) only found in ArcGIS Advanced, formerly ArcInfo, namely the creation of spatial weights using polygon contiguity/adjacency. (Note: You can create weights in ArcGIS, based on distance for example.)To help keep all maps uniform, I imported the results into QGIS. Click on any image below to magnify it. You can find definitions for the terms and statistics used here.
For the official map, for comparison, visit here. |
Global (p=0.02) and local autocorrelation are present. |
The Moran scatter plot of percent uninsured vs. lagged/neighboring counties has a r-squared value of 0.74 |
No global autocorrelation (p=0.51) but local autocorrelation is present in parts of states or throughout most of particular states, for example the low percent uninsured (and low percent in poverty) in Massachusetts which underwent significant healthcare reform in 2006. What do you think about some of the other states? |
Those States sprinting ahead with implementation and those sitting it out. |
ArcGIS 10.2.2 is now available for download from the Esri Customer Care site. For the most part 10.2.2 is about bug fixes. The complete list of issues addressed can be found here. If you plan on upgrading your 10.2 or 10.2.1 ArcGIS Server site to 10.2.2, please read the ‘What is new in 10.2.2’ section in the help as it includes helpful compatibility tables and other information. In addition to the issues addressed, here are some key new aspects of ArcGIS for Server and Portal for ArcGIS that you should be aware of:
We recorded a 15 minute video tutorial to give you a quick tour of this new functionality. If you would like to build your own applications, use the 10.2.2 release of the ArcGIS Runtime SDKs, they work well with your ArcGIS 10.2.2 Server.
Important note: The installation utility for completely disconnected Portal for ArcGIS environments will be made available for download via the Esri Customer Care site the week of April 21st. Stay tuned!
Enjoy 10.2.2!
The ArcGIS Server Team.
]]>One thing I would like is if you have changes or want to add things – forward them to me (you aren’t required to). My ultimate goal is to move this into github to make it more collaborative. Model Builder has been around long enough it’s worth knowing about. I think this class covers enough of the basics to get you started and happy with the software. I’m hoping by version 6 I get to a good spot with this class and get it to a level where it helps more people. Also I want to take “me” out of the class. In other words you read through this and you don’t have to have me standing up front answering questions.
I’m placing it on the website in pdf and odp (open office) formats. Like I said – the ultimate goal is github. I’ll be slowly working on this class and keeping it free. Once again – you want me to come teach it contact me and we will work something out.
The data! The data comes from a project I did several years ago – The Conasauga River Project. At one time I was using this data for every class I taught. There is more data here than you will need – but I didn’t want to break the dataset up. It was a great project GIS wise and it’s been very useful as training material.
Just to protect NRGS – This class is released as is. There is no warranty to the class or data and NRGS will not be held responsible if you build a model that destroys you, your computer, and causes natural disasters that ends life as we know it.
Go nuts -> http://www.northrivergeographic.com/model-builder
]]>As illustrated in this collection of Landsat 7 images, Landsat 7 provides a worldwide audience with objective views, both current and historical, of events and trends across the global landscape. Landsat data can be used to detect and monitor urban growth, forestry practices, the extent of floods, wildfire burn acreage, major natural or human-caused disasters, and many other important changes in land-surface conditions.
Landsat 7’s remarkable longevity has been vital to the majority of Landsat data users who require frequent imaging of specific areas for land and resource management. For example, water resource managers in western U.S. states need Landsat’s unique combination of thermal and vegetation condition readings at field scale to estimate water use more efficiently for crop irrigation — typically the major source of water consumption in these arid regions
Continuous data and more of it
Combined with Landsat 8, Landsat 7 ensures the collection of images across the entire U.S. every eight days (clouds permitting) and enables the collection of critical global imagery sets on a seasonal basis. Working in tandem, Landsat 7 and Landsat 8 together collect nearly 1000 images daily, almost double the amount of data collected when Landsat 5 and 7 were operating together. This increased data collection benefits all Landsat applications, especially in persistently cloudy areas (e.g. humid tropics and high latitudes) where multiple imaging attempts are essential.
A mechanical failure onboard Landsat 7 in 2003 reduces the amount of data in each scene by 22%. However, this loss does not affect the quality or usefulness of the remaining data. Many users simply treat the blank areas of each scene as if they were obscured by clouds. Landsat customers, especially those with agricultural interests, who require 8-day repeat data collection have a strong motivation to use Landsat 7 data even with the 22% loss per scene.
Barring the failure of any key spacecraft component, the remaining fuel on Landsat 7 is expected to permit imaging operations through 2017. NASA and the USGS are working together on a plan to ensure long-term continuity of land imaging operations while also addressing the near-term need to replace Landsat 7.
Seeing the world’s forests and the trees
An outstanding example of the scientific value of Landsat 7 imagery is the research published in 2013 by a team of scientists led by the University of Maryland. This team analyzed data from Landsat 7 to map changes in forests from 2000 to 2012 around the world at local to global scales.
Published in the journal Science, the study comprehensively described changes in the world’s forests from the beginning of this century, tracking forest loss and gain at the spatial granularity of an area covered by a baseball diamond (30-meter resolution). The uniform data obtained from more than 650,000 scenes taken by Landsat 7 ensured a consistent global perspective across time, national boundaries, and regional ecosystems.
A Landsat primer
Landsat images from space are not just pictures. They contain many layers of data collected at different points along the visible and invisible light spectrum. Consequently, Landsat images can show where vegetation is thriving and where it is stressed, where droughts are occurring, and where wildland fire is a danger.
Landsat satellites give us a view as broad as 12,000 square miles per scene while describing land cover in units the size of a baseball diamond. From a distance of more than 400 miles above the earth surface, a single Landsat scene can record the condition of hundreds of thousands of acres of grassland, agricultural crops, or forests.
Landsat data have been used to monitor water quality, glacier recession, sea ice movement, invasive species encroachment, coral reef health, land use change, deforestation rates, and population growth.
Free data for innovation
The Department of the Interior and USGS policy of unrestricted access and free distribution of Landsat data encourages researchers everywhere to develop practical applications of the data. Ready access to Landsat images provides a reliable common record of Earth conditions that advances the mutual understanding of environmental challenges worldwide by citizens, researchers, and decision makers.
USGS role in observing Earth
USGS and NASA have distinct roles in the Landsat program. NASA develops remote-sensing instruments and spacecraft, launches satellites, and validates their performance. The USGS then assumes ownership and operation of the satellites, in addition to managing ground-data reception, archiving, product generation, and distribution. USGS has managed daily end-to-end Landsat operations since October of 2000.
Learn more
USGS Landsat (latest satellite status and related information)
What is the Economic Value of Satellite Imagery? (USGS Professional Paper)
]]>Field & Stream called it a “…very cool tool and quite a bit of fun.” MinnPost described it as a “…high-tech illustration of Norman Maclean’s timeless view that, ‘Eventually, all things merge into one, and a river runs through it.” And Popular Science noted that, “There’s something especially satisfying about clicking a stream that…shoots its way across multiple states to empty into the ocean.”
These publications are all describing Streamer, the popular on-line mapping program from the U.S. Geological Survey. Streamer is a powerful, yet easy way to explore our major waterways. With a simple map click, anyone can trace rivers and streams from a starting point all the way downstream to where a stream drains. Even more impressive, they can click on a stream and trace all others that drain to that point. Streamer also produces a report that includes a map and information about the people and places encountered along the streams traced.
As good as Streamer was when it launched last summer, it just got better. Four major enhancements and dozens of small improvements have been made. These include:
It’s fascinating to explore the connections among our major streams and rivers using this improved new edition of Streamer. In its first eight months in service, Streamer users traced more than 2.9 billion river miles.
The USGS announced in February that it is ending production of the National Atlas on September 30, 2014 and that some of its products and services would transition to The National Map. With this release, Streamer becomes the first of these National Atlas products and services that The National Map will offer. For cartographers and geospatial information professionals, Streamer’s surface water data is available for download at no cost.
For more information: http://nationalmap.gov/streamer/
]]>ACRP Report 88: Guidebook on Integrating GIS in Emergency Management at airports consists of a guidebook and a CD with worksheets to help airports identify needs and assess current capabilities with respect to using geographical information systems (GIS) in emergency management (EM). The information collected in the worksheets provided become the backbone of a GIS-EM integration plan. A PowerPoint presentation (available on theTRB website by searching for ACRP Report 88) outlines the benefits of integrating GIS into EM and can be used when presenting those benefits to stakeholders.And the backgrounder:
A geographic information system (GIS) can be a productive tool to enhance EM and significantly reduce the gap in information flow and accuracy. For example, several airports have airport-specific assets mapped in various GIS layers including the following:
- Runways;
- Gates;
- Terminals and buildings;
- Roads and parking;
- Power stations and utility lines;
- Storage facilities;
- Fire suppression and alarm system components;
- IT infrastructure, location of on-site staging areas; and
- Other items, such as lease space/tenant information.
These assets and their associated information can be key components to modern day EM operations.
If you are curious about GeoPackages, here’s what you can do. At 10.2.1 or with 10.2.2 ArcGIS desktop, you can create an empty GeoPackage and populate the GeoPackage by copying feature data into it. At 10.2.1, we supported the draft version of the specification and at 10.2.2, the final version of the spec is supported. Currently we support only vector features, but with 10.3 we expect to extend support for raster tiles. One of the primary uses cases driving GeoPackage use is mobile support. Expect to see support for GeoPackage in runtime later this year.
So if you are a sqllite database aficionado and would like to test the waters with GeoPackage, here’s what you can do today with 10.2.1 or 10.2.2. You can use the included script to create a sample empty GeoPackage and then populate it with vector features. Use this GeoPackage as you would any other dataset. We have noticed that in some cases when navigating to a directory that contains GeoPackage (.gpkg) data, ArcCatalog/ArcMap does not display the file. Please review this KB article if you run into this issue. http://support.esri.com/en/knowledgebase/techarticles/detail/42348
Lance Shipman on the database team has been actively involved with this effort from the very beginning. Lance and I would welcome your feedback, as we at Esri continue to improve and extend GeoPackage support in 10.3.
Sample python script to create a GeoPackage.
import arcpy
# Set local variables
sqlite_database_path = ‘C:\data\example.gpkg’
# Execute CreateSQLiteDatabase
arcpy.gp.CreateSQLiteDatabase(sqlite_database_path, “GEOPACKAGE”)
The post 10 Geeky Geotech blogs to add to your favorite Geo news reader appeared first on CloverPoint.
]]>The following are 10 Geo blogs of interest that find their way into our daily (or almost) daily reading:
Obviously, this list is a little subjective and simply meant as a sample list and suggestions for your Geo reading. We easily could have built a list of 100 favorite blogs but 10 seemed to be a good place to stop! Do you know about or maintain a GeoTech blog that we should know about? Feel free to share with us on Google+ or hit us up on Twitter @CloverpointVic with a link.
The post 10 Geeky Geotech blogs to add to your favorite Geo news reader appeared first on CloverPoint.
]]>We will show you how you can leverage you can download maps to your device, collect and update GIS features whilst disconnected and then synchronize changes back to the office when you are connected.
Presentations will happen at 9:00 a.m PDT, 11:00 a.m. PDT and 3:00 p.m. PDT. You can find more details on the seminar here. You can download and read more about Collector here.
]]>CanVec is a dataset produced by the federal Department of Natural Resources. It's been made available to use in OpenStreetMap: users have to download the data for a given area and import it into the OSM database.
It's a great resource, but I've been giving CanVec the side eye for years, largely because OSM users had been bungling the imports and not cleaning up the mess they made. To some extent it also encouraged a certain amount of laziness from Canadian OSM users: why go to the trouble of tracing imagery or going out with a GPS if you could just download the data from the Natural Resources FTP server?
That said, most of my complaints were from a few years ago; it's been a while since I've seen a CanVec-induced mess in the database (for example, doubled or even tripled roads imported on top of one another). And between existing imports and the improved Bing aerial and satellite imagery coverage, there weren't many places I was aware of that I could, you know, try a CanVec import for myself.
Except one.
Hartney, a town of a few hundred people in southwestern Manitoba, managed to fall between the cracks of two swaths of aerial and satellite imagery. It was a noticeably empty patch of a map that was starting to fill up.
It was also the town my father grew up in. I spent a lot of time there as a child. I was, suffice to say, familiar with it. It was therefore a natural target for me to map. But with no imagery and no realistic chance of my visiting there in the near future, I was not likely to do so in the usual manner.
So I imported CanVec data.
It turned out to be a lot easier than I expected. For one thing, I didn't have to import the entire tile: I could import only the items I wanted. For another, I didn't have to resort to JOSM or some other application I was unfamiliar with; I could, it turned out, do it in Potlach, the Flash-based web editor I've always used, by importing the downloaded zip file as a vector layer and alt-clicking each element through into the edit screen.
But easier still wasn't objectively easy. I had to figure out what file to download from the FTP server by looking it up on the Atlas of Canada, and figuring out which of the files to import into Potlatch is a bit of a trial-and-error thing. There's also a bit of a delay before the CanVec layer shows up in your edit window.
In the end, though, I was able to figure it out, with the following results:
I practiced good edit hygiene: I created a separate user account for imports (here) and I cleaned up what I edited: I joined road segments so that a road five blocks long wasn't five separate ways, I straightened a badly garbled stretch of rail line, and I added a couple of points of interest I knew from personal experience.
In the end, I think I've left the map better than I found it. I didn't everything I could have: CanVec isn't perfect, and I'm not in a position to verify its data on the ground, so I adopted a less-is-more approach, so that I didn't simply add a ton of data for someone else to clean up. Nor did I add so much that it would discourage a local user from adding more, better, and more up-to-date material.
A positive experience overall. I was surprised.
]]>Two books (well, one is sort of book-ish) related to map art and personal cartography to tell you about:
Jill Kelly's previous work, Personal Geographies: Explorations in Mixed-Media Mapmaking, was reviewed here in 2011.
]]>On March 14, 2014, The Washington Post online published an article by Reid Wilson entitled “The United States of smoking: The state with the most tobacco farms smokes most often”. Although much in this article is factually true and is instructive, I object to the title, on two grounds.
The first problem concerns the epithet “the United States of smoking”. Even ...
This post is from GeoCurrents
]]>On March 14, 2014, The Washington Post online published an article by Reid Wilson entitled “The United States of smoking: The state with the most tobacco farms smokes most often”. Although much in this article is factually true and is instructive, I object to the title, on two grounds.
The first problem concerns the epithet “the United States of smoking”. Even ...
This post is from GeoCurrents
]]>Dear Colleagues,
The United States Geospatial Intelligence Foundation (USGIF) is excited to announce the opening of our 2014 Scholarship Program. USGIF is dedicated to assisting promising students interested in geospatial sciences with scholarship awards to further the advancement of the geospatial intelligence tradecraft.
With your help, we can make our 2014 Program the most successful yet. Please pass along this information to your contacts and/or students and download our 2014 Scholarship Program flyer.
Last year the Foundation awarded $107,000 to 25 recipients and plans to award at least $100,000 for the 2014 program. High school recipients are awarded $2,000 per scholarship and all others are awarded $5,000 each. Since 2004, USGIF has awarded $584,000 in scholarship funding to promising students in the geospatial intelligence field.
Students studying in fields such as geography, political science, physics, computer science, engineering, biology, anthropology, sociology or any field in the natural and social sciences are encouraged to apply. Through the USGIF Scholarship Program, the Foundation strives to communicate to students the breadth and power of GEOINT in serving national, global and human security interests.
The USGIF Scholarship Program endeavors to support students with innovative ideas for problem-solving with Geospatial Science and Technology. Please share information about this scholarship program with your students.
For more information on the USGIF Scholarship Program or to download applications, please visit: http://usgif.org/education/scholarships
Deadline to apply: April 25, 2014
Thank you for your assistance.
Sincerely,
R. Maxwell Baber, Ph.D., FBCart.S
Director of Academic Programs
United States Geospatial Intelligence Foundation
Map Tour is configured using the builder, an online configuration tool that enables you to assemble a tour and customize its look and feel and color scheme.
The default story map, with a dark gray banner, “A story map” linked text, and Esri logo, is shown below:
Changing the banner color
To change the banner color click Settings
Then choose the Colors tab to select from one of the preset color combinations, or click the last row to create your own color scheme.
Changing text color
With the color changes shown above applied to the header, the text is difficult to read:
Currently there isn’t a separate color setting for the title and subtitle text, but inline styles can be used to change them. Click the title and subtitle to edit the text, and add the style changes inline using the text editor. Inline styles can be used to change other attributes, like font size, weight, and more.
The result, after applying the style change to both the title and subtitle, is shown below:
Change the header logo and links
In Application settings click the Header tab to change the logo, links, and shortcut and social options, then click Apply.
The changes applied above alter the right side of the application header as shown below:
Remember to save all your changes before you leave builder mode.
Advanced customization
You can download, modify, and host the Map Tour from your own server. With a downloaded and self-hosted template you can make similar style changes (and more) using the techniques outlined in a previous post. The example below adds additional color tweaks, plus a custom header image using style overrides in the source HTML.
For more information see
Earth imaging Colorado-based DigitalGlobe trained two of its satellites on the locations where the plane was believed to have gone down, and put out a call to computer users to scour the ocean. Pinpoint images taken from 400 miles above are so vivid that you can see objects measuring a foot square.
Images from thousands of square miles of water are made into smaller areas that crowdsourcing volunteers can zoom in on and tag if they see anything that may resemble debris. The areas where the most tags are dropped will be sent to authorities as promising areas to search.
At press time a month ago, more than 100,000 people an hour had been analyzing the images and had covered nearly 4,000 square miles of ocean. Every pixel had been viewed by at least 30 pairs of eyes.
According to the report, images from another 5,500 square miles of open water are expected to be posted for examination on the system, known as Tomnod.
Nearly 650,000 tags had been dropped in an area between the Gulf of Thailand and the South China Sea where the Boeing 777 was first believed to have gone down on Saturday.
]]>
(Note: This is the second of two articles by Stanford student Claire Negiar that together contrast the situations of two geopolitically divided islands: Saint Martin and Cyprus)
Cyprus and Saint Martin – two very different islands sharing one key property: both are split by their “mother countries,” Greece and Turkey in the case of Cyprus, France and the Netherlands in the ...
This post is from GeoCurrents
]]>(Note: This is the second of two articles by Stanford student Claire Negiar that together contrast the situations of two geopolitically divided islands: Saint Martin and Cyprus)
Cyprus and Saint Martin – two very different islands sharing one key property: both are split by their “mother countries,” Greece and Turkey in the case of Cyprus, France and the Netherlands in the ...
This post is from GeoCurrents
]]>Tech Cocktail Sessions focus on bringing industry experts and successful entrepreneurs to share their stories and answer questions. Amber was the co-founder and CEO of Geoloqi, a company focused on mobile location technology that Esri acquired in 2012. Her team in Portland is working on more cutting edge location technology for the company that essentially invented GIS, and whose software she first used at the age of 12. So we and the Tech Cocktail folks think Amber (aka @caseorganic) definitely has some unique insights and expertise to share.
If you haven’t seen Amber talk before, check out her TED Talk about cyborg anthropology or dConstruct talk about ambient location. And if you happen to be in Vegas Friday try to make it to the Tech Cocktail Session.
]]>For example, for the 2001-2002 school year, the cleaning contract for South Hull Elementary School was $40,131 and in 2012-2013 it reached $83,665, but for 2013-2014 the contract dropped to $37,098. For Eardley Elementary, the cleaning contract was $20,713 for the 2001-2002 school year. In 2010-2011 it jumped to $156,563 from $36,432 in 2009-2010. In 2011-2012 it reached a staggering $173,668. For the 2013-2014 school year the school board changed service providers and the price dropped to $23,144.
That's astonishing: cleaning services for just two of the WQSB's five urban elementary schools had ballooned to nearly a quarter million dollars a year. But those services now cost $60,000 a year -- a savings of $180,000. How could one contractor justify $173,668 for a job another contractor could do for seven and a half times less? Anyone who recalls the stress and angst over the Board's proposal to close schools (such as Shawville's elementary school) to make up a million-dollar shortfall last year should be shaking with rage right now.
]]>The post A few PacNW Geo and Tech events of interest on the horizon appeared first on CloverPoint.
]]>Maybe we’ll see you on the road!
The post A few PacNW Geo and Tech events of interest on the horizon appeared first on CloverPoint.
]]>This is a very big year for ArcGIS for the Military. Among our many other projects, we will release several templates for our new Domestic Operations solution. Previously, Esri offered a limited set of templates known as ArcGIS for the National Guard, but as we expand our solutions, we are recognizing more opportunities to develop new templates for our customers.
Domestic Operations, also known as “Defense Support of Civil Authority” in the United States, are the collective methods in which the military is integrated into a comprehensive approach to national security at the federal, state, and local government levels. In the US military, the first element tasked with domestic operations is the National Guard, which usually works in conjunction with each state’s Emergency Management Agency. We integrated our existing National Guard and Emergency Management solutions, to create the nucleus for Domestic Operations.
In many countries, domestic operations are the military’s primary mission, and our templates will be configured for international customers. Aside from their geography, the main difference between the domestic and international templates will be in their doctrinal terminology and symbology. The US-focused templates will use terms and symbols based on US Department of Defense (DOD) and Federal Emergency Management Agency (FEMA) guidelines. The international templates will use more general terms and symbols based on North Atlantic Treaty Organization (NATO) and United Nations Office for the Coordination of Humanitarian Affairs (UN OCHA) guidelines.
The templates based on Esri technology simplify the creation and delivery of geospatial information and products to decision makers. They help personnel in your organization to conduct analysis, plan disaster response, monitor operations and improve workflows. This gets relevant information to the people that need it quickly.
New Geographies
Many Solutions map and application templates come with sample data to give you a starting point from which to use the template. Anyone familiar with Esri technology probably knows the city of Naperville, Illinois. It’s a great place, but a little landlocked, so we decided to find a location that offered more opportunities to showcase the possibilities of our Solutions. The Tidewater area of Virginia is centrally located on the Eastern Seaboard, and close to many of the federal or military agencies that would respond in the event of a large scale disaster.
To support emergency management and disaster response operations by our international customers, we are also creating templates for the city of Rio de Janeiro, Brazil. Using open-source data from the Urban Observatory (http://www.urbanobservatory.org/), we were able to create an operational environment on which we are building a mock event to showcase the capabilities of Esri templates in the ArcGIS platform.
Customer Interaction
We’ve received data from the Commonwealth of Virginia to use, and are beginning to build the data infrastructure that will provide context for our template development. We are also working with the Commonwealth Advanced Situational Awareness Working Group (CASAWG) to expand this relationship to all of the state agencies involved in the Virginia Emergency Response Team (VERT) and the localities affected by our mock event.
New Products
Several new applications are shaping the way you implement geospatial technology for domestic operations. Ready-to-use apps such as Collector for ArcGIS and Operations Dashboard for ArcGIS; new shared awareness tools such as the Executive Dashboard and Briefing Book; and applications such as the new ArcGIS Web Application Builder (AWAB) and ArcGIS Pro App will be the foundation of our solution in the future. We will transition existing templates to, and build new templates on, the newest versions of our software.
Portal for ArcGIS Template
We plan to package all of these solutions in a model Domestic Operations organization, providing a scripted deployment of Portal for ArcGIS that has groups for each functional area from the Incident Command Structure and Emergency Support Function. You’ll be able to download this package to rapidly deploy a configured domestic operations organization for your site. To ease the transition between browsing on the Solutions pages and Portal for ArcGIS, the organizational groups will mirror each other.
Learn More
To learn more about ArcGIS map and app templates for domestic operations, please contact me at jbayles@esri.com, or the Defense team at defenseinfo@esri.com. US National Guard members can contact the National Guard account manager, Mr. Andrew Smialek, at asmialek@esri.com.
ArcGIS for the Military – Domestic Operations Solutions
ArcGIS Defense & Intelligence Forum
ArcGIS Defense & Intelligence Ideas
By: Joe Bayles, Defense & Emergency Management Solutions Engineer
]]>Big difference, huh? This is a fully configurable template that allows you to create your own unique web mapping applications.
Overall, we’ve made a simpler, more usable mapping application. We have moved the drop down menus into a side panel that can be collapsed to accommodate different screen sizes. You can add a short summary of your map and drive users to areas of interest through map notes and bookmarks. Layers can be turned on or off and social media layers can be configured.
Esri’s Disaster Response Program uses this template to create applications highlighting wildfires, hurricanes, severe weather, flooding, and earthquakes. An example of a customized Public Information template is our Severe Weather map.
Are you as excited as we are with the new look? We would like your feedback on the new template! Please send us your comments.
Happy Customizing!
]]>...a team of researchers, epidemiologists and software developers at Boston Children's Hospital founded in 2006, is an established global leader in utilizing online informal sources for disease outbreak monitoring and real-time surveillance of emerging public health threats. The freely available Web site 'healthmap.org' and mobile app 'Outbreaks Near Me' deliver real-time intelligence on a broad range of emerging infectious diseases for a diverse audience including libraries, local health departments, governments, and international travelers. HealthMap brings together disparate data sources, including online news aggregators, eyewitness reports, expert-curated discussions and validated official reports, to achieve a unified and comprehensive view of the current global state of infectious diseases and their effect on human and animal health. Through an automated process, updating 24/7/365, the system monitors, organizes, integrates, filters, visualizes and disseminates online information about emerging diseases in nine languages, facilitating early detection of global public health threats.
1.) It's a first step toward deliver of real time understanding of the Public Health situation around the world, and
2.) To the degree that it can, it offers granularity by reporting local developments at the lowest level publicly available.]]>
April showers may bring May flowers, but spring can also bring ice jams to the thawing rivers and streams across the northern United States.
An ice jam or ice dam, is a buildup of broken ice in the river system. It can be a problem that causes the water to back up over the top of highway bridges, roads, or into cities. At times, they can cause flooding. Ice jams can be large–backing up water for miles, or small and only back up water in a small area locally.
An ice jam can damage bridges with the amount of water pushing on the jam from behind; it can force the ice to push the bridge – moving it slightly.
USGS monitors ice jams across the north using cameras as well as by collecting ice thickness information when technicians do regular streamgage work or when measuring discharge on the rivers in the spring.
For example, each year, the Maine Emergency Management Agency and U.S. Coast Guard asks the USGS to measure the ice thickness and provide an ice jam flood potential on the Kennebec River. The U.S. Coast Guard has used their ice breakers to clear the ice in the lower Kennebec River in years when the ice jam flood potential was high.
Greg Stewart, data section chief for the USGS New England Water Science Center, said its part of the agency’s job to monitor river flows throughout the state of Maine and to measure the stream flow underneath the ice.
USGS technicians take ice cores to measure ice thickness at various places on the rivers. In order to make an ice measurement, it’s necessary to drill between 25-30 holes in the ice. Then, thickness is recorded at just several of the holes to help document the measurement conditions.
That information allows the USGS to assess the risk of ice jams, flooding or other problems when the ice begins to melt, Stewart said. According to Stewart, when ice jam flooding starts to happen, there is very little time and very little warning.
Thickness of the ice and how fast a melt occurs affects the ice jam flooding potential. For example while the weather is cold, and the water is freezing you have ice accumulation. When the weather changes and it starts to warm the ice begins to melt and begins to break up. A quick warmup with the ice strong and still in place can cause significant ice jam flooding.
Another sign of spring is the melting snowpack, which is the result of accumulated layers of snow with generally more at high altitudes. Snowpacks feed rivers and streams providing an aquatic life habitat, hydro power, a possible source of drinking water, but they are also a potential flood hazard.
With a quick warm up of high temperatures over a short period of time, there is an increased likelihood of flooding from snowpack melt, but when you have a gradual increase in spring temperatures with moderate temperatures during the day and slightly below freezing at night the flooding potential is decreased slowly and safely.
According to Stewart, when the snowpack starts to melt, historically in March, that’s when snowmelt driven runoff begins and the USGS looks at the flooding potential.
To learn about the water levels at a streamgage near you sign up for alerts to your email or cell phone here!
Learn More:
Episode 6: Ice jams, flooding likely in Nebraska this spring, transcript: March 4, 2010
]]>In The Geology of Game of Thrones, a group of geologists has created a geologic map of Westeros and Essos, as well as an invented geologic history of the planet on which George R. R. Martin's epic takes place. Via io9.
This isn't the first time a fantasy world has been looked at through a geologic lens. Karen Wynn Fonstad's Atlas of Middle-earth took a reasonably rigorous look at the landforms of Middle-earth. And Antony Swithin -- a geologist in real life under his real name, William Sarjeant -- created a geologic map of his invented island of Rockall (see previous entry).
Previously: Review: The Lands of Ice and Fire.
]]>Basic HTML code sourcing the two frames. The "Chicago" frame was generated from QGIS. The "Legend" frame was a simple piece of code directed to a JPG of the Legend. |
Screenshot of the final map and legend. |
Left: Google Earth, Right: Plugin result using files from USGS |
If your depths are positive values, select a negative multiplier in the plugin. |
An overhead and underground view of the map. The orange boundary represents three counties of interest. |
1) Can you tell me how hybrid cloud location services are different than other location based services?
“Applications that need to leverage big data-residing in both public and private clouds, especially real-time data and analytics, benefit from hybrid cloud location services. For developers, it’s a system that significantly reduces the time needed to create an application. For businesses, it’s a system which allows line-of-business users to easily create applications without needing a high level of technical know-how.
A hybrid cloud model allows businesses to reduce the total cost of ownership, rapidly adjust to their customers’ needs and leverage the flexibility that cloud applications provide. This allows for instant updates and new features to be deployed automatically, which translates to high responsiveness from the end-user’s perspective.”
2) How are Pitney Bowes’ key solutions exposed?
“Pitney Bowes and IBM are collaborating on IBM’s codename “BlueMix” Platform-as-a-Service to develop new hybrid cloud location services. BlueMix provides DevOps in the cloud – an open, integrated development experience that scales to any level. Pitney Bowes is among the first third-party solutions now available to developers and companies on the new IBM BlueMix Platform-as-a-Service.
For Pitney Bowes, the partnership provides the opportunity to expose key solutions such as location-based services, e-commerce fulfillment, Internet postage and parcel management to an extended ecosystem of innovators and developers through IBM’s API Management. It also increases the availability of new services from Pitney Bowes to vast new markets globally.”
3) Does the hybrid cloud environment integrate with clouds outside the system, and if so, how is that accomplished?
“Yes. IBM’s Hybrid Cloud solution consolidates and simplifies the integration and management of on-premise and cloud computing resources by bringing together critical end-to-end cloud technologies like Tivoli’s enterprise management capabilities and Cast Iron’s integration technology. New cloud platforms deliver agile, rapid time-to-value, ease of assembly, and reduced IT needs that are accelerating adoption of the cloud for application development.”
4) What types of personalized services/experiences do you envision customers providing?
“As the company that defined the term ‘Location Intelligence,’ Pitney Bowes’ location intelligence suite of products offers the most comprehensive capabilities providing businesses with the ability to visualize spatial data and understand relationships between specific locations.
Using advanced, hyper-accurate location data, insurers can improve underwriting decisions, telecommunications providers can better analyze network coverage and retailers can deliver more targeted promotions to consumers based on when and where they are most likely to buy.”
]]>
A graphical look at the Pontiac electoral district's voting history since 1981. "Nationalist right" includes the Union Nationale, ADQ and CAQ; "separatist left" includes Québec Solidaire and its antecedents as well as Option Nationale.
Obviously this has been a safe Liberal seat for a very long time; what's interesting is the fluctuation in voter turnout.
]]>We try really hard at Esri to make imagery management as easy as possible. It’s often an overlooked aspect of image analysis, but if you’re waiting a long time for an image to load, or to put together a band composition, it takes the fun out of it.
There are two ways that you can make a layer stack in ArcGIS. One is to use a raster product. From the Catalog pane, navigate to the folder where you have your imagery, and if you see this icon next to one of the items, you’re in luck.
Here’s what it looks like zoomed in. It’s a satellite over a generic raster.
Now, we don’t support raster products for every single sensor that’s out there. If you want to know if the type of imagery you are working with is supported as a raster product, check out this page.
If you’re imagery is not supported, what you can do to create a layer stack is select all of the bands in the image analysis window. Then click on the band composition button. You won’t have all of the functionality that you get from a raster product (you won’t have a pansharpened image already set up for you, for example) but this will create the layer stack that you’re looking for.
]]>But that's not the whole story. Voter turnout was up substantially over last time: 3,870 more votes were cast in the Pontiac electoral district than were cast in 2012, for a total voter turnout of 68.22 percent. Fortin got 8,666 more votes than L'Écuyer did in 2012, while the CAQ and PQ candidates each lost around 2,000 votes over 2012.
Charmain Levy of Québec Solidaire actually gained a percentage point and 592 votes over last time, coming within 740 votes of the PQ candidate, who in turn came in 129 votes behind the CAQ candidate. (This is the fourth time the PQ has finished third in this district: it's not new.)
Previously: 2014 Quebec Election: Pontiac Candidates.
]]>Other Shortlist apps created by Esri and the user community can be viewed at the Story Maps Gallery.
To make a Shortlist app you must first download the template, author a web map with locations of interest, add the web map ID to the template, and finally host the app from a server. This straightforward process is documented in an online tutorial.
The default Shortlist header is gray, but you can easily add visual spice by changing the color scheme or adding your own custom banner to match your organization’s look and feel. Here are several examples:
Before you begin
You should already have a successfully deployed Shortlist application. If not, follow the instructions in the Shortlist tutorial before moving on to the banner customization steps below. We’ll consider several possibilities – changing the color scheme of the banner, adding a banner graphic, and changing the color scheme of core template elements.
The changes can be made by overriding the default style settings of the application. The style changes can be made by editing the application source, by changing styles in the
In this example a new stylesheet will be created to make the desired changes. This approach has a couple of advantages; the application source code is not altered making it easy to return to the default, and all of the customizations are in a single location. To setup the custom.css file, follow the steps below:
Step 1: Add a link to custom.css
In the index.html file, add a link to the custom.css stylesheet that will contain the style changes. Open the index.html in any text editor, and locate the links to existing stylesheets near the top of the file. Copy and paste one of the existing links to the end of the list, and rename the file referenced to custom.css, as shown below:
Or copy and paste the line below into index.html after the link to style.css as shown above:
Step 2: Create the custom.css file
In the CSS folder of the Shortlist source, create a new empty file named custom.css. Changes described later will be made to this file only. If things go awry, just delete custom.css and start again. If you want to go back to the original template as downloaded, simply remove the reference to custom.css from the index.html.
Step 3: Determine the style element you want to change
Using the inspection tools available in your browser, you can peer at the Document Object Model (DOM) to discover the elements you will need to change. See the specific developer documentation for Chrome, Firefox, or other browsers to learn more.
Once you’ve discovered the elements, add them to the custom.css and make the desired changes. These changes will override the application defaults.
Example – Changing the banner color
The default color scheme for Shortlist is white text on different gray backgrounds, as shown below:
The first change we want to make is to alter the color of the dark gray banner at the top of the application. Using the browser (in this case Chrome) developer tools, we can discover that the color of the banner background color is set in the #header element:
To change the banner color, edit custom.css and use the id and attributes of the element to set a new background color. In your custom.css add the id #header and set the background-color attribute to the desired color (in this case the hexadecimal color value). The id and attribute were both obtained from the inspector as shown above.
#header { background-color: #a8a875; }The above CSS will change the header background to green, as shown below.
To change the font color add the color attribute with the desired color:
#header { background: #a8a875; color:#333; }The Shortlist application banner now has dark gray text on a green background:
Next add a background graphic to the header. From the inspector we can discover that the header is 115px high. A custom banner image was made 115px high and 1000px wide, and saved to the images folder of the Map Tour. To make the image more pleasing, a fade-to-transparent gradient (which will let the background color show through) was applied to the right side of the graphic.
We also wanted the image to display once without repeating, and to be anchored on the left side. The changes made to custom.css to display the background image as described are shown here:
#header { background-color: #a8a875; background-image: url("../images/austin-ban.png"); background-repeat: no-repeat; background-position: left; color:#333; }Here’s how the banner looks after these changes:
Doing More
By following the steps above in which you identify elements and then add the desired changes to custom.css using the id or class, you can continue to tweak your application’s look and feel. Below is the final Shortlist app, with additional customizations for the tab colors, thumbnail background, and header divider:
The custom.css file used is shown below:
#header { background-color: #a8a875; background-image: url("../images/austin-ban.png"); background-repeat: no-repeat; background-position: left; color:#333; } .tab { background-color:#a8a875; } .tab.tab-selected { background-color:#e2deb6; color: #000; } #divStrip { background-color: #dcc999; } #paneLeft { background-color: #e2deb6; } .tab-selected { background-color:#b9b9b9; color:#FFFFFF; cursor:default; }The same techniques can be applied to any application template.
For more information
More information about customizing the Shortlist template is available by viewing the Readme file found in the downloaded source.
Thanks to Corey Baker, Esri San Antonio, for contributions to this post.
]]>
New CSVLayer class to easily display data from CSV files on a map. See the CSV Layer sample for an example of how to use this new layer. If CSV files are not on the same domain as your website, a CORS enabled server or a proxy is required.
New capabilities when querying against a layer in a hosted feature service in ArcGIS Online:
Renderers’ colorInfo property and setColorInfo method now support more than two colors as well as a stops property that allows developers to associate a specific color with a data value. The legend widget also now supports renderers with colorInfo. Two samples have been updated to use this new functionality:
In addition to continuous color ramps, the esri/Color module was added at this release. It is a convenient wrapper around dojo/_base/Color and has all options supported by dojo/_base/Color.
textSymbolEditorHolder
option to specify container for text symbol editing components.
New repository on GitHub with TypeScript definitions for the JS API as well as the jshint options file used by the JS API team.
Version 3.9 of the ArcGIS API for JavaScript uses Dojo 1.9.1 as well as version 0.3.11 of dgrid, 0.3.5 of put-selector and 0.1.3 of xstyle.
]]>The post Privacy issues in social media – a free, open source Curriculum for Educators appeared first on CloverPoint.
]]>
Imagine our surprise when we recently read an article about the release of an educational program designed specifically to educate and inform youth (in schools) about privacy. But wait, this gets even better as the educational program material created by Fordham Law School’s Center for Law and Information Policy has been released as open source as a set of free open source documents on the CLIP website to any educators who want to use the instructional materials to address the many privacy issues teens face as their use of technology skyrockets. Kudos to them!
About the lessons… “These lessons center on discussions of what privacy is, how it may be relevant to young people’s lives, and how the technologies they regularly use impact their privacy. Specific topics include managing an online reputation, understanding how technologies like cell phones and facial recognition work, dealing with social media “drama,” and maintaining secure passwords.”
The lessons cover such topics such as: 1) privacy basics; 2) how to deal with passwords and behavioral ads; 3) navigating social media and tricky situations; 4) understanding mobile, WiFi and facial recognition; and 5) managing a digital reputation.
The need for this type of education is revealed by recent reports from the Pew Research Center that 93% of teens ages 12 to 17 go online, 53% of teens post their email address online, 20% post their cell phone number and 33% are connected online to people they have never met. View the Pew Research Center report.
Perhaps the developers of the educational material would also be interested in this slideshare, Silence means security, identifying the problems with sharing sometimes confidential and secret information from the workplace – obviously, social sharing isn’t just an issue for school kids, however, it is a great time to raise the issue with them!
See Also:
The post Privacy issues in social media – a free, open source Curriculum for Educators appeared first on CloverPoint.
]]>(Note: Today’s post is by Claire Negiar, a Stanford senior, soon to graduate. Claire will be writing a few posts over the coming weeks, many of them focused on France and French dependencies.)
Saint Martin. Sint Maarten. A crossroad between North and South, split between France and the Netherlands, Saint Martin has known a different fate in the aftermath of decolonization ...
This post is from GeoCurrents
]]>(Note: Today’s post is by Claire Negiar, a Stanford senior, soon to graduate. Claire will be writing a few posts over the coming weeks, many of them focused on France and French dependencies.)
Saint Martin. Sint Maarten. A crossroad between North and South, split between France and the Netherlands, Saint Martin has known a different fate in the aftermath of decolonization ...
This post is from GeoCurrents
]]>1156 Data Shadows and Urban Augmented Realities I: Practicing Data ShadowsRunning concurrent with those sessions will be a couple of other sheep-related sessions that Monica has a hand in. First is a panel on 'tribes' organized by Renee Sieber, which Monica will be participating in. After that is a paper session organized by Monica and Joe Eckert, in which Ate will be presenting a paper.
8:00 AM - 9:40 AM in Grand Salon E, Marriott, Second Floor
1256 Data Shadows and Urban Augmented Realities II: Coding Data Shadows
10:00 AM - 11:40 AM in Grand Salon E, Marriott, Second Floor
1456 Data Shadows and Urban Augmented Realities III: Tracking Data Shadows
12:40 PM - 2:20 PM in Grand Salon E, Marriott, Second Floor
1122 Battle of the Tribes: geoweb, GIS, GI Science, cyberGIS, neogeographyIronsheep 2014 will be held from 5-9pm on Monday evening at the Tampa Bay Wave. Check here for more details.
8:00 AM - 9:40 AM in Room 22, TCC, First Floor
1216 Alternative Computation and Unconventional Spaces
10:00 AM - 11:40 AM in Room 16, TCC, First Floor
2154 Visioning GIScience EducationThere's also the alt.conference on Big Data co-organized by friends-of-sheep Joe Eckert, Jim Thatcher and Andy Shears. Various floating sheeple will be participating in these sessions at different times and in different capacities.
8:00 AM - 9:40 AM in Grand Salon C, Marriott, Second Floor
2210 alt.conference on Big Data: Opening PanelThe Annual Kentucky-Arizona Wildcat Party, where you can often find the floating sheeple, will be held on Wednesday night at the Double Decker (1721 E. 7th St.), starting at 8pm.
10:00 AM - 11:40 AM in Room 10, TCC, First Floor
2410 alt.conference on Big Data: Lightning Panels
12:40 PM - 2:20 PM in Room 10, TCC, First Floor
2510 alt.conference on Big Data: Tech Demos
2:40 PM - 4:20 PM in Room 10, TCC, First Floor
2610 alt.conference on Big Data: Lightning Talk Discussion
4:40 PM - 6:20 PM in Room 10, TCC, First Floor
3130 Thinking the 'smart city': power, politics and networked urbanism I
8:00 AM - 9:40 AM in Room 30A, TCC, Fourth Floor
3230 Thinking the 'smart city': power, politics and networked urbanism II
10:00 AM - 11:40 AM in Room 30A, TCC, Fourth Floor
4111 J. Warren Nystrom Award Session 1Jen Jack Gieseking and Luke Bergmann have also organized a trio of sessions around digital geographies. While none of the sheeple will be direct participants, some UK Geographers will be participating, just for good measure.
8:00 AM - 9:40 AM in Room 11, TCC, First Floor
4132 Digital Geographies, Geographies of Digitalia: Interventions into Digital Thought and Practice
8:00 AM - 9:40 AM in Room 32, TCC, Fourth Floor
4232 Digital Geographies, Geographies of Digitalia: The Digital in Place
10:00 AM - 11:40 AM in Room 32, TCC, Fourth Floor
4432 Digital Geographies, Geographies of Digitalia: Discursive Productions of Digital Space
12:40 PM - 2:20 PM in Room 32, TCC, Fourth Floor
I started doing some digging this weekend on the National Map Corps. This is the USGS’s foray into Crowdsourcing. I thought it was something new – it’s been going on awhile. 1994 to be exact. I can’t even remember how long ago I started hearing about The National Map. I’ve got this love hate relationship with it. It’s sorely needed. I’m just not sure if this is the form it needs to take.
Anyway, one problem that always came up in my past life is “Why aren’t these maps more up to date?”. These maps being various and assorted from the USGS. We are quite lucky that there is a treasure trove of readily available maps from the USGS. I’ve often wondered in the age of shrinking budgets how do you curate and maintain that data. I’ve seen the rumblings for over a year about the National Map Corps. I dug into it today.
For those of you familiar with OpenStreetMap, you’re going to walk into a very comfortable place. For those of you who aren’t – you are going to walk into a very easy to deal with place. They’ve taken Potlatch and customized it. Right now there are 10 things you can edit. I know 10 isn’t a lot and I expect this to grow as the program gets some legs under it.
From those 10 things they only want the basics: category, name, address, and how you determined it is what it is. Did you see it, find it on a website, or knew someone who knows? Overall it’s simple. It’s great. It’s a good thing to do currently. So if you’ve never crowd sourced any information – here is your chance. Their is a peer review process currently. You will notice your icons change colors as they are reviewed. So they just aren’t turning you loose with no oversight. The oversight appears to be more guided than the Google Maps peer review process….or as I have called it….oh – I can’t use that term up here.
So I immediately started editing. Immediately wondered how I can make this better and I remembered Fulcrum. We’ve had an account post USVI and I decided to put it to use. Given the number of places I go It would be nice to just pull my phone out and record some information about the 10 features the National Map Corps want you to collect. I’ll most likely pull it into QGIS/ArcGIS and use that to guide some of my edits back into the map. I’ve already updated it 2 or 3 times. I even recorded a cemetery down the street from me as a test.
So – I’m going to make the pitch. If you’ve wanted to edit in OSM and have gotten frustrated or have just wondered if you are doing any good – Well – take a break and go over the the National Map Corps. You don’t need an app – all you need is pen, paper, and the ability to click on a map and fill out some information.There’s no reason you can’t make an account and make one edit. Maybe two.
Do something cool. Help.
]]>Like many other pundits, David Frum fears that Vladimir Putin is plotting to transform Ukraine into a weak federation and then transform some of its federal units into de facto Russian dependencies. As he argues in a recent Atlantic article:
In the weeks since Russian forces seized Crimea, Vladimir Putin’s plan for mainland Ukraine has become increasingly clear: partition. Putin’s ambassadors ...
This post is from GeoCurrents
]]>Like many other pundits, David Frum fears that Vladimir Putin is plotting to transform Ukraine into a weak federation and then transform some of its federal units into de facto Russian dependencies. As he argues in a recent Atlantic article:
In the weeks since Russian forces seized Crimea, Vladimir Putin’s plan for mainland Ukraine has become increasingly clear: partition. Putin’s ambassadors ...
This post is from GeoCurrents
]]>I was staring at some carpet in my house this morning as I drank my morning tea. I like this particular carpet. However, if you look at it really closely and actually think about that carpet, it wouldn’t be obvious that it would be a nice carpet. Somebody, at some carpet design studio (is there such a thing?) would have had to think, “a light tan with some specks of black will look good,” and then this person or team would have had to present it to the boss.
Can you imagine thinking that a light tan color with specks of black in it would look good as a floor covering? The immediate thought, when in a logical mindset, would be to say that nobody in their right mind would install that on their floor because it would appear as if it were dirty right from the start! But when you do install it, logic defies and it actually looks very good.
So my take-away is to try to see a design from all angles, be broad minded, test in real life situations, and realize that what might seem perfectly logical might end up perfectly wrong.
]]>
So my former life at TVA was a mixed bag. I hate politics. I hate red tape. Those two single things made me a complete pain when I worked at TVA. There were two GIS shops in TVA – one in Chattanooga and one in Norris TN. Of course the one in Chattanooga was better – since that was the one I worked in…or at least that’s what I believed. The other shop in Norris was headed up by Charlie Smart. Charlie never was one to get excited about red tape or politics – I would sit through a meeting and would almost immediately go “this is bullsh*t” because – well – see the previous few sentences. The only time both shops were together was news of a re-organization. Charlie would almost always sit there smiling and go “well – that was a good meeting”. Charlie had been doing this a lot longer than I had – and he was used to the ebb and the flow. It was something I could never get used to when I worked there. I guess you could say I’m better at it now. Somewhat.
Charlie was doing GIS at TVA back in the 70′s. Who knows if it was “GIS” at that point. Science so new people are most likely going “This will never work”. When I came along he was probably 30 years into his career. He also was into shooting high powered rifles at a gun club. People would tell me to speak up when talking to him.
Anyway – I never had a chance to work with the man enough. He retired shortly after I left.
So Joyce, Bruce, Scott,..now Charlie….The list goes on.
There are weeks like the last two were I think I’ve been doing this way too long.
]]>Date: April 8, 2014
Time: 11:30 a.m. – 1:30 p.m.
Location and Directions: Newton County Fire Service Headquarters
4136A U.S. Highway 278
Covington, GA 30014
Uses of LiDAR in GIS or “Where do Contours Come from Daddy?”
|
While the use of LiDAR based derivative products is an everyday occurrence in Geospatial Technologies, the process of generating those derivatives from the raw data collected by LiDAR sensor is somewhat mysterious to those of us involved in more traditional GIS roles. We will look at a LiDAR dataset and the various derivatives that can be derived from them.
We will discuss standards for LiDAR acquisition and the extent of LiDAR data for the state.
Speaker
Ernie Smith
Ernie Smith is the GIS Coordinator for Newton County. He also serves as Chairman of the GISCC.
As always – go to http://www.gaurisa.org and register. This should be a fun event – Ernie is never at a loss for words.
100 Cherokee Blvd, Chattanooga, TN 37405
Synopsis: Learning new software can be intimidating for some especially if you are used to a certain product that you have been using for years. (Anyone remember switching from ArcView GIS 3.x to ArcGIS Desktop 8.x?) Or perhaps you are new to GIS and looking for an affordable, user-friendly alternative to other popular GIS software. This is an introduction to QGIS that will get you familiar with the interface and many tools and features that will assist you with viewing, editing, analyzing, creating, managing, and serving GIS data in a variety of formats. Though QGIS is definitely a powerful tool to add to your GIS toolkit, it has some limitations and challenges that will be discussed in this workshop.
It’s $325 Per Seat.
I’m bringing the Mobile Lab so I have room for 10 people. BUT – if you so desire and you are one of those people – you know a MAC User – we can accommodate since QGIS runs on that platform as well and the class is platform independent.
I would also like to add that by the 5th of April we should have online payments set up for the first time. So stay tuned for that bit of excitement. If you want go ahead and email me to reserve a seat - rjhale@northrivergeographic.com
I would also like to add that the next time you see QGIS Class it will be the data and editing one. Yes – this is going to be a trilogy by the end of summer.
]]>Without getting into too much detail now (I’m running out of time today) but you can get Lidar data into CityEngine with a bit of work, read on….
The workflow
Basically you convert your LAS data to a multipoint file then convert from multipart to singlepart (in ArcGIS) and import the point data into CityEngine. (was that the worlds shortest workflow?!)
Now CityEngine does support points, it imports this but converts it into little polygon squares. These squares can then have rules run on them, normally I would insert trees on 2D point data, but that would look odd if I did it on a point cloud.
A word of warning point: clouds turn into massive numbers of polygons, you need to bare this in mind when working. CityEngine handles it okay but the export can be very slow…
Want more posts like Lidar & Point Clouds in CityEngine? A quick workflow (sort of) ? Then visit GeoPlanIT for more exciting posts (no really).
]]>The World Ocean Council (WOC) is working to help ensure ocean business community participation in the Global Oceans Action Summit for Food Security and Blue Growth (The Hague, 22 – 25 April 2014).
Organized by the Netherlands, UN Food and Agriculture Organization (FAO) and the World Bank, the Global Oceans Action Summit seeks to convene global leaders, ocean practitioners, business, science, civil society and international agencies to share experiences and demonstrate how combined action in partnerships for healthier and productive oceans can drive sustainable growth and shared prosperity.
The organizers have invited WOC to reach out to the global ocean business community and encourage participation in the Global Oceans Action Summit. The event organizers are especially interested in participation from the seafood, fisheries, aquaculture, oil/gas, and shipping sectors, but also from a wide range of ocean industries.
The WOC has been invited to participate in the summit’s high level session on Thursday 24 April as part of assuring that the event does connect with diverse ocean business community, as well as being invited to participate in panels on Blue Growth.
The Global Oceans Action Summit will highlight the need to address successful integrated approaches that attract public-private partners, secure financing and catalyze good ocean governance while balancing between (i) growth and conservation, (ii) private sector interests and equitable benefits for communities and (iii) Exclusive Economic Zones (EEZs) and Areas Beyond National Jurisdiction (ABNJ).
For more info on the Global Oceans Action Summit for Food Security and Blue Growth, see http://www.globaloceansactionsummit.com/.
[Source: World Ocean Council press release]
On my mind today were two little nuggets from Inc Magazine’s April 2014 issue that got me thinking.
The first is a question, or actually, part of a question that goes like this, “Which customers can’t participate in our market because they lack skills?” It struck me as both a very obvious question to ask yourself as a business owner and a completely novel concept. It should be obvious to ask this question but it just isn’t asked very often.
I wouldn’t normally even write about this question on a blog about cartography except for one thing: it’s a question that hits home to traditional cartographers. Do we lack a skill that’s necessary for making maps in the modern era? If I’m adept at finding data, analyzing data, using a GIS program, and perhaps even in manipulating the GIS output in a graphics software, shouldn’t that be enough? Why should I invest my time learning new tools, which are heavily focused on web design, that are being developed? Because if you don’t, you won’t be able to participate in the new cartographic market, that’s why.
Another bit in the magazine espoused the ideals of providing a safe environment for exchanging ideas within your workgroup. Two articles describe how to produce this “safe environment” and, surprise surprise, they contradict one another. One of the articles talks about never knocking down the ideas of others. Another article talks about making it so people know they won’t be taken to task for what they say. If you have a culture of never questioning ideas then you have a culture where nobody knows if something’s actually good or if your peers are simply putting on a polite facade. If nobody’s ever taken to task then things could get ugly.
And what does all that have to do with cartography? It poses the possibility that there are multiple ways to allow critical feedback on a map design, an analysis, data inputs, and the like. As a profession, we are in desperate need of critical feedback. Some of that happens in social media today, such as on twitter. (If you want to know how people really feel about that map, post it on twitter but have a thick skin.) What seems to the designer like a fabulous idea–renaming every U.S. state for a beer brand let’s say–might be met with derision from the crowd.
Some say that criticism kills innovation. If you have too many people telling you that beer map is terrible then you might never come up with another map idea in your life. But if we never allow criticism in the workplace then we risk putting out a bunch of beer maps. Is there a way to win here?
]]>Upon the discussion paper’s publication, Greg Babinski, GISP, Past President of URISA, noted:
“Recent studies have demonstrated the tremendous return on investment (ROI) from deploying geospatial technology. Of course geospatial technology relies on the availability of high quality, current spatial data to deliver these benefits. But within the United States, we have been behind many other countries in completing development of a single authoritative spatial database.
Many other countries that I have visited, including the UK, UAE, China, Taiwan, and the EU countries, all have a top-down central government funded approach to developing GIS data for use at the national, regional, and local levels. In the US, we have not taken this approach. We have a fractured infrastructure, with local government agencies generally (but not universally) having good data, but little motivation to comply with standards or policies that would facilitate compilation into a comprehensive national database. Local government agencies and many states lack funding to support a comprehensive approach to building a national data sharing infrastructure.
In the discussion paper, Jim Sparks, Philip Worrall, and Kevin Mickey have laid out these issues and make a variety of proposals based on best practices that have worked on smaller scales, rational national coordination, and a proposal for effective funding.
I encourage GIS professionals to read this important paper and provide your comments and further suggestions to the authors. Also consider attend upcoming events where this paper will be the subject a panel discussion, including the Washington GIS Conference (May 12-14 in Tacoma) and GIS-Pro 2014: URISA’s Annual Conference (September 8-11 in New Orleans).”
Interested authors should send their papers to URISA’s Executive Director (wnelson@urisa.org) for review and consideration by the GIS Management Institute®. For more information about the GIS Management Institute® including the GIS Capability Maturity Model, visit http://www.urisa.org/main/gis-management-institute/.
[Source: URISA press release]
I stumbled on this the other day – Google has a map gallery. I knew somewhere in my head they did – but as you can see I use it so infrequently I forgot they had one. They’ve teamed up with the USGS to provide a fairly simple way to download the newer ortho quads . To me (and this is just me) the newer maps, while nice, don’t have the same soul as the old ones. They are nice because they are more up to date – but….SOUL……
Anyway I started looking and immediately downloaded the quadsheet in the screenshot – the Wauhatchie quad since that is in my old (or current) stomping grounds. Well – it’s a GeoPDF. I’m sorta torn about that, but after giving it some thought that might be the best way to deliver these maps to the public. Being a GeoPDF and seeing how it has layers – you can turn off the image if you wish.
Without Image and With Boring Tan Contours
With Image and Boring Tan Contours.
Cool right?
BUT – it’s a PDF.
gdal_translate -of GTiff TN_Wauhatchie_20100512_TM_geo.pdf test_ortho.tif –config GDAL_PDF_DPI 300 -co “COMPRESS=JPEG” -co “JPEG_QUALITY=85″
and your favorite desktop GIS software of choice:
So overall I can get the new ortho quads into GIS desktop software. The tan contours lines don’t lend themselves to great visibility. BUT – this might be a decent addition to some of the work I do. So the point of this story – Arm yourself with GDAL. It does many wonderful things.
]]>
DATA Inc., an IT software and service provider, and Digital Footsteps, Ltc., a leading tour app provider, won “Best in the World” category at the 2013 NJTC Mobile Apps Forum and Competition.The award was presented on June 28, 2013 at Fairleigh Dickinson University in Madison, NJ, and was among 10 awards presented to notable New Jersey technology firms.
The app was designed using map data from OpenStreet maps and adding DATA Inc.’s own layer data. DATA Inc. have developed a Tile Server based on a Postgres database that maintains map data at different zoom levels. For a given tour, they download a subset of map data including the company’s layered data to the mobile app.
“We are excited to receive this recognition from the NJTC,” said Ashis Bhisey, Vice President of Technology for DATA Inc. “In the rapidly evolving mobile space, we are delighted to deliver cutting-edge value to our clients.”
“The core technology can support multiple industries including universities, community newspapers and publishers, in addition to the hospitality industry,” said Julie Waldman, Founder of Digital Footsteps Ltd. “It can be spun off in many different directions and we’re looking to our partner DATA Inc. to help us achieve our goals.”
]]>Zook, M. and A. Poorthuis. 2014. "Offline Brews and Online Views: Exploring the Geography of Beer Tweets". In The Geography of Beer, eds. M. Patterson and N. Hoalst-Pullen. Springer. pp. 201-209.
Although for many Ukrainians and Russians the heading of this post may sound as a beginning of an anti-Semitic joke, Ukraine could conceivably soon have a Jewish president. At least, it now has a Jewish candidate in the running: businessman and philanthropist Vadim Rabinovich submitted his candidacy for the May 25, 2014 Presidential Elections. Rabinovich is a self-nominated candidate, not representing any political party.
This post is from GeoCurrents
]]>Although for many Ukrainians and Russians the heading of this post may sound as a beginning of an anti-Semitic joke, Ukraine could conceivably soon have a Jewish president. At least, it now has a Jewish candidate in the running: businessman and philanthropist Vadim Rabinovich submitted his candidacy for the May 25, 2014 Presidential Elections. Rabinovich is a self-nominated candidate, not representing any political party.
This post is from GeoCurrents
]]>Scents may be valuable in determining which apartment to rent, or where you might decide to put your next office. I don’t think we need a map to determine which restaurant to eat at, if we get close enough to the location.
]]>My father and I have been debating this back and forth for years. On the face of it it's intrinsically impossible: if you can't have FTL, the distances and costs involved in travel make trade and communication prohibitive. To accelerate goods and people to relativistic velocities would be insanely expensive, and it would still take decades to get there. Hardly anything would be worth the shipping costs: it would be easier and cheaper to synthesize what you need rather than import it. Transmutation is less expensive than interstellar trade. (No doubt this is why sf focuses on rare goods, from melange to unobtanium.)
Absent that trade, there's no rationale for having an interstellar civilization. Even if you were able or willing to colonize other planets (though again, the cost of sending a colony ship is of a magnitude that many in science fiction fail to grasp), the colonies would be on their own. With no reason to trade, how would the investment in a colony ship be recouped? And what purpose would there be for an interstellar government -- Empire, Federation, whatever -- if there was no trade for it to regulate?
One exception, dealt with in some depth at a panel at the Chicago Worldcon in 2012, is trade in information: planets could beam intellectual property at one another. Inventions and works of art. An interstellar government's role would be to regulate copyright and patent law. (Enforcement would be trickier: at said panel, Charlie Stross suggested the use of a Nicoll-Dyson laser.) But there would be no travel, and no spaceships; everything from trade to diplomacy to war would be conducted remotely. (So much for space opera.)
Thing is, FTL isn't a solution to the problem of interstellar civilization; it's a solution to the limitations of human biology. Both interstellar travel and a galactic civilization become a lot easier to contemplate if you take our limited lifespan, and the need to keep us alive (fed, watered, breathing and sheltered from cosmic rays) for the duration of the voyage, off the table in some fashion. Time dilation takes care of the lifespan of the voyagers (at least if they're travelling at relativistic velocities), but it means that origin, destination and traveller get out of sync.
Fortunately, human immortality is an easier problem to solve than Einsteinian physics. Sf writers have had some luck moving that lever instead. Take, for example, Scott Westerfeld's Succession series -- The Risen Empire and The Killing of Worlds -- which posits a galactic empire where the ruling elite possesses a life-after-death form of immortality: those who are not immortal must deal with relativistic sublight travel. And Charles Stross's Neptune's Brood (UK edition) not only features posthuman protagonists, it builds an entire economic system on the limitations of interstellar travel: Stross's solution for the problem of interstellar trade is banking.
With Lockstep, Karl Schroeder has come up with something quite different. And also quite extraordinary. He's managed to square the circle of space opera and known physics, and arrived at a scenario that is both startingly original but makes use of what is known and what is possible.
Lockstep's 17-year-old protagonist, Toby McGonigal, emerges from a cryogenic sleep 14,000 years long to discover that a civilization has sprung up among the rogue planets between the Sun and Alpha Centauri. Resources are scarce on these planets, so the human inhabitants survive by use of the locksteps: for every month they spend awake, they all spend thirty years in cold sleep, which allows those resources to replenish themselves. But more importantly, space travel is done during cold sleep: ships use the thirty year gap to move from one world to the other; the passengers awaken as though it was an overnight trip. When they return, a month later, the same amount of time has elapsed back home: by spending only 1/360th of the time awake, Schroeder's civilization has shrunk the virtual distances between the worlds.
The result, Schroeder says,
is a classic space opera universe, with private starships, explorers and despots and rogues, and more accessible worlds than can be explored in one lifetime. There are locksteppers, realtimers preying on them while they sleep, and countermeasures against those, and on and on. In short, it's the kind of setting for a space adventure that we've always dreamt of, and yet, it might all be possible.
Whereas a space opera universe that requires FTL isn't.
Schroeder wraps his cutting-edge setting around what is from all appearances a fairly traditional adventure story, replete with a missing heir and family drama, that would not be out of place in, dare I say it, a Heinlein juvenile. Toby discovers not only that it was his family who created, and controls, the lockstep, but that a cult in his name had arisen in the millenia since his disappearance. I recoil to some extent from stories about young people who discover they're the Most Important Person in the Universe -- oh look, another Chosen One -- but Karl does a reasonable job with it. Lockstep is fast-paced and clever, and makes full use of the implications of the universe he's built.
I mentioned Heinlein juveniles, and Lockstep is being referred to as a young-adult novel (what with its teenage protagonist), but Paul Di Filippo, in his review of Lockstep for Locus Online, argues that it's reductionist to call it that. Rather, he says, it's an example of what others have called "entry-level sf": more accessible to readers who haven't spent the last few decades absorbing sf's advanced reading protocols. In that I think it succeeds admirably. It's certainly an easier read than, say, Neptune's Brood, but the clarity and accessibility of its prose should not mask the importance or significance of what is clearly a major work of science fiction.
Full disclosure: I received an ARC of this book via Goodreads First Reads. The author and I are also socially acquainted.
Lockstep
by Karl Schroeder
Tor Books, March 2014
Buy at Amazon (Kindle) | author's page | publisher's page | Goodreads | LibraryThing
Flash mobs are not a new thing. Flash mobs by orchestras are not a new thing. Flash mobs by orchestras playing Beethoven's Ninth Symphony are not a new thing: you've almost certainly seen this one (previous entry). So you might not appreciate the significance of the Odessa Philharmonic Orchestra playing the Ode to Joy at the Privoz fish market on March 22 (video). Or the orchestral flash mobs playing the Ninth at seven airports across Ukraine last Sunday, marking the end of 40 days of mourning for the protesters killed in demonstrations against the previous Ukrainian regime.
The Ninth has long been used for political purposes. For Ukraine at this moment, the Ninth's use as the European anthem is no doubt as significant as its role as a hymn for peace and humanity. Invoking the Ninth is a powerful statement: it evokes memories of the Berlin Wall and Tiananmen Square. It's a profoundly humane act of defiance.
]]>Project area and files |
3D Visualization of Manhattan. Redder buildings are taller. |
A closer look: From Central Park down 7th Avenue |
Can you see where the ball is dropped on New Year's...One Times Square? |
The records, which span from 1880 to 1970, provide information on what areas of the country birds were spotted in and when they have arrived or departed from areas during spring and fall. The information is of use identifying how birds’ ranges and migration patterns have changed over time.
The one-millionth transcription was that of a house wren seen in Tierra Amarilla, New Mexico, on September 11, 1904 and is now part of the USGS North American Bird Phenology Program database.
Phenology is the study of the seasonal timing of natural biological phenomena, such as leafing and flowering of plants, maturation of agricultural crops, emergence of insects, and migration of birds. Many of these events are sensitive to climatic variation and change, and are simple to observe and record.
This 90-year span of archival data provides baseline information about the first arrivals and last departures of North American migratory birds, according to Jessica Zelt, the USGS North American Bird Phenology Program Coordinator. When combined with contemporary data, researchers have the unique opportunity to look at changes in seasonal timing in relation to climate and climate change over a 130-year period, unprecedented in its length of time for recorded migratory data.
Now part of the USA National Phenology Network, also funded and primarily operated by USGS, the Bird Phenology Program of the past was a network of volunteer observers who recorded information on first arrival dates, maximum abundance, and departure dates of migratory birds across North America.
History in the making
Active between 1880 and 1970, the program was coordinated by the Federal government and was the first program to be sponsored by the American Ornithologists’ Union. It exists now as a historic collection of millions of records, illuminating almost a century of migration, distribution and population status of birds. The records contain many stories from the emergence of introduced European species like the European starling and House sparrow to the decimation of species like the Carolina parakeet and Passenger pigeon. Even historical events such as the dust bowlare mentioned in the records.
Today, in an innovative “big data” project to organize, manage and make the data publically available, the records are being scanned and placed on a website, where volunteers worldwide transcribe the records and add them to database for analysis. This citizen science program welcomes participants of all backgrounds from around the world. Volunteers are from locations as varied as Gunma, Japan; Istanbul, Turkey; and Brussels, Belgium; although the majority reside throughout North America.
“Just last month, a participant wrote me to say she had transcribed a card by Tracy Irwin Storer, a name she recognized because he had authored her college biology textbook,” said Zelt. “One of the aspects that is so exciting about this program is that it provides participants with a link to ornithological history.”
Original records were created by many famous ornithologists, biologists, botanists and naturalists, such as Aldo Leopold, author of “A Sand County Almanac;” Roger Tory Peterson, who wrote “A Field Guide to the Birds;” and Clarence Birdseye, the creator of frozen foods.
“We feel that the world is changing and these bird records are providing us with the measuring tape to document that change,” said Sam Droege, a USGS wildlife biologist. “This is something anyone can get involved in exploring since we are making all the records open to the public on our website.”
Records, like this Ruby-throated Hummingbird card, have already been used in research studies to show the change in arrival dates over time.
Jason Courter, Assistant Professor at Malone University, published a paper on Ruby-throated Hummingbirds, documenting the advancement of arrival dates by 11-18 days in North America when comparing the historic data to the more recent data collected by Journey North and other hummingbird observing programs.
Anyone interested in participating in this innovative project can volunteer by registering online to transcribe these records for the database.
]]>A recent article in The Washington Post by Katie Zezima asked whether the country should be referred to as “the Ukraine” or simply “Ukraine”, without the definite article. Recent usage of the article with the country’s name by several American politicians apparently raised some ire on the part of certain Ukrainian pundits. It is time for GeoCurrents to dispel some myths about this issue.
This post is from GeoCurrents
]]>A recent article in The Washington Post by Katie Zezima asked whether the country should be referred to as “the Ukraine” or simply “Ukraine”, without the definite article. Recent usage of the article with the country’s name by several American politicians apparently raised some ire on the part of certain Ukrainian pundits. It is time for GeoCurrents to dispel some myths about this issue.
This post is from GeoCurrents
]]>The post Mapping March Madness Basketball and Favorite NCAA fan maps appeared first on CloverPoint.
]]>Some of the fun NCAA related mapping resources that we’ve found include the following:
The post Mapping March Madness Basketball and Favorite NCAA fan maps appeared first on CloverPoint.
]]>A coalition of international partners announced today the launch of the Ecocitizen World Map Project, a powerful online crowd mapping tool designed to explore, understand, and measure holistic urban health from a citizen’s perspective, at the upcoming 7th World Urban Forum (WUF7) in Medellín Colombia, April 5-11th.
Led by non-profit Ecocity Builders USA in collaboration with the Organization of American States, Esri, the Association of American Geographers, Eye on Earth (a partnership of UNEP + Abu Dhabi Environmental Data Initiative) along with local academic partners, NGOs and community organizations, the public-private partnership was developed to facilitate simple individual snapshots of a community’s social and environmental health as well as more sophisticated local and regional training and geospatial analysis.
“As the global community is becoming more aware of the crucial role cities play in mitigating climate change and leading the way toward sustainable development, the importance of understanding and connecting the diverse layers that comprise urban ecosystems cannot be overstated,” says Kirstin Miller, Executive Director of Ecocity Builders.
The Ecocitizen World Map Project consists of two interwoven elements. One enables and encourages citizens to participate directly by taking a short online survey—powered by crowdsourcing platform Ushahidi—ranking their cities and neighborhoods along fifteen conditions outlined by the International Ecocity Framework and Standards Initiative.
Another provides on-the-ground training in pilot cities to students, citizens, and public officials, using Esri’s mobile GIS technology in combination with online tools and educational materials to assess, measure, and plan for increasing the health and resilience of urban systems and to identify barriers to improving quality of life. Inaugural pilot cities include WUF7 host Medellín, supported by a grant from the OAS’ Sustainable Communities in the Americas Initiative, as well as Cairo and Casablanca, supported by a grant from Eye on Earth.
“In order to make informed decisions that benefit all stakeholders equitably and sustainably we have to delve more deeply into as many social, geographical, and environmental areas as possible,” Miller explains the need for charting the progress of cities’ social and environmental sustainability. “And who better to provide that first-hand knowledge than the inhabitants of those microcosms?”
The project will be presented by Ecocity Builders, AAG, Esri, OAS, AGEDI, and the US Department of State at the “Building Resilience and Equity Through Citizen Participation and Geodesign” session on Thursday April 10th, 11am – 12pm, at the UN Habitat City Changer Room. It will also be showcased throughout the conference at the Esri Geospatial Pavilion. A training event entitled “How to use mobile technology to measure urban equity,” presented by ITC-University of Twente, the Netherlands, Esri, and Ecocity Builders, will be held on Wednesday, April 9th at TE7, Room 20.
More information:
As mentioned in an earlier GeoCurrents post, Zakarpattia Oblast in far western Ukraine is a perfect example of how ethno-linguistic tensions affect geopolitical outcomes. Even deciding on a neutral term for the region can be challenging. The Russian/Ukrainians toponym “Zakarpattia” translates into English as “Transcarpatia”, while from the Hungarian perspective it is called “sub‑Carpathian Ukraine”, the more neutral term being Carpathian Ruthenia. The latter term is related to the ethnonym and language name “Ruthenian”, which as we shall see below, is itself quite problematic. In the remainder of this post, the local toponym “Zakarpattia” will be used, for the lack of a more neutral term. So who are the Rusyn, how many of them are there, and what language do they speak?
This post is from GeoCurrents
]]>As mentioned in an earlier GeoCurrents post, Zakarpattia Oblast in far western Ukraine is a perfect example of how ethno-linguistic tensions affect geopolitical outcomes. Even deciding on a neutral term for the region can be challenging. The Russian/Ukrainians toponym “Zakarpattia” translates into English as “Transcarpatia”, while from the Hungarian perspective it is called “sub‑Carpathian Ukraine”, the more neutral term being Carpathian Ruthenia. The latter term is related to the ethnonym and language name “Ruthenian”, which as we shall see below, is itself quite problematic. In the remainder of this post, the local toponym “Zakarpattia” will be used, for the lack of a more neutral term. So who are the Rusyn, how many of them are there, and what language do they speak?
This post is from GeoCurrents
]]>Right after the aircraft disappeared, Inmarsat was involved in the search for the plane. Although the main aircraft communications addressing and reporting system (which would usually transmit the plane’s position) was turned off, one of Inmarsat’s satellites continued to pick up a series of automated hourly ‘pings’ from a terminal on the plane, which would normally be used to synchronize timing information.
Inmarsat analyzed these pings and was thereby was able to establish that MH370 continued to fly for at least five hours after the aircraft left Malaysian airspace, and that it had flown along one of two ‘corridors’ – one arcing north and the other south. This was shown in various news reports, but this information was given by the Doppler effect, the change in frequency due to the movement of a satellite in orbit. This gave two predicted paths for the flight – one northerly and one southerly route. Inmarsat engineers came up with this prediction which had never been done before, according to senior vice president of external affairs at Inmarsat, Chris McLaughlin. He said that the technology to track position and speed of the aircraft can be made available on planes for less than a dollar and hour. The plane was reportedly flying at a cruising height above 30,000 feet.
Although this information was given to Malaysian officials by March 12, the Malaysian government did not acknowledge it publicly until March 15, according to the Wall Street Journal. This delay in responding has been sharply criticized in the press and is thought to have contributed to a considerable loss of valuable time in recovering the lost aircraft.
Inmarsat’s engineers continued with their further analysis of the pings and came up with a much more detailed Doppler effect model for the northern and southern paths. They compared these models with the trajectory of other aircraft on similar routes and were able to confirm a matching between Inmarsat’s predicted southerly path with reading from other planes on that same route.
These pings from the satellite coupled with assumptions about the plane’s speed, made it possible for Australia and the US National Transportation Safety Board to narrow down the search area to just 3 per cent of the southern corridor on March 18th.
“We worked out where the last ping was, and we knew that the plane must have run out of fuel before the next automated ping, but we didn’t know what speed the aircraft was flying at – we assumed about 450 knots,” said McLaughlin. “We can’t know when the fuel actually ran out, we can’t know whether the plane plunged or glided, and we can’t know whether the plane at the end of the time in the air was flying more slowly because it was on fumes.”
The analysis was given to the UK Air Accidents Investigation Branch (AAIB) by Inmarsat this week. So far, the cause of the crash remains unknown.
]]>
The now-dry Colorado River delta was once a thriving wetland ecosystem, teeming with wildlife. It was a treasured resource shared by both the U.S. and Mexico where water and sediment delivered from the Colorado River reached the Gulf of California. A century ago, the Colorado River delta was even navigable by large boats.
Today, upstream diversions and dams in both countries control the Colorado River’s flow, and little to no water is released into the channel downstream of Morelos Dam in most years.
An Agreement Across Borders
Recognizing the challenges facing the Colorado River Basin, including a 14-year period of historic drought, Minute 319 was executed on Nov. 20, 2012. It provides measures to enhance sharing of water supplies, permit Mexico to defer delivery of some of its allotted water in the United States, facilitate investment in Mexico’s water infrastructure, and measure the ecosystem effects of an experimental environmental pulse flow into the reach below Morelos Dam.
Bi-National Effort to Potentially Restore the Resource
In order to assist and inform future bi-national cooperative efforts as both countries work together protect resources on both sides of the border, a large pulse of water is being released into the former delta of the Colorado River along the U.S.-Mexico border.
U.S. Geological Survey scientists are studying the effects of this study on the environment as part of a historic, bi-national collaborative effort. This pulse flow and the need to study its effects were agreed to as part of the recently adopted Minute 319 to the 1944 US-Mexico Water Treaty.
This engineered release of water is the culmination of years of negotiations led by the U.S. and Mexican Sections of the International Boundary and Water Commission in partnership with the Department of the Interior, in conjunction with the seven U.S. Colorado River Basin states, Mexican government agencies and a wide array of municipal agencies, non-governmental organizations and universities from both the U.S. and Mexico. The release of water began on March 24 and will continue for about eight weeks, with the rate of release peaking on March 27. Over this period of time, 105,392 acre feet of water will be released, a volume that would fill about 52,000 Olympic sized swimming pools.
Studying the Flow
Minute 319 to the 1944 US-Mexico Water Treaty calls for studying the hydrologic and biologic effects of the pulse flow. Scientists from the USGS, the University of Arizona, the Universidad Autónoma de Baja California (UABC), Pronatura Noroeste, the Sonoran Institute, the Bureau of Reclamation and other institutions are conducting water monitoring and research along a 24-mile long river segment of the Colorado River where one bank is in Arizona and the other is in Baja California, Mexico. Experts will monitor where the pulsed water flows, track sediment transport and evaluate plant and wildlife response.
Research will focus on how the water moves through the Colorado River channel, how the pulse changes as it moves downstream and infiltrates through the streambed into the groundwater, water salinity levels, and the patterns of new vegetation establishment.
Will the Vegetation Grow?
The successful establishment of native seedlings is dependent on a number of aspects of streamflow, including: the magnitude and timing of peak flows; the rate at which water levels recede; and the availability of shallow groundwater. Studying all these factors will provide an understanding of why vegetation is able to thrive in some areas and not in others. The spread of seeds and new growth will be measured at 22 study sites along the river. Previous unintentional high flows in the mid-1980s and the 1990s promoted the germination and establishment of cottonwood and willow trees.
Other factors that may affect streamflow patterns and vegetation response include sediment transport and changes to the topography of the landscape. The pulse flow will be introduced into a channel with a sand bed, and an unknown amount of sand and mud will be redistributed within the channel and floodplain by this flow. Scientists will study how much sediment is moved and how much of the channel scours or fills with sediment. This information is essential in developing new tools to predict effects of future pulse flows, should they occur.
Eyes in the Sky
Satellite and aircraft-based imagery will be collected along the length of the Colorado River delta, including areas downstream of the U.S.-Mexico border, and will compliment on-the-ground observations. These images will be used to compare the distribution and density of the delta’s vegetation before and after the pulse flow and document the extent of flow inundation. LiDaR will be used to produce high-resolution digital elevation models that will help quantify changes to channel and floodplain topography.
Hope for the future?
“These results will not only help inform decisions about potential future flows, but will also advance cooperative management efforts to improve the health of the delta region in both the U.S. and Mexico,” said Suzette Kimball, Acting USGS Director.
]]>Energy issues figure prominently in many discussions of the conflict in Ukraine. As is often noted, Europe’s reliance on Russian natural gas gives Moscow significant leverage. Ukraine, moreover, is weakened by its own dependence on Russian energy, and its situation is complicated by the Russian natural gas pipelines that transverse its territory. Less often noted are Ukraine’s own significant gas ...
This post is from GeoCurrents
]]>Energy issues figure prominently in many discussions of the conflict in Ukraine. As is often noted, Europe’s reliance on Russian natural gas gives Moscow significant leverage. Ukraine, moreover, is weakened by its own dependence on Russian energy, and its situation is complicated by the Russian natural gas pipelines that transverse its territory. Less often noted are Ukraine’s own significant gas ...
This post is from GeoCurrents
]]>Anyone may nominate a person or organization for induction to URISA’s GIS Hall of Fame. To make a nomination, submit a written statement to URISA describing:
Hall of Fame laureates are expected to exemplify vision, leadership, perseverance, community-mindedness, professional involvement, and ethical behavior.
The nomination statement may be of any length, but it must be preceded by a one-page stand-alone summary. Nomination statements should be emailed to info@urisa.org by May 1. A committee of past URISA Presidents will review all nominations and make recommendations to the URISA Board of Directors by mid-June. Recipients will be honored during GIS-Pro 2014: URISA’s 52nd Annual Conference in New Orleans taking place September 8-11. This honor may not be given every year, and in some years there may be multiple recipients.
URISA’s Hall of Fame laureates include:
Visit URISA’s GIS Hall of Fame to learn about their path-breaking accomplishments.
[Source: URISA press release]
Did I tell you I like CityEngine today? Okay you will have noticed that CityEngine gives a designer an instant feedback on their rule files. You move a centre line of a road or a node on a building plot and the model created on top of it dynamically changes as it’s moved. It’s fascinating to watch your simple rule files come alive.
Take this example, it’s my attempt at really simplifying a street rule that includes a bridge element. Above a certain elevation (either 0 or a terrain) and the street gets bridge supports and simple sides. Also no trees are planted and no zebra crossings (‘crosswalks’ to our North American friends) are created either.
It works great, raise the road and supports are made, trees are taken out, just as I wanted but what happens if the road goes down… well have a look, I think I have some unintended consequences of a decision I made in my rule file…
Want more posts like Unexpected consequences and the power of instant feedback ? Then visit GeoPlanIT for more exciting posts (no really).
]]>A large landslide occurred in northwest Washington at about 10:37 am PDT on Saturday, March 22, 2014. Multiple casualties are confirmed as a direct result of the landslide and many people remain missing. Landslide debris covered about 30 houses and 0.8 miles of State Route 530.
The landslide occurred in an area of known landslide activity, but this time the slide was much larger, traveled much further, and had greater destructive force than previously experienced. Precipitation in the area in February and March was 150 to 200% of the long-term average, and likely contributed to landslide initiation.
The slide took place along the edge of a plateau about 600 feet high composed of glacial sediments. The volume of the slide is estimated to be about 10 million cubic yards, and it traveled about 0.7 miles from the toe of the slope. This travel distance is about three times longer than expected based on published information regarding previous slides of this height and volume worldwide. If the landslide had behaved in the expected range, it would have likely blocked the river and possibly destroyed a few houses. Instead it led to tragic loss of life and destruction of property.
Flow also dammed and temporarily blocked the upper part of the North Fork Stillaguamish River. A pool of water formed behind the debris dam, which flooded houses and other structures. There were initial fears that the debris dam would create a flood hazard downstream if the dam were breached, but a catastrophic dam breach is now considered unlikely. Currently, the lake level is gradually decreasing as the river is cutting a new channel across the top of the debris dam.
USGS scientists are supporting state and county agencies responding to the event. It is a collaborative effort, with many working hard to provide assistance, assess the situation, and alleviate impacts. In particular, scientists are assisting with monitoring the stability of the landslide area and monitoring debris-dam erosion and river and lake conditions.
The USGS operates a long-term streamgage to measure water levels and river discharge about 12 miles downstream from the landslide, on the North Fork Stillaguamish River at Arlington. The river level at the gage dropped suddenly at about 1:30 pm PDT on Saturday, March 22. The drop in water level was about 1.2 feet, which is equivalent to a drop in discharge of about 1,200 cubic feet per second. Go online and see near-real-time data.
Since the landslide occurred, the USGS installed three Rapid Deployment Gages to measure additional water levels and streamflow. These gages are located downstream from the debris dam, in the lake behind the dam, and upstream from the lake. In addition, the USGS installed two buoys to measure lake levels. The newly installed gage downstream from the debris dam also measures water turbidity, and this information is used to monitor the erosion rate of the dam.
The USGS is working with multiple federal, state and local agencies to monitor the status of the river and debris dam and help assess what the downstream impact may be on flooding and other variables under different streamflows and debris-dam erosion rates.
Seismograph readings show no indication that an earthquake triggered the landslide near Oso, Washington, on March 22, 2014. Seismographic recordings of the landslide event show two impulsive wave signals. The first is interpreted to represent the onset of landslide motion at 17:37 UTC, and a second signal is interpreted to represent a successive slide that occurred at 17:41 UTC. The seismic signals are of long period surface waves, with no clear high-frequency P or S phases that we would expect to see if a local tectonic earthquake occurred at the time of the event. The landslides generated elevated levels of local ground shaking for over an hour. Seismic readings are from the University of Washington Pacific Northwest Seismic Network, operated in cooperation with the USGS.
A team of landslide geologists from the county, state, and the USGS are assessing continuing landslide hazard that may pose a threat to search operations. One example is the USGS deployment of three “spiders,” which are portable instrumentation packages that contain high-precision GPS units for detecting landslide movement as well as geophones for detecting small vibrations. The spiders can be emplaced by hovering helicopters. Data from the spider units are transmitted by radio to USGS computers and made available to the monitoring team.
Snohomish County is the lead responding agency and is coordinating closely with local agencies. The Washington State Department of Natural Resources, Washington State Emergency Management Division, and Washington State Department of Transportation are the primary state staff at the site to help assess the flood hazard and evaluate how the river may rework the landslide and debris dam in the next few weeks. Many other organizations are playing a supportive role and providing all hands on deck to assist.
Other large, and perhaps sudden, landslides have occurred in this valley and in the broader region. Large landslides are the norm in many parts of the western foothills of the North Cascades. The recent landslide in Washington, however, exhibited unusual mobility (i.e., runout distance and probable speed) compared to previously studied events in the area. Examination of high-resolution, remotely sensed topography indicates the presence of similar landslides that may have travelled long distances.
This landslide appears to have involved a complex sequence of events. USGS landslide specialists, in collaboration with seismologists and state agencies, are still working to interpret the event.
Much of the landslide deposit has character that resembles that of the great Mount St. Helens rockslide, which was a debris avalanche on May 18, 1980, and the largest landslide in recorded history. The distal part of the landslide deposit, in the region roughly south of the alignment of State Route 530, resembles that of a debris flow. Debris flows are liquefied slurries of rock, water and mud that can travel great distances at high speeds, entraining nearly all objects in their paths. Some of the landslide material remaining north of the Stillaguamish River, closest to the top of the landslide, has the character of landslides that scientists classify as rotational slumps.
Landslides occur in all 50 states and U.S. territories, and cause $1-2 billion in damages and more than 25 fatalities on average each year. Falling rocks, mud, and debris flows are one of the most common and sometimes deadly hazards, yet there is still much to learn about how and why they happen.
USGS science is helping answer questions such as where, when and how often landslides occur, and how fast and far they might move. USGS scientists produce maps of areas susceptible to landslides and identify what sort of rainfall conditions will lead to such events. For more information, watch a video about USGS landslide science, and visit the USGS Landslide Hazards Program website.
Scientists at the USGS are also asking you to help by reporting your landslide experiences and sightings at the new USGS “Did You See It?” website.
Further, the USGS is working with the National Weather Service on a Debris Flow Warning System to help provide forecasts and warnings to inform community and emergency managers about areas at imminent risk.
View photographs related to the landslide and impacts.
Read the newly published report, Preliminary Interpretation of Pre-2014 Landslide Deposits in the Vicinity of Oso, Washington.
Updates and technical data regarding the landslide and flood conditions are posted online as available from the USGS Washington Water Science Center.Visit their website to receive information.
Learn more by reading the USGS article, Debris Flows: Behavior and Hazard Assessment.
Read the following reports by the Washington Division of Geology and Earth Resources:
]]>The post Visualizing Social Media on your map and mapping your foursquare history appeared first on CloverPoint.
]]>First off, head to foursquare.com and make sure that you’ve logged into your account. Then simply jump to the “Feeds” history page which can be located at https://foursquare.com/feeds/. You should then see feeds made available in various formats including: RSS, KML, ICS and GDAL. With these data you can then proceed to build a map mashup using your favorite tools like Google Maps, ArcGIS Online, CartoDB or any other WMS than can consume these data formats.
The post Visualizing Social Media on your map and mapping your foursquare history appeared first on CloverPoint.
]]>The previous GeoCurrents post considered a number of proposals for various ethnic territories to either leave or join the Russian Federation that have emerged in the wake of the Crimean referendum. The most likely next candidate to join Russia is Transnistria, a narrow strip of land between the River Dniester in the west and the eastern Moldovan border with Ukraine ...
This post is from GeoCurrents
]]>The previous GeoCurrents post considered a number of proposals for various ethnic territories to either leave or join the Russian Federation that have emerged in the wake of the Crimean referendum. The most likely next candidate to join Russia is Transnistria, a narrow strip of land between the River Dniester in the west and the eastern Moldovan border with Ukraine ...
This post is from GeoCurrents
]]>Coverage is typically reported as a percentages – percentage of statements, branches, functions and lines covered. Which is great… except what do these numbers really mean? What numbers should we shoot for, and does 100% statement code coverage mean that your code is unbreakable, or that you spent a lot of time writing “low-value” tests? Let’s take a deeper look…
Here is the output from our automated test system (grunt + jshint + jasmine) on our project… Included is a “coverage summary…
These numbers have been holding steady throughout the development cycle… but what do these numbers mean? Lets break them down a little
The first one is “statements”. In terms of code coverage, a “statement” is the executable code between control-flow statements. On it’s own, it’s not a great metric to focus on because it ignores the control-flow statements (if/else, do/while etc etc). For unit tests to be valuable, we want to execute as many code “paths” as possible, and the statements measure ignores this. But, it’s “free” and thrown up in our faces so it’s good to know what it measures.
Branches refer to the afore mentioned code-paths, and is a more useful metric. By instrumenting the code under test, the tools measure how many of the various logic “branches” have been executed during the test run.
Functions is pretty obvious – how many of the functions in the codebase have been executed during the test run.
Line is by far the easiest to understand, but similar to Statements, high line coverage does not equate to “good coverage”.
With that out of the way, let’s look at the numbers…
Since statements is not a very good indicator, let’s skip that. We notice it, but it’s not something we strive to push upwards.
On Branches we are at ~42%, which is lower than I’d like, but we’ll get into this in a moment.
Functions are ~45%, but this is a little skewed because in many controllers and models we add extra functions that abstract calls into the core event bus. We could inline these calls, but that makes it much more complex to create spies and mocks. In contrast, putting them into separate functions greatly simplifies the use of spies and mocks, which makes it much easier to write and maintain the tests. So, although creating these “extra” methods adversely impacts this metric, it’s a trade off we are happy with.
Where does this leave us? These numbers don’t look that great do they? Yet I’m blogging about it… there must be more.
These summary numbers tell very little of the story. They are helpful in that they let us know at a glance if the coverage is heading in the right direction, but as “pragmatic programmers” our goal is to build great software, not maximize a particular set of metics. So, we dig into the detailed reports to check where we have acceptable coverage.
The report is organized around folders in the code tree, and summarizes the metrics at that level. I’ve sorted the report by “Branches”, and while we can see a fair bit of “red” (0-50% coverage) in that table, the important thing is that we know what has low coverage – as long as we are informed about the coverage, and we decide the numbers are acceptable, the coverage report has done it’s job.
Diving down to the file level, we can check if we have high-levels of coverage on the parts of the code that really matter. For us, the details controller and view are pretty critical, so we can see that they have high coverage. It should be noted that high coverage numbers don’t always tell the whole tale. For critical parts of our code base, we have many suites of tests that throw all manner of data at the code. We have data fixtures for simulating 404’s from APIs, mangled responses, as well as many “flavors” of good data. By throwing all this different data at our objects we have ‘driven’ much of the code that handles corner cases.
Here is a look at the Models in our application.
We can easily tell that our models have very good coverage – and this recently helped us a ton when we refactored our back-end API json structure. Since the models contain the logic to fetch and parse the json from the server, upgrading the application to use this new API was relatively simple: create new json fixtures from the new API, run the tests against the new fixtures, watch as most of the tests fail, update the parser code until the tests pass and shazam, the vast majority of the app “just worked”. Without these unit tests, it would have taken much longer to make this change.
The system we use allows us to dive down even further – to check the actual line-by-line coverage.
There are a number of different tools that can generate code coverage reports. On our project we are using istanbul, integrated with jasmine using a template mix-in for grunt-contrib-jasmine. Istanbul can also be used with the intern, and karma test runners. If you are using Mocha, check out Blanket.js.
If you are just getting into unit testing your javascript code, this is kinda the deep-end of the pool – so I’d recommend checking out jasmine or mocha, and get the general flow of js unit testing going, look at automating things with a runner like karma or jasmine, and then look at adding coverage reporting.
Hopefully this helps show the benefit of having code coverage as part of your testing infrastructure, and helps you ship great code!
]]>IGEMS incorporates all of the data and functionality contained in NHSS, it also provides added information including hurricane tracks and current wind conditions. Since IGEMS utilizes the latest software and technology it provides a richer functionality, including a locate ability, and a richer user experience that supports mobile devices like tablets.
The conference will kick off on Monday with preconference courses and meetings and then feature a full day (Tuesday) of important general sessions and keynote addresses:
The Tuesday afternoon line-up will feature a town hall session on “Geospatial Education, Career Development and Mentoring, where the conference will discuss some critical issues facing the development of the human element critical to GIS success with a goal of generating actionable tasks that can be used to support human resources across the state. A brief look-back at 20 year history of CalGIS conferences will precede Lightning Talks, always entertaining!
Wednesday’s education will feature twelve breakout sessions on a wide range of topics from the environment and modeling to data sharing and Federal programs. The conference will conclude with a powerhouse closing session featuring Eric Gundersen of MapBox, who will discuss open source solutions followed by the closing keynote speaker, Jack Dangermond, President of Esri, dicussing “GIS Technology Trends”.
Always an important part of the conference is the opportunity to visit with exhibitors and sponsors and network with the California GIS community during a number of conference events.
In addition, Esri, gold conference sponsor, is hosting a GeoDev MeetUp on Sunday evening and a Story Map Competition on Monday.
Review the entire conference program online at www.calgis.org and register by April 11 in order to save $25.
[Source: URISA press release]
The Pacific Northwest Geodesign Forum brings together faculty, staff, students, and community partners using geospatial information technologies to create, evaluate, and monitor sustainable solutions to complex problems. Many complex problems involve a mix of social, economic, and ecological considerations that require collaborative efforts to address the challenges at hand. Methods for arriving at sustainable solutions to problems are emerging in the form of geodesign frameworks and concepts implemented using geospatial information tools. The goal of the Forum is to provide participants with a level of understanding about geodesign frameworks and concepts plus geospatial information tools that can implement them for addressing sustainable solutions to complex human-natural-built community problems at varying spatial-temporal scales. Geodesign enables us to change our world through design. Forum discussions include Community-University partnering opportunities for exploring solutions to complex problems.
Program
7:45 a.m. to 8:15 a.m. Registration and Light Refreshments
8:15 a.m. to 8:20 a.m. Welcome, Introductions, Program Overview
8:20 a.m. to 8:30 a.m. Who is in attendance? Sectors-Areas-Attendees Participating
8:30 a.m. to 9:00 a.m. Why Geodesign? Its Character and Benefits with Q & A
9:00 a.m. to 10:00 a.m. Challenges for the Geodesign Community
10:00 a.m. to 10:15 a.m. Break
10:15 a.m. to 10:45 a.m. A Tour of Geodesign Tools with Q&A
10:45 a.m. to 12:00 p.m. Quick Cases Using Geodesign Tools
12:00 p.m. to 1:00 p.m. Buffet Lunch with Discussions
1:00 p.m. to 2:00 p.m. Discussion Groups: PNW Geodesign Community of Practice
2:00 p.m. to 3:15 p.m. Report out from CoP discussion groups (3-5 minutes each)
3:15 p.m. to 3:45 p.m. Next Steps Synthesis for PNW Geodesign Forum
3:45 p.m. to 4:00 p.m. Informal networking as we vacate the Lyceum venue – out by 4PM
Pontiac (map) is one of the safest Liberal seats in the province; barring a black swan event, a win by the Liberal candidate is usually a foregone conclusion. With the retirement of incumbent MNA Charlotte L'Écuyer, that Liberal candidate is André Fortin, a government relations director for Telus who lives in Aylmer.
In 2012 the Coalition Avenir Québec candidate came a distant second with 18.1 percent of the vote; with overall CAQ support down this time around, it'll be interesting to see whether the CAQ candidate, Michel Mongeon, a consultant and former teacher, will beat out the Parti Québecois candidate. The PQ is running a very young candidate, almost certainly as a placeholder: Maryse Vallières-Murray, a CEGEP student who graduated from high school in Fort-Coulonge in 2012. The PQ candidate won 16.1 percent of the vote last time.
UQO professor Charmain Lévy is running again for Québec Solidaire. This is her third run for office; she got 5.2 percent of the vote in 2012. Louis Lang also returns as the candidate for the Marxist-Leninists; he got 0.2 percent of the vote in 2012. The Greens and Option Nationale ran candidates in 2012 but are not doing so this time around.
Previously: 2012 Quebec Election: Pontiac Candidates; 2012 Quebec Election: Pontiac Results.
]]>It’s March 27, 1964 in southern Alaska. At 5:36 pm, powerful ground shaking occurs for nearly five minutes from a magnitude 9.2 earthquake directly below your feet. Depending on where you are, you and your loved ones may face devastating tsunamis that wipe out entire villages, or landslides that send neighborhoods from suburban Anchorage into the ocean.
You just experienced the largest U.S. earthquake ever recorded, and the second largest ever recorded worldwide.
At that time, scientists did not yet know exactly how or why the earthquake occurred. Three U.S. Geological Survey (USGS) scientists were immediately sent to Alaska to figure it out. What they found marked a turning point in earthquake research.
To commemorate the 50th anniversary of the 1964 Great Alaska Earthquake and Tsunami, let’s look back on what happened and consider how science and technology has advanced since then. This event helped confirm the theory of plate tectonics and provided firsthand insight on earthquake processes, tsunami generation, and the impacts of these phenomena on communities, both locally and across the Pacific.
Double Whammy: Earthquake and Tsunami
The 1964 earthquake produced strong ground motions and caused more land surface deformation than any previously recorded earthquake. The earthquake was accompanied by massive local tsunamis and a trans-oceanic tsunami that swept across the Pacific. At several places in Port Valdez, Alaska, tsunami run-up was more than 100 feet. This great earthquake and ensuing tsunamis took 131 lives and caused about $2.3 billion in property loss (equivalent to $311 million in 1964).
Building a Plate Tectonics Theory
There were no obvious faults at the surface to explain the earthquake. Even with months of careful observation and field work, the cause of the earthquake remained a mystery until USGS scientist George Plafker set out to investigate the event and interpreted what he saw in the field. This new insight helped confirm the concept of plate tectonics and changed earthquake science forever.
At the time, the idea of plate tectonics was just being developed. No unifying theory existed on what caused these types of great earthquakes. Plate tectonics is a scientific theory that describes the Earth’s outermost layer as having about a dozen major tectonic plates that are constantly moving. When plates interact or collide, their interactions could produce mountains, earthquakes, volcanoes and more.
After detailed work investigating the 1964 event, Plafker concluded that this event was a “megathrust” earthquake, occurring where an oceanic plate descends underneath a continental plate in southern Alaska. Slip between the two tectonic plates along this kind of plate boundary, called a subduction zone, is the cause of the world’s largest earthquakes. This process is currently happening in many parts of the world, but especially around the Pacific Ocean. The Alaska-Aleutian subduction zone is part of what is known as the Pacific “ring of fire.”
Will It Happen Again?
How frequently do giant earthquakes like this occur? Will the next one happen tomorrow or thousands of years from now?
By drilling 50 feet into the earth and taking core samples along the Copper River, Plafker and his current research team discovered evidence of nine megathrust earthquakes that had occurred in south central Alaska over the past 5,500 years. The average time span between these quakes is about 600 years. This statistic provides an idea of earthquake probability. However, it is important to recognize that scientists cannot predict earthquake events.
Also noteworthy, the 10 largest earthquakes recorded in the United States have also occurred in Alaska. Most of these were megathrust earthquakes along the Alaska-Aleutian subduction zone.
Nearby and Far-Reaching Impacts to Communities
A clearer understanding of how earthquakes and tsunamis impact communities and infrastructure in Alaska was another lesson learned from the 1964 earthquake. Anchorage, about 75 miles northwest of the epicenter, sustained dramatic damage to property. This included damage to schools as well as about 30 blocks of dwellings and commercial buildings in the downtown area. Landslides destroyed homes and disrupted water, gas, sewer, telephone, and electrical systems.
The impacts of the tsunamis were far reaching and eye opening. Tsunamis engulfed towns along the Gulf of Alaska and caused serious damage in Canada, the west coast of the United States, Hawaii, and even Japan. Seiche action (a series of waves) in rivers, lakes, bayous, and protected harbors and waterways along the Gulf Coast of Louisiana and Texas caused minor damage. Tide gages in Cuba and Puerto Rico also recorded effects from the tsunamis.
Great Leaps Since 1964
Significant research progress was achieved in the years following 1964. The disaster led to the establishment of the National Earthquake Information Center (NEIC) within the U.S. Coast and Geodetic Survey in 1966. NEIC was eventually transferred in 1973 to the USGS. In 1977, Congress passed the Earthquake Hazards Reduction Act, which led to the establishment of the National Earthquake Hazards Reduction Program (NEHRP) the following year. USGS earthquake research efforts—which are now primarily coordinated through the USGS Earthquake Hazards Program—were established through NEHRP.
There has been an extensive expansion of monitoring stations in Alaska. Whereas only two seismometers were operational in Alaska in 1964, the Alaska Earthquake Center currently receives data from more than 400 seismic sites. This is part of a bigger effort through the USGS Advanced National Seismic System, aiming to establish a nationwide network of 7100 earthquake sensors across the country.
Another significant advance has been the production of the USGS National Seismic Hazard Maps. These maps present scientists’ best estimates of maximum ground shaking during future earthquakes. These are regularly updated to incorporate new data and analyses.
USGS seismic hazard maps are used in developing building codes, helping ensure that earthquake-resistant buildings are built in areas at risk from earthquakes. The 2011 Tohoku-oki earthquake in Japan had terrible loss of life from tsunamis, but in terms of shaking-related damage it was a success story: With modern building codes applied across Japan, relatively few lives were lost from building collapse.
New Insight on Tsunami Generation
Basic understanding of tectonic tsunami generation was also improved following the 1964 event. The understanding of tectonic sea floor deformation during the earthquake allowed a conceptual framework for tsunami generation, which is used in predicting future events. The 1964 earthquake led directly to establishing the Alaska Tsunami Warning Center. Since then, the development of the NOAA Tsunami Warning Centers has been critical for monitoring tsunami hazards across the world. Progress also includes inundation mapping, the implementation of warning systems for many coastal communities, and the identification of tsunami evacuation routes through the efforts of the federal and state National Tsunami Hazard Mitigation Program partnership.
Continuing to Learn into the Future
One of the key questions scientists are considering is how future earthquakes in Alaska might differ throughout time. Which segments of the Alaska-Aleutian subduction zone will rupture in the future, and what is the potential greatest magnitude of earthquakes these ruptures will produce? There are also still many unknowns regarding tsunami generation in Alaska and across the world, with new research efforts motivated in part by the recent events in Japan and Sumatra.
The USGS Science Application for Risk Reduction (SAFRR) project recently published the SAFRR Tsunami Scenario which depicts a hypothetical tsunami generated by a massive earthquake offshore the Alaska Peninsula and its impacts on the California coast.
The “Alaska Shield” component of FEMA’s National Exercise Program Capstone Exercise 2014 is testing response plans and capabilities in Alaska. The USGS provided simulation of the ground shaking and tsunami effects that occurred in 1964, so that emergency responders can assess modern preparedness levels.
The USGS and its partners are helping to provide critical seconds of notification by developing a prototype Earthquake Early Warning System for the west coast of the United States.
More broadly, a variety of research efforts are underway to better characterize earthquake hazards across the nation. Our understanding of tectonic processes has been furthered greatly since 1964 with the development of GPS and sophisticated imaging tools such as InSAR and LiDAR. Scientists are constantly updating earthquake hazard maps of the entire country, enlarging the scope of earthquake-sensing instruments, and gaining knowledge of regional geology, fault behavior, complex ground motion, and other earthquake processes.
4 Minute Video
Watch a short video on this earthquake, with perspectives from scientists as well as those who personally experienced the event. An expanded version of the video is also available for viewing.
Learn More
Read more about the 1964 Great Alaska Earthquake and Tsunami by visiting a USGS website with resources and information. See a summary of event details in a USGS fact sheet. Learn about earthquake hazards specifically in Alaska, including earthquake history, recent and notable events, tsunami information, maps, and more.
USGS: Start with Science
Science makes us safer. It is essential to start with science, because we can’t plan effectively if we don’t know what we are planning for. The USGS is dedicated to creating and providing information tools to support earthquake loss reduction, including hazard assessments, scenarios, comprehensive real-time earthquake monitoring, publications, and preparedness handbooks.
]]>Well it finally happened I conducted my first full CityEngine training session in conjunction with my good friends at GISTEC in the United Arab Emirates (UAE). I have done workshops and presentations before on CityEngine and conducted SketchUp training sessions for clients but not something like this.
I was staying in Sharjah a smaller emirate within the UAE, where GISTEC have their offices. My two days consisted of being collected at 5am and driven from Sharjah across Dubai to Abu Dhabi which took close to two hours! Fortunately I had the company of two GISTEC colleagues who provided me with excellent and interesting conversation.
If you are looking for CityEngine training sessions come talk to me, or if you are located in the Middle East contact my friends at GISTEC directly (tell them I sent you).
A large part of business for me is about enjoyment and personal connections so the two hours was too short not too long! My companions (including our drivers) on the drive were of different backgrounds and religions and we talked at length about everything it seemed from work to family, from cricket to football you name it we seemed to talk about.
The GISTEC office in Abu Dhabi was very impressive in one of the many new office blocks, their offices also included a training suite with a great view.
As I arrived I was quite nervous, preparation is key to a successful training session and I’ve got to be honest I thought I hadn’t got enough material for two solid days of CityEngine. I needn’t have been worried, it turns out I know more than I care to admit and had probably got too much material. As I stood in front of the trainees (9 people) my nervousness disappeared my breathing calmed down and we got down to business.
The first day I think in hindsight really was a bit cruel for the trainees. You see, after I did a small presentation I got them to hand code the Computer Generated Architecture (CGA) rule files, all day. I had given out a course handout that was basically code and I talked them through the process.
In my opening presentation I explained that coding in CityEngine is difficult and a successful coder of CGA ideally requires knowledge of design, GIS and coding. It’s rare to find people who are generalists, but I think a lot of GIS professionals may fit that bill. You see coding in CityEngine is like playing with Lego, you can’t teach someone how to build a specific building, but you can teach them the how the building blocks fit together and you can show them nice techniques. It turns out that this is how I have taught my daughters to play with Lego.
Anyway onto the training, I was lucky all the GISTEC equipment worked pretty much flawlessly, we had laptops with discrete graphics cards and for the work we were doing this was perfectly acceptable. Those of you who have used CityEngine know that to do large complicated models you need a big gaming PC. Small portions of code were entered until we got ourselves simple building models and streets. Unfortunately I had planned more but had run out of time, this is where being flexible helps. I realised the trainees were getting how difficult CityEngine is, but also how flexible and powerful it can be.
I asked them therefore to write down what they wanted to learn in Day Two, when the list came back (everyone had contributed) I was pleased to see they wanted to know about workflows. I had prepared for this, it’s all very well knowing some CGA coding but these were GIS professionals and needed to know how to get their data in and use it.
On Day Two I gave the trainees all the CGA rule files for the session including assets and maps (textures), my main thrust of this session was how to pick apart a rule file someone else wrote and understand it. My secret you see is that I don’t know everything about CityEngine, but I do know how rule files work and how to make components fit together. It’s the workflow that’s important you see. We went through the various rule files I had given them picking out important aspects like reporting and various random variation strategies. I also talked them through a rule file to texture roof tops based on a satellite image. We then went into how to import datasets like satellite imagery, File Geodatabases and OpenStreetMap. I really did run out of time at the end of day two, there is so much to cover in CityEngine you can’t possibly cover it all. I suspect each training session I do will be different and based on the backgrounds and needs of the trainees. This isn’t a ‘one size fits all’ training course.
The feedback I’ve gotten was positive and very complimentary clearly there are some areas I need to improve on, perhaps these can largely be fixed by making it a 3 day course, maybe…
This was really enjoyable for me though, the trainees from different backgrounds really challenged me in a good way. Asking questions and interacting with me to maximise the value from the course. I don’t mind being pushed (as long as it’s polite) it shows they got CityEngine and what it’s capable of and why people are enthusiastic about it.
I’ll finish by saying thank you to the trainees and my UAE Partners GISTEC. I hope to do more training session like this in the future.
If you are looking for CityEngine training sessions come talk to me, or if you are located in the Middle East contact my friends at GISTEC directly (tell them I sent you).
I know I keep talking about it but there really is training texts coming, I can’t give these away for free, but maybe I’ll release extracts of it for free and sell the complete text at some reasonable fee.
Want more posts like Conducting my first CityEngine training session ? Then visit GeoPlanIT for more exciting posts (no really).
]]>In the latest version is the ExpressZip web application for exporting imagery straight from the web browser as well as offering improved upgrade functionality. The process of migrating all image catalogs automatically is part of the upgrade functionality, making it easier for users to install their new version of Express Server. They won’t have to manually update all their catalogs during upgrade. Also, Express Server integrates with third party applications such as ArcGIS Server to speed up the delivery of raster imagery.
]]>The post Stakeholders to Discuss the Emerging North American Natural Gas (LNG) Market appeared first on CloverPoint.
]]>As part of the consultation process, National Geographic has planned and event/round table discussion and invited a number of stakeholders – Natural Gas: A Bridge to a Sustainable Energy Future? The event will take place March 25, 2014 in Vancouver, B.C. We will be monitoring this story, particularly as friend and colleague, Chief Karen Ogen has been asked to elevate the Big Energy dialogue by sharing the Wet’suwet’en First Nation story. Ogen will represent the Wet’suwet’en First Nation and participate in round table discussion(S) and the proposed expansion in B.C as well as proposed expansion into the Wet’suwet’en First Nation traditional territory.
For more on this important topic, see details of The Great Energy Challenge
This via National Geographic – “…They argue this new bounty should be shared—especially with hungry markets in Asia and Europe willing to pay a high price for the fuel. But long-distance transport of natural gas is one of the world’s most expensive engineering feats, and it will require government approvals, community support, and billions of dollars in capital to take North American gas overseas… Even as LNG project sponsors face a broad array of export opponents, and a complicated regulatory and financing process, they are racing each other to begin construction.” (Source: National Geographic blog)
On Twitter:
See Also:
Getting to Yes on Meaingful LNG Consultation
Wet’suwet’en Chief Ogen To Meet With Minister Of Energy About LNG Consultation
The post Stakeholders to Discuss the Emerging North American Natural Gas (LNG) Market appeared first on CloverPoint.
]]>“The USGS is actively engaged in many programs to inspire all youth into the wonders of science and to pursue careers in this field,” continued Kimball.
Inspiring Youth into the Future
The Pathways Internship Program is one way the USGS is aiming to inspire and mentor students. USGS engagement with youth and the wide range of research and learning experiences offered to students also directly support the Science, Technology, Engineering, and Mathematics (STEM) Education Coalition. These USGS activities align with the U.S. Department of the Interior STEM Education and Employment Pathways Strategic Plan: Fiscal Years 2013-2018, which focuses on increasing scientific literacy in the general public and attracting and preparing the future STEM workforce.
Women in Science
In this story, we take a moment to shine a light on the accomplishments and stories of a few women scientists at the USGS.
Kati Bednar
Years with USGS: 2008-present
Kati Bednar’s high school chemistry teacher encouraged her to look into the USGS, which has since led to a burgeoning science career as a Student Trainee Hydrologist. She is currently a full-time geology and geography student planning to graduate this year with her B.S. in geology and certificate in GIS. During her breaks from school, she gains a wide range of experience assisting USGS scientists studying groundwater, surface water, soils, gases, biological samples and greenhouse gases, while also utilizing her class training.
Of her experiences with the USGS, Bednar says, “I love traveling to remote places that I probably never would have been to, if not for the opportunity to collect water quality samples and conduct other experiments. With the USGS I hope to fully gain an understanding of the hydrological issues that we are going to face in the near future, particularly those dealing with fracking and water quality.”
Hilary Stockdon
Years with USGS: 1998-present
As a research oceanographer, Hilary Stockdon has recently focused her studies on areas hit by Hurricane Sandy along the Northeast coastline. Using lidar technology and USGS-developed models for coastal erosion, her team was able to predict where the natural protective sand barriers would be worn away and thus where the storm surges and hurricane waves would likely cause the most damage. This information, as well as assessments of coastal erosion hazards during future storms, is vital in helping resource managers and coastal planners identify the most vulnerable areas along the shoreline and address the public safety concerns of their residents.
In her time at USGS, she has also studied the impacts of Hurricanes Isabel, Ivan, Katrina, and Ike on barrier island beaches. Her work on the effects of these storms on the coastal communities of our Nation has raised public awareness about the value of scientific information on coastal vulnerability, helping residents prepare for future storms.
Jayne Belnap
Years with USGS: 1987 – present (DOI, USGS)Jayne Belnap’s career serves not only our Nation but the world. Belnap is a research ecologist studying how different land uses – recreational, agricultural and industrial – affect the fertility and stability of desert soils.
She then applies that knowledge to understand how and why some desert communities are more vulnerable than others to factors such as climate change, invasive species and dust production. In more than 25 years of service, she has published 245 peer-reviewed articles and books on soil crusts.
Belnap, a leader and expert in her field, conducts training for Federal, State, and private land managers on how best to manage dryland ecosystems. She is world-renowned for her expertise on biological soil crusts and has been invited by many foreign governments to train their scientists in soil crust ecology, including such distant places as South Africa, Zimbabwe, Kenya, Mongolia, China, and Australia.
Lucy Jones
Years with USGS: 1983-present
Lucy Jones, a highly respected researcher, is the go-to “earthquake lady” for many news media outlets in California. She has authored more than 100 papers on research seismology as well as developed algorithms to predict probabilities of aftershocks and foreshocks, to help inform earthquake warnings. In 2007 Jones helped launch the Multi-Hazards Demonstration Project (MHDP), which combines hazards science with economic analysis and emergency response to improve society’s resilience to natural disasters.
One result of this project is the Great ShakeOut, which is now a worldwide campaign that includes an earthquake drill where participants practice the safety procedure, “drop, cover and hold on.” It began in 2008 in southern California with 5 million participants, and it has since expanded to include regional ShakeOut events throughout the Nation and around the world. The last ShakeOut in 2013 saw more than 24 million participants. For 2014, Jones is supporting a special partnership between the USGS and the City of Los Angeles as the Mayor’s Science Advisor for Seismic Safety to help local managers develop approaches to reduce the risk from earthquakes.
Florence Bascom
Years with USGS: 1896-1936
Florence Bascom racked up a lot of “firsts”: the first woman hired by the USGS, the first woman to receive a Ph.D. from Johns Hopkins University (1893), the first woman to present a scientific paper before the Geological Society of Washington, and the first woman officer of the Geological Society of America.
Bascom was an authority on rocks of the Appalachian Piedmont and published many reports and maps. Researchers and scientists still refer to her work today. She also studied the water resources of the Philadelphia region. Along with her service in the USGS, Bascom also taught at Bryn Mawr College beginning in 1895 and was the founder of their geology department. Many American women geologists of the early 20th century owed their professional training to Bascom; several of them followed in her footsteps by joining the USGS themselves.
A Legacy of Excellence
These are just a few of the impressive women throughout the history of the USGS to serve our Nation and the world with relevant, high-quality scientific data and information. With the help of their teams and colleagues, they shape the future of the USGS, building a legacy of excellence and accomplishment in the sciences for young women and future generations.
]]>So through email I got the following from a colleague:
The Census Bureau has released a new geocoding tool that allows users to find the census geographic areas that street addresses or address coordinates are located within. The tool is available as an API and a web form. In addition to a single address look-up, the tool also allows users to submit batches of up to 1,000 addresses at a time. The information in the geocoder comes from the Census Bureau’s MAF/TIGER database, which holds our geographic information used for censuses and surveys. The address ranges used in the geocoder are the same address ranges found in the TIGER/Line Shapefiles, which are derived from the Master Address File (MAF).
Additional information, including documentation, descriptions of the data in the geocoder, and FAQs are included on our geocoder website. |
Contact: geo.tiger@census.gov | (301) 763-1128 |
So this is kinda nice I think. A geocoder from the group I would expect a geocoder from. Most of my geocoding was against ArcGIS’s Geocoding Service and when it disappeared as a free service – well – I haven’t done any geocoding since.
For fun and giggles I did my old address and got this back:
Matched Address: 215 Jarnigan Ave, CHATTANOOGA, TN, 37405
Coordinates:X: -85.30156 Y: 35.062725
Tiger Line Id: 59307418 Side: L
Address Components:
From Address: 203
To Address: 299
PreQualifier:
PreDirection:
PreType:
Street Name: Jarnigan
SuffixType: Ave
SuffixDirection:
SuffixQualifier:
City: CHATTANOOGA
State: TN
Zip: 37405
So – I await the Happy hacking of tools and scripting…..I’m halfway tempted now to pretend I could write a QGIS Plugin or an arctoolbox model!
]]>
QGIS Plugin Builder 2.0.3 |
Point to the directory where you plugin is located. |
Compiling the Resource and UI files |
The basic plugin window with the name of your plugin |
Viewing and editing python code from the QGIS Plugin Builder |
Several sharp improvements characterize this release as well as stability and performance upgrades:
Landsat images help track urban change, a factor that can impact a community’s flood risk. The Federal Emergency Management Agency, or FEMA, uses these images to help identify where they should launch a new flood study. Flood studies determine how prone different neighborhoods are to floods of a certain intensity or likelihood.
Video of FEMA’s use of Landsat data (3:10)
Successful flood studies require an arsenal of tools, however, including data on river flows and storm tides, hydrological and hydraulic analysis of landscape and river systems, and historic rain data, to name a few.
These studies have adding satellite data from Landsat to the toolkit. With its archive of images capturing sprawling cities and new developments, Landsat helps FEMA track how building and construction is impacting an area’s landscape.
Earth-observing Landsat satellites have been capturing images of the planet’s surface since 1972. Landsat 8, the newest satellite in the joint NASA and U.S. Geological Survey program, was launched Feb. 11, 2013, and now collects more than 400 images per day. New and archived Landsat data are available free to the public from USGS. Researchers put the free data to a multitude of uses.
One is called the National Urban Change Indicator, or NUCI, a product developed by MDA, a company that makes geo-spatial products derived from satellites. Using satellite imagery, NUCI detects if an area has undergone a human-induced change over a 25-year period.
Urbanization can spell trouble for flood risk.
“If you identify areas where urban change is accelerating, there are consequences,” said Zack Roehr, a senior spatial analyst with Dewberry, Fairfax, Va., a FEMA subcontractor. “The ground is no longer able to hold water, which means local flooding sources are going to receive more of that water. The flooding characteristics are going to change.”
Soil typically acts like a sponge, absorbing water from rainfall. With urbanization, often the lots that previously had natural landcover, with its ability to absorb some of the rainfall through infiltration, are now covered with concrete or other impermeable material. This change decreases infiltration of rain water into the ground and increases the amount of water that flows to streams and rivers, thereby increasing flood risk. Additionally, urbanization results in increased connectivity of the drainage system, which makes for faster runoff from rain storms into creeks and rivers.
Selected Landsat Imagery
Austin, TX – accelerated urban change
Washington, DC – 40 years of change
Colorado flooding – Sept. 2013
Learn more
]]>The RTLT is an online catalog of national resource typing definitions and job titles/position qualifications. Definitions and job titles/position qualifications are easily searchable & discoverable through the RTLT. They can be viewed directly on the web page, downloaded in PDF format or directly used by third party software applications using the available Web Services application programming interface (API)
Nationally typed resources support a common language for the mobilization of resources prior to, during, and after major incidents. Resource users at all levels use these definitions to identify and inventory resources for capability estimation, planning, and for mobilization during mutual aid efforts.
The White House Climate Data Initiative is one of the most important and timely initiatives of our times. In its community outreach, Esri plans to focus its initial efforts on 12 large and small communities, including New Orleans, Louisiana; Wake County, North Carolina; and Tamarac, Florida, to develop practical methods and approaches based on GIS technology that address the most critical requirements of the communities. Esri will continue its plan by publishing a series of maps and apps developed in conjunction with these communities that will be shared openly. Communities around the world can use the solutions to make progress toward becoming more resilient.
Esri’s climate-focused geo-collaboration portal is a place where citizens and professionals can go online to discover, contribute, and share resources critical to confronting the impacts of climate change, according to the press release. This website will offer a starting point for open data and ideas. It will evolve over time and grow as more scientists, government entities, and the public use it.
Both the local government focus group project and geo-collaboration portal complement the Esri Climate Resilience App Challenge, which launched last week in front of thousands of GIS developers at the Esri Developer Summit. The app challenge inspires developers to use their expertise for making maps and analytical tools that help communities see, understand, and prepare for climate risks. The event is open to everyone- from independent developers to startups, governments, academia, and NGOs. The resulting apps will be openly shared and Esri will award prizes to the winners. In July, the best of the best will be featured at the Esri International User Conference where more than 15,000 people gather to learn new practices they can use to make a positive difference in their own work.
To extend the reach of these important efforts, Esri has partnered with a variety of organizations that share a commitment for tackling complex climate challenges. Esri’s partners in building resilient communities include the International City/County Management Association (ICMA), National Association of Counties (NACO), National League of Cities (NLC), Tumml, American Public Works Association (APWA), American Planning Association (APA), Association of State Floodplain Managers (ASFPM), American Water Resources Association (AWRA), International Association of Fire Chiefs (IAFC), Local Government Commission (LGC), National Association of Development Organizations (NADO), National Alliance for Public Safety GIS Foundation (NAPSG Foundation), National Information Sharing Consortium (NISC), National Oceanic and Atmospheric Administration (NOAA), National Association of County and City Health Officials (NACCHO), Trust for Public Lands (TPL), and Public Technology Institute (PTI).
###
]]>The post WOW Technology and Visualizing Situational Awareness with Augmented Reality appeared first on CloverPoint.
]]>Rolls has worked with technology partners in Finland and professionals in the shipping industry to redesign the bridge on tug boats by using some amazing technologies. Imagine the bridge of a ship where the windows are actually augmented reality displays providing real time AR displays showing weather data and alerts, routing information, and navigational information… oh, and the workstation driving it all is designed to recognize who is at the controls and responds accordingly with pre-defined parameters.
Here at CloverPoint we may not be working on redefining the shipping industry, however, we are using some really cool technology, including some cutting edge 3D visualization tools to support our clients. Case in point, our popular video sharing some virtual reality scenarios we designed for clients that take advantage of some great technology and marries our Insight Software with virtual reality, a little 3D, Unity3D and the Oculus Rift. Check out the video HERE
You can see more about what Rolls Royce and their partners at the VTT are doing in this space in this fabulous read from E&T - supporting video has been shared below.. enjoy!
The post WOW Technology and Visualizing Situational Awareness with Augmented Reality appeared first on CloverPoint.
]]>The OSGeo group makes a bootable DVD/ISO image that gives you a chance to explore a lot of Free and Open Source GIS Software. The DVD runs Linux (but don’t be scared of it – it’s quite simple to run and use) plus it’s free. There is quite a bit of software available from desktop to server type applications. It’s a great way to explore and learn – plus – once again – it’s free. You may end up getting so curious about some of this stuff you might just install it and possibly….Use it!
Go forth – learn – do something different.
]]>
Thank you, Michael. I promise not to let Randy use this on the new waffle tacos that will be coming to a Taco Bell near us soon.
]]>
The lead software architect at Esri, Scott Morehouse has had a profound effect on the field of GIScience by having applied his deep knowledge of information systems to the development of Esri software for more than 25 years. Hanan Samet, of the University of Maryland’s Computer Science Department and its Institute of Advanced Computer Studies, is an internationally eminent scholar in the theory and development of spatial data structures. Lastly, Dr. John Wilson, Director of the Spatial Sciences Institute at the University of Southern California, is recognized for both his early research in terrain representation and analysis as well as his leadership at envisioning new directions for GIScience education and research in the 21st century.
The UCGIS Fellows Program was created in 2010 to celebrate the extraordinary record of achievements of individuals in a variety of spatial disciplines and communities of practice that use spatial information. These new Fellows were selected by a review committee comprised of the current UCGIS Fellows and members of the UCGIS Executive Committee.
For more information, please visit http://ucgis.org/announcements/three-new-fellows-recognized or contact Diana Sinton, Executive Director (dianasinton@ucgis.org).
[Source: UCGIS Announcement]
The biography, drawing on family interviews and personal papers, takes up the first two thirds of the book. It reveals a type of character rather familiar to those of us who muck about with snakes: fearless, reckless (he was bitten numerous times) and just a little feral, absolutely fixated on the subject matter, and dripping, perhaps, with a wee bit too much testosterone. A difficult personality who nonetheless engendered fierce loyalty. But Slowinski was more than just Steve Irwin with a Ph.D.: he was stone-cold brilliant, a major contributor to the field of phylogenetics, and in particular to the systematics of elapid snakes -- a point that James makes clear, if not at length. (Can't say I blame him.) The final third reads like a feature article in Outside (and one was written about the incident, by another author), cataloguing the mishaps and bureaucratic nightmares involved in going deep into a restricted area of a country run by a deeply corrupt and paranoid regime, and the heroic attempts to keep him alive once the krait envenomated him while his support networks stateside were dealing with 9/11.
Where The Snake Charmer shines is in its portrayal of Slowinski himself; for all his reckless behaviour, he was not necessarily much for introspection. James has had to do his homework. I would very much have liked to see a bibliography, though, as in several James mentions publications that I wanted to look up for myself. In terms of the herpetology, for someone who is not necessarily well-versed in it James does a creditable job, though it's clear he's drawing on secondary sources for his material on snakes, and he makes a couple of minor errors that a herp-aware copyeditor (hi there) would have caught. But I've seen much worse. All in all an interesting read.
The Snake Charmer: A Life and Death in Pursuit of Knowledge
by Jamie James
Hyperion, June 2008
Buy at Amazon (Kindle) | author's page | publisher's page | Goodreads | LibraryThing
Just a quick ‘heads-up’ for all you CityEngine fans out there, a new version of CityEngine 2013 has been released which fixes a few, ahem, ‘issues’ the main one being a ‘concurrent licence’ issue. I’m reliably informed there is also a fix in there to solve an issue of data being shifted when exported to a webscene (why they haven’t said that in the release notes I don’t know).
If there are any other fixes I hear of or notice, I’ll let you know.. in the meantime you can download it from the Customer Care Portal, oh and you’ll need to uninstall the 2013 previous version to install this.. I hope one day these fixes come in updates rather than total new versions…
Want more posts like CityEngine 2013 Service Release (2013.1 140203R) ? Then visit GeoPlanIT for more exciting posts (no really).
]]>Women are encouraged to contribute their information to the map, which has been created on Esri’s ArcGIS Online platform.
]]>
While reading through your site, I noticed you have a link to the U.S. Fire Administration website on http://www.epcupdates.org/.
I wanted to introduce you to FireScience.org, an organization dedicated to Fire education and information ecosystem. Our mission is to provide fire education, public safety careers information and tools to the public at no cost. A few of these resources can be found here:
Fire Science education and training for current and future students: http://www.firescience.org/firefighter-training-education/
An extensive “how to become” series, which includes careers such as firefighting, EMT, fire inspector, fire marshall, arson investigator and more: http://www.firescience.org/how-to-become/
A resources section with a database of fire department and academies and fire statistics by state: http://www.firescience.org/resources/
Would you mind adding the link to our website on your resource page above (or similar page) to help this information reach those interested in fire education and public safety careers?
Fire Science Careers, Education and Degrees: http://www.firescience.org
Thank you,
Matt Davis
Fire Science Online
916-990-4526
"Most of the 100+ available products are updated within three hours of observation, essentially showing the entire Earth as it looks 'right now'. This supports time-critical application areas such as wildfire management, air quality measurements, and flood monitoring."- NASA Worldview WebsiteOf note, browsing on a tablet or smartphone is supported. There is a layered slider for time, so users can choose the year, month, and day. Selecting dates in some websites can be painful. However, this slider is intuitive, quick, and easy to use--much more so than those pesky calendar pages some users are forced to navigate. Users also have several base maps to choose from in addition to the rest of the layers. There is a great "About" page and brief "Tour" available.
Global Land Surface Temperatures/Day from MODIS, 17 July 2013 |
PolicyMap is a one-stop-shop for a huge variety of public and commercial data (15,000 datasets), as well as the tools to map this data.
“PolicyMap is an online tool that allows anyone, particularly non-experts, the ability to easily make data rich maps on the web,” said McCullough. “Our customers are not GIS specialists or analysts. They tend to be public policy analysts and the end user who is really looking to understand data in a particular geography for specific purposes. So when we launched in 2007 we learned what people want to do with maps and the kinds of data they want access to. We have grown the business to include a lot of public users. We offer a lot for free, have a lot of government agencies, commercial organizations, a growing number of universities and non-profits.”
McCullough said maps have grown in popularity with the use of open data and the federal government releasing more data to the public
Data maps are useful because they help people gain insight into a geographic area. The maps help people make informed decisions and helps them explain it to others. “When we use the phrase “data map” we’re talking about this kind of map, a status map with shading at a state level, with the data that comes out of the Center for Disease Control (CDC) for example, every month. This map lets us know that the darkest purple places like Texas and Louisiana and Alabama have widespread flu activity as of Dec. 20, 2013. You can see how flu has spread.”
In a population data map, the data map shows how the population has changed between 2000 and 2011.
“If you don’t have GIS training, you’ve got to find software and the data and you have to know how to put data on the map, and that can be expensive and require training,” said McCullough. “And the data itself isn’t that simple. Open data sounds like it’s free and easy, but it has to be cleaned, normalized, validated, what it’s for and it’s not all in one place.”
Data incoming is not always going to be in the same format, so PolicyMap is able to get data ready and clean it so that users can go ahead and start analyzing it without having to worry about finding it and cleaning it up to make a map.
The biggest difference in this new version is the user interface has changed dramatically with a full screen map. Maps can be hard for people to understand. The tool allows for the making of maps, creation of tables, reports, 3 layer maps, where you can layer three layers on top of each other. There is a self service data loader for customers who want to leverage the data we have in PolicyMap but also want to upload their own data and see it in here as well.
In the Location bar – you can type in an address, city and county and you can also go to a geography like a census track, congressional district or school district. The search boxes and menus in PolicyMap are patterned after those in Amazon or LinkedIn so that they are easy to manipulate for the non-technical user.
From the press release:
Among the updates in the new PolicyMap are:
The basic level of PolicyMap is free. Give it a try: http://www.policymap.com/maps
]]>By Nicolas Moiroux, Armel Djènontin, Abdul S Bio-Bangana, Fabrice Chandre, Vincent Corbel, and Hélène Guis
A better understanding of the ecology and spatial-temporal distribution of malaria vectors is essential to design more effective and sustainable strategies for malaria control and elimination. In a previous study, we analyzed presence-absence data of An. funestus, An. coluzzii, and An. gambiae s.s. in an area of southern Benin with high coverage of vector control measures. Here, we further extend the work by analysing the positive values of the dataset to assess the determinants of the abundance of these three vectors and to produce predictive maps of vector abundance.
Positive counts of the three vectors were assessed using negative-binomial zero-truncated (NBZT) mixed-effect models according to vector control measures and environmental covariates derived from field and remote sensing data. After 8-fold cross-validation of the models, predictive maps of abundance of the sympatric An. funestus, An. coluzzii, and An. gambiae s.s. were produced.
Cross-validation of the NBZT models showed a satisfactory predictive accuracy. Almost all changes in abundance between two surveys in the same village were well predicted by the models but abundances for An. gambiae s.s. were slightly underestimated. During the dry season, predictive maps showed that abundance greater than 1 bite per person per night were observed only for An. funestus and An. coluzzii. During the rainy season, we observed both increase and decrease in abundance of An. funestus, which are dependent on the ecological setting. Abundances of both An. coluzzii and An. gambiae s.s. increased during the rainy season but not in the same areas.
Our models helped characterize the ecological preferences of three major African malaria vectors. This works highlighted the importance to study independently the binomial and the zero-truncated count processes when evaluating vector control strategies. The study of the bio-ecology of malaria vector species in time and space is critical for the implementation of timely and efficient vector control strategies.”
]]>
]]>
The U.S. Geological Survey, in collaboration with many partners, recognizes National Groundwater Awareness Week: March 9-15, 2014.
USGS National Groundwater Awareness Week website
What’s so special about groundwater?
Groundwater is one of the Nation’s most valuable natural resources. It supplies the drinking water for nearly half our nation’s population and provides about 40 percent of our irrigation water. It sustains streamflow between precipitation events and during protracted dry periods. And it helps maintain a variety of aquatic ecosystems that are dependent on groundwater discharge to streams, lakes, and wetlands. As the Nation’s principal reserve of freshwater, it represents much of the potential future water supply.
Groundwater: right underneath our feet
Groundwater is an essential part of most of our daily lives. Rural farmers and urbanites, water-supply managers and regulators, researchers and policy-makers — all have a part to play in the current status and future of our groundwater resources. This valuable resource is right underneath our feet.
While groundwater can be found nearly everywhere, its availability varies. Groundwater is stored in aquifers, a resource shared by many users. Aquifers receive water deposits from precipitation and surface water. If there are too many users making too many withdrawals from an aquifer — say, to irrigate farmlands, water lawns, or supply wells — and the deposits don’t keep up with the demand, there may not be enough groundwater to go around.
Even when groundwater is plentiful, it’s not truly available unless the quality is acceptable for the intended use. Both water quantity and quality are essential to maintaining water supply for municipal, domestic, agricultural, and recreational use, and for aquatic ecosystems.
USGS and groundwater
At the USGS we systematically observe and monitor groundwater conditions at locations across the United States. USGS groundwater assessments help inform the public so that citizens across the nation can engage in best practices for management, protection, and conservation. Groundwater conservation is a matter of both conserving the quantity and protecting the quality from contamination.
USGS scientists work constantly to improve our understanding of how groundwater moves through the subsurface and what human and natural factors affect the quantity and quality of that groundwater. Understanding these dynamics helps answer important questions about current groundwater availability and long-term sustainability.
Learn more
USGS Groundwater Watch (active groundwater level network)
]]>
]]>
]]>
Geogit is a tool made just for me. Despite my best intentions, there are times when data file names get the best of me. These times are lamentable because when the inevitable moment arrives when I have to comb through a series of files in order to find the one that is the most correct or the most current, I’ll find something like this:
Krige_1
Krige_2
Krige_3
Krige_3_final
Krige_3_final_final
Or worse. Maybe later on down the line, during that complex interpolation project last year, I decided that this analytical method was sub-optimum and went with natural neighbor instead. So then there’s a series of “natural_neighbor” prefixed datasets in the same vein to sort through. After a year goes by it’s hard to know which one was the end result. Which one had the fewest errors. Which one is the most authoritative.
And with multi-person teams things get more complex. Maybe the intern added 5,000 septic system points to the wrong database and you don’t have an easy way to undo it.
Enter geogit.
It has more benefits than solving the above problems but these are the ones that hit home to me the most at this, the beginning stages of my geogit learning journey.
Hey, guess what? I started a new position this week at Boundless. I’m absolutely thrilled to be a part of such a great team. Learning about geogit is one of the first things I get to do. I can’t complain.
]]>
If you have an ArcGIS Online Organizational account, you’ve already set. But you don’t need one of those to build a story map. In fact, you can create story maps for free.
So why not start experimenting with story maps yourself and see what you come up with?
—–
You can start here by creating a free account:
http://www.esri.com/software/arcgis/arcgisonline/features/free-personal-account
Click on “Sign Up for a Free Account”, which brings up this screen:
Click on “Create a Public Account”, which steps you through the account creation process.
Once you’ve created the account, click on “Map” in the top navigation. Select which basemap you would like to use, but don’t worry about it too much at this point—you can always change this later. Here I’ve selected the National Geographic basemap.
Now save your webmap. This is the webmap you will use to build you web mapping app, or “story map.”
Now click on “Share”, check the box next to “Everyone (public)”, and then click on “Make a Web Application”.
Now choose a template. To make things as simple as possible for your first experience building a story map, select the “Map Tour” template (it’s the only template with an interactive “builder” mode right now) and then click “Publish”.
After you’ve clicked “Save and Publish”, click on “go to the item now”.
From this details page about your new story map, click on “Configure App” and then click on the button.
Next click on “Start a New Tour”.
And there you have it. Your new (but not yet populated) story map.
Now click the “Add” button, which brings up a dialog to add your first item to the map. The first step is to add your media. The two options are “Picture” and “Video”.
For “Picture”, you simply paste in the URLs of your main image and a thumbnail. Ideally the sizes should be 1000 x 750 for your main image and 200 x 140 for the thumbnail, but almost any size will work and the app will resize it on the fly (but remember that there is some overhead with that, so a large story map with non-standard sized images can be a little slow). Another thing to remember is that the main image and the thumbnail can also be two different images—they don’t have to be the same exact image, just at two different sizes.
For “Video”, you can put in the URL for a video hosted on YouTube or Vimeo, then click the button and the app will automatically create the thumbnail for you. If your video is hosted somewhere else, select “Other” and then put in the URL of the thumbnail. An interesting, if undocumented, feature of “Video” – “Other” is that you can actually put in any URL—not just for a video, but for ANY WEB PAGE. Just be aware that not all web pages will work in this context.
Once you’re done entering information about your Media, click on the “Information” tab and enter a name and a caption for your item. You can include html in both the Name and the Caption, do that you can bold or italicize text, add links, etc.
When done entering your information, click on the “Location” tab. You can pan/zoom and annual mark the location, or you can type in and address or place or longitude, latitude in the search box.
Once your item has been correctly located on the map, click “Add tour point”. You’ve done it–you’ve added your first item to your story map! And it should look something like this:
Now add the rest of your items to the map the same way. Remember to save often. Once all the points are on your map, you can click on “Organize” and interactively drag and drop items to change the order on the map.
And when you’re all done with your story map and ready for people to see it, make sure to click on “Share”.
—–
There are obviously a lot more things you can do to customize your story map, but this is the most basic way to start. So try it out, push some boundaries, and most of all, have some fun with story maps!
Over the past year the Washington DC and Portland Research & Development Centers have experimented and developed a number of new web technologies. Next week at the annual Esri DevSummit we are presenting on new core technology (Geotrigger Service) and the use of emergent languages, tools and frameworks such as node.js, Ruby, better unit testing, Esri-Leaflet, css preprocessors, user-experience, Angular, Ember, Backbone and Web Components. Many of these are open-source so feel free to check them out on your own.
If you won’t be in Palm Springs, you can still participate in our 100-lines-or-less-js code challenge or our hackathon. Feel free to reach out to say hi or chat!
Patrick Arlt (@patrickarlt on Twitter) from the Portland R&D Center will be talking about the new developers.arcgis.com site and how you can use the tools there to quickly bootstrap your map application.
Monday 1:00pm – 2:00pm – Demo Theater 1 – Oasis 1
Planner link
Aaron Parecki (@aaronpk from Portland will be doing a quick session on using OAuth with the ArcGIS platform
Chris Helm (@cwhelm) from DC will be talking about something awesome
Monday 3:30pm – 4:30pm – Santa Rosa/San Jacinto
Planner link
In this session about designing applications, Patrick Arlt from the Portland R&D Center will analyze and critique some real-world maps and design a sample application with some real-world datasets.
Monday 4:45pm – 5:45pm – Demo Theater 1 – Oasis 1
Planner link
Rounding out a very busy day of speaking, Patrick Arlt will discuss the esri-leaflet plugin and how to combine it with other leaflet plugins to create light-weight web mapping applications.
Monday 6:00pm – 6:30pm Demo Theater 1 – Oasis 1
Planner link
Amber Case (@caseorganic) from the Portland center will be co-presenting with Jeff Archer (@vee_dubb) on how the GeoChase application was built. If you are interested in building mobile applications, or the Geotrigger Service, be sure to check this out.
Monday 6:00pm – 7:00pm – Demo Theater 2 – Oasis 1
Planner link
Patrick Arlt from the Portland center will be dropping the wisdom. AngularJs is a rapidly growing application framework from Google that focuses on extending HTML with a language for creating rich applications. Angular JavaScript can help you create custom HTML elements and attributes so you can rapidly develop mapping applications with reusable components. Oh, and some esri-leaflet for good measure.
Tuesday 1:00pm – 2:00pm Demo Theater 2 – Oasis 1
Planner link
Mike Juniper (@mjuniper) from the DC center will be joining a panel of presenters in this session covering patterns for integrating JS API with other frameworks. The plan is to talk about Angular, Ember, Backbone/Marionette, and Polymer (aka web components). If you are sick of trying to debug spaghetti code, check out this session and add some structure to your next application!
Tuesday 4:00pm – 5:00pm Smoketree A-E
Planner link
Davy Stevenson (@davystevenson), Nate Goldman (@ ungoldman ) and Ryan Arana (@aranasaurus), all from Portland will be talking about Esri’s Geotrigger Service and how to build location-based applications that can receive push notifications when they enter or exit a defined area. What can you do if your app knew where it was?
Tuesday 4:00pm – 5:00pm – Mojave Learning Center
Planner link
Thursday 1:00pm – 2:00pm – Mojave Learning Center
Planner link
Joshua Yaganeh (@hsoj), Courtland Fowler (@FowlerCourt) and Ryan Arana (@aranasaurus) all from Portland will be talking about the differences in location services and programming for iOS and Android. This talk will cover user experience, development and platform differences so that you can get a jump start on adding location aware alerts to your iOS and Android applications with Esri’s Geotrigger Service.
Tuesday 5:30pm – 6:30pm – Mesquite GH
Planner link
Thursday 2:30pm – 3:30pm – Mesquite GH
Planner link
Chris Helm (@cwhelm) of the DC center will be mixin it up with all sorts of great technologies. Check out new ways to interact with various ArcGIS GeoServices APIs in the context of advanced JavaScript libraries such as D3.js and Node.js. The session will present ways to use third-party data and APIs within the ArcGIS platform and will illustrate how such data can be combined with other Esri services to make compelling maps and visualizations from a multitude of services. I suspect Koop will also make an appearance.
Wednesday 11:00am – 12:00pm Demo Theater 2 Oasis 1
Planner link
Patrick Artl’s final presentation covers the use of CSS preprocessors as a means to handle increasingly complex stylesheets, containing hundreds of rules including styles for different browsers and mobile devices. CSS preprocessors like SASS build on top of CSS and allow for reusable variables, grouping rule set, and customized helpers. Frameworks like Compass and Bourbon can even help manage and version assets and build cross-browser rule sets with a few lines of code.
Wednesday 10:30am – 11:30am Mojave Learning Center
Planner link
Andrew Turner (@ajturner) and Mark Harrower from the DC center will be talking about adding usability testing into your development workflow to ensure that when you deliver your app to the world, it’s the best that it can be!
Wednesday 10:30am – 11:30am Mesquite GH
Planner link
Dave Bouwman (@dbouwman) from the DC center will be co-presenting this session along with David Spriggs (@davidspriggs) and Tom Wayson. We will be covering how to organize your code so it is testable, and how to write tests. We’ll talk about using various testing frameworks, including the Intern, Karma, and a GruntJS-based system. We know it’s late in the day and we promise to make this interesting!
Wednesday 5:30pm – 6:30pm Pasadena/Ventura/Sierra
Planner link
Thursday 1:00pm – 2:00pm Catalina/Madera
Planner link
Kenichi Nakamura (@kenichi_pdx) from Portland and Jason Wieringa (@jwieringa) from DC will talk about using Ruby with ArcGIS. Both the Portland and DC teams have a number of applications, Geotrigger Service and ArcGIS Open Data application, backed by Ruby web frameworks such as Sinatra and Ruby on Rails. Come learn about some of the tools that Esri has built to integrate the ArcGIS platform, with Ruby libraries. Even if you’re not a Ruby programmer, you may learn a few things that may entice you to try it out or even just learn about some interesting parts of the ArcGIS APIs.
Wednesday 4:00pm – 5:00pm – Mojave Learning Center
Planner link
Thursday 1:00pm – 2:00pm – Smoketree A-E
Planner link
The post Happy Birthday CloverPoint! 22 Years and Going Strong appeared first on CloverPoint.
]]>CloverPoint celebrates 22 years in Geospatial technology and data visualization (PRLog, March 2014)
Happy Birthday to us…
The post Happy Birthday CloverPoint! 22 Years and Going Strong appeared first on CloverPoint.
]]>First off, now is a great [...]]]>
First off, now is a great time to get into javascript! jQuery has leveled the playing field across browsers, and the truly horrible versions of Internet Explorer are nearly in behind us. The community is exploding, and it seems every day there is some new exciting project in javascript.
I’ve got some great news: over the last few years javascript has matured as a language and as a community. No longer are javascript applications “spaghetti” code by default, and cross-browser issues are much less common and painful than in the past. Myriad Model-View-Something frameworks exist to provide structure for your code, and if you’re doing anything more complex than “Hello World” I’d strongly recommend investing in learning one (or more).
I was going to list out a bunch of frameworks along with pros and cons, but then I remembered this video by Rob Conery titled “Javascript Infero”. I really like this talk as it compares 4 javascript frameworks - Knockout, Backbone, Angular, and Ember. Go ahead and click through and watch it now… I’ll wait here…
Backbone was the first of the client-side MV* frameworks that really took off. It’s also barely a framework – very un-opinionated, thus allowing you do to virtually anything. Marionette is a backbone extension that helps developers implement additional patterns by adding in formal Modules, Controllers, Layouts, Regions, various types of views, as well as an Application. Leveraging these greatly streamlines development both by reducing repetitive code and enforcing a development pattern. Personally I liked this stack because it give you lots of freedom, while still providing pattern guidance. Coming from ASP.NET / C# on the backend, this resonated with me. Last spring I did a 6 part series on building a mapping app using Marionette, that would be a good intro if Backbone + Marionette sound appealing.
For what it’s worth, this is what we used to build the ArcGIS for Open Data application, as it gave us the benefit of solid structural and architectural patterns, while still leaving us lots of flexibility to implement the interface behavior we wanted.
Polymer is a Google project that lets you build applications based on Web Components. The tricky bit is that Web Components is an emerging W3C standard, and no browsers have support for them yet. Undaunted, Polymer provides a set of polyfills (stop-gap code) that let you build and use web components today (IE10+ and evergreen browsers). This project just hit “alpha” in mid-February 2014, so it’s great for experiments, but I’d recommend staying away from this for production. That said, Web Components will be the future of the web, so it is worth getting a general understanding of the concepts. Another side note – both Ember and Angular are aligning themselves to slip-steam their view infrastructure into web components.
Underscore is a utility belt of awesome stuff that should be part of javascript but is not (yet). Lo-dash is the mo-better-faster implementation of the same library (yeah competition!). Get to know one/both of these, as they will save you a ton of time and effort. What is really great is that these libraries are smart enough to use native implementations of functions when they are available in the running browser, so you can use the same “code” in your app and in down-level browsers it will use a javascript implementation, but in newer browsers it will use the underlying C++ implementation.
Bootstrap is a css framework that allows you to create a “reasonable” web app in minutes – no wonder it’s the most popular front-end framework! Sensible defaults based on a responsive base means that you can throw markup into a file, and after only a few minutes reading the documentation, have a site that looks good on a 27inch iMac and on your phone.
What’s more – Bootstrap is so popular that when you want to level up, there are tutorials on creating themes, or you can skip that and drop two lattes worth of cash on a pre-made theme. Boom. Beautiful, and you don’t have to fight the css. Bootstrap also comes with a bunch of optional javascript helpers. Start by using them, and then as you transition up to using a framework like Angular/Ember/Backbone, you can switch over.
In the famous words of Nike: Just Do It. Start something – anything. Throw it up on github. A great starting point for working with maps is the bootstrap-map-js repo. Slap up something simple. Then build something else.
Will it break? Yes. Will you have problems with Internet Explorer? Yes. Will you scream at your monitor and rue the day you learned how to spell javascript? Likely.
But honestly, that’s learning. You did that when learning Silverlight or Flex. And really, if you are going to work on the web, javascript needs to be your new best friend. Think of it like this – you could switch to native app development, and then you’d have to know Objective C for iOS, Java for Android, and C# for Windows Phone/8.
In comparison, javascript is not so bad
Checkout my previous blog post on Leveling up your Javascript for lots of links.
YouTube has ton’s of resources for Angular and Ember
Html5Rocks & Polymer-Project – lots of info on Polymer and Web Components
]]>Using our current favourite Internet of Things service – If This Then That – the front light in the radio can be linked to any number of data feeds (see out post on IFTTT, Netatmo & Philips Hue: Linking Data to Lighting), at the moment it changes colour according to the outside temperature. The movie below shows the link to the Philips Hue and the iPhone BBC Radio App (ignore the cat, it decided to take part in every example i filmed):
The National Oceanic and Atmospheric Administration (NOAA) recently signed an enterprise license agreement with Esri, the world leader in GIS technology.
The agreement enables NOAA to continue building its GIS platform while maintaining data quality in bathymetry, climate and weather data, navigational charting, fisheries protection, natural resource management, marine planning, and other areas of its mission.
“NOAA now has the ability to increase access to Esri software and services that provide additional options for making NOAA data and applications available to all our constituencies and partners,” says Tony LaVoi, NOAA geospatial information officer. “We’re looking forward to the opportunities this presents to continue to grow our geospatial programs in NOAA.”
All NOAA employees now gain unlimited access to select Esri desktop and server products, including the powerful ArcGIS for Desktop, ArcGIS Spatial Analyst and 3D Analyst extensions, and ArcGIS for Maritime. In addition, NOAA staff members gain unlimited access to Esri’s Virtual Campus for online training, discounts on Esri technical support and classroom training, and complimentary passes to annual Esri user and developer conferences.
Another benefit of the agreement is a subscription to Esri’s ArcGIS Online. This benefit allows NOAA to quickly create interactive maps and applications and share these with the rest of the organization and the public.
“The agreement provides a foundation for the development of an enterprise geospatial program for NOAA, which will likely result in increased efficiencies across the organization, enhanced access to NOAA data and services, and a streamlined acquisition process,” states Joe Klimavicz, NOAA’s chief information officer (CIO).
NOAA’s mission is to understand and predict changes in the earth’s environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources.
For more information about enterprise license agreements, visit esri.com/ela.
[Source: Esri press release]
The post Maps, Not Guns. appeared first on CloverPoint.
]]>‘More Indigenous territory has been claimed by maps than by guns. This assertion has its corollary: more Indigenous territory can be reclaimed and defended by maps than by guns. Whereas maps like guns must be accurate, they have the additional advantages that they are inexpensive, don’t require a permit, can be openly carried and used, internationally neutralize the invader’s one-sided legalistic claims, and can be duplicated and transmitted electronically which defies all borders, all pretexts, and all occupations.’ (Bernard Q. Nietschmann, 1995)
Maps, not guns – I love it. Through CloverPoint and its numerous mapping escapades, I’ve had the honor of helping several indigenous peoples in their quest to assert their rights and title to the land they call home no different to you or I defending our own personal homes and properties with pride. To date we’ve not needed any guns.
A spell back my esteemed colleague Brandon Thompson and I attended an Aboriginal Land Management conference and met Anthony Laforge of the Magnetawan First Nation. Anthony was an amazing individual filled with passion about his people and culture and how their way of life was at risk from the pressures that surrounded their natural resources and traditional ways of life. A great storyteller, Anthony put the cause of the Ojibwa people and of most indigenous peoples in perspective with this simple analogy.
“Imagine someone pulling up to your driveway, opening your white picket fence gate, crossing your grassed yard, opening your front door, walking in straight passed you through your living room and kitchen, then pouring and drinking a glass of water – How would you feel? While we would be happy to share our water, all you have to do is ask.”
Chief Karen Ogen has become a personal friend and inspires me with her every word. She is a courageous woman fighting for the people of the Wet’suwwet’en First Nation (http://wetsuwetenfirstnation.ca) near Burns Lake BC. The LNG industry is poised to push through the Wet’suwet’en’s front yard and Chief Karen wants meaningful consultation with government and industry. She, like Anthony and the Magnetawan, wants the respect of those who chose to come to her people’s traditional territory to ask her permission, to ask what her people want, to ask for that glass of water.
Defending ones rights and titles is a basic principle. Defending that principle is done with the courage of people like Anthony Laforge and Chief Karen Ogen. Defending that principle is done with MAPS, not guns.
Corcoran’s thesis articulates a common bond we at CloverPoint share with those trailblazing geographers like Nietschmann and their technical counterparts like Jack Dangermond of Esri or Roger Tomlinson the founding father of GIS. Corcoran’s olive branch to the mappers of the world,
Author: Jeff Warwick, President & CEO, Cloverpoint – @warwickvic
References:
Benchmarking Spatial Information Endeavours in South Australia:
An Aboriginal Context
By: Paul Andrew Corcoran BA, MSc.
A thesis submitted for the degree of Doctor of Philosophy (Geoinformatics)
University of South Australia, Barbara Hardy Institute
School of Natural and Built Environments,
Division of Information Technology, Engineering and the Environment
The post Maps, Not Guns. appeared first on CloverPoint.
]]>A screenshot of an unpacked shapefile in File Manager |
Unemployment Rate 2009, by County |
Hawaii, cities, volcanoes, and risk of lava flows, transparent, overlaying imagery. |
However, I take issue with this little snippet in Sunday’s NY Times from David J. Hand. When speaking about geographic clusters* he wags his finger at us and pontificates, “…if you do see such a cluster, then you should work out the chance that you would see such a cluster purely randomly, purely by chance, and if it’s very low odds, then you should investigate it carefully.” See the short article here.
Granted, he’s probably reacting to the surfeit of maps that have been circulating the internet claiming to prove this, that or the other, when in fact they are mostly bogus. For example, Kenneth Fields tweeted this abomination this morning:
#McCartoCrap ~2500 years of cartography and this RT @Amazing_Maps: what a time to be alive pic.twitter.com/CnzVHLW26w
— Kenneth Field (@kennethfield) February 24, 2014
Jonah Atkins has created a github location for sharing remedies to bad maps like the above called Amazing-Er-Maps (this is itself in reaction to the name “Amazing Maps,” which has been given to a twitter account that showcases maps of questionable quality at times.)
Amazing-Er-Maps, as I understand it, is a place for you to upload a folder that contains the link to a bad map and a new map that is similar but does a better job. You include the data and the map as well as any code that goes with it. It’s a fabulous idea. Don’t just complain about bad maps, seek to make them better in a way that the whole community can gain inspiration from and learn from. Check it out, Jonah’s already got it going with several fun examples. Super warm-fuzzies.
Circling back to Mr. Hand, he has a point: we need to apply sound statistical and mathematical reasoning to our datasets and the maps we make from them. For example, when I was helping the Hood Canal Coordinating Council map septic system points, I didn’t just provide maps for them to visually inspect for clusters of too-old septics, I produced a map of statistically significant clusters of the too-old septics using hierarchical nearest neighbor clustering, which provides a confidence level for the chance that the cluster could be random.
The point is, those who are already practicing sound data mapping practices don’t like to be lumped in with the creators of maps that are produced–let’s face it–as sensational products. Our little map community is challenging those bad maps out there, creating great ones for our clients and bosses, and continuing to learn to make them better. Give us a bit more credit here and check out some of the really amazing things we’ve done.
*On an exciting note, “geographic clusters” makes main-stream news media!
]]>The post The Social Map and a Twitter Analysis of the Reach of #GoCanadaGo from Sochi Olympics appeared first on CloverPoint.
]]>
For us, the Echosec Social map was a fantastic resource for viewing live tweets and amazing photos being shared by fans in attendance at the game – you can view images from the Olympic village, the closing ceremonies, and from inside the arena where the game took place.
Canada took to Twitter big time and jumped on the #GoCanadaGo hashtag to share their enthusiasm – you can see results from the hashtag on the social map for a fun look at the fans showing their support! For those looking for even more Canada luv from Sochi, our analysis of the hashtag shows that the following related hashtags were also popular for fans: #wearewinter #Olympics #CanadaProud #WJC2014 #TeamCanada #Sochi2014 #Olympics2014
A quick report showing some of the most popular and most influencial users of the #GoCanadaGo hashtag:
The post The Social Map and a Twitter Analysis of the Reach of #GoCanadaGo from Sochi Olympics appeared first on CloverPoint.
]]>It’s nice that xkcd provides the occasional popular exposure of cartographic topics, but unfortunate that it makes critics’ jobs easier. The comic linked above has been invoked often since it first appeared, including in response to everyone’s latest favorite map to hate, US GDP Split in Half.
If you’re a map person you’ve already seen this a thousand times, often accompanied by hyperbolic words like “incredible,” and you’ve also seen a thousand complaints about how it’s meaningless and simply a map of population density. (Indeed, I was not about to let an opportunity for snark slip by, despite my stated support for bad maps.) The argument is that most people in the US live in metropolitan areas, so of course that’s where most of the economic activity will come from.
Is that true? There are reasons why the map doesn’t say anything significant about economic activity—one of them being that it’s totally arbitrary and there are infinite ways to divide GDP in half geographically—but “just a population map” is a cheap and thoughtless dismissal. The only thing that is actually a population map is a population map.
In this case, the numbers show that it’s not quite a population map. The 23 metropolitan areas account for half the GDP but only 39% of the population, and, by extension, their per capita gross metropolitan product is 50% higher than the remainder of the country. There’s wide variation among the metro areas, too: San Jose’s per capita GMP is nearly twice that of Phoenix. Here’s a version of the map with a little extra information based on 2012 source data and population estimates (corrected to show metropolitan areas, not urbanized areas).
That’s not to say the original map did a good job of highlighting contrasts between population and economic activity, or really anything at all—it doesn’t expose any population information, and the arbitrary grouping means that these 23 metros are not necessarily more special than any others—but the point only is that the contrasts do exist and the map is not simply a population map.
So, friends, let’s not be hasty to drop xkcd links and the categorical “just a population map” criticism. There’s nuance to every map, even if we have to go looking for it.
]]>Want more posts like Does the CityEngine webscene viewer work on a Nexus 7 (2012)? ? Then visit GeoPlanIT for more exciting posts (no really).
]]>For this example we are going to use Google Analytics. Simple, Free, and virtually ubiquitous. Of course you could send this information to [...]]]>
For this example we are going to use Google Analytics. Simple, Free, and virtually ubiquitous. Of course you could send this information to another service or a custom back-end, but that’s beyond the scope of this discussion.
The three types of telemetry we want to track – page views, user actions and in-browser performance – all map very nicely to three calls in the Google Analytics API:
Although analytics.js gives you a means to send this information to the backend service, it’s not the sort of thing that we want littered all over our code. Thus…
Before we start into the details, let’s talk application design for a moment. While you could simply litter the code with calls to the analytics API, that would make a mess, and be a huge pain should you want to change to use some other tracking system.
Instead, we want to centralize the tracking and storing of telemetry in a central service. The specifics depend on your framework (Backbone/Angular/Ember/Dojo/other) but you will likely have some sort of “global event bus”, or a core “Application” object which all elements of your application can access.
In our application, we are using Backbone and Marionette, and so we have added methods to our “Application” object, as that is passed into all the modules.
Again, your application architecture will dictate where to attach these events, but in frameworks that have the concept of a router, that’s a good place to start. For all “navigation” actions in our application, follow the same pattern and centralize things onto an application method, specifically Application.navigate(). Super handy, because we can just drop in the page view logging in this one function as shown below:
The actual logPageView function is just a call to the analytics function.
As we mentioned in the last post, this is simply a means to track what the user has interacted with. So, any place you have DOM event handlers, we want to assign that action a name, and make a call to Application.logUiEvent() method.
Depending on the amount of “magic” your application framework comes with (I’m looking at you Ember and Angular) this may be more or less difficult. Even with Marionette in the mix, our Backbone based app is pretty un-magical. All DOM interaction happens in handlers, defined in Views. So, all we do is drop in calls to logUiEvent in these handlers. While this could be made even more elegant by overriding the backbone and marionette view event binding infrastructure, in the interest of keeping the code easy to understand, we opted to just add these calls. Here is an example from one of our views.
This is the trickiest of the bunch. As we mentioned, we need two calls – one to setup the timer, and a second to indicate the event we were tracking has completed. Or failed. We also need to handle the case where it does not complete.
For this we created a timer object – “Took.js” – which grabs a time stamp when it’s instantiated, and calculates the duration until the stop() method is called. We also have two additional methods – store() and log().
Here is a jsbin that you can play with as well (obviously it won’t report to Google Analytics)
We also expose this via the Application object as two simple methods startTimer(name, category, label, maxDuration) and stopTimer(name).
Through our code we wrap the various blocks we want timing data on in these two calls. Before we show an example, this brings up another area where we have centralized things – xhr calls. Although Backbone has a dependency on jQuery, and we could use $.ajax or $.getJson anywhere in the application, we have decided to route all requests through a central location – again on our Application object.
Anyhow – here is an example of a call that fetches rows from a remote feature service.
At this point, our project has not gone live, so we just have telemetry from dev and test environments in Google Analytics. That said, having this setup well before launch has already helped us re-arrange some of our naming conventions, and validated some ideas about the types of reports we can get out of the system.
Since we know that javascript performance varies greatly between browsers (orders of magnitude between recent browsers and older versions of Internet Explorer), we really wanted to make sure we could segment our performance data by browser and version.
Turns out that segmenting the data like this is not “built-in”, but it’s not hard to setup. Basically you create new “segments” and in the UI for that choose “Technology”, and then Browser & Browser Version.
With this in place, we can now easily compare performance of specific actions between different browsers. NOTE: data in this screen shot is from development – the actual performance is *much* better in production where all the code is combined and minified
We hope this helps you get started with telemetry in your javascript applications. This is extremely useful information to have, and now more than ever, it’s very easy to get. Happy coding!
]]>The same lidar data classified into different categories using lasclassify--and triangulated. |
Contours overlaying a raster DEM using las2dem and las2iso. |
The details of the sphinx, including head, front and rear legs, can be easily distinguished. |
I’ve been finding that small increments just don’t work very well (lots of z fighting). On the plus side the new tree rendering in CityEngine 2013 webscenes is fantastic.
You can view the webscene below on my company’s ArcGIS Online site.
Want more posts like 3D Flood Mapping Landscapes with CityEngine ? Then visit GeoPlanIT for more exciting posts (no really).
]]>(...)
Read the rest of Insignificant Spaces on my blog.
© John Reiser for new jersey geographer, 2014. |
Permalink |
No comment |
Add to
del.icio.us
Post tags:
Designing an application is really a set of educated guesses about what features users will actually use. We do interviews, conduct user testing sessions, and use our existing understanding of the “ideal user”, but without data coming in from real users, we just don’t know. In this two part series we will show you how to get this sort of data.
The idea here is simple: track user actions in your application and send that information (“telemetry”) to a service that will aggregate it. Most analytics packages do this, but they are oriented towards so called ‘normal’ web pages, and the information they provide for javascript heavy pages (aka apps) is very limited without doing some additional work.
We are interested in three classes of telemetry data – Page Views, User Actions, and in-browser performance. In this post we will introduce these classes, give some ideas of what to track, and how to organize things. Next post we will review how to actually implement the tracking.
While there are many “page view” tracking techniques, they usually revolve around full-page refreshes. Of course, modern javascript applications eschew the full-page refreshes in favor of a more immersive applications that manipulate the url via html5 push-state.
Put another way, a user may spend 40 minutes using your web app, but traditional “page view” measurement may only register a single “hit” – when the page first loads.
Thus, when building rich javascript application, we need to help out and manually collect this information as the user changes pages, or “context” within your application. While most “single-page applications” don’t have clear boundaries between “pages”, we can usually break things down into reasonable units. Most applications will have some “home” or landing view, a search view, one or more types of search results views, item detail views etc etc.
The main thing is to come up with a naming convention for these “pages” that is consistent and makes logical sense.
We will look at the details of how to integrate this sort of tracking in the next post, but if your application has a “router”, that is likely a good place to attach page view logging.
Being able to track User Action is the core to being able to tell which features are actually being used. Since you are instrumenting the code yourself, you can track virtually anything that raises an event.
In our case, we want to measure the percentage of users change the base map, the size of the map… and virtually every other interface interaction.
The really boils down to adding code into every DOM event handler in the app, so how we structure our telemetry helpers will be really important – we don’t want to have brittle code littered all over the app. Think about having a central “telemetry service” that is available to all views or DOM events in the application.
Having a good naming convention is even more important for this type of tracking since these will likely be added by more than one developer and a mish-mash of naming will make the telemetry data a mess to work with. On the upside, it’s pretty easy to tweak
The third type of information we want to track is related to performance. For our team, we develop on Chrome Canary, on maxed out Retina Macbook Pros while using high-speed internet. Unfortunately not all our users will have such an optimized environment. Add the fact that we are supporting IE8/9/10/11, Chrome, Firefox, Safari and Opera, virtually the only way to get realistic performance information for all those platforms is to harvest it from real users.
The end-goal of course is to help improve the real-world performance of the application. But, before we start wildly poking around the code base tweaking things we think may be performance bottle-necks, we want to have the system instrumented so we know where the real bottle necks are, and that when we deploy changes, we really do see improved performance.
So – what do we want to time? Initially, for our project, we want to track basic page load times, network calls (xhr’s), and computationally intensive code blocks (client-side filtering). Some specific timers:
While page view and events are essentially single calls, tracking timing requires two actions – one to start a timer, and a second to stop it and record the duration. Once again, having sensible, consistent naming is really helpful.
In the second part of this post, we will talk about how to integrate telemetry into an application.
Radio Telescope photo modified from Stephen Hanafin‘s Flickr stream. cc by-sa.
]]>Yes you can do this all in other packages but in CityEngine I can change the flood height variable and the model changes pretty much instantly (video to come). I’ve obviously done some pre processing work in ArcGIS to allow for it to work in CityEngine.
Data courtesy of CyberCity3D, who I am providing CityEngine consulting for.
Want more posts like 3D Flood Mapping of London using CityEngine ? Then visit GeoPlanIT for more exciting posts (no really).
]]>New Jersey uses “PAMS_PIN” as a key between the GIS data and the tax assessors’ rolls. The key is a simple concatenation: Municipal Code (a four digit value for each municipality), Block, and Lot joined by underscores. If the Qualifier Code (“QCODE”) field is populated, then that is also joined, again with an underscore. In Somerset County, many of these PAMS_PIN keys are incorrect, resulting in duplicates.
(...)
Read the rest of Key issue with New Jersey’s parcel data on my blog.
© John Reiser for new jersey geographer, 2014. |
Permalink |
No comment |
Add to
del.icio.us
Post tags:
Here are some screen shots from ArcGIS CityEngine and the Webscene viewer.
Data courtesy of CyberCity3D, who I am providing CityEngine consulting for.
Want more posts like 3D London & Terrains ? Then visit GeoPlanIT for more exciting posts (no really).
]]>UPDATE 07/02/2014 : Read the associated comment on this post from Phil at the Ordnance Survey, quite possibly the best response I’ve ever had on this blog, thank you.
UPDATE 07/02/2014 2: Old Maps at end of this post courtesy of ‘Phil Allen’ FSE Manager at the Ordnance Survey, thank you!
Working with real 3D models of London it sometimes makes sense to place this in context on a boundary map, but I’ve run into to something that’s given me pause for thought….
The City of London is an odd and special part of London I think you’ll agree, I’ve always known it’s administrative boundary as being a little odd (something about bridges…) sure enough on the City of London website there is the boundary showing clearing two bridges are covered in its area.
Now being a GIS sort of fellow I want to download this boundary set, so visiting the OS OpenData site, I see that something’s up whilst one bridge is clearly there on the left, London Bridge has been excluded (hence my clever title). What does it all mean? Well I think probably OS Opendata is generalised in some way and this bit got missed… but I don’t really know. Downloading boundary data from the Greater London Authority data site doesn’t fix things either (it is just the data set the OS gives).
So what does this all mean? Well it means that the OS may well be the ’authoritative geographic data’ set for the UK, but it doesn’t mean everything you get from it is without ‘issues’. Know your data, know its limitations, also did I mention OpenStreetMap seems to get it right? Why am I relying on data from the OS again?
UPDATE 07/02/2014 1: Read the associated comment on this post from Phil at the Ordnance Survey, quite possibly the best response I’ve ever had on this blog, thank you.
UPDATE 07/02/2014 2: Old Maps below courtesy of ‘Phil Allen’ FSE Manager at the Ordnance Survey, thank you!
Want more posts like One of our Bridges is Missing! Mapping Discrepancies (update: no it isn’t) ? Then visit GeoPlanIT for more exciting posts (no really).
]]>Want more posts like New 3D London data arrived today ? Then visit GeoPlanIT for more exciting posts (no really).
]]>APPLY NOW FOR SEPTEMBER ENTRY
As Course Director, i am pleased to announce the new MSc and MRes in Smart Cities, here at the The Bartlett Centre for Advanced Spatial Analysis. Both the MSc and MRes offer an innovative and exciting opportunity to study at UCL – with key training for a new career in the emerging Smart Cities market. Smart cities are focused on how computers, data, and analytics which consist of models and predictions, are being embedded into cities. Cities currently are being extensively wired, thus generating new kinds of control and new kinds of services, which are producing massive data streams – ‘big data’. To this end, we need powerful analytics to make sense of this new world with a new skill set to understanding and lead this new field.
CASA
The Bartlett Centre for Advanced Spatial Analysis (CASA) is one of the leading forces in the science of cities, generating new knowledge and insights for use in city planning, policy and design and drawing on the latest geospatial methods and ideas in computer-based visualisation and modelling.
CASA’s focus is to be at the forefront of what is one of the grand challenges of 21st Century science: to build a science of cities from a multidisciplinary base, drawing on cutting edge methods, and ideas in modeling, complexity, visualisation and computation. Our current mix of architects, geographers, mathematicians, physicists, urban planners and computer scientists make CASA a unique department within UCL.
Our vision is to be central to this new science, the science of smart cities, and relate it to city planning, policy and architecture in its widest sense.
EDUCATIONAL AIMS OF THE PROGRAMME
The MSc Smart Cities and Urban Analytics comprises 180 credits which can be taken full-time over 12 months or on a flexible modular basis of up to 5 years duration (full details of the MRes structure can be found here. If taken full time over one year, the following structure is followed:
TERM ONE
Smart Systems Theory
The module provides a comprehensive introduction to a theory and science of cities. Many different perspectives developed by urban researchers, systems theorists, complexity theorists, urban planners, geographers and transport engineers will be considered, such as spatial interactions and transport models, urban economic theories, scaling laws and the central place theory for systems of cities, growth, migration, etc., to name a few. The course will also focus on physical planning and urban policy analysis as has been developed in western countries during the last 100 years.
The class runs during term one, for two hours per week
Assessment is by coursework (2,500 – 3,000 words)
Quantitative Methods
This module will empower you with essential mathematical techniques to be able to describe quantitatively many aspects of a city. You will learn various methodologies, from traditional statistical techniques, to more novel approaches, such as complex networks. These techniques will focus on different scales and hierarchies, from the micro-level, e.g. individual interactions, to the macro-level, e.g. regional properties, taking into account both discrete and continuous variables, and using probabilistic and deterministic approaches. All these tools will be developed within the context of real world applications.
The class runs during term one, for three hours per week (one hour lecture followed by two a hour practical)
Assessment is by coursework (2,500 – 3,000 words) and via an exam
This class runs during term two, for one and a half hours per week
Assessment is by coursework (2,500 – 3,000 words)
Spatial Data Theory, Storage and Analysis
This module introduces you to the tools needed to manipulate large datasets derived from Smart Cities data, from sensing, through storage and approaches to analysis. You will be able to capture and build data structures, perform SQL and basic queries in order to extract key metrics. In addition, you will learn how to use external software tools, such as R, Python, etc., in order to visualise and analyse the data. These database statistic tools will be complemented by artificial intelligence and pattern detection techniques, in addition to new technologies for big data.
The class runs during term two, for two hours per week
Assessment is by project output (5,000 – 6,000 words)
Urban Simulation
The module provides the key skills required to construct and apply models in order to simulate urban systems. These are key in the development of smart cities technologies. You will learn different approaches, such as land-use transport interaction models, cellular automata, agent-based modelling, etc., and realise how these are fashioned into tools that are applicable in planning support systems, and how they are linked to big data and integrated data systems. These models will be considered at different time scales, such as short-term modelling, e.g. diurnal patterns in cities, and long term models for exploring change through strategic planning.
The module runs during term two, for two hours per week
Assessment is by coursework (2,500 – 3,000 words)
TERM THREE
Dissertation
This dissertation marks the culmination of your studies and gives you the opportunity to produce an original piece of research making use of the knowledge gathered in the lectures. You will be guided throughout this challenge by your supervisor and with the support of the Course Director, and together you will decide the subject of research. This enterprise will enable you to create a unique, individual piece of work with an emphasis on data collection; analysis and visualisation linked to policy and social science oriented applications.
Assessment is by 10,000-12,000 words dissertation.
STAFF
The teaching staff are worlds leaders in the field from Professor Mike Batty MBE, through to Sir Alan Wilson – you can found out full details and staff profiles at our sub-site http://mscsmartcities.org/
ENTRY REQUIREMENTS
Ideally, you will already have a Bachelor’s degree in an appropriate subject such as Geography, GIS, Urban Planning, Architecture, Computer Science, Civil Engineering, Economics or a field related to the Built Environment though other subjects will be considered especially if you can demonstrate a keen interest in your personal statement to convince us you should be given a place. You’ll need to have obtained a 2.2 (or international equivalent) to join the MSc.
If you do not have a Bachelor’s degree, we can still consider you if you have a professional qualification and at least three years relevant experience, so don’t be put off applying if you fall into this category because the greater the mix of students we have on each course, the more interesting seminars and discussions are going to be.
HOW TO APPLY
You can apply for this course online or find out more at http://www.mscsmartcities.org - If you have any questions, you can contact our Teaching and Learning Administrator, Lisa Cooper. The course brochure can be download in pdf format (5Mb).
As readers of this blog are frequently reminded, spatial data is increasingly mobile. To remain highly relevant to users, OGC standards must take into account mobile network and platform/device strengths and limitations. As a result of several overarching OGC activities initiated back in 2011, and the contribution of its members, OGC’s standards are increasingly implemented on and designed to serve mobile platform users. Looking ahead, the OGC’s standards and value proposition are becoming increasingly attractive to mobile-only technology providers and their users.
]]>As readers of this blog are frequently reminded, spatial data is increasingly mobile. To remain highly relevant to users, OGC standards must take into account mobile network and platform/device strengths and limitations. As a result of several overarching OGC activities initiated back in 2011, and the contribution of its members, OGC’s standards are increasingly implemented on and designed to serve mobile platform users. Looking ahead, the OGC’s standards and value proposition are becoming increasingly attractive to mobile-only technology providers and their users.
]]>Before we get to the how, there are a few things you should know about this trick:
Even with the gridded queries that dynamic mode feature layers use, at small scales (zoomed way out) it is common that you will be requesting more than the max record count number of features. When this happens, you get “holes” in the data returned, as shown below.
The default for max record count is usually 1000 features, but I think the latest release bumps this up to 2000. Regardless of the default, this value can be changed as part of the server configuration, so unless you control the server, it’s not something you can change.
Scale ranges are commonly used to avoid sending very detailed or dense data over the wire. So, even if the layer you are working with only has a few dozen features, if they are really detailed geometries, things may get really slow, so perhaps reconsider.
It’s actually really easy. Create a feature layer, then in it’s “load” event, reset the min/max scale properties. Then add it to the map.
Here is a link to a JSBIN you can play with.
The map below shows a feature layer from a demo server that has a minScale of 100,000 shown on a map that is at 1:36,978,595
And of course you can use the same technique with MapService layers. Shown below is a layer that has a maxScale of 1,000,000 and we are zoomed in well past that.
]]>Once the bulbs are connected to the network there are a variety of third party services that can be used to control the bulbs, either as a group or individually. One of our current favourites here at CASA is IF This Then That (IFTTT), IFTTT is a service that allows the creation of simple ‘recipes’, linking popular online systems via a series of rules. Among the services linked on IFTTT is the Netatmo Weather Station and the Philips Hue.
We are using the Netatmo in CASA as part of our forthcoming office data dashboard, it monitors internal temperature, humidity, sound levels and carbon dioxide with an external unit providing temperature and humidity readings.
IFTTT is an emerging service, as such it does have a number of limitations. We had to create a recipe for each temperature change, rather than combining it into one logical statement. IFTTT also only checks the readings from external services every 15 minutes, making it unsuitable for rapidly changing data. Finally, there do seem to be a few bugs, the hex colour does not seem to create a direct match to the Hue light, so a few tweaks are required to gain the correct colour output.
Despite a few limitations it does open up a number of possibilities, the next steps are to look into how to link in other data feeds. This is arguably where the power of IFTTT comes into play, its ability to link a number of services and then control external hardware makes it a perfect opening into linking data to the Internet of Things.
The whole system could arguably be built using an Arduino board / Raspberry Pi linked to LED’s at a much lower cost. The linking of consumer grade units to data is however perhaps a step towards smart objects for everyone. The whole set up took under two hours to get up and running, including the setting up of the IFTTT recipes and linking to the Philips Hue. Edit (20/2/2014) – the light is now installed inside an old Ships Lamp with the colour changing according to the outside temperature.
It is probably a niche market, but the linking of data to lighting is an intriguing development. From making the lights blink when carbon dioxide levels in the office rise through to changing colours according to a twitter or rss feed, we are excited by the possibilities.
The web based weather dashboard can be viewed at http://www.casa.ucl.ac.uk/weather/colours.html data updates every 3 seconds. We will be mounting the ‘weather bulb’ in a small globe and placing it in the corner of the office. The Philips Hue starter pack comes with three bulbs, 50 lights can be linked to each bridge. With the ability to link each bulb, or a group of bulbs, to almost any data feed, the office is about to become a colourful place…
“La Quinta” is Spanish for “next to Denny’s.”
Thinking about this, I realized we could use GIS to find the number of La Quintas that are next to Denny’s. Last night after the kids were asleep, I sat down with a beer (Sierra Nevada) and figured out how I could get the data and perform the calculation.
First, I visited La Quinta’s website and their interactive map of hotel locations. Using Firebug, I found “hotelMarkers.js” which contains the locations of the chain’s hotels in JSON. Using a regular expression, I converted the hotel data into CSV.
Next, I went to Denny’s website and their map. Denny’s uses Where2GetIt to provide their restaurant locations. They provide a web service to return an XML document containing the restaurants near the user’s location (via GeoIP) or by a specified address. Their web service also has a hard limit of 1,000 results returned per call. Again, I used Firebug to get the URL of the service, then changed the URL to search for locations near Washington, DC and Salt Lake City, Utah, with the result limit set to 1000 and the radius set to 10000 (miles, presumably). From this, I was able to get an “east” and “west” set of results for the whole country. These results were in XML, so I wrote a quick Python script to convert the XML to CSV.
#!/usr/bin/env python import sys from xml.dom.minidom import parse dom = parse(sys.argv[1]) tags = ("uid", "address1", "address2", "city", "state", "country", "latitude", "longitude") print ",".join(tags) for poi in dom.getElementsByTagName("poi"): values = [] for tag in tags: if len( poi.getElementsByTagName(tag)[0].childNodes ) == 1: values.append( poi.getElementsByTagName(tag)[0].firstChild.nodeValue ) else: values.append("") print ",".join( map(lambda x: '"'+str(x)+'"', values) )
I then loaded the CSVs into PostgreSQL. First, I needed to remove the duplicates from my two Denny’s CSVs.
CREATE TABLE dennys2 (LIKE dennys); INSERT INTO dennys2 SELECT DISTINCT * FROM dennys; DROP TABLE dennys; ALTER TABLE dennys2 RENAME TO dennys;
Then I added a Geometry column to both tables.
SELECT AddGeometryColumn('laquinta', 'shape', 4326, 'POINT', 2); UPDATE laquinta SET shape = ST_SetSRID(ST_MakePoint(longitude, latitude),4326); SELECT AddGeometryColumn('dennys', 'shape', 4326, 'POINT', 2); UPDATE dennys SET shape = ST_SetSRID(ST_MakePoint(longitude, latitude),4326);
From there, finding all of the La Quinta hotels that live up to their name quite easy.
SELECT d.city, d.state, d.shape <-> l.shape as distance, ST_MakeLine(d.shape, l.shape) as shape FROM dennys d, laquinta l WHERE (d.shape <-> l.shape) < 0.001 -- 'bout 100m ORDER BY 3;
Here are the 29 cities that are home to La Quinta – Denny’s combos:
You might have noticed El Paso, Texas, has two La Quinta – Denny’s combos. Both happen to also be on the same street, just a few miles apart.
I love that the second one even shares a post for both of their highway-scale signage.
So out of the 833 La Quintas and 1,675 Denny’s, there are 29 that are very close (if not adjacent) to one another. So, only 3.4% of the La Quintas out there live up to Mitch Hedberg’s expectations.
GIS: coming up with solutions for the problems no one asked!
Update: Chris in the comments made a good point about projection. So here’s the data reprojected into US National Atlas Equal Area and then limited to 150 meters distance between points.
SELECT d.city, d.state, ST_Transform(d.shape,2163) <-> ST_Transform(l.shape,2163) as distance FROM dennys d, laquinta l WHERE (ST_Transform(d.shape,2163) <-> ST_Transform(l.shape,2163)) < 150 ORDER BY 3;
This yields 49 pairs (or 5.8% of all La Quintas):
This modification to the query introduced a new oddity: Huntsville, AL has a Denny’s that is between two La Quintas, which is why it’s on the revised list twice.
Update: I uploaded a dump of the data to Github, if you want to explore the data on your own.
© John Reiser for new jersey geographer, 2014. |
Permalink |
3 comments |
Add to
del.icio.us
Post tags:
Meteorology is coordinated worldwide through a technical commission of the United Nations: the World Meteorological Organisation (WMO), whose roots were established in the 19th century. There is a long tradition of global standards and interoperability. WMO also has responsibility for hydrology and collaborates strongly with UNESCO's International Oceanographic Commission (IOC) over marine matters.
]]>Meteorology is coordinated worldwide through a technical commission of the United Nations: the World Meteorological Organisation (WMO), whose roots were established in the 19th century. There is a long tradition of global standards and interoperability. WMO also has responsibility for hydrology and collaborates strongly with UNESCO's International Oceanographic Commission (IOC) over marine matters.
]]>Of course you should be careful with this technique. Many times MapServices are used to display very dense data, so you may end up pulling a lot of data over the wire. But, if you happen to need more interactivity from a layer that’s already published as a MapService, this is a great way to avoid having to publish a FeatureService.
Here is a JSBin with the example running.
]]>Can you see the Luxor Hotel? |
A slightly cleaned up 3-D view of the Luxor Hotel. |
How tall is the Luxor Hotel? Here, a quick measurement calculates about 105 meters-close to its actual 107m. |
NEW: LIDAR data with satellite imagery underneath. |
Google Street View of the Luxor and surrounding area |
The NWS Enhanced Data Display - Snow Predictions |
Hour-by-Hour Predictions... |
Ah 2013, what a great year you were – returning to full-time development has been a fantastic change of pace, and I’m super excited about what the next year will bring. In that spirit, here are a few things that the Esri DC team is looking forward to investigating.
This year our team is looking at adding Yeoman and Bower into the mix. We’ve been using grunt since we started our project, and I can’t imagine doing any front-end development with out it. Yeoman and Bower are companion tools that bring more awesome workflow options for front-end developers. Yeoman focuses on scaffolding out applications or components for you (think Rails or ASP.NET MVC scaffolding but for your javascript). Bower is a dependency management tool which makes it easy to pull down all the front end libraries you use, and easily update them over time. Grunt of course is a javascript task runner that can orchestrate things, and you should be using it. Or gulp.js, which is a “streaming build system”.
Although our team, and our primary project uses Ruby on Rails for a back-end web framework, as javascript developers, we can’t ignore all the activity around Express.js &Sails.js.
Personally, in switching over to OSX, I’ve been feeling a little off the back in terms of full-stack chops, so I’ve been hacking around with building simple web apps using Express, which is a back-end framework built on node.js. Express is the most mature of the node.js “frameworks” and thus the documentation is pretty good, and there are good intro and advanced books available. There are also a lot of modules that can be used to extend express.
Taking it up a level, Sails.js is a set of Express middleware that adds “stronger opinions” to Express. It is crazy quick to get started, and makes building out single-page style apps, and APIs a breeze. It’s similar to Rails in that it comes with scaffolding tools – so it’s a simple command to whip up a model and controller at the same time, and have that auto-magically become a REST and Socket.io based API. Again similar to Rails, it also has a plug-able data store model, allowing you to switch between database technologies as your project progresses. Sweet. That said, development is on-going and documentation is limited, so expect to fall back on some Express chops when you start digging into things. From a team perspective, Chris Helm (@chelm) has built the Koop project on top of Sails, so stay tuned for some interesting things coming on that front.
If you are paying any amount of attention to the front-end web development space, you’ve likely heard people raving about Angular JS. While I am a big fan of Backbone & Marionette, the more I read about Angular, the more interested I am. Backbone is great when you want “extreme control” of the user experience (as is the case in our project), but in other scenarios, frameworks like Angular are a big win, as they can off-load a lot of work from the developer. On top of that, Angular seems to be able to play-nice with server-rendered markup, which is a big plus for building apps with really fast initial page-load times. Finally, being built by Google, using solid software engineering principles (dependency injection anyone?), with an eye towards our web-componety future (see below), I’d expect Angular to be relevant for quite some time.
I also found this very interesting post about Backbone and Angular . Not sure I completely agree with everything in there, but if you use backbone, it’s worth reading.
Speaking of looking forward over the next few years, the proposed web components standard is extremely interesting, and will be a huge shift in front-end development. At it’s core are 4 technologies (Templates, Shadow DOM, Custom Elements and Packaging) which combined will allow developers to create re-usable “components” for applications. What’s this boils down to is that we will be able to create new html tags and bake in behavior – just like built-in UI components like the <select>. Only much bigger and better. What will be particularly interesting to watch is how application frameworks (Backbone, Ember, Angular) adapt to web components.
Thanks to Google, this future is available today. Chrome Canary has support for many of the underlying technologies, and the Polymer Project is a set of polyfills that allow web components to be used on all modern browsers (IE10+)
To learn more about Web Components…
If nothing else, this year will be a ton of fun as we all try to build the next awesome thing.
Cross posted from blog.davebouwman.com
]]>Our team is building an extension to ArcGIS.com that will better enable organizations to meet their Open Data initiatives. Much more about the specifics of the application functionality will be coming out in the next few months (think Esri Fed UC aka FedGIS) but what is important in the context of this post (and partly why I joined this team @ Esri) is the technology stack.
Although the “back-back-end” is the ArcGIS Online item-store, our app itself has a Rails back-end that acts (more or less) as a fancy caching proxy / search engine. Knowing just enough Ruby to be very dangerous, I’m pretty insulated from the guts of the back-end, but it does impact the front end code because we use the Rails asset pipeline to concatenate, and minify our javascript.
Like many web apps these days, the front-end of our application uses Backbone + Marionette. We chose this framework because it provides us enough structure to ensure consistent separation of concerns across the code-base (read: it’s maintainable and testable), while still affording lots of flexibility in terms of implementation and integration.
With that in mind, here is a list of the resources that I’ve found useful…
A great editor on it’s own, sublime’s plug-in ecosystem is what makes it truly rock. I’ve tried a bunch of other plugins and these are the ones that I actually end up using on a regular basis:
I very quickly learned that using git by yourself working directly in master, is very different from working on a team that’s got a very fluid “git-flow” going. Leveling up took some time, and I still feel like I’m one bad merge away from disaster
While I’ve been working with javascript on a regular basis for the last 5+ years, I was also wearing a lot of other hats, so it was time to get into the language at a whole new level.
Again, while I’d built a few backbone applications, there is a difference between using a framework, and knowing it. Reading the backbone source code is a good option, but it can be abstract. Here are some additional resources that are really helpful.
Marionette is composite application library build on top of Backbone. Essentially Marionette helps remove the repetitive, boiler-plate code that is common in Backbone apps.
There is a great eco-system of Backbone plugins out there, and here are two that we are using
How anyone can build a non-trivial javascript application without good unit testing is a mystery to me. While it takes some effort to get into the swing of writing tests, the value you get back out of it is totally worthwhile. I can’t even begin to explain how much having tests has helped our project – and we have ~50% coverage. There are lots of testing tools out there, and this is what we are using.
Just because we are working a simple text editors does not mean we don’t want automation, and that is where GruntJs comes in. Written in node, it can be used to automate a lot of the “behind the scenes” work that heavier IDEs perform – i.e. syntax checking (aka linting), running unit tests, and code coverage tools. Grunt’s plugin ecosystem is huge, and although we use just a few packages, the breadth of what you can automate is amazing. Install them by adding entries to package.json and running npm install.
As I mentioned in the intro, we use Rails asset pipeline to combine + minify our javascript, and to compile the sass to css. But of course there are grunt plugins for that stuff as well – just poke around at GruntJs.com
If only I wrote perfect code all the time! Alas, a lot of time is spent debugging things, and so it’s good to get familiar with the tools of the trade. In the past I’d used FireBug quite a lot, but over the last year, I’ve switched to Chrome Dev Tools and never looked back. At this point the Dev Tools are practically an IDE, allowing for direct editing of source javascript and css. Support for javacript source maps and css source maps is also amazingly helpful. But that’s just the tip of the iceberg. Performance and memory profiling, mobile emulation and a console that is crazy powerful take it to the next level
That’s about it – thanks to all the people who contribute their time and energy to building this amazing eco-system that helps push the web forward.
Photo Credit: Some rights reserved by Toms Bauģis
]]>
]]>
]]>
Announcement: If you are interested in Android App Development, check out the Coursera course "Programming Mobile Applications for Android Operating Systems" from the University of Maryland. I will taking it and hope to see you there! The course is free. However, Coursera offers a Verified Certificate for $49--which may be worthwhile for professional development. The course beings on January 21st, and lasts 8 weeks, so if you are interested get registered now! Check out the video below!
Of note in the literature is Mike Batty’s new book on The Science of Cities along with new papers in the CASA Working Paper Series – now up to number 194 and all available to download. We have launched a new MSc in Smart Cities and Urban Analytics as well as an MRes in Smart Cities to run along side the current MRes in Advanced Spatial Analysis and Visualisation, opening up a number of new routes to gain a Masters Degree at CASA. As a note to the season, our Christmas Movie is below:
The project is building a series of architectural interfaces in East London and Nottingham neighbourhoods, which use broadcast media and interactive technologies to enhance real world connections, add value to the daily experience of the urban environment, foster community participation and ownership of the urban space. In this project we are seeking to involve local organisations and residents as key partners in the creative development process.
It is one of our favourite projects out of the current Research in the Wild themes, The Screens in the Wild team includes: makers, architects, HCI designers, computer scientists, anthropologists, developers, artists and curators.
Find out more at http://www.screensinthewild.org/
]]>
]]>
Available from Packt Publishing and Amazon/Amazon Kindle |
As part of a wider segment on the BBC’s One Show with a 4 minute documentary about pigeons by Mike Dilger and Adam Rogers the Pigeon Sim was wired up for a fly through by comedian Jack Dee. The clip below shows the system in action and our attempt of making sense of the whole thing on live tv:
The One Show was hosted by Chris Evans and Alex Jone and attracts audience ratings in excess of 4 million, the joys of live TV and technology should not be underestimated – it worked but with a full studio getting the Kinect to recognise a comedian in a pigeon outfit is challenging to say the least…..
You can fly Pigeon Sim yourself at the Almost Lost exhibition in Wellington Arch, London, from December 4th.
Jerry Johnston is Geospatial Information Officer at the US Department of the Interior. In this role, Dr. Johnston leads DOI’s efforts to coordinate and implement geospatial technology across the Department to meet a wide range of mission goals. This includes providing a vision for geospatial interoperability throughout the enterprise, as well as guidance and perspective on opportunities for adopting place-based approaches more broadly across Departmental lines of business.
]]>
Jerry Johnston is Geospatial Information Officer at the US Department of the Interior. In this role, Dr. Johnston leads DOI’s efforts to coordinate and implement geospatial technology across the Department to meet a wide range of mission goals. This includes providing a vision for geospatial interoperability throughout the enterprise, as well as guidance and perspective on opportunities for adopting place-based approaches more broadly across Departmental lines of business.
]]>
JOSM is my preferred editor for OpenStreetMap. I find it to be incredibly robust and powerful. I also love that it’s extensible; there are plenty of great plugins and it integrates with web services well. One thing that JOSM supports is WMS and TMS background imagery. I often use the freely-available 2012 imagery for New Jersey as a base for my edits. We’re also using the 2012 WMS for our NJ MAP “Growth” crowdsourcing web app to help identify areas of recent development in NJ. If you haven’t seen the app, check it out. It’s an easy to use app where 2012 imagery is presented along with a black mask derived from the 2007 urban lands in the Land Use/Land Cover data. Simply put, if you see a building on the aerials, click on it and tell us what it is.
Because those clusters of single family housing built post-2007 are not likely reflected in OpenStreetMap, I wanted to see if I could provide some base roadways for the new subdivisions as well as clean up the land use imported into OSM, which was from 2002.
I have a WMS service of the points in the Growth app, served up by GeoServer. GeoServer is also capable of providing the same data in TMS. While I was going to simply add a link to the WMS to JOSM, so that I could see the project’s contributions along side the aerial photo and the OSM data, I realized that it wouldn’t be as useful in the main map interface, because I’d only see the points after I downloaded some OSM data.
To help you find your area of interest, JOSM includes several OSM-derived map services through the Download Data window. I did not realize that the list of layers included any TMS layers you added to JOSM. So instead of adding the new urban points from Growth as a WMS, I added them as a TMS. Now, instead of browsing for locations using Mapnik tiles, I can look for clusters of development points.
Granted, it’s not too meaningful – it’s just the points without any other background information – but it’s easy to find areas that likely need attention.
This area was forest in 2002, but in 2012 is a new housing development. I was able to add the roads of the new development, as well as clean up the surrounding land use.
Now, I’ll still need to refer to another available source (such as NJ’s road network or our parcel data – both freely available under OSM-friendly licenses) for things like the road names, being able to focus in on areas that need updating in OSM will help us all improve the New Jersey portion of the world’s best free map.
If you want to help update OSM using NJ MAP: Growth as a guide, add the following TMS to JOSM:
tms:http://gis.rowan.edu:8080/geoserver/gwc/service/tms/1.0.0/njmap%3Acrowdsource_points@EPSG%3A900913@png/{zoom}/{x}/{-y}.png
© John Reiser for new jersey geographer, 2013. |
Permalink |
No comment |
Add to
del.icio.us
Post tags: crowdsourcing, GIS, Land Use, new jersey, OpenStreetMap, tile, TMS
Week 1 Map: Early in the epidemic... |
Hong Kong: Generated/fake "cases" appear to be located in the water through error or perhaps how they are being generated. |
The UK’s Future Cities movement is set to receive a major boost, thanks to a new initiative by the Future Cities Catapult and Level39, Europe’s largest accelerator space for financial, retail and future city technologies, based at Canary Wharf.
Launched in March this year, the Future Cities Catapult is an independent innovation centre set up by the Technology Strategy Board (TSB). It aims to help UK businesses develop cutting-edge, high value urban solutions and then sell them to the world. The Catapult will unite business, city governments, innovators and academia to run a series of large-scale pilot projects that address how cities can become more economically active, while lowering their environmental footprint and moving towards a low-carbon economy.
The Future Cities Catapult is prioritising the ‘Future Cities Financing’ aspect of their work programme to address the need for large-scale investment into the sector. Growing cities represent a great opportunity to develop innovative technologies that have the potential to provide a major boost to the UK economy.nMore than £6.5tn will be invested globally in city infrastructure over the next 10 to 15 years. Of this, the accessible market for companies developing future city systems is estimated to be worth £200bn a year by 2030. Critical to the flourishing of future cities is greater access to finance for integrated city systems (not simply the traditional model of investment in specific projects in areas such as transport, energy and health).
The Future Cities Catapult will focus on developing and leveraging finance for cohesive city systems as a key area of activity in future. This work has begun by tasking Level39 with pulling together experts from within its extensive finance and investor networks to raise awareness of investment opportunities and draw upon their knowledge to debate the merits of the current or potential new funding models.
Peter Madden, Chief Executive, Future Cities Catapult, said: “In the future, the majority of people will live in cities. There are huge benefits to people and opportunities for business in making cities better places to live, but at present we lack the financial mechanisms to unlock these opportunities.
“Future Cities Financing will explore the challenges, opportunities and risk profiles for different kinds of investment. By engaging with the right financial stakeholders, we can also amplify the message that investment in smart cities is a new and highly valuable asset class.”
Eric Van Der Kleij, Head of Level39 at Canary Wharf Group plc says:
“We are pleased to be supporting the Future Cities Catapult by creating the right blend of industry experts to debate the crucial issue of future cities financing.” “The built environment at Canary Wharf is one of the most technologically advanced in London and we will continue this journey at Level39 by facilitating new innovation in future cities technology. Our location alongside the capital’s leading financial district and extensive contacts, which include finance specialists, fund managers and investors, will enable us to help unlock vital sources of funding.”
The clip below provides a good insight into Level39 – having been there a couple of times we can safely say its an exciting place:
Finally, details on What is the Future Cities Catapult are highlighted in the movie below:
You can find out more about the Future Cities Catapult at https://futurecities.catapult.org.uk and Level39 at http://level39.co/level39/introduction/
New Sleek Looking QGIS Website |
A Heat/Density Map of Healthy Food Stores in Philadelphia |
In 2008 we [...]]]>
In 2008 we launched the GeoCommons open geodata sharing community. To date the community has shared over 200,000 datasets, including a few hundred million features. The goal was to make it as easy as possible for people to share, find, and use open geospatial data for exploration, and collaboration. As a small company we were very pragmatic on our API and metadata while also attempting to evolve the developer ecosystem to use these small but important mechanisms.
For example we embedded Dublin core meta tags in the HTML, used OpenSearch-Geo for auto-discovery to reference our AtomPub feeds that provided links to relevant formats and actionable endpoints.
A year ago we joined Esri and I am interested in extending our use of lightweight, pragmatic linked open data formats. I believe that by prolific availability of data both in highly accessible (e.g. CSV) but also modern, advanced (e.g. JSON-LD , ALPS ) formats we can not only encourage, but enable the community to evolve its adoption of these formats. We can utilize this linked data for a vastly improved user experience, similar to what Google is doing with Gmail.
Historically I have seen most discussions of Linked Data devolve into deep ontological debates and overly focus on schemas in OWL or RDF, lacking the practicality of user experience and application capabilities. I am hoping that it’s possible to bring the development and deployment of these ideas that are meant for people that do not need to know the technical specifications in order to realize the benefits.
Thematic web maps themselves provide some linked relationships. Layers are styled references to data services, implying a contextual relationship between layers and in a specific geographic area of interest. Through styling, users have characterized attributes of interest, but likely are missing links to the units of measurement, statistical relevance, or methods of measurement.
What other examples are there, akin to Google or Apple’s embedded metadata, where annotated, and ideally even linked, data directly enhance the user experience? Relevant to the geospatial domain, what types of formats have gained prevalence and tool adoption. OpenStreetMap as LinkedGeoData was a good start but seems to be moving, but slowly. What shortcomings still need to be addresses that perhaps we as a group can discuss and drive community discussion?
And if you are interested investigating and building this technology – we are actively hiring smart engineers and user experience designers – make sure to ping us.
]]>A free day of everything related to future cities and the built environment at the premiere 300 seater Barbican Cinema One:
Held at the 300 seater Barbican Cinema One in London, the open to all conference will deal with new ways of visualising cities in 3D using virtual realities, it will show how we are building an internet of things around which the city is being reinvented and it will explore how the prosperity of cities relates to their scale and size. A particular feature of the day will be a focus on new ways of exploring movement patterns in cities using data sets from smart cards, from open data sources and from new methods for crowdsourcing not only data but ideas for future cities.
The event will be streamed live at http://conference.casa.ucl.ac.uk/ from 10am Friday 27th, 2013…..
The parts were created in the free version of SketchUp, via a DXF plan and exported to .stl for import into MakerWare. 3D printing is still a hit and miss affair, we printed each part out individually to minimise the risk of any printing errors on the replicator.
In general, printing in the centre of the replicator reduces any errors, we also added a raft to each cog and printed at 100% to increase the strength of the final print. Each cog took approximately 2 hours to print with the frame sections 3 to 4 hours.
The complete clock took 4 days to print, it runs on an 600g weight and requires winding every 48 hours – the clip below details the completed 3D printed clock:
Paradoxically, as cyberspace provides a world without borders, human population is becoming more centralized. The increasing production of information in cities raises issues of privacy, access, and inclusion. Who will own the brains of Smart Cities? Fast Company sees a battle for control between "hacktivists" pushing for self-serve governance and companies providing opaque systems based on proprietary technology. Achieving balance depends on an agenda of openness, transparency and inclusiveness led by municipal government and enabled by open standards.
]]>Paradoxically, as cyberspace provides a world without borders, human population is becoming more centralized. The increasing production of information in cities raises issues of privacy, access, and inclusion. Who will own the brains of Smart Cities? Fast Company sees a battle for control between "hacktivists" pushing for self-serve governance and companies providing opaque systems based on proprietary technology. Achieving balance depends on an agenda of openness, transparency and inclusiveness led by municipal government and enabled by open standards.
]]>Big Data, Sensing and Augmented Reality – New Directions for The Crowd and Industry
Andrew Hudson-Smith
Director, The Bartlett Centre for Advanced Spatial Analysis, http://www.casa.ucl.ac.uk / http://www.digitalurban.org, @digitalurban
Everyday we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few (IBM, 2013). This data, compared to traditional data sources, can arguably be defined as ‘big’ with cities and urban environments the main source of this data creation. Every minute 100,000 tweets are sent globally, Google receives 2,000,000 search requests and users share 684,478 pieces of content on Facebook (Mashable, 2012). An increasing amount of this data stream is geolocated, from Check-ins via Foursquare through to Tweets and searches via Google Now. These streams of data that cities and individuals emit can be collected and viewed to make the data city visible, aiding our understanding of not only how urban systems operate but opening up the possibility of a real-time view of the city at large (Hudson-Smith, 2013). This paper and presentation explores the rise in big data and the need to link Building Information Modelling (BIM) to Geographic Information Systems (GIS), Citizen Science and finally the Internet of Things (IoT). This four-way linkage, we argue, is key to the generation of the future smart city, a city that will be viewed via augmented reality. We explore various augmented reality systems and conclude that the next decade will see the fall of the smart phone and the rise of electroencephalograph embedded devices with information sent directly to our retinas – this is, we argue, the future of big data, sensing and augmented reality in relation to the built environment.
Over the last 5 years there has been a turning point in the availability of data related to the urban environment. Systems such as The London Data Store, a download data service from the Greater London Authority (GLA), stand out as first steps towards opening up data related to the city. The store was developed with the ethos of allowing anyone to access the data that the GLA and other public sector organisations hold and to use the data however they see fit – for free (GLA, 2013). With on-going development as part of a European wide Smart Cities project known as iCity, the Date Store has, to date, stimulated over 70 mobile applications linking to the 500+ datasets and a combination of the 27 real-time live traffic and transport data feeds. The key to opening up our understanding of the city is the joining of such datasets with other feeds and systems. The collection and visualisation of live data, linked to current systems such as the established BIM and GIS sectors, and the emerging citizen science and IoT movements is central to this ethos. The Internet of Things, a term attributed to the Auto-ID research group at MIT in 1999, denotes the idea that in future every object will have an online presence (de Jode et al, 2012). Information from occupancy rates through to ambient temperature can be collected using sensors linked to IoT, combined with location and time attributes it moves from the hyper-local view of the sensor through the macro scale of an urban system. IoT is still in its infancy with systems ranging from Xively (https://xively.com/) and an array of numerical data through to Tales of Things (http://www.talesofthings.com) and its narrative based take on objects. It is however estimated that over 6 billion objects will be connected to the internet by 2015 (Anita Bunk, 2013). These objects will form the backbone of future smart buildings, BIM systems, and ultimately,the smart city for data analysis, predictive modelling and visualisation.
The current theme in many research fields is citizen science. As (Haklay, 2010) states, using citizen science can take a form in which volunteers put their efforts to a purely scientific endeavour, such as mapping galaxies, or a different form that might be termed ‘community science’ in which scientific measurements and analysis are carried out by members of local communities so they can develop an evidence base and set action plans to deal with problems in their area. Largely driven by the rise in mobile phone ownership and access to networked technology, ‘the crowd’ is both a provider and user of data. Access to social networks is a key source of crowd based data, almost all the main social network providers allow access to data feeds via an Application Interface (API), this allows data to be collected or mined. These API’s are central to the current ability to make sense of these growing streams of data. One of the most popular current social networks is Twitter, created in 2006 the network now was 500 million registered users with over 340 million tweets made daily. Twitter allows users to send a message up to 140 characters in length; a tweet can contain links to other web based content, user name and a user’s location. At The Bartlett Centre for Advanced Spatial Analysis we have developed a variety of techniques and toolkits to mine and track this variety of network data, from the IoT through to the crowd, often with location and time as the linking theme.
Arguably, the core unifying aspect of data is location and time. As such, on-site augmented reality systems, similar in characteristic to the forthcoming Google Glass, have the potential to join up the smart city. Powered by BIM, GIS, IoT and the crowd, such systems will embed a new range of sensors to capture and share the ‘emotion of place’ while streaming in data. Taking a look at current number of complex systems it is clear that there is a need to join up and simplify urban data acquisition and analysis, the move to augmented reality may provide this unique opportunity.
Bunk, A. (2013), The Internet of Things, http://blog.bosch-si.com/
de Jode M., Barthel R., Rogers J., Karpovich A., Hudson-Smith A., Quigley M. and Speed C.”Enhancing the ‘second-hand’ retail experience with digital object memories”, Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, Pennsylvania, 2012, pp 451-460.
GLA. (2013), The London Datastore,http://data.london.gov.uk/datastore/about.
Haklay, M (2010) Geographical Citizen Science – Clash of Cultures and New Opportunities. In: (Proceedings) Workshop on the Role of Volunteered Geographic Information in Advancing Science, GIScience 2010.
Hudson-Smith (2013), Tagging and Tracking, Architectural Design, forthcoming.
IBM. (2013), Big Data at the Speed of Business, http://www-01.ibm.com/software/data/bigdata/
Mashable. (2012), How Much Data is Created Every Minute? http://mashable.com/2012/06/22/data-created-every-minute/
THAT WEST WING BIT ABOUT THE GALL-PETERS PROJECTION
We feel smug every time someone tweets or emails this to us, because we already knew the distortions of the Mercator projection and the social arguments for the Gall-Peters projection. It’s all we can do not to lecture you about it beyond the four minutes of the clip. Don’t get us wrong: we’re kind of giddy that the ever-highbrow West Wing introduced you to the subject, but we’ve seen it a million times.
Nothing is where you think it is.
“HASN’T EVERYTHING ALREADY BEEN MAPPED?”
Okay, this one isn’t a link people send us. But, every time. Every time we mention our job to someone new, this is what we hear in reply. Or something along those lines, anyway. We’ve heard it a million times and we’re tired of answering it.
You make maps? That’s so sad!
The inevitable conversation
BUSTER BLUTH
Buster thinks that blue on the map indicates land, LOLOLOLOL!!!111!!1!one. Okay, we can laugh at Buster for that one, but the harder joke to swallow is the one earlier in the episode, which kind of dismisses cartography because everything has been discovered by Magellan, Cortés, and NASA. Oh well. Cartographers are cool, so we’re Arrested Development fans, which means we’d seen this a million times before you ever sent it to us.
Never hurts to double check.
zOMG AMATEUR CARTOGRAPHERS
Every few months some notable outlet runs a story on the growing interest in things like OpenStreetMap and the ever-increasing accessibility of mapping tools and data. Admittedly these aren’t written for us—and they provide excellent exposure for good things—but they still fly around cartography circles. Hey, we are keenly aware that amateur cartographers are everywhere. Why do you think we get so cranky and act like know-it-alls? Because a million amateurs are going to STEAL OUR JOBS!
What’s next, robots?
Uncharted Territory: The Power of Amateur Cartographers
What Happens When Everyone Makes Maps?
etc.
26 MAPS THAT [WILL BLOW YOUR MIND/EXPLAIN EVERYTHING/ARE PRETTY/ARE FUNNY]
These appear almost monthly now and come in a few flavors, ranging from Buzzfeed nonsense to respectable journalism. They tend to follow a pattern, which is to include:
Trust me, the only satisfaction we get from these is seeing that some of our colleagues made the list. (Those would be the several good maps.) Otherwise we cringe. Also, we’ve already seen all the maps a million times. We’ve seen this 26 million times before.
Look at the title of this post! Ha ha ha!!!
40 maps that explain the world
40 maps they didn’t teach you in school
38 maps you never knew you needed
etc.
SOMETHINGOROTHER AS A SUBWAY MAP
Harry Beck must be rolling over in his grave. The subway map infographic craze was in full swing a couple of years ago, with people “visualizing” all kinds of things using this well-known style. Yeah, it’s a nice visual, but folks: if there isn’t actual topology to show, it shouldn’t be a subway-style map. Among the popular images that get passed around are some magnificent maps that actually make sense (often dealing with transportation, for example Cameron Booth’s maps), and a few of the others are clever enough to be worthwhile, but we cartographers do a lot of eye-rolling at the rest of them. A million eye rolls.
But this is one of the clever ones.
I am in reality only half as rude as the above post may suggest.
count of my blogposts by year: 2008, 136; 2009,81; 2010,46; 2011,16; 2012,4; 2013,2;
— NYGeog (@nygeog) August 8, 2013
]]>Jason Birch: I haven't blogged in two years. Partially because of a technology shift at work away from open source, and partially because our industry is less exciting than it was five years ago, and partially because I'm lazy (and have better things to do in my spare time)
Bill Morris: Because I don't call it GIS blogging
Roger Diercks: I gave up blogging two years ago. I haven't undergone any significant technology shifts at work, but +Jason Birch otherwise squarely hit on exactly the same reasons I pulled the plug. I very much appreciate the engagement from my blog's readers, but blogging began to feel like a chore and less like an enjoyable creative outlet.
count of my blogposts by year: 2008, 136; 2009,81; 2010,46; 2011,16; 2012,4; 2013,2;
— NYGeog (@nygeog) August 8, 2013
]]>Jason Birch: I haven't blogged in two years. Partially because of a technology shift at work away from open source, and partially because our industry is less exciting than it was five years ago, and partially because I'm lazy (and have better things to do in my spare time)
Bill Morris: Because I don't call it GIS blogging
Roger Diercks: I gave up blogging two years ago. I haven't undergone any significant technology shifts at work, but +Jason Birch otherwise squarely hit on exactly the same reasons I pulled the plug. I very much appreciate the engagement from my blog's readers, but blogging began to feel like a chore and less like an enjoyable creative outlet.
An Overview of Several Cities |
The Epidemic in Houston |
"P.s. please stop spelling it #murica or #mericuh or any other variation. It's #MERICA. #northerngirlprobs"In contrast, this erudite tweeter prefers the more guttural 'Murica spelling:
"I don't know Harry, I heard the French are assholes" true statement. Elated to be back in 'Murica"But sadly, there is no consensus around this important issue, which if left unchecked (or at least unmapped) could threaten to undermine the very foundation of the nation. Even more tragic is that someone [2] was so unthoughtful as to bring up this topic on the day in which all 'Mericans/'Muricans should join together in our hatred of everyone who doesn't acknowledge that we're so totally superior to them. As such, we dutifully bring you an investigation of this debate that you may not have even been aware of. You're welcome.
Source: DOLLY, n = 3702 #StandWithWendy tweets on June 25th and 26th, 2013; Darker shading indicates greater intensity |
Source: DOLLY, n = 3702 #StandWithWendy tweets on June 25th and 26th, 2013; Darker shading indicates greater intensity; Normalized by the total number of tweets sent during the same time period |
Source: DOLLY, n = 3702 #StandWithWendy tweets on June 25th and 26th, 2013; Normalized by the total number of tweets sent during the same time period |
Zombies exist, though perhaps not in an entirely literal sense. But the existence, even the outright prevalence, of zombies in the collective social imaginary gives them a ‘realness’, even though a zombie apocalypse has yet to happen. The zombie trope exists as a means through which society can playfully, if somewhat grimly and gruesomely, discover the intricacies of humanity’s relationship with nature and the socially constructed world that emerges from it. In this chapter, we present an analysis of the prevalence of zombies and zombie-related terminology within the geographically grounded parts of cyberspace, known as the geoweb (see also Haklay et al. 2008 and Graham 2010). Just as zombies provide a means to explore, imagine and reconstruct the world around us, so too do the socio-technical practices of the geoweb provide a means for better understanding human society (Shelton et al. 2013; Graham and Zook 2011; Zook et al. 2010; Zook and Graham 2007). In short, looking for and mapping geo-coded references to zombies on the web provides insight on the memes, mechanisms and the macabre of the modern world. Using a series of maps that visualize the virtual geographies of zombies, this chapter seeks to comprehend the ways in which both zombies and the geoweb are simultaneously reflective of and employed in producing new understandings of our world.]]>
Tweets negatively referring to "Dyke" |
Hotspots for "wetback" Tweets |
Screenshot of TileMill |
One day standing in line for lunch I met two guys who were also in attendance 10 years ago, but we'd never met before. It was the guy in front of me and the guy directly behind me. We ended up eating together. What are the odds of that? It's those little moments of connection that make these events so much fun.
This year I noticed an even more distinct focus on web mapping technologies, and web development, than usual. With QGIS 2.0 just about to release it would have been great to hear a state of QGIS talk. I'd also really appreciate more sessions on spatial analysis. Maybe I'll need to sign up for one next time. After all there is more to geospatial than web development.
The opening plenary was one of the highlights. Erek Dyskant covered use of FOSS4G technologies behind the Democratic National Committee's recent presidential campaign. A stack of FOSS4G software was developed including PostGIS, QGIS and web services. This stack allowed access to current campaign related data in near real time to all nationwide staff. Field offices were then in a great position to prioritize door knocking and calling campaign, and maximize resources.
Another session of note had an educational focus with papers titled: The New Users, Adapting Web Mapping Curriculum to Open Source Technologies, and Building a Geospatially Competent Workforce with FOSS4G. This was especially interesting for me as I strive to keep my Introduction to Open Source GIS and Web Mapping course current in a rapidly changing field. I also heard valuable updates on MapServer, GeoServer, MapBox, OpenGeo, GDAL/OGR, Leaflet and OpenLayers 3.
The final session was a panel discussion on the use of FOSS4G in state and local governments. It was an interesting frank discussion. On one side it was about the political and bureaucratic hurdles in the way of organizations adopting FOSS4G. On the other were success stories of FOSS4G being utilized in state governments.
The Gala was held at the Mill City Museum in the ruins of the Gold Medal Flour mill on the Mississippi River. A gorgeous site. Seeing voluminous water is a treat coming from drought stricken New Mexico.
Plus I met a bunch of new folks! Kudos to the organizers for putting on another great show. It was a great conference!
************************************************************
END NOTE: If you're a geo-geek and into exercise you've got to get a Suunto Ambit. Here is the data from my walk back to the conference hotel from the Gala Event. I wasn't wearing my heart rate monitor, but it still collects elevation, barometric pressure, GPS, elevation, speed, temperature etc., and allows export to KML. Oh and you can navigate with it and it has a compass.
]]>
One day standing in line for lunch I met two guys who were also in attendance 10 years ago, but we'd never met before. It was the guy in front of me and the guy directly behind me. We ended up eating together. What are the odds of that? It's those little moments of connection that make these events so much fun.
This year I noticed an even more distinct focus on web mapping technologies, and web development, than usual. With QGIS 2.0 just about to release it would have been great to hear a state of QGIS talk. I'd also really appreciate more sessions on spatial analysis. Maybe I'll need to sign up for one next time. After all there is more to geospatial than web development.
The opening plenary was one of the highlights. Erek Dyskant covered use of FOSS4G technologies behind the Democratic National Committee's recent presidential campaign. A stack of FOSS4G software was developed including PostGIS, QGIS and web services. This stack allowed access to current campaign related data in near real time to all nationwide staff. Field offices were then in a great position to prioritize door knocking and calling campaign, and maximize resources.
Another session of note had an educational focus with papers titled: The New Users, Adapting Web Mapping Curriculum to Open Source Technologies, and Building a Geospatially Competent Workforce with FOSS4G. This was especially interesting for me as I strive to keep my Introduction to Open Source GIS and Web Mapping course current in a rapidly changing field. I also heard valuable updates on MapServer, GeoServer, MapBox, OpenGeo, GDAL/OGR, Leaflet and OpenLayers 3.
The final session was a panel discussion on the use of FOSS4G in state and local governments. It was an interesting frank discussion. On one side it was about the political and bureaucratic hurdles in the way of organizations adopting FOSS4G. On the other were success stories of FOSS4G being utilized in state governments.
The Gala was held at the Mill City Museum in the ruins of the Gold Medal Flour mill on the Mississippi River. A gorgeous site. Seeing voluminous water is a treat coming from drought stricken New Mexico.
Plus I met a bunch of new folks! Kudos to the organizers for putting on another great show. It was a great conference!
************************************************************
END NOTE: If you're a geo-geek and into exercise you've got to get a Suunto Ambit. Here is the data from my walk back to the conference hotel from the Gala Event. I wasn't wearing my heart rate monitor, but it still collects elevation, barometric pressure, GPS, elevation, speed, temperature etc., and allows export to KML. Oh and you can navigate with it and it has a compass.
]]>
US Population Counts by County and Cities with Population Greater than 250,000 |
Family Size (Purple/Red = Greater than 1 Standard Deviation above the mean, Blue = Below, No Color = Mean to 1 SD Above). Both maps are derived from data in the Summary Demographic Profile 1. |
Finally, a way to organize all of these open GIS files! |
There is an ever expanding ecosystem of geospatial apps for iOS. For this project we are evaluating EPICollect and GIS Pro. EPICollect is a free app designed to collect point data with a custom form. GIS Pro is a very expesive app. However, with the price comes a very intuitive and robust data collection system.
Once data is collected QGIS is used to combine the data with other organizational datasets, conduct spatial analyses and prepare maps. GIS Cloud is being used for final online presentation.
]]>
There is an ever expanding ecosystem of geospatial apps for iOS. For this project we are evaluating EPICollect and GIS Pro. EPICollect is a free app designed to collect point data with a custom form. GIS Pro is a very expesive app. However, with the price comes a very intuitive and robust data collection system.
Once data is collected QGIS is used to combine the data with other organizational datasets, conduct spatial analyses and prepare maps. GIS Cloud is being used for final online presentation.
]]>© John Reiser for new jersey geographer, 2013. |
Permalink |
No comment |
Add to
del.icio.us
Post tags: education, environmental studies, GIS, Planning, portfolio, Teaching
This article presents an overview and initial results of a geoweb analysis designed to provide the foundation for a continued discussion of the potential impacts of ‘big data’ for the practice of critical human geography. While Haklay's (2012) observation that social media content is generated by a small number of ‘outliers’ is correct, we explore alternative methods and conceptual frameworks that might allow for one to overcome the limitations of previous analyses of user-generated geographic information. Though more illustrative than explanatory, the results of our analysis suggest a cautious approach toward the use of the geoweb and big data that are as mindful of their shortcomings as their potential.Crampton, J.W., M. Graham, A. Poorthuis, T. Shelton, M. Stephens, M.W. Wilson and M. Zook. 2013. Beyond the Geotag: Situating ‘Big Data’ and Leveraging the Potential of the Geoweb. Cartography and Geographic Information Science 40(2): 130-139.
More specifically, we propose five extensions to the typical practice of mapping georeferenced data that we call going ‘beyond the geotag’: (1) going beyond social media that is explicitly geographic; (2) going beyond spatialities of the ‘here and now’; (3) going beyond the proximate; (4) going beyond the human to data produced by bots and automated systems, and (5) going beyond the geoweb itself, by leveraging these sources against ancillary data, such as news reports and census data. We see these extensions of existing methodologies as providing the potential for overcoming existing limitations on the analysis of the geoweb.
The principal case study focuses on the widely reported riots following the University of Kentucky men's basketball team's victory in the 2012 NCAA championship and its manifestation within the geoweb. Drawing upon a database of archived Twitter activity – including all geotagged tweets since December 2011–we analyze the geography of tweets that used a specific hashtag (#LexingtonPoliceScanner) in order to demonstrate the potential application of our methodological and conceptual program. By tracking the social, spatial, and temporal diffusion of this hashtag, we show how large databases of such spatially referenced internet content can be used in a more systematic way for critical social and spatial analysis.
"Arches has been purpose-built for the international cultural heritage field, and can be used to inventory and document all types of immovable heritage, including buildings and other structures, cultural landscapes, heritage ensembles or districts, as well as archaeological sites."On the Arches website, you will also find interesting background information about similar systems being deployed: http://archesproject.org/project-background/. Also be sure to check about their FAQ page: http://archesproject.org/faq/ or view the factsheet.
So, two things. First, we could stand to share knowledge better, cartographers! Everyone is pretty good at sharing code and data these days, but we fall short on sharing the why of things, especially those of us who went to school for this and everything.
Second, an attempt at uncovering the problems with choropleth mapping on the Mercator projection.
Now, perhaps nobody really talks about why small-scale Mercator choropleths are bad because the gist of the reason is intuitive enough: bigger looks like “more,” so any map projection that distorts area (especially as severely as Mercator does) will make some values look exaggerated and will thus be misinterpreted. Size comparison is at the heart of many types of statistical graphics, and obviously relative sizes need to be correct for the whole concept to make any sense at all.
Indeed, this sometimes applies to areal mapping, for example “land-use or similar mapping in which a measure of the area occupied by some distribution is crucial to map interpretation” (Muehrcke and Muehrcke, Map Use 3rd ed.). If you need to compare areas, areas cannot be distorted. (Never mind that humans are terrible at estimating and comparing areas of irregular shapes, from what I hear.)
In the typical choropleth map, however, area is not directly the visual variable of interest, and we are not trying to measure it. Still we assume that relative sizes need to be true in order for the map to work. How do we know that? Well, I’m not sure. I flipped through all my cartography textbooks and to my surprise it’s not that I forgot the evidence for this—it’s that they really don’t cite anything on the subject. We accept it on faith and common sense, apparently, although I’d bet a shiny nickel that someone somewhere has done empirical studies to confirm it, or that somewhere buried in How Maps Work is an explanation. Please, if anybody can point me to some of the research behind all this, it would be appreciated!
It turns out, then, that this is not just an internet problem. A textbook education in cartography will not teach you, in scientific terms, why a choropleth Mercator map is worse than a choropleth sinusoidal map or a proportional symbol map. Interpretation of area in quantitative maps gets no quantitative explanation; instead it gets basically the same treatment as propaganda maps and the whole Peters thing, which paraphrased boils down to “bigger things totally look more prominent and important because they’re bigger.” Semiology of Graphics is the only book I have that really addresses size directly and as matter of fact—noting among other things that “it is not possible to disregard it visually” and “in any map representing areas of unequal size, what is seen is [quantity] multiplied by the size of the area”—but even if he was correct, Bertin was pretty much making things up.
Mentioned more commonly but no more deeply explained is the need to normalize data to account for area in choropleth maps, i.e., not mapping counts. Considering this rule, the projection requirement, and a host of “ideal” enumeration unit characteristics, choropleth mapping just starts to sound like a terrible idea for anything at all. Size variation that is not directly related to numerical variation seems to cause nothing but problems. Danny Dorling’s arguments for cartograms and mapping human phenomena in human space, not geographic space, start to sound appealing.
Too bad cartograms are also kind of awful.
]]>Hence my—and some peers’—disappointment in the most recent “challenge” from the MBTA, Greater Boston’s transit agency. To summarize a somewhat lengthy description page, they are essentially seeking new design ideas for their standard subway map—in the space of three weeks, for free, and with no rights retained by the cartographer. And if you win this contest? You get… um, fleeting glory, apparently.
I want to like the idea. The MBTA carries crippling debt, and as a somewhat regular user of the system I don’t want to see its service diminished or my fares increased, so I applaud any other funding or savings. But—and I’m looking for some kind of “third rail” wordplay here—this time they strike a nerve with those of us who have mapping jobs.
The T has run contests before. The most successful was a few years ago at the dawn of its open data age, resulting in some cool visualizations and interesting apps using schedule data, which shortly thereafter was supplanted by real-time tracking. These previous contests, though, were very much about openness. Yes, the clever angle was to get the community to create products at no cost to the agency, but at least these products were not owned by the agency. And there totally were prizes.
From the outside it’s easy to mistake modern cartography for a free endeavor driven by some desire to improve the world. Indeed, we do have a few altruistic motives, and the latest trends are all about openness: open data, open source code, etc. But even these things are not always free. Free to use, yes, but often enough someone has paid for them to be made in the first place. And this model doesn’t really apply to design. Good design is a part of any project, open or not, but when the job itself is design, we don’t jump at the chance to do it for someone else without compensation just because it’s fun. Like everyone else in the world, we do this to earn a living.
In short, if you can design a subway map that’s good enough for millions of people to use on a daily basis, you are very good at this. Maps are easy. Good maps are not. Your skills are valuable. Make maps for fun when it’s for your own satisfaction or for the causes you champion, but recognize your worth when it’s for others’ satisfaction. And make them recognize your worth, too.
In any case, while we’re on the subject, do enjoy Cameron Booth’s MBTA map redesign—which the MBTA can’t have for free—and Peter Dunn’s time-based map.
]]>--
Follow me on Twitter @atanas
--
Follow me on Twitter @atanas
"The hardest thing of all is to find a black cat in a dark room, especially if there is no cat." --Confucius (attributed)
--
Follow me on Twitter @atanas
"The hardest thing of all is to find a black cat in a dark room, especially if there is no cat." --Confucius (attributed)
--
Follow me on Twitter @atanas
--
Follow me on Twitter @atanas
--
Follow me on Twitter @atanas
I'm still always looking for ways to improve my work space. I've used an ergonomic keyboard/ mouse tray for the last 4 years. I try and get up and walk around frequently too. Two years ago, I read this article in the New York Times. I've wanted an adjustable height desk ever since. I then started hearing all the news reports about how unhealthy it is to sit all day. For example, this piece aired on NPR and this blog post came up on Mark's Daily Apple. It's certainly not shocking news. However, it got me thinking about how much sitting I do between work all day, and lounging in the living room at night. I started realizing I not only want, but need, a desk that would allow me to stand at least part of the day.
I felt almost immediately that the GeekDesk would be the perfect fit. It was at the right price and it matches my existing office furniture. Just before the holidays I ordered my GeekDesk figuring I could use the break to get it set up, and tear the old one down. Originally, I ordered the GeekDesk v3 in the 47" width. I then realized the GeekDesk Max had a nice feature, it comes with a control pad that allows you to preset four desk heights. That same afternoon I changed my order to the GeekDesk Max in the same size. Once it arrived I realized that the wider one would work better taking into account my CPU hangar and my space. The folks at GeekDesk were very accommodating with all these changes to my order. In fact they have some of the best customer service I've ever encountered. In the end, they let me exchange just the parts that differed between the small and large desks - the table top and some braces.
I also purchased a CPU hangar to hold my computer underneath the desk. This keeps the computer with the desk as it raises and lowers. It was easy to install and works great with my HP workstation. I still need to either buy a new human scale keyboard tray or cut my existing keyboard track because it is too long to fit under the desk. Aside from that, my GeekDesk is now completely set up and works great. It will adjust from 23" to 48". I absolutely love it! It's been a seamless transition and it feels very natural to stand. I haven't gotten into any routine yet. I noticed though, that I seem to prefer standing in the morning while I'm reading and returning emails and surfing the web. Then while I'm working on more challenging tasks I tend to sit or use a stool. I now change the height of my desk half a dozen times a day!
]]>
I'm still always looking for ways to improve my work space. I've used an ergonomic keyboard/ mouse tray for the last 4 years. I try and get up and walk around frequently too. Two years ago, I read this article in the New York Times. I've wanted an adjustable height desk ever since. I then started hearing all the news reports about how unhealthy it is to sit all day. For example, this piece aired on NPR and this blog post came up on Mark's Daily Apple. It's certainly not shocking news. However, it got me thinking about how much sitting I do between work all day, and lounging in the living room at night. I started realizing I not only want, but need, a desk that would allow me to stand at least part of the day.
I felt almost immediately that the GeekDesk would be the perfect fit. It was at the right price and it matches my existing office furniture. Just before the holidays I ordered my GeekDesk figuring I could use the break to get it set up, and tear the old one down. Originally, I ordered the GeekDesk v3 in the 47" width. I then realized the GeekDesk Max had a nice feature, it comes with a control pad that allows you to preset four desk heights. That same afternoon I changed my order to the GeekDesk Max in the same size. Once it arrived I realized that the wider one would work better taking into account my CPU hangar and my space. The folks at GeekDesk were very accommodating with all these changes to my order. In fact they have some of the best customer service I've ever encountered. In the end, they let me exchange just the parts that differed between the small and large desks - the table top and some braces.
I also purchased a CPU hangar to hold my computer underneath the desk. This keeps the computer with the desk as it raises and lowers. It was easy to install and works great with my HP workstation. I still need to either buy a new human scale keyboard tray or cut my existing keyboard track because it is too long to fit under the desk. Aside from that, my GeekDesk is now completely set up and works great. It will adjust from 23" to 48". I absolutely love it! It's been a seamless transition and it feels very natural to stand. I haven't gotten into any routine yet. I noticed though, that I seem to prefer standing in the morning while I'm reading and returning emails and surfing the web. Then while I'm working on more challenging tasks I tend to sit or use a stool. I now change the height of my desk half a dozen times a day!
]]>
If you need only the data for New Jersey, then use this file:
If you need the other states, look in the MOTF FTP link for similar file names with different state abbreviations. Contained in that zip file are a vector shapefile of the estimated storm surge extent based on field observation data collected through November 11, and there is a depth grid (ESRI raster) at 3-meter resolution as well. These files were created by interpolating the high water marks (collected by USGS under FEMA mission assignment) into a water surface elevation grid, and then subtracting the ground elevation (3-meter DEM) from the water surface elevations in order to provide estimated water depth over land (inundation).
--
Follow me on Twitter @atanas
]]>
If you need only the data for New Jersey, then use this file:
If you need the other states, look in the MOTF FTP link for similar file names with different state abbreviations. Contained in that zip file are a vector shapefile of the estimated storm surge extent based on field observation data collected through November 11, and there is a depth grid (ESRI raster) at 3-meter resolution as well. These files were created by interpolating the high water marks (collected by USGS under FEMA mission assignment) into a water surface elevation grid, and then subtracting the ground elevation (3-meter DEM) from the water surface elevations in order to provide estimated water depth over land (inundation).
--
Follow me on Twitter @atanas
]]>
Both the NJ Landscape Project habitat data and the Municipalities data are complex. It is hard to pinpoint a rendering issue when the data is very complex. I prepared a reference data set composed of a few multipart features. You can download the reference shapefile here.
I created the polygon features in ArcGIS Desktop as a shapefile. I then imported the shapefile into my PostGIS database twice, once using PG_GEOMETRY & ArcSDE, the other using OGR. I then prepared two WMS layers in GeoServer, one based off of the SDE import and one from the OGR import. The SDE import causes GeoServer to improperly render the data. GeoServer can render the OGR import without issue.
I exported the geometries of each table as WKT: hexagons imported using SDE; hexagons imported using OGR. Note that the coordinates are essentially the same, but the “wrappers” around the coordinates are different. You can also tell that it is a rendering issue; in GeoServer’s layer preview mode, you can click on the OpenLayers map to identify a feature. Clicking on one of the erroneous polygons yields no results.
It appears that GeoServer is improperly handling MULTIPOLYGON features. POLYGON features (with multiple parts) are properly rendered. This is also what is being suggested in the support forum.
One other issue I encountered was that OGR would not import the shapefile without the -nlt flag set.
$ ogr2ogr -f "PostgreSQL" PG:"pg connection string" ./testing.shp -nln test_hexagons_ogr Warning 1: Geometry to be inserted is of type Multi Polygon, whereas the layer geometry type is Polygon. Insertion is likely to fail ERROR 1: INSERT command for new feature failed. ERROR: new row for relation "test_hexagons_ogr" violates check constraint "enforce_geotype_wkb_geometry" Command: INSERT INTO "test_hexagons_ogr" ("wkb_geometry" , "id") VALUES ('snip'::GEOMETRY, 3) RETURNING "ogc_fid" ERROR 1: Terminating translation prematurely after failed translation of layer testing (use -skipfailures to skip errors) $ ogr2ogr -f "PostgreSQL" PG:"pg connection string" ./testing.shp -nln test_hexagons_ogr -nlt POLYGON $
© John Reiser for new jersey geographer, 2012. |
Permalink |
5 comments |
Add to
del.icio.us
Post tags: Geoserver, postgis, wms
Several GIS data sets that have been brought into PostGIS using ArcGIS 10.0 (with the PG_GEOMETRY keyword) have then exhibited some bizarre rendering issues when rendered using GeoServer. Above, the Landscape Project data appears to have some random triangular polygons that stretch beyond New Jersey’s western border. We used ArcGIS to process this layer as I wanted a Union (ESRI not PostGIS terminology) between Landscape and NJ’s municipality layer. ArcGIS Desktop’s method of storing multipart polygons in PostGIS seems valid – the table renders just fine in ArcGIS Desktop and QGIS – but is a little different than what GeoServer’s rendering engine is expecting.
Here’s a “fixed” version that enables GeoServer to render the data correctly.
The “fix” comes in the form of a simple table view. By forcing the geometry field through ST_Force_Collection and then setting the table view up as a new WMS layer, the problem polygons disappear. Frankly, I was surprised this worked, but I assumed that if ArcGIS and QGIS could handle the data, the problem wasn’t the data, but GeoServer’s ability to process it. ST_Force_Collection is just enough to massage the data into a GeoServer-friendly form.
I’m not sure if it’s something to do with ArcGIS’s PostGIS support directly, but I have already experienced issues with weirdness due to inconsistent handling of circular features between ArcGIS, PostGIS and OGR. I’m going to generate a shapefile of some bizarre multipart shapes and see how the shapes are handled via SDE & PG_GEOMETRY and well as OGR to PostGIS. I’ll share my findings here along with the source data.
© John Reiser for new jersey geographer, 2012. |
Permalink |
One comment |
Add to
del.icio.us
Post tags: ArcGIS, Geoserver, GIS, postgis, rendering, wms
Two weeks after Sandy, I took another shot from roughly the same spot.
The dunes present in the top photo are completely gone. Many of the shops fared well, a little broken glass and best signs, but overall, intact.
Here’s another panorama of the boardwalk in front of the Music Pier.
© John Reiser for new jersey geographer, 2012. |
Permalink |
No comment |
Add to
del.icio.us
Post tags: gigapan, Hurricane, hurricane sandy, ocean city
Funtown Pier partially destroyed.
A roller coaster at Casino Pier can be seen sitting in the surf. Two buildings at the end of the Pier are entirely gone.
For those that are unfamiliar with the two piers, here’s a before and after.
© John Reiser for new jersey geographer, 2012. |
Permalink |
No comment |
Add to
del.icio.us
Post tags: Aerial Photography, Hurricane, sandy, seaside heights
One bright note: Lucy the Elephant survived!
© John Reiser for new jersey geographer, 2012. |
Permalink |
No comment |
Add to
del.icio.us
Post tags: Aerial Photography, Hurricane, ngs, noaa, sandy
As much as a decade ago, I remember running into amazingly high resolution aerial imagery of Cambridge, Massachusetts. You could see people in this imagery, which was not so common on the web at the time. I explored Cambridge a bit via the map, as I am wont to do with any map in front of me. I found what looked like some busy spots, identified the famous Harvard University, and so on. It was a strange, unknown place—a city I only knew in person as a collection of buildings glimpsed from highways or from across the river in Boston, where I had been a number of times. It was mostly only a place on a map, and it was up to my imagination to picture what it was like to be there.
Aerial image of Harvard Square, dated 2001 in Google Earth.
Then, some years later, circumstances brought me to Cambridge as a resident. Now a further four years after that, I obviously have a much different image of the city. I love this place, and I’m glad I’ve come to know it well, but there’s no longer any mystique. I kind of miss imaginary Cambridge.
Part of maps’ broad appeal is that they are captivating as canvases for imagination. They can represent lands we’ve never seen, offering a simple lattice of information but requiring us to fill in the gaps in our minds. We can explore maps and “know” places to be as fantastic as our minds will allow. Ultimately, I think, it leads us to explore the places in reality, and it can be shocking when reality doesn’t match our imagined expectations. The shock is not necessarily bad and may even be pleasant (except when, say, imaginary beauty turns out to be a trash-strewn real world); but if you’re like me, you lament the demise of the place your mind invented, even if the reality that supplanted it is better.
As web reference maps move toward less and less abstracted representations of the world, some observers have begun to wonder whether people are losing the interest or need to go to explore real places and experience them in real life, because Street View can show you exactly what a place looks like, or Twitter maps can tell you exactly what people are talking about there, and so forth. I remain optimistic that modern maps will not be a substitute for reality, but rather will draw people in to experience what they know is happening in different places. The maps of old may have tantalized people with their sea monsters and blank spaces, but people didn’t stop climbing mountains when someone else had mapped their slopes with precision, and I didn’t avoid walking around town because I had already seen people-level aerial photos. Knowing what’s out there is as much of a draw as not knowing.
No, the victim in the march toward realistic maps is not real-world experience; the victim is imagination and a bit of the fun of reading maps. I don’t cease to imagine places when looking at a map. It’s just that my imagination is increasingly accurate. It used to be that for every place in the world there were actually two places: one in my mind and one on the ground. Soon, perhaps, there will be only one.
RIP, the last imaginary place on Earth.
]]>Source: http://img.gawkerassets.com/img/17pobsq6vo22cjpg/xlarge.jpg |
Our answer is: IT doesn't matter! Here is why:"Is Google worried that Apple’s defection will substantially reduce its user base, and, consequently, the advertising revenue it gains through maps? Does the search company fear that it could lose its place as the online mapping leader, a position that has long been one of its competitive advantages? Is it concerned that Apple might build a better, more useful maps app?"
Source: http://www.flagstaffshuttle.net/images/GC3.jpg |
Magnetometer plot indicating a UXO or ferrous item at 3 m depth (source: www.conepenetration.com) |
Approx. 100 m deep sinkhole in Guatemala: The world's largest sinkholes |
This course is a perfect mix of theoretical and practical sessions particularly designed for people who are looking for their capacity enhancement in evening time in Islamabad.
Contact Details: pakgis@citypulse.com.pk 0333 461 490 5 | Venue: House 162, Street 91, I 8/4, Islamabad, Pakistan |
Dates and Timings: 21-25 May 2012 06:00 PM to 09:00 PM | Registration Fee: Normal = Rs. 6000 PKR Students = Rs. 5500 PKR |
See Training Schedule>> | Register Here>> |
Download Airline Route Mapper
Originally available at http://arm.64hosts.com/
]]>
The Queensboro Bridge and sight of the famous scene in Woody Allen's 'Manhattan'.
The view north from the Top of the Rock looking at Central Park.
The Dakota on Central Park West.
The old American Radiator Building, built in 1924, with the Empire State Building in the background. This is one of the most magnificent gothic art deco buildings in the city. It's now called the Bryant Park Hotel and is just south of Bryant Park and west of the New York City Public Library.
Grand Central Station buzzing with activity on an early weekday morning.
The Seagram Building at Park and 52nd. A classic modern skyscraper built in 1957 and site of a famous scene from 'Breakfast at Tiffany's'.
The famous Flatiron Building at Madison Square Park.
Katz's Delicatessen near 1st Ave and 1st Street in the Lower East side.
The lunch crowd at Katz's Deli. Site of a famous scene from "When Harry Met Sally'.
The Brooklyn Bridge.
The route of one days walk from the Lower East Side across the Manhattan Bridge into Redhook in Brooklyn.
The view of the Brooklyn Bridge from the Manhattan Bridge.
]]>
The Queensboro Bridge and sight of the famous scene in Woody Allen's 'Manhattan'.
The view north from the Top of the Rock looking at Central Park.
The Dakota on Central Park West.
The old American Radiator Building, built in 1924, with the Empire State Building in the background. This is one of the most magnificent gothic art deco buildings in the city. It's now called the Bryant Park Hotel and is just south of Bryant Park and west of the New York City Public Library.
Grand Central Station buzzing with activity on an early weekday morning.
The Seagram Building at Park and 52nd. A classic modern skyscraper built in 1957 and site of a famous scene from 'Breakfast at Tiffany's'.
The famous Flatiron Building at Madison Square Park.
Katz's Delicatessen near 1st Ave and 1st Street in the Lower East side.
The lunch crowd at Katz's Deli. Site of a famous scene from "When Harry Met Sally'.
The Brooklyn Bridge.
The route of one days walk from the Lower East Side across the Manhattan Bridge into Redhook in Brooklyn.
The view of the Brooklyn Bridge from the Manhattan Bridge.
]]>Download KML file of the Coverage HERE:
An analysis of mapping potential from OrbView 3 Images is given below (extracted from GIM International)
Information Content of High-resolution Satellite Image
The information content of OrbView-3 and Ikonos imagery is compared, using the Zonguldak area in Turkey as test area. Although OrbView-3 images are qualitatively slightly inferior to Ikonos panchromatic scenes, they can be used for the generation of topographic maps at scale 1:10,000. However, they are not suited for 1:5,000 mapping, for which scale Ikonos images also show limitations.
In operation since 2004, OrbView-3 is one of the recent very high-resolution space sensors, offering images of 1m panchromatic and 4m multispectral Ground Sampling Distance (GSD). In mapping terms both geometric accur-acy and information content are important, but the required geometric accuracy can be reached without difficulty provided that images are not degraded by at-mosphere and sun-elevation effects. As a rule of thumb, the GSD should be at least 0.1mm of the map scale, corresponding to scale 1:10,000 for 1m GSD.
Visual Comparison
Examination of information content has to be done by visual inspection (Figure 1). OrbView-3 and Ikonos have approximately the same resolution, but comparison shows that edges are sharper in the Ikonos image and that whilst OrbView-3 shows cars only as blobs, structural elements are visible in Ikonos. The GSD of 0.62m offered by QuickBird enables identification of more detail. On the other hand, the 5m GSD of Spot 5 limits the use of these images to the creation of maps of smaller scale. Buildings are still visible but they cannot be mapped in detail, and sometimes back-gardens will be identified as streets. Many of these differences result from sensor configuration, radiometric resolution, recording conditions and terrain characteristics.
Sensor Configuration
OrbView-3 uses staggered CCD-lines; two CCD-lines are shifted by 0.5 pixels against each other so that the pixel size projected on the ground for nadir view is 2m and adjacent pixels overlap 50% in both directions (Figure 2). The effective GSD of 1m resulting from such over-sampled pixels differs from nominal GSD of 1m. OrbView-3 takes 2,500 double lines per second, but the satellite footprint speed is 7.1km/sec, which requires permanent change of view direction to slow down angular speed. The resulting slowdown factor is 1.4 (Figure 3). The effective GSD as determined by point-spread analysis of sharp edges does not show loss of resolution against the nominal GSD, but it can be manipulated by contrast enhancement.
Radiometric Resolution
OrbView-3, Ikonos and QuickBird have a radiometric resolution of 11bit, with which 2,048 grey valu-es can be represented. However, the grey values within one scene will not cover the whole range and a qualified change from 11bit to 8bit grey values does not lead to significant loss of information. Only in some crucial areas do differences appear between the original 11bit and the derived 8bit grey values. Figure 4 shows more details in the roof in the original 11bit image than in its 8bit counterpart. This may be important for automatic image matching, but for mapping purposes it is unimportant because in both cases the building can be sufficiently well identified in all required detail.
Recording Conditions
Haze, clouds and smoke may reduce contrast; enhancement is possible but the resulting image quality will not approach that of images taken under optimal conditions. Sun elevation and azimuth cause shadows that hinder identification of details (Figure 5). With a sun elevation angle of 63°, shadows in the OrbView-3 image are not so long as in the Ikonos image with a sun elevation angle of 41°. Shadows cause identification problems in scenes with narrow streets, high buildings and terrain inclination, as is the case in the north of the Zonguldak area, but sometimes shadows may support object identification. For example, a helicopter landing-pad might at first sight look like a roof, but missing shadow may indicate that it is on the same level as surrounding grassland.
Terrain Characteristics
Contrast is the dominant component of image interpretation, but identification of objects also depends on their characteristics. Planned areas, with larger, well-arranged buildings can be more easily mapped than unplanned areas with smaller and irregu-lar objects, especially when the latter occur in hilly terrain (Figure 6). Identification of objects in planned areas does not result in significant differences between OrbView-3 and Ikonos panchromatic images, while in unplanned areas the better image quality of Ikonos resulted in a larger number of identified objects. Not every building has a rectangular shape and, particularly in hilly terrain, walls may not be parallel. Figure 7 shows a building of irregular shape (a), a rectangular building (c) and a low building throwing little shadow (b). The latter has not been identified during the mapping exercise, mainly because of missing shadow. OrbView-3 cannot take panchromatic and colour images simultaneously as do Ikonos and QuickBird, so no direct pan-sharpening was possible. Mapping with pan-sharpened Ikonos and QuickBird images simplified object identification, but this does not mean that more objects can be identified; the number was insignificant.
Results
Table 1 summarises the detection (DET) and recognition (REC) possibilities of features and objects in OrbView-3 and Ikonos imagery. Figure 8 shows maps created from panchromatic OrbView-3 and Ikonos images. All buildings and nearly all roads have been recognised in the Ikonos image; a few roads in shadowy areas have not been recognised. In the OrbView-3 mapping 93% of the buildings and 96% of the roads mapped with Ikonos are seen, while only 33% of the pavements could be identified. These results demonstrate that OrbView-3 images are well suited for creation of 1:10,000 topographic maps.
Biography of the Author(s)
Huseyin Topan is a PhD candidate for geodesy and photogrammetry in the Ýstanbul Technical University, Turkey. His main research direction is the infor-mation content and geometry of high-resolution space imagery.
Gürcan Büyüksalih is professor in Photogrammetry at Zonguldak Karaelmas University, Turkey. He received his PhD from the University of Glasgow, UK, Department of Geography and Topographic Science. His research direction is the full range of photogrammetry, especially application of space imagery.
Karsten Jacobsen received a PhD in Photogrammetry from Leibniz University, Hanover, Germany. He is academic director of the Institute of Photogrammetry and Geo-information at the same university. His main research area is numerical photogrammetry, especially the use of space imagery.
]]>
.::Click HERE to request for this data set::.
]]>A couple of our favorite cartographers are out there now rounding up work from all of our other favorite cartographers. If you’ve got a map to show off, submit it for consideration! If you know people who have maps to show off, encourage them to submit! The deadline is February 24; see all the instructions on the Atlas site.
Do it!
]]>Administrative Boundaries Maps:
Union Council Map (Admin level 4) of District Jaffarabad – Balochistan
Union Council Map (Admin level 4) of District Killa Saifullah – Balochistan
Union Council Map (Admin level 4) of District Kohlu – Balochistan
Union Council Map (Admin level 4) of District Loralai – Balochistan
Union Council Map (Admin level 4) of District Nasirabad – Balochistan
Union Council Map (Admin level 4) of District Sibi – Balochistan
Khyber Pakhtunkhwa
Union Council Map (Admin level 4) of District Abbottabad – KPK
Union Council Map (Admin level 4) of District Bannu – KPK
Union Council Map (Admin level 4) of District Batagram – KPK
Union Council Map (Admin level 4) of District Buner – KPK
Union Council Map (Admin level 4) of District Charsadda – KPK
Union Council Map (Admin level 4) of District Chitral – KPK
Union Council Map (Admin level 4) of District Dera Ismail Khan – KPK
Union Council Map (Admin level 4) of District Hangu – KPK
Union Council Map (Admin level 4) of District Haripur – KPK
Union Council Map (Admin level 4) of District Karak – KPK
Union Council Map (Admin level 4) of District Kohat – KPK
Union Council Map (Admin level 4) of District Kohistan – KPK
Union Council Map (Admin level 4) of District Lakki Marwat – KPK
Union Council Map (Admin level 4) of District Lower Dir – KPK
Union Council Map (Admin level 4) of District Malakand P.a. – KPK
Union Council Map (Admin level 4) of District Mansehra – KPK
Union Council Map (Admin level 4) of District Mardan – KPK
Union Council Map (Admin level 4) of District Nowshera – KPK
Union Council Map (Admin level 4) of District Peshawar – KPK
Union Council Map (Admin level 4) of District Shangla – KPK
Union Council Map (Admin level 4) of District Swabi – KPK
Union Council Map (Admin level 4) of District Swat – KPK
Union Council Map (Admin level 4) of District Tank – KPK
Union Council Map (Admin level 4) of District Upper Dir – KPK
Punjab
Union Council Map (Admin level 4) of District Bahawalpur - Punjab
Union Council Map (Admin level 4) of District Bhakkar – Punjab
Union Council Map (Admin level 4) of District Dera Ghazi Khan – Punjab
Union Council Map (Admin level 4) of District Jhang – Punjab
Union Council Map (Admin level 4) of District Khushab – Punjab
Union Council Map (Admin level 4) of District Layyah – Punjab
Union Council Map (Admin level 4) of District Mianwali – Punjab
Union Council Map (Admin level 4) of District Multan – Punjab
Union Council Map (Admin level 4) of District Muzaffargarh – Punjab
Union Council Map (Admin level 4) of District Rahim Yar Khan – Punjab
Union Council Map (Admin level 4) of District Rajanpur – Punjab
Sindh
Union Council Map (Admin level 4) of District Dadu – Sindh
Union Council Map (Admin level 4) of District Ghotki – Sindh
Union Council Map (Admin level 4) of District Hyderabad – Sindh
Union Council Map (Admin level 4) of District Jacobabad – Sindh
Union Council Map (Admin level 4) of District Jamshoro – Sindh
Union Council Map (Admin level 4) of District Kashmore – Sindh
Union Council Map (Admin level 4) of District Khairpur – Sindh
Union Council Map (Admin level 4) of District Larkana – Sindh
Union Council Map (Admin level 4) of District Matiari – Sindh
Union Council Map (Admin level 4) of District Naushahro Feroze – Sindh
Union Council Map (Admin level 4) of District Qambar Shahdadkot – Sindh
Union Council Map (Admin level 4) of District Shaheed Benazirabad – Sindh
Union Council Map (Admin level 4) of District Shikarpur – Sindh
Union Council Map (Admin level 4) of District Sukkur – Sindh
Union Council Map (Admin level 4) of District Tando Muhammad Khan – Sindh
Union Council Map (Admin level 4) of District Thatta – Sindh
Urban Union Councils:
Urban Area Maps:
.::Click HERE to request for this data set::.
]]>Special thank to our very Valued Contributor for Lahore.
.::Click HERE to request for this data set::.
]]>Chicago letterpress: Two-color prints of the downtown area, with a light blue background on the lake and rivers and either blue or black ink for the text. An addition from the poster prints is the inclusion of the ‘L’ transit lines.
San Francisco letterpress (2nd edition): In either blue or black ink, this one features a waterline effect around the city.
Manhattan letterpress: Two sections, upper and lower Manhattan. Available individually or as a set; with careful cutting you could splice them together and everything will properly line up.
Madison, Wisconsin: The old Axis Maps stomping grounds and home of our graduate institution, the University of Wisconsin. This one is a regular offset print and covers the isthmus and university areas.
Besides those we’ve got our old standard posters: Washington DC, New York, San Francisco, Chicago, and Boston.
So there it is. Get in any orders by this Friday to ensure delivery by Christmas!
]]>Back in 2007, or maybe 2008, I agonized over choosing a domain name. Those were wild days, a time when we all had to try to compete with the more badass names of our friends’ websites (e.g., or really i.e., indiemaps). Eventually I settled on Cartogrammar for its mild wordplay. It was about the grammar of cartography or some such nonsense. It never was a cool name and I never did invent a meaning for it, but even worse is that it sounds like it has to do with cartograms, which I kind of hate. And why try to give myself branding, anyway? I’m already part of a company that has a name. Using my own name for a domain name seemed dull a few years ago, but now dot-comming myself just seems to make sense.
No bookmarks or anything are dying here. The new domain simply points to the same place as cartogrammar.com, so everything continues to work as usual. Just wanted to note that I’m dropping the Cartogrammar name from the site and that from now on I prefer to link to andywoodruff.com instead. (While I was at these changes, by the way, I made some updates to my portfolio page.)
See you in hell, Cartogrammar!
]]>Title:
Land Capability Map of Sind.
Publication year:
1978
Publisher(s):
Direction of Dt. M. Bashir Choudhri. Director General. Soil Survey of Pakistan.
Language:
En
Coordinates:
N28°10 - N23°50 °55 - E71°10
Scale:
1,000,000
Title:
Soil Map of Sind.
Publication year:
1978
Publisher(s):
Direction of Dt. M. Bashir Choudhri. Director General. Soil Survey of Pakistan.
Language:
En
Coordinates:
N28°10 - N23°50 °55 - E71°10
Scale:
1,000,000
Title:
Soil Map of the Punjab.
Publication year:
1978
Publisher(s):
Direction of Dt. M. Bashir Choudhri. Director General. Soil Survey of Pakistan.
Language:
En
Coordinates:
N34°0 - N27°50 °10 - E75°10
Scale:
1,000,000
Title:
Land Capability Map of the Punjab.
Publication year:
1978
Publisher(s):
Direction of Dt. M. Bashir Choudhri. Director General. Soil Survey of Pakistan.
Language:
En
Coordinates:
N34°0 - N27°50 °10 - E75°10
Scale:
1,000,000
I’ve recently returned from the annual meeting of the North American Cartographic Information Society in my old stomping grounds of Madison, Wisconsin. I’ve mentioned NACIS here in the past. It’s a wonderful organization and it holds the best conference ever.
While I will recap some of the conference (which was very good this year), this time I’ve been thinking about it as a good representation of the state of American cartography. Even if you don’t care about the conference, bear with me as I hit on a few of its points and contemplate their significance to the field.
How does design make a difference?
This was the tagline of the conference, and I’m not sure there was much of an answer. It’s not an easy question, really. We all agree that good design can make a difference over bad design, but what is design? Can we make maps with an absence of design, and if so what difference does design make over non-design?
Let’s assume there is some agreed-upon definition of “design” and think about what it means that this was the theme of the conference. In an era when it’s not always clear what a “cartographer” is, here is a core group self-identified cartographers identifying themselves as designers. I’m among them and have encountered surprise when describing cartography to the uninitiated as by and large a design practice. Maybe now that anyone is a mapmaker, this attitude is what defines cartography. Maybe that’s how design makes a difference. Cartography isn’t making a map; it’s designing a map.
Art in cartography
Or maybe a cartographer is an artist. Tim Wallace organized a session on art in modern cartography, a topic that has come up many times over the years but this time stemmed from a series of blog posts that Tim instigated this past spring.
It continues to be an interesting debate because of its technological facets. Daniel Huffman argued for the art in “human cartography,” lamenting computer automation, which to be honest I see as a bit of a straw man. Aaron Straup Cope, if I am not misinterpreting his points, noted that newfangled ubiquitous, easy mapping creates more room for artistic cartography now that we don’t need to put all our efforts toward painstakingly accurate maps for navigation and the like.
Practical Cartography Day
The main NACIS conference is preceded by a day of more workshoppy talks, which this time I think comprised a representative slice of modern cartography. There was some of the usual fare, tips for traditional print or desktop cartography such as Alex Tait’s top ten reference cheat sheets. But nearly half the talks dealt with web cartography, with several hot shots covering hot topics. They included Nathaniel Vaughn Kelso of Stamen, AJ Ashton of Development Seed (I mean, have you read anything about web cartography lately that doesn’t mention TileMill?), Adam DuVander of the Map Scripting book, and my good pals Jeremy White of the New York Times and (with a presentation that alone was worth the price of admission) Zachary Forest Johnson of GeoIQ and other fame.
Thanks to cool guys Tim Wallace and Sam Pepple for crafting this session so well!
The new crowd
Speaking of those guys, in the six years that I’ve known NACIS I’ve been pleased to see how the membership has evolved to better reflect the reality of modern cartography. At the 2006 NACIS meeting, which was also in Madison and was the first one I attended, Schuyler Erle was invited to give a keynote address. He spoke, as was his wont, about the democratized cartography afforded by things like the still young Google Maps. Listening to the murmurs around the room, one could hear that many of the old school cartographers—the core constituency of NACIS—were appalled by the idea of amateur non-cartographers making maps. But now we seem to welcome these types, as it’s been proven that some of the best cartography is coming from people without cartography backgrounds but rather, often, web backgrounds. It is excellent to see, for instance, Messrs. Cope (who is “from the Internet”) and Migurski (who gave the keynote two years ago) from Stamen showing up among the “mainstream” cartographers, if that’s the right word. Even almighty Google now has a presence.
Meanwhile, I’m looking forward to seeing how my generation of bona fide cartographers helps shape the community. We’re the ones who are trained in cartography but during this explosive period of web mapping, which perhaps gives us a different perspective on the field from that of the more established cartographers. NACIS meetings are attended by a fair number of students as well as people like me who are only a few years out of school, and some of them already have pretty strong and active voices.
Teaching cartography
So far in this post I’ve mostly ignored the academic side of cartography, and I should mention that NACIS comprises a mix of professionals and academics. For me the most fascinating session at this year’s conference was one that brought together both types: a panel discussion on teaching cartography. It sounds ridiculous, but I’ve never had such an easy time staying awake at a conference session. Many topics and challenges were discussed, like teaching software versus teaching concepts and thematic versus reference mapping. (Also, glad that panelist, Harvard scholar, and new local carto/drinking buddy Kirk Goldsberry was dragged to the conference for this.) But at a week’s removal, what’s really fascinating is my fascination itself. I sat there, engrossed in the discussions, kind of wondering why I, not being a cartography teacher, was so interested. Perhaps it’s just reflection on my own roots and where my education was good and where it was lacking. But more likely it’s that cartography is—and I don’t care if this sounds pathetic—my essence, and I care a lot about how it is taught or otherwise instilled in others. It matters to all of us who make maps in this time when, as I noted before, we’re not even sure what a cartographer is. However we arrived at map-making, let’s think about what people need to learn to practice the craft and how it can be taught.
Best week of the year
One of my happiest days a couple of years ago was when the top search term directing people to my website was “drinking in a bathtub,” which brought visitors to a post about a previous NACIS conference. I have certainly been much more serious this time, but don’t let that distract from the fact hat NACIS is simply the best time you will ever have at a conference, especially if it’s in Madison. NACIS truly is a community, where the people you meet are more like friends than professional contacts. The conference organizers do an amazing job of establishing a productive but fun environment. (I want to thank them profusely but don’t want to list names for fear of leaving someone out. If you’re a current or future NACIS attendee you’ll know them.) The schmoozing is easy, and there is a healthy drinking culture among cartographers (I’d like to think that we at UW-Madison were pioneers in that area).
Consider it plugged. NACIS is awesome. Cartography is awesome.
]]>The map viewer acts similar to other ArcGIS programs such as ArcMap and draws the features by using the location information as well as stores any attribute information that is included with the data. By including the attribute information, the map viewer allows you to create pop-up windows for your map based on this data.
To add a layer from a file from the map viewer:
– Click “Add”
– Click “Add Layer from file”
– Navigate to the file location on your computer
– Click “Import Layer”
If you are using Mozilla Firefox or Google Chrome, you can import the data directly by dragging the data from your computer and dropping the file into the map
There are a few things to keep in mind when adding features from a file:
– Delimited text files must have latitude and longitude information stored in separate fields
– The map viewer will display the first 1000 waypoint, track, and/or route features from a GPS exchange file
– You must compress the shapefile into a .zip file which must contain the .shp, .shx, .dbf and .prj files that comprise a shapefile. In addition, the zip file and shapefile must have the same name and be under 1 MB since larger datasets will experience poor performance
Interested in learning more? GeoSpatial Training Services is offering an introductory course on the basics of working with ArcGIS Online.
The Python GIS Programming Bootcamp will be comprised of 3 courses scheduled back to back with a one week break between each.
Each of the courses is instructor guided and self-paced. You also have access to all course materials for an entire year so if you need extra time it’s not a problem!
We are going to limit this first session to 20 participants so don’t wait to register. We expect it to fill very quickly given the popularity of our GIS Programming 101 for ArcGIS 10 course.
Suggested pre-requisite: GIS Programming 101 for ArcGIS 10: Mastering Python
The three courses that will be part of this bootcamp are listed below:
Intermediate Python Programming Concepts for GIS Programmers
Course dates: October 17th – October 28th
Open Source GIS Programming with Python
Course dates: November 7th – November 18th
Advanced ArcGIS Programming with Python
Course Dates: November 28th – December 9th
Bootcamp Price is $697
Don’t want to take all 3 courses but are still interested in one or two? You can register for them individually.
Hurricane Irene, which hit the East Coast of the U.S. over the weekend of August 27 - 28 2011, produced large amounts of flooding in some inland areas such as the state of Vermont (its worst natural disaster since 1927), while leaving other areas such as the Connecticut River valley of Massachusetts relatively unscathed. The Berkshire Mountains of western Massachusetts received between 6 and 8 inches of rain from Irene. In particular, the picturesque town of Shelburne Falls on the Deerfield River, which averages 4.4 inches of rain for the entire month of August, had to be evacuated due to flooding.
There are already lots of maps out there depicting Hurricane Irene and its effects, such as this New York Times map, but many of us would like to play with the data itself. Prompted by a question about local precipitation data, I located downloadable shapefiles from the National Weather Service's Advanced Hydrologic Prediction Service and put it together with storm track information from the National Hurricane Center to produce this map:
The precipitation data from the National Weather Service is provided as point shapefiles on a 4.3-Km grid (rather than as rasters, possibly because this lets them leave out points with zero precipitation). The files with the shortest time period are for the "hydrologic day", ending at 12 Noon GMT. Two days of data were merged with ArcToolbox > Data Management Tools > General > Merge, and then the precipitation from coincident locations (sharing unique IDs and coordinates) were summed using ArcToolbox > Data Management Tools > Generalization > Dissolve (note that this short-changes North Carolina totals). The hurricane data is provided as a Google Earth KML document, which was converted to a table using GPS Visualizer and imported. The hurricane symbols are built from Character Marker Symbols using ESRI's Climate and Precipitation and Geometric Symbols fonts.
Hurricane Irene, which hit the East Coast of the U.S. over the weekend of August 27 - 28 2011, produced large amounts of flooding in some inland areas such as the state of Vermont (its worst natural disaster since 1927), while leaving other areas such as the Connecticut River valley of Massachusetts relatively unscathed. The Berkshire Mountains of western Massachusetts received between 6 and 8 inches of rain from Irene. In particular, the picturesque town of Shelburne Falls on the Deerfield River, which averages 4.4 inches of rain for the entire month of August, had to be evacuated due to flooding.
There are already lots of maps out there depicting Hurricane Irene and its effects, such as this New York Times map, but many of us would like to play with the data itself. Prompted by a question about local precipitation data, I located downloadable shapefiles from the National Weather Service's Advanced Hydrologic Prediction Service and put it together with storm track information from the National Hurricane Center to produce this map:
The precipitation data from the National Weather Service is provided as point shapefiles on a 4.3-Km grid (rather than as rasters, possibly because this lets them leave out points with zero precipitation). The files with the shortest time period are for the "hydrologic day", ending at 12 Noon GMT. Two days of data were merged with ArcToolbox > Data Management Tools > General > Merge, and then the precipitation from coincident locations (sharing unique IDs and coordinates) were summed using ArcToolbox > Data Management Tools > Generalization > Dissolve (note that this short-changes North Carolina totals). The hurricane data is provided as a Google Earth KML document, which was converted to a table using GPS Visualizer and imported. The hurricane symbols are built from Character Marker Symbols using ESRI's Climate and Precipitation and Geometric Symbols fonts.
Our self-paced GIS Training provides a fast, easy, inexpensive, and highly effective method for acquiring GIS skills. Here are some of the key features of our self paced training:
Wiki Page: http://wiki.crisiscommons.org/
http://irene.betanyc.org/
Ushahidi Instance: http://irene.tethr.org/
Data and services being hosted through ArcGIS.com Irene group.
http://www.arcgis.com/home/
Esri Cyclone Map
]]>
For the past three years, during the month of July, I’ve made it a point to visit the Disneyland in San Diego. I am sure you are already questioning my sanity. “Really, Disneyland in San Diego?”
Well, yes. As all geo-geeks would agree, the International User Conference organized by Esri ( it’s Ezree) in San Diego is, definitely, the “Disneyland” that attracts 14,000 geo-sapiens from all over the world. Here, the various geospatial offerings will take you from Alice in Wonderland to Space Mountain, “adventure(Is)land” is where you try out different geospatial software and a “fantasyland” of map gallery is filled with maps beyond imagination.
This year’s conference provided an opportunity for few of our crisismappers to meet. The group comprised of members from Esri, Ushahidi, GIS Corps and SBTF. With increased partnerships comes more ideas and opportunities. An interesting relationship that blossomed during the meet-up was a possible partnership with GIS Corps. GIS corps is a 10 year old, dynamic group of professional volunteers providing GIS support to organizations dealing with crises. SBTF can leverage their GIS expertise in developing analysis capabilities to expand the amount of information made available to ground teams.
And of course, there’s Esri! These folks are dedicated to improving the analysis of crowdsourced information collected at the time of a crisis. During several crises including Haiti and Japan, Esri extended their support by offering data and publishing social media maps. Discussions with the Esri team have led to concrete steps to be taken towards improving interoperability of Ushahidi with ArcGIS platform and also establishing a focus group of Ushahidi/GIS users. Members of SBTF’s analysis team will be involved in the focus group.
In addition to being a great place for exchanging ideas, the User Conference allowed us to explore cross-applications of GIS and draw inspiration from such projects from around the world. The Ushahidi team’s blog post is one such example that was inspired by GIS for the War-Fighter application. Other useful takeaways for SBTF related to GIS tools are:
ArcGIS Online is a place to access, and visualize maps and data created by GIS users. There are tons of data, maps, and applications posted by various federal and private organizations. Newbies to GIS can create their own map that can then be viewed in a browser, desktop, or mobile platforms. The maps can also be shared via blog, email, or embedded in a website. The tools are highly intuitive; I encourage you to go create your first map today and become an official geo-geek! I believe that the crisismapping community can leverage this platform as a data repository to facilitate geospatial data exchange between agencies.
The option to create private groups is available which helps maintain integrity and security in data transactions. When a crisis response deployment demands added security such as a firewall, Esri’s Portal for ArcGIS (due to be released soon) helps. It is simply ArcGIS Online behind a firewall, that is, data residing in a private cloud.
GeoCommons, I quote from their website, are “a public community of geo users building open repository of data and maps for the world.” This is yet another free platform to create, share, analyze maps & data.
ArcGIS Explorer is another free product from the Esri family. Similar in functionality to ArcGIS Online, the Explorer provides free versions for both desktop and online, and enables creation of visually rich, map presentations. It is important for Crisis Mappers to create powerful stories, now made possible by this application.
Community Maps Program is an initiative by Esri that provides most of the thematic data found in ArcGIS Online. Organizations, from local governments to humanitarian aid agencies, interested in making their data content broadly available utilize this program. I call this Esri’s crowdsourcing model for data collection but open to enterprises crowd only.
High resolution satellite imagery and LIDAR were other hot topics discussed at the conference that SBTF could utilize for analysis during humanitarian crisis response deployments.
My affiliation with SBTF made this year’s user conference awe-inspiring because I met fellow crisismappers who once were strangers, got-together as a team, brainstormed, argued, supported, developed wonderful friendships, and joined hands in the quest to save lives. I realized the significance of Patrick Meier’s (Co-founder, International Network of CrisisMappers) words, “Changing the world, one map at a time! ”
Special thanks to Jessica Heinzelman, a fellow crisismapper, for sharing her thoughts and presentation at Esri User Conference.
]]>Initially people often wonder:
The other main factor is unfamiliarity. Over the last two summers I've taught a semester long course called, Introduction to Open Source GIS at the local community college. The GIS program at my school, like most, is ESRI-centric. A majority of the students are very surprised to learn about the broad array of FOSS GIS software. Once exposed to FOSS software, such as QGIS, they ask, "Why doesn't everyone use it?" It comes down to a combination of these two factors.
In full disclosure, I also use ArcGIS almost every day. However, I also utilize all the leading FOSS GIS software. I consider them all valuable tools in my toolkit. One of the nice features of FOSS GIS software is that it's free. So there is absolutely nothing preventing you from downloading a FOSS GIS package and trying it out. If it doesn't meet your needs just uninstall it. My hope is to inspire people to do exactly this.
This year the FOSS4G Conference is in Denver, CO and very accessible to those of us in the USA. The timing couldn't be better to learn more about FOSS4G. While FOSS GIS software has been around since the 1980's, recent years have seen the software becoming much more mature and user friendly. There are great FOSS GIS products for the desktop, web server, web client, spatial database and mobile GIS. There are now intuitive Windows installers for all the leading packages.
So, if you have questions like:
you should sign up for the Introduction to Geospatial Open Source at this year's FOSS4G conference.
Hope to see you there!
]]>Initially people often wonder:
The other main factor is unfamiliarity. Over the last two summers I've taught a semester long course called, Introduction to Open Source GIS at the local community college. The GIS program at my school, like most, is ESRI-centric. A majority of the students are very surprised to learn about the broad array of FOSS GIS software. Once exposed to FOSS software, such as QGIS, they ask, "Why doesn't everyone use it?" It comes down to a combination of these two factors.
In full disclosure, I also use ArcGIS almost every day. However, I also utilize all the leading FOSS GIS software. I consider them all valuable tools in my toolkit. One of the nice features of FOSS GIS software is that it's free. So there is absolutely nothing preventing you from downloading a FOSS GIS package and trying it out. If it doesn't meet your needs just uninstall it. My hope is to inspire people to do exactly this.
This year the FOSS4G Conference is in Denver, CO and very accessible to those of us in the USA. The timing couldn't be better to learn more about FOSS4G. While FOSS GIS software has been around since the 1980's, recent years have seen the software becoming much more mature and user friendly. There are great FOSS GIS products for the desktop, web server, web client, spatial database and mobile GIS. There are now intuitive Windows installers for all the leading packages.
So, if you have questions like:
you should sign up for the Introduction to Geospatial Open Source at this year's FOSS4G conference.
Hope to see you there!
]]>This map has been developed and generously shared by Mohammad Ihsan Afridi, Geologist Pakistan Mineral Development Corporation, Peshawar Pakistan
Base data for this map have been collected from the meteorological department, some from the Engineering council as building design code of Pakistan and the rest have been digitized from the published map of the Geological Survey of Pakistan.
]]>To help you in getting started with building your ArcGIS Server Viewer applications we are releasing Introduction to the ArcGIS Viewer for Flex as a free learning module. This is the first module in our Programming the ArcGIS Server API for Flex course. This has been a very popular course over the past two years and has now been updated for version 2.4 of the Flex API. The next session begins August 29th and includes the course modules you see below. This is a self-paced, web-based, instructor guided course. One year of access to all course materials is included so you can review as necessary. This also includes any course updates that occur during the year.
The free module also includes 5 exercises. These exercises are in pdf format and can be found below the list of course modules.
Exercises for Introduction to the ArcGIS Server Viewer for Flex:
This course is broken into two sections: Introduction to VBA for ArcMap and Introduction to ArcObjects. They are meant to be taken in sequence. Exercises for the course come from the book Getting to Know ArcObjects which you’ll need to purchase if you intend to complete the exercises. However, the lectures provided above are free for all to use.
For more information on other courses please visit our website.
The first session of our newest instructor guided, Internet based Virtual GIS Classroom course entitled “Introduction to Managing ArcSDE with SQL Server” will be delivered beginning September 26th. Learn how to configure SQL Server for ArcSDE, configure and optimize ArcSDE, perform database backup and recovery, manage ArcSDE, store and manage vector and raster data, manage versioned and non-versioned databases.
Course Modules
This course is also now a part of our ArcGIS Server Bootcamp. The next session of the bootcamp begins September 12th. Register by August 15th for the pre-registration price of $615.00
Introduction to Web Development
Do you want to start developing web applications but aren’t sure where to begin? This introductory level course will introduce you to the basic web concepts you need to understand BEFORE you start developing web applications. This course will give you a solid understanding of things like web servers, IP and DNS addresses, dynamic versus static web pages, web hosting, HTML, CSS, and web programming languages including JavaScript. We highly recommend that you take this introductory level course as a pre-requisite to our Mastering the ArcGIS Server JavaScript API, Programming the ArcGIS Server API for Flex, Programming ArcGIS Server with Silverlight, Introduction to Programming the Google Maps API, and Introduction to OpenLayers Programming courses.
Modules
Module 1: Introduction to Web Development
Module 2: Web Architecture
Module 3: Identifying Computers by IP and DNS
Module 4: Static versus Dynamic Web Pages
Module 5: Web Hosting
Module 6: Introduction to HTML
Module 7: Introduction to Cascading Style Sheets (CSS)
Module 8: Design and Planning for the Web
Module 9: HTML Editors
Module 10: Tools for Creating Websites
Module 11: Web Programming Languages
Module 12: Introduction to JavaScript
This course will be available September 15th. Pre-purchase now and save $30.00
Introduction to ArcGIS Online
ArcGIS Online is a website that is created and maintained by ESRI and is dedicated to working with maps and data. The website acts like an online warehouse of maps, data, applications, and tools provided by ESRI, ESRI partners, and the GIS community. You can access the website at http://www.arcgis.com/home. In this course you will be introduced to ArcGIS Online, learn how to search for existing maps, access basemaps, create maps, access online tasks, share contents and maps and use the community maps program.
Modules
Module 1: Introduction to ArcGIS Online
Module 2: Searching ArcGIS Online and Opening Maps
Module 3: Basemaps
Module 4: Creating Maps Using ArcGIS Online
Module 5: ArcGIS Online Tasks
Module 6: Sharing Content and Maps
Module 7: Community Maps Program and ArcGIS Online Groups
This course will be available September 1st. Pre-purchase now and save $30.00
This course is now part of our ArcGIS 10 Bundle. Save $100 on this bundle during the month of August when you enter the discount code ‘agisbundle‘ (no quotes) when purchasing online.
Our self-paced GIS Training provides a fast, easy, inexpensive, and highly effective method for acquiring GIS skills. Here are some of the key features of our self paced training:
Traditional face-to face training
Web based, instructor guided
The first ever Monday Mapping Meetup will be held at 9am on Monday, September 19th, at the Map Room in Chicago! This is a free informal and coffee-laced gathering for NPO and community group staff interested in learning more about community mapmaking. We'll look at free and user-friendly tools and data sources - like Google Maps, US Census Datasets, and GeoCommons - that other groups are using to build powerful online "mapplications" in their neighborhoods and cities. Bring a laptop (optional) and your map-related questions and ideas!
Although this event is free, we're asking all attendees to buy a coffee, juice, and/or bagel to thank the Map Room for graciously hosting our meetup! Map Room is a short walk east of the Armitage Blue Line stop; bike parking is available. Questions? Contact us.
]]>
The first ever Monday Mapping Meetup will be held at 9am on Monday, September 19th, at the Map Room in Chicago! This is a free informal and coffee-laced gathering for NPO and community group staff interested in learning more about community mapmaking. We'll look at free and user-friendly tools and data sources - like Google Maps, US Census Datasets, and GeoCommons - that other groups are using to build powerful online "mapplications" in their neighborhoods and cities. Bring a laptop (optional) and your map-related questions and ideas!
Although this event is free, we're asking all attendees to buy a coffee, juice, and/or bagel to thank the Map Room for graciously hosting our meetup! Map Room is a short walk east of the Armitage Blue Line stop; bike parking is available. Questions? Contact us.
]]>
To date we have had 731 respondents. Here are some of the highlights:
You can get all the results here.
All of these areas represent potential wilderness, and they would all be opened to road-building and off-road vehicle use–impacts. Not only would existing protections for these areas be reversed, but future administrations would be prevented from ever protecting Wilderness Study Areas or unroaded Forest
Service Lands.
For more information go to: PEW Environmental Group
]]>All of these areas represent potential wilderness, and they would all be opened to road-building and off-road vehicle use–impacts. Not only would existing protections for these areas be reversed, but future administrations would be prevented from ever protecting Wilderness Study Areas or unroaded Forest
Service Lands.
For more information go to: PEW Environmental Group
]]>This fall Bird's Eye View (with the support of the GeoTech Center) will be holding the FOSS4G Workshop for Educators at the Free and Open Source Software for Geospatial Conference (FOSS4G) in Denver, Colorado. This is exciting for at least two reasons. Having the FOSS4G Conference in North America, let alone the United States, is fairly uncommon. In recent years it has been held in Australia, South Africa and Spain. Secondly, the workshop will premier one of the only FOSS GIS curricula in the United States. Entitled Introduction to Open Source GIS and Web Mapping, it is currently being taught at Central New Mexico Community College.
Free and open source software comprises one of the fastest evolving sectors of GIS. While FOSS GIS software has been around since the 1980's, recent years have seen the software becoming much more mature and user friendly. There are great FOSS GIS products for the desktop, web server, web client, spatial database and mobile GIS. Historically, ease of access and installation has been a major hurdle for those wanting to transition to FOSS GIS software. Now there are intuitive Windows installers for all the leading packages.
The course is expected to become increasingly important to the CNM program. In New Mexico, employers are starting to favor applicants with knowledge of both ESRI and FOSS applications. This is in part due to the economic times. Students at CNM and elsewhere learn GIS in pure ESRI environments. Most are shocked to discover how many capable FOSS GIS software packages exist.
The course sticks to a pure FOSS paradigm. For example, assignments and lectures are provided in Open Office versus Microsoft Office. The students are not introduced to much new GIS material in the course, save web mapping. Rather they are shown how to do things they have learned in other foundational courses using FOSS GIS software. The packages used include: Quantum GIS, GRASS GIS, GDAL/OGR, SpatiaLite, PostgreSQL/PostGIS, and MapServer. They are also introduced to open standards and open data. Midway through the semester they are given a final project assignment. For this they research a FOSS GIS package not being covered in the course lab, and during the last week of class they present their findings to the class. This exposes the students to a large number of new tools.
The web mapping portion is an introduction to web mapping and the web in general. Part of the overall goal for the course is to make it accessible to students who have completed the Introduction to GIS course. So, this course has no programming requirement. Google maps (although no open source) is used as a gentle introduction to web mapping. Then students move on to labs where they use MapServer to create basic web mapping applications.
The workshop this fall will target educators wanting to incorporate FOSS GIS into their curricula, or those who are just curious about what FOSS GIS is and what it can do. The course goals, readings, labs and exam structures will be shared. Attendees will also get to try their hand at a lab or two. For more information visit the conference workshop page.
]]>This fall Bird's Eye View (with the support of the GeoTech Center) will be holding the FOSS4G Workshop for Educators at the Free and Open Source Software for Geospatial Conference (FOSS4G) in Denver, Colorado. This is exciting for at least two reasons. Having the FOSS4G Conference in North America, let alone the United States, is fairly uncommon. In recent years it has been held in Australia, South Africa and Spain. Secondly, the workshop will premier one of the only FOSS GIS curricula in the United States. Entitled Introduction to Open Source GIS and Web Mapping, it is currently being taught at Central New Mexico Community College.
Free and open source software comprises one of the fastest evolving sectors of GIS. While FOSS GIS software has been around since the 1980's, recent years have seen the software becoming much more mature and user friendly. There are great FOSS GIS products for the desktop, web server, web client, spatial database and mobile GIS. Historically, ease of access and installation has been a major hurdle for those wanting to transition to FOSS GIS software. Now there are intuitive Windows installers for all the leading packages.
The course is expected to become increasingly important to the CNM program. In New Mexico, employers are starting to favor applicants with knowledge of both ESRI and FOSS applications. This is in part due to the economic times. Students at CNM and elsewhere learn GIS in pure ESRI environments. Most are shocked to discover how many capable FOSS GIS software packages exist.
The course sticks to a pure FOSS paradigm. For example, assignments and lectures are provided in Open Office versus Microsoft Office. The students are not introduced to much new GIS material in the course, save web mapping. Rather they are shown how to do things they have learned in other foundational courses using FOSS GIS software. The packages used include: Quantum GIS, GRASS GIS, GDAL/OGR, SpatiaLite, PostgreSQL/PostGIS, and MapServer. They are also introduced to open standards and open data. Midway through the semester they are given a final project assignment. For this they research a FOSS GIS package not being covered in the course lab, and during the last week of class they present their findings to the class. This exposes the students to a large number of new tools.
The web mapping portion is an introduction to web mapping and the web in general. Part of the overall goal for the course is to make it accessible to students who have completed the Introduction to GIS course. So, this course has no programming requirement. Google maps (although no open source) is used as a gentle introduction to web mapping. Then students move on to labs where they use MapServer to create basic web mapping applications.
The workshop this fall will target educators wanting to incorporate FOSS GIS into their curricula, or those who are just curious about what FOSS GIS is and what it can do. The course goals, readings, labs and exam structures will be shared. Attendees will also get to try their hand at a lab or two. For more information visit the conference workshop page.
]]>The project of PakGIS was started to support researchers, students and professional in their work/ studies by the provision of geo-data which is generally not available free of cost.
Being first and biggest forum of free GIS data provision, now we are getting an increased number of GIS data requests on daily basis from all over the world.
How can you contribute?
For sustainability of the initiative, we are looking for home based volunteers for the following tasks:
We can teach you for any GIS processing required for any task. All we need is dedicated and trust worthy persons committed to the cause.
Benefits
Join PakGIS now by registering HERE
Please spread this message among your juniors and friends interested in GIS stuff through mail, facebook or other sources.
]]>For $100 annual fee, interested and prospective users can purchase ArcGIS for Home Use designated for noncommercial personal use only. The home use 12-month term license includes:
While available worldwide, US residents can order online and the rest of the world can purchase through local distributors. The online ordering for US customers will be open from July 11, 2011.
The ArcGIS puppy follows you everywhere!
]]>Because it depends on public transit data encoded using Google's open data standard for transit info, the Transit Feed Specification, it's only available for certain cities. It's a follow-up to the Mapumental site, a UK version which has been in closed beta for quite a while. You can learn more about the technology behind Mapnificent in a blog article by the creator.
]]>
Because it depends on public transit data encoded using Google's open data standard for transit info, the Transit Feed Specification, it's only available for certain cities. It's a follow-up to the Mapumental site, a UK version which has been in closed beta for quite a while. You can learn more about the technology behind Mapnificent in a blog article by the creator.
]]>
From AZ Central.com here's a fire progression up to June 6th. You can click on the link go directly to the site.
From NASA here is a MODIS satellite image from June 4th showing the smoke plume spreading far into New Mexico. Click on the image to go directly to this NASA site.
From Wildfire Today here is a map of fire danger across the lower 48.
]]>
From AZ Central.com here's a fire progression up to June 6th. You can click on the link go directly to the site.
From NASA here is a MODIS satellite image from June 4th showing the smoke plume spreading far into New Mexico. Click on the image to go directly to this NASA site.
From Wildfire Today here is a map of fire danger across the lower 48.
]]>
The notable modules include a web tool kit that bundles [Google, Gmail, Bing, Twitter, Wikipedia, Flickr] APIs with a robust HTML parser and web spider; a NLP toolkit for english; search algorithm and graphical data structure to represent relationships.
A case study on Twitter opinion mining experimented during Belgian 2010 elections is available for reference. I look forward to endless applications in geospatial context from a geographically distributed product branding analysis to critical data mining for crisis mappers.
]]>Dilbert’s “Salary Theorem” states that:
“Engineers and scientists can never earn as much as business executives and sales people.”
This theorem is supported by a mathematical equation based on the following two postulates:
1. Knowledge is Power.2. Time is Money.
As every engineer knows: Power = Work / Time
Since: Knowledge = Power, and Time = Money, then Knowledge = Work / Money
Solving for Money, we get: Money = Work / Kowledge
Thus, as Knowledge approaches zero, Money approaches infinity, regardless of the amount of work done.
Conclusion: The less you know, the more you make.
If you have difficulty following the above theorem, I am sure you must make a lot of money
]]>Minutes after the Christchurch quake news, Crisismappers activated their media monitoring and verification teams of standby task force to help the NZ emergency managers.
Click here for the Christchurch Crowdsourced Crisismap.
Google contributed its part by deploying a person finder application and there are 5000 records being tracked already.
Click here to access those records if you are looking for someone in the chaos.
Esri deployed an editable social media map,today, that organizes geo-tagged ushahidi posts and relevant youtube videos.
Click here to visit the Esri’s NZ Earthquake Incident Viewer.
]]>We are in a new decade and have carried with us several seeds of fascinating technologies from 2010. Social media, mobile, cloud computing, location awareness, 3D, augmented reality, touch UI have been snazzy enough in 2010 and will continue to engage us through 2011. We will experience high dominance of cloud interventions in the mainstream tech-arena from watching TV to listening music to paying bills (mobile integrated payments like Facecash) to targeted, location-aware mobile marketing and to what not!
Believe it or not, consumers have turned into ‘cloud’sumers. Well, the geospatial world has not been spared too. We have evolved from using historic paper maps (wow!) to dynamic maps in mobile apps. Ipads, iphones, androids, and digital slates will be our new “Arcview” stations. Cloud enabled geospatial architectures will be the new engines driving the industry. Everyday will be a new day, new life, and a new geo-space with increased participation in volunteered geographic information (VGI) maps / crowd-sourced data / openstreetmap. Though a historically adapted concept, Geodesign will be the geospatial buzz word and I would call it as ‘Digital Geodesign.’
We will constantly acquire social responsibilities with multiple citizenships in dynamic, spatial communities. By wearing those polymorphic digital suits, we will provide more feedback, share more often, shout out loud, build more 3D spaces, fly-through augmented reality apps, and care digitally!
Thou shall witness the rise of ‘cloudsumers’ in 2011!
Thoughts ?
Tip: Success formula in 2011 for all the geo-sapiens out there: Collaborate more; communicate often; wear a ‘geodesigner’ badge, and hey, learn Python (rightly said, James Fee)!
]]>I have a good feeling about India’s long journey towards data transparency initiatives and strongly believe that Bhuvan is a good seed sown. It has immense capabilities to evolve into a full-scale Indian information portal. Kudos to the Bhuvan team! However, with right technologies, constant collaboration, effective feedback, and efficient marketing, you can do better Bhuvan!
]]>]]>
Last week, Google launched the Chrome OS, its new, light-weight, open source operating system that is entirely browser-based and relies heavily on apps and browser extensions. According to the Chrome blog, Google is working with several device manufacturers to release Chrome OS loaded netbooks. With ipad’s huge success, tech companies like Acer, HP, ASUS, Lenova, Toshiba, TI, and Dell are in the pool, seriously considering Chrome OS. Even though the OS is launched officially, it is not in a consumable form for the public yet (expected after mid 2011). However, Google is sending out a test netbook named CR-48 to pilot testers around the country.
Tip: If you have applied for the pilot program and are anxiously awaiting the result, click here to use an unofficial version of cr-48 shipment tracker.
Will the Geo-community hop on?
Few months ago, we had a brainstorming discussion in Linkedin, over my post ‘Journey from Desktop to Cloud.’ While there were mixed feelings on cloud GIS services, most of the users were willing to adapt to the cloud environment gradually. Not to mention, Esri led the way by moving ArcGIS server on Amazon EC2 , launching ArcGIS online and ERDAS, as well, launched Apollo on the cloud with Skygone. There are several pure browser-based GIS applications too.
Assuming Chrome OS secures fast adoption rates, the increased demand for web-based services may shift a percentage of the geospatial industry into cloud. Although it is too early to speculate and the change is not going to happen overnight, I do expect the migration (average consumers) to be real in about 2 to 5 years. Eventually, would Chrome OS replace Windows and/or other operating systems? Negative. Although, gone are the days of unreliable networks. We have entered an era of sophisticated, fast, robust, scalable, and reliable networks. Based on initial reviews, chrome development, and future trends I would bet on the success of Chrome OS in sweeping market shares, as Android did.
Fingers crossed!
]]>For the past year and a half, my department, Academic Technology Services, has been working on a mapping project that we call Cityscapes. It's a “Web 2.0” tool to allow students to collaborate in their studies of urban neighborhoods, where geography should be an organizing theme. Think of Google Maps, then think of groups of students adding their own location markers and decorating them with photos, videos, and blogs.
The two sites we've created so far can be seen here:
http://www.ats.amherst.edu/tokyodemo
http://www.ats.amherst.edu/parisdemo
The Tokyo site is not completely open due to copyright considerations; if you would like an account, contact me.
My part of this project was preparing the historical maps that you see in the image below. This included georeferencing them but also turning them into properly positioned Google tiles.
Attachment | Size |
---|---|
Cityscapes Tokyo Animated.gif | 1.15 MB |
For the past year and a half, my department, Academic Technology Services, has been working on a mapping project that we call Cityscapes. It's a “Web 2.0” tool to allow students to collaborate in their studies of urban neighborhoods, where geography should be an organizing theme. Think of Google Maps, then think of groups of students adding their own location markers and decorating them with photos, videos, and blogs.
The two sites we've created so far can be seen here:
http://www.ats.amherst.edu/tokyodemo
http://www.ats.amherst.edu/parisdemo
The Tokyo site is not completely open due to copyright considerations; if you would like an account, contact me.
My part of this project was preparing the historical maps that you see in the image below. This included georeferencing them but also turning them into properly positioned Google tiles.
Attachment | Size |
---|---|
Cityscapes Tokyo Animated.gif | 1.15 MB |
I have recently been writing my first script to handle a geoprocessing task related to our Cityscapes project, where we place georeferenced historical maps in a Google Maps framework that can be used in urban studies classes (more to come!). Since the beginning I have been converting the georeferenced maps into Google Maps tiles by hand, a process that takes a couple of hours. This is small compared to the georeferencing time, so even though it was a prime candidate for scripting I had been putting it off in order to complete more maps.
For about a year now I have been learning the scripting language Python, as it has become the new, open way to script ArcGIS (and also has many fans in other areas of scientific computing). As my summer Cityscapes efforts ended, and as I prepare for my Cityscapes presentation at the NEArc meeting on November 8, my education has finally shifted over to using Python with ArcGIS (9.3). The documentation I have been using includes the following items:
The script is being implemented through ArcToolbox, which makes it easier to provide script parameters, such as the input raster, through a GUI dialog. But because I want it to be usable by others, I have to write it in a general way, which means displaying information about the input raster, suggesting reasonable values for subsequent inputs, and checking for illegal values. Hence I have been learning about the ToolValidator class, apparently a new feature in version 9.3, and I've been struggling to understand some undocumented characteristics. Some of my observations follow.
First off, the ToolValidator class is only very loosely connected to its Tool dialog. In particular, it is not instantiated just once when the dialog opens, which would allow the preservation of state within the class as the user changes the dialog, but instead every time the dialog parameters are changed, and before each of its methods are called. Hence the dialog parameters themselves are the only way to store information as the user fills in the fields and they are validated.
The ToolValidator template provided for each script accesses the dialog parameters in its initialization method:
def __init__(self):
import arcgisscripting as ARC
self.GP = ARC.create(9.3)
self.params = self.GP.GetParameterInfo()
The parameter values themselves are accessed through, for example, self.params[2].Value
, but be forewarned that sometimes the property Value
is a Value object, rather than just a string of characters (poor documentation on this is discussed here). This seems to be the case when the parameter refers to a file or other data structure, and if you want its name as a character string you must reference the property
self.params[0].Value.Value
or use the expression
str( self.params[0].Value )
The latter conversion will occur automatically in contexts requiring a string, such as concatenation.
When a dialog is first opened, the ToolValidator method initializeParameters()
is called. It can be used to insert initial values into the parameter fields, and also set some characteristics of those fields in the dialog. For example, to disable a parameter field (make it uneditable), you can include code such as:
self.params[2].Enabled = False
When a dialog field has been modified (which may include when the dialog first opens), verification that a parameter value is usable occurs at several places:
updateParameters()
comes next, and can be used to verify many characteristics of input values and avoid processing bad ones. It's also possible to correct the values directly, but in most cases it's better to send a message to the user as described in step 4 and let them correct them.updateMessages()
is called, where the messages returned from step 3 can be inspected and, if you want, reset with your own. These messages can direct users to fix their errors.The result of the separation of steps 2 and 4 is that sometimes the same tests must take place, first to avoid processing bad data and second to send a message about it.
The ToolValidator method updateParameters()
can be used to process new parameter values (e.g. to set related parameters). No information is provided about which of the dialog fields was modified, though, so you must check each parameter of interest to see if it is the one that needs to be validated:
if not self.params[0].HasBeenValidated :
…
elif not self.params[1].HasBeenValidated :
…
Once you find the modified parameter, you can ignore the update if it's due to the script itself (e.g. initialization) rather than being altered by the user:
if not self.params[1].altered : return
Before processing an input dataset, you should test for its existence:
if not self.GP.Exists( str( self.params[0] ) ) : return
If the data isn't actually there, this method will raise an error, but it won't be reported until later, hence the need to return
at this point.
The ToolValidator method updateMessages()
can be used to send the user a message about problems with the value in a dialog field. The result is a red ⊗ next to the input field, and clicking on it will display the message. Once again no information is provided about which of the dialog fields has the problem, so you must check each parameter of interest (even if this was done previously), e.g.:
if self.params[1].Value and self.params[1].Value <= 0 :
self.params[1].ClearMessage()
self.params[1].SetErrorMessage("This parameter must be positive.")
Providing basic documentation for the tool and its parameters is fairly easy, once you know that the simplest way to create and modify them is with the Documentation Editor. The ArcGIS documentation provides two options for opening the Editor, the first of which doesn't work, the second of which is available by opening ArcCatalog and menuing View > Toolbars > Metadata to bring up that toolbar. Then click on the script, click on its tab Metadata, and finally, in the Metadata toolbar, click on the button Edit metadata.
When the Documentation Editor opens, the left side lists the various types of information you can provide. The main tool description, visible when it first opens, is added in the section General Information, in the item Abstract. The various parameters are described in the section Help in the item Parameters, and they will all be listed to the right under Contents. Open the one you want to describe, click on Dialog Reference, click on the [A] Paragraph button, and then click in the box to the right to add text.
Warning: if you change a parameter name after creating a description for it, the latter will be lost!
I have recently been writing my first script to handle a geoprocessing task related to our Cityscapes project, where we place georeferenced historical maps in a Google Maps framework that can be used in urban studies classes (more to come!). Since the beginning I have been converting the georeferenced maps into Google Maps tiles by hand, a process that takes a couple of hours. This is small compared to the georeferencing time, so even though it was a prime candidate for scripting I had been putting it off in order to complete more maps.
For about a year now I have been learning the scripting language Python, as it has become the new, open way to script ArcGIS (and also has many fans in other areas of scientific computing). As my summer Cityscapes efforts ended, and as I prepare for my Cityscapes presentation at the NEArc meeting on November 8, my education has finally shifted over to using Python with ArcGIS (9.3). The documentation I have been using includes the following items:
The script is being implemented through ArcToolbox, which makes it easier to provide script parameters, such as the input raster, through a GUI dialog. But because I want it to be usable by others, I have to write it in a general way, which means displaying information about the input raster, suggesting reasonable values for subsequent inputs, and checking for illegal values. Hence I have been learning about the ToolValidator class, apparently a new feature in version 9.3, and I've been struggling to understand some undocumented characteristics. Some of my observations follow.
First off, the ToolValidator class is only very loosely connected to its Tool dialog. In particular, it is not instantiated just once when the dialog opens, which would allow the preservation of state within the class as the user changes the dialog, but instead every time the dialog parameters are changed, and before each of its methods are called. Hence the dialog parameters themselves are the only way to store information as the user fills in the fields and they are validated.
The ToolValidator template provided for each script accesses the dialog parameters in its initialization method:
def __init__(self):
import arcgisscripting as ARC
self.GP = ARC.create(9.3)
self.params = self.GP.GetParameterInfo()
The parameter values themselves are accessed through, for example, self.params[2].Value
, but be forewarned that sometimes the property Value
is a Value object, rather than just a string of characters (poor documentation on this is discussed here). This seems to be the case when the parameter refers to a file or other data structure, and if you want its name as a character string you must reference the property
self.params[0].Value.Value
or use the expression
str( self.params[0].Value )
The latter conversion will occur automatically in contexts requiring a string, such as concatenation.
When a dialog is first opened, the ToolValidator method initializeParameters()
is called. It can be used to insert initial values into the parameter fields, and also set some characteristics of those fields in the dialog. For example, to disable a parameter field (make it uneditable), you can include code such as:
self.params[2].Enabled = False
When a dialog field has been modified (which may include when the dialog first opens), verification that a parameter value is usable occurs at several places:
updateParameters()
comes next, and can be used to verify many characteristics of input values and avoid processing bad ones. It's also possible to correct the values directly, but in most cases it's better to send a message to the user as described in step 4 and let them correct them.updateMessages()
is called, where the messages returned from step 3 can be inspected and, if you want, reset with your own. These messages can direct users to fix their errors.The result of the separation of steps 2 and 4 is that sometimes the same tests must take place, first to avoid processing bad data and second to send a message about it.
The ToolValidator method updateParameters()
can be used to process new parameter values (e.g. to set related parameters). No information is provided about which of the dialog fields was modified, though, so you must check each parameter of interest to see if it is the one that needs to be validated:
if not self.params[0].HasBeenValidated :
…
elif not self.params[1].HasBeenValidated :
…
Once you find the modified parameter, you can ignore the update if it's due to the script itself (e.g. initialization) rather than being altered by the user:
if not self.params[1].altered : return
Before processing an input dataset, you should test for its existence:
if not self.GP.Exists( str( self.params[0] ) ) : return
If the data isn't actually there, this method will raise an error, but it won't be reported until later, hence the need to return
at this point.
The ToolValidator method updateMessages()
can be used to send the user a message about problems with the value in a dialog field. The result is a red ⊗ next to the input field, and clicking on it will display the message. Once again no information is provided about which of the dialog fields has the problem, so you must check each parameter of interest (even if this was done previously), e.g.:
if self.params[1].Value and self.params[1].Value <= 0 :
self.params[1].ClearMessage()
self.params[1].SetErrorMessage("This parameter must be positive.")
Providing basic documentation for the tool and its parameters is fairly easy, once you know that the simplest way to create and modify them is with the Documentation Editor. The ArcGIS documentation provides two options for opening the Editor, the first of which doesn't work, the second of which is available by opening ArcCatalog and menuing View > Toolbars > Metadata to bring up that toolbar. Then click on the script, click on its tab Metadata, and finally, in the Metadata toolbar, click on the button Edit metadata.
When the Documentation Editor opens, the left side lists the various types of information you can provide. The main tool description, visible when it first opens, is added in the section General Information, in the item Abstract. The various parameters are described in the section Help in the item Parameters, and they will all be listed to the right under Contents. Open the one you want to describe, click on Dialog Reference, click on the [A] Paragraph button, and then click in the box to the right to add text.
Warning: if you change a parameter name after creating a description for it, the latter will be lost!
Marble is cross-platform and has versions for the KDE environment and Qt.
]]>Marble is cross-platform and has versions for the KDE environment and Qt.
]]>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Aqua satellite captured this image of Pakistan. This image uses a combination of infrared and visible light to increase the contrast between water and land. Water appears in varying shades of blue, vegetation is green, and bare ground is pinkish brown. Clouds are bright turquoise.
.::Click HERE to request for this data set::.
]]>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’sAqua satellite captured this image of Pakistan. This image uses a combination of infrared and visible light to increase the contrast between water and land. Water appears in varying shades of blue, vegetation is green, and bare ground is pinkish brown. Clouds are bright turquoise.
.::Click HERE to request for this data set::.
]]>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’sAqua satellite captured this image of Pakistan. This image uses a combination of infrared and visible light to increase the contrast between water and land. Water appears in varying shades of blue, vegetation is green, and bare ground is pinkish brown. Clouds are bright turquoise.
.::Click HERE to request for this data set::.
]]>In the first article in our series on the ArcGIS Server API for Flex we covered some basic concepts including how to create maps and add layers. We covered both tiled and dynamic map service layers. In this brief article you will learn how to apply a definition expression to your dynamic map service layer to restrict the features displayed from a layer. For instance, in the figure below we are plotting only those counties that suffered a population loss from 2000 to 2007.
]]>In the first article in our series on the ArcGIS Server API for Flex we covered some basic concepts including how to create maps and add layers. We covered both tiled and dynamic map service layers. In this brief article you will learn how to apply a definition expression to your dynamic map service layer to restrict the features displayed from a layer. For instance, in the figure below we are plotting only those counties that suffered a population loss from 2000 to 2007.
]]>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’sAqua satellite captured this image of Pakistan. This image uses a combination of infrared and visible light to increase the contrast between water and land. Water appears in varying shades of blue, vegetation is green, and bare ground is pinkish brown. Clouds are bright turquoise.
.::Click HERE to request for this data set::.
]]>The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’sAqua satellite captured this image of Pakistan. This image uses a combination of infrared and visible light to increase the contrast between water and land. Water appears in varying shades of blue, vegetation is green, and bare ground is pinkish brown. Clouds are bright turquoise.
.::Click HERE to request for this data set::.
]]>Although these tools are remarkably easy to use, GeoCommons Maker makes the mistake of using the Mercator projection for geospatial data visualization.
]]>Although these tools are remarkably easy to use, GeoCommons Maker makes the mistake of using the Mercator projection for geospatial data visualization.
]]>Today we are beginning a new series of posts on the ArcGIS Server API for Flex. The ArcGIS API for Flex allows the creation of Rich Internet applications on top of ArcGIS Server, and is based on the free Adobe Flex framework. The Flex framework is a client-side technology that is rendered by Flash Player 9 and above, or by Adobe AIR.
The term Rich Internet Applications or RIAs has become synonymous with Web 2.0 applications. But what are Rich Internet Applications? RIAs provide desktop functionality in a web application. They are engaging, interactive, and expressive applications with easy to use interfaces. RIA provide increased productivity to end users. There is no waiting for full-page reloading after user actions. Instead, RIAs provide instant feedback to the user and are responsive to their actions. In GIS terms, RIA applications provide the ability to create attractive visualizations of geographic analysis, and the ability to interact with the data.
]]>
Today we are beginning a new series of posts on the ArcGIS Server API for Flex. The ArcGIS API for Flex allows the creation of Rich Internet applications on top of ArcGIS Server, and is based on the free Adobe Flex framework. The Flex framework is a client-side technology that is rendered by Flash Player 9 and above, or by Adobe AIR.
The term Rich Internet Applications or RIAs has become synonymous with Web 2.0 applications. But what are Rich Internet Applications? RIAs provide desktop functionality in a web application. They are engaging, interactive, and expressive applications with easy to use interfaces. RIA provide increased productivity to end users. There is no waiting for full-page reloading after user actions. Instead, RIAs provide instant feedback to the user and are responsive to their actions. In GIS terms, RIA applications provide the ability to create attractive visualizations of geographic analysis, and the ability to interact with the data.
]]>
The Geneva and Barcelona accessibility maps were created by giving GPS-enabled camera phones to people using wheelchairs. Over a 6-month period, the mapmakers documented inaccessible barriers around their respective cities - each barrier was photographed and placed automatically on a mashup. Common barriers include stairways, inaccessible curbs, escalators, broken elevators, etc.
]]>The Geneva and Barcelona accessibility maps were created by giving GPS-enabled camera phones to people using wheelchairs. Over a 6-month period, the mapmakers documented inaccessible barriers around their respective cities - each barrier was photographed and placed automatically on a mashup. Common barriers include stairways, inaccessible curbs, escalators, broken elevators, etc.
]]>]]>
]]>
]]>
]]>
OpenAddresses includes a user web interface and a number of REST services. Data is typically hand-digitized or donated by institutions and public authorities.
]]>OpenAddresses includes a user web interface and a number of REST services. Data is typically hand-digitized or donated by institutions and public authorities.
]]>Google Maps provides a web mapping application wherein maps are produced in advance and served as a set of small tiles for assembly into one big image in the browser. The advantage of this approach is consistency of appearance and graphical quality of the map and, probably more important, enormous scalability that can be achieved. There is no need for server side processing to generate maps and individual map tiles are much smaller than the whole map presented at the user end, so they are able to be delivered and displayed much faster. The trade off is a big effort up front to generate nice looking maps and the need to fix zoom levels rather than allowing a continuous zoom, as is the case with older web mapping technologies. The approach has been copied by other online map technology providers. But what approach should you take if you’d like to present your own custom data on top of a Google Maps base layer without using markers, polylines, or polygons? Perhaps you have a large dataset stored in a shapefile and you’d simply like to convert this data to a format suitable for display in Google Maps. In this case it would make sense to pre-create custom map tiles of your data at various zoom levels and have them available for display.
]]>Google Maps provides a web mapping application wherein maps are produced in advance and served as a set of small tiles for assembly into one big image in the browser. The advantage of this approach is consistency of appearance and graphical quality of the map and, probably more important, enormous scalability that can be achieved. There is no need for server side processing to generate maps and individual map tiles are much smaller than the whole map presented at the user end, so they are able to be delivered and displayed much faster. The trade off is a big effort up front to generate nice looking maps and the need to fix zoom levels rather than allowing a continuous zoom, as is the case with older web mapping technologies. The approach has been copied by other online map technology providers. But what approach should you take if you’d like to present your own custom data on top of a Google Maps base layer without using markers, polylines, or polygons? Perhaps you have a large dataset stored in a shapefile and you’d simply like to convert this data to a format suitable for display in Google Maps. In this case it would make sense to pre-create custom map tiles of your data at various zoom levels and have them available for display.
]]>MapServer is a widely used open source GIS platform, especially useful in web environments. It is written in C, and has two primary modes of usage: through a CGI script, or in a more programmatic manner, via MapScript, a set bindings for many programming languages. Both methods are based on MapFiles, which contain specifications and parameters for a map (which can be based on a shapefile for instance), written in a declarative mini-language. The basic working of MapServer can be abstracted in an easy way: you give it a MapFile as its input, and it produces in return a (static) image of the resulting map (usually via a web server), which a client application is then free to manipulate in any useful way.
MapServer is a widely used open source GIS platform, especially useful in web environments. It is written in C, and has two primary modes of usage: through a CGI script, or in a more programmatic manner, via MapScript, a set bindings for many programming languages. Both methods are based on MapFiles, which contain specifications and parameters for a map (which can be based on a shapefile for instance), written in a declarative mini-language. The basic working of MapServer can be abstracted in an easy way: you give it a MapFile as its input, and it produces in return a (static) image of the resulting map (usually via a web server), which a client application is then free to manipulate in any useful way.
The planet Mars is currently enjoying its biannual dalliance with the Earth. Marching through the constellation Cancer, the Red Planet's nearness and opposition to the Sun produce an unusually bright spessartine gem against the black velvet sky. It now hangs above the eastern horizon in the evening, where it will remain for several months.
Mars has been the subject of intense human scrutiny for centuries, and a battery of NASA probes have visited the planet since the dawn of the space age. The first wave began in the 1960s with the Mariner flybys and orbiters, and ended in the early 1980s with the Viking orbiters and landers. Starting in the late 1990s, a new series of sophisticated craft invaded Mars. Probably most famous are the three robotic rovers deployed to directly explore and analyze its surface (Sojourner, Spirit, and Opportunity). Of more interest for geography are the three orbiting satellites that have remotely imaged the entirety of Mars, returning unprecedented amounts of information and transforming our understanding of the planet.
The Mars Global Surveyor, operating from 1999 to 2006, carried a number of scientific instruments, including a high-resolution camera with a horizontal resolution as small as 50 cm, a laser altimeter with a vertical resolution of 30 cm, a thermal emission spectrometer, and a magnetometer, and provided evidence for relatively recent water flow!
Mars Odyssey began studying the material composition of the planet in 2002, using another thermal emission imaging system and a gamma ray spectrometer, and confirmed the existence of subsurface frozen water.
The Mars Reconnaissance Orbiter reached Mars in 2006, and includes a telescopic camera with a resolution around half a meter per pixel. Using stereoimaging, improved measurements of elevation are also being calculated.
Currently the highest resolution topographic data is only available in certain areas. There is so much data that it isn't being processed in its entirety, but only where scientists have been particularly interested!
The best global topographic data currently available comes from the Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), which covers the planet in 1/128th-degree (28 arc second) measurements (and four times more detailed in the polar regions). For example, this "slice" of the planet extends from the north pole on the left to the south pole on the right along the prime meridian, revealing the generally higher elevations in the southern hemisphere:
One representation of MOLA data as a Digital Elevation Model (DEM) is in the Planetary Data System (PDS) IMG raster format, which can be viewed with the NASAView software:
http://starbeam.jpl.nasa.gov/tools/nasa-view.shtml
PDS/IMG is somewhat obscure; NASAView can save it as GIF or JPEG format, which is one way to quickly use it in other tools, but a lot of information is lost. Specifically, Martian elevation varies from -8,068 m to +21,134 m (a larger range than on Earth!), so the data is stored as 16-bit signed integers, while GIF and JPEG can only handle 8-bit unsigned integers (grayscale), forcing the output to be scaled and offset into values from 0 to 255.
Not surprisingly, the specialized image analysis program ENVI understands PDS/IMG as an external/generic format, which can be opened and resaved in another format such as ESRI's Grid for use with ArcGIS.
More generally, PDS/IMG is actually a pretty simple format, being a file of uncompressed data with a separate human-readable label file (.lbl) describing its layout. The data can be purely sequential or interleaved (if it's color/multi-band). It's therefore the same as what ESRI ArcGIS and IDL-VIS ENVI call Band Sequential (.bsq), Band Interleaved by Pixel (.bip) or Band Interleaved by Line (.bil). (Note that these formats are identical for grayscale/single-band data.) The only difference, it appears, is that ArcGIS expects a "header" file (.hdr) with a different layout of information. So only the latter file needs to be created, and the data file can simply have its file extension changed. A basic file is described below, but there is also a detailed description of the .hdr format here:
http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?topicname=BIL,_BIP,_and_BSQ_raster_files
The procedure to view the Mars data is as follows:
GEOGCS["ACS_Mars_MGS_MOLA",DATUM["D_<custom>",SPHEROID["_custom",3396000.0,0.0]],PRIMEM["Meridian Planum",-180.0],UNIT["Degree",0.017453292519943295]]
Corresponding Descriptions of Raster Characteristics | |
.hdr parameter | Corresponding .lbl parameter(s) |
---|---|
nrows 720 | lines = 720 |
| |
ncols 1440 | line_samples = 1440 |
| |
skipbytes 0 | record_bytes = 2880 |
| |
nbands 1 | bands = 1 |
| |
nbits 16 | sample_bits = 16 |
| |
pixeltype unsignedint | sample_type = *_unsigned_integer |
| |
byteorder I | sample_type = lsb_* |
| |
layout BSQ | band_storage_type = band_sequential band_storage_type = sample_interleaved band_storage_type = line_interleaved |
| |
nodata -3.4028226550889045e+38 | missing_constant = 16#FF7FFFFB# |
xdim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ydim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ulxmap 0.125 | westernmost_longitude = 0 <DEG> |
| |
ulymap 89.875 | maximum_latitude = 90 <DEG> |
| |
Properties
dialog, scroll down to Statistics
, click on the button Options
, and select Build Statistics…
.Tools
=> Extensions…
, and turn on 3D Analyst
. Then menu View
=> Toolbars
=> 3D Analyst
.3D Analyst
toolbar, click on the button Interpolate Line
, and create a polyline along the path where you want to see the profile, by clicking once at each vertex and ending with a double-click. Then click on the button Create Profile Graph
.An alternative source of data is the Precision Experiment Data Records (PEDR), a point set (Digital Terrain Model/DTM) from which the raster DEM is derived:
http://pds-geosciences.wustl.edu/missions/mgs/pedr.html
As noted above, there are also some bits and pieces of newer, high-resolution Mars DTM available from the Mars Reconnaissance Orbiter, available here:
This data has an embedded header, which ArcGIS 9.3 is able to partially recognize, but it makes a few mistakes along the way. In particular:
PROJCS["Plate_Carree",GEOGCS["EQUIRECTANGULAR MARS",
DATUM["D_MARS",SPHEROID["MARS",3393830.0,0.0]],PRIMEM["Reference_Meridian",-180.0],
UNIT["Degree",0.0174532925199433]],PROJECTION["Plate_Carree"],
PARAMETER["false_easting",0.0],PARAMETER["false_northing",0.0],
PARAMETER["central_meridian",341.0599975585938],
PARAMETER["Standard_Parallel_1",20.0],UNIT["Millimeter",0.001]]
Standard_Parallel_1
and the Central_Meridian
, but also the spheroid radius.NoData as:
, assign the following value: -3.4028226550889045e+38
.The cleanest way to import this data is to use a utility like tail (standard on Unix, an add-on for Windows) to remove the label from the beginning of the data file; with properly constructed .hdr and .prj files as described above, the data is ready to use.
An alternative, more black-box approach is described at ISIS support.
The planet Mars is currently enjoying its biannual dalliance with the Earth. Marching through the constellation Cancer, the Red Planet's nearness and opposition to the Sun produce an unusually bright spessartine gem against the black velvet sky. It now hangs above the eastern horizon in the evening, where it will remain for several months.
Mars has been the subject of intense human scrutiny for centuries, and a battery of NASA probes have visited the planet since the dawn of the space age. The first wave began in the 1960s with the Mariner flybys and orbiters, and ended in the early 1980s with the Viking orbiters and landers. Starting in the late 1990s, a new series of sophisticated craft invaded Mars. Probably most famous are the three robotic rovers deployed to directly explore and analyze its surface (Sojourner, Spirit, and Opportunity). Of more interest for geography are the three orbiting satellites that have remotely imaged the entirety of Mars, returning unprecedented amounts of information and transforming our understanding of the planet.
The Mars Global Surveyor, operating from 1999 to 2006, carried a number of scientific instruments, including a high-resolution camera with a horizontal resolution as small as 50 cm, a laser altimeter with a vertical resolution of 30 cm, a thermal emission spectrometer, and a magnetometer, and provided evidence for relatively recent water flow!
Mars Odyssey began studying the material composition of the planet in 2002, using another thermal emission imaging system and a gamma ray spectrometer, and confirmed the existence of subsurface frozen water.
The Mars Reconnaissance Orbiter reached Mars in 2006, and includes a telescopic camera with a resolution around half a meter per pixel. Using stereoimaging, improved measurements of elevation are also being calculated.
Currently the highest resolution topographic data is only available in certain areas. There is so much data that it isn't being processed in its entirety, but only where scientists have been particularly interested!
The best global topographic data currently available comes from the Mars Global Surveyor's Mars Orbiter Laser Altimeter (MOLA), which covers the planet in 1/128th-degree (28 arc second) measurements (and four times more detailed in the polar regions). For example, this "slice" of the planet extends from the north pole on the left to the south pole on the right along the prime meridian, revealing the generally higher elevations in the southern hemisphere:
One representation of MOLA data as a Digital Elevation Model (DEM) is in the Planetary Data System (PDS) IMG raster format, which can be viewed with the NASAView software:
http://starbeam.jpl.nasa.gov/tools/nasa-view.shtml
PDS/IMG is somewhat obscure; NASAView can save it as GIF or JPEG format, which is one way to quickly use it in other tools, but a lot of information is lost. Specifically, Martian elevation varies from -8,068 m to +21,134 m (a larger range than on Earth!), so the data is stored as 16-bit signed integers, while GIF and JPEG can only handle 8-bit unsigned integers (grayscale), forcing the output to be scaled and offset into values from 0 to 255.
Not surprisingly, the specialized image analysis program ENVI understands PDS/IMG as an external/generic format, which can be opened and resaved in another format such as ESRI's Grid for use with ArcGIS.
More generally, PDS/IMG is actually a pretty simple format, being a file of uncompressed data with a separate human-readable label file (.lbl) describing its layout. The data can be purely sequential or interleaved (if it's color/multi-band). It's therefore the same as what ESRI ArcGIS and IDL-VIS ENVI call Band Sequential (.bsq), Band Interleaved by Pixel (.bip) or Band Interleaved by Line (.bil). (Note that these formats are identical for grayscale/single-band data.) The only difference, it appears, is that ArcGIS expects a "header" file (.hdr) with a different layout of information. So only the latter file needs to be created, and the data file can simply have its file extension changed. A basic file is described below, but there is also a detailed description of the .hdr format here:
http://webhelp.esri.com/arcgisdesktop/9.3/index.cfm?topicname=BIL,_BIP,_and_BSQ_raster_files
The procedure to view the Mars data is as follows:
GEOGCS["ACS_Mars_MGS_MOLA",DATUM["D_<custom>",SPHEROID["_custom",3396000.0,0.0]],PRIMEM["Meridian Planum",-180.0],UNIT["Degree",0.017453292519943295]]
Corresponding Descriptions of Raster Characteristics | |
.hdr parameter | Corresponding .lbl parameter(s) |
---|---|
nrows 720 | lines = 720 |
| |
ncols 1440 | line_samples = 1440 |
| |
skipbytes 0 | record_bytes = 2880 |
| |
nbands 1 | bands = 1 |
| |
nbits 16 | sample_bits = 16 |
| |
pixeltype unsignedint | sample_type = *_unsigned_integer |
| |
byteorder I | sample_type = lsb_* |
| |
layout BSQ | band_storage_type = band_sequential band_storage_type = sample_interleaved band_storage_type = line_interleaved |
| |
nodata -3.4028226550889045e+38 | missing_constant = 16#FF7FFFFB# |
xdim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ydim 0.25 | map_resolution = 4.0 <PIX/DEG> |
| |
ulxmap 0.125 | westernmost_longitude = 0 <DEG> |
| |
ulymap 89.875 | maximum_latitude = 90 <DEG> |
| |
Properties
dialog, scroll down to Statistics
, click on the button Options
, and select Build Statistics…
.Tools
=> Extensions…
, and turn on 3D Analyst
. Then menu View
=> Toolbars
=> 3D Analyst
.3D Analyst
toolbar, click on the button Interpolate Line
, and create a polyline along the path where you want to see the profile, by clicking once at each vertex and ending with a double-click. Then click on the button Create Profile Graph
.An alternative source of data is the Precision Experiment Data Records (PEDR), a point set (Digital Terrain Model/DTM) from which the raster DEM is derived:
http://pds-geosciences.wustl.edu/missions/mgs/pedr.html
As noted above, there are also some bits and pieces of newer, high-resolution Mars DTM available from the Mars Reconnaissance Orbiter, available here:
This data has an embedded header, which ArcGIS 9.3 is able to partially recognize, but it makes a few mistakes along the way. In particular:
PROJCS["Plate_Carree",GEOGCS["EQUIRECTANGULAR MARS",
DATUM["D_MARS",SPHEROID["MARS",3393830.0,0.0]],PRIMEM["Reference_Meridian",-180.0],
UNIT["Degree",0.0174532925199433]],PROJECTION["Plate_Carree"],
PARAMETER["false_easting",0.0],PARAMETER["false_northing",0.0],
PARAMETER["central_meridian",341.0599975585938],
PARAMETER["Standard_Parallel_1",20.0],UNIT["Millimeter",0.001]]
Standard_Parallel_1
and the Central_Meridian
, but also the spheroid radius.NoData as:
, assign the following value: -3.4028226550889045e+38
.The cleanest way to import this data is to use a utility like tail (standard on Unix, an add-on for Windows) to remove the label from the beginning of the data file; with properly constructed .hdr and .prj files as described above, the data is ready to use.
An alternative, more black-box approach is described at ISIS support.
Full pricing information has yet to be released, but the service is currently free for publically-visible data.
]]>Full pricing information has yet to be released, but the service is currently free for publically-visible data.
]]>With the ArcGIS Server Query Task you can perform attribute and spatial queries against data layers in a map service that has been exposed. You can also combine these query types to perform a combination attribute and spatial query. Some examples would perhaps be illustrative at this point. An attribute query might search for all land parcels with a valuation of greater than $100,000. A spatial query could be used to find all land parcels that intersect a 100 year floodplain, and a combination query might search for all land parcels with a valuation of greater than $100,000 and whose geometry intersects the 100 year floodplain. In this article we’ll take a look at the mechanics of querying data from a map service using the ArcGIS Server JavaScript API.
]]>With the ArcGIS Server Query Task you can perform attribute and spatial queries against data layers in a map service that has been exposed. You can also combine these query types to perform a combination attribute and spatial query. Some examples would perhaps be illustrative at this point. An attribute query might search for all land parcels with a valuation of greater than $100,000. A spatial query could be used to find all land parcels that intersect a 100 year floodplain, and a combination query might search for all land parcels with a valuation of greater than $100,000 and whose geometry intersects the 100 year floodplain. In this article we’ll take a look at the mechanics of querying data from a map service using the ArcGIS Server JavaScript API.
]]>In this second part, I produce a working implementation using UMN MapServer and OpenLayers. The working implementations can be found in the Polar Map section of Equal-Area-Maps.com .
]]>In this second part, I produce a working implementation using UMN MapServer and OpenLayers. The working implementations can be found in the Polar Map section of Equal-Area-Maps.com .
]]>The results of these articles can be seen in the new Polar Projections section of the Equal Area Maps website. Note that only one of these polar projections actually has the equal area property.
]]>The results of these articles can be seen in the new Polar Projections section of the Equal Area Maps website. Note that only one of these polar projections actually has the equal area property.
]]>Many ArcGIS Server applications need to be able to query data from a map service and display the results in a tabular structure. In this post I will show you how to use the DojoX DataGrid along with ItemFileReadStore and QueryTask from the ArcGIS Server JavaScript API to display your query results in a tabular structure. The process is really quite simple once you understand the basic concepts.
]]>Many ArcGIS Server applications need to be able to query data from a map service and display the results in a tabular structure. In this post I will show you how to use the DojoX DataGrid along with ItemFileReadStore and QueryTask from the ArcGIS Server JavaScript API to display your query results in a tabular structure. The process is really quite simple once you understand the basic concepts.
]]>Recently, the Ordnance Survey launched their "free" OpenSpace service to allow users to add interactive maps to websites. This article is an overview of this service.
]]>Recently, the Ordnance Survey launched their "free" OpenSpace service to allow users to add interactive maps to websites. This article is an overview of this service.
]]>Who knew? Zombie Chicken Power!
Who knew? Zombie Chicken Power!