GITA Geospatial Infrastructure Solutions Conference 31 Report
With that in mind, it’s time to make a blueprint for the investment of infrastructure of all kinds in the U.S. and govern differently.
“We need to look at infrastructure as an interdependent system,” said Murphy in conclusion.
For GITA Award winners see www.gita.org
Power Panel – The Ties that Bind: GIS Technology for the Greater Good’
A Power Panel held on Tuesday at GITA, moderated by Matt Ball of Vector 1 Media, brought together a number of distinguished panelists to discuss the topic, “GIS Technology for the Greater Good.”
Panelists included GITA Speaker Award Winner Dr. Bob Austin, the City of Tampa; Ron Langhelm, Booz, Allen & Hamilton; Timothy Nyerges, University of Washington; and Tom Nolan, Seattle Public Utilities.
Ball summarized what the panel would address: as a result of GITA’s focus shift onto infrastructure, he encouraged panelists to speak to infrastructure data issues which may include national defense and Homeland Security, emergency response, data sharing, educating the public, and the physical protection of assets. Within each of these topics, the human aspect of geospatial was to be taken into account.
Panelists were asked to summarize a project or situation in which they or their organization helped use GIS for the greater good.
Austin said that the most recent contribution his organization has made occurred with Michael Baker working on recovery with FEMA on Katrina, and also on recovery work in the manual assessment of damage with Charlie. “Just prior to Katrina we were able to get a sensor over New Orleans to get 163,000 people to qualify for reimbursement, which made life a lot easier for those families to get assistance.”
Langhelm reported that following a number of disasters he was involved in including 9/11, the shuttle recovery and an earthquake, he submitted a paper to FEMA recommending the building of geospatial response teams on a local level. After Hurricane Katrina, FEMA extended funding and the response teams were deployed for the wildfires in California.
Nolan related his experience developing some of the first LiDAR in the country in the development of the Puget Sound Regional LiDAR System. A cooperative project with lots of organization, the project involved NASA as pioneers of the project, the USGS, with a focus on seismic activity, and habitat studies as per transportation requirements.
Nyerges spoke about his work on the National Standards Committee. His PhD advisor was on the chair of the committee for the National Spatial Data Transfer (NSDT) and Nyerges posed the idea of spatial options in the digital world of cartography and GIS. “Computer systems don’t talk to one another very well. We need to look at aspects of how people and computer systems can get together better,” said Nyerges.
“We’re all falling in love with Microsoft,” Langhelm said. “In the old days it was hard to get people to understand technology. Interfaces are much better today; you can drill into other systems for more information than you used to be able to.”
Echoing that sentiment, Nyerges said that “Community is understanding the people community as well as the geospatial community. As we get into web based stuff and what it means to interoperability, there is people interoperability as well.”
A concern shared by all participants is the fact that data ages so quickly. All agreed that the tools seem adequate, yet we need a better way to interact with the public and let the public know how to use the system.
This raised some more questions, such as what did people have in mind when they created the databases? If people can’t understand the databases and the data is not up to date, then they will question the validity of the data, and associate that with the aging infrastructure and then question the usefulness of the GIS.
Sharing information between organizations can be useful when both organizations have something to gain by doing so. In providing data to the public, it’s important to keep in mind that high quality data is not cheap, and there is always software to be maintained. Issues of security are a priority when providing data to the public, also.
“Being an academic, I’m working on a $20 million proposal for a Data Net, a $100 million program being developed by the National Science Foundation, which is intending to be what the Internet is not now,” said Nyerges. “The Internet doesn’t allow us to share data in a multidisciplinary way. The buzzword here is oncology,’ – what is the meaning of your database in the nature of your organization? Organizations have a different intent in building databases.”
Austin recounted a data sharing example of different departments for the City of Tampa maintaining different address schemes. “The Parks Department didn’t want to use the same address scheme as the rest of the city. A 911 call revealed that the ambulance couldn’t find the location for a particular emergency, and although this incident did not result in great injury or fatality, it was dramatic enough to get attention to seek ways to share data.”
Another interesting data sharing example: the code enforcing department began finding abandoned buildings. The fire department became excited about this research because they consider abandoned buildings to be more likely to catch fire. The police department wanted to know about the abandoned building locations because they are also known as places where drug deals take place. It was only when they realized that they had a common need for this data, that they began sharing.
Langhelm said that for the most part, the tools are evolving as quickly as people can articulate the need for them. The big problem is with data sharing, and just making sure your data is out there and available during times of crisis.
A question from the audience precipitated more on data sharing: What happens when people misuse data or don’t understand it? How do you ensure that it’s not misused?
Disconnected systems offer more opportunities for things to go wrong. A way of tracking data lineage, i.e. who created the data, who put their hands on it, could be created by software vendors.
“The research world and the commercial world do not know how to get people from one level of knowledge to another,” noted Nyerges. A possible solution is to study what people do with data.
This may involve a “customer care center” where people access data, with a policy in place to find out if they got from the data what they needed. Also helping people learn how to manage data could be useful.
On the Exhibit Floor
Below are summaries of products seen at press conferences and on the exhibit floor: This year very few vendors were exhibiting new products, but rather, giving an overview of their offerings and how they fit into the bigger picture.
One of the biggest buzzes I heard at the show was about Safe Software’s new advancements in spatial data access with the release of FME 2008, which includes the new FME Server and a new version of FME Desktop, the standard for spatial ETL (extract, transform and load) (formerly Spatial Direct).
Throughout the conference, a common theme was the need for spatial data access. According to Safe president, Don Murray, FME Server targets large migrations and data distribution. FME Desktop is a replacement product for the former FME Spatial Direct, and brings new robustness to desktop spatial data access.
FME 2008 allows users to transform data over the web and support true 3D geometries. FME Server is the first enterprise ready spatial ETL solution available on a scalable SOA, that centralizes spatial data conversion and distribution tasks. This includes live data streaming into mashups and web applications, downloading spatial data onto the web and making it available for distribution over the web, and high throughput data conversion capabilities that enable enterprises to more efficiently manage large spatial ETL projects and to participate in spatial data infrastructure (SDI) initiatives. More FME Engines can be added as needed.
A newcomer on the exhibit floor this year was Microsoft, showcasing their upcoming SQL Server Spatial, and also their Virtual Earth for the public sector. According to Jerry Skaw, marketing communications manager, Virtual Earth Business Unit, if you have data that needs to be on a map, Virtual Earth can help. The new SQL Server Spatial will make a back end database of Virtual Earth.