This week on Tech Speak, we sit down with S&T program manager Norman Speicher to discuss the urgent need for interoperability standards in response to the increasing effects of climate change—think floods or wildfires. Much of the response and recovery information that is gathered by flood sensors, wildfire sensors, and local agencies exists in unique and proprietary formats. Interoperability standards would allow these large amounts of data (that are already being collected) to be combined and shared, creating a clearer picture to prevent, plan for, and respond to emergencies. Tune in to hear Norman talk us through what happens when data can’t be shared, as well as S&T’s collaboration with the Open Geospatial Consortium, which helps ensure that it can be in the future.
Guest: Norman Speicher, Program Manager, Office of Mission and Capability Support
[00:00:00] You could be looking at a particular value and not knowing definitively, whether that's a temperature, whether that's a measurement or some other value. With data standards, you know exactly what to expect.
[00:00:14] Dave: Welcome to Tech Speak, a mini episode of the Technologically Speaking Podcast. In this episode, we're going to hear from Norman Speicher, a program manager with the Office of Mission and Capability Support. We caught up with him at the Open Geospatial Consortium, or O G C Workshop. Norman was there to understand and align the climate research objectives of the Science and Technology Directorate with the industry and government stakeholders represented at OGC. Here is Norman to help explain that a little better.
[00:00:41] Norman: OGC has long been a partner in developing interoperability standards. Data interoperability standards allow us to share data across organizations, across tiers of governments, across other stakeholders involved in response and recovery.
[00:00:59] Dave: We asked Norman to talk about how climate change impacts the DHS mission.
[00:01:04] Norman: Climate change impacts DHS in different ways. Examples include wildland fires, floods. So, broadly speaking, natural disasters. Most of the research originated around sensors, so wildfire sensors and flood sensors. So, wildfire is obvious. It detects wildfires, allowing unattended sensors to notify local agencies of the existence or the emergence of a wildfire. Flood sensors are monitoring tributaries and giving municipalities and other entities advanced notice of rising waters, so that they can respond accordingly.
[00:01:46] Norman: So, it's fairly new technology in a relative sense. And without interoperability, we could have a flood sensor from one vendor and not be able to compare the flood sensor from a different company. So, sort of there's vendor lock-in, and we're forced to perhaps use one vendor or perhaps forced to use multiple systems because we're using two different types of flood sensors. So, interoperability allows us to get away from proprietary solutions. Interoperability is an interesting problem. In many ways, it's not a technology issue. With enough time and, and inclination, you can work through interoperability issues. However, industry in many instances is resistant to interoperability because they rely on selling proprietary systems, which they may view as a threat. From the DHS perspective, we view the threat as not being able to collect and interpret all the data available. That each has its own proprietary format, its own way of interpreting the data or its own way of accessing the data. And so, one county has a given sensor system and another county has different sensor system than at the state level, it would be difficult to deal with a situation where perhaps flooding, not unexpectedly, as crossing jurisdictional boundaries.
[00:03:22] Norman: There's a considerable amount of data that exists with various entities, state, local, non-governmental organizations, but oftentimes it's not possible to easily share that data with other entities. That's really where the OGC and other interoperability standards come into play. Without those standards, that information is, is no more accessible than if it were on paper.
[00:03:55] Dave: We asked Norman if he could give us an example from a recent event to highlight the need for interoperability and he talked about Hurricane Harvey that ravaged Harris County and Houston, Texas, in 2017.
[00:04:05] Norman: Doing an operational experiment with Harris County Fire and Police Department, we were told about some of the interoperability issues that they had during Hurricane Harvey and one of the particularly acute situations they had good, I believe it's called soil inundation data. So, they had information that gave them details as far as where flooding was likely to occur. However, they didn't have data on development. So much of the data that they had on soil inundation didn't factor in the realities that, in many instances, those areas had been developed and were now covered with concrete, so there was no soil exposed. In practical terms, what that meant is that they had flooding that they never expected and areas that they never expected, because their modeling was not accurate, due to these missing elements. If that data had been available and the systems that kept that data had been interoperable, they likely would've made different operational decisions.
[00:05:12] Dave: By helping OGC and its members establish interoperability and information sharing, Norman Speicher and his colleagues are helping to organize the data that is already being collected and see it is being shared with first responders, cities and counties, planning for natural disasters and to help mitigate the consequences of climate change. This has been Tech Speak. Thanks for listening. Follow us on social media at DHS SciTech DHS, S C I T E C H. Bye.