Executive Interview: Ted Okada, CTO, Federal Emergency Management Agency (FEMA)
FEMA Going to Work on Visual AI, Textual AI, Operations AI, and AI for Privacy Preservation
A member of the Senior Executive Service and FEMA’s Chief Technology Officer (CTO), Ted Okada is responsible for leading the technology strategy and direction for a wide variety of mission, business, and enterprise systems.
He has spearheaded a broad range of continuous improvement initiatives involving geospatial technologies, data analytics, and cyber security, as well as a whole community approach to interoperable communications in the event of a disaster. Mr. Okada is the creator and executive sponsor of OpenFEMA—the public’s resource for FEMA’s data provided in open, machine-readable formats. Launched in 2012 after Hurricane Sandy, OpenFEMA provides timely, usable, and accurate information enabling collaboration with Whole Community partners in support of disaster survivors.
Mr. Okada is a graduate of Northwestern University with a B.A. in Mathematical Methods in the Social Sciences and Economics. His experience includes over thirty years in international relief and development, a decade in internet services architecture, and two technology start-ups. Prior to becoming FEMA’s first CTO, Mr. Okada served as the Director of U.S. Global Public Private Partnerships and as the Director of the Humanitarian Systems Group at Microsoft developing solutions to the world’s most vexing and least served humanitarian problems.
He recently spent a few minutes talking to AI Trends Editor, John P. Desmond.
Ted Okada, CTO of FEMA
AI Trends: Thank you for talking to AI Trends today. You have a wide range of responsibility for leading technology strategy and direction at FEMA. How’s it going there?
Ted Okada: After being at FEMA for eight years now, I can confidently say the thing I am most proud of is being part of this team. The workforce at FEMA is the most professional it has been in its history. The technical skills, the understanding of hazards and resiliency, the response and recovery efforts, and the dedication to disaster survivors all demonstrate the staff’s commitment to FEMA’s mission of helping people before, during, and after disasters. It is a pretty amazing group of people.
Government is complex, the nation is complex, and disasters are complex. Superstorm Sandy happened two months after I started at FEMA. Since then, we have had the difficult and challenging storms in 2017—Harvey, Irma, and Maria—as well as the California wildfires, and major hurricanes Florence and Michael in 2018, not to mention smaller hurricanes and flooding events.
Each of these disasters posed unique challenges and complexities to the workforce. However, this agency analyzed the lessons learned and operationalized them, utilizing new technologies to ensure FEMA is ready for this year’s flood and hurricane seasons—and it is always earthquake season. It is a privilege to serve the nation in this way.
AI Trends: Good for you. What would you say is the role of AI at FEMA?
Ted Okada: We look at AI in the way I think a lot of government agencies are now looking at it. We believe AI technologies have immediate applicability to FEMA in areas such as automating mundane, manual tasks and advanced data analytics. In the short term, that means augmenting a lot of our current work, allowing staff to focus on value-added, strategic tasks and improving outcomes for disaster survivors. We want to give our analysts, who are experts in everything from flood plain management to disaster response, the tools to do their job more effectively and efficiently.
Right now, I’m focused on the machine learning part on inferential reasoning, and the use of more mathematics or statistics-based inference to help improve processes. The idea is to use independent reasoning capabilities within AI and machine learning to effectively enhance our workforce. We need to augment the work of our analysts with other techniques like generative adversarial networks and other tools that extend inferential learning to augment decision-making.
In addition, there are many other uses for AI/ML in the emergency management space—some which can be pursued by FEMA and others that are more appropriate for our Whole Community partners to pursue. I believe that FEMA should only be focused on AI as it directly pertains to our core mission. A couple of examples could be using sentiment analysis from social media to monitor disaster conditions or applying AI/ML techniques to imagery to identify the extent of damage after a disaster.
AI Trends: What are the current priorities for projects incorporating AI at FEMA?
Ted Okada: We are looking at four general buckets: Visual AI, Privacy Protection AI, Operations AI, and Textual AI.
The first type, Visual AI, is being used throughout government and has direct applicability to FEMA. Visual AI is used to identify objects from images and video from remote imagery. A good use case would be to view a particular property after a disaster and determine whether it suffered major, minor, or complete damage.
Or it can be used to quickly assess what’s happening on the ground after a flood or hurricane landfall to accelerate a disaster declaration process. We want to get our FEMA Administrator all the data he or she needs to make a rapid, evidence-based assessment.
The second area of focus is Privacy Protection AI. How we protect privacy, especially of survivors and citizens, is of utmost importance. We need to ensure that personal data is truly protected from any potential breach. FEMA is looking to use AI, ML, inferential models to improve our privacy protection and privacy preservation efforts.
A lot of these folks have lost everything in a disaster. All they may have left is a slab of concrete as was certainly true of survivors of California’s wildfire season. It is critical that when the government quickly provides benefits to eligible survivors that their privacy is protected, particularly when data is shared between Federal agencies or with State governments.
The third bucket is Operational AI or AI Ops—improving the standard operations within FEMA itself. We are finding that to get better efficiency and effectiveness in our IT operations requires us to have better predictive analytics. How do we have better prediction of failure?
That brings up a larger question about risk management. One thing that helps us understand predictive failure is to understand the possible risk, especially cyber risk. How do we institute a more risk-based framework when thinking about our government systems utilizing AI/ML?
Using historic data and analytical models, AI Ops can be used to help perform failure risk estimation, root cause analysis, and to predict impending cyber threats—all of which can enhance our operations response and uptime.
Finally, the fourth area: We have a lot of words in government. We have units that look at simplifying our policies, to make them more user-friendly to citizens and our state and county stakeholders. So, we also have analysts who focus primarily on text. We’ll look at word vectors and how we process textual data, and then turn that into a scoring framework. Textual/AI if you will.
Those four are the general areas where AI is relevant to FEMA today. We are still taking baby steps to lay the critical foundation for this effort such as our ability to identify, tag, enable, and govern data to support AI/ML technologies. Enablement begins with data, and the ability to generate machine-readable data.
AI Trends: OpenFEMA is an initiative of yours designed to get cooperation from the media, nonprofits, and universities. How is that working?
Ted Okada: Before I answer that, a quick clarification. OpenFEMA is really the public’s resource for accessing a vast amount of data that FEMA collects around various aspects of emergency management. One of the biggest benefits of OpenFEMA is that it enables external partners to analyze and use FEMA’s data to improve outcomes in their support of disaster survivors. Yes, this includes the media, nonprofits, and universities, but it also includes financial and insurance companies, mortgage and real estate companies, private sector companies, other Federal/State/Local government agencies, and citizens/individuals.
OpenFEMA has been a success and keeps growing. It launched soon after Superstorm Sandy in 2012 where it was first used to coordinate the release of Sandy Housing Assistance data for New Jersey and New York, necessary to apply for Community Development Block Grant funds. Since then, FEMA’s data has been used to visualize flood risks and potential costs, inform fire safety guidance, create disaster response apps, and spur academic research all of which helps increase understanding of risk for people before, during, and after disasters.
Another OpenFEMA success has been our API-driven approach. If you go to OpenFEMA.gov/open, your readers will see that FEMA’s data sets are available both for download and via an application programming interface (API) that provides data in an easily digestible, machine-readable format.
The success of the OpenFEMA API is now influencing other API efforts within FEMA. FEMA is in the midst of multiple major system modernizations, all of which have a big need for API standardization and opening the aperture on the data they release.
So, how does this relate to AI?
When releasing our data to the public, privacy considerations are paramount. We are increasingly concerned about the sophistication of cyber-threats as well as the use of AI in cyberattacks and see increasing challenges in protecting and preserving privacy while still providing services to citizens. To that end, FEMA is beginning to explore the use of AI for privacy protection and privacy preservation. We are looking at techniques such as K-anonymity and differential privacy to mask, delete, or transform data such that privacy is assured.
AI Trends: How is the private software and services industry responding to AI-related software and system requirements at FEMA? Are you getting what you need from the industry? Are there any key industry partners in your efforts that bear mentioning?
Ted Okada: Widely attended conferences are very beneficial to get the word out about visual AI, textual AI, operations AI, and AI applied to privacy preservation. We see burgeoning expertise in all 50 states and territories in AI and want to leverage the best minds in the nation. If they have a solution, we want to know about it.
Getting noticed by industry journalists and analytical communities is also important. They are important but there is a wider tech press related to emergency management that can be a great platform to highlight an industry innovation.
AI Trends: How is AI helping with cyber security?
Ted Okada: Local, county, city, state, and federal government agencies are all increasingly experiencing cyberattacks and ransomware attacks. Better identity management, facilitated by AI, will lead to better protection.
At FEMA, we are evaluating technologies around the emerging concept called “zero trust networking”, which is dependent upon a solid understanding of identity protection, security, and management. Part of zero trust is learning and adapting—trying to identify risky or aberrant user behavior that may compromise the network. This is where AI and ML come into play. AI will strengthen this approach to identity management in terms of how trust models and trust scores are developed and matured.
These kinds of augmenting technologies are going to be critical for how emergency management is carried out at all levels of government.
AI Trends: Do you have any advice for students who might be interested in a career in AI? What should, for example, high school students be thinking about? What should undergrads be studying? How can early and mid-career people come up the learning curve on AI? What would be your suggestion to them?
Ted Okada: I was a math major in college; mathematics is critical. Calculus and applied mathematics, which historically sat in engineering departments.
Things like differential equations, calculus of variations, algebraic topology and number theory, topology itself, graph theory, and numerical methods can all become relevant. Then apply that with statistics, everything from the management of data to statistical analysis and inference.
It’s cool that we generally have more analysts coding in R than people coding in Python, and leverage a wide base of statistical learning; inferential models, stochastic processes, scaling to other techniques—clustering, logistic regression, classification, etc.
Back to the R versus Python question. Not that one is better than the other, but what I thought was interesting is that a lot of these folks now at FEMA and now approaching ML all come out of basic statistics backgrounds.
On my desk, I have the high school AP statistics study guide; lunchtime brain teasers—better than Sudoku. Are you good at understanding analysis of variance and how to do linear optimization? These are some of the basic things that you learn in high school but are also the same skills that you need when it comes to data science and, ultimately, machine learning.
Math will remain the cornerstone of machine learning and AI into the future. Programming languages, while important, changes often and evolves rapidly. The reality is math, statistics, statistical reasoning, and inferential thinking, don’t change that much over time. So being grounded in the classics, if you will, of mathematics and applied/engineering mathematics, will likely greatly benefit you.
AI Trends: Do you have to get involved in politics in your job at FEMA?
Ted Okada: No, FEMA’s mission transcends politics—“helping people before, during, and after disasters”.
However, I do believe if one desires to serve in government, one must understand government. And the best way to do that is to understand our history. One of my favorite quotes is by Harry Truman: “There is nothing new in the world except the history you do not know.” I am a big fan of the classics, from Homer’s Iliad and the Odyssey, to Thucydides and Herodotus, to Socrates, Plato, and Aristotle, all the way to our Western tradition as a democracy.
The idea that one can understand the political environment divorced from history is not going to make you successful in being able to navigate working in government at any level. Being an informed citizen that has read through the Declaration and the Constitution is going to make you a better government employee. That’s been my experience. The more complex the challenges I face at FEMA, the deeper in our history I must go to both understand and to help develop practical solutions to help FEMA’s mission.
Learn more at FEMA and OpenFEMA.