Data Reference Model


Definition/Description (What) – defines the primary considerations for describing, discovering, delivering, and sharing common data using open standards and the promotion of uniform data management practices to sustain data as a national asset.

Purpose/Function (Why) – to promote the common identification, tagging, sharing, and reuse of appropriate geospatial data/information resources across communities. It contributes to the mission/business Operational Requirements Documentation to determine what data inputs and assets are required to meet the functional needs of the stakeholder. The section describes how to:

• Establish a process for base lining and documenting geospatial data inputs and datasets.

• Provide guidance for preparing data description, context, and sharing methods.

• Provide [limited] references to common operating data and other sources.

Stakeholder Performance Guide (Who & How) – driven by mission/business requirements and the associated functional capabilities identified in the Operational Requirement Document, data resource investment must be a shared responsibility agreed to by the Executive Leadership and managed as an enterprise/corporate resource and service administered by a steward (e.g., Program Manager) and implemented by the Solution Architects.


Geospatial data are the biggest cost to an enterprise geospatial solution, and remains the primary need, challenge and barrier to the geospatial community.

Data Reference Model Approach

Geospatial data identification, search, discovery, and access continues to be the primary challenge for the geospatial professional/user, but is now compounded by the fact that geospatial technology has become “commoditized” to a level where the general public have come to expect it to be just another query tool on their web browser. This is the same expectation that the geospatial Program Manager and Solution Architect face when providing geo-data and services to the mission/business owner within their organizations who are not geospatial professionals.

The GIRA’s Data Reference Model will focus upon a practical approach to documenting the geospatial data requirements within and across an organization to meet mission/business requirements as well as provide guidance for data description, context, and sharing. The GIRA Data Reference Model is not:

  • A data management “how to” manual for building and maintaining data architectures.
  • A government-wide conceptual data model or fully attributed logical data model.
  • A set of eXtensible Markup Language (XML) schemas.
  • A replacement of existing data structures within an organization’s geospatial enterprise.

Geospatial Baseline Assessment Matrix

Once established, an organization’s Executive Steering Committee’s should have the authority and ability to initiate an enterprise-wide Geospatial Baseline Assessment. The baseline would include data requirements based upon mission/business functional needs taken from the Operational Requirements Document. Agencies should perform a business analysis that generates the geospatial data requirements including data model, geospatial and temporal coverage, accuracy, and quality. The potential sources for that geospatial information, ranging from self-production, to usage of another agency’s data product, to direct acquisition from commercial sources should be considered. This should yield a data architecture that defines information types and data requirements in terms of business needs.

The Baseline Assessment for Data can serve multiple benefits and reporting requirements. The Data Assessment is essentially an inventory and catalog of current data holdings as well as planned needs based upon functional requirements. However, the Matrix is a tool to help foster discussion with the stakeholder community to better understand the mission/business requirements.

“The creation of duplicative data and redundant capabilities often results from consumers’ inability to locate, access, understand, or trust that existing data assets meet their needs.”[1]

The Data Assessment Matrix is not exhaustive and there are many themes of data within each category listed in Table 1 that would require further discussion to identify the best available dataset to meet the business need. Agencies should use the Data Reference Model from the Federal Enterprise Architecture to help create and maintain their inventory. The inventory would facilitate the identification of “authoritative”[2] and/or desired datasets; identifies redundant data assets for decommissioning; identifies opportunities to reuse or extend a data asset rather than creating a new one; and the opportunity to avoid redundancy costs based upon the establishment of enterprise licensing agreements.



The GeoCONOPS[3] defines authoritative data owned and/or produced by the federal entities supporting the National Response Framework as:

  • Rational Authority: Government agencies are by default the “authoritative” sources for data or services that they produce, or have a statutory responsibility.
  • Expert Authority: Scientifically authoritative data is defined in the realm of the various professions under which the standards and methodology for data are created.

These classifications provide clarity beyond the frequent notion that an authoritative data source is simply the entity trusted because of a subjective belief that it is the “best” or “most accurate source for a specific data theme. The owner or authoritative source of any geospatial data I responsible for defining the business rules for the access and sharing of that information across the stakeholder community.

The Baseline Assessment would also serve to meet OMB’s Open Data Policy[4] responsibility to “Create and maintain an enterprise data inventory… that accounts for datasets used in the agency’s information systems.”

The Baseline Assessment discussion would begin with a Data Inputs (Table 1 and Data Reference Model) of general types of data required to meet the mission/business functional capabilities of the organizations stakeholders as identified in the Operational Requirements Document. This can generally be performed by the geospatial investment Program Manager and Solution Architects initiating discussions with both geospatial investment “owners as well as business owners that may need geospatial functionality/services but do not intend to have a system investment. A follow-on assessment with a more detailed (inventory) would be performed to fully identify the existing datasets either currently existing or planned for acquisition. Table 2 provides an extract of the 6-page Baseline Dataset Assessment Matrix.


Table 1. Geospatial Baseline Assessment Matrix: Data Inputs



Table 2. Geospatial Baseline Assessment Matrix: Datasets (Extract)



[1] DoD 8320.02-G, Guidance for Implementing Net-Centric Data Sharing, April 2006, available at

[2] Department of Homeland Security, Geospatial Concept of Operation (GeoCONOPS) available at

[3] Ibid.

[4] OMB Memorandum M-13-13, Open Data Policy – Managing Information as an Asset (May 9, 2013), available at


Data Reference Model Alignment

The Data Reference Model’s (DRM) primary purpose is to promote the common identification, use, and appropriate sharing of data/information across the federal government. The DRM is a flexible and standards-based framework to enable information sharing and reuse via the standard description and discovery of common data and the promotion of uniform data management practices.[1] The DRM focus upon two core questions: What information is available for sharing and re-use, and what are the information gaps needing correction?[2] The DRM provides a standard means by which data may be described, categorized, and shared while respecting security, privacy, and appropriate use of that information. It consists of three standardization areas:[3]

  • Data Description: Provides a way to uniformly describe data to convey meaning, thereby supporting its discovery and sharing.
  • Data Context: Facilitates discovery of data through an approach to the categorization of data according to Additionally, enables the definition of authoritative data assets within a common operating environment.
  • Data Sharing: Supports the access and exchange of data where access consists of ad-hoc requests (such as a query of a data asset), and exchange consists of fixed, reoccurring transactions between This is enabled by capabilities provided by both the Data Context and Data Description standardization areas.

This standardized Data Reference model structure is depicted in Figure 1. [4]


Data Reference Model Structure

Figure 1. Data Reference Model (DRM) Structure


Data Description provides a means to uniformly describe data, thereby supporting its discovery and sharing. Traditionally, data description was solely focused on organizing and describing structured data [geographic base data layers]. With unstructured data [mission/business data that may contain a geographic element] as the largest focus of agencies’ data management challenges, the Federal Enterprise Architecture Framework’s DRM Description component has been revised to focus on the larger topic of metadata, which includes both traditional structured data and unstructured data description.[5]

Metadata is structured information that describes, explains, locates, or otherwise makes it easier to retrieve, use, or manage an information resource (NISO 2004, ISBN: 1-880124-62-9).[6] The challenge is to define and name standard metadata fields so that a data consumer has sufficient information to process and understand the described data. The more information that can be conveyed in a standardized regular format, the more valuable data becomes. Metadata can range from basic to advanced, from allowing one to discover the mere fact that a certain data asset exists and is about a general subject all the way to providing detailed information documenting the structure, processing history, quality, relationships, and other properties of a dataset that enable a potential user to determine its fitness of use for their purposes.

The International Standards Organization (ISO) 19115-1:2014: Geographic Information Metadata, Part 1: Fundamentals, available at, defines the schema required for describing geographic information and services. It provides information about the identification, extent, quality, spatial and temporal schema, spatial reference, and distribution of digital geographic data.[7] The Metadata Standard is applicable to:

  • Cataloguing of datasets, clearinghouse activities, and the full description of datasets.
  • Geographic datasets, dataset series, and individual geographic features and feature properties.

The ISO Metadata Standard also defines:

  • Mandatory and conditional metadata sections, and metadata elements.
  • Minimum set of metadata required to serve the full range of metadata applications (e.g., data discovery, determining data fitness for use, data access, data transfer, and use of digital data).
  • Optional metadata elements—to allow for a more extensive standard description of geographic data.
  • Method for extending metadata to fit specialized needs.


What is commonly understood as metadata comprises:[8]

  • Identification information, e., information to uniquely identify the resource such as:
    • Title, abstract, reference dates, version, purpose, responsible parties, …
    • Data extent,
    • Browse graphics (overview, thumbnail, …), and
    • Possible usage.
  • Content Description, i.e., information identifying the feature catalogue(s) used and/or information about the coverage content.
  • Distribution information, e., information about the distributor of, and options for obtaining the resource.
  • Legal and security constraints; e., restrictions placed on the data and metadata in the context of delivering, accessing, and using.
  • Portrayal information, e., information identifying the portrayal catalogue used.
  • Reference system information, i.e., identification of the spatial and temporal system(s) used in the resource data.
  • Spatial Representation, i.e., information concerning the mechanisms (for example, coverage or vector) used to spatially represent the resource data.
  • Quality and validity information, e., a general assessment of the quality of the resource data including:
    • Quality measures related to the geometric, temporal and semantic accuracy, the completeness or the logical consistency of the data,
    • Lineage information including the description of the sources and processes applied to the sources, and
    • Validity information related to the range of space and time pertinent to the data; to whether the data has been checked to a measurement or performance standard or to what extent the data is fit for purpose.
  • Maintenance information, i.e., information about the scope and frequency of updating of the resource data.
  • Information about metadata, e., identifier for the metadata itself, information about the language and character set of the metadata, metadata date stamp, metadata point of contact, name, and version of the metadata standard.


All nationally significant and other federally stewarded geospatial data should be documented with descriptive metadata to enable discovery, assessment of fitness-of-use, and sharing of geospatial data resources.[9] Geospatial metadata should be organized by a common schema in accordance with ISO metadata standards. Metadata explicitly defines distribution rights and restrictions to enable role-based access implemented through federal e-authentication initiatives.

The Office of Management and Budget’s Open Government Directive[10] requires agencies to expand access to information by making it available online in open formats. Specifically, this Memorandum requires agencies to collect or create information in a way that supports downstream information processing and dissemination activities. This includes using common core and extensible metadata for all new information creation and collection efforts.

Federally sponsored Metadata Working Groups promote the advancement, adoption and use of geospatial metadata and provide considerable expertise, documentation, training, and information for users including:



Data Context is any information that provides additional meaning to data and an understanding to the purposes for which it was created. The Data Context method can also be called “categorization” or “classification.”[11] OMB’s Open Data Policy responsibility to “Create and maintain an enterprise data inventory … that accounts for datasets used in the agency’s information systems,”[12] provides input for the categorization. The Baseline Assessment (Matrix Datasets) allows organizations the ability to begin to agree upon data taxonomies, definitions and authoritative sources. Once data assets are inventoried, categorized, and then shared in data registries, these catalogs become source for discovering data and assessed based upon metadata that determine utility to the user. The catalog and its taxonomy are not meant to be fixed and unchanging, but flexible and scalable so that new Subjects and Topics can be added as the business model for the organization changes as needed, for their respective business processes.

The Federal Enterprise Architecture Framework V2.0 describes Data Categorization Methods[13] and provides best practices examples (Table 3) used to describe the common data assets.


Table 3. Federal Enterprise Architecture Framework V2.0: Data Categorization Method




Data Asset Catalog

A data asset catalog reduces time and cost to implement change by reducing the time to locate needed data, identifies redundant data assets for decommissioning, and identifies opportunities to reuse or extend a data asset rather than creating a new data asset.

Data Management Book of Knowledge (DMBOK) DAMA, April 2009.


Using the Data Taxonomy, agencies should inventory their data assets, associate or map the data assets to the Data Taxonomy and create a data catalog consisting of the Taxonomy Subject, Taxonomy Topic, Entity Name (table, class, file name), Attribute Name (column, attribute, field, tag), and Data Asset Population (a rule to limit scope of an association). The data asset catalog is populated through a “bottom up” process that associates the data contents of a data asset documented in the data asset’s data model to the Data Taxonomy.

DoD 8320.02-G, Guidance for Implementing Net-Centric Data Sharing, April 2006


An agency can create a data catalog with the following steps: 1) Inventory data assets and collect the data model or structure for each asset, 2) Map the asset characteristics to the Data Taxonomy, 3) Present the results in a data catalog. The data asset catalog provides the foundation of an enterprise data inventory, which lists and describes all agency data sets used in the agency’s information systems and is required by OMB’s Policy on Managing Government Information as an Asset.


In summary, an organization can create a geospatial data catalog with the following steps:

  1. Inventory data assets and collect the data model or structure for each asset.
  2. Map the asset characteristics to the DRM Taxonomy.
  3. Preserve the results in a data catalog that is exposed for Search and Discovery.

There are numerous geospatial data cataloging initiatives and websites available for search, discovery, posting and retrieval. Each catalog capability, however, uses its own taxonomy for the inventory and includes variations such as: Community Categories, Types, Groups, Tags, Layers, Name, Keyword, etc.

The geospatial community does not maintain a standardized, consensus driven or commonly applied taxonomy to catalog geospatial data assets.

Taxonomies describing nationally significant and other federally stewarded geospatial data should be documented using eXtensible Markup Language (XML) Topic Maps, Web Ontology Language (OWL), Resource Definition Format (RDF) hierarchies, or ISO 11179 classification schemes.[14] Taxonomies describing geospatial data can then be made accessible via services to facilitate efficient search, discovery, and data translation capabilities and to facilitate development of more detailed data schemas and logical data models.


Mapping each geospatial data asset in an agency’s data asset catalog to the agency’s data categorization taxonomy, enables users to identify the data assets that satisfy the search criteria of the user.[15] There are three primary search and discovery methods as described in Table 4.


Table 4. Standards-based Content Search Methods



Federated Search

A real-time, simultaneous search of multiple resource collections that may reside on many separate domains. Federated Search utilizes a broker to accept a query request, broadcast it out to a number of providers, and aggregates the results into a combined set for the consumers.

Centralized Search

Operates by creating a central index of content obtained by crawling web sites and following web feeds. Search queries are then executed against the index.

Enterprise Metadata Catalog

A central catalog of discovery metadata organized into collections. The Enterprise Metadata Catalog searching mechanism supports precise criteria such as geospatial and temporal parameters as well as full-text search. The Defense Discoverable Metadata Specification (DDMS)[16] is an example of an appropriate metadata tagging standard for the contents of an Enterprise Metadata Catalog.

Regardless of the search method applied, each requires the effective use of Data Description and Data Context in order to identify the requested data.


Geospatial data sharing (e.g., identification of and access to requested data) is often the greatest need and obstacle for the user community. Office of Management and Budget’s (OMB) Open Government Directive requires agencies to expand access to information by making it, “online in an open format that can be retrieved, downloaded, indexed, and searched by commonly used web search applications.”[17] Specifically, this Memorandum requires agencies to collect or create information in a way that supports downstream information processing and dissemination activities. This includes using machine readable and open formats, data standards, and common core and extensible metadata for all new information creation and collection efforts.

CHALLENGE: “Federal, state, local, and tribal organizations typically use different definitions in the storage and exchange of like data across a community of interest.”[18]

Creating a standardized information exchange with agreed upon data descriptions, or Content Model, enables each participating organization to create the necessary interface to receive or provide data only once. Content models refer to community agreements on the elements, relationships between elements, semantics and so forth for a specific data set in a given domain.

Further, content models are implementation independent and vendor neutral. In order to automate and make the exchange of domain specific geospatial data seamless, consensus needs to be built among the community participants on:

  • A shared data model for data exchange, in terms of a common understanding and agreement for how different systems “understand” each other;
  • Common definitions of the different data entities and their properties; and
  • Common controlled vocabularies and taxonomies.

Creating a standardized information exchange with agreed upon data descriptions enables each participating organization to create the necessary interface to receive or provide data only once.

Existing exchange partners can use a new participant’s data without having to write any interface or transformation. Also, this standardization improves the quality of information exchange by ensuring that the source and target mapping is accurate, by utilizing the exchange model and standardized data definitions.

For example, the DRM abstract model can be implemented using different combinations of technical standards. As one example, the Exchange Package concept in the Data Sharing standardization area may be represented via different messaging standards (e.g., XML) schema, Electronic Data Interchange (EDI) transaction set) in a system architecture for purposes of information sharing.[19]

The Geospatial Profile of the Federal Enterprise Architecture[20] describes geospatial data sharing in the following context. Geospatial data schemas define how geospatial data are organized, how geospatial objects relate to each other, and list the attributes associated with each object. For maximum interoperability,       these schemas must be based on standards for logical (abstract/database design) and physical (encoding/exchange) applications. The extension fo the National Information Exchange Model (NIEM)[21] with Geo4NIEM is an example of a federally developed data schema which incorporates geospatial content. Community collaboration and harmonization of semantics and exchange schema is used to provide common approaches and resolve discrepancies.

The FEA DRM provides an architectural pattern for sharing and exchanging data through a services-oriented strategy. Geospatial data should be encoded using appropriate interface standards and specifications to enable data exchange (fixed recurring transactions between data suppliers and consumers) and less structured requests for data access.

Geospatial architectures should leverage metadata catalogs as exposure mechanisms to enable consumers to discover availability and fitness-of-use of relevant geospatial data while also providing an effective means to connect consumers with authoritative geospatial data through service-oriented discovery, brokering, and access. To facilitate data sharing geospatial standards should:

  • Be open and vendor-neutral to enable exploitation by a broad range of technology solutions.
  • Be based on voluntary consensus standards (ISO/ANSI/FGDC/OGC) or community standards.
  • Promote encoding of full geographic information (i.e., raster and vector spatial data and their attributes) in support of multiple mission requirements.


[1] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2, January 29, 2013, available at

[2] Office of Management and Budget, A Common Approach to Federal Enterprise Architecture, May 2, 2012, available at

[3] Ibid.

[4] Office  of  Management  and  Budget,  Consolidated  Reference  Model,  Version  2.3,  October  2007,  available  at

[5] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2.0, January 29, 2013.


[7] International Standards Organization, ISO 19115:2003 Geographic Information – Metadata, available at

[8] Defense Geospatial Information Working Group (DGIWG), DGIWG Metadata Vision – 906, September 30, 2013, available at

[9] Geospatial  Profile  of  the  Federal  Enterprise  Architecture  (FEA),  Version  2.0,  March  06,  2009,  available  at

[10] OMB Memorandum M-1 0-06, Open Government Directive, December 8, 2009, available at

[11] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2.0, January 29, 2013.

[12] OMB Memorandum M-13-13, Open Data Policy – Managing Information as an Asset, May 9, 2013, available at

[13] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2.0, January 29, 2013. Appendix C: Data Reference Model.

[14] Geospatial Profile of the Federal Enterprise Architecture (FEA), Version 2.0, March 06, 2009, available at

[15] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2.0, January 29, 2013.


[17] OMB Memorandum M-1 0-06, Open Government Directive, December 8, 2009, available at

[18] Office of Management and Budget, Federal Enterprise Architecture Framework, Version 2.0, January 29, 2013.

[19] Office of Management and Budget, Consolidated Reference Model, Version 2.3, October 2007.

[20] Geospatial  Profile  of  the  Federal  Enterprise  Architecture  (FEA),  Version  2.0,  March  06,  2009,  available  at

[21] National Information Exchange Model (NIEM).

Data Access And Policy

The goal of a data architecture is to facilitate accurate (timely and precise information), trusted (data authorities and security), and common (agreed upon information source) data across the geospatial investments. Data access is key to understanding the “who and why” of data management. Access is a leveraged capability involving policy considerations. This consists of identity and role based access that relies upon standards defined through the Federal Identify and Access Management (FICAM) roadmap.[1] Additionally, Information Sharing Agreements (ISA) and Memorandums of Agreement/Understanding must be structured and adaptive among mission partners to gain access to data sets to be used in the respective agency geospatial systems.

One of the five goals of the 2012 National Strategy for Information Sharing and Safeguarding,[2] (NSISS) is to improve information discovery and access through common standards. Goal 2 of the strategy states:

“Secure discovery and access relies on identity, authentication and authorization controls, data tagging, enterprise-wide data correlation, common information sharing standards, and a rigorous process to certify and validate their use.”

National Strategy for Information Sharing and Safeguarding


The NSISS Goal 2.1 goes on to state:

“Discovery and access are distinct concepts: the first addresses a user’s ability to identify the existence of information, and the second relates to a user’s ability to retrieve it. Our national security demands relevant information is made discoverable, in accordance with existing laws and policies, to appropriate personnel. Discovery and access require clear and consistent policy and standards, as well as technical guidance for implementing interoperable process and technology.”

National Strategy for Information Sharing and Safeguarding



Data handled by various governmental authorities is subject to differing concerns regarding operational security, as well as the privacy, civil rights, and civil liberties of individuals and organizations described by the data or having access privileges to the data. As such, organizations will likely caveat source data with various access restrictions, and any operations on the source data will need to appropriately propagate those access controls through data access policies and reflected in the metadata at varying degrees of granularity. Figure 2 graphically depicts the data access and policy element “wrappers” necessary for sharing.



Figure 2. Data Access and Policy Wrapper


The granularity of the metadata applied to each data object is critical for enabling repeatable fine- grain access control. This allows for maximizing information integration while minimizing risks associated with over-sharing. Each organization and associated data stewards are ultimately responsible for defining the security policies that govern how data is acted upon. These policies should be machine executable to dynamically provide a grant/deny decision to each data object at run-time.


[2]National Strategy For Information Sharing and Safeguarding, December 2012, available at

Data Resources

The number of geospatial data resource catalogs are too numerous to attempt to list, as many of the geospatial community (federal, state, local, private, international, and academia) have on- going initiatives to provide a variety of these data resources. The following examples are offered as examples of (primarily) government geospatial data resources available to users.


“The Geospatial Platform (GeoPlatform) is a managed portfolio of common geospatial data, services, and applications contributed and administered by trusted sources and hosted on a shared infrastructure, for use by governmental agencies and partners to meet their mission requirements and the broader needs of the Nation.”[1] The was developed by the member agencies of the Federal Geographic Data Committee (FGDC), in coordination with the Federal CIO Council, as an interagency Federal initiative and OMB shared service and is hosted by the U.S. Department of Interior, as the Managing Partner for the Geospatial Platform.[2]

The GeoPlatform provides open standards compliant catalog web services supporting the GeoPlatform and[3] (e.g., official U.S. government site providing increased public access to federal government datasets). The shared catalog provides access via the Catalog Service for the Web (CSW) standard for both first-order and all metadata (including members of large collections) for harvested data, services, and applications. The Catalog Service for the Web is an Open Geospatial Consortium (OGC)[4] standard that defines common interfaces to discover, browse, and query metadata for data and services. The catalog enables both data and services searching via several methods (e.g., Categories, Types, Groups, Tags, Name, Keyword, etc.) and provides the metadata on that specific dataset or tool. Some datasets are downloadable, while others are extraction tool or widgets. The user must also be aware of and acceptable to the data policy use conditions for the requested dataset.

Many of the Federal datasets available on the GeoPlatform are part of the National Geospatial Data Asset (NGDA) portfolio management approach prescribed by OMB Circular A-16 Supplemental Guidance.[5] To ensure quality and usability, the data must be:

  • Discoverable – published and available.
  • Reliable – coordinated by a recognized national steward.
  • Consistent – supported by defined schema, standards and understood content definitions to ensure their integrity (including conformance with FGDC Standards as applicable).
  • Current and applicable – maintained regularly and adaptable to current needs.
  • Resourced – established and recognized as an enterprise investment.

The GeoPlatform provides a Datasets Published per Month for the previous 12-month period and the most recent summary at the time of this writing is displayed in Table 5.

Table 5. GeoPlatform Datasets Published per Month[6]





May ‘13

Jun ‘13

Jul ‘13

Aug ‘13

Sep ‘13

Oct ‘13

Nov ‘13

Dec ‘13

Jan ‘14

Feb ‘14

Mar ‘14

Apr ‘14


Department of Agriculture





Department of Commerce












Department of Homeland Security



Department of the Interior










Department of Transportation















Data as of 04/03/2014 1:08 AM.

At the time of this writing, the GeoPlatform listed 80,603 datasets found with the dataset type of “geospatial.”


The Homeland Security Geospatial Concept of Operations (GeoCONOPS)[7] is a multiyear effort focused on the geospatial communities supporting the Department of Homeland Security (DHS) and the Federal Emergency Management Agency activities under the National Response Framework (NRF) and in coordination with Presidential Policy Directive 8: National Preparedness (PPD-8), which describes the Nation’s approach to preparing for the threats and hazards that pose the greatest risk to the security of the United States. The GeoCONOPS, in its sixth year, is a multiyear product to document the current geospatial practices and serves as a guide to federal departments and agencies providing support under the NRF, PPD-8, and Stafford Act activities.

The GeoCONOPS is an interagency collaboration with strategic direction provided by a federal interagency Geospatial Interagency Oversight Team (GIOT). The participants and intended audience of the GeoCONOPS include the GIOT Members, 15 Emergency Support Functions (ESF), both primary and support, and other federal mission partners. The GeoCONOPS is updated on a yearly basis to ensure it meets the needs of all mission partners.

The intended audiences for this document are the geospatial communities supporting homeland security and emergency management activities from the Joint Field Offices and operations centers to NRF headquarter entities. The GeoCONOPS outlines federal geospatial capabilities in support of state, local, and tribal authorities during homeland security and emergency management operations across the entire emergency management life cycle. The GeoCONOPS website[8] is a resource that provides:

  • Geospatial mission blueprint of the resources and capabilities available for support in the Homeland Security Enterprise.
  • Identifies points of coordination and collaboration.
  • Documents authoritative geospatial data sources.
  • Describes best practices.
  • Identifies technical capabilities.

Table 6 is an extract from the GeoCONOPS Appendix: Authoritative Data Matrix, which lists over 1,200 data themes (datasets) by subcategory (Data Reference Model). These datasets are the desired resources, although many of those assets may not be available or have a URL link to their source availability. Users can search the on-line catalog for data listing content through the GeoCONOPS taxonomy using attributes, including; Type, Keywords, Category/Subcategory, etc.

Table 6. GeoCONOPS Authoritative Data Matrix (Extract)













Egg Production Farms




5, 11

Protection, Response


Sheep/Goat Farms



n/a,-lamb- mutton.aspx#.UbcWYpxyaYk

5, 11



Support Facilities

Agriculture Chemical Manufacture






5, 10, 11

Protection, Response


State Fairgrounds





5, 8, 11

Protection, Mitigation, Response


U.S. Agriculture






5, 11




Veterinary Pharmaceutical Manufacture






5, 10, 11

Protection, Response


Veterinary Services


Dun & Bradstreet (D&B)



1 thru 13

Prevention, Protection, Mitigation, Response, Recovery




Banking and Credit

Automated Check Clearing Houses


Federal Reserve




5, 13

Protection, Response


Banking Institutions– National Credit Union Administration (NCUA)




5, 13

Protection, Response


Branches/Agencies of Foreign Banks





5, 13



Credit Unions HQ





5, 13



Farm Credit Administration (FCA) Financial Institutions






5, 13




The GIS Inventory[9] of the National States Geographic Information Council (NSGIC) is a tool for states and their partners. Its primary purpose is to track data availability and the status of Geographic Information System (GIS) implementation in state and local governments to aid the planning and building of Spatial Data Infrastructures. The random access metadata for online nationwide assessments (RAMONA) database is a critical component of the GIS Inventory. RAMONA moves its FGDC-compliant metadata for each data layer to a web folder and a Catalog Service for the Web (CSW) that can be harvested by Federal programs (e.g., and others. This provides far greater opportunities for discovery of user information. The GIS Inventory allows the user to search by keywords including Theme or Place Names, Layer Category, Layer Name, Production Date, and Production Status. The GIS Inventory is maintained by individual users that document their own organizational information and data holdings.

At the time of this writing, the NSGIC GIS Inventory listed 23,012 results under its “Browse Data Layers” tab.


[1] Geospatial Platform.

[2] Federal Geographic Data Committee, Steering Committee meeting minutes, April 19, 2012.



[5] OMB Memorandum M-11-03, Issuance of OMB Circular A-16 Supplemental Guidance, November 10, 2010, available at

[6] Geospatial Platform. Available at

[7] Geospatial Concept of Operations (GeoCONOPS), Version 6.0, June 2014, available at


[9] National State Geographic Information Council (NSGIC), available at

Stakeholder Performance Guidance

The Performance Guidance provides a summation of the key decision points necessary to determine the most effective and efficient design, development, and implementation of the geospatial system investment (Table 7).

Table 7. Stakeholder Performance Guide: Data







Executive Leadership

• Authorize a Business Needs Analysis to identify geospatial data requirements using the Baseline Assessment Matrix: Data.

• Agreed upon data authorized source to reduce redundancy and determine Enterprise License Agreement (ELA) opportunity with data provider/vendor.

• Require any/all funded data creation or enhancement initiatives (e.g., contract award, cost-share, grant, etc.) include metadata standard compliance.

• Work with other Executives to acknowledge the need to reduce data costs by leveraging investment and performing the Baseline Assessment based upon mission/business needs.

• Based upon business/mission need during Data Matrix assessment, may require Service Level Agreement and cost share for availability and Enterprise License Agreement (ELA) with vendor/provider.

• Working with Exec Leadership approach Chief Procurement Officer to require contract language for the inclusion for all financial obligations.

• Signatory with defined responsibility and stated measurable results (e.g., IT Asset Inventory for OMB Open Data Policy reporting and a quantifiable data resource inventory).

• The inventory would facilitate the identification of desired datasets; identifies redundant data assets for decommissioning; identifies opportunities to reuse or extend a data asset rather than creating a new one; and the opportunity to reduce redundancy costs based upon the establishment of enterprise licensing agreements and allows for cost share for economies of scale.

• Provides a way to uniformly describe data, thereby supporting its discovery and sharing resulting in cost avoidance. Compliance with government Open Data Policy.

Program Manager

• Coordinate across organization’s geo investment PMs for completion of Data Matrix and document business/mission functional requirements that drive data needs.

• Determine which dataset will be used enterprise-wide based upon data content, currency, and availability.

• Work with PMs across enterprise to perform review of internally produced data includes metadata with a common taxonomy and cataloged for discovery.

• Post datasets in open standards to appropriate catalogs for discovery.

• PMs prepare Data Matrix and schedule survey and follow-on interviews to clarify Data findings with business owners to understand functional needs.

• Detailed assessment of datasets and how they meet the mission/business functional requirements. May require ELA with broader use terms and additional attributes requiring cost-share.

• Review procurement vehicles to ensure metadata standard compliance language. Develop a common taxonomy for cataloging the metadata enhanced data resources.

• Ensure enterprise data are exposed or “harvestable” to appropriate web catalog services.

• Awareness and understanding of enterprise data requirements and business/mission owner functional needs that drive data.

• Reduced contracting for vendor provided data, ELA discounts for volume-based pricing, data Steward responsibility as opposed to multiple posting/storage of datasets.

• Ability to identify, search, discover and share datasets across the enterprise.

• Facilitates the search and identification of geospatial data sharing. Compliance with government Open Data Policy.

Solution Architect

• Data Assessment Matrix design and development.

• Ensure data are cataloged and available in open standards and posted to web catalog service.

• Assist the data matrix interview with mission/business owner to determine functional requirements that drive data and application needs.

• Develop technical approach for ensuring enterprise data resources are available, vetted, and provided in compliance with open data requirements.

• Technical vetting and validation across investments for desired To-Be end-state environment. Understand functional requirements to optimize application development and data resource acquisition.

• Facilitates the search and identification of geospatial data sharing. Compliance with government Open Data Policy.

Updated on February 4, 2020