Infrastructure Reference Model


Definition/Description (What) – the Infrastructure Reference Model (IRM) establishes a roadmap to achieve an organization’s mission/function through optimizing its information technology environment. The IRM will provide an architectural “blueprint” for effectively defining an organization’s current (Baseline/”As-Is”) and/or desired (Target/”To-Be”) geospatial system environments.

Purpose/Function (Why) – the IRM is used to inform, guide and constrain the geospatial investment decisions for the enterprise/organization. Geospatial reference architecture should serve as a primary authoritative resource for organizational planning, a baseline from which to insert new technologies and capabilities into the infrastructure of the enterprise, and a documentation source for investment justification. The IRM provides a roadmap from the “As-Is” environment to the “To-Be,” target environment using documentation tools and artifacts. It contributes to the investment governance structure (see Investment Governance) to baseline, align, transition, and mature their geospatial invest across the enterprise. It describes how to:

  • Establishes a process for base lining and categorizing geospatial technology and functionality.
  • Defines a 3-Tier geospatial architecture as the To-Be target.
  • Provides reference artifacts for the 3-Tier functionality.

Stakeholder Performance Guide (Who & How) – Solution Architects serve as the primary developer and user of the Infrastructure Reference Model documentation,

planning tools, and system artifacts necessary for the design and subsequent leverage/buy/build investment strategy. Program Managers must facilitate the cross organizational collaboration necessary to document the technology investments and negotiate implementation options that support both mission and enterprise needs. Executive Leadership must assess the justification for approving/denying a proposed geospatial investment.


Design for the Mission … Develop for the Enterprise.

Alignment To The Enterprise Architecture Investment Planning Process

The Infrastructure Reference Model will focus upon a practical approach to documenting the technical and functional capabilities and requirements of the geospatial enterprise. The IRM supports architectural analysis and reporting within an organization’s overall Enterprise Architecture. The IRM also unifies existing agency infrastructure portfolios and guidance by providing a foundation to advance the reuse and standardization of technology and service components. Aligning agency capital investments to the IRM leverages a common, standardized vocabulary, allowing intra/interagency discovery, collaboration, and interoperability.

Organizations and enterprise architectures will benefit from economies of scale by identifying and reusing the best solutions and technologies for applications that are developed/provided or subscribed to support their business functions, mission, and target architecture.

Agencies must document and submit their Enterprise Architecture (EA) documentation to OMB. The EA provides the explicit description and documentation of the current and desired relationships among business and management processes and information technology. It describes the “current architecture” and “target architecture” as well as providing a strategy that will enable the agency to support its current state and also act as the roadmap for transition to its target environment. These transition processes will include an agency’s capital planning and investment control (CPIC) processes. The EA should inform the CPIC process by defining the technologies and information critical to operating an agency’s business, and by creating a roadmap which enables the agency to transition from its current to its targeted state.

The EA helps the agency respond to changing business needs, and ensures that potential solutions support the agency’s targeted state. A proposed IT solution that does not comply with the EA should not be considered as a possible investment, and should not enter the CPIC process. The CPIC process helps Executive Leadership select, control, and evaluate investments that conform to the EA. For example, during the select stage of capital planning an agency identifies and investigates different potential solutions for an investment. An agency then selects the option with the best business case. If any of these alternatives does not conform to the EA, the agency should drop it from consideration.

Geospatial Baseline Assessment Matrix

Each agency must support the Enterprise Architecture with a complete inventory of agency information resources, including personnel, equipment, and funds devoted to information resources management and information technology, at an appropriate level of detail.[1] Agencies must implement the EA consistent with the following [amongst others] principles:

  1. Develop information systems that facilitate interoperability, application portability, and scalability of electronic applications across networks of heterogeneous hardware, software, and telecommunications platforms.
  2. Meet information technology needs through cost effective intra-agency and interagency sharing, before acquiring new information technology resources.

To accomplish this requirement, the organization’s Executive Steering Committee should authorize the development of the Baseline Assessment Matrix that begins to document the existing and planned geospatial infrastructure and technology investments across the organization’s enterprise (Table 1). The Baseline Assessment provides a framework to begin to profile the geospatial system infrastructure environment. The Solution Architects from across the organization’s geospatial investments should develop the Infrastructure Assessment Matrix to include the entire infrastructure core capabilities involved in or impacted by their geospatial system capability.


Table 1. Geospatial Baseline Assessment: Infrastructure



Additionally, the Solution Architects will need to prepare the Technology core capability matrix for the Baseline Assessment. Table 2 is an extract of the three-page Technology Assessment Matrix and combined with the Infrastructure Matrix will provide the foundation for investment comparison, both existing and planned to determine the optimal technology infrastructure and how it should align to the Enterprise Architecture. It will also provide the content necessary to respond to the OMB Common Approach to Federal Enterprise Architecture and submit an Enterprise Roadmap[2] which includes a summary of current architecture, including infrastructure.


Table 2. Geospatial Baseline Assessment: Technology (Extract)




[1] Ibid.

[2] Office of Management and Budget, The Common Approach to Federal Enterprise Architecture, April 30, 2012.

Digital Government Strategy Alignment

The Digital Government Strategy[1] is intended to be disruptive, realizing the need to do more with less.

“[The Digital Strategy] gives the federal workforce the tools needed to carry out their mission of delivering services to all citizens—whether to a warfighter in the field retrieving geospatial imagery information … or a rural farmer accessing real- time forecast of seasonal precipitation. It provides a platform to fundamentally shift how government connects with, and provides services to, the American people.”[2]

To drive this transformation, the strategy is built upon four overarching principles:

  • An “Information-Centric” approach – focuses on ensuring data and content are accurate, available and Transforming unstructured content into structured data – then ensure all structured data are associated with valid metadata and providing this information through web Application Programming Interfaces (APIs) helps to architect for interoperability using open standards. This approach also supports device-agnostic security and privacy controls, as attributes can be applied directly to the data and monitored through metadata, enabling agencies to focus on securing the data and not the device.
  • A “Shared Platform” approach – requires the reuse of resources and to “innovate with less”, accelerates the adoption of new technologies, lowers costs and reduces duplication. A shared platform approach to developing and delivering digital services and managing data needs to leverage existing services, build for multiple use cases at once, apply common standards and architectures, produce shared government-wide solutions to ensure consistency in how information is created and delivered.
  • A “Customer-Centric” approach – means quality information is accessible, current and accurate at any time. It requires an understanding of stakeholder business/mission requirements (Operational Requirements Documentation) and makes content more broadly available and accessible in a device-agnostic way.
  • A platform of “Security and Privacy” – requires the transformation to happen in a way that ensures the safe and secure delivery and use of digital services to protect information and Architecting for openness and adopting new technologies have the potential to make devices and data vulnerable to malicious or accidental breaches of security and privacy. Architectures must adopt solutions in areas such as continuous monitoring, identity, authentication, and credential management, and cryptography (Security Reference Model) that support the shift from securing devices to securing the data itself and ensure that data is only shared with authorized users.

The development of any Federal geospatial infrastructure must align with the Digital Government Strategy principles.


[1] Digital Government: Building a 21st Century Platform to Better Serve the American People, May 23, 2012.


Geospatial 3-Tier Target Architecture

The Digital Government Strategy[1] establishes a conceptual model that acknowledges three “layers” of digital services, Figure 1. Three-tier architecture is a client-server architecture in which the functional process logic, data access, computer data storage, and user interface are developed and maintained as independent modules on separate platforms.[2] Three-tier architecture allows any one of the three tiers to be upgraded or replaced independently. The user interface is generally implemented on the client-side environment either at the desktop or web browser uses a standard graphical user interface with different modules running on the platform layer that hosts the application server(s). The data layer includes a relational database management system that contains the computer data storage logic (e.g., schemas, metadata, topology, and ontologies). This 3-Tier architecture model includes: the Presentation Layer, the Platform or Application Layer, and the Information of Database Layer.




Figure 1. Three-Layers/Tiers of Digital Services

The three layers or 3-Tier model separates information creation from information presentation— allowing organizations to create content and data once, and then use it in different ways through hosted applications or publication services operating on the shared platform.



The Presentation Layer defines the manner in which information is organized and provided to customers. It displays information related to services available on a platform. This tier communicates with other tiers by sending results between the user applications and interfaces and other tiers in the network and represents the way information delivery occurs (e.g., data or content), whether through desktop clients, websites, mobile applications or other modes of delivery. The presentation layer must support open interface standards and allow application programmers, software developers, web service publishers and device manufacturers’ extensibility in presentation of information resources. The Infrastructure Assessment Matrix illustrates some of the common technical elements to consider in the implementation process such as web browser support, application programmer interfaces and methods. The Infrastructure Assessment Matrix also includes the core infrastructure capabilities involved in or impacted by the target geospatial system capability.


The Platform or Application Layer, also known as the logic or business logic tier, is where it controls application functionality includes all the systems and processes used to manage the information. Examples include systems for content management, processes such as web API and application development, services that support mission critical IT functions such as mapping and situational awareness, as well as the hardware used to access information (e.g., mobile devices). Solution Architects should use the Infrastructure Assessment Matrix as a blueprint to address common technical elements to consider for assessing infrastructure serviced through an on premise data center or off premise cloud service offering. These considerations include operating systems, user and system access controls, support for commercial or government off-the-shelf software, distributed processing capability, and search and indexing software.


The Information or Database Layer houses the database servers where information is stored and retrieved. Data in this tier is kept independent of application servers or business logic and contains the digital information. It includes structured information (e.g., the most common concept of “data”) such as geospatial data layers and metadata, plus unstructured information (e.g., content), such as fact sheets, guidance documentation geospatial search indexes, or geocoding/geo-tagging dictionaries. The

Infrastructure Assessment Matrix provides Solution Architects a frame of reference for technical considerations such as database software, support for federated search and queries using structured query language (SQL), ability to provide geospatial search and indexing capabilities, access control at the data layer and role level.


[1] Digital Government: Building a 21st Century Platform to Better Serve the American People, May 23, 2012.

[2] 3-Tier Architecture definition.

Geospatial Target Architecture Artifacts

The goal of the Geospatial Interoperability Reference Architecture is to make geospatial information and technology more broadly accessible, geospatial investments more effective, and geospatial practitioners and business systems more productive. The GIRA provides a blueprint for architectural analysis and reporting within an Agency’s Enterprise Architecture. The reference implementations of the GIRA are intended to provide Solution Architects with go-to Target Architectures for the sensitive-but-unclassified and public domains that so Government Program Managers can reuse and/or emulate. These reference implementations provide best practices for geospatial interoperability and information sharing to drive:

  • Discoverability – Discoverable by appropriate users, systems, and communities of interest.
  • Accessibility – Available in a usable form that is easily understood.
  • Understandability – Able to be used intelligently using commonly defined terms and intuitive interfaces and tools.
  • Interoperability – Readily consumed and combined with other geospatial capabilities (software, data, services, or systems) using open-standards or best practices for geospatial information and services exchange.
  • Reliability – Capabilities are consistently delivered over time.
  • Trust – Accuracy, currency, completeness, and source of capabilities (software, data, services, and systems) are available to users.

The Federal Geospatial Platform[1] is a FY 2011 budget initiative and Presidential call for action. Through the Federal Geographic Data Committee (FGDC), federal departments and agencies are developing the Geospatial Platform to more effectively share place-based products and services to the public. The Geospatial Platform will be a managed portfolio of common geospatial data, services, and applications contributed and administered by authoritative sources and hosted on a shared infrastructure, for use by government agencies and partners to meet their mission needs and the made openly available.

The content of all datasets and services are required to be verified by the agencies to be consistent with federal privacy, national security, and information quality policies. Additionally, the Geospatial Platform provides access to data from various partners across state, tribal, regional, and local governments as well as non-governmental organizations. The overall goal is to reduce duplication of efforts and promote the use of open standards among agencies’ geospatial programs. The move to a standard Geospatial Platform offers many advantages to its users:

  • A “one-stop shop” to deliver trusted, nationally consistent geospatial products, with a preference towards interoperable web services.
  • Tools for the centralized discovery, access, and use of data and services managed and maintained in multiple agencies.
  • Tools that enable cross-government data to be displayed in a visual context.
  • Tools enabling on-line collaboration communities focused on mission and/or priority issues, where federal and non-federal agencies and partners can share and create geospatial data and map products to provide common understanding of information for decision making.
  • Problem-solving applications that are built once and reused many times.
  • A shared cloud computing infrastructure.

Figure 2 provides a high-level conceptual depiction of the Geospatial Platform. Some of the features include the migration of the Geospatial Open Source catalog to, which includes a search interface and community features. In addition to catalog search, users will be able to create and share maps. Agencies are also encouraged to provide content supporting their business cases.




Figure 2. Geospatial Platform Conceptual Model

The Geospatial Platform employs a multi-tired, services-based architecture that support open standards. Figure 3 provides a more detailed view of the technical architecture. The Geospatial Platform provides users a standard web interface and developers with application programmer interfaces. Web services are provisioned on the platform layer. Data services are managed in a shared data layer.




Figure 3. Geospatial Platform Technical Architecture

The Geospatial Platform is expected to expand access to high quality data, enabling the increased sharing and reuse of resources resulting in reduced costs. The integrated approach will mean that the federal portfolio of geospatial data will be better managed, service a broader audience, and be easier to use.


The Department of Homeland Security’s (DHS) Geospatial Information Infrastructure (GII)[2] is a target architecture for the enterprise platform to support multiple missions across the Homeland Security community. It provides access to a wide set of shared capabilities that support geospatial visualization, analysis, processing, modeling and simulation, and content delivery of geospatial information. The GII provides secure hosting services for geospatial web and mobile applications, interoperable access to more than 600 layers of geospatial foundation and infrastructure information that includes high resolution U.S. population information, pre- and post-incident imagery, public alerts and warnings, and derivative map products. It also includes a general purpose web map viewer called OneView, interoperable web map services for desktop GIS users and system integrators based on OGC standards, support for multiple viewing solutions, and application programmer interfaces (APIs) that allow application developers to extend GII functionality (web services, and data feeds) into customer centric applications. Developers can use the GII APIs with the underlying binaries and programming references to build internet-based mobile or web mapping applications using GII services and components.

Figure 4 depicts the multi-tiered, services-based architecture for the GII. The GII technical architecture supports open standards for search, web services, APIs, and data publication. GII application hosting and web services are provisioned on the platform layer. Data services are managed in a shared data layer.




Figure 4. DHS Geospatial Information Infrastructure (GII) Technical Architecture


Figure 5 illustrates how GII services can be leveraged to support business application and systems as referenced by the DHS Common Operating Picture (COP).



Figure 5. DHS Common Operating Picture Aligned to DHS Geospatial Information Infrastructure






Geospatial Target Architecture Considerations

As part of the baseline assessment, Solution Architects should make considerations for IT security including cyber security, identity, credentialing, and access management, certification and accreditation (C&A), authority to operate (ATO) and trust boundaries, and network domain (e.g. public, sensitive-but-unclassified, classified). Solution Architects should also evaluate the resiliency requirements for the capability including disaster recovery, failover, and surge capacity. Compliance requirements for Section 508 Accessibility and Privacy Impact Assessments (PIAs) are crucial and support multiple open standards for geospatial information exchange and geospatial search. The Baseline Assessment Matrix provides a blueprint for assessing these technical considerations during the capital planning and investment process.

It is important that Agencies consider the maturity of the investment and not just the technical capability as part of the baseline assessment. Factors to consider are:

  • Where is the investment in its lifecycle? Is the technology near end of life or still emergent?
  • What is the schedule for technology refresh?
  • What is the concept of operations for operating and maintaining the investment?
  • How are technical support services provisioned? Is training or help desk support available?
  • Does system have a completed certification and accreditation package or authority to operate?
  • Is the investment compliant with Agency enterprise architecture policy?
  • Is the investment compliance with federal privacy and accessibility requirements?

The Federal Cloud Computing Strategy states that, “When evaluating options for new IT deployments, OMB will require that agencies default to cloud-based solutions whenever a secure, reliable, cost-effective cloud option exists.”[1] The OMB also requires a Cloud Computing Alternatives Evaluation[2] for an agency’s capital planning submission specifying a cloud alternative was evaluated for the investment or components/systems within the investment, per the Cloud First policy. All investments should answer this question regardless of the overall lifecycle stage of the investment, as operational investments may consider performing such an evaluation during or as a result of an operational analysis. The evaluation should indicate one of the following answers:

  1. The agency evaluated a cloud alternative and chose a cloud alternative for some or all of the investment.
  2. The agency evaluated a cloud alternative but did not choose a cloud alternative for any of the investment.
  3. The agency did not evaluate a cloud alternative but plans to evaluate a cloud alternative by the end of the Base Year.
  4. The agency did not evaluate a cloud alternative and does not plan to evaluate a cloud alternative by the end of the Base Year.

As part of that evaluation, the Solution Architect and Program Manager should develop an operational requirements document (ORD) that is cross referenced against the Baseline Assessment Matrix. These artifacts should serve as the basis of comparison for completing the cloud computing alternative analysis. The evaluation process should be conducted in two phases.

The first phase should identify commercial and federal government candidates from a functional and technical perspective and flag these for further evaluation. An application is considered viable if it passes all the steps of the evaluation process as discussed in the following paragraphs. If Phase 1 identified no viable solutions, the alternative analysis would have concluded at the end of Phase 1 and the recommended alternative would be custom build-out or maintain the status-quo. The second phase should perform a comparison of the costs, benefits, and risks associated with each of the potential solutions identified in Phase 1 and the costs, benefits, and risks of custom build-out or maintaining the status-quo.


The Phase 1 evaluation process contains several steps that serve as filters to either eliminate solutions or pass them on for more detailed evaluation. Each step answers a particular set of questions:

  • Step 1 – Does the solution provide geospatial cloud services (i.e.; operational requirements)? Does it appear to be a good choice for the Agency’s operating environment? Only the solutions with “yes” responses are passed to Step 2.
  • Step 2 – Does the solution feature appropriate technical capabilities? How does it compare with other solutions? Only applications with the highest technical capability scores are passed on to Step 3.
  • Step 3 – Are technical capabilities present in a robust, flexible, and easy-to-use fashion? Will the solution be difficult to integrate with the Agency’s operating environment and existing geospatial software packages and systems? Only applications that appear to have a high probability of success will be passed on to Step 4.
  • Step 4 – Is the solution proven technology currently used within the Agency or other federal agencies? Consult technical experts about the suitability of the solution to meet Agency operational requirements, how it compares with its competitors, current user base, financial stability of the vendor, future viability of the solution.

The Phase 2 evaluation process compares the most viable candidate identified in Phase 1 to a custom-build-out or maintain status-quo approach in terms of cost, benefits, and risks.

  • Step 1 – For custom build-out or maintaining status-quo, use Agency infrastructure pricing, supplementary GSA software licensing and existing FTE/Contractor rates. For viable alternatives, compute the cost by adjusting the Agency cost factors for items that will be eliminated or changed to accommodate a commercial solution. Add the costs to acquire the commercial solution. Compare the two costs.
  • Step 2 – Identify a set of possible benefits. Ascertain the probability that these benefits will occur with the commercial solution and with a custom-built solution. Compare the two results.
  • Step 3 – Identify the risks for both solutions, along with the probabilities that the risks will occur and the impacts of those occurrences. Compare the two results.
  • Step 4 – Compare costs, risks, and Recommend a solution.


[1] Office of Management and Budget, Federal Cloud Computing Strategy, February 8, 2011.


Stakeholder Performance Guide

The Performance Guidance provides a summation of the key decision points necessary to determine the most effective and efficient design, development and implementation of the geospatial system investment (Table 3).


Table 3. Stakeholder Performance Guide: Infrastructure







Executive Leadership

• Executive Steering Committee authorization and commitment to perform Baseline Assessment Matrix: Infrastructure and Technology.

• Approve/disapprove a proposed IT solution depending upon its compliance with Enterprise Architecture for inclusion within CPIC process.

• Ensure Cloud Option assessment is performed as part of a proposed IT solution.

• Task Program Managers responsible for geospatial system oversight to perform develop and execute the Baseline Assessment.

• Ensure that the infrastructure/technology Baseline Matrix capabilities are aligned to EA and proposed new infrastructure/technology aligns to and not duplicative of existing capabilities.

• Task Program Manger to apply cloud process review as option for IT solution.

• Provides input for CPIC (53/300) and reporting to OMB as well as establishes the enterprise baseline of the As-Is geospatial investments across the organization.

• Promotes interoperability, reduces redundant investments, and allows for cost share.

• Complies with Cloud First policy and provides economies for implementation.

Program Manager

• Coordinate across organization’s geospatial investments to ensure committed participation in Baseline Assessment.

• Identify opportunities for shared infrastructure and/or technology based upon Baseline Assessment Matrix comparison.

• Review report of finding for cloud options for IT solution and make recommendations to Executive Leadership.

• PMs identify and prioritize capability gaps and planned investments aligned to Operational Requirements Document (ORD) and prepare recommendations and/or options for Execs approval.

• Based upon gap analysis, identify candidate investments to leverage, eliminate or new develop based upon ORD priorities.

• Coordinate Cloud assessment process evaluation for IT solution architecture.

• Cross organization agreement for prioritized geospatial system development priorities and leveraged resource commitment.

• Reduce duplicative IT footprint and identify opportunity to leverage or reprioritize investments.

• Complies with Cloud First policy and provides economies for implementation.

Solution Architect

• Develop the Infrastructure Assessment Matrix from across the entire organization’s geospatial investments.

• Vet “new” technology insertions to EA and Technical Reference Model to ensure alignment with organization’s To-Be environment.

• Prepare report of finding for cloud options for IT solution.

• Work with other organization SAs to ensure a complete baseline assessment and perform capability gap analysis for As-Is and To-Be environments.

• Determine “optimal” solution if duplicative investments and ensure alignment to EA for “new” technology.

• Perform Cloud assessment process evaluation for IT solution architecture.

• Ensure broadest possible technical review, adoption and acceptance.

• Technical vetting and validation across investments for desired To-Be end-state environment and alignment to EA target ensures compatibility and reduces IT footprint cost.

• Provides awareness of architecture investment and solution options.

Updated on February 4, 2020