Accelerating data and analytics transformations in the public sector

| Article

An increasing number of organizations are embarking on programs to embed data and analytics at the heart of their operations, aware of the potential to transform performance. The McKinsey Global Institute estimates data and analytics could create value worth between $9.5 trillion and $15.4 trillion a year if embedded at scale—and $1.2 trillion of that in the public and social sectors. But there is a long way to go. A recent McKinsey survey shows that half of the respondents are still not using artificial intelligence (AI) anywhere within their organizations.

Adopting data and analytics is never easy. A transition to new technologies, new ways of working, and a new, data-driven culture are among the challenges. So too is a shortage of data and analytics talent. Yet all are arguably of greater magnitude in the public sector and more difficult to address (see sidebar, “Data and analytics transformation in the public sector: A bigger and trickier challenge”).

For instance, few private-sector companies are the size of government institutions: the US Department of Defense is the largest employer in the world.1 Government bodies are notoriously slow to approve new projects or reallocate resources. Receiving authority to operate on US government information systems can take a year or more due to security restrictions and protocols. And given budget constraints and hiring timelines, governments will often lose in a war with the private sector for top analytics talent.

Even so, a few public-sector organizations are making good progress harnessing the power of digital and analytics. Learning from them, as well as from successful commercial organizations, we propose a five-part framework that should help more government organizations make more progress.

Set a bold aspiration—but one you can measure

Many public-sector organizations have data and analytics strategies. But many of those strategies share a common weakness: they are too broad. A strategic aspiration to, say, improve operations or use taxpayers’ money more efficiently will prove hard to realize without clear, quantitative targets, no matter how much money is invested. Instead, numerous small pilots will likely be launched that fail to scale, as the goalposts have been set too wide. One public-sector agency hired a chief data officer who soon spent more than $20 million on new data infrastructure and launched four interesting, but unrelated, analytics projects. None made it to production. Within a year, the whole program was under scrutiny with serious questions raised about the return on investment and how scale would ever be achieved.

To be clear, the starting point for transformational change has to be a bold aspiration that will help further the organization’s mission, and it must have unequivocal leadership backing. But it must also be measurable. Hence, the strategy should address two separate questions: What do we want to achieve, and what does success look like? For one organization, the answer might be a reduction in costs while maintaining or improving service, with success measured as a 20 percent cost reduction without denting satisfaction scores. For another, it might be faster operations while maintaining quality, measured as an 80 percent backlog reduction and a 20 percent increase in quality metrics.

Efforts stay focused when the strategy is formulated in this way, and people across the organization understand what will be gained—the first step toward winning workforce support.

Anchor use cases to the aspiration, not technology

With the aspiration clear, the next task is to select data and analytics use cases for deployment.

Too often, the lure of exciting new technologies influences use-case selection—an approach that risks putting scarce resources against low-priority problems or projects losing momentum and funding when the initial buzz wears off, the people who backed the choice move on, and newer technologies emerge. Organizations can find themselves in a hype cycle, always chasing something new but never achieving impact.

To avoid this trap, use cases should be anchored to the organization’s (now clear) strategic aspiration, prioritized, then sequenced in a road map that allows for deployment while building capabilities. There are four steps to this approach.

First, identify the relevant activities and processes for delivering the organization’s mission—be that testing, contracting, and vendor management for procurement, or submission management, data analysis, and facilities inspection for a regulator—then identify the relevant data domains that support them.2

Second, draw up a list of potential data and analytics use cases for the activities and processes. Use cases should be framed as questions to be addressed, not tools to be built. Hence, a government agency aspiring to improve the uptime of a key piece of machinery by 20 percent while reducing costs by 5 percent might first ask, “How can we mitigate the risk of parts failure?” and not set out to build an AI model for predictive maintenance. The question-based framing ensures that use cases drive toward the aspiration and are not determined by the latest piece of technology. Importantly, several questions with corresponding use cases may nest behind the first “macro” one. For instance, to understand how to mitigate the risk of machine-parts failure, the organization might first have to ask, “How can we detect parts failure in operational data?”; then, “How can we predict parts failure?”; and finally, “What is the best action to take when we predict parts failure?”

Each of these questions has a separate data or analytics use case that builds on the previous one. Understanding the sequence helps companies organize resources and personnel effectively (exhibit).

To anchor use cases to the aspiration, frame them as questions to be addressed, not tools to be built.

Third, prioritize use cases from the potentially hundreds that could drive results, using three criteria:

  • Impact: The value that can be captured relative to the aspiration and the timing.
  • Feasibility: The organization’s ability to execute the use case. For example, does it have the right data, talent, and technology?
  • Amplification: The extent to which a use case builds the organization’s ability to execute more of them—perhaps because it cleans data that can be used again or builds useful data architecture or skills. One public-sector agency chose a use case aimed at answering common, resource-consuming requests for information. A request by a departmental leader for spending figures, for example, could cascade into dozens of requests to people lower down in the organization to collect the data from different programs in the unit’s portfolio. A data and analytics project to build a dashboard with real-time answers to a range of questions not only captured efficiencies but also ingested, cleaned, and imposed order on a significant quantity of program, finance, and contracting data for other use cases.

Finally, sequence the prioritized use cases in a road map. Successful road maps do not necessarily begin with the highest-impact initiatives. Instead, use cases are sequenced with a view to their collective force. Those that require similar data or data systems can be grouped together to speed deployment. But many best-in-class organizations build a lighthouse—that is, they implement 10 to 15 use cases within one organizational unit or focused upon one topic. The concentration delivers change that can be seen, not incremental improvements, and so builds support for broader adoption. The US Air Force recently created such a lighthouse with the aim of improving aircraft readiness to deploy and allocating resources more efficiently. The goal was 80 percent readiness, a higher level of performance than military fleets often achieve. By sequencing multiple use cases for two platforms, it achieved the readiness goal for priority units six years earlier than originally projected.3

Many best-in-class organizations build a lighthouse—that is, they implement 10 to 15 use cases within one organizational unit or focused upon one topic. The concentration delivers change that can be seen, not incremental improvements, and so builds support for broader adoption.

Build the data infrastructure incrementally

Organizations will, without doubt, have to invest time and resources cleaning data and building infrastructure. But it is important not to get bogged down in the endeavor and waste resources. We have seen organizations spend years trying to aggregate and clean raw data in a single location—previously a data-enterprise warehouse and now more often a data lake—before beginning to execute use cases, only to discover the data lake has become a data swamp, full of poorly organized and governed data that is of limited use. The approach also risks locking in old technologies or delivering outdated solutions. One public-sector organization spent so long trying to build a single data repository that other units built their own solutions in the interim, further frustrating consolidation efforts.

A better approach is to ingest data and build the architecture incrementally. There are three considerations:

Architecture strategy

  • Consider whether bespoke data architecture is needed, perhaps due to specific security concerns or unique systems, or whether platform-as-a-service or off-the-shelf solutions will suffice.
  • Add the elements of the data architecture incrementally, in line with the use case road map. Not everything has to be in place from the start. Make sure, however, that the architecture is flexible enough to add new capabilities, as data and analytics needs scale.

Data ingestion and cleaning

  • As with the architecture, clean and ingest data as and when it is needed to support use cases in the road map, not before.
  • Design landing zones and data integration layers for raw data.
  • Create a conformed data layer—that is, a layer between the raw-data layer and the analytics tools and dashboards—where the data can be cleaned and integrated. The layer can serve as the source of truth—the gold standard of data for that domain.

Data governance and ethics

  • Create a data-governance strategy for determining where data is located, who has access, and how it is being used. Best-in-class government data and analytics efforts create a central unit to establish policies and processes, and appoint data stewards—typically from a function familiar with the data—who are then responsible for ingesting, cleaning, and structuring the data domains. They also adopt iterative principles in day-to-day governance. Data governance and models must therefore embody explicit ethical principles that are guided and enforced by a risk-management framework, with controls to test fairness and ensure ethical outputs.
  • Implement tools and processes to ensure data is used ethically, and bias is mitigated. Failure to do so is a huge reputational risk that can lead to a loss of public trust. Data governance and models must therefore embody explicit ethical principles that are guided and enforced by a risk-management framework, with controls to test fairness and ensure ethical outputs.

Design an empowered analytics function

A successful data and analytics transformation depends upon top leadership support. But those leaders then need to ensure that the function has the authority needed to move swiftly and with impact. There are two common public-sector missteps.

One is failure to make the data and analytics lead a senior post, sending out a message that data and analytics projects are low priority, thereby slowing down progress if requests for approval have to escalate through layers of bureaucracy. Problems are compounded if the function then sits within IT, as the two functions often have conflicting interests. IT may want to reduce costs while data and analytics will likely be requesting bold investments to achieve transformational outcomes.

The second common misstep is to appoint a chief data or chief analytics officer but withhold sufficient decision-making and policy authority, enforcement ability, or budget. Without this organizational leverage, they will struggle to deliver a fast, wide-scale transformation. Funding can be a particular challenge for public-sector organizations, where budget cycles sometimes dictate that requests for funds have to be made two to three years in advance. Establishing a seed fund on which the data and analytics leader can draw at any time helps ensure use cases aren’t put on hold while awaiting fund approval.

Other organizational choices are important too. There will need to be a clear, agile operating model that determines processes, decision rights, and accountability to help teams work fast while minimizing risk. Structure also matters. If the data and analytics function is too centralized, business units can feel sidelined and that data and analytics are being forced upon them. If too decentralized, it can be hard to prioritize data and analytics resources or to standardize and scale them across the organization. Often, successful organizations start with a center of excellence to focus efforts, then move to distributed models with data and analytics embedded in the business units as their analytics capabilities mature.

Talent is another key issue, and one that can be particularly challenging in the public sector (see sidebar, “Data and analytics talent: Who you need and where to look”).

Invest in changing the way people think and work

Some public-sector leaders report that persuading employees to let go of entrenched ways of working proved harder than identifying and executing use cases. Despite all the investment, new tools sat unused.

Six measures can help build a culture that embraces data and analytics and its power to aid decision making.

  • Create a communications plan to share the organization’s aspirations for data and analytics and introduce the use cases and the new way of working. Everyone, from frontline workers to senior leaders, needs to understand why they are being asked to replace old working habits with new tools. And everyone needs a tailored message that explains how the transformation makes their own jobs easier and raises the level of service they can provide to colleagues and citizens.
  • Ensure leadership backs the new approach publicly, making clear that achieving the organization’s goals depends upon it. Think of it as pushing outcomes that require analytics, rather than pushing analytics.
  • Elect a business sponsor responsible for driving implementation and adoption of use cases. If a lighthouse unit or topic area has been chosen, the sponsor will be a leader from that unit or area.
  • Identify change leaders at all levels of the organization to champion initiatives and use cases and influence peers and direct reports.
  • Train the consumers of the use cases—those who are expected to make different decisions based on the outcome of the analytics or to operate differently. They will need to learn how to use new tools, relying on a smart alert to schedule timely equipment maintenance, for example, rather than instinct.
  • Establish and monitor metrics for success so the organization is held accountable for performance improvements, not just for data and analytics spend and deployment.

The framework described here does not diminish the effort, and ultimately the resources, that public-sector organizations will need to devote to harnessing the power of data and analytics. It does, however, suggest an approach that helps them plot a surer path toward that goal, overcoming the challenges that can dog the public sector and ensuring that effort and resources have lasting impact.

Explore a career with us