Skip to main content

Justice, Fairness, Inclusion, and Performance.

The Use of People Analytics to Improve Organizational Health and Performance

September 18, 2017

Summary from September 15, 2017 EOM Panel Meeting


The Office of Management and Budget has asked NAPA for insights on how to leverage the growing body of data on the organizational health and performance of federal agencies and programs. One senior OMB official laments: “I’ve got all these office-level indicators of employee behavior and performance. What do they tell me and how can I use them to improve government operations?”

The annual Federal Employee Viewpoint Survey provides data on 26,000 work units across the government. The Office of Personnel Management has made these data available to federal managers via How are federal agencies analyzing and using these data?

In parallel, private sector corporations also collect and analyze employee data. In fact, there is an entire private sector industry around the use of sophisticated “people analytics.” How do they analyze their data and use them to improve performance in their operations? Are there lessons or inspirations for the federal government?


The Trump Administration has kept the existing federal performance management framework, which is a first. This gives OMB an opportunity to build upon it rather than start from scratch. A transformational opportunity has been made possible by the increased availability of people data at the organizational unit level in agencies – both survey data and individual personnel details. But with this new data come some practical questions:

How can OMB develop an agenda focused on solving problems, not start with conceptual solutions in search of problems? With data and analytics, it is possible to go beyond department or agency level assessments of organizational health and performance and delve into front-line operational units, where the work of government actually gets done:

  • Identify organizations that are clear mission risks, but no one currently sees it as their job to follow up on; e.g., analysts have identified the worst rated 200 (out of 28,000) units in the government and have found they’ve gotten worse over time.
  • Compare the practices of leaders in high and low performing units and reduce the gap.
  • Change the dynamics of what leaders focus on – not just budget and policy, but effective management.
    • What can we do that would be most helpful?
  • Focus on how we can change managers’ behaviors to matter to the bottom line.
    • The business case for improving management at the organizational unit level is iron clad; we don’t need to focus on making that case. We need to find ways to model that behavior at the operational and bureau levels – not headquarters.
  • Engage organizational unit heads to model the behaviors and actions of leaders of high performing organizations.
  • Transform information into better management decisions.
  • Foster a culture of inquiry and force difficult conversations to happen without the perception of this being punitive actions instead of learning opportunities.

The strategic focus on the use of employee feedback data should be on their job engagement, not necessarily their satisfaction (15 questions from the 71- question annual Federal Employee Viewpoint Survey are used to create the engagement index).

What's New

Data and analytics have been around since WWII via operations research, program evaluation, economics. This evolved to include processes such as PerformanceStat. Now there is “organic” real-time data relevant and useful to front line supervisors.

  • How do we distill this data and provide it to managers in such a way as to speed up their learning curve?
  • What processes and what types of data do agencies use to make performance-related decisions?
  • What specifically can managers do to make a difference?

The Case of USDA

The Department of Agriculture in 2015 ranked #16 out of 19 large federal agencies. By the end of 2016, it had improved its ranking to #9. Top level leadership commitment was key. Leaders saw improving organizational engagement as a long-term process, and as much art as it is science.

  • The assistant secretary for management became the department’s “face” for its effort to improve employee engagement. He travelled the country, meeting with employees and holding town halls to learn directly from them what they felt they needed.
  • Each USDA program unit decided how to approach improvements. However, some initiatives were run centrally, from the department level, such as improvements in diversity and retention.
  • The department developed a cadre of internal coaches, created an engagement office, and used employee advisory councils to solicit input and serve as trusted communication channels.
  • To improve interactions among the CXO offices, he had them each take on his job as assistant secretary for management so they could see the importance of the connections between their individual roles and the overall operation of the department.

The Case of GSA

Like USDA, GSA’s employee engagement scores had dropped several years in a row. In 2015, GSA ranked slightly below the median among mid-sized agencies. In response, GSA brought in a consultant to analyze its employee survey data at a high level, to help them set priorities for action. Leadership moved along two tracks: one was agency-wide and the other was at the office level.

  • GSA-wide efforts:
    • Conducted a follow up survey to get more granular responses (e.g., the question about “trust in leaders” was broken down to front-line, political, career SES, etc.)
    • Focused on communications and recognition – conducted periodic pulse surveys to assess effectiveness
    • Created an HR dashboard to track improvements in hiring processes
  • Office-level efforts:
    • Implemented plans, analytic support, designated local points of contact for follow up
    • Found performance ratings were not useful metrics

The Case of IBM

Much of existing HR data are “dark data,” but this is increasingly accessible. IBM can now “reimagine HR” through the use of Artificial Intelligence, and democratize decision making by putting data into the hands of managers.

  • Managers won’t use data in decision-making unless it is made simpler to access and interpret.
  • IBM created a “data lake” to analyze its HR data from multiple data sources.
  • It segmented its populations so it could create targeted interventions, for example, individualizing learning and career progression; focused on creating team expertise??.
  • It put the employee at the center of the design of its new performance management system, which is more tailored and involves more frequent sit-down discussions.
  • Concluded you won’t need to use “change management” if employees are actively engaged in helping co-create approaches and solutions.
  • Used other kinds of data than just survey: pay, location, organizational performance, skill levels, use of social media; extent of internal connections.

Used frequent and targeted “pulse surveys” more than organization-wide annual surveys. More informal questions to gain insight, undertake course-corrections. Surveys are more useful if you can tailor them to meet specific needs.

The Case of the Department of Veterans Affairs

VA has conducted its own employee survey for about 20 years, reaching about 17,000 work units. They have hundreds of survey coordinators around the country. It is in the process of aligning its survey instrument with the OPM FEVS survey.

VA has found that 70 percent of the variation in survey responses is based on the quality of frontline supervisors.

If you are considering decentralizing implementation of survey results, “data literacy” of front line managers will be a challenge, even in interpreting dashboards (e.g., nurses would prefer a narrative version than charts/graphs). So, they had to present data in simplified forms for the smaller work units. As a result, they highly filter and simplify the data that go out to front line supervisors.

VA has both an agency-wide strategy and a bottom-up approach to its change efforts.

Discussion Highlights

  • Any approach has to be both top-down and bottom-up. The key is how do you make your organization’s strategies and values real. . . we need to be able to answer: “what do you need to do to be successful?”
  • Better engagement is seen as leading to better results, but it is less clear how administrative processes and performance connect.
  • What should we be learning? What questions should we be asking? . . . not just for now (e.g., the importance of creating data literacy) but down the road (e.g., future data sources the government should be looking at). Our report should offer a vision of where the government should be heading in the future.
  • What should OMB be focusing on because no one else is. What could lead to exponential change? What first steps need to be taken?
  • We should lay out the Ideal design, and put it in the context of what we can do today, given current context of reality and constraints on how government works.
  • There is a broad lack of data literacy at lower levels of management, which inhibits the use of the survey results as a tool for improving organizational performance. Data should be converted into narratives for managers so they can better comprehend and translate into actions that can be employed to improve organizational performance.