Better Data Collection Would Improve Analysis of NEPA Regulations

April 30, 2018

Download this Commentary (PDF)


In June, the Council on Environmental Quality (CEQ) published an advance notice of proposed rulemaking (ANPRM) as a first step in considering revisions to its implementing rules for the National Environmental Policy Act (NEPA). Before the public comment period closed on August 20, I submitted a comment focusing on how CEQ can align NEPA regulations with regulatory best practices and improve data collection for conducting retrospective review. Regulatory agencies should routinely look back at existing regulations just as CEQ is doing, but CEQ’s attempt to evaluate its NEPA regulations highlights how data availability is a significant obstacle for effective retrospective review. Without better data collection and reporting, it will be difficult to analyze how NEPA implementation has evolved over time and across agencies and investigate whether the Act is achieving its goals.

The NEPA process and CEQ’s ANPRM

NEPA requires agencies to consider environmental effects before undertaking a major federal action, but the implementing regulations instruct agencies how to go about that process. President Carter’s Executive Order 11991 (1977) directed CEQ to issue regulations implementing the procedural provisions of NEPA. In short, CEQ’s NEPA regulations govern incorporation of environmental review into the decision-making process for federal agencies (as well as state, local, tribal, or private actors involved in the project). If a project does not fall under a Categorical Exclusion (i.e., types of actions that have been identified as not having a significant effect on the environment), then agencies must conduct an Environmental Assessment (EA) and/or a more extensive Environmental Impact Statement (EIS).

CEQ explained that the ANPRM aims to gather public feedback on whether revising existing regulations would lead to “a more efficient, timely, and effective NEPA process consistent with the national environmental policy stated in NEPA.” This is an admirable start, but CEQ should go further by establishing a foundation for retrospective review.

Current situation: Minimal information on NEPA implementation

A detailed 2014 Government Accountability Office (GAO) report investigated data on NEPA implementation and analyses, finding few sources. Government-wide data on the number and type of NEPA analyses are sparse. While the Environmental Protection Agency (EPA) maintains an EIS database with records going back to 1987, EISs constitute less than 1% of all NEPA analyses.

Furthermore, agencies rarely track the costs of completing NEPA reviews, and the benefits are often difficult to convey because of their largely qualitative nature. When reported, the large variance in costs limits the usefulness of high-level estimates—GAO discovered that an EA could range from $5,000 to $200,000 and an EIS from $250,000 to $2 million. NEPA also operates as an umbrella statute by integrating reviews from other laws and acts, which creates difficulty in attributing costs and benefits to NEPA reviews versus other environmental requirements. The time frames for performing NEPA reviews suffer from a similar problem, since factors other than the NEPA process may delay a project. And while there is some information on completion times for EISs, such data are inconsistent for EAs or CEs.

Other ad hoc reports provide valuable but limited information on NEPA implementation (e.g., NEPA at 19, NEPA's Effectiveness after 25 Years, and Modernizing NEPA Implementation).  These reports all raise important points about how to improve NEPA regulations, but they were infrequently produced and efforts to assess progress on recommendations were not monitored. More importantly, the data contained in them are not sufficient for comparing consistent metrics across time and agencies. The bottom line is that reliable data on key measures are severely lacking, which constrains public understanding of whether NEPA implementation has improved over the years and where it can be most productively reformed.

How can data collection be improved?

To begin to resolve the data limitations, CEQ should revise its NEPA regulations to establish expectations for clear and comparable metrics that can be used to measure improvement. In particular, CEQ should consider establishing metrics that generate useful info on the NEPA process and inform ex post analysis. Furthermore, the agency should pair data collection improvements with provisions for retrospective review.

CEQ should direct agencies to collect and report the following key measures (see Recommendation 2 in my public comment):

  • Number and types of analyses
  • Completion times for EISs and EAs
  • Cost data for EISs and EAs
  • Document length of EISs and EAs

Each measure should be delineated by agency, state, and project type. Categories for project types may have to be established, too, but the important point is that trends across consistent dimensions can be tracked and evaluated. While initial attempts at government-wide data collection may be imperfect, having a starting point that can be revised is critical for future improvements because incremental changes are often necessary for progress.

DOE’s Lessons Learned Program as a template

For agencies looking for a starting point to collect the needed data, the Department of Energy’s (DOE) Lessons Learned Program offers helpful insights—both in what agencies should and should not be doing. Since 1994, DOE has published quarterly reports on its NEPA compliance efforts.

DOE’s approach could be used as a template for other agencies, most fundamentally by demonstrating that it is possible for agencies to collect the requisite data. DOE uses its Lessons Learned reports to convey time and cost metrics associated with NEPA compliance, including EIS completion times, the length of EIS documents, and cost data on preparation. The agency has even broken down the proportion of its NEPA analyses in terms of EISs, EAs, and CEs.

But DOE’s quarterly reports also pose challenges to reviewers. Most notably, the reports have limited usefulness because key information is not consistently included nor available in a public database. Even data that are generally included in every report (such as time and cost statistics) are not always conveyed in similar or standard formats. Specifically, the presentation of the time and cost data varies from relatively simple in many reports to more granular in others (e.g., the detailed December 1995 report includes data on cost and completion times for both EISs and EAs, facts about the characteristics of specific projects, analysis of cost and time outliers, and trend analysis of EA and EIS data).

When the presentation of data is inconsistent, it takes more effort for stakeholders and researchers to analyze trends and outcomes. Simply reporting the data in a database with regular updates would greatly aid analysis of NEPA reviews. And seeking to provide consistency over time does not rule out evolution in reports, documents, and data collection. In fact, a public database would aid in these efforts as CEQ receives feedback on how data collection could be enhanced. In addition, establishing a comprehensive database with information from each agency would permit examination of interagency trends and comparison of outcomes among agencies, project types, and states.

Admittedly, a tradeoff exists between consistency and adaptability in many instances, and CEQ may have to develop a transparent method for maintaining each data point. Simpler metrics—such as document length—most likely will not pose any challenges. For complex identifiers (e.g., the North American Industry Classification System), organizations often publish concordances to make it possible to compare older and newer data. CEQ should be mindful of what historical data are available and how revisions could affect consistency, and agencies should clearly document any changes to their collection and reporting efforts.

Conclusion

Combined with instituting periodic reviews of NEPA regulations at defined intervals, better data collection and analysis would improve evaluation of the effectiveness of NEPA implementation. Data should be comparable across time and agencies and made publicly available. Agencies like DOE have already demonstrated that it is possible to collect and report such data, even if the methods of conveying the information to the public could be improved. But agencies need to begin somewhere. Opening up CEQ’s NEPA regulations for comments is an important initial step, and CEQ should go on to propose a regulatory change that will enhance the data available on NEPA compliance. By laying a foundation for retrospective review, CEQ will better align its NEPA regulations with regulatory best practices.

Read Mark Febrizio's public interest comment on CEQ's proposal.