NCM08011 Pacific NW Salmon Project

Two key features of America’s fishery management system are its federated structure and the strength of its conservation mandates. Many would agree that these traits have been pillars of our success: devolving decision-making authority to the regional level has helped make management responsive to local needs, while strong conservation mandates are keeping fish stocks healthy and productive now and into the future. However, these two features are also central to the growing fishery information system challenges that we face.

The need to meet the conservation mandates that have brought our fisheries back from the brink has driven demand for more accurate, precise, timely, and readily accessible data. For example, catch limits must be set based on “the best scientific information available,” often depending on a stock assessment built using an array of data streams. Catch limits and other management measures must be implemented by capturing and utilizing catch data in as close as possible to real time. And the rules must be enforced, both at sea and at the dock, through some combination of human and electronic systems.

Our decentralized structure has been one of several factors that have complicated our ability to meet these needs efficiently and effectively. At the regional and national levels, we have been saddled with an array of legacy data collection and management systems, practices and policies that prevent us from taking full advantage of modern technology and other tools to deliver better science, business and management products. These include:

Inadequate data availability: Despite strong interest from fishermen and managers alike in modernizing monitoring and reporting systems, fishery data remain slow to compile, incomplete, expensive and often inaccurate. Human onboard observers remain more prevalent in US fisheries than electronic monitoring systems, and most reporting continues to be done on paper forms. This represents a missed opportunity in many fisheries to ensure that scientists, managers and enforcement authorities have access to the quality and timely fishery-dependent data that they need. For example, in the New England groundfish fishery, coverage rates were reduced to just 14% after the observer cost burden was transferred to industry, further exacerbating concerns about compliance and the integrity of fishery-dependent data for the stock assessment process. Improved use of technology and increased efficiency could allow for increased coverage and reduced costs.

In recreational fisheries, catch is largely based on survey data, compiled sometimes months after fishing activity has occurred. Delays and inaccuracies in Marine Recreational Information Program data have led to repeated recreational overages in the Gulf of Mexico red snapper fishery, and the limitations of existing data collection systems and practices present a huge obstacle for those interested in designing better alternatives for anglers.[1]

Outdated and fragmented data management systems: Regional data management systems have grown organically over the last forty years, as managers and IT staff have digitized historic data sets and reacted to new demands from the law, the public, and fishery participants. This ad hoc evolution means that data management systems have not always been built to meet the needs of users in accordance with best practices, are often incompatible with each other, and in many instances haven’t kept pace with technological advances that have transformed how data are used, stored and organized in other industries. Data cannot easily be compared across gears, sectors, and regions because data are coded and captured differently depending on how they are collected. Most states and regions have their own distinct codes for species and gear, and there may be no code or timestamp that allows managers to match a logbook to an observer report for the same trip. If data are difficult for managers and fishermen to access, they can be even more difficult to access publicly. In early 2016 Congress found that NOAA’s per request cost of complying with Freedom of Information Act requests was “uniquely high within the Department [of Commerce] and throughout the federal government”.

Legacy systems not designed to meet current objectives: In many instances, systems that were originally designed to meet a specific need remain narrowly focused on that task rather than being integrated into a system that meets the full suite of government and stakeholder objectives. In some instances this is true across government functions, for example where an enforcement program is missing the opportunity to capture data that could be valuable to scientists. It is also apparent in the context of interface with industry. Buyers and processors generally create duplicate systems, and fishermen often have to fill out duplicate or triplicate forms. Once data enter the government system, it may not be accessible to the fishermen themselves, for whom such data could often be helpful in their businesses. Historic and legal concerns about confidentiality and proprietary business information continue to have a chilling effect on data access, sometimes even to those parties who the privacy restrictions were designed to protect. These redundancies and missed opportunities reduce overall cost-effectiveness, accuracy and performance. Reviews conducted by NMFS of data collection and management systems in each region note how seriously data stewardship is taken by agency scientists. But reviewers also note some serious gaps, as in this example from the Pacific Island region’s review:

Current data management and information flow is complicated by multiple hardware and software systems, dispersed offices, and blurred lines of responsibility for data analysis and sharing as mission shifts require new lines of information flow. In some cases, there seem to be problems with the accessibility of data housed both within [the regional science center] and by partner organizations.[2]

While the demands for data continue to rise, NMFS’s budget has been highly constrained and remains below 2010 levels. Without a concerted effort to improve the information systems that drive America’s fisheries, NMFS will fail to capitalize on the performance gains and efficiencies that improved information infrastructure can deliver. Agency systems will not be able to handle new data streams from industry and citizen scientists. Obligations to support anglers and industry in an era of fiscal constraint will be compromised. And NMFS will risk falling behind other agencies, many of which have prioritized information system improvements under successive administrations.[3]

[1] A recent Government Accountability Office report highlighted challenges relating to recreational fisheries data. See: Government Accountability Office. 2015. Recreational Fisheries Management: The National Marine Fisheries Service Should Develop a Comprehensive Strategy to Guide Its Data Collection Efforts. GAO-16-131, a report to congressional requesters.[2] Pacific Island Fishery Science Center. 2013. Review of Information for Fishery Stock Assessments. Prepared by Gordon Tribble, USGS Pacific Island Ecosystems Science Center.[3] See, for example: Office of Management and Budget. 2005. Memorandum for the Heads of Executive Departments and Agencies on Improving Public Access to and Dissemination of Government Information and Using the Federal Enterprise Architecture Data Reference Model. M-06-02 from Clay Johnson, III.