ABSTRACT
The “Readiness Evaluation Tool for Oil Spills (RETOS™)” is an application upgraded in 2014 with the support of regional and international experts from industry and government, including associations such as Oil Spill Response (OSRL), the Regional Activity Centre / Regional Marine Pollution, Emergency, Information and Training Centre – Caribe (RAC/REMPEITC-Caribe), and the International Maritime Organization (IMO). The ARPEL Manual and RETOS™ provide a comprehensive set of criteria for industry and governments to assess their level of oil spill response (OSR) planning and readiness. The assessment criteria, agreed upon by the participating companies and institutions, provide the foundation for a series of checklists whereby gaps can be identified in spill response planning and readiness programs. The background for the tools is the “Assessment of Oil Spill Response Capabilities: A Proposed International Guide for Oil Spill Response Planning and Readiness Assessment”, developed for the 2008 International Oil Spill Conference.
The RETOS™ Excel application and Manual list evaluation criteria according to the type of OSR program to be assessed. These tools have:
Seven different scopes and two general perspectives (government and industry) are considered, including facilities, companies’ business lines, and government national programs.
For each scope, there are three possible assessment levels for which OSR planning and readiness assessment criteria become increasingly more demanding.
Each level contains criteria in 10 different categories (topic areas) and identifies critical criteria deemed necessary for completeness at a basic level.
An additional category for institution-specific added criteria.
Given that the criteria utilized relate to best international practices, RETOS™ represents a powerful tool for international benchmarking purposes. As of the end of 2016, workshops on how to use the tools have been presented in at least seven countries with over 400 total participants. RETOS™ has been used in more than 30 countries worldwide with most of those assessing national spill preparedness programs. Initial assessment results for Level A (basic) ranged from approximately 20% to 99% completion. Over 20 companies and institutions have utilized the tool with a similar range of results. Re-assessment provides a clear indication of progress toward higher levels of preparedness. The Manual and RETOS™ are currently available in English, Spanish, Portuguese and French and can be downloaded free of charge from the ARPEL web site (www.arpel.org).
INTRODUCTION
In 2007, organizers of the 2008 International Oil Spill Conference (IOSC) agreed to support development of general guidance to assess oil spill response (OSR) readiness programs. As part of that development, the 2008 IOSC Workshop Subcommittee prepared a broad suite of planning and readiness assessment elements to encourage improved response capacity. That initial work set a framework to aid development and maintenance of response management systems to improve OSR readiness, documented in the 2008 IOSC Guideline (Taylor et al., 2008a and 2008b).
Subsequent feedback received from the international community recommended transforming the 2008 IOSC Guideline into a more user-friendly management tool. ARPEL (2016) took the lead on this recommendation and developed the “ARPEL Oil Spill Response Planning and Readiness Assessment Manual” (the “Manual”) and its accompanying assessment tool, the Readiness Evaluation Tool for Oil Spills - RETOS™ (ARPEL, 2014).
RETOS™ has seven scopes or general OSR program areas:
Assessment scopes applicable to either Industry or Government
1. Facility: Generally local geographic context - Refineries, Well or Production Sites, Storage facilities, Tank farms, Floating Storage and Offloading/Floating Production Storage and Offloading, Transfer facilities, Privately-owned port
2. Multi-Facility / Asset Operations: Generally broad or wide geographic context - Pipeline operations, Vessel fleets (tankers, barges), Rail transport, Subsea pipelines and gathering systems
Assessment scopes applicable to Government
3. Port/City/Local: Port facilities, Municipalities
4. Area (Region, Province, State): Sub-national - State, Province, Multi-state/provincial
5. National (& Multi-National): Country-wide, National, Joint National, Multi-National
Assessment scopes applicable to Industry
6. Country or Business Line: Nation-wide Industry program, Pipelines (comprehensive for multiple operations), Fleets, Production, Drilling & Exploration
7. Corporate: Company OSR Program, OSR portion of Corporate HSE Programs, OSR programs defined in ISO and adopted international practices
For each scope a user first selects an assessment Level (Figure 1). Assessment Levels (A, B, or C) are relative measures of the maturity of a program. Programs considered complete at these three levels generally indicate:
Level A - Preparedness at this level indicates that all components are in place, at least to a minimum level, to ensure a reasonable OSR management capacity. Plans are in place and fully implemented.
Level B – Programs at this level have planned and implemented OSR to more rigorous levels that reflect the feedback and evaluation process necessary for improvement and sustained management capability.
Level C - The top level reflects programs in search of excellence. These are programs that consistently implement feedback in improving sustained readiness through application of Best International Practices in OSR concepts, management, planning, and competency.
Workshops, training, and use conducted after the rollout of RETOS™ between 2011 and 2013 provided multiple channels of feedback and suggestions for upgrades to the application. In 2012, experts from OSRL (Oil Spill Response Limited), ITOPF (International Tanker Owners Pollution Federation), IMO/IPIECA GI WACAF (Global Initiative’s West and Central Africa), IPIECA, and OSPRI (Oil Spill Preparedness Regional Initiative for Caspian, Black Sea and Central Eurasia) evaluated the Manual and RETOS™. These experts made suggestions to improve user-friendliness of the tool and to foster its use within the Global Initiative context. Eight major aspects were built in an upgrade leading to RETOS™ V 2.0 and its accompanying Manual (Taylor et al., 2014). The 2014 upgrades were:
Identified “critical” criteria (for Level A only) that, if missing or incomplete, would not allow a program to qualify as complete, regardless of scores for individual criteria.
Added editable blank criteria rows to allow for up to ten institution-specific criteria to be inserted at any given level.
Added RETOS™ V 2.0 functionality for critical and institution-specific criteria such that a N/A, missing, or incomplete/partial score requires an assessor to add Comments and Recommendations.
Developed a more robust assessment report, the Global Performance Analysis (GPA), including sub-scores per assessment category, highlighting any categories with missing or incomplete critical criteria, and enabling simple display of results by category in a radar chart or spider-web diagram (Figure 2).
Enabled auto-generation of a GIP -Global Implementation Plan in which critical criteria (rated as either missing or partial) are highlighted and listed as top priorities for improvement, followed by other criteria rated as either partial or missing (Table 1).
Added linkages to the GIP Report so that criteria ‘requiring action’ have an added notation to relevant information in the 2008 IOSC Guideline (see right-hand column in Table 1).
Updated available literature and guidelines on best practices, including links to web sites, to supporting information (i.e., the Toolbox) in the Manual, now with over 150 publicly available technical and scientific references.
Upgraded the ARPEL Manual to reflect changes and improvements made to RETOS™.
GLOBAL IMPLEMENTATION
Since its initial rollout in 2011, the ARPEL RETOS™ tool and Manual have been used within industry groups and governments worldwide. Highlights of its use are:
Latin America
ARPEL coordinated workshops on use and implementation of RETOS™ in at least nine venues between 2011 and 2016: Brazil, Ecuador (2), Trinidad & Tobago, Peru (2), Colombia, Venezuela (2), Mexico, Argentina (2), and Paraguay with more than 300 total participants having worked with the tool on actual or fictitious OSR programs.
During 29–31 August 2012, RETOS™ was presented at the “Fourth Regional OPRC Forum on Oil Spill Prevention, Preparedness and Response in the Gulf of Mexico and the Caribbean” convened in Mexico by IMO and RAC/REMPEITC-Caribe (2012). Over 70 participants from Brazil, Cuba, Bahamas, Guyana, Jamaica, Mexico, Suriname, United States and Venezuela participated of this event. They represented professionals from oil companies, oil spill response stakeholders and officials from the maritime authorities of these countries.
In July 2013, ARPEL presented the beta version of RETOS™ V2.0 at the 20th meeting of ROCRAM-CA (Operational Network for Regional Cooperation among Maritime Authorities of Central America and the Dominican Republic) convened in Guatemala by COCATRAM (Central American Maritime Transport Commission). On this occasion, training was aimed at 25 officials from the maritime authorities of Guatemala, Costa Rica, El Salvador, Nicaragua, Panama, Honduras, and the Dominican Republic. These officials were also directly linked to the National Contingency Plans and responsible for assessing industry contingency plans, including those from oil companies operating in their countries (COCATRAM, 2013).
The RETOS™ V 2.0 readiness analysis tool was introduced at the Sub-regional OPRC ratification and implementation Workshop in Paramaribo, Suriname in November 2014 (RAC/REMPEITC-Caribe, 2014). The workshop, coordinated by RAC/REMPEITC-Caribe, brought together government officials from Suriname, Belize, French Guyana and Guyana. Workshop participants from each country used the RETOS™ tool to perform a preliminary assessment (Level A) of each of their respective National Oil Spill Programs.
RETOS™ was also used as the primary tool to complete a gap analysis of the oil spill preparedness and readiness of thirteen Caribbean states during an IMO Regional Workshop on Oil Spill Contingency Planning, coordinated by RAC/REMPEITC-Caribe, that took place in Nassau, Bahamas in December 2016 (see Donohue et al, 2017, this conference)
ARPEL member companies have agreed to use RETOS™ as a benchmarking tool for pipeline systems and the approximately 100 maritime terminals in Latin American countries. ARPEL is presently preparing the Manual for benchmarking of these two major operational targets. Using RETOS™ for gap assessment and improvement is meant to lead to improved spill prevention and mitigation. The benchmarking exercise is expected to start in 2017.
Central and West Africa
GI WACAF presented the tool to a number of countries between 2014 and 2016: Cote d’Ivoire, Congo, Democratic Republic of Congo, Togo, Gabon, Ghana, Gambia, Liberia, Nigeria and Namibia (Anton Rhodes, 2016 - personal communication & IMO/IPIECA, 2014–2016). The purpose of those RETOS™ sessions was to explain the tool, its purpose, how it works, and whether these tools were something that the Governments would be interested in using within their countries. The feedback received was extremely positive and GI-WACAF introduced the tool into the regional work program GI WACAF used the tool to undertake a full assessment of Gambian national oil spill response capability in May 2015 (GI WACAF, 2015). RETOS™ was subsequently used to evaluate plans and preparedness in Angola, Mauritania, Mozambique, and Namibia. English, French, and Portuguese versions are expected to be used for most of the remaining countries in the GI WACAF region. These will be Level A assessments and will be national in scope. The primary goal of these assessments is to help GI WACAF, and more importantly the countries themselves, identify where gaps exist in their national contingency plans and explore opportunities in which countries with more robust programs can support those with greatest needs.
Southeast Asia
GI-SEA has discussed plans to roll-out RETOS™ V 2.0 in ASEAN (Association of Southeast Asian Nations) although definite dates and countries where this will start are not yet finalized (Joselito Guevara, 2015, personal communication).
Baltic Sea
Palsson (2016) utilized RETOS™ V2.0 in cooperation with nine Baltic nations and the USA to evaluate and develop a benchmark for oil spill response programs in the region. The national OSR programs for Norway, Sweden, Finland, Latvia, Lithuania, Poland, Germany, Denmark, and Russia were evaluated at Level A with results ranging from 56% to 98% completion.
KEY FINDINGS
By the end of 2016, RETOS™ had been presented and/or used in more than 45 countries representing over 60 government and industry organizations (Tables 1 and 2). In 2016 alone, the Manual and RETOS™ files had been downloaded by over 120 users worldwide representing industry, national governments, and consultants. To the best of our knowledge, the tools have been used to assess national level plans in 35 countries worldwide (Table 2). Most non-national programs evaluated (Table 3) consisted of industry fixed facilities (installations) or wider operations (i.e., pipelines). Results of most of the assessments conducted (all Level A) showed that OSR programs typically achieved a 60–70% completion. Of the 26 countries with a known assessment for Level A, the average percent completion was 64%. Six countries reported completion above 80% and only two were deemed complete; that is, each scored above 90% and had no missing or partial critical criteria.
Priority areas for closing gaps are provided in the Global Improvement Program report generated by the RETOS™ tool (Table 1). The same report lists where additional information on the subject can be found in the IOSC 2008 Guideline and the more than 150 hyper-linked references in the Manual provide personnel tasked to address the gaps with guidelines and references for further information.
RETOS™, its Manual, and the 2008 IOSC Guide have successfully been used as best practice guidelines at local to regional levels for assessing and setting baselines for preparedness. Regional workshops have used the outcomes of the analysis to identify National to Regional priorities and mechanisms to address gaps. Regional priorities for oil spill preparedness becomes clear as work groups compare and collate individual results into a broader preparedness framework. The example from the 2016 Bahamas regional workshop highlighted key categories in need of development within the participating eleven Caribbean national programs (Donohue et al, 2017): training and exercises, logistical readiness, finance-related preparedness, sustainability for readiness, and Spill Contingency Planning. Following assessment of the national programs, the mostly common missing or partial complete critical criteria from the RETOS evaluation were:
Risk Assessment for OSR Planning
Health & Safety Protocols or Standards (Responders & Public)
Exercises (w/ training) – Tier 2 deployments & tabletop
Contact lists (completed & up to date)
Oil spill emergency funds
Spill mitigation measures
Legal - oil spill sampling and responders’ liability during cooperative schemes.
Workshops and user feedback provide valuable suggestions for future upgrades and revisions to the Manual and tool. The ARPEL project team has taken advantage of each opportunity to learn what improvements were recommended for improved RETOS™ functionality. One recommendation from more recent applications was to add the numbering to criteria in the Manual so that it would be easier for users to find corresponding criteria listed in RETOS™ V 2.0 and to match similar criteria between levels. Another recommendation made and being explored is to transition the RETOS tool from an Excel™ based application to a web- or tablet-based application.
CONCLUSIONS
The ARPEL Manual and RETOS™ are intended to help assess OSR planning and readiness and to identify challenges, information needs, and areas for improvement. OSR assessment criteria are the foundation for a consistent approach to gauge the level of OSR planning and readiness and to assist in identifying areas for improvement. The criteria in RETOS™ are not mandatory and are not intended to reflect or add any legal or regulatory requirements. The Manual and RETOS™ are oriented more towards the management of OSR readiness and less towards detailed operational aspects, such as specific amounts or types of equipment.
An important feature of RETOS™ is the fact that criteria are specific to the scope of the OSR program being evaluated, providing assessments tailored to the needs of the user. The RETOS™ tool and Manual have been well received and provide a common basis available to the broad international spill response community for engaging in gap assessment and continual improvement. Since the criteria utilized relate to best international practices, RETOS™ represents a powerful tool for international benchmarking purposes. During the 2017 OPRC workgroup at IMO’s 4th Sub-Committee on Pollution Prevention and Response (PPR4) meeting, discussion was held on a new project to develop an OPRC implementation guide. Specific to RETOS™, recommendations were made that programs should first be assessed to see what needs to be implemented and, if more countries undertook a systematic benchmark assessment, those having advanced areas of preparedness can be identified and assist with the exchange of best practices to help those in need of assistance (Keith Donohue, 2016, personal communication).
Benchmarking results from RETOS™ analyses, whether for national or industry programs can be used to find synergies between countries or companies, promote opportunities for addressing common gaps, and identify strengths within programs that can be used to assist or guide others. Regionally applied, benchmarking activities can lead to a balance among spill preparedness programs and allows comparisons from region to region.
Intergovernmental organizations such as IMO, RAC/REMPEITC-Caribe, and COCATRAM, and government/industry partnerships such as the Global Initiative in the different regions, play an important role in enhancing the level of oil spill preparedness and readiness of their member States. By using RETOS™ as a benchmark, these organizations can efficiently channel the resources by developing regional activities in which RETOS™ assessment reports reveal consistent or large gaps. Furthermore, these organizations as well as individual countries can periodically (e.g., every year) check and demonstrate the advances made by addressing the gaps and recommendations captured in the GIP reports.
Similarly, international businesses and non-governmental organizations, such as ARPEL, IPIECA, and OSRL, that strive to enhance the oil spill management and performance of their member companies can use RETOS™ for gap assessments and benchmarking purposes in any of the four scopes applicable to industry. The results of a periodic (e.g., annual) benchmarking, wisely utilized, would encourage companies to seek continuous improvement and reach higher levels of oil spill preparedness and readiness.
In all cases (i.e., intergovernmental organizations, government/industry initiatives, and industry associations), RETOS™ seems to be the ideal tool to identify gaps and set the bar for basic and increasingly higher levels of oil spill preparedness for all players. Furthermore, these institutions play an important mission in cooperative activities, a most effective approach to accomplish higher levels of operational and management excellence for spill response.
RETOS™ V 2.0 is very easy to use. The Manual and RETOS™ V 2.0 are currently available in English, Spanish, Portuguese, and French and can be downloaded free of charge from the ARPEL web site (www.arpel.org). Translation into other languages may be undertaken with due consideration to copyrights.
REFERENCES
1 Although the minimum percentage to pass from Level A to Level B is 90%, the fact that there are critical criteria partial or missing for Response Coordination and Operational Response categories–in yellow- results in the overall assessment to be shown as “In development”.
2 Table corresponds to GPA report in Figure 2.