ABSTRACT

The increased public awareness of oil spills and their impacts following the 2010 Macondo oil spill incident along with the changing global landscape of regulatory requirements and stakeholder expectations present numerous challenges to operators. Many of these challenges, as well as incorporation of lessons learned from Macondo, can be addressed through robust preparedness.

Oil spill preparedness programs generally include a combination of response plans, incident management and response teams, response equipment and personnel (resources), training and exercises. However, what constitutes a robust preparedness program is often open to interpretation. While acknowledging the availability of a few preparedness assessment/audit tools, such as RETOS, it was felt a fit for purpose assurance program that facilitated open communication, free exchange of ideas and sharing of best practices would result in a greater overall improvement to the company’s global level of preparedness. Consequently, ConocoPhillips retained Oil Spill Response Ltd. (OSRL) to assist in developing a process and methodology for evaluating ConocoPhillips’ existing oil spill preparedness programs. The evaluations were conducted at nine business units (BUs) in various countries whose operations ranged from deepwater exploration and production to onshore production to tanker operations. The main objective of this project was to achieve a high and consistent level of global preparedness through the identification of potential gaps in each BU’s preparedness program and implementation of associated improvement plans.

The preparedness assurance process consisted of several components including:

  • Detailed review of the oil spill/emergency response plans to evaluate contents and identify improvement opportunities and best practices.

  • Evaluation of each BU’s Incident Management and Emergency Response Team size, structure, competency and lines of communication and coordination

  • Evaluation and validation of training and exercise programs

  • Evaluation of Tiers 1, 2 and 3 response resource availability including dispersant stockpiles

The results of the evaluations were shared with the BU emergency response leads and management and an improvement plan developed to address any identified gaps. Upon completion of the program, a report was prepared summarizing the preparedness evaluation results for each BU, highlighting best practices identified during the evaluations and providing a general assessment of what “good” looks like. This report was then shared with all BUs to, along with the individual improvement plans, enhance consistency and the level of spill preparedness across the company.

The primary objective of this paper is to explain the assurance process that was developed, share key lessons learned during the implementation and provide a summary of the general findings such that it can provide a blue-print for use by other companies to inform the development of similar oil spill preparedness assurance programs.

INTRODUCTION

The 2010 Macondo incident in the Gulf of Mexico resulted in a heightened awareness by the oil and gas industry of the need to ensure an adequate level of preparedness to respond to oil spills of all sizes. There was also an increase in awareness and expectations among stakeholders and the wider oil spill response community for industry to maintain a high level of preparedness. To that end, ConocoPhillips worked with Oil Spill Response Ltd. (OSRL) to develop and implement an oil spill preparedness assurance program to assess the level of preparedness for its higher risk business units (BUs). The primary objective was to identify potential gaps and subsequently drive improvement but also to enhance consistency in the level of oil spill preparedness across the enterprise. It is important to note that this was not a compliance audit or rigorous structured assessment, such as the RETOS approach, but rather an assurance program designed to facilitate dialogue and exchange ideas and best practices while assessing the BUs level of preparedness with the ultimate goal of driving improvement and consistency.

This paper provides a description of the assurance program and its implementation as well as lessons learned and a summary of the general findings. The description is divided into the program’s individual components that include:

  • Plan Reviews

  • Site Visit

  • Results Briefing

  • Response Resource Capability Assessment

  • Findings Report

  • Improvement Plan

  • Assurance Program Summary Report

The assurance program was conducted over a three-year period and focused on those BUs that posed a risk of high consequence oil spills. Included were BUs with offshore exploration and/or production activities, marine tanker operations or those that are proximal to sensitive resources at risk, such as the North Slope of Alaska.

As with most newly developed programs, it was a learning process with minor modifications made along the way to address lessons learned and improve the program. The key lessons learned are discussed at the end of this paper to help those companies that may develop their own program, avoid encountering similar issues. The general findings of the assurance program are also discussed to provide an idea of what to expect should a company decide to implement a similar oil spill preparedness assurance program.

ASSURANCE PROCESS

Plan and Document Reviews

The first step of the process was to review readily available oil spill preparedness documents such as oil spill response/contingency plans (OSRPs), incident or crisis management plans, response equipment inventories, etc. The OSRPs were primary focus of the reviews which for some BUs, a single plan covered all of their operations whereas for other BUs there were two or more OSRPs along with ancillary incident, emergency and/or crisis management plans. In the latter cases, only the primary OSRP(s) were reviewed in detail with a cursory review conducted on the other plans. A list of questions and clarification needs were then developed from the document reviews for subsequent resolution by BU personnel. All plan reviews were completed prior to conducting the site visits to minimize the time onsite and disruption to the BU’s operations.

The OSRP reviews consisted of two components with the first consisting of a detailed, page by page plan review to identify gaps, improvement opportunities, errors, inconsistencies and best practices. The second component utilized a checklist to assess the presence and adequacy of various types of information ideally contained in an OSRP. The checklist was previously developed by OSRL but then modified in collaboration with ConocoPhillips so that it was fit for purpose. It is very comprehensive and divided into several major areas or topics, each with multiple subtopics or information types. The key topics or content items evaluated include:

  • Risk assessment

  • Initial action checklists

  • Notification details

  • Response resources, mobilization and response times for each tier

  • Incident management team (IMT)

  • Oil spill trajectory modeling

  • Sensitive environmental/socio-economic resources and locations (maps)

  • Training and exercise program

  • Waste management procedures/plan

Each subtopic/information type was ranked during the review based on the extent to which it was addressed in the plan. Due to its extensive nature, it was not possible to include the checklist in its entirety but an example of a completed Incident Management Structure portion of the checklist is provided below in Table 1. The results of the detailed, page by page review of the OSRPs were documented as comments and suggested edits in a redline version of the plan that was provided to the BUs.

Table 1

OSRP Review Incident Management Structure Checklist

OSRP Review Incident Management Structure Checklist
OSRP Review Incident Management Structure Checklist

The completed checklist, redline version of the OSRP and the review results for any other associated documents were provided to the BU prior to the site visit. This enabled BU personnel to be prepared to provide clarification or additional documentation, where warranted.

Site Visit and Briefing

Following the submittal of the document review results to the BU, a site visit was conducted to validate information in the reviewed documents, interview selected IMT members, examine training and exercise records, and evaluate any Tier 1 response resources that may be stored at the location. The site visits were designed to be completed in two days to limit the disruption to BU operations. A list of the records and other documents to be reviewed onsite was provided, and the IMT member interviews were scheduled, in advance to maximize the efficiency of the visits.

The IMT members interviewed consisted of key positions for a spill response including:

  • Incident Commander

  • Operations Section Chief

  • Planning Section Chief

  • Logistics Section Chief

  • Environmental Unit Leader

The interviews were primarily intended to determine their knowledge of the:

  • Duties and responsibilities of their assigned position

  • How they are activated during an incident

  • Initial actions they would take upon arrival at the Incident Command Post

  • What oil spill/emergency response training they received in the past two years

  • Organization and contents of their OSRP(s)

The interview questions consisted of general response related topics that were posed to all IMT members as well as more in-depth, position specific topics to better assess their competency in that roll. The questionnaire for the Operations Section Chief is provided below in Table 2.

Table 2

Operations Section Chief Interview Questionnaire

Operations Section Chief Interview Questionnaire
Operations Section Chief Interview Questionnaire

The documentation or records for the BU’s oil spill response related training and exercise program were reviewed while on site to validate consistency with what is described in the OSRP as well as responses to related interview questions. Any gaps or inconsistencies in the documentation or record keeping were noted.

The Tier 1 response equipment observations were conducted to confirm consistency with what was described in the plan as well as to visually assess the equipment condition as an indicator of the maintenance and testing program adequacy. Most BUs do not, however, maintain their Tier 1 response equipment at or near the main office where the majority of site visits were conducted and traveling to all equipment storage locations was not feasible due to time constraints. Consequently, most of these Tier 1 equipment evaluations were based on descriptions of the maintenance and testing program provided by the BU.

Following the completion of the site visit activities, the BU Emergency Response (ER) Lead and Health, Safety and Environment (HSE) Manager were briefed on the initial site visit findings including the key improvement opportunities/gaps and best practices identified, interview results and equipment evaluations. This ensured there were no significant surprises when the draft assurance reports were submitted to the BUs for review.

Response Resource Capability Evaluation

For each BU and/or selected assets or operations within the BU, an evaluation was made of the capabilities of the available Tier 1, 2 and 3 response resources relative to their overall spill risks. Mobilization times of the resources to each asset or operation were also considered. For clarity, the tier definitions considered local regulatory requirements but were generally defined as:

  • Tier 1 – Locally available resources (owned or contracted)

  • Tier 2 – Regional or national resources

  • Tier 3 – National or international resources

In general, the types of response resources were assessed based on their applicability to the oil types produced or handled by the BU and the prevalent environmental conditions at the asset/operation location(s). The quantity of each type of resource was also assessed relative to the associated Tier 1, 2 and 3 spill scenarios. In particular, the quantity of dispersants required to sustain a response to a worst credible case discharge as defined in the OSRPs (subsea injection and surface application if applicable) were calculated and compared to the available Tier 3 dispersant stockpile quantities.

Findings Report

Upon completion of the various BU assessment and evaluation activities, a report was prepared summarizing the oil spill preparedness assurance objectives, process and findings. An Executive Summary was included at the beginning of the report that summarized the findings and also identified the top 10–15 gaps or improvement opportunities to help the BU prioritize their efforts to address those issues. The completed checklists, similar to the example in Table 1, were included for each major topic with the color-coded rankings also intended to help prioritize their preparedness improvement efforts. In addition, the best practices identified during the OSRP reviews, interview results and other aspects of the evaluations were highlighted in the report. The report also included a summary of the detailed OSRP review findings as well as a copy of the redline version of the plan containing the actual comments and suggested edits.

The findings report was only provided to the BU ER Lead and HSE Manager to ensure it would not be perceived as an audit and to promote open and frank discussions with BU personnel during the process. It was initially submitted as a draft to provide an opportunity for the BU to dispute any findings, provide additional information, correct errors or misinterpretations, etc. The report was then finalized and resubmitted to the BU.

Improvement Plan

After reviewing the findings report, the BU prepared an improvement plan identifying which gaps/improvement opportunities they planned to address, their prioritization and a completion timeline. The ConocoPhillips corporate HSE group provided guidance or assistance, as requested, in developing the improvement plans. No formal criteria were established regarding which gaps needed to be addressed, their prioritization or timeline for completion as it was left to the discretion of the BU management. There was, however, a general expectation that at least the higher priority items would be addressed and the improvements identified in the plan implemented within two to three years.

Summary Report

Once all BU reports and improvement plans were completed, a final report was prepared to summarize the findings of all of the oil spill preparedness assurance assessments, identify common gaps/improvement opportunities and share examples of best practices. Additionally, an overview of what “good” looks like for key components of an oil spill preparedness program was prepared based on professional judgement, the identified best practices and program aspects that received higher rankings for the majority of the BUs. This was done to enable BUs to compare the adequacy of various components of their program against those of other BUs as well as to provide a benchmark of what “good” looks like which BUs can use to determine if any additional improvements are necessary to meet or exceed the benchmark. Additionally, the examples of best practices can be adopted by the BUs to facilitate improvements in their OSRPs and other components of their preparedness program.

GENERAL FINDINGS

There was considerable variability between the BUs for the various components of their oil spill preparedness programs but in the opinions of both the OSRL and ConocoPhillips personnel involved, the overall level of preparedness for all BUs was commensurate with their oil spill risks. The strengths and weakness of each preparedness program differed considerably between BUs making it difficult to draw conclusions as to the reasons behind them. Several best practices were identified for each BU but similarly they were often associated with different components. This did, however, prove to be beneficial as it essentially created a library of best practice examples that spanned most of the OSRP topics and program components and, as such, can be used by BUs to help address whatever improvement opportunities they may have. The general findings for the key areas evaluated are provided in the following sections.

OSRPs

The OSRPs were found to be adequate for the associated spill risks particularly when taking into account supporting documents referenced in the plans. There was, however, a large disparity in the content and organization of the plans with some being better than others. As with the overall findings of the assurance assessment, there was little consistency in the strengths and weaknesses of the various plans. Much of the content of most plans was relatively generic, reference type information versus actionable, site/BU specific information which is not uncommon in OSRPs regardless of location or regulatory regime.

IMTs and ERTs

The size and organizational structure for the IMTs were deemed to be more than adequate for the spill risk in all cases although the ERTs were not always well defined. Based on the interviews with selected IMT members, the competency of all teams was excellent. The interface between the ERTs in the field and their respective IMTs in the Command Post was, in many cases, unclear as was the interface between the IMT and the ConocoPhillips corporate response teams but that was not viewed as significant. One unexpected finding was the limited familiarity of the IMT members with their respective OSRPs although many members had developed their own “Quick Guides” with information they would likely need to execute the duties and responsibilities of their assigned position.

Training and Exercise Programs

Both the training and exercise programs for each BU were sufficient for the risk and exceeded expectations in several cases. Descriptions of the programs in some OSRPs were somewhat vague or not included whereas others were very detailed and considered a best practice. Documentation or record keeping for the training and exercise events for all BUs was very good and validated the adequacy of their programs.

Resource Availability

For most BUs, the Tier 1 resources were more than adequate but in a few cases, some of the types were not ideally matched to the operating environment or stored near their operations or assets. In both cases, though, the BUs were producing, or exploring for, natural gas so was not considered significant from a risk perspective. Tiers 2 and 3 resource availability was consistently good across all BUs.

LESSONS LEARNED

There were several lessons learned during the implementation of this oil spill preparedness assurance program. Many were minor or insignificant although some were more substantive and warrant sharing externally, particularly if a company is contemplating conducting a similar program of their own. These are described in the following paragraphs.

Assurance Program and Site Visits

The HSE Management and ER Lead of each BU should be fully briefed prior to initiating their assessment and conducting the site visit. More specifically they should be advised of the overall process, the objectives, how the program will benefit the BU and the minimal time commitment for them and their staff. They will be much more receptive to fully participating if they understand the benefits and know there will be minimal disruption to their operations.

Best Practices

It is important to highlight the best practices that have been adopted by the BU and/or incorporated into their OSRP versus the negative connotations of focusing solely on the gaps and improvement opportunities. The BUs should be recognized for what they are doing well in addition to identifying where they could improve.

Response Resources vs. Risk

When assessing the type and quantity of available response resources, the BU’s spill risk should be considered. BUs with only natural gas operations do not require the same suite of resources that oil operations do and the potential volumes of both operational and worst credible case discharges should also be taken into account. Additionally, contracts with Tier 1, 2 and 3 response organizations don’t always equate to adequate resource availability particularly if the worst credible case discharge is substantial.

Rank Findings

The gaps or improvement opportunities identified during the assessments must be ranked or prioritized to maximize the benefit to the BU. The initial assessments that were conducted just provided a list of items or issues that needed to be addressed but it was not always apparent which were most significant and should be prioritized. Therefore, in subsequent findings reports, the gaps/improvement opportunities were ranked with their relative importance (high, medium, low) along with identifying the top 10 or so that should be addressed first.

IMT Interview Questions

The questions used when interviewing IMT members should be designed to assess their knowledge of incident management and their assigned position as well as their competency. Initially, the same list of general questions was used for all interviewees to ensure consistency but did not achieve the desired results. Consequently, the questions were revised to be more probing and difficult and tailored to the role of the individual. There should still be some common, general questions but most should be position specific to adequately assess knowledge and competency.

No Punitive Measures

The facilitators of the assurance program should ensure that the findings do not result in punitive measures taken against the BU. Limiting distribution of the findings reports to just the BU HSE Management and ER Leads and not to upper management proved effective in fostering open discussion and collaboration which, in turn, maximized the benefits to the BUs.

Third Party Involvement

The involvement of third party experts, such as OSRL, in the design and implementation of this assurance program enhanced its credibility as well as the participation and engagement of the BUs. Justified or not, BUs do not always trust representatives from a corporate group so the involvement of a reputable third party can be critical to the success of a program such as this.

Summary Report

The report summarizing and comparing the findings for each BU as well as providing examples of best practices was the most impactful aspect of this program. In particular, the matrix (see example in Figure 1) enabled a direct comparison between BUs of the adequacy of each OSRP topic and preparedness program component and for each BU to identify their strengths and weaknesses relative to their counterparts. This also inadvertently provided additional motivation for some BUs to make supplemental improvements.

Figure 1

BU Findings Comparison Matrix

Figure 1

BU Findings Comparison Matrix

SUMMARY AND CONCLUSIONS

The objectives of the assurance program described above were to identify gaps and improvement opportunities in various components of a BU’s oil spill preparedness program as well as to drive improvement and consistency across all BUs within ConocoPhillips. Although the improvement and consistency efforts are still ongoing, they are progressing and the program was deemed to be very successful. The identification, highlighting and sharing of best practices was not part of the original design but was well received and will contribute significantly to the overall success of the program.

There were a number of learnings identified during the program’s implementation. Some of which were affirmations of certain features that were incorporated into the program design but many were unexpected and immediately addressed by modifying the associated component. This also contributed to the success of the program. Another key factor was the involvement of third party experts which enhanced the acceptance of the process and results by the BUs.

This assurance approach, versus a traditional audit or assessment where only gaps or non-compliances are identified, proved beneficial in many ways. It gave the BUs an opportunity to showcase the extra efforts they made, and best practices they developed, in their preparedness programs. The summary report, including the comparison matrix, enabled the BUs to compare their respective plans and other program components to a “what good looks like” benchmark as well as to each other which also provided additional motivation to make improvements. Additionally, the creation of a library of best practice examples will facilitate adoption by the BUs which will ultimately enhance the company’s collective preparedness profile.

ConocoPhillips and OSRL both believe the program outlined above is a comprehensive and effective approach for assessing the adequacy and facilitating the improvement and consistency of oil spill preparedness programs for multiple BUs or operations. We also believe that it can provide a blue print for the development of oil spill preparedness assurance programs for other companies within the oil and gas industry.