Rigor and Discipline

The date was January 28, 1986. The event was the tenth and final flight of the Space Shuttle Challenger. Seventy-three seconds into flight, the booster rocket that was lifting Challenger into space exploded, killing all seven astronauts aboard.

When events like the Challenger explosion happen, you never forget where you were at the time. You remember the iconic photos and the national days of mourning for those lost. After the Challenger explosion, President Reagan appointed the Rogers Commission to investigate the disaster, and some of you may remember the news commentary on the Rogers Commission Report. If you didn’t study the reports from the incident, you likely aren’t aware of the stunning findings, the changes that were called for and, even more importantly, the effect the changes at NASA have had on industry – including the utility industry. It’s worth taking a look. You can read about lessons learned from the incident at https://ocw.mit.edu/courses/aeronautics-and-astronautics/16-891j-space-policy-seminar-spring-2003/readings/challengerlessons.pdf.

What the Rogers Commission found missing at NASA were rigor and discipline in their communications, decision-making and safety culture. The documentation includes an almost incomprehensible lack of safety culture and hazard awareness, and even a widely known, consistent failure to follow their own safety protocols. The commission found NASA’s organizational culture and decision-making to be key contributing factors in the incident. In particular, the commission determined that NASA engineers had been aware of a potential design flaw associated with cold weather effects on silicone O-rings that could cause them to fail, but they did nothing about it. The flaw had been a concern of rocket-booster manufacturer Morton Thiokol. NASA managers had even been reminded of the issue by NASA engineers, and calls from Morton Thiokol over the technical concerns the morning of the launch went unreported to supervisors and managers.

To understand how negligent these actions were, you need to know that the silicone rubber O-rings were classified as a Criticality-1 level of components in the launch vehicle. Criticality 1 meant that the component had no backup system and failure could result in loss of the launch vehicle during liftoff. Not only that, the O-ring designer, Morton Thiokol, had limited the low-temperature performance to 40 degrees Fahrenheit. The temperature at liftoff the day of the incident was 18 degrees Fahrenheit.

The Concern for Utilities
So, what is the concern for utilities based on the Challenger incident? The concern is that if a sophisticated organization like NASA can completely collapse – an organization that is full of highly disciplined professionals who are responsible for critical issues that directly relate to the survival of the workforce – what are the chances of a disaster happening at your utility where the rigor is hundreds of degrees less?

In the third paragraph above, I used “rigor and discipline” to paraphrase the missing elements across NASA’s management spectrum as described in the Rogers Commission Report. I have begun to use “Rigor and Discipline” as a key theme or maybe even a motto over the last year or two in all of my training seminars. The theme arose from consideration of the hundreds of incident investigations I’ve been a part of in the past 20 years. Whether the incidents resulted in simple equipment damage or were complex incidents that resulted in fatalities, I found those elements – rigor and discipline – missing again and again. Like NASA, the rigor that was missing evolved over a period of time, often unnoticed. However, it didn’t start that way. In most cases at NASA, the initial deviations in protocol or process were noticed; they just weren’t resolved. In the worst case, deviation from protocols was a purposeful decision to speed something up, to save time or money, or to “get-r-done.” Sound familiar?

Institutional Deviation
Often, a deviation serves its purpose and nothing bad happens. It then becomes institutional deviation, meaning that the deviation from a safe work practice eventually becomes accepted across the organization (i.e., “institutionalized”). I call it institutionalized as a result of experience. On several occasions over the years, I have found an organization engaging in a practice that was just plain illegal or so out of the ordinary that it was hazardous. In some of those cases, the only reason nothing bad had happened was because the various conditions hadn’t come together to cause an incident. In others, the various conditions did come together and were recognized during an incident investigation.

There’s also another phrase you’ve probably heard before: normalization of deviation. That phrase is attributed to the book “The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA” by Diane Vaughan. In her analysis of the Challenger disaster, she wrote, “Organizations establish safe best practices. One day it becomes expedient to deviate from one or more of these processes. Nothing untoward occurs. Over time, this becomes the new ‘normal.’ Other small steps away from this new normal occur. Then, a disaster happens.”

Vaughan said that this deviation is normalized by repetition, but my experience is that the deviation itself is what’s normalized. It is human nature for people to adjust to rules they think are unimportant or inconsequential. It is also human nature to adjust the rules if the rules make it difficult to accomplish a goal. The deviation from the rules doesn’t become normalized; it may be more normal in human terms to deviate if you find a way to justify it. Then the deviation becomes accepted to the point that nobody wants to deal with trying to stop the practice. Speed limits, safety glasses, truck inspections, seat belts, taxes and face shields come to mind. It’s OK to go up to 10 mph over the speed limit; not everybody has side shields, particularly those who wear prescription glasses; DOT truck inspections are not properly done; many companies don’t have a seat-belt compliance plan; lots of people cheat on their taxes just a little; and face shields are not appropriately used at most utilities. Sure, someone reading does do some of these things well, but most don’t.

Discipline
This is where the second part of my motto – discipline – comes in. An employer’s procedures are only as good as the training they do at rollout and the follow-up they do in execution. It’s probably self-explanatory, but how good is a procedure that you roll out if you don’t train the workforce regarding its purpose, your goals and implementation of the procedure? Then, after the rollout, how much value do even the very best of procedures have if they are not being followed in the workplace? Hence, discipline.

This is not the kind of discipline that takes place after violation of a rule; in fact, it’s far from it. With this discipline in place, there will never be a need for the discipline that follows someone breaking the rules. This kind of discipline is an element of safety culture. Discipline is the culture factor that sees procedures as immutable conditions. The disciplined worker sees that procedures or conditions are to be a certain way and that no other way is acceptable. Discipline applied to safety rules, procedures, training and best practices (rigor) ensures that the very conditions designed into the safe workplace are never violated.

Tailboard Trial
Let’s start our rigor and discipline trial with the tailboard. The first thing we have to do is examine the tailboard form itself. What is the goal of the form? Is it a record for the employer, a task for the crew or a tool to guide a crew through an effective job hazard analysis (JHA)? If it’s not a guide to an effective JHA, then replace it. Too many forms have been created to meet the criteria found in the OSHA standard and not as a functional guide to ensure the crew has a model to follow toward effective use. When you consider your form, use the lessons from NASA. A series of checkboxes to cover every possible thing becomes a task. I recently saw a distribution crew JHA form with 112 checkboxes and four lines where the crew was to list the tasks for the day’s work. Remember this: A form becomes more tedious to complete reading left to right and top to bottom. Checklists should alternate between methods of recording, such as writing in information, circling items and checking boxes. There should be room to write in tasks, hazards and remediations. Create a form that flows, has continuity and makes sense. Don’t use redundant items or items that have no purpose with regard to hazard prevention or incident response.

Now that we have a draft form, send it out for use by a few crews and request their feedback. Consider the feedback, make changes if necessary and send the form out again for a trial. Once you have the right form, the most important part takes place. Develop the training that will occur before you roll out the new form. Your training will inform the participants why JHAs are important – because hazard analysis is the number one activity that prevents incidents. A well-performed hazard analysis leverages all of the experience and training of every member of the crew. The training on the JHA also gives the crew skills for an effective approach to hazard analysis. The best training appears to be in two parts: managers first and then supervisors and foremen. Yes, I said “managers.”

Train your managers on the new form, its goals and how it should be filled out. Then, train your managers on their responsibility associated with the form. The first responsibility: Own the tailboard process and support it by participating and encouraging effective tailboards and JHAs. The second responsibility: Read the tailboards daily, give feedback to the crew supervisors and hold them accountable for passing that feedback to the crews. The third responsibility: When managers go out and see a crew, first review their tailboard form. If they are doing something that’s not on the tailboard, find out why. If the tailboard form is important to managers, it will be important to the supervisors, foremen and crews.

Next, train the supervisors, foremen and crews on the new form, its goals and how it should be filled out. Then, train your supervisors, foremen and crews on their responsibilities associated with the form. The first responsibility: Own the tailboard process and support it by participating and encouraging effective tailboards and JHAs. Yes, that sounds just like the manager training – and it is. This methodology creates a continuum of interlocking procedures that ensure an effective program. This is what was missing at NASA. They had dozens of checklists and forms that went from clipboard to file. Ten years before the Challenger incident, NASA had in their files a form detailing a discussion between Morton Thiokol engineers about problems with the pliability of silicone rubber seals exposed to freezing temperatures.

Conclusion
Every critical safety procedure at your workplace, including those found in safety manuals and specific procedures for your crews, should be introduced, explained, trained on and documented similar to the JHA tailboard example above. And every worker in your organization should know the expectations and goals of your safety program, including the motto “Rigor and Discipline” and why it matters.

About the Author: After 25 years as a transmission-distribution lineman and foreman, Jim Vaughn, CUSP, has devoted the last 20 years to safety and training. A noted author, trainer and lecturer, he is a senior consultant for the Institute for Safety in Powerline Construction. He can be reached at jim@ispconline.com.

Utility Fleet Professional

360 Memorial Drive, Suite 10, Crystal Lake, IL 60014 | 815.459.1796

KNOWLEDGE, INSIGHT & STRATEGY FOR UTILITY FLEET LEADERS

Utility Fleet Professional is produced by Utility Business Media, Inc.   View Capabilities Statement

Get the Utility Fleet Professional Digital Edition App
Get the Utility Fleet Professional Digital Edition App

Get the iP Digital Edition App


© All rights reserved.
Back to Top