24 October 2013

Examples of Good Employee Behavior

What type of failures in safety culture can be identified from oil and gas catastrophes of the last 40 years? Why are positive safety cultures not always created in organisations? How can you begin building a stronger safety culture? GL Noble Denton’s Applied Psychology & Human Factors Group considers these crucial questions.

Over the last 40 years there have been a number of high profile disasters in the oil and gas industry.  Names such as Flixborough, San Juanico, Piper Alpha, Texas City, and more recently Deepwater Horizon, have left an indelible mark on our conscience, especially when we remember the loss to human life, the environment and to the reputation of companies and individuals.  Tragically, these five incidents alone account for over 700 fatalities.

Failings related to Safety Culture.

The factors contributing to each disaster are varied and far reaching; with each issue coming into play at one critical point in time.  However, weaknesses (human and technical) festering in the organisation, reflecting the underlying safety culture, are often also contributory factors.  A high level review of accident reports connected to the disasters highlighted above, indicates weaknesses in safety culture related to:

  • Poor management commitment to safety
  • Prioritising cost-cutting and production above safety
  • Complacency about risks
  • Staffing issues and excessive workload
  • Inappropriate rewards and incentives for reporting incidents
  • Inadequate training for emergencies
  • Leaders inconsistently modelling safety behaviours
  • Absence of learning from past incidents
  • Fear of speaking up by staff
  • Poor competency of managers in risk/hazard management
  • Safety critical tasks not performed
  • Organisational change poorly managed
  • Inadequate communication and handover


Understanding Safety Culture

Why has the oil and gas industry been punctuated with major accidents since the 1970’s?  Why do organisations continue to record high levels of injuries?  Why does the HSE continue to issue large numbers of prohibition and enforcement notices?  Ultimately, why is it that many companies have failed to build strong safety cultures? 

One reason is that managers often fail to properly understand safety culture, let alone how to improve upon it.  Some consider it too abstract a concept, tending to concentrate on tangible and operational day-to-day health and safety management.  Others may have advanced their safety approach by comprehensively addressing operational functions, as well as more strategic level issues, but still have an ill-defined understanding of what knits these aspects together: a good safety culture.  This shortcoming will ultimately reduce their overall effectiveness in the battle to improve safety.

In illuminating the concept of safety culture, the first steps are to define and then measure it.  Definitions may differ, but most would agree that safety culture is the collective views of the workforce in relation to their values, attitudes and behaviours towards safety.  In terms of measurement, it is necessary to go beyond piecemeal anecdotal views, or only rely on the views of senior management.  According to the definition, measurement requires canvassing the views of those who live and shape your organisation’s culture – your people.  

Unsurprisingly, a great deal of research has been conducted into the concept of safety culture.  But research in itself is not sufficient to effectively tackle its measurement and development.  Our team of psychologists have combined their academic backgrounds with hands-on experience of running safety culture surveys and used this to devise their own practical model for measuring safety culture.  Our Safety Culture PROFILER Model is based on 10 key factors that represent safety culture (see Figure 1), which can all be directly measured by collecting workforce views.  This is carried out by using our specially developed questionnaire, and supplemented by facilitated focus groups. 
Safety Culture PROFILER Model

Figure 1: GL Noble Denton’s Safety Culture PROFILER Model

Data from the questionnaires and focus groups becomes the evidence base allowing a clear vision of how well an organisation performs on each of the 10 factors.  People’s attitudes towards safety are clearly quantified by a percentage score.  Areas that are weaker can then inform decisions on where attention needs to be focussed to drive forward improvements in safety and ultimately safety culture.  Repeat surveying of the workforce can then show where progress has been made.  In effect, we help to arm companies with the intelligence needed to make safety investment decisions and enhance their safety culture.

Concluding thoughts

It is clear from our work with other high risk industries, that it is not wise to place complete faith in a safety management system without having a good understanding of how an organisation’s culture affects the attitudes and behaviours of employees tasked with managing safety.  Of course, the caveat is that even with a strong safety culture a serious accident can still happen, but what is important to note, is that the chances of it happening are significantly reduced when the culture reflects a true commitment to safe working.  Few can argue that there is a greater goal than protecting the lives of those we work with. 

Source: http://www.gl-nobledenton.com/en/consulting/1492.php

Tackling the safety culture challenge

What type of failures in safety culture can be identified from oil and gas catastrophes of the last 40 years? Why are positive safety cultures not always created in organisations? How can you begin building a stronger safety culture? GL Noble Denton’s Applied Psychology & Human Factors Group considers these crucial questions.

Over the last 40 years there have been a number of high profile disasters in the oil and gas industry.  Names such as Flixborough, San Juanico, Piper Alpha, Texas City, and more recently Deepwater Horizon, have left an indelible mark on our conscience, especially when we remember the loss to human life, the environment and to the reputation of companies and individuals.  Tragically, these five incidents alone account for over 700 fatalities.

Failings related to Safety Culture.

The factors contributing to each disaster are varied and far reaching; with each issue coming into play at one critical point in time.  However, weaknesses (human and technical) festering in the organisation, reflecting the underlying safety culture, are often also contributory factors.  A high level review of accident reports connected to the disasters highlighted above, indicates weaknesses in safety culture related to:

  • Poor management commitment to safety
  • Prioritising cost-cutting and production above safety
  • Complacency about risks
  • Staffing issues and excessive workload
  • Inappropriate rewards and incentives for reporting incidents
  • Inadequate training for emergencies
  • Leaders inconsistently modelling safety behaviours
  • Absence of learning from past incidents
  • Fear of speaking up by staff
  • Poor competency of managers in risk/hazard management
  • Safety critical tasks not performed
  • Organisational change poorly managed
  • Inadequate communication and handover


Understanding Safety Culture

Why has the oil and gas industry been punctuated with major accidents since the 1970’s?  Why do organisations continue to record high levels of injuries?  Why does the HSE continue to issue large numbers of prohibition and enforcement notices?  Ultimately, why is it that many companies have failed to build strong safety cultures? 

One reason is that managers often fail to properly understand safety culture, let alone how to improve upon it.  Some consider it too abstract a concept, tending to concentrate on tangible and operational day-to-day health and safety management.  Others may have advanced their safety approach by comprehensively addressing operational functions, as well as more strategic level issues, but still have an ill-defined understanding of what knits these aspects together: a good safety culture.  This shortcoming will ultimately reduce their overall effectiveness in the battle to improve safety.

In illuminating the concept of safety culture, the first steps are to define and then measure it.  Definitions may differ, but most would agree that safety culture is the collective views of the workforce in relation to their values, attitudes and behaviours towards safety.  In terms of measurement, it is necessary to go beyond piecemeal anecdotal views, or only rely on the views of senior management.  According to the definition, measurement requires canvassing the views of those who live and shape your organisation’s culture – your people.  

Unsurprisingly, a great deal of research has been conducted into the concept of safety culture.  But research in itself is not sufficient to effectively tackle its measurement and development.  Our team of psychologists have combined their academic backgrounds with hands-on experience of running safety culture surveys and used this to devise their own practical model for measuring safety culture.  Our Safety Culture PROFILER Model is based on 10 key factors that represent safety culture (see Figure 1), which can all be directly measured by collecting workforce views.  This is carried out by using our specially developed questionnaire, and supplemented by facilitated focus groups. 
Safety Culture PROFILER Model

Figure 1: GL Noble Denton’s Safety Culture PROFILER Model

Data from the questionnaires and focus groups becomes the evidence base allowing a clear vision of how well an organisation performs on each of the 10 factors.  People’s attitudes towards safety are clearly quantified by a percentage score.  Areas that are weaker can then inform decisions on where attention needs to be focussed to drive forward improvements in safety and ultimately safety culture.  Repeat surveying of the workforce can then show where progress has been made.  In effect, we help to arm companies with the intelligence needed to make safety investment decisions and enhance their safety culture.

Concluding thoughts

It is clear from our work with other high risk industries, that it is not wise to place complete faith in a safety management system without having a good understanding of how an organisation’s culture affects the attitudes and behaviours of employees tasked with managing safety.  Of course, the caveat is that even with a strong safety culture a serious accident can still happen, but what is important to note, is that the chances of it happening are significantly reduced when the culture reflects a true commitment to safe working.  Few can argue that there is a greater goal than protecting the lives of those we work with. 

Source: http://www.gl-nobledenton.com/en/consulting/1492.php

04 October 2013

Enforcing a Security Policy

It's easy enough to write a security policy, but the devil's in the details when you start talking about enforcement. By Anonymous November 01, 2003 — CSO — Don't know about where you work, but in most places policy is a four-letter word. Management, especially, tends to bristle at the notion. "That's not the way we do things around here," they'll say. Or, "We don't need a policy. We've got bright people who will automatically want to do the right thing." Or how about, "I hired you to influence and to lead. If you have to rely on a piece of paper to get things done, maybe I've hired the wrong guy." Nevertheless, I'm someone who's bullish on security policy for, I think, all the right reasons. Because, for one, it frames our work as CSOs. And because it also provides a hook to the resources we CSOs require. I've worked long and hard over the years to develop a solid security policy at my organization, and I've had some luck getting senior management buy-in. I even gave a presentation on security policy at a security conference a year or so ago. As I prepared my pitch, I couldn't help but wonder what the sponsors were hoping for. I mean, it was about boring, bureaucratic B.S. (and that's not a college degree, by the way). Well, as it turned out, it topped the hit parade in the participant evaluations, and I still get requests for copies of the presentation today. I'm quite sure that it wasn't my phenomenal charisma that made such an impression, so I've circled back more than a few times to learn why people care about policy. One CSO in particular was interested in learning how I had approached the enforcement part of policy. And as I started to dig in to what I thought was familiar land, I hit a rock. While it's easy to spout off about the way things ought to work, it's another thing altogether to try to tell someone how to enforce the rules. Policy policing, it turns out, is not as easy as it sounds. Many chief information officers and others at the top pay only lip service to supporting infosec policies. Nimdaand a few other wake-up callshas changed that for some because multiple-attack vectors whacked enough critical business processes to bring new meaning to the concept of "intense displeasure" to business managers. "Hmmm," says the CEO, finally. "If we have a policy on this, maybe we need to be more forceful in enforcing it." Eureka. History LessonMy dictionary defines policy as "a plan or course of action as of a government, political party or business designed to influence and determine decisions, actions and other matters." Now, believe me, I'm all about influence. But determining decisions and actions? That's another matter. In fact, it's one hell of a stretch.

Think about the evolution of corporate security policy. Several decades ago, it was pretty straightforward, although it wasn't very visible from a business process perspective. We had the basic framework aimed primarily at managing a baseline security program such as physical access, notification protocols, safety and perhaps some directives that emerged from an incident or event of note.
And then Al Gore invented the Internet. Do you suppose he had imagined the potential for doing business on such a highway? Did any of us imagine how insecure it would be? How about we put this incredible facility on our desktops? Who would've thunk some idiot would send uninvited trash to colleagues? It's clear that we most certainly need some business rules and other safeguards around this channel.
The past dozen or so years have been manna from heaven for policy partisans everywhere, what with the (continuing) influence of the lawyers and insurance carriers and employment laws. There was an explosive integration of technology in core business processes and the resulting risks to intellectual property and business continuity. Add to that the Corporate Sentencing Guidelines, a plethora of industry-specific regulations, privacy, the Patriot Act, Sarbanes-Oxley, anthrax, Sars, terrorism threats....
We've got to have an envelope of policies and procedures with all that potential for disaster, don't we? You bet.Details, DetailsThere are four parts to governance from my perspective:
  • Identifying and communicating riskWhat's the problem?
  • Creating an accepted policy and guidance infrastructureWhat do we expect accountable parties to do?
  • Developing processes to monitor conformance with policyHow do we know we are successful?
  • Preparing, when the controls fail, response capabilitiesIf it hits the fan, who will do what to mitigate it?

Assessing compliance is not the problem. Not surprising, policy compliance in the information security realm is automated. A number of products can be deployed to monitor and report on rule infractions. Both logical and physical access and intrusion detection are highly sophisticated and online. A variety of business process anomalies are identified with smart-transaction monitoring. Internal and external audits will assess and confirm compliance, and our investigations will reveal where policies were not followed. In short, a huge portion of the policy landscape isor can betested in real-time for conformance. Unfortunately, we aren't so easily able to do that with infractions of business and professional conduct policy, which is a huge element in your company's reputational risk.
So here we are with a comprehensive set of governance and asset protection policies and options for measuring compliance. But what about enforcement and sanctions? The devil, of course, is in the details
. Policies set expectations and assign accountability. They establish a legal framework, spelling out what is and isn't permitted. They define how management will govern. They provide direction to our security strategy and architecture. But when do you stop selling and start punishing? And who authorizes you to do so? Which brings us to the first of five lessons for my CSO friends. Lesson One. The enforcement of policy should be directly connected to the consequences of inaction. In other words, you need to create consequences for not actively following the company's policy. You need to punish the yahoos who don't follow the rules. Lesson Two. Unattended risk is unacceptable. The concept of corporate governance is morphing. Events have moved insurers, shareholders, regulators, legislators and directors to a much lower tolerance for risk-takingboth from a personal and corporate perspective. Consequences are shifting to officers, directors and audit committee members who are now held accountable when bad things happen. Lesson Three. An uncommunicated policy does not exist. The more that policies are clearly tied to well-communicated, higher likelihood risks, the more our constituents will understand and comply. Are you surprised that a policy on testing business continuity plans or building evacuations might have sold shortly after 9/11 to the same people who put up a fight when we called an annual drill a few months prior? Lesson Four. The masses know when policies are hollow or inequitably enforced. It's the idea of enforcement that causes the kinds of reactions we often get from our customers. With pressure from insurers, regulators and boards, frequency of cyberattacks and a raised bar on risk management, I think we're beyond having to justify an inventory of security policies. The rub is around what you intend to do about noncompliance. That is where your success at selling the policy to top management and then communicating expectations to employees is key to effectiveness. Lesson Five. Do your homework and frame the business case for a policy. Isn't it amazing that when we catch an hourly employee doing something wrong we have to hold management back from sending him to the gallows? But what happens when it's one of their own? "Do you realize how valuable this guy is?" they'll ask incredulously. I've had more than my share of time in the hot seat on issues such as that, and my best ally has always been our employment law counsel. (I've not had the same luck with HR types.) The lawyers know that uneven application of sanctions is an invitation to a lawsuit. General counsel should be in the loop on all policies that carry the potential for employee sanctions. You should be playing out "what if" scenarios and driving mutual stakes in the ground on how infractions will be pursuedregardless of rank. If you think there is an elitist culture working overtime at your company, you'd also do well to think hard on how you approach the investigation of white-collar wrongdoing. Findings need to be bulletproof. Meaningful sanctions are at work when someone at the accountable management level (on his watch) gets his bonus croaked or gets fired. I heard a story about one executive who was so careless with his laptop that, two weeks after his first one was stolen, his replacement was also taken. Both had highly proprietary data on them. He was sacked. You better believe that message was not lost on the survivors. So let it be written. So let it be done.