Earlier this week – prompted by a new report on safety in academic chemical research labs – Science Careers posted an article titled “Practical steps to make lab workers safer”. The article highlights that students, postdocs and technicians “face the greatest risk of injury […] —yet they are often powerless to make their working lives safer”, and then lists a number of proposals for improvements. While I haven’t read the actual report (available for a mere 45$), this basic ideas resonates well with recurring discussions I have had with friends and colleagues about highly impractical safety procedures. Sometimes these regulations are so stupid, I am convinced they are aimed less at ensuring actual safety than an illusion of it, covering employers assess should it ever come to an accident and/or legal action.
The way I see it, institutes and departments fail on three accounts, when it comes to safety:
First, the regulations (or their implementation) are often non-sensical or not in proportion to the actual hazards employees are exposed to. My friend, for example, told me she is required to wear full-body protective clothing (ie long trousers, closed shoes, a lab coat and gloves) to work with certain equipment in a small windowless room stuffed with electronic equipment and no air conditioning, that is stiflingly hot even in winter, and completely unbearable in summer. Or take the case of ethidium-bromide (EtBr), a potential carcinogen that can be used to stain DNA: the dangers of the dye are much debated, but often labs are absolutely paranoid and hysterical about the concept of using EtBr – yet the very same labs will then handle EtBr-stained samples with latex gloves (latex lets EtBr pass through), and image the stained samples using an open UV-source – even though UV is a well-documented carcinogen.
Second, employees are not provided with (regular) training on safety procedures. The extent of training varies from lab to lab, but I think most institutions have a tendency of passing it off as a pure formality. As someone who works late nights (when there’s often noone else around), and having had a couple of close calls (setting a beaker of ethanol on fire, and other minor mishaps) this really angers me. No institution should take safety lightly, and all employees should have the possibility to gain extensive insight into the regulations specific for a given institution. In my current department I was even “allowed” to skip safety training altogether, because it was only in French, which I find wholly unacceptable. And even beyond initial training, there are constantly technological advances that make lab equipment safer, new toxicology studies, or new regulations based on frequent accidents elsewhere. However, keeping up with these advances is often outside the scope of scientists work, and so even those who were once well-trained might be desperately outdated after 10-15 years. To follow-up on the EtBr vs UV story: closed imaging systems, where scientists are not exposed to UV are available (and have been for many years), yet when I recently mentioned this to our security officer (who is a rather senior member of the staff) he shrugged it off as a modern fad, that still needed to prove its usefulness, before such an investment would be warranted.
Third, there is generally no forum for employees to give feedback about bad security measures. At all of my previous institutions, lab safety was either coordinated by an office that was far removed from daily lab life, or by full-time scientists, who volunteered some of their time to fulfill the role as safety officers. Neither of those solutions work: in the latter case, having to deal with actual problems is frequently seen as a hassle, competing with time for experiments. This results in half-baked solutions, just to get the issue out of the way as quickly as possible. For example, some years ago, a carbon-dioxide detector went of in our cell culture room. When I told our (volunteer) safety officer, he turned off the alarm, saying it often malfunctioned, and he’d have the electrician look into it in a couple of months (!!!), when the yearly checkup was done. In contrast, dedicated safety officers might be more effective, but often they have no contact with researchers and therefore do not recognize their needs. Similarly, they are often not available for discussion, when safety issues clash with a healthy work environment (see the aforementioned case of full protective gear in an unbearably hot room).
These three reasons, combined with globally mobile scientists, who might have encountered different regulations at different institutions, often results in scientists making up their own safety rules, which they think are adequate. However, not only is that a slippery slope (first your own rules for a couple of small things, and then…?), but it also means that the collective working environment becomes a hotchpotch of everyone’s personal rules, from inadequately labeled samples to handling dangerous substances without sufficient protection… And to transform this mess from an illusion of safety to a culture of safety, the first step will be for institutions and scientists to work together on problems, and how to implement regulations sensibly, so that safety has a “high importance … all the time, not just when it is convenient or does not threaten personal or institutional productivity goals.”