In an increasingly digital world, Incognito considers the future of laboratory training and development, questioning the quality of current practices and assessing the “easy wins” that can be learned from other industries. With the end-goals of improving compliance, quality, and laboratory efficiency, as well as uplifting staff morale, Incognito shows that the entire industry can benefit.
I can say with some certainty that employers within Analytical Chemistry bemoan the skills and capabilities of graduates emerging from our academic systems. I can also say, with some insight, that the training provided to these graduates, once in industry, typically varies from poor to mediocre. Rarely have I seen examples of outstanding practice in workplace learning and development.
How many of us still have a large pile of standard operating procedures (SOP’s) that must be “read and understood” as part of the on-boarding training process? Seriously, how much of this information do we think is retained by the recruit? Do we even bother to measure how much “understanding” has taken place?
Once the on-boarding process is complete, does the training quality improve? How many of us have designated trainers who have been assessed capable and qualified
to train in a particular test or technique? Of these trainers, how many of them have had any degree of training in “how to train”— the so-called Train the Trainer paradigm? Without some education and practice in andragogical practice and the most effective ways to deliver and assess training, we can assume that our training programmes are sub-optimal. Without an appreciation of cognitive and kinesthetic taxonomic levels, training designs for achieving maximum information retention and skills development or effective training techniques, then how can we expect our trainers to be effective? How much do we rely upon accreditation of prior learning (APL), in assuming that someone coming from a good university possesses some fundamental knowledge or skills which are beyond their capabilities? How much do we assume that recruits with previous laboratory experience are skilled in the basics such as gravimetric or volumetric measurement? Even if we have well developed training programmes, perhaps using dedicated test articles to assess if a trainee can produce the “correct” concentration for example, how much do we focus on the production of this “correct result” rather than the processes and underpinning knowledge associated with various stages of the analysis? How often is training based on “grandfathering” of knowledge from the more experienced team members—regardless of their knowledge or understanding? I think we all know that inherent misunderstandings or poor practice can be very quickly amplified under this type of system.
I believe we can do very much better in training analytical laboratory staff and below are some thoughts which I hope will inspire you to question your current training practice and develop your training programmes to meet the standards we associate with other industries.
Training Methodologies
First on my mind, some over-arching points relating to training methodologies. The new zeitgeist for successful learning indicates that educational events should be: social, engaging, personalized, and inclusive. I’ve read this list several times lately, but most recently it was quoted on a call with LinkedIn Learning (previously Lynda.com) and I’ll talk more about digital learning in a while. For now, let me just pose the question, “could your learning events be described using the list above?” I’ll be honest and say that my own learning events often fall short, in all of these categories.
The folks who we use as “designated trainers” within the laboratory, will typically have reached a certain degree of knowledge and experience with a technique or test methodology that we have the confidence that they will be able to answer trainee questions, show good technique, and have a deep understanding of methodologies
and equipment involved. I say “typically”; I should have said “demonstrably” because we really ought to be using subject matter or technique experts as our trainers, who can consistently show that they truly have the have required skills and knowledge. But how many of these folks will have been trained in the art and skill of training? Within larger organizations, who may have dedicated teams of trainers, this may well be the case, but how many smaller organizations could say the same? The way training is designed, delivered, assessed, and then transferred back into the workplace is of fundamental importance. Knowing how to prepare for training, the best ways to deliver an event to those with different learning styles, ensuring training techniques are used to maximize retention of information, and then properly assessing and ensuring the learning is transferred to daily practice is a specialist field and needs to be a primary consideration when building an effective work-based training and learning programme. This cannot simply be ignored because we work in a scientific discipline and these aspects feel like “soft skills”, and therefore not applicable to our highly rigorous environment. Trainers need to be taught to be effective, and there is nothing that would convince me otherwise, and it is a fundamental truth that expertise does not give anyone, by right, the skills to be an effective trainer.
Of course, by taking this approach we are somewhat shooting ourselves in the foot with regards to the effectiveness of our training programme. I’m guessing the availability of staff with the required expertise who have been trained in the andragogical aspects of effective training design and delivery will be few and far between within any organization. Therefore, we butt heads against the evergreen problem of “trainer availability”. Laboratory Scientist A needs to be trained and “signed-off” prior to undertaking an analysis for which there is currently high demand, and therefore the training is “urgent”, however there are no trainers available for scheduling in the next few weeks, and we have ourselves a problem. Well, here is where we need to get very creative, and I’ll talk about the flexibility that can be afforded by digital learning in a little while. Many businesses will link their HR or quality management systems (QMS) to the training of individuals and it will be possible to schedule both on-boarding and on-the-job training events well into the future, but real life rarely follows even the best planned schedule, and there must be enough capacity within the training system to flex with business requirements.
Grandfathering
In many organizations I see the processes of Grandfathering of knowledge and skills. This can even be the case where a good training documentation system exists, but where trainers very readily make statements such as, “the training documents say we do this, but for a while now we have been doing it this way” or “the training documents say we do this, but I always find it works better when I do it this way”. The age-old question of who polices the police is a gnarly one, but just ask yourself the questions—who keeps your training documentation up-to-date, and who checks that your trainers are delivering the information to the correct levels? Further, who decides what knowledge and skills can be covered with APL, when someone enters our business from another reputable company? Do we even bother to assess whether that PhD level chemist can use a positive displacement pipette properly? I could write a book about this subject but I suspect it wouldn’t sell many copies and so for now, I’ll leave you with the questions to ponder.
Assessment of Understanding
The last of my training methodology comments is around assessment and the so-called “transference of training” into the workplace. Again, there is so much I could write, but I’ll take just a few examples to highlight the major points. The principle of “read and understand”, I would contend actually means “read and acknowledge that you read”, as very often there is little assessment of understanding. Why is this? Is it because it takes too much time and effort to design an effective assessment of the subject material? Is it because we need to get policies or test methods into circulation swiftly and there isn’t time to design and deploy an effective assessment? Is it because having someone read the policy, procedure or test method is better than doing nothing at all? Or is it really because this approach is enough to cover ourselves from a quality perspective to tick a box to be able to state that “everyone was trained in the document”? Is it really too much trouble to use one of the really nice software based e-learning assessment tools to generate a meaningful “quiz” to ensure that the basics of the subject matter really are understood? I’m sure there are many examples when “read and understand” really requires some higher taxonomic level of understanding such as analysis or evaluation, where concepts need to be applied to new situations or used to draw conclusions from evidence presented. Can these levels of understanding be assessed using software tools or do we need to be more creative in order to evaluate the effectiveness of training and the assurance that we have tools to ensure learnings will be carried into the workplace. Assessment methods will need to be more comprehensive and rigorous coaching and evaluation employed on an ongoing basis to ensure the learnings are effectively transferred into daily working practice. What means do you have available to ensure this is happening? How do you assess the acquisition of new skills? Typically by “observation of the trainee” making up a mobile phase, using a pH meter, using a pipette, and so on. I guess this is somewhat effective but read on as I pick up on the points of confirmation bias and the value of failure. Some laboratories will assess a “whole process”, such as measuring the “correct” concentration for a pre-characterized test article using a chromatographic technique, which is very much a step in the right direction in terms of assessing at higher taxonomic levels, but how much confirmation bias is there in this process? What drivers are there for a trainer to observe impartially whilst a trainee makes mistakes and ultimately arrives at the “incorrect” answer for the test article concentration? How tempting is it for the trainer to intervene when they see an issue, to steer the trainee to get the right result at the end of the assessment process, because who wants to go through the training and assessment process all over again? Well, the temptation will be less if your trainers have been properly educated as educators, since they will know the positive benefits of being allowed to fail, and then deconstruct the failure for the benefit of improved understanding, and if there is an effective and flexible training scheduling programme in place, because there will be time to allow the learning through failure process to happen.
Continuing Professional Development (CPD)
An interesting question to ask, when assessing the quality of a training system,
is, “to what level is knowledge, in its purest form, a fundamental aspect?” I believe that—understandably given the maturity of our science and the industrial context—we train “to do”. Our training courses are based around SOP’s, analytical methods, and instruments. I grant that there may be sections in these documents on “Background” or “Theory” but they are often scant and lacking in enough detail for them to be useful in, say, troubleshooting or method optimization. You may be thinking, “well, this is something which builds with a chemist over their career or, surely this should part of ongoing CPD activities”, and to a large extent, I’d agree with you. But are we good at providing opportunities for CPD, do we consider it necessary for our profession? If the answer is yes, then the next question must surely be, why have our professional bodies not mandated that this is part of our ongoing training and career development. I grant you there have been welcome moves in this direction over recent years, but I don’t see a clamour for the CPD points certificate at the end of external training courses, conferences, or symposia in analytical chemistry, that I always see in the clinical diagnostics and toxicology conferences.
Digital Training
At this point I should admit that I’ve been leading this discussion in a particular direction, in fact to two very important aspects that I believe point to the future of training learning in our industry. The first is simple, a single word in fact, digital.
I’ve recently been studying how to better use digital learning technologies in staff development which involved a review of what is being done in allied industries. Below are just a few examples with explanation of why I think they are such good ideas.
Video training resources, more specifically videos of experts demonstrating skills, can be highly useful in solving many of the problems highlighted above. The quality of the training is “fixed”, our experts do the right things and their techniques can be reviewed by senior staff or external peers for the avoidance of doubt. The scheduling issues with training can be overcome—digital resources are accessible anytime and anywhere the trainee has internet access. Here’s the killer part of implementation though—in a case study that I have followed, a peer group of trainees then gather to train each other, based on what they have learned online.
Their skills demonstrations are filmed using GoPro type cameras or simply using their mobile phones, the videos being uploaded to a site for review and feedback from the subject matter expert trainers. However, it gets better, the trainees are then required to use their wearable or mobile tech to record the same operations during “daily” work over a set period or number of events, and these videos are then uploaded for review to ensure training transference into the workplace. This approach is evolution rather than revolution, but it has so much that is positive going for it. Preparing to teach others to do something, is in fact the best way to learn. So the “cohort learning” aspect is perfect in this regard, and it ticks so many of the boxes from our original list of ideal training requirements: social, engaging, personalized, and inclusive. The results from this study are already showing that the effectiveness of training in this manner is much higher than with traditional laboratory training approaches.
Measurable Delivery of Knowledge-based Learning and Assessments
Of course, digital delivery methods also allow measurable delivery of knowledge-based learning, such as that available from LCGC’s very own CHROMacademy (1), or a host of instrument manufacturers’ websites, and a wide variety of other internet resources. The advantage of the CHROMacademy platform, is the number of assessments available and the way that these assessments are built to measure key facets of the learning with varying question sets—trainees are not learning the assessment answers, because the assessments change each time. Again, this improves the availability and breadth of the knowledge-based learning, and also the rigour of the assessment process, to a standardized level. For those who are sceptical of the value of digital learning, please move over and allow the Digital native generations to educate you on the amount of information that, within their daily lives, they derive from digital and online sources.
Combine all of these digital assets, learning and assessment opportunities into
a digital platform (learning management system [LMS]) and one has the foundation of something which could be transformational in laboratory training and development. Heck, if we can then combine the confirmation of transference into the workplace with digital badges (the millennial equivalent of football stickers as far as I
can tell!), then, with some standardization, we could perhaps begin to build a digital curriculum that could be used on an industry wide basis, which is the second important proposal I wanted to lead us towards. Why can’t we have a post graduate industrial training standard in analytical chemistry, just like so many other allied industries?
Barrier Breaking for The Greater Good
OK, so let’s just pause to consider some barriers to the implementation of such a system. Who is going to build the digital assets and record the training videos? Someone still needs to arrange, and where appropriate oversee, the practical sessions where trainees are involved with “whole process” training. Someone will cry “foul” because the production of an industry wide curriculum for learning seems too democratic (why should we pay to help bring up the standards of our competitors) or that the material isn’t prescient to the “way we do things here”. Are we not grown up enough as an industry to overcome these barriers? With the exception of the most complex process related tasks, could our trainee cohort, given their prior exposure to digital learning and demonstrations, not be trusted to work in a safe and organized manner? Are there not a tonne of digital resources from vendors and reputable providers to lighten the burden of video production to at least give this approach a try? Are we not big enough to admit that “the way we do things here” isn’t always the best way? Isn’t it about time we had some standardization and expectation setting for our analytical laboratory staff? Do we not have enough differentiators within our businesses to render analytical skills to the lesser leagues of “why we win”?
Conclusion
I’ll leave everyone who made it thus far to reflect on what, in an increasingly digital world, might just be the future of laboratory training and development. We may want to ask ourselves if we are truly delivering a great industrial training and learning experience to our staff, and how anything we do to improve the effectiveness of laboratory training, might pay us pack very quickly in terms of improved compliance, quality, and laboratory efficiency. Not to mention the morale uplift in our staff who feel more “invested in” and are able to proudly display their digital badges which are collecting nicely towards their Registered Analytical Chemist certification.
Reference
1. CHROMacademy, www.chromacademy.com
Contact Author: Incognito
E-mail the Editor: kjones@mjhlifesciences.com
Next Generation Peak Fitting for Separations
December 11th 2024Separation scientists frequently encounter critical pairs that are difficult to separate in a complex mixture. To save time and expensive solvents, an effective alternative to conventional screening protocols or mathematical peak width reduction is called iterative curve fitting.