Elements of Laboratory Technology Management

This discussion is less about specific technologies than it is about the ability to use advanced laboratory technologies effectively. When we say “effectively,” we mean that those products and technologies should be used successfully to address needs in your lab, and that they improve the lab’s ability to function. If they don’t do that, you’ve wasted your money. Additionally, if the technology in question hasn’t been deployed according to a deliberate plan, your funded projects may not achieve everything they could. Optimally, when applied thoughtfully, the available technologies should result in the transformation of lab work from a labor-intensive effort to one that is intellectually intensive, making the most effective use of people and resources.

People come to the subject of laboratory automation from widely differing perspectives. To some it’s about robotics, to others it’s about laboratory informatics, and even others view it as simply data acquisition and analysis. It all depends on what your interests are, and more importantly what your immediate needs are.

People began working in this field in the 1940s and 1950s, with the work focused on analog electronics to improve instrumentation; this was the first phase of lab automation. Most notably were the development of scanning spectrophotometers and process chromatographs. Those who first encountered this equipment didn’t think much of it and considered it the world as it’s always been. Others who had to deal with products like the Spectronic 20[a] (a single-beam manual spectrophotometer), and use it to develop visible spectra one wavelength measurement at a time, appreciated the automation of scanning instruments.

Mercury switches and timers triggered by cams on a rotating shaft provided chromatographs with the ability to automatically take samples, actuate back flush valves, and take care of other functions without operator intervention. This left the analyst with the task of measuring peaks, developing calibration curves, and performing calculations, at least until data systems became available.

The direction of laboratory automation changed significantly when computer chips became available. In the 1960s, companies such as PerkinElmer were experimenting with the use of computer systems for data acquisition as precursors to commercial products. The availability of general-purpose computers such as the PDP-8 and PDP-12 series (along with the Lab 8e) from Digital Equipment, with other models available from other vendors, made it possible for researchers to connect their instruments to computers and carry out experiments. The development of microprocessors from Intel (4004, 8008) led to the evolution of “intelligent” laboratory equipment ranging from processor-controlled stirring hot-plates to chromatographic integrators.

As researchers learned to use these systems, their application rapidly progressed from data acquisition to interactive control of the experiments, including data storage, analysis, and reporting. Today, the product set available for laboratory applications includes data acquisition systems, laboratory information management systems (LIMS), electronic laboratory notebooks (ELNs), laboratory robotics, and specialized components to help researchers, scientists, and technicians apply modern technologies to their work.

While there is a lot of technology available, the question remains “how do you go about using it?” Not only do we need to know how to use it, but we also must do so while avoiding our own biases about how computer systems operate. Our familiarity with using computer systems in our daily lives may cause us to assume they are doing what we need them to do, without questioning how it actually gets done. “The vendor knows what they are doing” is a poor reason for not testing and evaluating control parameters to ensure they are suitable and appropriate for your work.

View the full article on LIMSforum

Laboratory Technology Planning and Management: The Practice of Laboratory Systems Engineering

What separates successful advanced laboratories from all the others? It’s largely their ability to meet their goals, with the effective use of resources: people, time, money, equipment, data, and information. The fundamental goals of laboratory work haven’t changed, but they are under increased pressure to do more and do it faster, with a better return on investment (ROI). Laboratory managers have turned to electronic technologies (e.g., computers, networks, robotics, microprocessors, database systems, etc.) to meet those demands. However, without effective planning, technology management, and education, those technologies will only get labs part of the way to meeting their needs. We need to learn how to close the gap between getting part-way there and getting where we need to be. The practice of science has changed; we need to meet that change to be successful.

This document was written to get people thinking more seriously about the technologies used in laboratory work and how those technologies contribute to meeting the challenges labs are facing. There are three primary concerns:

  1. The need for planning and management: When digital components began to be added to lab systems, it was a slow incremental process: integrators and microprocessors grew in capability as the marketplace accepted them. That development gave us the equipment we have now, equipment that can be used in isolation or in a networked, integrated system. In either case, they need attention in their application and management to protect electronic laboratory data, ensure that it can be effectively used, and ensure that the systems and products put in place are both the right ones, and that they fully contribute to improvements in lab operations.
  2. The need for more laboratory systems engineers (LSEs): There is increasing demand for people who have the education and skills needed to accomplish the points above and provide research and testing groups with the support they need.[a]
  3. The need to collaborate with vendors: In order to develop the best products needed for laboratory work, vendors should be provided more user input. Too often vendors have an idea for a product or modifications to existing products, yet they lack a fully qualified audience to bounce ideas off of. With the planning in the first concern in place, we should be able to approach vendors and say, with confidence, “this is what is needed” and explain why.

If the audience for this work were product manufacturing or production facilities, everything that was being said would have been history. The efficiency and productivity of production operations directly impacts profitability and customer satisfaction; the effort to optimize operations would have been an essential goal. When it comes to laboratory operations, that same level of attention found in production operations must be in place to accelerate laboratory research and testing operations, reducing cost and improving productivity. Aside from a few lab installations in large organizations, this same level of attention isn’t given, as people aren’t educated as to its importance. The purpose of this work is to present ideas of what laboratory technology challenges can be addressed through planning activities using a series of goals.

View the article on LIMSforum

Considerations in the Automation of Laboratory Procedures

Scientists have been dealing with the issue of laboratory automation for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated informatics infrastructure beginning with sample preparation and continuing on to the laboratory information management system (LIMS), electronic laboratory notebook (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?

The answer to that question has changed from a focus on hardware and programming, to today’s need for a lab-wide informatics strategy. We’ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.

The high-end of the problem—the large informatics database systems—has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard “let’s try this” method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.

Why is this important? Whether you are conducting intense laboratory experiments to produce data and information or making chocolate chip cookies in the kitchen, three things remain important: productivity, the quality of the products, and the cost of running the operation. In any case, if the productivity isn’t high enough, you won’t be able to justify your work; if the quality isn’t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that these labs will continue operating and, that you’ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.

This article looks at conditions that need to be met before you embark on the automation of a laboratory process. It comes down to a key factor: is it worth it? What will you gain by doing it, how much effort will it take, and will it significantly improve lab operations?

The material can be access through this link to the LIMSwiki.

What is Laboratory Systems Engineering?

LSE is intended as a multi-disciplinary field of work encompassing an understanding of:

The relevant science. This is necessary because the LSE would connect with laboratory personnel and their work environment, understand what they are doing, and translate their needs into working systems. This could be a specialization point for an LSE in electronics, life science applications, physical sciences, etc. However, the basic principles of data acquisition, processing, storage, analysis, etc., are common across sciences, so moving from one scientific discipline to another would not be difficult.
Laboratory Informatics includes LIMS, ELN, SDMS, LES, IDS, digital communications connections (serial, parallel, GPIB, etc.), relevant protocols, analog data acquisition, and related processes etc.
Robotics, including electronics and mechanical engineering principles, and the application of commercial systems to lab work.
Information Systems Technologies, including hardware, operating systems, database applications, and communications. LSEs would not be a replacement for or duplication of traditional IT services. They would bridge the gap between IT services roles and applying those technologies to lab work. In many lab applications, the computer is just one piece of a system that is only fully functional if all the components work together. While this is less of a concern when a system (instruments and computers) is purchased and installed by the vendor, it becomes a significant problem when mixed vendor solutions are being used, for example, the connection of an instrument data system to a LIMS, SDMS, or ELN.

Their skill set should include working with teams and the ability to lead them in the successful development, execution, and completion of projects. This would require interpersonal skills to work with people at various management levels.

Part of the LAE’s role is to examine new technologies and see how they can be applied to lab work. Another is to assess the lab’s needs and anticipate technologies that need to be developed.

An earlier version of this skill set was described under “Laboratory Automation Engineering,” drafted in 2005/2006**. In the almost two decades since that article was released, laboratory informatics and information technologies have become more demanding and sophisticated, requiring a change in the field’s name to reflect those points.

It would also be helpful to have a background in General Systems Theory – This field of work will help LSEs and those they work with describe and understand the interactions between the laboratory informatics systems used in laboratory work. Why is that important in this context? Laboratory Informatics is an interconnected set of components that ideally will operate with minimum human intervention. General Systems Theory* will help describe those components and their interactions.

  • “Systems are studied by the general systems theory—an interdisciplinary theory about the nature of complex organizations in nature, society, and science, and is a framework by which one can investigate and/or describe any group of elements that are functioning together to fulfill some objective (whether intended, designed, man-made, or not).”, from: https://doi.org/10.1016/B978-0-12-381414-2.00004-X.

** “Are You a Laboratory Automation Engineer?