Organizational Memory and Laboratory Knowledge Management: Its Impact on Laboratory Information Flow and Electronic Notebooks

When we enter into discussions about laboratory automation, computing, and informatics, most of the effort is focused on information management (LIMS, LIS, SDMS), data generation (IDS, LES), and robotics (sample manipulation in preparation for data generation). That’s where the products are and where most of the justification for labs and projects originate.

During the course of laboratory experiments, testing, and research, a lot of information, data, and reports are produced that may or may not be well managed. These materials are a valuable product of lab work and when taken together form the organization’s memory. It’s the history of what has been done, why, what the results were including successes and blind alleys. Ever find yourself remembering that someone did some work on a topic that’s now of interest, and wondering where that stuff is?

Developing an organizational memory as part of the lab’s informatics structure, or being extended into a larger organizational matrix, is an important aspect of realizing a return on investment (ROI) in lab work. By taking steps to make that information resource more usable, the ROI can jump significantly by avoiding duplication of work, using people’s time doing manual searches, and coordinating work from multiple sources.

With the advent of artificial intelligence systems, ELN, and LIMS/LIS, we have the basis for developing an organizational memory system. That is what this article is about

To view the material through the LIMSwiki system, click here. (Note: there is no cost for accessing LIMSwiki material. You simply have to login into your account).

Guide for Management: Successfully Applying Laboratory Systems to Your Organization’s Work

Laboratory informatics involves a collection of technologies that range from sample storage management and robotics to database/workflow management systems such as laboratory information management systems (LIMS) and electronic laboratory notebooks (ELN), with a lot of task-specific tools in between. These components were designed by a number of vendors who saw specific needs and developed products to address them. Those products in turn were presented to laboratories as a means of solving their instrumental data collection and analysis, sample preparation, data management, and document management issues. With many needs and so many ways to address them, how do you go about choosing a set of products that will work for you?

That is what this set of webinars is all about. We introduce the technologies and position them for you so that you can see how they may or may not apply to your work. Then we address the very real world topic of justifying the investment needed to put those tools to use in your laboratories.

Once that foundation has been put in place we cover:

  • Technology planning and education: Planning is essential for success in this work. We look at how to go about it, who to involve, and methodologies for carrying out the work. We also look at the associated knowledge necessary to be effective.
  • Implementation: Informatics systems can be a challenge to implement. We look at what is needed to minimize risks and make the implementation easier, as well as the support requirements needed to manage their use in your laboratory environment.
  • Regulatory guidelines and compliance: We also address regulatory guidelines and compliance and how they can affect every laboratory application.
  • The future: What developments will arise and be needed in the future? We wrap up the series with those details.

The material in the link below gives you access to a series of slides and transcripts of a webinar series consisting of an introduction plus:

  • Laboratory Informatics Technologies
  • Laboratory Informatics and Return on Investment
  • Technology Planning and Education
  • LIMS/LIS, ELN, SDMS, IT, and Education
  • Supporting Laboratory Systems
  • Instrument Data Systems
  • Laboratory Processes

This link give you access to the material through LIMSwiki.

Elements of Laboratory Technology Management

This discussion is less about specific technologies than it is about the ability to use advanced laboratory technologies effectively. When we say “effectively,” we mean that those products and technologies should be used successfully to address needs in your lab, and that they improve the lab’s ability to function. If they don’t do that, you’ve wasted your money. Additionally, if the technology in question hasn’t been deployed according to a deliberate plan, your funded projects may not achieve everything they could. Optimally, when applied thoughtfully, the available technologies should result in the transformation of lab work from a labor-intensive effort to one that is intellectually intensive, making the most effective use of people and resources.

People come to the subject of laboratory automation from widely differing perspectives. To some it’s about robotics, to others it’s about laboratory informatics, and even others view it as simply data acquisition and analysis. It all depends on what your interests are, and more importantly what your immediate needs are.

People began working in this field in the 1940s and 1950s, with the work focused on analog electronics to improve instrumentation; this was the first phase of lab automation. Most notably were the development of scanning spectrophotometers and process chromatographs. Those who first encountered this equipment didn’t think much of it and considered it the world as it’s always been. Others who had to deal with products like the Spectronic 20[a] (a single-beam manual spectrophotometer), and use it to develop visible spectra one wavelength measurement at a time, appreciated the automation of scanning instruments.

Mercury switches and timers triggered by cams on a rotating shaft provided chromatographs with the ability to automatically take samples, actuate back flush valves, and take care of other functions without operator intervention. This left the analyst with the task of measuring peaks, developing calibration curves, and performing calculations, at least until data systems became available.

The direction of laboratory automation changed significantly when computer chips became available. In the 1960s, companies such as PerkinElmer were experimenting with the use of computer systems for data acquisition as precursors to commercial products. The availability of general-purpose computers such as the PDP-8 and PDP-12 series (along with the Lab 8e) from Digital Equipment, with other models available from other vendors, made it possible for researchers to connect their instruments to computers and carry out experiments. The development of microprocessors from Intel (4004, 8008) led to the evolution of “intelligent” laboratory equipment ranging from processor-controlled stirring hot-plates to chromatographic integrators.

As researchers learned to use these systems, their application rapidly progressed from data acquisition to interactive control of the experiments, including data storage, analysis, and reporting. Today, the product set available for laboratory applications includes data acquisition systems, laboratory information management systems (LIMS), electronic laboratory notebooks (ELNs), laboratory robotics, and specialized components to help researchers, scientists, and technicians apply modern technologies to their work.

While there is a lot of technology available, the question remains “how do you go about using it?” Not only do we need to know how to use it, but we also must do so while avoiding our own biases about how computer systems operate. Our familiarity with using computer systems in our daily lives may cause us to assume they are doing what we need them to do, without questioning how it actually gets done. “The vendor knows what they are doing” is a poor reason for not testing and evaluating control parameters to ensure they are suitable and appropriate for your work.

View the full article on LIMSforum

Notes on Instrument Data Systems

The goal of this brief paper is to examine what it will take to advance laboratory operations in terms of technical content, data quality, and productivity. Advancements in the past have been incremental, and isolated, the result of an individual’s or group’s work and not part of a broad industry plan. Disjointed, uncoordinated, incremental improvements have to give way to planned, directed methods, such that appropriate standards and products can be developed and mutually beneficial R&D programs instituted. We’ve long since entered a phase where the cost of technology development and implementation is too high to rely on a “let’s try this” approach as the dominant methodology. Making progress in lab technologies is too important to be done without some direction (i.e., deliberate planning). Individual insights, inspiration, and “out of the box” thinking is always valuable; it can inspire a change in direction. But building to a purpose is equally important. This paper revisits past developments in instrument data systems (IDS), looks at issues that need attention as we further venture into the use of integrated informatics systems, and suggests some directions further development can take.

There is a second aspect beyond planning that also deserves attention: education. Yes, there are people who really know what they are doing with instrumental systems and data handling. However, that knowledge base isn’t universal across labs. Many industrial labs and schools have people using instrument data systems with no understanding of what is happening to their data. Others such as Hinshaw and Stevenson et al. have commented on this phenomenon in the past:

Chromatographers go to great lengths to prepare, inject, and separate their samples, but they sometimes do not pay as much attention to the next step: peak detection and measurement … Despite a lot of exposure to computerized data handling, however, many practicing chromatographers do not have a good idea of how a stored chromatogram file—a set of data points arrayed in time—gets translated into a set of peaks with quantitative attributes such as area, height, and amount.[1]

At this point, I noticed that the discussion tipped from an academic recitation of technical needs and possible solutions to a session driven primarily by frustrations. Even today, the instruments are often more sophisticated than the average user, whether he/she is a technician, graduate student, scientist, or principal investigator using chromatography as part of the project. Who is responsible for generating good data? Can the designs be improved to increase data integrity?[2]

We can expect that the same issue holds true for even more demanding individual or combined techniques. Unless lab personnel are well-educated in both the theory and the practice of their work, no amount of automation—including any IDS components—is going to matter in the development of usable data and information.

The IDS entered the laboratory initially as an aid to analysts doing their work. Its primary role was to off-load tedious measurements and calculations, giving analysts more time to inspect and evaluate lab results. The IDS has since morphed from a convenience to a necessity, and then to being a presumed part of an instrument system. That raises two sets of issues that we’ll address here regarding people, technologies, and their intersections:

1. People: Do the users of an IDS understand what is happening to their data once it leaves the instrument and enters the computer? Do they understand the settings that are available and the effect they have on data processing, as well as the potential for compromising the results of the analytical bench work? Are lab personnel educated so that they are effective and competent users of all the technologies used in the course of their work?

2. Technologies: Are the systems we are using up to the task that has been assigned to them as we automate laboratory functions?

View the entire article on LIMSforum

Directions in Laboratory Systems: One Person’s Perspective

Introduction

The purpose of this work is to provide one person’s perspective on planning for the use of computer systems in the laboratory, and with it a means of developing a direction for the future. Rather than concentrating on “science first, support systems second,” it reverses that order, recommending the construction of a solid support structure before populating the lab with systems and processes that produce knowledge, information, and data (K/I/D).

Intended audience

This material is intended for those working in laboratories of all types. The biggest benefit will come to those working in startup labs since they have a clean slate to work with, as well as those freshly entering into scientific work as it will help them understand the roles of various systems. Those working in existing labs will also benefit by seeing a different perspective than they may be used to, giving them an alternative path for evaluating their current structure and how they might adjust it to improve operations.

However, all labs in a given industry can benefit from this guide since one of its key points is the development of industry-wide guidelines to solving technology management and planning issues, improving personnel development, and more effectively addressing common projects in automation, instrument communications, and vendor relationships (resulting in lower costs and higher success rates). This would also provide a basis for evaluating new technologies (reducing risks to early adopters) and fostering product development with the necessary product requirements in a particular industry.

Link to the on-line article (found on LIMSforum)