Читать книгу Digital Transformation of the Laboratory - Группа авторов - Страница 22
1.2.5.1 Lab Automation Integration and Interoperability
ОглавлениеLab instrument integration and interoperability to support higher levels of lab automation have been and will continue to evolve quickly, driven by the pressure from scientists and lab managers and, above all to have better ways to manage and control their equipment [44–46]. Capabilities as diverse as chemical synthesis [47] and next‐generation sequencing (NGS) [48] are seeking to better automate their workflows to improve speed and quality and to align with the growing demands of AI in support of generative and experimental design as well as decision‐making [49]. An additional stimulus toward increased automation, integration, and interoperability is that of experiment reproducibility. The reproducibility crisis that exists in science today is desperately in need of resolution [50]. This is manifested not only in terms of being unable to confidently replicate externally published experiments, but also in not being able to reproduce internal experiments – those performed within individual organizations. Poor reproducibility and uncertainty over experimental data will also reduce confidence in the outputs from AI; the mantra “rubbish in, rubbish out” will thus continue to hold true! Having appropriate automation and effective data management can support this vital need for repeatability, for example of biological protocols [51]. This will be especially important to support and justify the lab as a service business model, which we have mentioned previously. It is our belief that the increased reliability and enhanced data‐gathering capability offered by increased automation initiatives in the LotF will be one important way to help to address the challenge of reproducibility.
Updated automation will always be coming available as an upgrade/replacement for the existing equipment and workflows; or to enhance and augment current automation; or to scale up more manual or emerging science workflows. When considering new automation, the choices for lab managers and scientists will depend on whether it is a completely new lab environment (a “green‐field site”) or an existing one (a “brown‐field site”).
As mentioned previously, the growth of integration protocols such as IoT [52] is expanding the options for equipment and automation to be connected [53]. The vision for how different workflows can be integrated – from single measurements (e.g. balance measurements), via medium‐throughput workflows (e.g. plate‐based screening), to high data volume processes such as high content screening (HCS) involving images and video – has the potential to be totally reimagined. IoT could enable the interconnectivity of a huge range of lab objects and devices, such as freezers, temperature control units, and fume hoods, which previously would have been more standalone, with minimal physical connectivity. All these devices could be actively connected into expanded data streams and workflows where the measurements they take, for example, temperature, humidity, and air pressure, now become a more integral part of the experiment record. This enhanced set of data collected during experiments in the LotF will be hugely valuable during later analysis to help spot more subtle trends and potential anomalies. Furthermore, these rich datasets could play an increasing role as AI is used more and more for data analysis; small fluctuations in the lab environment do have a significant impact on experimental results and hence reproducibility. As well as this passive sensor monitoring, there is also the potential for these devices to be actively controlled remotely; this opens up options for further automation and interaction between static devices and lab robots, which have been programmed to perform tasks involving these devices. As always, it will be necessary to select appropriate automation based on the lab's needs, the benefits the new automation and workflows can provide, and hence the overall return on investment (ROI).
While the potential for these new systems with regard to improved process efficiency is clear, yet again, though, there is one vital aspect which needs to be considered carefully as part of the whole investment: the data. These LotF automation systems will be capable of generating vast volumes of data. It is critical to have a clear plan of how that data will be annotated and where it will be stored (to make it findable and accessible), in such a way to make it appropriate for use (interoperable), and aligned to the data life cycle that your research requires (reusable). A further vital consideration will also be whether there are any regulatory compliance or validation requirements.
As stated previously, a key consideration with IoT will be the security of the individual items of equipment and the overall interconnected automation [54, 55]. With such a likely explosion in the number of networked devices [56], each one could be vulnerable. Consequently, lab management will need to work closely with colleagues in IT Network and Security to mitigate any security risks. When bringing in new equipment it will be evermore important to validate the credentials of the new equipment and ensure it complies with relevant internal and external security protocols.
While the role of lab scientist and manager will clearly be majorly impacted by these new systems, also significantly affected will be the physical lab itself. Having selected which areas should have more, or more enhanced and integrated, lab automation, it is highly likely that significant physical changes to the lab itself will have to be made, either to accommodate the new systems themselves or to support enhanced networking needs.
In parallel to the lab environment undergoing significant change over the upcoming decades, there will also be new generations of scientists entering the workforce. Their expectations of what makes the LotF efficient and rewarding will be different from previous generations. The UX [57] for these new scientists should be a key consideration when implementing some of the changes mentioned in this book. For example, apps on mobile phones or tablets have transformed peoples' personal lives, but there has been slower development and adoption of apps for the lab. The enhanced usage of automation will very likely need to be managed through apps; they will therefore become a standard part of the LotF. One cultural caveat around the growth of lab apps should be flagged here. With apps enabling much more sophisticated control of automation operating 24/7, via mobile phones, outside “human working hours,” there will need to be consideration of the new scientists' work/life balance. If handled sensitively, though, developments such as lab apps could offer much‐increased efficiency and safety, as well as reducing experiment and equipment issues.
Voice‐activated lab workflows are also an emerging area, just as voice assistants have become popular in the home and in office digital workflows [58]. For the laboratory environment, the current challenges being addressed are how to enrich the vocabulary of the devices with the specific language of the lab, not only basic lab terms but also domain‐specific language, whether that is biology, chemistry, physics, or other scientific disciplines. As with IoT, specific pilots could not only help with the assessment of the voice‐controlled device or system but also highlight possible integration issues with the rest of the workflow. A lab workflow where the scientist has to use both hands, like a pianist, is a possible use case where voice activation and recording could have benefits. The ability to receive alerts or updates while working on unfamiliar equipment would also help to support better, safer experimentation.
As with voice control, the use of AR and virtual reality (VR) in the lab has shown itself to have value in early pilots and in some production systems [59]. AR is typically deployed via smart glasses, of which there is a wide range now in production. There are a number of use cases already where AR in the lab shows promise, including the ability to support a scientist in learning a new instrument or to guide them through an unfamiliar experiment. These examples will only grow in the LotF. To take another, rather mundane example, pipetting is one of the most familiar activities in the lab. In the LotF where low throughput manual pipetting is still performed, AR overlays could support the process and reduce errors. AR devices will likely supplement and enhance what a scientist can already do and allow them to focus even more productively.
Another area of lab UX being driven by equivalents in consumer devices is how the scientist actually interacts physically with devices other than through simple keyboard and buttons. Technologies such as gesture control and multitouch interfaces will very likely play an increasing role controlling the LotF automation. As with voice activation, these input and control devices will likely evolve to support the whole lab and not just a single instrument. Nevertheless, items such as projected keyboards could have big benefits, making the lab even more digitally and technologically mature.
As mentioned before there is another technology which could play a significant role in enhancing the UX in the LotF; this is the “digital twin.” [60] In brief, a digital twin is a representation in silico of a person, a process, or a thing. Its role has been evolving in recent years, such that digital twins can now be seen as virtual replicas of physical environments or objects which managers, data scientists, and business users can use to run simulations, prepare decisions, and manage operations [42, 61]. This technology has the potential to impact the LotF in two primary areas: (i) simulation and (ii) remote control.
Starting with simulation, digital twins, unlike the physical world, which shows you a picture of the present, can review the past and simulate the future. The digital twin can therefore become an environment to test out in pilot mode not only emerging technologies such as voice activation, AR, VR, and multigesture devices but also novel or redesigned workflows without the need for full‐scale deployment. Indeed, with increased computational capability (provided by exascale computing and ultimately quantum computing – see Section 1.2.5.2), the processes that operate within the LotF will be simulatable to such a degree of sophistication that a person will be able to see, in silico, a high‐resolution representation of the technology, experiment, or process they are looking to perform, in a simulation of the lab in which it will run. This digital twin will allow the operator to check, for example that the novel process is likely to run smoothly and deliver the output that is hoped for. While digital twin technology may be more applicable to the protocol‐driven lab, it may also have applicability in the research lab as a means of exploring “what‐if” scenarios prior to doing the actual physical experiment.
Turning to digital twin technology and improved remote control, massively improved computational technology combined with advances in AR and VR will allow operators, who might be located nowhere near the lab in which their experiment is being run, to don appropriate AR/VR headsets and walk into an empty space that will “feel” to them like they are right inside the lab or even right inside the experiment itself. The potential for scientists to “walk” into the active site of an enzyme and “manually” dock the molecules they have designed, or for an automation operator to “step into” the reaction vessel running the large‐scale manufacturing of, say, a chemical intermediate to check that there are no clumps, or localized issues (e.g. overheating), will revolutionize how the LotF can operate, making it more likely to be more successful and, importantly, safer.
One final, obvious application of digital twin technology is where that LotF is not even on Earth. Running experiments in low or zero gravity can lead to interesting, sometimes unexpected findings [62]. This has led to numerous experiments having been performed on the NASA Space Station [63]. But having a trained astronaut who can effectively run any experiment or protocol, from organic synthesis to genetic manipulation, is asking a great deal. Digital twin technology could make the LotF in zero gravity a much more compelling proposition [64].
Returning to the area of instrument integration and interoperability, a more practical consideration is how different instruments communicate with each other, and how the data they generate is shared.
Within any lab there is and always will be a wide range of different instruments from different manufactures, likely bought over several years to support the business workflows. This “kit diversity” creates a challenge when you want to define a protocol which involves linking two or more instruments together that do not use the same control language. SiLA‐2 [65] is a communication standard [66] for lab instruments, such as plate readers, liquid handling devices, and other analytical equipment, to enable interoperability. As indicated throughout this section, the ability to fully connect devices together will enable a more flexible and agile lab environment, making it possible to track, monitor, and remote control automation assets. This will further enable enhanced robotic process automation (RPA) as well as easier transition to scale up and transfer to remote parties. Specific devices connected together for one workflow will be easily repurposable for other tasks without a monolithic communication design and coding.
Data in all its forms will remain the dominant high‐value output from lab experiments. As with protocols and communications, there need to be standards to support full data integration and interoperability within and between research communities. Over the years, data standards have evolved to support many aspects of the life science process whether that is for registration of new chemical entities [67], images [68], or macromolecular structures [69] or for describing the experiment data itself. Analytical instrument data (e.g. from nuclear magnetic resonance machines [NMRs], chromatographs, and mass spectrometers) are produced by a myriad of instruments, and the need to analyze and compare data from different machines and support data life cycle access in a retrievable format has driven the creation of the Allotrope data format [70] (ADF). This is a vendor‐neutral format, generated initially for liquid chromatography, with plans to expand to other analytical data. These wide community‐driven efforts such as those from Allotrope, SLAS, IMI [71], or the Pistoia Alliance [72] highlight the value of research communities coming together in life sciences, as happens elsewhere in industries such as financials and telecoms. Such enhanced efforts of collaboration will be needed even more in future.
In summary, the use of open standards will be critical for the success of the LotF, as the range of devices grows and science drives changes. There will need to be reliable, robust ways for the instruments, workflows, and data to be shared and accessible in order to support flexible, open‐access, and cross‐disciplinary collaborations, innovation, and knowledge exchange. The automation in the LotF will need to be effective across many different sizes and types of labs – from large, high‐throughput labs doing screening or sequencing, to midsize labs with some automation workbenches, to the long tail of labs with a few specialist instruments. In planning for a new lab, creating a holistic vision of the design will be a key first element. That vision will include the future processes that your lab will want to tackle, as well as the potential new technologies to be deployed in the lab, e.g. IoT, AR, or voice control. Additionally, new skills will need to be acquired by those involved to help implement these changes, and an investment in staff and their training remains vital. Furthermore, in future there will likely be an ecosystem of lab environments both local and more disparate to consider; the LotF will be smarter and more efficient but not just through the adoption of a single device.