While their use in the lab began as “glorified plate handlers,” robots are proving more valuable and taking on more responsibilities.
Sam Michael says the evolution of laboratory robots looks a bit backward. Rather than starting out small and growing in size and complexity, lab robot systems began large and have since slimmed down and sleeked up.
Sam Michael says that while large monolithic robots are still being used in the lab, smaller, sleeker robots are also becoming popular. Photo: NCATS
“We had these big monolithic systems,” says Michael, the director of automation and compound management at the National Center for Advancing Translational Sciences (NCATS), Rockville, Md. “Now you have these much smaller robots. There has been this huge push to have little robotic arms, which have a special direct drive or magnetic drive. You don’t need all of the safety features now because the arm, if you bump it, will stop itself,” he told Laboratory Equipment.
NCATS is a highly automated national users facility that runs high-throughput screening of compounds for research organizations that don’t have that level of automation or much of a compound library. Using NCATS, Yale University researchers recently found a drug originally designed for cancer that can restore memory and reverse cognitive problems in mice with Alzheimer’s type symptoms.
No mater what size or shape, Michael says, robots are becoming more useful than the “glorified plate transportation” systems for which they were first intended.
From high-throughput screening to chemical and biochemical synthesis, robots and other forms of automation have found a home in the modern laboratory. Their roles have expanded and they are being advanced for even greater responsibilities, freeing researchers from manual labor and letting them focus on more strategic tasks.
Ross King, a professor of machine intelligence at the University of Manchester’s School of Computer Science and the Manchester Institute of Biotechnology in the U.K., calls robot scientists “a natural extension of the trend of increased involvement of automation in science.” King has been a driving force behind the development of Eve and Adam, two laboratory robotic systems.
“They can automatically develop and test hypotheses to explain observations, run experiments using laboratory robotics, interpret the results to amend their hypotheses, and then repeat the cycle, automating high-throughput, hypothesis-led research,” he told Laboratory Equipment. “Robot scientists are also well suited to recording scientific knowledge. As the experiments are conceived and executed automatically by computer, it is possible to completely capture and digitally curate all aspects of the scientific process.”
“Chemical synthesis is ripe for automations,” King adds. “It should be possible to press a button and have a robotics system make any compound a human synthetic chemist can make.”
Synthesis for the masses
Dr. Martin Burke, MD, a chemistry professor at the University of Illinois, Urbana-Champaign, is keenly focused on automating chemical synthesis, a laborious task that has traditionally required Ph.D-level artisans working tirelessly in the lab to concoct new small molecules. His new device, simply referred to as “the Machine,” uses a building block approach and tries to replicate what is found in nature.
The Burke research group (Martin Burke, center) has been working to break down complex molecules into smaller building blocks that can be easily assembled. Photo: University of Illinois
“We wanted to take a very complex process and make it simple. The way things have evolved in the field, small molecule synthesis is a highly individualized customized process,” Burke told Laboratory Equipment. “We are advocating for a more generalized automated approach to making small molecules.”
The Machine can assemble complex small molecules at the click of a mouse, like a 3-D printer at the molecular level. The automated process has the potential to greatly speed up and enable new drug development and other technologies that rely on small molecules.
The Burke group’s strategy has been to break down the complex molecules into smaller building blocks that can be easily assembled. The chemical building blocks all have the same connector piece and can be stitched together with one simple reaction, the way interconnecting plastic blocks can have different shapes but all snap together.
To automate the building block assembly, Burke’s group devised a simple catch-and-release method that adds one building block at a time, rinsing the excess away before adding the next one. They demonstrated that their machine could build 14 different classes of small molecules, including ones with difficult-to-manufacture ring structures, all using the same automated building block assembly.
The technology has been licensed to REVOLUTION Medicines, Inc., a company that Burke co-founded. The company is initially focusing on anti-fungal medications.
“It is expected that the technology will similarly create new opportunities in other therapeutic areas as well, as the industrialization of the technology will help refine and broaden its scope and scalability,” says Burke, who has been working on automating synthesis for the past 10 years.
“If we can hand over the power of making molecules to everyone, from biologists to synthesists to some kid working in his garage, it is incredibly exciting to think about what could be accomplished at the molecular scale. We could create functional devices that could literally change the world.”
At Manchester University, King developed Eve with Steve Oliver, professor of biotechnology at the University of Cambridge. Eve is designed to automate early-stage drug design.
"The Machine” uses a building block approach that tries to replicate what is found in nature. Photo: University of Illinois
“Eve exploits its artificial intelligence to learn from early successes in her screens and select compounds that have a high probability of being active against the chosen drug target,” Oliver says. “A smart screening system, based on genetically engineered yeast, is used. This allows Eve to exclude compounds that are toxic to cells and select those that block the action of the parasite protein while leaving any equivalent human protein unscathed. This reduces the costs, uncertainty and time involved in drug screening.”
First, Eve systematically tests each member from a large set of compounds in conventional mass screening. The compounds are screened against assays designed to be automatically engineered, and can be generated much faster and more cheaply than current assays. This enables more types of assays to be applied, more efficient use of screening facilities to be made and thereby increases the probability of a discovery within a given budget.
Eve’s robotic system is capable of screening more than 10,000 compounds per day. King and Oliver recently demonstrated its power by aiding in the identification of promising new drug candidates for malaria and neglected tropical diseases, such as African sleeping sickness and Chagas’ disease.
While simple to automate, mass screening remains relatively slow and wasteful as every compound in the library is tested. It is also unintelligent, as it makes no use of what is learned during screening.
To improve the process, Eve selects at random a subset of the library to find compounds that pass the first assay—any “hits” are re-tested multiple times to reduce the probability of false positives. Taking this set of confirmed hits, Eve uses statistics and machine learning to predict new structures that might score better against the assays. Although it currently does not have the ability to synthesize such compounds, future versions of the robot could potentially incorporate this feature, King says.
Similarly, at NCATS, Michael describes his groups “wells to cells” technique, which helps cut down on time and data.
“If you assume with high-throughput screening that the hit rate you want is about half of a percent, that means 99.5% of your data are irrelevant,” he says.
“A technique we implement is well to cells, where we do sort of a first pass of that experiment with a lower content reader,” Michael explains. “For example, if it’s a GFP (green fluorescent protein) read and we know that if our target is going to express GFP it’s a hit, we will do an early pass on a lower content reader and then only read those specific wells by automating the analysis to hit some threshold value.” he says.
“Instead of reading the entire 1,536 well plate, we will read only 50 wells that might be a combination of controls plus hits. Instead of generating a gigabyte of data, [researchers] can cut it down to megabytes because you are reading only wells of interest.”
Regardless of those techniques, King says the prime challenge to lab robotics is adding intelligence to the system.
“The deepest problem is adding flexibility of intelligence so that the system can change its representation of knowledge,” he says.
The laboratory of Ross King at Manchester University has been designing robots to automate early-stage drug design
“A lot of people are trying to implement machine learning,” adds Michael, explaining that today’s systems remain open loop with humans playing the integral role. “You have automated systems and the informatics to analyze the data, and now there is a huge push to see if you can automate the component of analyzing that data and feeding it back into another experiment.”
So will robots take over the lab? Possibly, but not in the foreseeable future as all of the intuition in drug discovery resides in humans. King says the current strength remains humans working with robots.
“Computers are now much better than humans at chess, but humans and computers working together are still better than computers alone,” he adds.