7 Secret Tricks to Optimize Emulsion PCR in the NGS Workflow

Back to overview page

Next-generation sequencing (NGS) has revolutionized our understanding of life science, with more and more laboratories set to jump on the NGS bandwagon in the race to unlock genetic puzzles. The continuously increasing volume of samples means that manual processing is rapidly becoming a thing of the past. Can lab automation help scientists to gain and maintain a competitive advantage?

Library prep automation: What if you don’t have it? 

The inherent complexity of the NGS sample preparation workflow and the challenges it presents should not be underestimated. On top of this, there is the aspect of human error. No matter how careful we are, we all make mistakes. This may only come to light when aberrant results are generated, which means more work and higher costs.
The clonal amplification step of the NGS workflow plays a pivotal role in the process and is crucial for success. Why? It generates multiple copies of templates for sequencing after library prep has been performed. It amplifies the detectable signal of a single target DNA sequence and comprises emulsion making, emulsion PCR, emulsion breaking, and bead enrichment. Suboptimal clonal amplification can result in:

  • Bias from AT- and GC-rich regions
  • Polyclonal beads with multiple templates
  • Beads with no template

As with everything, clonal amplification can be performed in a number of ways. Amplification can be carried out in a flow cell using bridge amplification, which Illumina uses in their patterned flow cell technology. Other companies use emulsion PCR – which also is at the core of novel digital PCR.

Harness the power of emulsion PCR;

Why should we be excited about emulsion PCR? This is a powerful technique that has gained traction in many application areas, such as reduction of amplification bias, clonal amplification for NGS, enhanced multiplexing, digital PCR, and haplotyping. But, without the necessary know-how, it can be tricky to deploy. So what do you need to take into account?

1. Having the right enzyme

Firstly, having the right enzyme for the job is critical as this greatly impacts the number of copies of DNA amplified. Enzyme properties – such as thermal stability and proofreading activity – can affect the length of the DNA fragment produced, as well as the quality. Surprisingly, a test of enzymes in 2015 yielded the result that just 6 out of 13 enzymes tested gave good results in all downstream, high-throughput DNA sequencing tests performed, meaning you have to navigate through a marketing minefield to find the best product:

  • AmpliTaq Gold® (Applied Biosystems)
  • Biotaq® (Bioline)
  • FastStart® High Fidelity PCR System (Roche)
  • HotStarTaq® DNA Polymerase (QIAGEN)
  • Phusion® High Fidelity DNA Polymerase (Finnzymes)
  • Taq DNA Polymerase (Roche)
Select the right beads

It is imperative to select the right beads. For efficient DNA binding, it is vital that the beads do not clump. The amount of DNA bound on a bead also determines the read length, so the concentration must be right. Bead concentration must also be within a recommended range for optimal results.

3. Optimal number of monoclonal beads is around 30%

Emulsion making is a stochastic process, which means beating the statistics is a bit hit or miss. The optimal number of monoclonal beads is around 30%, because the process follows a Poisson distribution.

4. Perfect combination of enzyme and beads is no guarantee of success

However, the perfect combination of enzyme and beads is no guarantee of success. PCR can be a difficult process to get right. Prerequisites are a highly stable water-in-oil emulsion, optimized reaction components, and a small reaction volume. If that were not enough, the droplet size must be uniform to avoid negative impacts on the length of DNA that can be amplified. “Shake and bake” methods tend to generate a non-uniform droplet size, which has a knock-on effect on the results.

5. Optimize cycling program

Having found the ideal reaction conditions, the next step involves optimizing the cycling program. Suboptimal conditions will give suboptimal results, so it’s worth investing some time in this to get it right. For example, it is very important to choose an appropriate annealing temperature to optimize stringency. Too low a temperature can result in unwanted cross-reactions and/or primer-dimer formation and can also slow the reaction kinetics. Conversely, too high a temperature can destabilize annealing of the primer to the desired target site. Likewise, the extension temperature (if different from the annealing temperature) plays an important role. It must be optimal for the enzyme used to accelerate the reaction kinetics and to help prevent formation of secondary structure of the target that could prevent complete extension.  Furthermore, annealing time and extension time (again, if different) are important to assure successful annealing of the primer(s) and full-length extension of the amplicon region. For emulsion PCR we found that somewhat longer extension times help to ensure full-length amplicon formation and good overall efficiency of the reaction, which in turn leads to increased yield of the desired product.

6. Emulsions for PCR

Emulsions for PCR comprise an oil phase, a surfactant, and an aqueous phase which functions as a micro-reactor. A plethora of methods for emulsification have been developed and it can be difficult to navigate through the intricacies to find a reliable and reproducible solution.

7. Emulsion breaking

Emulsion breaking is usually carried out by centrifugation, but can also be performed by applying vacuum. Emulsion breaking is a delicate process. Bead loss is still an issue and if bead aggregates form it is likely they will be lost in the wash steps.

Library prep automation: How do you get it?

Workflow automation in life science saves time, reduces costs, increases productivity, and helps to eliminate human error – the goal of almost all labs. But what automated technologies are available for creating emulsions? These currently include microfluidics, dripping techniques (often with ultrasonic actuators), and dynamic pipetting.

Dynamic pipetting (fast up-and-down pipetting) has the advantage of being easy to automate and is typically faster and yields more droplets than, for example, microfluidic techniques. It is cost-effective and very robust and reproducible, once the right set of parameters is determined. One disadvantage is that the size distribution of the droplets produced using dynamic pipetting is somewhat higher than the distribution produced by the microfluidic method, although it is still within a very usable range. The most important parameter that influences achievement of the desired characteristics of the emulsion is optimization of the pressure that is built up in the tip. It controls the liquid speed and therefore the droplet size. The pressure depends on the plunger speed of the syringe and the orifice size of the tip. Since the orifice size relates to the pressure by a power of 4, it is highly critical. High-precision tips with minimal tolerance are a prerequisite. When properly adjusted, the automated method is highly reproducible – much more so than manual methods..

Develop a cutting-edge system

When developing an automated system, a high level of systems engineering expertise and an innate ability to cover all bases is necessary. The system must be reliable and it is vital to prevent the occurrence of problems, such as bead loss. Having the know-how to ensure that the beads remain homogeneously resuspended assures the success of downstream steps. During system development many factors need to be considered. A state-of-the-art pipetting system is crucial for advanced liquid handling, particularly when dealing with very small volumes and sensitive enzymatic reactions. Among other advantages, automated liquid handling streamlines the process, reduces variability, and helps to eliminate errors. Probably the most critical factor is the uniformity and number of emulsion droplets. In our experience, the instrument that is best-suited for precise measurement with a reasonable throughput is the MasterSizer from Malvern.

Summary

The complex multifaceted nature of workflow automation can only be overcome by working in a well-coordinated interdisciplinary team.

  • Successful workflow automation relies on an integrated systems engineering approach involving fields such as chemistry, biology, engineering, physics, statistics, and many more.
  • The first step to solving a problem is to break it down into bite-sized pieces. This means defining Key Performance Indicators (KPIs) and appropriate measurement methods.
  • Apply Design of Experiments (DOE) to optimize the process parameters.
  • Partner with a global systems engineering provider like HSE•AG to optimize costs and quality, minimize development risks, and gain a long-term partner for system development.

Do you want to speak about the complexity of the NGS sample preparation workflow to get valued advice on how to optimize it? The challenges it presents should not be underestimated.

  

Konstantin Lutze

Touch Base with us

Successful automation of life science and diagnostics workflows is a highly complex undertaking. With our key technology and application knowledge as well as with our high level of experience we will help you to shorten your time-to-market and grow your business.