Article Text


Concerns with conclusions in the article by Sherwood et al ‘Key differences between 13 KRAS mutation detection technologies and their relevance for clinical practice’
  1. George Karlin-Neumann
  1. Department of Scientific Affairs, Digital Biology Center, Bio-Rad, California, USA
  1. Correspondence to Dr George Karlin-Neumann; george_karlin-neumann{at}


Click here and here to see the linked papers

  • letter to the editor

This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See:

Statistics from

Dear Editor,

We wish to raise some questions and concerns about the design, execution and evaluation of results in Sherwood et al.1

Concerning the study design, it is not clear that the same amount of sample material was analysed by all of the technologies tested. For example, while it appears that most platforms processed 1 μL of each sample in table 2 (ie, 50 or 100 mutant copies per test reaction), it seems that for Oncomine duplicate libraries were made and their results averaged in table 3, potentially enhancing the assessed sensitivity of this platform. Was data for all other platforms derived from a single experimental run of the sample test plate? For droplet digital PCR (ddPCR) tests, was 9 μL of each of these undiluted samples used per reaction (as written)? That would suggest that 9×50 and 9×100 mutant copies were not detectable by the QX200 platform, whereas it is documented that as few as a couple of mutant copies can be detected among as many as 100 000 wild-type copies (eg, refs 2–4). The Methods section would benefit from greater clarity.

Data analysis and the authors’ conclusions raise a number of questions. It is surprising that several of the best performing platforms (eg, Idylla and Oncomine) appeared to perform ~10X more sensitively than stated by the manufacturers (compare tables 3 and 4), whereas the ddPCR platform, widely recognised to be among the most sensitive and specific (reviewed in5), grossly underperformed (by several orders of magnitude). These discrepancies were not addressed.

For those technologies performing best in this study (ie, Idylla, Oncomine and UltraSEEK), could these outcomes have been influenced by relying on the vendors themselves to have run and analysed their own systems? In contrast, there is no discussion of the data quality behind the apparently flawed ddPCR testing run by a party different from the manufacturer. The fact that the QX200 results were only reported qualitatively (table 3) is very surprising as the inherent benefit of digital PCR is to provide absolute quantitation and this should raise a red flag. Were appropriate ‘no template controls’ run to assess possible contamination in this run? Were recommended positive and negative controls present in the run to permit proper manual thresholding of droplet clusters and data analysis? (Or was only autothresholding relied on?) These data raise clear concerns and need to be reconciled with the extensive peer-reviewed literature demonstrating markedly different results with the QX200 ddPCR technology, including a recent international interlab study across 21 European and North American labs showing reproducible KRAS mutant detection down to 0.17% MAF.6 Additional analytical validation data for all five KRAS PrimePCR assays used in the Sherwood et al study can be found in the URLs listed in the legend to figure 1. This figure also illustrates high specificity of the three KRAS G12 assays (C, D & V) in contrast to the authors’ findings.

Figure 1

Specificity of PrimePCR KRAS G12C, G12D and G12V Rare Mutation Detection Assays. The mutant allele is present at ~0.5% in ~40,000 genome copies/well. Additional analytical validation data showing detection limits <0.1%mutant fraction for all five assays reported in Sherwood et al (2017) are available in the ’Validation Data' tabs at: G12C c.34G>T:; G12V c.35G>T:; G12D c.35G>A:; G13D c.38G>A:; Q61H c.183A>T:

We look forward to the authors’ clarification of these issues.


George Karlin-Neumann, PhD

Director of Scientific Affairs

The Digital Biology Center, Bio-Rad


View Abstract


  • Provenance and peer review Not commissioned; internally peer reviewed.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.