WeChat Mini Program
Old Version Features

Multiplex PCR-ASE Functionalized Microfluidic Diagnostic Platform for the Detection of Clarithromycin Resistance Mutations in Helicobacter Pylori

SENSORS AND ACTUATORS B-CHEMICAL(2023)

Cited 3|Views27
Abstract
With the application of clarithromycin for Helicobacter pylori treatment, the number of resistant strains has increased greatly and has become a problem for treatment. Here, a novel multiplex PCR-allele-specific extension (PCR-ASE) functionalized microfluidic diagnostic platform with advantages of low-cost, easy-to-use and high sensitivity was firstly fabricated and proposed. A biphasic amplification multiplex PCR-ASE assay was integrated and used for the rapid detection of different mutations at positions 2142/2143 in the 23S rRNA gene in a single tube, and the results can be read with the naked eye using a nucleic acid detection strip (NADS). Our data revealed good performance for the multiplex diagnostic platform, which had an analytical sensitivity of approximately 50 copies/reaction and the ability to detect hetero-resistance at proportions as low as 0.5 %. Furthermore, it had no cross-reactivity with 10,000 copies of the wild-type template. The performance of platform for detection of clarithromycin resistance in 25 tissue biopsies was compared to that of Sanger sequencing and found to be in complete concordance. Due to its efficiency and simplicity, the multiplex PCR-ASE functionalized detection platform has great advantages for clinical application using a conventional thermal cycler for the detection of clarithromycin resistance within 2 h.
More
Translated text
Key words
Clarithromycin resistance,Helicobacter pylori,Nucleic acid detection strip,Multiplex PCR -allele-specific extension
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined