WeChat Mini Program
Old Version Features

Development of Loop-Mediated Isothermal Amplification Assay for the Rapid Detection of Pyrenophora Graminea in Barley Seeds

AGRONOMY-BASEL(2023)

Qinghai Univ

Cited 3|Views7
Abstract
Barley leaf stripe, caused by Pyrenophora graminea, is an essential systemic seed-borne disease in barley worldwide. Barley is a major cereal crop in the Qinghai–Tibet Plateau, and barley production has been threatened by leaf stripe in this region, particularly in organic farming regions. Detecting the pathogen in infected barley seeds is crucial for managing barley leaf stripe. In this study, a loop-mediated isothermal amplification (LAMP) assay was developed to detect the pathogen based on primers designed based on the sequence of the pig 14 gene (GenBank: AJ277800) of P. graminea. The optimal concentrations of MgSO4, dNTPs, and enzymes in the LAMP reaction system were established as 10.0 mM, 1.0 mM, and 8 U in a 25 μL reaction volume, respectively. The established LAMP methods for detecting P. graminea were optimally performed at 63 °C for 70 min with high reliability. The minimum detection limit was 1 × 10−2 ng·μL−1 in the 25 μL reaction system. The specificity of LAMP for P. graminea was validated with eight fungal species. All DNA extracts from P. graminea-infected barley seeds with incubation, intact, and smashed treatments were applied in LAMP and confirmed to enable the detection of the pathogen. The LAMP assay in this study could facilitate the detection of P. graminea in barley seeds onsite, provide information for seed health certificates, and help decide on seed treatment in leaf stripe management.
More
Translated text
Key words
Pyrenophora graminea,loop-mediated isothermal amplification,rapid detection technology of disease,optimal condition
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined