基于万寿菊转录组测序的SSR标记开发
Acta Horticulturae Sinica(2018)
Abstract
万寿菊(Tagetes erectaL.)花蕾转录组测序共获得48 953条Unigene,利用MISA软件检测出20 666个SSR位点,分布于13 849条Unigene中,出现频率为28.29%,平均分布距离为2.51 kb.优势重复基序为三核苷酸、四核苷酸,分别占总SSR位点的50.16%和20.94%.ATG/ATG和AAAC/GTTT分别是三核苷酸、四核苷酸的优势重复基元,占总SSR重复类型的13.82%和3.66%.随机选取不同主导重复基元类型并合成SSR引物36对,以20份万寿菊自交系的基因组DNA为模板,对引物有效性和多态性进行了验证,30对引物可以扩增到清晰稳定的目标条带,有效扩增率为80.56%;其中13对引物具有多态性,平均He和PIC分别为0.275和0.608.以上结果表明,万寿菊转录组测序产生的Unigene信息可作为开发SSR标记的有效来源,获得的大批量SSR标记可为万寿菊的遗传多样性分析和遗传图谱构建提供可靠的标记选择.
MoreTranslated text
Key words
Tagetes erecta L.,SSR,transcriptome
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined