WeChat Mini Program
Old Version Features

Study on the Mechanism of "Treating Different Diseases with Same Method" of Banxia Xiexin Decoction (半夏泻心汤) in Treating Chronic Atrophic Gastritis and Insomnia Based on Network Pharmacology

Journal of Liaoning University of Traditional Chinese Medicine(2021)

中国中医科学院望京医院

Cited 0|Views16
Abstract
目的 采用网络药理学的方法探讨半夏泻心汤"异病同治"慢性萎缩性胃炎及失眠的共同作用机制.方法 通过中药系统药理学数据库(traditional Chinese medicine systems pharmacology database,TCMSP)筛选半夏泻心汤的主要活性成分并预测其潜在作用靶点,利用GeneCards、OMIM、TTD、DrugBank和DisGeNET数据库搜集慢性萎缩性胃炎和失眠相关疾病靶点.将药物靶点与疾病靶点取交集获取共有靶点,并将其导入Cytoscape软件绘制"半夏泻心汤成分-共有靶点"网络图.运用STRING数据库获得共有靶点蛋白互作网络关系,并导入Cytoscape进行拓扑分析及可视化.通过Metascape数据库对共有靶点进行基因本体(gene ontology,GO)富集分析和京都基因与基因组百科全书(Kyoto encylopedia of genes and genomes,KEGG)通路富集分析.结果 筛选后得到半夏泻心汤"异病同治"慢性萎缩性胃炎及失眠的主要活性成分138个,作用于58个共同靶点.GO功能富集分析涉及生物学过程4508个、分子功能407个、细胞组成347个,KEGG通路富集分析共获得187条信号通路,主要涉及晚期糖基化终末产物-糖基化终末产物受体(AGE-RAGE)信号通路、T细胞受体信号通路、辅助性T细胞17(Th17)分化、缺氧诱导因子-1(HIF-1)信号通路、血管内皮生长因子(VEGF)信号通路、核因子-κB (NF-κB)信号通路、癌症的途径等通路,表明半夏泻心汤可能通过调节炎症反应、免疫功能、氧化应激、血管生成等发挥对慢性萎缩性胃炎和失眠的共同治疗作用.结论 半夏泻心汤"异病同治"慢性萎缩性胃炎和失眠涉及多种成分、多个靶点以及多条通路,可为接下来的实验研究及临床应用提供参考.
More
求助PDF
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined