WeChat Mini Program
Old Version Features

国家虚拟仿真实验教学课程共享平台用于"生物技术制药"课程实验教学实践

ZHANG Xiaokai, LUO Ping, CHENG Ping,ZOU Quanming, ZHAO Zhuo

China Pharmaceuticals(2023)

中国人民解放军陆军军医大学

Cited 0|Views8
Abstract
目的 提升"生物技术制药"课程的实验教学效果.方法 以中国人民解放军陆军军医大学2020级药学专业四年制本科生为授课对象,随机分为线上线下混合组(观察组)和线下组(对照组).以生物技术制药经典实验为例,对照组采用传统方式教学;观察组由教师通过国家虚拟仿真实验教学课程共享平台中浙江中医药大学相应内容虚拟实验课程,进行虚拟仿真实验教学辅助传统线下教学.比较两组的学习效果.结果 观察组学生均认为线上学习方式降低了自身学习难度,提高了兴趣,增强了掌握实验技能的自信.每个实验的观察组学生教学当天即可完成教学过程,短于对照组实验使用时间.观察组学生实验课平均得分为(93.78±1.41)分,明显高于对照组的(75.94±0.00)分(P<0.05).结论 虚拟仿真实验教学作为"生物技术制药"课程课堂教学辅助手段,可有效解决传统实验课程面临的学时有限、操作机会有限、教学效果欠佳的难题,助力生物技术制药专业人才实际操作技能的培养.
More
Key words
"Biotechnological Pharmaceutics",experiment teaching,virtual simulation,online teaching
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined