WeChat Mini Program
Old Version Features

The Role of Model Architecture and Scale in Predicting Molecular Properties: Insights from Fine-Tuning RoBERTa, BART, and LLaMA

Lee Youngmin,Lang S. I. D. Andrew, Cai Duoduo, Wheat R. Stephen

CoRR(2024)

Cited 0|Views8
Abstract
This study introduces a systematic framework to compare the efficacy of LargeLanguage Models (LLMs) for fine-tuning across various cheminformatics tasks.Employing a uniform training methodology, we assessed three well-knownmodels-RoBERTa, BART, and LLaMA-on their ability to predict molecularproperties using the Simplified Molecular Input Line Entry System (SMILES) as auniversal molecular representation format. Our comparative analysis involvedpre-training 18 configurations of these models, with varying parameter sizesand dataset scales, followed by fine-tuning them on six benchmarking tasks fromDeepChem. We maintained consistent training environments across models toensure reliable comparisons. This approach allowed us to assess the influenceof model type, size, and training dataset size on model performance.Specifically, we found that LLaMA-based models generally offered the lowestvalidation loss, suggesting their superior adaptability across tasks andscales. However, we observed that absolute validation loss is not a definitiveindicator of model performance - contradicts previous research - at least forfine-tuning tasks: instead, model size plays a crucial role. Through rigorousreplication and validation, involving multiple training and fine-tuning cycles,our study not only delineates the strengths and limitations of each model typebut also provides a robust methodology for selecting the most suitable LLM forspecific cheminformatics applications. This research underscores the importanceof considering model architecture and dataset characteristics in deploying AIfor molecular property prediction, paving the way for more informed andeffective utilization of AI in drug discovery and related fields.
More
Translated text
PDF
Bibtex
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined