WeChat Mini Program
Old Version Features

Structure of the LarB-Substrate Complex and Identification of a Reaction Intermediate During Nickel-Pincer Nucleotide Cofactor Biosynthesis

Biochemistry(2023)

Michigan State Univ

Cited 1|Views9
Abstract
LarB catalyzes the first step of biosynthesis for the nickel-pincer nucleotide cofactor by converting nicotinic acid adenine dinucleotide (NaAD) to AMP and pyridinium-3,5-biscarboxylic acid mononucleotide (P2CMN). Prior studies had shown that LarB uses CO2 for substrate carboxylation and reported the structure of a Lactiplantibacillus plantarum LarBNAD(+) complex, revealing a covalent linkage between Cys221 and C4 of the pyridine ring. This interaction was proposed to promote C5 carboxylation, with C5-carboxylated-NaAD suggested to activate magnesium-bound water, leading to phosphoanhydride hydrolysis. Here, we extended the analysis of wild-type LarB by using ultraviolet-visible spectroscopy to obtain additional evidence for cysteinyl side chain attachment to the ring of NAD(+), thus demonstrating that this linkage is not a crystallization artifact. Using the S127A variant of L. plantarum LarB, a form of the enzyme with a reduced rate of NaAD hydrolysis, we examined its interaction with the authentic substrate. The intermediate arising from C5 carboxylation of NaAD, dinicotinic acid adenine dinucleotide (DaAD), was identified by using mass spectrometry. S127A LarB exhibited spectroscopic evidence of a Cys221-NAD(+) adduct, but a covalent enzyme-NaAD linkage was not detectable. We determined the S127A LarBNaAD structure, providing new insights into the enzyme mechanism, and tentatively identified the position and mode of CO2 binding. The crystal structure revealed the location of the side chain for Glu180, which was previously disordered, but showed that it is not well positioned to abstract the C5 proton in the adduct species to restore aromaticity as Cys221 is expelled. Based on these combined results, we propose a revised catalytic mechanism of LarB..
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined