Mapping N- to C-terminal Allosteric Coupling Through Disruption of a Putative CD74 Activation Site in D-dopachrome Tautomerase.
JOURNAL OF BIOLOGICAL CHEMISTRY(2023)
Brown Univ
Abstract
The macrophage migration inhibitory factor (MIF) protein family consists of MIF and D-dopachrome tautomerase (also known as MIF-2). These homologs share 34% sequence identity while maintaining nearly indistinguishable tertiary and quaternary structure, which is likely a major contributor to their overlapping functions, including the binding and activation of the cluster of differentiation 74 (CD74) receptor to mediate inflammation. Previously, we investigated a novel allosteric site, Tyr99, that modulated N-terminal catalytic activity in MIF through a "pathway" of dynamically coupled residues. In a comparative study, we revealed an analogous allosteric pathway in MIF-2 despite its unique primary sequence. Disruptions of the MIF and MIF-2 N termini also diminished CD74 activation at the C terminus, though the receptor activation site is not fully defined in MIF-2. In this study, we use site-directed in vitro and in vivo biochemistry to explore the putative CD74 activation region of MIF-2 based on homology to MIF. We also confirm its reciprocal structural coupling to the MIF-2 allosteric site and N-terminal enzymatic site. Thus, we provide further insight into the CD74 activation site of MIF-2 and its allosteric coupling for immunoregulation.
MoreTranslated text
Key words
NMR,allostery,cytokine,protein dynamics,receptor activation
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined