WeChat Mini Program
Old Version Features

TSH Receptor and IGF1 Receptor Expression in Circulating Fibrocytes in the Pathogenesis of Graves’ Orbitopathy

crossref(2024)

Institute of Post Graduate Medical Education and Research and SSKM Hospital

Cited 0|Views2
Abstract
Purpose Graves’ Orbitopathy (GO), an autoimmune disorder linked to Graves’ Disease (GD), manifests through inflammation in orbital tissues and extraocular muscles (EOMs), driven by key receptors like TSHR and IGF1R. It was observed that a certain individual with GD will develop clinically significant orbitopathy and reason behind this still unclear. This study aimed to elucidate this connection by: i) Assessing IGF1R expression and its correlation with TSHR on circulating fibrocytes. ii) Investigating fibrocyte conversion to fibroblasts upon serum treatment. iii) Analysing cytokine and chemokine expression in fibrocytes post-serum exposure within the Indian population. Methods and Results Flow cytometry analysis of IGF1R in peripheral blood from 30 GO, 30 GD, and 20 healthy controls (HC) revealed significantly elevated IGF1R+ fibrocytes in GO (11%) versus GD (2.4%) and HC (0.1%). Immunocytochemistry of TSHR and IGF1R on cultured fibrocytes confirmed colocalization of TSHR and IGF1R on fibrocytes, notably higher in GO. Treating HC-derived fibrocytes with GO patient serum triggered fibroblast transformation, marked by increased fibrotic markers (CD90, alpha SMA). Moreover, sandwich ELISA of cytokines and chemokines like IL-6, IL-8, TNF-α, MCP-1, and HA demonstrated elevated levels of those cytokines and chemokines in GO serum-treated HC-fibrocytes. Conclusion These results highlight the potential pathogenicity of TSHR and IGF1R on fibrocytes in GO, suggesting their role in orbital tissue remodelling and inflammation. The observed receptor colocalization may drive GO pathogenesis, providing insights into targeted therapeutic strategies for this debilitating condition.
More
Translated text
求助PDF
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined