Non-coding RNA and Gene Expression Analyses of Papillary Renal Neoplasm with Reverse Polarity (PRNRP) Reveal Distinct Pathological Mechanisms from Other Renal Neoplasms.
PATHOLOGY(2024)
Biogipuzkoa Hlth Res Inst
Abstract
Papillary renal neoplasm with reversed polarity (PRNRP) is a recently described rare renal neoplasm. Traditionally, it was considered a variant of papillary renal cell carcinoma (PRCC). However, several studies reported significant differences between PRNRP and PRCC in terms of clinical, morphological, immunohistochemical and molecular features. Nonetheless, PRNRP remains a poorly understood entity. We used microarray analysis to elucidate the non-coding RNA (ncRNA) and gene expression profiles of 10 PRNRP cases and compared them with other renal neoplasms. Unsupervised cluster analysis showed that PRNRP had distinct expression profiles from either clear cell renal cell carcinoma (ccRCC) or PRCC cases at the level of ncRNA but were less distinct at the level of gene expression. An integrated omic approach determined miRNA:gene interactions that distinguished PRNRP from PRCC and we validated 10 differentially expressed miRNAs and six genes by quantitative RT-PCR. We found that levels of the miRNAs, miR-148a, miR-375 and miR-429, were up-regulated in PRNRP cases compared to ccRCC and PRCC. miRNA target genes, including KRAS and VEGFA oncogenes, and CXCL8, which regulates VEGFA, were also differentially expressed between renal neoplasms. Gene set enrichment analysis (GSEA) determined different activation of metabolic pathways between PRNRP and PRCC cases. Overall, this study is by far the largest molecular study of PRNRP cases and the first to investigate either ncRNA expression or their gene expression by microarray assays.
MoreTranslated text
Key words
Renal cancer,miRNA,transcriptome,pathway analysis,papillary renal neoplasm with reversed polarity
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined