Beta-Amyloid and Its Asp7 Isoform: Morphological and Aggregation Properties and Effects of Intracerebroventricular Administration
BRAIN SCIENCES(2024)
Abstract
Background/Objectives: One of the hallmarks of Alzheimer’s disease (AD) is the accumulation of aggregated beta-amyloid (Aβ) protein in the form of senile plaques within brain tissue. Senile plaques contain various post-translational modifications of Aβ, including prevalent isomerization of Asp7 residue. The Asp7 isomer has been shown to exhibit increased neurotoxicity and induce amyloidogenesis in brain tissue of transgenic mice. The toxicity of Aβ peptides may be partly mediated by their structure and morphology. In this respect, in this study we analyzed the structural and aggregation characteristics of the Asp7 isoform of Aβ42 and compared them to those of synthetic Aβ42. We also investigated the effects of intracerebroventricular (i.c.v.) administration of these peptides, a method often used to induce AD-like symptoms in rodent models. Methods: Atomic force microscopy (AFM) was conducted to compare the morphological and aggregation properties of Aβ42 and Asp7 iso-Aβ42. The effects of i.c.v. stereotaxic administration of the proteins were assessed via behavioral analysis and reactive oxygen species (ROS) estimation in vivo using a scanning ion-conductance microscope with a confocal module. Results: AFM measurements revealed structural differences between the two peptides, most notably in their soluble toxic oligomeric forms. The i.c.v. administration of Asp7 iso-Aβ42 induced spatial memory deficits in rats and elevated oxidative stress levels in vivo, suggesting a potential of ROS in the pathogenic mechanism of the peptide. Conclusions: The findings support the further investigation of Asp7 iso-Aβ42 in translational research on AD and suggest its involvement in neurodegenerative processes.
MoreTranslated text
Key words
Alzheimer’s disease,amyloid,AFM,ROS,rat model,intracerebroventricular injection
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined