WeChat Mini Program
Old Version Features

Abstract A24: Comprehensive PI3K Pathway Inhibition Through Combination of the PI3Kβ/δ Inhibitor AZD8186 and the Mtorc1/2 Inhibitor AZD2014 Drives Tumor Regression in Vivo

MOLECULAR CANCER THERAPEUTICS(2015)

AstraZeneca

Cited 0|Views37
Abstract
Abstract AZD8186 inhibits the PI3K isoforms PI3Kβ and δ In solid tumors when the tumor suppressor PTEN is deleted, mutated or downregulated PI3Kβ becomes a key driver of tumor cell growths. In hematological tumors such as DLBCL PTEN is down-regulated, moreover PI3Kδ is important in signaling from the B-cell receptor, creating potential for targeted treatment of hematological malignancies. AZD8186 has single agent activity in a range of pre-clinical models representative of different tumor types. However efficacy of agents in the PI3K pathway are anticipated to be limited by compensatory PI3K isoform activation or relief of feedback loops. As a result it is likely that maximal benefit will be seen when combining different agents that target the PI3K pathway. Unlike other inhibitors of the PI3K pathway AZD8186 does not affect peripheral glucose levels. However AZD8186 specifically modulates tumor FDG uptake in the PTEN null tumor model 786-0. This is associated with tumor specific modulation of the PI3K pathway biomarkers. Therefore in PTEN null tumors AZD8186 can be combined with other PI3K pathway inhibitors to give increased pathway suppression without increasing normal tissue toxicity. To test this AZD8186 has been combined with the mTORC1/2 inhibitor AZD2014 in a number of different PTEN null tumors. The combination increased reduction in tumor growth, and even induced tumor regression. This is associated with increase depth or duration of pathway suppression. Interestingly tumor regressions can be achieved with intermittent of both AZD8186 and AZD2014, implying that acute complete pathway suppression is sufficient to drive the anti-tumor effect. This data establishes the potential for AZD8186 to be used in combination with a mTOR inhibitors with the ability to customize dose and schedule to optimize both tolerability as well as anti-tumor effects. Further exploration of the combination opportunities for AZD8186 with other molecular targeted agents would inform on the potential for inhibitors of PI3Kβ and δ to give benefit in different tumor types. Citation Format: Urs Hancox, Urszula Polanska, Lyndsay Hanson, Rebecca Ellston, Julia Maynard, Manfred Kraus, jon Curwen, Teresa Klinowska, Lara Ward, Francisco Cruzalegui, Stephen Green, Stefan Symeonides, Kathryn Cronin, Simon Barry. Comprehensive PI3K pathway inhibition through combination of the PI3Kβ/δ inhibitor AZD8186 and the mTORC1/2 inhibitor AZD2014 drives tumor regression in vivo. [abstract]. In: Proceedings of the AACR Special Conference: Targeting the PI3K-mTOR Network in Cancer; Sep 14-17, 2014; Philadelphia, PA. Philadelphia (PA): AACR; Mol Cancer Ther 2015;14(7 Suppl):Abstract nr A24.
More
Translated text
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined