Efficient Desorption and Reuse of Collector from the Flotation Concentrate: A Case Study of Scheelite
International Journal of Minerals, Metallurgy and Materials(2024)
Central South University
Abstract
Flotation is the most common method to obtain concentrate through the selective adsorption of collectors on target minerals to make them hydrophobic and floatable. In the hydrometallurgy of concentrate, collectors adsorbed on concentrate can damage ion-exchange resin and increase the chemical oxygen demand(COD) value of wastewater. In this work, we proposed a new scheme, i.e., desorbing the collectors from concentrate in ore dressing plant and reusing them in flotation flowsheet. Lead nitrate and benzohydroxamic acid(Pb-BHA) complex is a common collector in scheelite flotation. In this study, different physical(stirring or ultrasonic waves) and chemical(strong acid or alkali environment) methods for facilitating the desorption of Pb-BHA collector from scheelite concentrate were explored. Single-mineral desorption tests showed that under the condition of pulp p H 13 and ultrasonic treatment for 15 min, the highest desorption rates of Pb and BHA from the scheelite concentrate were 90.48% and 63.75%, respectively. Run-of-mine ore flotation tests revealed that the reuse of desorbed Pb and BHA reduced the collector dosage by 30% for BHA and 25% for Pb. The strong alkali environment broke the chemical bonds between Pb and BHA. The cavitation effect of ultrasonic waves effectively reduced the interaction intensity between Pb-BHA collector and scheelite surfaces. This method combining ultrasonic waves and strong alkali environment can effectively desorb the collectors from concentrate and provide “clean” scheelite concentrate for metallurgic plants; the reuse of desorbed collector in flotation flowsheet can reduce reagent cost for ore dressing plants.
MoreTranslated text
Key words
scheelite concentrate,collector,desorption,reuse,flotation
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined