Process Optimization and CFD Simulation in External Loop Airlift Reactor and Sectionalized External Loop Airlift for Application of Wastewater Treatment
International Journal of Chemical Reactor Engineering(2022)
Inst Chem Technol
Abstract
Abstract The industrial wastewater treatment is carried out in an external loop airlift reactor (EL-ALR) and sectionalized EL-ALR. The airflow rate is optimized in EL-ALR and sectionalized EL-ALR for better degradation of wastewater. The 74% degradation is obtained in sectionalized EL-ALR as compared to EL-ALR for continuous aeration up to 32 h. For higher superficial gas velocity (U G ) percentage degradation is decreased due to shear stress on microorganisms. The extent of degradation is obtained 77 and 80% for the effect of hydrogen peroxide (H2O2) and combination of H2O2 and Fenton reagent in an EL-ALR. The computational fluid dynamic (CFD) simulation validated with experimental results of gas hold-up and liquid circulation velocity compared in EL-ALR sectionalized EL-ALR over a broad range of superficial gas velocity 0.0024 ≤ U G ≤ 0.016 m/s. The comparison of CFD and experimental values of gas hold-up and liquid circulation velocity are in good agreement.
MoreTranslated text
Key words
biological treatment,chemical oxygen demand (COD),computational fluid dynamic (CFD),optimization approach,oxidizing agent,small scale industrial waste
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined