WeChat Mini Program
Old Version Features

Nearshore Submerged Wave Farm Optimisation: A Multi-Objective Approach

APPLIED OCEAN RESEARCH(2022)

Univ Western Australia

Cited 2|Views25
Abstract
To be commercially viable, wave energy converters (WECs) will need to be deployed in arrays or "wave farms" to generate significant amounts of energy and to have the costs of these farms minimised. However, when designing a wave farm, there are a number of trade-offs to be made between competing objectives; for example, between the power production potential and installation costs, with the optimal design for one objective not necessarily favourable for the other. In this study, we developed a multi-objective optimisation methodology to allow rigorous evaluation of the trade-offs amongst multiple objectives. We demonstrate the methodology for four objectives: (1) maximising power production, (2) minimising the foundation loads, (3) minimising the number of foundations and (4) minimising the total export cable length required. However, the method is flexible and can be used for optimising a range of other parameters. A case study examining multi-objective optimisation of a wave farm using the developed probability-based evolutionary strategy was conducted for a proposed development site in Albany, Western Australia. The wave farms were composed of 5, 10 and 20 fully submerged cylindrical point-absorber type WECs similar to Carnegie Clean Energy's CETO-6 device. Simulations show that the optimal layouts preferring maximum power formed a single line perpendicular to the predominant wave direction; the optimal layouts preferring minimum cable length and a minimum number of foundations form multiple lines; whereas the optimal layouts preferring minimum foundation loads formed multiple lines in line with the predominant wave direction. By applying a cost model and non-dominated sorting, the methodology allowed us to quantify the trade-offs between power production and cost.
More
Translated text
Key words
Wave farms,Wave energy converters,Multi -objective optimisation,Wave power,LCoE,Loads
求助PDF
上传PDF
Bibtex
收藏
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined