WeChat Mini Program
Old Version Features

Enhanced Robustness of a Bridge-Type Rf-Mems Switch for Enabling Applications in 5G and 6G Communications

Sensors(2022)

Univ Politecn Catalunya UPC

Cited 5|Views7
Abstract
In this paper, new suspended-membrane double-ohmic-contact RF-MEMS switch configurations are proposed. Double-diagonal (DDG) beam suspensions, with either two or three anchoring points, are designed and optimized to minimize membrane deformation due to residual fabrication stresses, thus exhibiting smaller mechanical deformation and a higher stiffness with more release force than previously designed single diagonal beam suspensions. The two-anchor DDGs are designed in two different orientations, in-line and 90°-rotated. The membrane may include a window to minimize the coupling to the lower electrode. The devices are integrated in a coplanar-waveguide transmission structure and fabricated using an eight-mask surface-micro-machining process on high-resistivity silicon, with dielectric-free actuation electrodes, and including glass protective caps. The RF-MEMS switch behavior is assessed from measurements of the device S parameters in ON and OFF states. The fabricated devices feature a measured pull-in voltage of 76.5 V/60 V for the windowed/not-windowed two-anchor DDG membranes, and 54 V/49.5 V for the windowed/not-windowed three-anchor DDG membranes, with a good agreement with mechanical 3D simulations. The measured ON-state insertion loss is better than 0.7 dB/0.8 dB and the isolation in the OFF state is better than 40 dB/31 dB up to 20 GHz for the in-line/90°-rotated devices, also in good agreement with 2.5D electromagnetic simulations.
More
Translated text
Key words
RF-MEMS switch,beam suspension,coplanar-waveguide
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined