WeChat Mini Program
Old Version Features

Sampled-Data Consensus for Multiagent Systems over Semi-Markov Switching Networks under Denial-of-Service Attacks.

IEEE Trans Syst Man Cybern Syst(2025)

Department of Mathematics

Cited 0|Views13
Abstract
This article investigates the almost sure consensus (ASC) problem for sampled-data multiagent systems (MASs) operating over semi-Markov switching networks (SMSNs) and facing different types of denial-of-service (DoS) attacks. During real-time information exchange among agents, communication failures between agents occur randomly, which may result in each possible network topology occurring with a certain probability, and its sojourn time is also stochastic. This necessitates the consideration of a more general switching signal to describe the stochastic switching phenomenon of networks. In pursuit of this goal, a semi-Markov chain is introduced to characterize the switching signal of stochastic interaction networks, whose sojourn time distribution allows for arbitrary continuous-time distribution and depends on the current and next state. Additionally, this article delves into the impact of two distinct types of DoS attacks on MASs. The first type involves random DoS attacks, which are also modeled by a semi-Markov chain to capture the stochastic nature of attack durations. The second type is deterministic DoS attacks, characterized by their frequency and duration. The proposed new stochastic analysis method, based on the law of large numbers, is used to analyze the ASC for MASs featuring SMSNs under the DoS attacks. The effectiveness of the proposed approach is demonstrated by evaluating the results obtained from two illustrative numerical examples.
More
Translated text
Key words
Almost sure consensus (ASC),denial-of-service (DoS) attacks,sampled-data-control,semi-Markov switching networks (SMSNs)
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
Summary is being generated by the instructions you defined