Scale problems in data fusion applications to civil engineering

crossref(2024)

引用 0|浏览0
暂无评分
摘要
In the last decade many researchers have investigated the opportunity, at first, to integrate and, more recently, to fuse data observed from different sources in order to enhance the information and to find new correlations for explaining and solving problems.This approach has been successful in the field of civil engineering and in correlated fields.Data integration makes it possible to identify a problem and, simultaneously, make some diagnosis about the main causes of decaying. In general, following a data integration approach, data from different sources (e.g. satellite, photogrammetry, lidar, ground penetrating radar) are evaluated to feed models with the main objective to explain a specific phenomenon as for example the evolution of a damage, the risk assessment of a landslide, the stability of a bridge. Under this framework the data are considered singularly and autonomously but into a unique environment. BIM, among models and digital platforms, can help significantly to manage data.Data fusion approach overpasses the integration because data are not only integrated in one environment but they are merged referring to a single scale digital twin. It is based on the discretization of spatial and time domain in a way that the information from different sources are assigned to the discretized cells.The main problems that have to be tackled are related (1) to the identification of the adequate dimension of the discretization cells, both in terms of spatial and time scale, and (2) to the up or down scaling of the raw data.The dimension of the discretization cell must be designed considering the scale of the problem that has to be studied. For example, the structural risk assessment of a bridge needs spatial scale in the order of 100m in dx and dy, 10-3m in dz and 100days in the time domain. If the problem to be investigated is a landslide the spatial scale can differ so that dx and dy can be in the order of 101m while dz 10-2m and time interval can be related to months.The second relevant problem is the standardization of data versus uniform space and time scale. Typically, it implies the need to upscale some very accurate data and to downscale coarser data. In the case of upscaling the algorithms reduce the information, on the contrary the downscaling produces artificial data by statistically or physically based predictions.Declustering methods have been applied to upscale clouds of data and reduce their number according to the relevant scale. Kriging and block kriging have been applied to downscaling problems in order to generate artificial samples according to the relevant scale.The case of the ancient Roman bridge “Ponte Sisto” has been investigated, by fusing Lidar, In-SAR and GPR data in a digital discrete model. Kriging has been applied to downscale In-SAR data, while ARIMA models have been used to upscale GPR and Lidar data. This research is supported by the Projects “PIASTRE” accepted and funded by the Lazio Region, Italy (PR FESR Lazio 2021-2027 – “Riposizionamento Competitivo RSI”)CUP: F83D23000470009
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要