Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (2,820)

Search Parameters:
Keywords = outliers

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 23594 KiB  
Article
Research on a DBSCAN-IForest Optimisation-Based Anomaly Detection Algorithm for Underwater Terrain Data
by Mingyang Li, Maolin Su, Baosen Zhang, Yusu Yue, Jingwen Wang and Yu Deng
Water 2025, 17(5), 626; https://doi.org/10.3390/w17050626 - 21 Feb 2025
Abstract
The accurate acquisition of underwater topographic data is crucial for the representation of river morphology and early warning of water hazards. Owing to the complexity of the underwater environment, there are inevitably outliers in monitoring data, which objectively reduce the accuracy of the [...] Read more.
The accurate acquisition of underwater topographic data is crucial for the representation of river morphology and early warning of water hazards. Owing to the complexity of the underwater environment, there are inevitably outliers in monitoring data, which objectively reduce the accuracy of the data; therefore, anomalous data detection and processing are key in effectively using data. To address anomaly detection in underwater terrain data, this paper presents an optimised DBSCAN-IForest algorithm model, which adopts a distributed computation strategy. First, the K-distance graph and Kd-tree methods are combined to determine the key computational parameters of the DBSCAN algorithm, and the DBSCAN algorithm is applied to perform preliminary cluster screening of underwater terrain data. The isolated forest algorithm is subsequently used to carry out refined secondary detection of outliers in multiple subclusters that were initially screened. Finally, the algorithm performance is verified through example calculations using a dataset of about 8500 underwater topographic points collected from the Yellow River Basin, which includes both elevation and spatial distribution attributes; the results show that compared with other methods, the algorithm has greater efficiency in outlier detection, with a detection rate of up to 93.75%, and the parameter settings are more scientifically sound and reasonable. This research provides a promising framework for anomaly detection in underwater terrain data. Full article
Show Figures

Figure 1

32 pages, 8818 KiB  
Article
Latent Outlier Exposure in Real-Time Anomaly Detection at the Large Hadron Collider
by Thomas Dartnall Stern, Amit Kumar Mishra and James Michael Keaveney
Computers 2025, 14(3), 79; https://doi.org/10.3390/computers14030079 - 20 Feb 2025
Abstract
We propose a novel approach to real-time anomaly detection at the Large Hadron Collider, aimed at enhancing the discovery potential for new fundamental phenomena in particle physics. Our method leverages the Latent Outlier Exposure technique and is evaluated using three distinct anomaly detection [...] Read more.
We propose a novel approach to real-time anomaly detection at the Large Hadron Collider, aimed at enhancing the discovery potential for new fundamental phenomena in particle physics. Our method leverages the Latent Outlier Exposure technique and is evaluated using three distinct anomaly detection models. Among these is a novel adaptation of the variational autoencoder’s reparameterisation trick, specifically optimised for anomaly detection. The models are validated on simulated datasets representing collider processes from the Standard Model and hypothetical Beyond the Standard Model scenarios. The results demonstrate significant advantages, particularly in addressing the formidable challenge of developing a signal-agnostic, hardware-level anomaly detection trigger for experiments at the Large Hadron Collider. Full article
(This article belongs to the Special Issue Machine Learning Applications in Pattern Recognition)
Show Figures

Figure 1

11 pages, 1449 KiB  
Article
The Causal Relationship Between Dietary Factors and the Risk of Intracranial Aneurysms: A Mendelian Randomization Analysis
by Longyuan Li, Jiaxuan Li, Mei Chang, Xin Wu, Ziqian Yin, Zhouqing Chen and Zhong Wang
Biomedicines 2025, 13(3), 533; https://doi.org/10.3390/biomedicines13030533 - 20 Feb 2025
Abstract
Background: Some studies have shown that dietary factors can influence the occurrence of intracranial aneurysms (IAs). This study aimed to investigate whether dietary factors and habits are associated with intracranial aneurysms using Mendelian randomization (MR) analysis. Methods: A two-sample MR study was conducted [...] Read more.
Background: Some studies have shown that dietary factors can influence the occurrence of intracranial aneurysms (IAs). This study aimed to investigate whether dietary factors and habits are associated with intracranial aneurysms using Mendelian randomization (MR) analysis. Methods: A two-sample MR study was conducted to evaluate the association of dietary factors with IAs. Heterogeneity was evaluated using Cochran’s Q test, and horizontal pleiotropy was assessed through MR-Egger regression and MR-pleiotropy residual sum and outlier (MR-PRESSO). Results: Fresh fruit intake (OR: 0.28, 95% CI: [0.13, 0.59]) was related to a decreased risk of IAs. Lamb/mutton intake may be associated with IAs, although the meta-analysis results were not significant (OR: 1.43, 95% CI: [0.27, 7.67]). Furthermore, MR analyses based on two aneurysm databases showed that alcoholic intake was not associated with IAs (alcoholic drinks per week: OR: 1.057, 95% CI: [0.788, 1.42]; OR: 0.509, 95% CI: [0.1665, 1.56]; alcohol intake frequency: OR: 1.084, 95% CI: [0.909, 1.29]; OR: 1.307, 95% CI: [0.814, 2.1]). Our results showed no causal relationship between coffee intake and IAs (OR: 1.149, 95% CI: [0.575, 2.3]; OR: 0863, 95% CI: [0.2979, 2.5]). Other dietary intakes were also found to have no causal relationship with IAs. Conclusions: This study found that fresh fruit intake was associated with a reduced risk of IAs. Lamb/mutton intake may be associated with IAs. However, other dietary factors, including alcohol intake and coffee intake, were found not to be associated with IAs. Full article
(This article belongs to the Section Neurobiology and Clinical Neuroscience)
Show Figures

Figure 1

17 pages, 6915 KiB  
Article
Dam Deformation Data Preprocessing with Optimized Variational Mode Decomposition and Kernel Density Estimation
by Siyu Chen, Chaoning Lin, Yanchang Gu, Jinbao Sheng and Mohammad Amin Hariri-Ardebili
Remote Sens. 2025, 17(4), 718; https://doi.org/10.3390/rs17040718 - 19 Feb 2025
Abstract
Deformation is one of the critical response quantities reflecting the structural safety of dams. To enhance outlier identification and denoising in dam deformation monitoring data, this study proposes a novel preprocessing method based on optimized Variational Mode Decomposition (VMD) and Kernel Density Estimation [...] Read more.
Deformation is one of the critical response quantities reflecting the structural safety of dams. To enhance outlier identification and denoising in dam deformation monitoring data, this study proposes a novel preprocessing method based on optimized Variational Mode Decomposition (VMD) and Kernel Density Estimation (KDE). The approach systematically processes data in three steps: First, VMD decomposes raw data into intrinsic mode functions without recursion. The parallel Jaya algorithm is used to adaptively optimize VMD parameters for improved decomposition. Second, the intrinsic mode functions containing outlier and noise characteristics are identified and separated using sample entropy and correlation coefficients. Finally, KDE thresholds are applied for outlier localization, while a data superposition method ensures effective denoising. Validation using simulated deformation data and Global Navigation Satellite Systems (GNSS)-based observed horizontal deformation from dam engineering demonstrates the method’s robustness in accurately identifying outliers and denoising data, achieving superior preprocessing performance. Full article
(This article belongs to the Special Issue Dam Stability Monitoring with Satellite Geodesy II)
Show Figures

Figure 1

21 pages, 3622 KiB  
Article
Predictive Modelling of Weld Bead Geometry in Wire Arc Additive Manufacturing
by Kristijan Šket, Miran Brezočnik, Timi Karner, Rok Belšak, Mirko Ficko, Tomaž Vuherer and Janez Gotlih
J. Manuf. Mater. Process. 2025, 9(2), 67; https://doi.org/10.3390/jmmp9020067 - 19 Feb 2025
Abstract
This study investigates the predictive modelling of weld bead geometry in wire arc additive manufacturing (WAAM) through advanced machine learning methods. While WAAM is valued for its ability to produce large, complex metal parts with high deposition rates, precise control of the weld [...] Read more.
This study investigates the predictive modelling of weld bead geometry in wire arc additive manufacturing (WAAM) through advanced machine learning methods. While WAAM is valued for its ability to produce large, complex metal parts with high deposition rates, precise control of the weld bead remains a critical challenge due to its influence on mechanical properties and dimensional accuracy. To address this problem, this study utilized machine learning approaches—Ridge regression, Lasso regression and Bayesian ridge regression, Random Forest and XGBoost—to predict the key weld bead characteristics, namely height, width and cross-sectional area. A Design of experiments (DOE) was used to systematically vary the welding current and travelling speed, with 3D weld bead geometries captured by laser scanning. Robust data pre-processing, including outlier detection and feature engineering, improved modelling accuracy. Among the models tested, XGBoost provided the highest prediction accuracy, emphasizing its potential for real-time control of WAAM processes. Overall, this study presents a comprehensive framework for predictive modelling and provides valuable insights for process optimization and the further development of intelligent manufacturing systems. Full article
Show Figures

Figure 1

27 pages, 7952 KiB  
Article
Evaluation of Long-Term Performance of Six PM2.5 Sensor Types
by Karoline K. Barkjohn, Robert Yaga, Brittany Thomas, William Schoppman, Kenneth S. Docherty and Andrea L. Clements
Sensors 2025, 25(4), 1265; https://doi.org/10.3390/s25041265 - 19 Feb 2025
Abstract
From July 2019 to January 2021, six models of PM2.5 air sensors were operated at seven air quality monitoring sites across the U.S. in Arizona, Colorado, Delaware, Georgia, North Carolina, Oklahoma, and Wisconsin. Common PM sensor data issues were identified, including repeat [...] Read more.
From July 2019 to January 2021, six models of PM2.5 air sensors were operated at seven air quality monitoring sites across the U.S. in Arizona, Colorado, Delaware, Georgia, North Carolina, Oklahoma, and Wisconsin. Common PM sensor data issues were identified, including repeat zero measurements, false high outliers, baseline shift, varied relationships between the sensor and monitor, and relative humidity (RH) influences. While these issues are often easy to identify during colocation, they are more challenging to identify or correct during deployment since it is hard to differentiate between real pollution events and sensor malfunctions. Air sensors may exhibit wildly different performances even if they have the same or similar internal components. Commonly used RH corrections may still have variable bias by hour of the day and seasonally. Most sensors show promise in achieving the U.S. Environmental Protection Agency (EPA) performance targets, and the findings here can be used to improve their performance and reliability further. This evaluation generated a robust dataset of colocated air sensor and monitor data, and by making it publicly available along with the results presented in this paper, we hope the dataset will be an asset to the air sensor community in understanding sensor performance and validating new methods. Full article
(This article belongs to the Special Issue Recent Trends in Air Quality Sensing)
Show Figures

Figure 1

20 pages, 2002 KiB  
Article
Implementing Anomaly-Based Intrusion Detection for Resource-Constrained Devices in IoMT Networks
by Georgios Zachos, Georgios Mantas, Kyriakos Porfyrakis and Jonathan Rodriguez
Sensors 2025, 25(4), 1216; https://doi.org/10.3390/s25041216 - 17 Feb 2025
Abstract
Internet of Medical Things (IoMT) technology has emerged from the introduction of the Internet of Things in the healthcare sector. However, the resource-constrained characteristics and heterogeneity of IoMT networks make these networks susceptible to various types of threats. Thus, it is necessary to [...] Read more.
Internet of Medical Things (IoMT) technology has emerged from the introduction of the Internet of Things in the healthcare sector. However, the resource-constrained characteristics and heterogeneity of IoMT networks make these networks susceptible to various types of threats. Thus, it is necessary to develop novel security solutions (e.g., efficient and accurate Anomaly-based Intrusion Detection Systems), considering the inherent limitations of IoMT networks, before these networks reach their full potential in the market. In this paper, we propose an AIDS specifically designed for resource-constrained devices within IoMT networks. The proposed lightweight AIDS leverages novelty detection and outlier detection algorithms instead of conventional classification algorithms to achieve (a) enhanced detection performance against both known and unknown attack patterns and (b) minimal computational costs. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

29 pages, 13905 KiB  
Article
A Comparative Study of Unsupervised Machine Learning Methods for Anomaly Detection in Flight Data: Case Studies from Real-World Flight Operations
by Sameer Kumar Jasra, Gianluca Valentino, Alan Muscat and Robert Camilleri
Aerospace 2025, 12(2), 151; https://doi.org/10.3390/aerospace12020151 - 17 Feb 2025
Abstract
This paper provides a comparative study of unsupervised machine learning (ML) methods for anomaly detection in flight data monitoring (FDM). The study applies various unsupervised ML techniques to real-world flight data and compares the results to the current state-of-the-art flight data analysis techniques [...] Read more.
This paper provides a comparative study of unsupervised machine learning (ML) methods for anomaly detection in flight data monitoring (FDM). The study applies various unsupervised ML techniques to real-world flight data and compares the results to the current state-of-the-art flight data analysis techniques applied in industry. The results are validated by the industrial experts. The study finds that a hybrid Local Outlier Factor (LOF) approach provides significant advantages compared to the current state of the art and other ML techniques because it requires less hyperparameter tuning, reduces the number of false positives, provides an ability to establish trends amongst the entire fleet and has an ability to investigate anomalies at each timestep within every flight. Finally, the study provides an in-depth review for some of the cases highlighted by the hybrid LOF and discusses the particular cases providing insights from an academic and flight safety/operational point of view. The analysis conducted by the human expert regarding the outcomes produced by an ML technique is predominantly absent in scholarly research, thereby offering extra value. The study presents a compelling argument for transitioning from the current approach, based on analyzing occurrences through the exceedances of a threshold value, towards an ML-based method which provides a proactive nature of data analysis. The study shows that there is an untapped opportunity to process flight data and achieve valuable information for enhancing air transport safety and improved aviation operations. Full article
(This article belongs to the Section Air Traffic and Transportation)
Show Figures

Figure 1

29 pages, 3674 KiB  
Article
Advanced Tax Fraud Detection: A Soft-Voting Ensemble Based on GAN and Encoder Architecture
by Masad A. Alrasheedi, Samia Ijaz, Ayed M. Alrashdi and Seung-Won Lee
Mathematics 2025, 13(4), 642; https://doi.org/10.3390/math13040642 - 16 Feb 2025
Abstract
The world prevalence of the two types of authorized and fraudulent transactions makes it difficult to distinguish between the two operations. The small percentage of fraudulent transactions, in turn, gives rise to the class imbalance problem. Hence, an adequately robust fraud detection mechanism [...] Read more.
The world prevalence of the two types of authorized and fraudulent transactions makes it difficult to distinguish between the two operations. The small percentage of fraudulent transactions, in turn, gives rise to the class imbalance problem. Hence, an adequately robust fraud detection mechanism must exist for tax systems to avoid their collapse. It has become significantly difficult to obtain any dataset, specifically a tax return dataset, because of the rising importance of privacy in a society where people generally feel squeamish about sharing personal information. Because of this, we arrive at the decision to synthesize our dataset by employing publicly available data, as well as enhance them through Correlational Generative Adversarial Networks (CGANs) and the Synthetic Minority Oversampling Technique (SMOTE). The proposed method includes a preprocessing stage to denoise the data and identify anomalies, outliers, and dimensionality reduction. Then the data have undergone enhancement using the SMOTE and the proposed CGAN techniques. A unique encoder design has been proposed, which serves the purpose of exposing the hidden patterns among legitimate and fraudulent records. This research found anomalous deductions, income inconsistencies, recurrent transaction manipulations, and irregular filing practices that distinguish fraudulent from valid tax records. These patterns are identified by encoder-based feature extraction and synthetic data augmentation. Several machine learning classifiers, along with a voting ensemble technique, have been used both with and without data augmentation. Experimental results have shown that the proposed Soft-Voting technique outperformed the original without an ensemble method. Full article
(This article belongs to the Special Issue Application of Artificial Intelligence in Decision Making)
Show Figures

Figure 1

17 pages, 1391 KiB  
Article
Optimizing Sensor Data Interpretation via Hybrid Parametric Bootstrapping
by Victor V. Golovko
Sensors 2025, 25(4), 1183; https://doi.org/10.3390/s25041183 - 14 Feb 2025
Abstract
The Chalk River Laboratories (CRL) site in Ontario, Canada, has long been a hub for nuclear research, which has resulted in the accumulation of legacy nuclear waste, including radioactive materials such as uranium, plutonium, and other radionuclides. Effective management of this legacy requires [...] Read more.
The Chalk River Laboratories (CRL) site in Ontario, Canada, has long been a hub for nuclear research, which has resulted in the accumulation of legacy nuclear waste, including radioactive materials such as uranium, plutonium, and other radionuclides. Effective management of this legacy requires precise contamination and risk assessments, with a particular focus on the concentration levels of fissile materials such as U235. These assessments are essential for maintaining nuclear criticality safety. This study estimates the upper bounds of U235 concentrations. We investigated the use of a hybrid parametric bootstrapping method and robust statistical techniques to analyze datasets with outliers, then compared these outcomes with those derived from nonparametric bootstrapping. This study underscores the significance of measuring U235 for ensuring safety, conducting environmental monitoring, and adhering to regulatory compliance requirements at nuclear legacy sites. We used publicly accessible U235 data from the Eastern Desert of Egypt to demonstrate the application of these statistical methods to small datasets, providing reliable upper limit estimates that are vital for remediation and decommissioning efforts. This method seeks to enhance the interpretation of sensor data, ultimately supporting safer nuclear waste management practices at legacy sites such as CRL. Full article
(This article belongs to the Special Issue Sensors and Extreme Environments)
Show Figures

Figure 1

12 pages, 4767 KiB  
Article
Disentangling Multiannual Air Quality Profiles Aided by Self-Organizing Map and Positive Matrix Factorization
by Stefano Fornasaro, Aleksander Astel, Pierluigi Barbieri and Sabina Licen
Toxics 2025, 13(2), 137; https://doi.org/10.3390/toxics13020137 - 14 Feb 2025
Abstract
The evaluation of air pollution is a critical concern due to its potential severe impacts on human health. Currently, vast quantities of data are collected at high frequencies, and researchers must navigate multiannual, multisite datasets trying to identify possible pollutant sources while addressing [...] Read more.
The evaluation of air pollution is a critical concern due to its potential severe impacts on human health. Currently, vast quantities of data are collected at high frequencies, and researchers must navigate multiannual, multisite datasets trying to identify possible pollutant sources while addressing the presence of noise and sparse missing data. To address this challenge, multivariate data analysis is widely used with an increasing interest in neural networks and deep learning networks along with well-established chemometrics methods and receptor models. Here, we report a combined approach involving the Self-Organizing Map (SOM) algorithm, Hierarchical Clustering Analysis (HCA), and Positive Matrix Factorization (PMF) to disentangle multiannual, multisite data in a single elaboration without previously separating the sites and years. The approach proved to be valid, allowing us to detect the site peculiarities in terms of pollutant sources, the variation in pollutant profiles during years and the outliers, affording a reliable interpretation. Full article
(This article belongs to the Special Issue Atmospheric Emissions Characteristics and Its Impact on Human Health)
Show Figures

Graphical abstract

21 pages, 2447 KiB  
Article
Advancing Taxonomy with Machine Learning: A Hybrid Ensemble for Species and Genus Classification
by Loris Nanni, Matteo De Gobbi, Roger De Almeida Matos Junior and Daniel Fusaro
Algorithms 2025, 18(2), 105; https://doi.org/10.3390/a18020105 - 14 Feb 2025
Abstract
Traditionally, classifying species has required taxonomic experts to carefully examine unique physical characteristics, a time-intensive and complex process. Machine learning offers a promising alternative by utilizing computational power to detect subtle distinctions more quickly and accurately. This technology can classify both known (described) [...] Read more.
Traditionally, classifying species has required taxonomic experts to carefully examine unique physical characteristics, a time-intensive and complex process. Machine learning offers a promising alternative by utilizing computational power to detect subtle distinctions more quickly and accurately. This technology can classify both known (described) and unknown (undescribed) species, assigning known samples to specific species and grouping unknown ones at the genus level—an improvement over the common practice of labeling unknown species as outliers. In this paper, we propose a novel ensemble approach that integrates neural networks with support vector machines (SVM). Each animal is represented by an image and its DNA barcode. Our research investigates the transformation of one-dimensional vector data into two-dimensional three-channel matrices using discrete wavelet transform (DWT), enabling the application of convolutional neural networks (CNNs) that have been pre-trained on large image datasets. Our method significantly outperforms existing approaches, as demonstrated on several datasets containing animal images and DNA barcodes. By enabling the classification of both described and undescribed species, this research represents a major step forward in global biodiversity monitoring. Full article
(This article belongs to the Special Issue Algorithms in Data Classification (2nd Edition))
Show Figures

Figure 1

24 pages, 1345 KiB  
Article
iEVEM: Big Data-Empowered Framework for Intelligent Electric Vehicle Energy Management
by Siyan Guo and Cong Zhao
Systems 2025, 13(2), 118; https://doi.org/10.3390/systems13020118 - 13 Feb 2025
Abstract
Recent years have witnessed an unprecedented boom of Electric Vehicles (EVs). However, EVs’ further development confronts critical bottlenecks due to EV Energy (EVE) issues like battery hazards, range anxiety, and charging inefficiency. Emerging data-driven EVE Management (EVEM) is a promising solution but still [...] Read more.
Recent years have witnessed an unprecedented boom of Electric Vehicles (EVs). However, EVs’ further development confronts critical bottlenecks due to EV Energy (EVE) issues like battery hazards, range anxiety, and charging inefficiency. Emerging data-driven EVE Management (EVEM) is a promising solution but still faces fundamental challenges, especially in terms of reliability and efficiency. This article presents iEVEM, the first big data-empowered intelligent EVEM framework, providing systematic support to the essential driver-, enterprise-, and social-level intelligent EVEM applications. Particularly, a layered data architecture from heterogeneous EVE data management to knowledge-enhanced intelligent solution design is provided, and an edge–cloud collaborative architecture for the networked system is proposed for reliable and efficient EVEM, respectively. We conducted a proof-of-concept case study on a typical EVEM task (i.e., EV energy consumption outlier detection) using real driving data from 4000+ EVs within three months. The experimental results show that iEVEM achieves a significant boost in reliability and efficiency (i.e., up to 47.48% higher in detection accuracy and at least 3.07× faster in response speed compared with the state-of-art approaches). As the first intelligent EVEM framework, iEVEM is expected to inspire more intelligent energy management applications exploiting skyrocketing EV big data. Full article
Show Figures

Figure 1

21 pages, 2359 KiB  
Review
Integrative Analysis of Metabolome and Proteome in the Cerebrospinal Fluid of Patients with Multiple System Atrophy
by Nimisha Pradeep George, Minjun Kwon, Yong Eun Jang, Seok Gi Kim, Ji Su Hwang, Sang Seop Lee and Gwang Lee
Cells 2025, 14(4), 265; https://doi.org/10.3390/cells14040265 - 12 Feb 2025
Abstract
Multiple system atrophy (MSA) is a progressive neurodegenerative synucleinopathy. Differentiating MSA from other synucleinopathies, especially in the early stages, is challenging because of its overlapping symptoms with other forms of Parkinsonism. Thus, there is a pressing need to clarify the underlying biological mechanisms [...] Read more.
Multiple system atrophy (MSA) is a progressive neurodegenerative synucleinopathy. Differentiating MSA from other synucleinopathies, especially in the early stages, is challenging because of its overlapping symptoms with other forms of Parkinsonism. Thus, there is a pressing need to clarify the underlying biological mechanisms and identify specific biomarkers for MSA. The metabolic profile of cerebrospinal fluid (CSF) is known to be altered in MSA. To further investigate the biological mechanisms behind the metabolic changes, we created a network of altered CSF metabolites in patients with MSA and analysed these changes using bioinformatic software. Acknowledging the limitations of metabolomics, we incorporated proteomic data to improve the overall comprehensiveness of the study. Our in silico predictions showed elevated ROS, cytoplasmic inclusions, white matter demyelination, ataxia, and neurodegeneration, with ATP concentration, neurotransmitter release, and oligodendrocyte count predicted to be suppressed in MSA CSF samples. Machine learning and dimension reduction are important multi-omics approaches as they handle large amounts of data, identify patterns, and make predictions while reducing variance without information loss and generating easily visualised plots that help identify clusters, patterns, or outliers. Thus, integrated multiomics and machine learning approaches are essential for elucidating neurodegenerative mechanisms and identifying potential diagnostic biomarkers of MSA. Full article
Show Figures

Figure 1

16 pages, 4746 KiB  
Article
SPS-RCNN: Semantic-Guided Proposal Sampling for 3D Object Detection from LiDAR Point Clouds
by Hengxin Xu, Lei Yang, Shengya Zhao, Shan Tao, Xinran Tian and Kun Liu
Sensors 2025, 25(4), 1064; https://doi.org/10.3390/s25041064 - 11 Feb 2025
Abstract
Three-dimensional object detection using LiDAR has attracted significant attention due to its resilience to lighting conditions and ability to capture detailed geometric information. However, existing methods still face challenges, such as a high proportion of background points in the sampled point set and [...] Read more.
Three-dimensional object detection using LiDAR has attracted significant attention due to its resilience to lighting conditions and ability to capture detailed geometric information. However, existing methods still face challenges, such as a high proportion of background points in the sampled point set and limited accuracy in detecting distant objects. To address these issues, we propose semantic-guided proposal sampling-RCNN (SPS-RCNN), a multi-stage detection framework based on point–voxel fusion. The framework comprises three components: a voxel-based region proposal network (RPN), a keypoint sampling stream (KSS), and a progressive refinement network (PRN). In the KSS, we propose a novel semantic-guided proposal sampling (SPS) method, which increases the proportion of foreground points and enhances sensitivity to outliers through multilevel sampling that integrates proposal-based local sampling and semantic-guided global sampling. In the PRN, a cascade attention module (CAM) is employed to aggregate features from multiple subnets, progressively refining region proposals to improve detection accuracy for medium- and long-range objects. Comprehensive experiments on the widely used KITTI dataset demonstrate that SPS-RCNN improves detection accuracy and exhibits enhanced robustness across categories compared to the baseline. Full article
(This article belongs to the Section Sensing and Imaging)
Show Figures

Figure 1

Back to TopTop