New Anomaly Detection Benchmark Unlocks Insights for Future Algorithm Development
The ADBench project tested 30 anomaly detection algorithms on 57 datasets to see how well they work with different levels of supervision, types of anomalies, and noisy data. Through almost 100,000 experiments, the researchers found insights on how supervision and anomaly types affect detection performance. They also provided a benchmark for future researchers to compare new methods against existing ones, including datasets from natural language and computer vision fields. The ADBench project is open-source, making it easy for others to replicate and build upon the results.