IPS benchmarking is notoriously difficult to do well. Vendor-provided benchmarks are often measured under idealized conditions that bear little resemblance to real-world deployments. Here’s how to conduct meaningful IPS evaluations.
The Problem with Vendor Benchmarks
Vendor throughput figures are typically measured with all signatures disabled or a minimal signature set, using synthetic traffic with large packet sizes, and with no other features enabled. Real-world performance — with full signature sets, mixed traffic, and all features active — can be 50-80% lower than vendor claims.
Testing Methodology
Effective IPS benchmarking requires realistic traffic profiles that match your environment, full signature sets enabled, detection accuracy testing with known-good attack traffic, false positive rate measurement with normal business traffic, and performance measurement at your actual peak traffic load, not theoretical maximums.
Key Metrics to Measure
- Throughput at full signature load (not with signatures disabled)
- Latency added to normal traffic
- Detection rate against a known attack corpus
- False positive rate against baseline traffic
- Performance degradation under attack load
- Recovery time after a blocking event