What is the most efficient way to scan thousands of addresses?

Dear All,

I am setting up a system that will be scanning around 130k addresses each week. And am looking for advice on optimising the scanning process to complete it as efficiently as possible, ideally without compromising too much on thoroughness. We’ve previously been using Nessus to accomplish this, but due to their increasing prices we’re moving to OpenVAS for the time being. With Nessus it helped to slice the work into chunks of 100 IP addresses. It’s not at all clear to me whether this is a good strategy to continue with OpenVAS.

I have the option of spreading the load over multiple scanners, which we didn’t have with Nessus due to the licensing model.

It’s also not clear to me how best to tune the parameters of /etc/openvassd.conf for the best performance (in my case optimising for speed).

I’d appreciate getting in touch with someone with greater experience on such matters!



P.S. My scanners are running on VMs running Ubuntu 18.04.1 LTS
OpenVAS Scanner 5.1.2
OpenVAS Manager 7.0.3
Greenbone Security Assistant 7.0.3

Slicing is a valid strategy. It depends a bit on how/whether you want to aggregate the results
into a single report. 100 appears to me like very small slice, I use far bigger slices.

At Greenbone we work with some customers on such large scale scans. My recommendation is that you use the GMP interface and script the scans. Our works for the customers result in some GMP scripts we publish to gvm-tools, see here:

A typical challenge is that your hosts are hidden in a larger address range of which most are “dead” IPs.

For large scale scanning I also recommend to spend some time on tuning the Scan Configuration.
And what also determines the performance is of course the port list. Especially the port list is a trade-off between speed and false-negative-tolerance. Which makes it important to define what you want to achieve with the scans. The actual purpose will drive your technical decisions.

One practical recommendation from my experience for large scale scans: Use the “random” host order for scanning in your task configuration.


Dear Jan,

thank you for your response to my questions. It will be very helpful.

I am scripting the process using the Python library pyvas (actually I needed to fork this library and add a lot of functionality to it, in order to preserve some of the ways of working that we were using with Nessus) to automate the process, and it’s taking a feed from our database of registered IP addresses, so it will only attempt to scan registered addresses.

Thank you for your suggestions regarding randomising the host order, and tuning the scan configuration.

I have found this documentation on the scan configuration. Could you tell me if there is any documentation you’re aware of online that delves deeper into the performance implications of each setting please?

Thanks again,


gvm-tools is also implemented in Python and actively supported by Greenbone.

Please do not rely on that ancient “compendium”.

We have this documentation available online, focusing the use of a GSM but perhaps providing some helpful hints for your Source Edition setup:


This manual offers also some hints on Performance considerations: