I’m using the latest GVM version to scan a system with my private VT, the task is interrupted and the following exception is raised:
ERROR: (ospd.ospd) 32ae4066-28f6-4d4c-b69a-242149d65ef2: Exception too many values to unpack (expected 9) while scanning
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ospd/ospd.py", line 596, in start_scan
self.exec_scan(scan_id)
File "/usr/local/lib/python3.10/dist-packages/ospd_openvas/daemon.py", line 1177, in exec_scan
got_results = self.report_openvas_results(kbdb, scan_id)
File "/usr/local/lib/python3.10/dist-packages/ospd_openvas/daemon.py", line 815, in report_openvas_results
return self.report_results(results, scan_id)
File "/usr/local/lib/python3.10/dist-packages/ospd_openvas/daemon.py", line 922, in report_results
rseverity = vthelper.get_severity_score(vt_aux)
File "/usr/local/lib/python3.10/dist-packages/ospd_openvas/vthelper.py", line 243, in get_severity_score
return CVSS.cvss_base_v3_value(severity_vector)
File "/usr/local/lib/python3.10/dist-packages/ospd/cvss.py", line 112, in cvss_base_v3_value
_ver, _av, _ac, _pr, _ui, _s, _c, _i, _a = cls._parse_cvss_base_vector(
ValueError: too many values to unpack (expected 9)
The reason is the severity_vector tag of my VT is not only containing CVSS base score metrics but also contain temporal score metrics.
I can simply adjust the severity_vector tag to only contain base score metrics, but I think it is better to handle this exception to avoid task interrupted.
Hi @panajo1017
Thanks for reporting this issue. I have just created a patch to handle this situation in a better way.
Feel free to create a github issue in the corresponding project’s issue tracker next time
Please do note that there are QA measures in place which prevents that VTs with such unsupported severity vectors are entering any official feed in the first place (Note that is still makes sense to harden the related component like already done in the PR above).