Hi All,
Need help to conduct Vulnerability assessment on UPS device.
UPS: Eaton 9px
Tools?
Methodology?
Reporting format? and samples to refer to?
Regards,
Jagadeesh
I'd recommend Nessus. They seem to have vendor -specific plugins and their Nessus iO product does pretty good job at report generation.
Nessus Professional is better proposition from the licensing perspectives, but the reports are limited. You can export the data in CSV and run your own analysis or use built-in report filters to narrow down the scope and severity of the discovered vulnerabilities.
Thank you for your response.
I'd make sure you have Eaton or your vendor available and the UPS is not securing entire production - some of them collapsed during scans (I did on a 93-series using nessus and it went fine but that was luck I guess).
We dind't run the assessment for the UPS only but we took also phyical protection into consideration.
Good luck!
Jagadeesh,
ISACA's Phoenix Chapter published an example report here:
https://www.isaca.org/chapters1/phoenix/events/Documents/V_Scan_Handout_Sample_Report.pdf
Nessus is a good tool and well respected, and you may be able rely on it as a plug and play system without background knowledge. I would be a little concerned however, that if you are not familliar with performing a Vulnerability Assessment that you should either (a) find yourself an experienced consultant and shadow them, or (b) at least follow methodology in a guide book like the Certified Ethical Hacker certification manual.
I have several reasons for caution. The first is, that unless you're aware of what certain tests are doing behind the curtain of your tool's GUI, that you may accidentally cause a destructive test - bricking your UPS. With all automated tools, Nessus can result in false positives and false negatives, so you may want to use the baseline an automatic tool gives you for further research to find something you didn't know you had, or to eliminate what looks like a critical vulnerability that doesn't actually apply. I had a third, but I had to run away to do something at work and I forgot... If I remember, I'll come back and post.
Sincerely,
Eric B.
@oms Thank you for bringing up this very valid point regarding physical availability of the units under scans.
I have omitted that since am accustomed to the highly available infrastructures with racks supplied with power from two PDUs, each served by separate UPS with management on separate networks.
This may not be the case in someone's environment and the implications of accidentally tripping the unit may be quite nasty.
Regards,
Vladimir
Eric,
Can you shed a bit more light on preferences for manual pen-testing of specialized equipment?
I do not posses a deep enough knowledge to argue for or against it, but am trying to figure out if there is an added risk in going manual.
My reasoning is that if Tenable or its competitors actually have a manufacturer-specific knowledge, they have likely crafted plugins that are taking a more targeted approach and may list the caveats in the description of the tests. Whereas in the case of manual process, you are forced to rely solely on competence of the tester and his/hers familiarity with particular target.
Would this, in your opinion, not increase probability that some things may be omitted or, conversely, are likelier to cause the issues you are trying to avoid?
Thank you,
Vladimir
Vladimir,
@vt100 wrote:
Can you shed a bit more light on preferences for manual pen-testing of specialized equipment?
I do not posses a deep enough knowledge to argue for or against it, but am trying to figure out if there is an added risk in going manual.
Sure. These are my thoughts and opinions, and in no way should be considered authoritative best practice.
There is a psychological factor in performing the tests outside of a GUI or automatic tool that assists the testing process, the report writing process, and any oral defense of the findings (that typically occurs when you're evaluating and making claims of deficiency against someone else's configuration). That is, you are much more deliberate in the whole process because it requires much more attention and focus. When using an automated or GUI tool, it has been my experience that folks simply "select all" and then come back when it's done to read the report and often may have no idea of how the tool reached its conclusions.
Although understanding your target is a good fundamental practice to all testing, I believe it is in severe danger of being overlooked when using a push-button automated toolkit. A tester planning a manual test is encouraged to obtain source documentation about the target, such as it's operating system, installed applications, and patch levels. For example, knowledge that a system is relying upon a certain version of a RH kernel and is running custom applications (a), (b), and (c) will help you figure out existing vulnerabilities prior to initiating a fatal or or several hours/days of unnecessary tests.
The next benefit is in the selection of the tool, the switches that are set, and the arguments given when compared to a tool where these processes are handled behind the scenes of check boxes and buttons. Rather than relying upon a "select all" function in GUI tool to do a "bunch of magic", you have direct knowledge of the purpose of the command-line tool, what operations it performed and why you selected that test in the first place.
The final major benefit of running a manual test is that the tester looks at the target system from the vantage point of a threat rather than the perspective of a tool. For example, it's unlikely that running an automated tool will detect a malicious IoT process using a network knock protocol. However, running a sniffer on the final wire to the target that filters out all of the expected traffic from your initial assessment can identify the traffic that is unexpected. Building out the tests based on a threat hunting methodology rather than a tool based methodology provides both a much deeper and wider look at your target.
@vt100 wrote:
Would this, in your opinion, not increase probability that some things may be omitted or, conversely, are likelier to cause the issues you are trying to avoid?
This is why following an established pen test methodology is important; and I suggested hiring out to an experienced pen tester and shadowing them for the first run. If your pent test engagement literally has them coming in, running Nessus and leaving you with a printout of the report (I've seen consultants charge customers $1m for this service) .... then these aren't pros, they're scammers.
Sincerely,
Eric B.
"I've seen consultants charge customers $1m for this service"
Ouch!
Completely on board with your logic. Am just not sure about feasibility of manual pen-testing for SMBs. Many of those entities are not even staffed to handle their IT operations properly, not to mention security.
Also I am a bit concerned about "capture the flag" mentality of some of the pen-testers.
If you could share from personal experience, what is the typical size of the company (by number of employees or revenue, whichever you think is better applicable), that could afford and are engaging professionals actually capable of skillfully applying the methodology you've described?
Vladimir,
@vt100 wrote:
If you could share from personal experience, what is the typical size of the company (by number of employees or revenue, whichever you think is better applicable), that could afford and are engaging professionals actually capable of skillfully applying the methodology you've described?
This is a tough question. I haven't done this kind of work commercially in a long time.
My clients ranged severely from multi-billion dollar pharmas in NJ, to small accounting firms with 3 people in HI. The majority of folks that availed pen testing specifically were community banks, community medical centers, and law firms with employees in the 50-1000 range. My billing rate was $135-190 an hour depending on the services requested. I honestly can't tell you what any of my flat rate estimates were at this point.
I once got billed at $400 an hour as an expert witness against another firm that had been engaged to do a pen test, but had only run Nessus and printed the report as the final product. They failed to even vet the report since there were tons of false positives for products the customer didn't even have installed, and it caused the customer an additional loss trying to chase down where these nonexistent services were on their network.
Sincerely,
Eric B.