Hi all,
why is output and input controls useful to mitigate data contamination?
From my point of view it has to do something with throughput. So only statement was to mitigate data throughput.
Have someone an idea why it mitigate data contamination.
Thanks
OliLue
Garbage in/Garbage out.
If you can control what goes into a system, process, storage, function, programme and limit it to correct content/format/you know lineage/it’s traceable you can do a whole lot. But if your not geared for it it’s expensive…
Privacy world data flow mapping is a good example.
Network storage scan what you put in for malware, scan what you take out. clean pipe.
APIs that really control what they accept will stop you getting badly formatted cruft.
You might consider looking at the Biba model which prevents an entity writing to a higher level than allowed to preserve integrity of information at that level / it’s not the same thing but introns where IO can go.
in all cases you’re hoping to control wat good in and what comes out based on classification, type, source, destination - really good example is having applications generate their own SQL queries on the fly so you can cut injection etc off at the source.
If you examine the OWASP top 10 or SANS top 25 vulnerabilities you'll find some of the common vulnerabilities relate to systems accepting unsanitised input i.e. XSS, SQL injection. Also have a look at Mitre CWE 20. Data can be parsed as code very easily.
In terms of output validation it's always a good idea that there are reasonableness checks of values i.e. is it reasonable that I'm paying several million to a small supplier or individual? Is the data in a field even of the correct format or data type or am I pushing the challenge of validation to a downstream system, which may itself omit any sort of validation.