[Info-vax] A few tools for improving software security

Arne Vajhøj arne at vajhoej.dk
Fri Jan 19 19:42:09 EST 2024


On 1/19/2024 4:46 PM, John Dallman wrote:
> Hundreds of researchers are only a few times as effective as one. I've
> done some of this, and worked with people who do more: given good tools,
> a few man-months can make significant improvements. In the order in which
> they were used:
> 
> Valgrind is a binary instrumentation tool: it decompiles an executable
> into a higher-level abstract machine language, inserts instrumentation
> and recompiles that into the machine language. It doesn't need source,
> although it does need debug symbols. https://valgrind.org/ It gets used
> for building many kinds of execution monitors and checking tools.
> 
> The only one I've used is Memcheck, which finds usage of uninitialised
> data, by the brute-force technique of tracking the initialisation status
> of every byte of memory in the process. This is too slow to use in
> production, but I spent a few months tracking down and fixing all the
> uses of uninitialised memory, and we basically stopped getting
> unrepeatable run-time errors. Now one of the junior engineers keeps it
> running through all our tests on a regular basis and fixing newly added
> uses of uninitialised data.
> 
> Our QA team use another Valgrnd tool, Callgrind, to track which files and
> functions are called by all of our tests. Keeping this up to date keeps a
> few machines busy full-time, but it's invaluable. The data is kept in an
> SQL database in inverted form, so when developers change source, they can
> find out which test cases go through the changed files in seconds, and
> run those test cases to find and fix regressions before they release
> changes into source-management.
> 
> Valgrind runs on Unix-like operating systems. I run it on x86-64 Linux,
> which is by far the most-used platform.

Those tools are mostly C/C++ centric right?

If one would be a little cynical: tools to mitigate those languages
lack of the strictness found in other languages.

> Static analysis starts with a compile-like process, in which you run all
> your source through an analysis tool that behaves like a compiler.
> There's then a pretty heavyweight process of matching up all the
> functions, variables and possible execution paths, and then you have data
> that can be used to spot security problems, via further automatic
> analysis. It's kind of like "lint" on a whole-program basis. The tool I
> use is Coverity, which is commercial, and got picked for us by the larger
> company. It isn't brilliant, but it works. It runs on Windows, Linux and
> Mac.
> 
> <https://www.synopsys.com/software-integrity/static-analysis-tools-sast/co
> verity.html>

I have heard  good things about Coverity.

There are alternatives. Commercial: CAST, Klocwork etc.. Open source:
Sonarcube, PMD etc..

Such tools can actually find a lot.

> Fuzz testing is the main thing that security researchers do. Digging
> through disassembly listings for vulnerabilities by eye is very slow and
> unreliable. You need to be able to run the software you're "fuzzing" and
> to be able to feed it a data file. You use a fuzz testing tool which runs
> the software and monitors it for segmentation violations and other
> exceptions. It also mutates the data file, by modifying, inserting and
> removing bytes. A bit of correlation analysis lets it find its way to
> minimal test cases for exceptions. This burns up a fair bit of CPU power:
> people in my team have had four or eight threads on an old x86-64 box
> running for about a year. In that time, we've found and fixed a bit more
> than one defect per working day.
> 
> There are lots of frameworks for fuzzing, but AFL
> <https://en.wikipedia.org/wiki/American_Fuzzy_Lop_(software)> is
> considered pretty good, and is what we use.

Arne





More information about the Info-vax mailing list