|
|
@ -0,0 +1,108 @@
|
|
|
|
|
|
|
|
---
|
|
|
|
|
|
|
|
title: Most breaches actually begin in corp
|
|
|
|
|
|
|
|
date: 2023-12-07
|
|
|
|
|
|
|
|
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Readers of my blog will note that while I believe Rust is an excellent
|
|
|
|
|
|
|
|
tool for developers to leverage when building software, that there is
|
|
|
|
|
|
|
|
a disconnect between the developers leveraging Rust features to improve
|
|
|
|
|
|
|
|
their software and many of the advocates who talk about the language,
|
|
|
|
|
|
|
|
which I believe is counterproductive when it comes to Rust advocacy.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
For example, I see [takes like these][linkedin] frequently, which generally
|
|
|
|
|
|
|
|
advocate that if *only* we adopted memory safe languages, we would solve
|
|
|
|
|
|
|
|
all security problems in computing forever:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
> If it's estimated that writing in a memory safe language prevented
|
|
|
|
|
|
|
|
> 750 vulnerabilities (in just one codebase!) and IBM calculated [1]
|
|
|
|
|
|
|
|
> the average cost of a data breach is $4.45 million, that's over
|
|
|
|
|
|
|
|
> $3.3 *billion* saved by moving to memory safety.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
[linkedin]: https://www.linkedin.com/feed/update/urn:li:activity:7138201685847453697/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Don't get me wrong: it sure would be nice to change to a memory safe
|
|
|
|
|
|
|
|
language and save $3.3 billion in losses, but in reality it's far
|
|
|
|
|
|
|
|
more complicated than that.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Every year, Verizon's security group releases a [Data Breaches
|
|
|
|
|
|
|
|
Investigation Report][dbir]. These reports are *fascinating*
|
|
|
|
|
|
|
|
to read, and I highly recommend giving them a read if you're
|
|
|
|
|
|
|
|
interested about the past year's notable data breaches and
|
|
|
|
|
|
|
|
how they actually happened.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
[dbir]: https://www.verizon.com/business/resources/Tbcb/reports/2023-data-breach-investigations-report-dbir.pdf
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
What we learn from these reports is that, in general:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* Over 70% of data breaches actually involve a human element
|
|
|
|
|
|
|
|
instead of a software vulnerability, for example a phishing
|
|
|
|
|
|
|
|
attack or a misconfiguration of a service.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* Almost 50% of data breaches actually involve compromised
|
|
|
|
|
|
|
|
credentials, such as leaked OAuth tokens which did not
|
|
|
|
|
|
|
|
expire.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* Roughly 15% of data breaches have phishing as their root cause.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* Only 5% of data breaches actually come from exploitation of
|
|
|
|
|
|
|
|
a software vulnerability.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Don't get me wrong -- software vulnerabilities are bad and should be
|
|
|
|
|
|
|
|
fixed in an expedient manner, however, to circle back to the prior
|
|
|
|
|
|
|
|
example I quoted, if we are considering data breaches to have a price
|
|
|
|
|
|
|
|
tag of $4.45 million, and we are talking about 750 security incidents
|
|
|
|
|
|
|
|
in practice, then in reality only 38 of these incidents would have
|
|
|
|
|
|
|
|
the potential to have memory safety as their root cause, which is a
|
|
|
|
|
|
|
|
much smaller price tag of $169.1 million that could be attributed to
|
|
|
|
|
|
|
|
memory safety.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
The point is not that we shouldn't refactor, or even rewrite software,
|
|
|
|
|
|
|
|
to improve its memory safety. But we should be honest about why we are
|
|
|
|
|
|
|
|
doing it. While memory safety *is* important, the real benefit in
|
|
|
|
|
|
|
|
doing this refactoring work is to improve the *clarity* of the underlying
|
|
|
|
|
|
|
|
software's technical design: technical constraints can be enforced using
|
|
|
|
|
|
|
|
Rust's trait system, for example -- a form of behavioral modeling.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
By leveraging features such as traits to enforce behavioral correctness
|
|
|
|
|
|
|
|
of the code you are writing, you wind up having a much better
|
|
|
|
|
|
|
|
vulnerability posture *overall*, not just in the area of memory safety.
|
|
|
|
|
|
|
|
This is the reason why refactoring software to use code written in Rust
|
|
|
|
|
|
|
|
and other modern languages with these features is advantageous.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This is a far more interesting story than the talking points about
|
|
|
|
|
|
|
|
memory safety I hear. At this point, with features such as `FORTIFY`
|
|
|
|
|
|
|
|
and Address Sanitization, it is possible to address memory safety
|
|
|
|
|
|
|
|
defects without having to go to such lengths to refactor pre-existing
|
|
|
|
|
|
|
|
code.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Features like ASan do not even have to carry significant runtime
|
|
|
|
|
|
|
|
performance penalties. To illustrate my point, Justine Tunney proposed
|
|
|
|
|
|
|
|
building a modified version of Alpine with ASan enabled in 2021 using
|
|
|
|
|
|
|
|
a production-tuned variant of [her ASan runtime included in her
|
|
|
|
|
|
|
|
Cosmopolitan libc project][cosmo-asan]. It was estimated that enabling
|
|
|
|
|
|
|
|
ASan in conjunction with this variant of her ASan runtime would only
|
|
|
|
|
|
|
|
result in a 3 to 5% performance reduction over code that did not have
|
|
|
|
|
|
|
|
ASan enabled. Adopting this work would have immediately derisked the
|
|
|
|
|
|
|
|
use of memory unsafe code in all packages as they would be built with
|
|
|
|
|
|
|
|
ASan by default.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
[cosmo-asan]: https://github.com/jart/cosmopolitan/blob/master/libc/intrin/asan.c
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
And, of course, even with the borrow checker, and traits, and type
|
|
|
|
|
|
|
|
enforcement, and the other code verification features provided by the
|
|
|
|
|
|
|
|
Rust compiler, you still have `unsafe{}` blocks, and the Rust compiler
|
|
|
|
|
|
|
|
provides support for ASan as a mitigation for these blocks. So you
|
|
|
|
|
|
|
|
*still* really need ASan even in a memory safe world, because even when
|
|
|
|
|
|
|
|
you build such a thing with perfect memory safe abstractions over a
|
|
|
|
|
|
|
|
memory unsafe world, you really are still building on top of a memory
|
|
|
|
|
|
|
|
unsafe world.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
The point here isn't that these abstractions are meaningless. They do
|
|
|
|
|
|
|
|
provide significant harm reduction when working with otherwise memory
|
|
|
|
|
|
|
|
unsafe interfaces, but even the most perfect abstraction is still, by
|
|
|
|
|
|
|
|
its very nature of being an abstraction, leaky. Instead, we should
|
|
|
|
|
|
|
|
recognize *why* Rust improves memory safety, and how the techniques
|
|
|
|
|
|
|
|
which improve memory safety can also be used to enforce elements of
|
|
|
|
|
|
|
|
the underlying software's design at compile time. This is a much
|
|
|
|
|
|
|
|
better story than the handwaving I usually see about memory safety
|
|
|
|
|
|
|
|
from advocates.
|