Microsoft Zero-Days Sold and Then Used

Yet another article about cyberweapons arms manufacturers and their particular supply chain. This one is about Windows and Adobe Reader zero-day exploits sold by an Austrian company named DSIRF.

There’s an entire industry devoted to undermining all of our security. It needs to be stopped.

Posted on July 29, 2022 at 10:08 AM30 Comments

Comments

Jenkins July 29, 2022 10:23 AM

Is the “industry” a child of the intelligence agencies?
That’s the basis of “This is how they tell me the World ends” book.

Leo July 29, 2022 10:30 AM

How can this industry be stopped? Has there every been an industry that has been effectively stopped?

Measures like making it illegal may slow or disincentivize the participants, but as long as there are motives for undermining our security, there will always be a market to do so.

I fear that the only way to accomplish this is to outlaw general purpose computing.

Quantry July 29, 2022 11:20 AM

Why not rather incent software makers who
traditionally sell us betas with back doors?

Seems like finding zero days is a good thing.
Punish for producing them,
or funding for their creation,
and for the illegal use of them.

Or simply don’t trust.

Maq Maq July 29, 2022 11:31 AM

Governments should outlaw, honeypot, prosecute, jail, seize assets, and allow victims to sue. Let’s make their life miserable, they deserve it.

Frank B. July 29, 2022 1:00 PM

Maybe the solution is that the U.S. government should sift through all Microsoft code line by line to see if they can find any security problems the same way they’re planning to do with open source software.

This way they can “get two birds stoned at once” as Ricky on Trailer Park Boys says.

lurker July 29, 2022 1:38 PM

@Bruce

There’s an entire industry devoted to undermining all of our security. It needs to be stopped.

One way to stop an industry, or rather pause it until it finds a workaround, is to stop its supply of raw materials. It’s long past time that Microsoft and Adobe were stopped from selling rubbish.

They say there’s something called a free market where rubbish can be sold to anyone foolish enough to buy. But some items are regulated in the name of public safety e.g. motorcars, electricity supply. What will it take for software to be regarded as a matter of public safety?

Clive Robinson July 29, 2022 2:46 PM

@ Bruce, ALL,

Re : Economics of sin.

“There’s an entire industry devoted to undermining all of our security. It needs to be stopped.”

It’s not realy an “industry” but an “economy” and we need to think of it that way especially as it’s value easily rivals the GDP of several smaller nation states.

Whilst it is not actually illegal to sell zero days[1] those involved often behave as though it might be. So in many respects the various stages of the total supply chain have many hallmarks of a “Black Market”. Much the same as the “Arms Trade” many Western Governments publicly decry, yet keenly encorage in various ways.

But without a doubt it is an econony based on a market that satisfies the basic rules of,

“Supply and Demand”

The supply side is abundant[3], and almost every day a new demand class arises.

Stoping either would at best be dificult verging on impossible.

Perhaps the easiest to change would be the production methods of the commercial / consumer software industry. NASA amongst one or two others have methods that can deliver defect free software, as well as hardware systems that detect and work around hardware defects. However the processes involved do not sit at all well with the way we produce commercial consumer systems. Some argue that it would “kill the ICT industry” thus the modern economy…

[1] It’s almost impossible to come up with a legally suitable definition of a zero day or any other potential privacy or security violating “error”, as I know from first hand experience. It has been tried for over four decades, and failed every time even though legislators have gone for ridiculously broad over-scope they still fail[2]. As a result it’s become a perception issue and one of those,

“We don’t ask and you must not tell us”

Look the other way markets. Where neither selling or buying is technically illegal or even unlawful, as long as you have the fig-leaf pretence of no knowledge of what “the good” is going to be used for, or who it might get passed on to as an “end user”. So effectively the market has,

“Chinese Walls at every step of the supply chain.”

[2] Mostly, the legal definition failure, with regards to zero days and their ilk, is because of three main issues,

1, Any error is a vulnarability[3].
2, The “Observer Point of View” issue[4].
3, An error is “assumed not” to be an attack[5].

Of which the second almost by definition is not something that can be measured thus fairly judged[4].

[3] As we know “To err is human” and there is the old statistic going back atleast as far as Fred Brook’s “The mythical man month” essays of the mid 1970’s of,

“One error in every five lines of code”

If it is true or not, does not matter, because it became first perceived wisdom, then mantra, and now a saw, or saying. The reality is the “Bug:LOC” measure, with rapid “knock it out, then fix” coding giving 1:10 after “knocking it out”. Which with a degree of “fix” can get it down to 0.1:1k in commercial “release” code by the Likes of Microsoft. With strong processess NASA has got 0:0.5M examples. Unfortunately things are not linear doubling LOC rather more than doubles Bugs. Likewise the effort to halve bugs is rather more than double and rises exponentially. Importantly what is often forgotton is all errors / bugs can under the right circumstances be exploited as either an attack, or a step in an attack. With that as a given, the problem is the attacker has “agency” thus all the “random” chance / probability arguments normally used with the likes of risk analysis cease to be applicable.

[4] There is a major issue in deciding good/bad or right/wrong, which is it is,

1, Entirely subjective.
2, Context sensitive.
3, Changes with time.

Even killing is subjective, with little wrong attached when the context is “self defence”. Worse the judgment is made by a later “Observer” in the context of a probably shifting “Point of View”(POV). So whilst you might start out thinking something is good/right, with time your POV can change, or be changed, so you end up believing it is bad/wrong.

[5] A continuous failing we see in the ICT industry is that an error or bug is “assumed” to be by accident not design. Therefore it is not an attack, or step in an attack. The reality is the opposit is becoming more likely. For instance “hard coded passwords” are still mainly considered a mistake not an implanted backdoor, especially if they look like they were part of a test harness used during development that got left in by “accident”. Many would have hoped “Solar Winds” would have been a wakeup call, but no it was a five minute wonder and is now all but forgotton less than a year later. Even at the time SANS published an article[6] that said,

“Supply chain attacks are not common”

It was then, and still is now, compleately untrue. Supply chain attacks to my knowledge have been going on longer than the ICT industry has existed[7]. Where “knock-offs” get substituted in the physical commodity supply chain. The first software ICT supply chain issue to hit the MSM was probably the PC malware (RavMonE.exe) being loaded on to some Apple Video iPods somewhere in production back in 2006.

[6] https://www.sans.org/blog/what-you-need-to-know-about-the-solarwinds-supply-chain-attack/

[7] Supply chain attacks are also known as “passing off” and almost certainly go back before recorded history. One fun one was back in the “spice trade” days when some spices were incredibly valuable such as “pepper corns” and “nutmeg”. With Portuguese sailors carving fake nutmegs out of wood and profiting by substituting them with real ones back in the 1500’s. Something that apparrently was still going on in the US as late as the 1800’s.

software engineer July 29, 2022 3:03 PM

Some of you are effectively asking “what would it take to force all software to be made in a way that is guaranteed absolutely bug free”?

I’ll take a stab at answering that. It is possible, yes. But it would cost ONE THOUSAND TIMES more…. That cheap piece of software you paid $30 bucks for? $30,000 dollars, please?

Oh, you think open source is the solution? Not to this problem it’s not. Imagine everyone going to prison unless they all spend a thousand times more hours (for free!) writing everything… watch open source start shrinking…

Or, you’d just end up with a black market of more reasonably priced stuff from out of the country, because you can’t mandate such ridiculousness worldwide, can you?

NASA can afford to spend $23 million dollars on a toilet (google it), but the average person cannot. Oh, I guess I may be underestimating by a few orders of magnitude… sigh.

software engineer July 29, 2022 3:30 PM

To those who might think that it’s impossible to have absolutely bug free code, look here: https://en.wikipedia.org/wiki/Formal_verification

Here’s a hint: yes, it’s possible! It’s relatively easier in a very small code base. It quickly becomes exponentially harder with a larger code base. Harder equals much more expensive (in time, money, or whatever measurement of effort you use, even if it’s number of times you have to threaten your prison workers, which is what the whole world would be under any regime enforcing this)

software engineer July 29, 2022 3:49 PM

All of this is not to say that it’s impossible to improve anything. It absolutely can be improved from where it is now, without ending the world.

For example, programmers could be taught what “Formal Verification” actually is. Which is to say, they could be taught that it exists, firstly. After that, they could be taught the value of an automated test suite. Nobody should be graduating with those kinds of degrees anymore, unless they’ve stopped writing test-less code. “Red-green-refactor” should be the mantra in every programming class from the very first introductory line of code written all the way through graduating university. No short cuts allowed.

Additionally, some of the principles of “Formal Verification” could be worked into existing software frameworks and languages in a way that doesn’t impose such a crazy burden on everyone.

With these two things, I believe it will start to become more commonly known even among those who are not formally trained in programming. A large percentage of programmers are self-taught. A huge percentage of open source is written by self-taught programmers, because of the low barrier of entry. Even me. My degree is in business, I’ve been coding since I was 12, I’m 49 now.

And finally, it doesn’t have to be a binary thing: 100% bug free vs 100% full of bugs. All we have to do is eek it up a bit to the better and have a great improvement! And then don’t be satisfied with that, keep eeking it up some more… forever.

software engineer July 29, 2022 4:16 PM

It’s already happening to some degree… for example, typeless languages like Perl, Ruby, and even Javascript are on the decline, in relation to strongly typed languages like Typescript. Why? Because it’s easier to write less-buggy code when you use a good editor together with core features of the language to make sure your variables are definitely the things you expect them to be! So a misspelling causes loud red highlights in your editor instead of silently causing very difficult to find bugs. Also languages like C++ and Go have mostly usurped languages like C in new projects… Search for pointers vs references, for example 🙂

This could be more widespread though. And it could become more common at lower and lower levels…. even hardware. Don’t be satisfied with where we are, always seek to improve.

Jonathan Wilson July 29, 2022 6:23 PM

A big part of the problem is that law enforcement and intelligence agencies across the western world have decided that being able to take advantage of vulnerabilities in software to allow spying on (or collecting evidence on) bad guys (whether they be criminals of all sorts, terrorists, “enemy” countries or otherwise) is more important than working to close these vulnerabilities so as to keep these devices secure from said bad guys.

A good example is when the FBI decided it would rather drop the charges against people it claimed were guilty of child pornography offenses than be forced to reveal exactly how it hacked the Tor Browser and identified these individuals.

Ted July 29, 2022 9:51 PM

@software engineer

It quickly becomes exponentially harder with a larger code base.

Looking at Microsoft’s blog post on this, I don’t envy the work it takes to analyze and guard against cyber mercenary attacks on their products.

From the book “Modern Operating Systems”:

The source code for Windows is over 50 million lines of code. The source code for Linux is over 20 million lines of code… It would take a bit over seven bookcases to hold the full code of Windows 10. Can you imagine getting a job maintaining an operating system and on the first day having your boss bring you to a room with these seven bookcases of code and say: ‘‘Go learn that.’’ And this is only for the part that runs in the kernel.

https://www.microsoft.com/security/blog/2022/07/27/untangling-knotweed-european-private-sector-offensive-actor-using-0-day-exploits/

https://twitter.com/herbertbos/status/1537833750138060805

SpaceLifeForm July 29, 2022 11:00 PM

@ Ted, software engineer, Clive

It quickly becomes exponentially harder with a larger code base.

The codebase expands because of new hardware.

There is always a new driver. Not well thought out.

Many, many exploits abuse drivers.

Just to be clear, I am talking about kernel code, not userland.

Userland code expansion can just be a side effect.

Richard July 30, 2022 2:52 AM

Why not tax the big software companies to make a large pot of money, then use it to reward whitehats who find zero-day exploits and publicly publish them to force them to be fixed. The reward could be scaled to the potential risk envelope of the exploit. For example a zero-day for Windows that could impact every Windows user could be rewarded $10m.

Clive Robinson July 30, 2022 3:35 AM

@ SpaceLifeForm, software engineer, Ted, ALL,

Re : Hardware needs software that needs hardware.

“There is always a new driver. Not well thought out.”

But new hardware also brings “new capabilities”, and that necessitates other change as well not just drivers.

In the current traditional design OS’s that are on Personal Computers and up, that means not just the drivers, but above the driver in kernel space, also in other parts of the OS space to make it available to user land space in all the ways it can.

That can be quite an increasingly thick wedge of software as you go up the stack.

It’s one of the reasons as @Nick P used to point out we needed to move towards “User Land I/O” with properly designed IO MMU’s.

But it does not entirely solve the problem.

We currently have a single memory space model. That is all the RAM is CORE. Which is the root cause of attacks via the likes of “RowHammer”.

If you can attack the memory where the MMU tables are stored then the MMU becomes ineffective as a privacy / security measure.

This is the essence of a “bubbling up attack” where a user can make an attack deep down in the computing stack. That is, below the ISA layer, below the CPU layer, below the virtualisation layer, right down through the logic gate layer down to the “analog layer” of the silicon where manufacturing choices are exploited.

Just about everything in modern computer design rests on one nolonger true assumption

“RAM will securely store ‘state'”.

Have a think on how the flipping of just one bit can cause problems all the way up the computing stack.

At the end of the day one of the EmSec / TEMPEST fundemental ideas is,

“Issolation and segregation”

Of hardware and data flows. Unfortunately it goes against the speckmenship drive for getting the most bang for your buck out of every part. Hence as I’ve previously pointed out you run into,

“Efficiency -v- Security”

Issues big time. RowHammer came about due to an “Efficiency -v- Security” compromise in DRAM chip design. It’s not the only one, there are many more, all those little nasties in the CPU that have followed Spector and Meltdown are again down to “Efficiency -v- Security” compromise in the CPU design.

It does not matter how many formal methods or other top down methods you apply to software if an attacker can destroy the fundemental axiom of them all that “state stored in core will not change” then all their security promises vanish.

It’s one of the reasons why “Castles -v- Prisons” design was started. An idea before it’s time, but the clock can clearly be heard ticking.

TimH July 30, 2022 8:55 AM

“There’s an entire industry devoted to undermining all of our security. It needs to be stopped.”

It won’t be gubmints that stop it, because they want the product. After all, Obama let AT&T off for illegally tapping calls for NSA in at least that switching room San Francisco.

Look at DHS buying people data from the surveillance data brokers that aggregate stuff sent back by phone apps for a more recent example.

Clive Robinson July 30, 2022 9:33 AM

@ TimH, ALL,

“It won’t be gubmints that stop it, because they want the product.”

Yup and if it’s not the “goverment” it will be the “Law Enforcment Agencies” and even the local/regional civil servants.

When RIPA first surfaced in the UK under David Blunkit as part of the Tony Blair PM government, the list of agencies and people who were in effect given the right to spy on you was very long and frankly quite shocking.

In the UK we have the “Association of Chief Police Officers”(ACPO) who have at every point cried out for greater data access powers and have consistently told factual inaccuracies about “End to End Encryption”(E2EE).

And by US standards the UK is so liberal in it’s attitudes some think we must all be “lilly livered Commies” or some such…

Society as we know it can only survive if there is “privacy” the fact that the Government and it’s agencies want to destroy privacy tells you that they also want to destroy society. History tells us that this sort of destruction of society gives rise in the short term to a “Police State” but in time even that gets destroyed and you end up back at that awful system of the “King Game”…

If you look carefully you can identify not just politicians but corporate and religious leaders lusting over such power to control.

There is little doubt that such behaviour is an aberrant mental disorder and currently incurable…

So no, there is not any government that will realistically stop this. If they try then more likely than not there will be “An unfortunate series of events” as history has shown.

Michael Lowe July 30, 2022 10:55 AM

the more features and functionality added to the application, the larger the code base thus possibility of more bugs and zero-day vulnerabilities.

MK July 30, 2022 7:41 PM

Geez. You guys, especially the self-taught ones, who think they know all about formal proofs of program correctness, need to go (back) to school. Formal proofs sometimes work with toy programs, but fall down completely when you get to co-routines, threaded execution, network interactions.

I think that professional code is much better than 1 error in 5 lines. However:
hxxps://dilbert.com/strip/1995-11-13

SpaceLifeForm July 30, 2022 9:25 PM

@ MK, Security Sam, ALL

A bit of Haiku

Trusting formal proof
Security theatre
Silicon turtles

Building a toolchain
Reflections on trusting trust
Code is not simple

The problems are hard
There is no strong guarantee
In the universe

I have solution
One can believe that nonsense
Cosmic ray says no

But I can Observe
You do not see reality
It is illusion

But there are turtles
I can see the earth is flat
Compiler problem

Clive Robinson July 31, 2022 2:54 AM

@ MK,

“I think that professional code is much better than 1 error in 5 lines.”

I would hope so, especially with modern syntax highlighting editors and the like.

But… “It’s what the mantra says” from back in Fred Brooks essays days so long long ago.

As I said Microsoft say about one bug in 10,000 LOC in releases and as indicated NASA has half a million LOC without errors. So we know better, a lot better can be achieved. The question is at “what cost”…

Because around one error in a thousand lines of code, is about where “Specification bugs” start creeping through the door, which takes things into a whole different territory.

MK July 31, 2022 11:12 AM

@Clive Robinson,
“Because around one error in a thousand lines of code, is about where “Specification bugs” start creeping through the door, which takes things into a whole different territory.”

1999
“NASA lost a $125 million Mars orbiter because a Lockheed Martin engineering team used English units of measurement while the agency’s team used the more conventional metric system for a key spacecraft operation, according to a review finding released Thursday.”

MikeA July 31, 2022 11:18 AM

@Clive

“NASA has half a million LOC without errors”

Without known errors.

Just glanced and it appears that VIPER is still (recently) active research into formal proof:

https://ntrs.nasa.gov/citations/19930011854

But I so agree that at some point the errors originate in the spec, or some layer of translation.

Clive Robinson July 31, 2022 12:47 PM

@ MikeA, MK, ALL,

Re : One bug in 5LOC

Without

known

errors.

Yup there is always the “Unknown Unkowns” to deal with when they happen[1] along with all those fun below the CPU ISA level things like RowHammer going on in the bottom of the computing stack that “bubble up” that no top down software methodology the programers can implement will ever prevent or mitigate…

The real problems are as I once said,

“You only know you are a killer once you kill someone.”

Likewise,

“You only know you’ve spent to little on defence when you are attacked.”

That is there is an epoch when probability collapses and becomes certainty or more simply,

“The unknown becomes known”

And as with Trinity 77 years ago almost to the day, we embark on a new era or reality. And if you are lucky the toast is butter side up when it lands.

[1] Note, I’m only quoting what Microsoft and NASA have themselves said.

Gert-Jan August 1, 2022 6:45 AM

Would it help if the law required that the seller has to inform the software producer of the bug and would have to wait at least 30 (?) days before being allowed to sell the information about the bug to others?

This could reduce cybercrime.

(Since every country would make exemptions for their intelligence agencies, it wouldn’t reduce state sponsored hacking)

Mercury Rising August 1, 2022 8:09 AM

Generally speaking, a lot of the “processes” that are in place depend on majority of the people in the society acting ethically. Every security feature requires that some person somewhere needs to be trusted, so there is really no way around this. The only way to have trust in a society where no one can be trusted is to monitor and track as much as possible.

J Dee August 1, 2022 8:34 AM

@Bruce
There’s an entire industry devoted to undermining all of our security. It needs to be stopped.

If all monetary transactions and all online communication was stored in blockchain based Web 3 then perhaps we could put a stop to these

sooth_sayer August 7, 2022 9:48 PM

Need to be stopped — dah!
However MSFT is busy updating windows machines EVERY MONTH — they have no time to fix the crappy code they are pushing.
There is no rigour left to software these days — it’s crapware that MSFT has decided not to improve and just use the whole installed base of users as guinea pigs.

I think MSFT will lose the OS market to maybe Apple — it will be a shame but they don’t care about their software at all now.

Leave a comment

All comments are now being held for moderation. For details, see this blog post.

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.