[Dailydave] Deep down the certificate pinning rabbit hole of "Tor Browser Exposed"

Ryan Duff ry at nduff.com
Thu Sep 15 10:06:38 EDT 2016


Hey everyone,



I spent a decent portion of my day looking into the claim by the Tor-Fork
developer that you could get cross-platform RCE on Tor Browser if you're
able to both MitM a connection and forge a single TLS certificate for
addons.mozilla.org. This is well within the capability of any decently
resourced nation-state. Definitely read @movrcx's write-up first to see his
claim. It's here:https://hackernoon.com/tor-browser-exposed-anti-privacy-
implantation-at-mass-scale-bd68e9eb1e95#.vh1a04yxy



Instead of making you read a whole lot to find out the issue, I'll give you
the bottom line up front (BLUF): Firefox uses its own static key pinning
method for it's own Mozilla certs instead of using HPKP. The enforcement of
the static method appears to be much weaker than the HPKP method and is
flawed to the point that it is bypassable in this attack scenario. The bug
appears to be fixed as of the September 4th nightly build of Firefox but is
obviously still unpatched in both the current production versions of
Firefox and Tor Browser. I'm posting about this publically now instead of
waiting because @movrcx's post still shows how to take advantage of these
flaws even if it doesn't describe (and he didn't know at the time) exactly
how and why it bypassed certificate pinning.



Regarding my motivation for looking at this; I'm neither a Tor-Fork OR Tor
supporter. However, I've been a big fan of cross-platform RCE for a very
long time. We go way back. I also had concerns that anything legitimate
@movrcx may have found wouldn't be taken seriously because of the inherent
drama involved in going against Tor in such a public way. To his credit,
when I reached out to him after reading his blog post on it, he was
excellent to work with and didn't get defensive at all as I poked holes in
his claims. The result was us both learning a lot about how Firefox handles
certificate pinning and discovering that his attack works for reasons that
weren't obvious to either of us as we started research today. I also need
to caveat this by saying I only have a few hours of work into all of this
and while I know there is definitely a bug and I have a great hypothesis on
where Mozilla's problem is rooted, I'm not positive about it. More lab work
and validation (no pun intended) is definitely needed. That's why I'm
posting here instead of staking more big claims in a blog. Any input from
anyone who understands Firefox certificate validation internals in detail
would be very appreciated. While I did my research, I obviously don't know
what I don't know. As you'll see, there is quite a bit of nuance here. I'm
going to go through the things @movrcx did and then explain my research
into why they worked even though they really shouldn't have.



I want to give a quick overview of @movrcx's claims, address some of the
criticisms he has received on social media of them, and hopefully get you
believing that there is at least SOMETHING wrong happening here.



Here is a simple overview of @movrcx's attack (again, read this first:
https://hackernoon.com/tor-browser-exposed-anti-privacy-
implantation-at-mass-scale-bd68e9eb1e95#.vh1a04yxy ); Firefox and Tor
Browser update their extensions automatically by checking
addons.mozilla.org about
every 24 hours (and will check on launch if it's been more than 24 hours
since the last check). Tor specifically has HTTPS Everywhere and NoScript
pre-installed. They both use the normal Firefox way of updating themselves.
To attack this, you just follow these steps.



1) Write a malicious extension to be your payload and then have it signed
by Mozilla using their fully automated process.

2) Generate a forged certificate for addons.mozilla.org that validates up
through any CA built in to the Firefox certificate store (not an easy task
but definitely doable by a nation state and should be considered in Tor's
threat model).

3) Man-in-the-Middle traffic to addons.mozilla.org trying to update
NoScript or HTTPS Everywhere.

4) Serve your malicious extension instead of the requested update to the
target.

5) Win with no user interaction.



The naysayers to this claim all mostly revolved around one point (which I
will disprove here). That point is "Mozilla pins the cert for
addons.mozilla.org and that pinning should prevent this attack. The only
reason it worked for @movrcx is because Firefox will bypass pinning when a
certificate validates through a user-added CA cert". This is wrong because,
unlike Firefox, Tor Browser strictly enforces certificate pinning. No
matter what cert @movrcx added, it still should have failed. Yet, he was
able to get it to work. That is what got me interested in this and drove me
to dig in.



There is actually quite a bit to unpack here and I'll start with how
Firefox handles certificate pinning enforcement levels. When a certificate
is pinned, a SHA-256 hash of the public cert is hardcoded into the browser
and depending on which cert is pinned (CA, Intermediate, End-Client Cert)
the site's TLS certificate must validate through or match the pinned cert.
However, Firefox actually has 4 enforcement settings that are applied under
the preference security.cert_pinning.enforcement_level in about:config.
These are the enforcement levels:



0. Pinning disabled

1. Allow User MITM (pinning not enforced if the trust anchor is a user
inserted CA, default)

2. Strict. Pinning is always enforced.

3. Enforce test mode.



The root of the naysayers revolved around the fact that Firefox's default
is enforcement level 1. This means that when a user adds their own CA
("like @movrcx did" but not really, as you’ll see) and the site's
certificate validates through that CA, Firefox will bypass certificate
pinning. They do this so companies can have their TLS stripping security
appliances with their root certs that need to be installed on client
machines. By bypassing pinning for those user added certs, the users will
still be able to access their GMail or other sites that would normally fail
because of pinning. So, game over, right?


The attack should fail in practice. As I stated above, Tor actually changed
the default enforcement to level 2. You can see the ticket here:
https://trac.torproject.org/projects/tor/ticket/16206 . Because of this, NO
CA cert other than the one associated with the pinned certificate should
work. Even if you add your own CA cert to the Firefox store, strict
enforcement should mean that a connection to a site with a pinned cert
attempting to validate through that custom CA should fail. So, double game
over, right?! Nope. Somehow, @movrcx was able to get his extension to load
by validating it through his custom cert. despite strict pinning
enforcement. How is this possible? We must go deeper...



When @movrcx initially added his CA cert (which was the Burp CA cert so he
could perform the required MitM) to the certificate store, it DID fail to
validate. This should be game over as well, right? Not quite. Lets looks at
the debug output from the failure:



1473825046500addons.update-checkerWARNRequest failed:
versioncheck-bg.addons.mozilla.org/update/Version…{73a6fe31-595d-460b-a920-
fcc0f8843232}&version=2.9.0.14&maxAppVersion=*&status=
userEnabled&appID={ec8030f7-c20a-464f-9b0e-13a3a9e97384}&
appVersion=45.3.0&appOS=Linux&appABI=x86_64-gcc3&locale=en-
US&currentAppVersion=45.3.0&updateType=112&compatMode=normal -
[Exception... "Certificate issuer is not built-in."  nsresult: "0x80004004
(NS_ERROR_ABORT)"  location: "JS frame :: resource://gre/modules/CertUtils.jsm
:: checkCert :: line 171"  data: no]



See that exception? It doesn't say "failed to validate pinned certificate"
or something like that. Instead, it says, "Certificate issuer is not
built-in". Could it be that if it validates through a cert that is built-in
to the browser, it would work? Lets find out. @movrx added the Burp
certificate to certdata.txt ( https://hg.mozilla.org/
mozilla-central/raw-file/tip/security/nss/lib/ckfw/builtins/certdata.txt ),
compiled the TorBrowser with that in, retried his MitM, and it worked! Now,
there are actually TWO security failures here. The first one we discussed;
strict enforcement of pinning should have made this fail regardless of the
new certificate. But there is a second failure; adding a certificate this
way should not have made it recognized as “built in”. Firefox uses libNSS
for verification of built-in certificates. That’s done by calling a
software module called CKBI (which somehow is supposed to stand for
"built-in root certificates"). That module is implemented in libNSS as
libnssckbi.so. There is a special entitlement that libNSS looks for that
can only be stored in the read-only libnssckbi.so. Mozilla updates this
certificate list and sends it out with each new version of libNSS (which
ships with Firefox). A brief rundown of this policy can be found here:
https://wiki.mozilla.org/NSS:Root_certs



You actually can add your own CA manually and get this entitlement, but it
requires compiling a utility that is not normally included with Mozilla's
libNSS into your own custom copy of libNSS (instructions on how to do that
are here: http://wiki.cacert.org/NSSLib ). You then must compile your
custom libNSS into your custom Firefox build. But, as I stated above, that
is not what @movrcx did.



Let's recap where we are at this point (I told you this rabbit hole was
deep)... We have a situation were strict pinning enforcement is enabled and
seems to work by not letting the extension update connection validate when
a CA cert is added to the Firefox store through its interface. However,
it's not failing due to pinning enforcement, but rather on the fact that
the certificate it's trying to use doesn't have the BuiltIn entitlement.
When we compile the same cert in directly, it is then recognized as BuiltIn
even though it doesn't have the specific entitlement libNSS should be
looking for. That's two checks it should have failed but were bypassed.
So... WTF right? Why is this happening?! Deeper we go...



My answer presented itself after a discussion with Erinn Atwater
(@errorinn) who is a PhD student in Computer Science at the University of
Waterloo in Ontario. I saw that she was asking pointed questions on twitter
that were in the same line as my research, so I shot her a DM. She had
actually put a bunch of students on this task last night because she had
the same concerns I did about this not being taken seriously. It turns out
that Mozilla actually doesn't use normal HPKP for certs related to their
operations (like addons.mozilla.org). Instead, they use a form of static
key pinning. These validation weaknesses are limited to these statically
pinned certs and HPKP seems to be fully functional (ie: not broken) in
Firefox. That's great for every other site but obviously bad for Mozilla
and the certs they have statically pinned.



I don't have enough data to state this with 100% certainty, but it appears
that with statically pinned certs, the only requirement for validation is
that the certificate validates through a "built in" CA. However, since it's
not using libNSS to do the validation, it looks like "built in" for
statically pinned cert validation just means that the CA was there at
compile time. That is why the certificate didn't validate when it was added
through the Firefox interface but did when it was compiled in. In turn,
that means that any “addons.mozilla.org” certificate that validates through
any CA that is shipped with Firefox should bypass pinning restrictions and
work. That also means that @movrcx's attack should work as advertised.



You can actually test this without all of the compiling custom browsers!
Try going to pinning-test.badssl.com in Firefox. It SHOULD give a TLS
error... but it doesn't for most versions of Firefox. Mozilla added a
static pinned cert for this test and you can find it here:
https://dxr.mozilla.org/mozilla-central/source/security/manager/ssl/
StaticHPKPins.h#283 . The pinned cert has a static pin set to "AAAAAA="
which doesn't match any known CA. That means it should fail validation.
Yet, since the test site has a valid cert from Google, it validates through
a CA that is built in to the browser and works. That is the core bug that
makes this attack work.



The good news is that Erinn's team was able to determine that this bug is
fixed in the September 4th Firefox nightly build. Unfortunately, it's not
clear what bug the fix was associated with or what change in the code
facilitated it. The changelog for that build can be found here:
http://forums.mozillazine.org/viewtopic.php?f=23&t=3022271 . None of the
public bugs seem to line up though. However, there are 2 restricted bugs at
the top of the list. It's possible that this issue might be one or both of
those. If you want to test the builds yourself, the issue occurs in:

https://ftp.mozilla.org/pub/firefox/nightly/2016/09/2016-
09-03-00-40-09-mozilla-aurora/ and does not occur in:
https://ftp.mozilla.org/pub/firefox/nightly/2016/09/2016-
09-04-00-40-02-mozilla-aurora/ . I'm not sure when this will be pushed to
the production versions of Tor and Firefox but at least a fix seems to be
in the pipeline.



So, that's it for the bug. As you have seen, the attack as described by
@movrcx should work as advertised, but it's not at all immediately obvious
why it works. If you know how pinning is supposed to work, you would
probably claim with gusto that it doesn't work, as many on twitter were
very quick to do. Yet, it does work.



While TorBrowser will catch the fix from the Mozilla patch, I believe they
should actually change how they handle extensions overall. It seems
ridiculous to me that they actually use Mozilla’s auto-update process for
extensions. If NoScript or HTTPS Everywhere added a new vulnerability with
an update, all Tor users would get it within a day of using the browser.
Also, with the paranoia their organization seems to have, I would think
Mozilla being compelled to push a malicious extension to specific Tor users
would be a real concern of theirs. To me, the logical solution would be to
compile NoScript and HTTPS Everywhere themselves, sign those extensions
with their own key, hardcode their public key into the TorBrowser, and then
do their own cryptographic validation of extensions locally. Extension
updates would go out with TorBrowser updates exactly how the TorBrowser
Firefox updates are delivered.



If you made it this far, you are a trooper. I hope it was informative. If
you have any questions or input, please share. It’s always possible I
missed something major and all of this was a big mistake. ;-) If so, please
tell me! Thanks again to @movrcx for being cool and working through all of
this with me. Also, thanks to @errorinn for sharing her and her team’s
research with me. It was key in closing the last information gap I had.



Thanks again for reading!



-Ryan Duff

https://twitter.com/flyryan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.immunityinc.com/pipermail/dailydave/attachments/20160915/89934cc8/attachment-0001.html>


More information about the Dailydave mailing list