Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Onion service related changes to HTTPS handling #15560

Merged
merged 2 commits into from
Feb 11, 2021

Conversation

cohosh
Copy link
Contributor

@cohosh cohosh commented Jan 16, 2021

This fixes the way mastodon is loaded over onion services.

This change will allow the onion service version of a mastodon instance to load everything over http while keeping clearnet access of the site https only.

Shoutout to @psjbeisler for finding this guide for onion services and rails by riseup.net: https://riseup.net/security/network-security/tor/onionservices-best-practices#onion-services-and-rails-4

The changes in this pull request are in line with the advice listed there. Importantly these changes won't make access to clearnet (non onion service) versions of the site less secure for the following reasons:

  • removing the force_ssl directive merely removes an additional layer of redirects. This is taken care of already in the nginx config through a redirect from http to https and the setting of HSTS headers.
  • setting the secure flag for cookies in a configuration file using the secure_headers gem rather than in the source code, so that it is set only for HTTPS accesses of the clearnet site and not for plain HTTP onion service accesses.

See #15501 for more discussion

@cohosh
Copy link
Contributor Author

cohosh commented Jan 16, 2021

Looks like tests are failing :( this needs a bit more work it seems.

Gemfile.lock Outdated
@@ -808,3 +812,9 @@ DEPENDENCIES
webpacker (~> 5.2)
webpush
xorcist (~> 1.1)

RUBY VERSION
ruby 2.7.2p137
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not required.

@cohosh cohosh marked this pull request as draft January 17, 2021 19:26
@cohosh cohosh force-pushed the issue/15501 branch 3 times, most recently from 60d0c91 to f2b8f29 Compare January 17, 2021 23:49
@cohosh
Copy link
Contributor Author

cohosh commented Jan 18, 2021

Thanks, I cleaned up Gemfile.lock. There was an additional test that was failing because it checked to make sure that http requests were being redirected to https. Because we are now relying on the nginx configuration to do that, I disabled the test.

I understand if the reviewers have some misgivings about this. Another option is to add another Rails environment called "onion" that will be identical to the "production" environment, except it will have forced redirects in the code disabled. It would be used only by instance operators that want to set up their instance as an onion and clearnet service.

Thoughts/preferences?

@cohosh cohosh marked this pull request as ready for review January 18, 2021 00:08
Base automatically changed from master to main January 20, 2021 10:31
@cohosh
Copy link
Contributor Author

cohosh commented Feb 4, 2021

Hey, just checking in to see if there's something I can do to move this forward.

@shleeable @Jenn650 do you have any opinions on the strategy taken here?

@jtracey jtracey mentioned this pull request Feb 10, 2021
@Gargron
Copy link
Member

Gargron commented Feb 11, 2021

Usage of secure_headers gem in here is superiour to #15666 because in that PR your server can only either run through tor or not, while here both access methods would be supported, but the force_ssl handling and PR title are better in the other PR.

Ideally one of you would combine them into one.

@dr-bonez
Copy link
Contributor

@cohosh I'm going to manually apply a subset of your changes to #15666

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

@cohosh I'm going to manually apply a subset of your changes to #15666

Sounds good, thanks :)

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

@cohosh I'm going to manually apply a subset of your changes to #15666

I had already started this and responded before I realized how close I was to do doing the same. I pushed it here, feel free to use it if it looks good, or feel free to proceed with your thing.

@cohosh cohosh changed the title Issue/15501 Onion service related changes to HTTPS handling Feb 11, 2021
@dr-bonez
Copy link
Contributor

LOL great. I'm running a quick manual test of federation on an onion instance I'm running, I'll push up when that's done. @Gargron at that point both PRs will be identical so merge whichever one you want. @cohosh I think this branch still needs to be rebased though.

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

@Gargron do you want me to rebase this on the latest master?

@Gargron
Copy link
Member

Gargron commented Feb 11, 2021

Yeah there's a conflict in Gemfile.lock that prevents merging.

@dr-bonez
Copy link
Contributor

Tested .onion federation and it works.

@Gargron Gargron merged commit e79f8dd into mastodon:main Feb 11, 2021
@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

Thanks @dr-bonez and @Gargron !!

@ClearlyClaire
Copy link
Contributor

ClearlyClaire commented Feb 11, 2021

I think the use of secure_headers has been a bit hasty, there are a few possible issues:

  • HSTS header is currently defined in the sample nginx configuration file we provide, this could result in two HSTS headers
  • we already define X-Frame-Options in a context-dependent way. Here, secure_gem is going to make that less strict for most pages, and too strict for things we can specifically embed.
  • X-Content-Type-Options and X-XSS-Protection are already set in the config/environments/production.rb. Those use the same values as secure_headers so I think the latter just overwrites them and that doesn't turn out to be an issue, though.
  • secure_headers decides whether to set secure flags on cookies depending on whether the request is done using HTTPS or HTTP. While fine in theory, I'm worried that this is going to make setups with misconfigured reverse-proxies (where the “https” scheme isn't properly forwarded) insecure.

EDIT: Updated comment wrt. cookies

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

Thanks for looking into this.

I think the use of secure_headers has been a bit hasty, there are a few possible issues:

* HSTS header is currently defined in the sample nginx configuration file we provide, this could result in two HSTS headers

* we already define `X-Frame-Options` in a context-dependent way. Here, `secure_gem` is going to make that less strict for most pages, and too strict for things we can specifically embed.

* `X-Content-Type-Options` and `X-XSS-Protection` are already set in the `config/environments/production.rb`. Those use the same values as `secure_headers` so I think the latter just overwrites them and that doesn't turn out to be an issue, though.

Can we solve this by adding

  config.hsts = SecureHeaders::OPT_OUT
  config.x_frame_options = SecureHeaders::OPT_OUT
  config.x_xss_protection = SecureHeaders::OPT_OUT
  config.x_content_type_options = SecureHeaders::OPT_OUT

I already did that for the content security policy, but I was bit hasty in not checking the other values.

* `secure_headers` decides whether to set secure flags on cookies depending on whether the request is done using HTTPS or HTTP. While fine in theory, I'm worried that this is going to make setups with misconfigured reverse-proxies (where the “https” scheme isn't properly forwarded) insecure.

Yeah, TBH this solution isn't ideal. My understanding of secure headers is that it should be a browser-side thing, where browsers decide whether or not to accept secure cookies depending on if the connection is secure (which usually means HTTPS, but In the case of Tor Browser, HTTP connections to onion sites are treated by the browser as secure). But from what I can tell, Ruby on Rails is doing some extra server-side work and not sending secure cookies over plaintext HTTP connections, causing this to not work properly. In an ideal world, we should be able to always set these as secure and it will just work for HTTP onion sites.

I do think it's important to keep support for HTTPS onion services, but am willing to defer to your judgement on what the best way to proceed here is (maybe going with something closer to @dr-bonez solution for now and filing an issue with Ruby on Rails upstream)?

@ClearlyClaire
Copy link
Contributor

it will just work for HTTP onion sites

it will? I have to admit I don't know much about how the TorBrowser handles that, but if it indeed works for onion sites even on HTTP, why not revert that part and use secure cookies unconditionally?

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

it will just work for HTTP onion sites

it will? I have to admit I don't know much about how the TorBrowser handles that, but if it indeed works for onion sites even on HTTP, why not revert that part and use secure cookies unconditionally?

It doesn't work right now because of the server side issue with Ruby on Rails. The mastodon server won't send secure cookies over an HTTP link at all, causing things to break. What I meant was that in an idea world, the secure cookies header would all be handled browser-side and if it was Tor Browser would already accept them because of this patch: https://bugzilla.mozilla.org/show_bug.cgi?id=1633015

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

So in order to get this to work, we might have to file an upstream issue with Ruby on Rails to get them to send secure cookies regardless of the connection, or make an exception for .onion sites.

@ClearlyClaire
Copy link
Contributor

As far as I understand, that isn't an issue with Rails, but with the secure_headers gem specifically:
https://github.com/github/secure_headers/blob/5592e9ac79358ce5cfae2eb9b20228618aad3953/lib/secure_headers/middleware.rb#L33-L40

Since Torbrowser seem to consider .onion sites as secure now, I advocate for reverting the secure_headers-related changes and have cookies using secure regardless of how the website is accessed.

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

I tried that before on a production deployment, and it didn't work, even without the secure headers gem. My guess is it's an issue with both the gem and with Rails.

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

Even @dr-bonez had to disable secure headers for .onion sites, for the same reason. So, if you're concerned about the affects this has for non .onion sites, we could go with their solution of making the call on whether or not to set the secure flag based on whether the domain is a .onion or not. It makes https .onions a little less secure, but is perhaps a good compromise until Rails is fixed.

@dr-bonez
Copy link
Contributor

I don't think it does make https onions less secure. With SameSite set to lax, as long as the main page has https, the cert is being validated, so your browser is verifying the authenticity of the of the cert. At that point, the domain has been verified, and anything that comes from that domain is trusted, regardless of transport protocol since the url is self authenticating.

@dr-bonez
Copy link
Contributor

I will also note, the next version of nginx has a directive to modify the cookie flags, so this may become a moot point. We can just remove the secure flag from rails and set it in the nginx conf, after that's stable.

@jtracey
Copy link
Contributor

jtracey commented Feb 11, 2021

@dr-bonez Authenticity is only meaningful if it's visible to the user; if a user is browsing an .onion domain of a site with an openweb version, they should be able to see that they correspond. If the user is redirected to the .onion site from the openweb site, or if there is ever a single resource loaded from the https version of the onion site, then yes the browser could tell that the sites correspond even once https is gone -- but the user wouldn't, so that doesn't give you much. Basically, persistent https grants you the ability to mitigate someone mirroring your dual openweb/onion service instance as an onion service (potentially similarly named ;)), and the adversary surreptitiously linking it to victims as authentic. A user can click, at any time, the onion button while browsing https://www.facebookcorewwwi.onion/ and be able to tell if they accidentally clicked a link to https://www.facebookmaincomi.onion/ or whatever, without having to memorize the full legit facebook onion address.

(Also, my experience accords with the above comments, that rails refuses to send secure cookies over http, and that this can be fixed with bleeding-edge nginx in lieu of that portion of this PR, but who knows how long it'll be before that's in distros like Debian Stable.)

@dr-bonez
Copy link
Contributor

dr-bonez commented Feb 11, 2021

I understand the point about why https is useful for onion sites. My point is that the secure flag on cookies isn't really necessary here. The point of that flag is to prevent someone from letting you access the real site, so you get your cookie issued, then spoofing the http version and redirecting you to it, so that your browser automatically sends your cookie to the attacker. In this case, the same site flag would prevent the cookie from being sent to any domain that's different, regardless of whether it looks similar. The user doesn't decide when cookies get sent, so they can't be tricked into sending cookies to a bad domain.

@jtracey
Copy link
Contributor

jtracey commented Feb 11, 2021

Ahh, my mistake, yes that sounds right to me.

@cohosh
Copy link
Contributor Author

cohosh commented Feb 11, 2021

Cool, going with @dr-bonez 's approach to only disable secure cookies for .onion sites sounds like the best solution here.

@ClearlyClaire
Copy link
Contributor

I tried that before on a production deployment, and it didn't work, even without the secure headers gem. My guess is it's an issue with both the gem and with Rails.

Oh, ok, I misunderstood that Rails dropped the secure flag somehow, while what Rails does is refuse sending the cookie altogether, which kind of makes sense.

What secure_headers does is drop the secure bit instead of dropping the whole cookie, effectively making the app insecure over HTTP and misconfigured reverse-proxies, although this indeed doesn't matter for Hidden Services, for which the connection is secure anyway regardless of whether it's using HTTP or HTTPS.

It's probably possible to monkey-patch secure_headers so that its decision is based on requesting a .onion domain rather than whether the protocol is HTTPS, which would make me more confident in that change. I will have a look.

@Gargron
Copy link
Member

Gargron commented Feb 11, 2021

If the only thing we use from secure_headers is the cookie thing and we're going to monkey-patch it why not just write our own Rack middleware?

@ClearlyClaire
Copy link
Contributor

Good point. We'd have avoided the cookie parsing stuff I guess.

chrisguida pushed a commit to Start9Labs/mastodon that referenced this pull request Feb 26, 2022
* Enable secure cookie flag for https only

* Disable force_ssl for .onion hosts only

Co-authored-by: Aiden McClelland <[email protected]>
@vmstan vmstan mentioned this pull request Nov 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants