-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gate Timestamps behind existing permission prompts #64
Comments
The often-cited paper, Fantastic Timers and Where to Find Them, suggests that While some of those are of lower resolution than Since blocking all those timers behind a user permission doesn't seem tenable, browsers have chosen a different path to tackle this issue:
If you haven't yet, I suggest you'd read @creis' excellent document on Long-Term Web Browser Mitigations for Spectre. With all the above said, browsers are free to clamp Given the fast-moving pace of this area, I doubt any of those implementation decisions should be baked into the spec. |
@yoavweiss Thank you for your comments above. I was familiar with many of these efforts, but not all. so I'm grateful for the the links. Thanks much!
This is an interesting idea too (although this could be its own fingerprinting target) First party vs third party, etc, all could also useful signals
I take your point here, but I disagree. The current practice of proposing and adopting standards that include clear risks for privacy loss, and then relying on browsers to nerf / muck-with those standards has been a catastrophe for browser privacy. Its bad for web compat reasons, for baking in developer expectations that this privacy-risky functionality will be around forever, and as precedent for future standards. We'll be in a much better place privacy wise if standards address privacy more directly in the normative parts of standards, even if some vendors will still go further than the standard. Saying "lets standardize the functionality, but things are moving too fast to deal with mitigations" seems likely to only increase the privacy headache down the road. Would you / other authors be receptive to talking with PING about the kinds of signals / situations where |
I've opened our PARTIAL, INCOMPLETE enumeration of precise-enough clocks: https://bugs.chromium.org/p/chromium/issues/detail?id=798795 The sheer number and variety of these clocks means to me that (a) we can never coarsen them all without breaking legitimate functionality and web compatibility; and (b) we can't create a permission prompt that would defend against attack while still being meaningful. See also the Attenuating Clocks section of the Post-Spectre Threat Model Re-Think: https://chromium.googlesource.com/chromium/src/+/master/docs/security/side-channel-threat-model.md#attenuating-clocks I think our goal going forward should be that an origin cannot time another without explicit opt-in from the timee (such as with the Timing-Allow-Origins header or a similar mechanism). |
FWIW, when Firefox deployed our 2ms bump and then reduced it to 1ms + jitter, this applies to every explicit clock exposed by the web platform - not just performance.now. (If we missed one, it's a bug.) Additionally, to try to hurt implicit timers we integrated Fuzzyfox, but have not had occasion to try enabling it (or doing much performance/compatibility testing on it beyond some initial manual-QA efforts.) In general, I agree with you though: with arbitrary amplification, you can't coarsen timers enough to get to a point where you're defeating a medium-bandwidth attack while also maintaining web compatibility. |
@tomrittervg
What I take away from your message is that standardizing privacy-risking end points (in this case, extremely high res timers), and then relying on non-standardized, vendor dependent mitigations is a losing strategy, or at least one with quickly declining marginal returns. If I'm reading you right, that seems to suggest we should limit the number, and situations, where such timers are available in the web API, no? Which is my point / goal / suggestion with this issue. :) |
@psnyde2 I strongly agree with both points made by Chris. Yes, there are differences in the resolution that different platforms expose, and those will continue to evolve based on capabilities of each architecture, the underlying hardware, etc. Per Chris's point, there isn't a meaningful defense or "access to time" prompt that we can present to the user. Based on this, speaking on behalf of Chrome, our position on this proposal is "no". I won't speak on @tomrittervg's behalf, but given that FF and Safari clock resolutions are same as |
@igrigorik i think there is a misunderstanding. The proposal is to not to create additional prompts, but to gate these additional timing sources behind existing prompts, since these existing prompts seem to exist 1-to-1 to the stated use cases for the timing information. Again, there is no suggestion for addition "date" or "access to time" prompts. Point taken that there are other troublesome timing sources present, but the goal here is to avoid adding additional technical / privacy debt that will need to be paid down further. When you realize you're in a privacy hole, the first thing to do is to stop digging. |
@snyderp what existing prompts are you referring to? |
Permissions for full screen, VR / usb device use, midi use, etc
… On Apr 23, 2019, at 06:58, Ilya Grigorik ***@***.***> wrote:
@snyderp what existing prompts are you referring to?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
I don't think the proposed change makes much sense because:
|
|
Because users don't expect using fullscreen on a given website may pose a new security or privacy threat, it would be not okay to grant any security or privacy sensitive API even if |
1a. We're discussing standardization here / what should exist in the platform / where things should move, not just codifying what exists in one or two implementations. 1b. WebKit and Chromium are not the only games in town… 1c. Standards act also a sign for where the web should move in the future / what developers should target / etcc
|
Restricting access to |
I think we're at disagreement, so I don't want to turn this into an argument. But to state the case on last time, and try and avoid any misunderstanding, the motivations behind the proposal (and objections to the standard as is) are:
|
Peter, thanks for the feedback. I would like to suggest that we step back from the "don't be cavalier" and invoking "privacy catastrophe" framing, as neither of those is helpful in the technical discussion where the current disagreement lies. Per discussion above, we (a) don't believe it's possible to eliminate, coarsen, or jitter all explicit and implicit clocks in the platform (note), (b) we enforce Timing-Allow-Origin opt-ins on cross-origin resources that controls access to high res timestamps for such resources, (c) we don't believe there is a meaningful prompt that can be presented to the user to ask access to a clock and we disagree with the proposed strategy of gating access to a clock behind other permission prompts (it's unintuitive and unexpected both to the user and developer, and it blocks valid use cases that may not otherwise require a prompt), (d) browsers have already adopted various strategies to mitigate #56, ranging from coarsening clocks to equivalent of Date.now() to platform level re-architecture — e.g. site isolation. As a result, we disagree both with the motivation for and proposed method of the proposal. |
I agree with @igrigorik, and will add: We'll also be able to get rid of a lot of cross-origin timing side-channels by getting the Open Web Platform to a point where clients don't send credentials in requests for a page's sub-resources unless the sub-resource origin has opted in to being called from the main page's origin (if different). One version that of that idea is whatwg/html#4175, although one can imagine others. Like Site Isolation, it's a stronger defense than hacking on clocks and less bad for the OWP's overall utility. That's the direction we should be going. |
I appreciate the above input. I appreciate that my suggestion many not the best way to address this. But I strongly think it's necessary to include normative mitigations or reduce the availability of the timers. Its just not a defense of any standard to say "the standard calls for X, but its not a problem because browsers can just not do X" (i.e. Safari and Firefox, Date.now, etc) Are there other patterns / strategies can be standardized (i.e. in normative text) that would keep the functionality out of the common path and restrict it to only the cases where it's likely needed? Even things like requiring a user gesture in the frame, making it un-available to third-party frames, requiring aliasing with Date.now() in 3p frames, etc. Anything would be better for privacy (and future privacy+web compat work) than the current "available to everyone, all the time". @noncombatant Re: whatwg/html#4175, its a neat idea! But it wouldn't address a large number of the kinds of history sniffing and related privacy concerns, plus trackers are likely to opt-in anyway, no? @igrigorik just to summarize the above in reply to your a-d notes a) you might be right here, but reducing the number of clocks, especially high res ones, makes it easier to deal with the existing ones. |
TL;DR; I'm happy to close this issue, since its specifically tied to my specific proposal, and re-open another, broader issue mentioning the problem, instead of a specific solution, if it'd be better for org. But the current pattern (not limited to this spec) of:
is the anti-pattern that makes privacy-improvements such a web-comapt problem today (and came up as a pattern to avoid at the most recent AC meeting). |
@snyderp I suspect the source of misunderstanding is in the threat model you think high resolution timers pose. With Spectre and Meltdown timers can be used in order to read arbitrary process memory. That means that limiting access to them in third party contexts or behind user interaction won't help much, as first party sites pose a similar threat. This is also why browsers choose to protect against them by augmenting process boundaries (site isolation, CORP, COWP/COOP, CORB, CORP-P and probably other future things that start with CO).
How? How would barring access to explicit clocks help in mitigating the implicit ones?
Can you elaborate on that? How would limiting one explicit timer help establish user trust? Again, I think this comes down to misunderstandings around the threat model.
To re-iterate previous points - this standard does not introduce new risks, as there are plenty timers in the platform. You keep repeating that we should block access to this specific one, but we'd need a bit more reasoning as to how that would help users and their privacy and security.
I think it will be helpful if you could clearly point out the threat model that you think timers pose in general, and this one in particular. From my perspective, you can keep it in this issue.
Can you point out specific examples where A exposed information, B followed it, we were then able to get rid of the information exposure in A, but not able to get rid of it in B?
I'd appreciate if you could point me to the minutes. |
I'm told that the current thinking of the committee is to add language that permits UAs to gate access but does not require them. Implementers are going to do whatever they need to do to protect
|
For attacks, I imagine we're thinking of 100% all the same kinds attacks (though narrowly for this concern, i'm not considering things like spector etc). So attacks like this Environment learning attacks like The related work / threat models discussed in the fantastic timers paper and the below etc etc etc…
This is the webcompat issue being discussed. If we break benign code paths that depend on implicit timers, thats breakage thats likely a-ok; thats the authors relying on unmade promises. If the platform makes promises saying "its okay to rely on high resolution time" and then that has to get coarsened / removed later on and breaks high-res-dependent benign code, thats a real problem. So, again, not promising / building APIs that provide high res timers preserves web-compat privacy paths going forward. It preserves possibility space, where further committing to high res timers removes that possibility space. Similarly, there are practical differences between explicit and implicit timing sources. If a vendor, for example, sees a page doing strange / suspicious things to build timing signals, thats a signal to impose new interventions (or even force close a worker / frame). If though, the platform says "it is okay to rely on high res timers" though explicit sources, that makes it much more difficult for us to protect users. e.g. browsers implementing the normative sections of this standard makes it more more difficult to get to a fuzzyfox like privacy-preserving web, w/o breaking existing code that has been built to rely on the promises in the proposal.
I'm sorry but I'm having difficulty parsing this. Not trying to be difficult, but can you rephrase?
https://www.w3.org/2019/04/09-ac-minutes.html#item04 (if you do a search for my name or Chaals or normative you can get to the stubs of the conversation during the panel discussion) The metapoint here in this issue is that this is functionality seems likely to be user-serving in only a very very small number of websites (ex in previous research I was involved in, :Most Websites Don't Need To Vibrate", our testers couldn't find a single site that needed it for user-serving purposes). As high-res timers (implicit and explicit) have well know privacy issues, it would be good to explore ways of making these timers less available (implicit and explicit), and figure out ways to limit their use to when they'll be used for things users care about (instead of turning users into debugging / feedback tools for websites, etc.). |
Thanks you for this! I think many of these are very promising ideas, but, tbh, adding them in non-normative sections of the document doesn't help the situation; it just leads to web compatibility problems. If there is a floor (not ceiling) of mitigation that the group thinks would be useful, then the best course of action is to add them as normative text. If there aren't agreed to successful / practical mitigations, then that's a reason to keep working on the standard until those mitigations exist (or the functionality requiring mitigation no longer exists). Put differently (sincerely, w/o snark), what is gained by standardizing high resolution timers, if a suggested mitigation is "resolution reduction / aliasing with low resolution timers"? |
OK, thanks. I ask because you suggested that permissions or user gestures may be helpful to prevent attacks here, as well as limiting the API in third party contexts.
@tomrittervg - That's a great list of mitigations. I'm happy to add those as part of the Security & Privacy section.
As we've seen in the past, today's mitigations could be insufficient tomorrow. Similarly, some of today's mitigations may become unnecessary as browsers evolve (e.g. Out-of-process iframes) or as content opts-in to restrict its own access to third party resources. In the past as a group, we were reluctant to set the temporary mitigations in stone, as part the normative spec language. This has not changed AFAIK.
|
I may misunderstand, but all the attacks above seem equally executable in a 3p frame as in the 1p. Which ones seemed like they'd require being the 1p / top level frame?
Would it suffice then to modify the standard to be low-resolution now, and then to revise the standard later on when a cross-browser / implementation-independent / normative-text solution was in place to mitigate the privacy concern? |
This is exactly what I was saying. Earlier you suggested we may want to limit access to this API in 3P contexts (e.g. "requiring aliasing with Date.now() in 3p frames"). I was trying to understand why you think that made sense. I guess we agree that it doesn't.
So far we've seen 0 interest in doing that on this thread, from all the three independent browser engine vendors. The spec does not specify a specific resolution, and implementations are required to make its resolution as granular as possible while not granular enough to attack their users. Where that line is drawn will vary based on browser architecture, CPU architecture, current attack landscape and probably other factors. I understand that is not the outcome you were hoping for. Nevertheless, that's the decision the group has reached. I'm therefore closing this issue, and the related #20. |
#204 |
Following up from #20
The privacy risks associated with these high res timestamps are documented in the draft, as well as some of the issues (e.x. #56), and a long list of research papers (happy to provide citations, but they're already linked to elsewhere in the repo).
The usefulness of the timestamps are also well documented in #56, and it seems like there are cases where these would be very useful to web users.
However, these cases seem to be the rare case, and not what users will encounter on the vast majority of pages. And the most compelling examples of the usefulness of these timestamps are gated behind permission prompts (fullscreen API for games, USB inputs and WebVR for VR uses, etc).
I suggest only making these timestamps available when the user has approved one of the following permissions (using the permissions in Chrome currently):
Could also add to the above list, if there are other permissions that'd be relevant. But it seems like this would be a way of blocking the privacy violating uses of these timestamps (at least in the common case) while also making them available in the cases where they're likely to be useful.
The text was updated successfully, but these errors were encountered: