Two types of privacy - Seirdy


Privacy" can mean different things in different contexts. Tracking-reduction and tracking-evasion represent different goals with some conflict and overlap



Onion Details



Page Clicks: 0

First Seen: 03/11/2024

Last Indexed: 10/21/2024

Domain Index Total: 190



Onion Content



Preface Threat modelling provides important context to security and privacy advice. Measures necessary to protect against an advanced threat are different from those effective against unsophisticated threats. Moreover, threats don’t always fall along a simple one-dimensional axis from “simple” to “advanced”. I appreciate seeing communities acknowledge this complexity. When qualifying privacy recommendations with context, I think we should go further than describing threat models: we should acknowledge different types of privacy. “Privacy” means different things to different people. Even a single person may use the word “privacy” differently depending on their situation. Understanding a user’s unique situation(s), including their threat models, can inform us when we select the best of approach. How do we choose between reducing a footprint’s spread and size? Toggle table of contents Intro­duction I highlight two main approaches to privacy: “tracking reduction” and “tracking evasion”. Tracking reduction ( TR ) TR aims to reduce the amount of data collected about an exposed user. It reduces a footprint’s spread primarily by blocking trackers. Sometimes this can increase the size of a footprint. Tracking evasion ( TE ) TE reduces the amount of data exposed by a user. Rather than eliminating data collection itself, TE prevents useful data from being made available in the first place. In other words, it reduces a footprint’s size. note 1 There is gray area between these two extremes, and not every privacy measure falls neatly into one of these two categories. I’ll address that later in this article. Note: this article focuses primarily on Web browsers; however, its concepts can apply to any software capable of tracking users. Let’s get started: Tracking reduction (TR) TR is suitable for casual threat models. These techniques typically aim to remove trackers or to block malicious traffic. If someone just wants to browse the web with less tracking, they’re probably not expecting a “nuclear option” that removes all their personalization. That user is more likely to be concerned with manipulation by personalized ads, or something vague such as being “followed around” as they browse websites while signed out. These users are likely okay with being identified by a site; several of their accounts are probably linked to the same identity. However, when they log into “example.com”, they’d rather not ping trackers from “facebook.com” or “amazon-adsystem.com”. Of course, data-sharing could happen on the backend. Users may accept that there’s little they can do about this beyond reading a privacy policy and filing suit upon violations. In other words, TR falls closer to “wants” on the (somewhat contrived) “wants versus needs” spectrum. It addresses the gray area between personal preferences and real present threats. Our goal is to reduce tracking where we can, without significantly degrading the user experience. Sample TR techniques Badness enumeration: content-blocking and firewalls. Sending headers such as Global Privacy Control or Do Not Track. Opting out of tracking when given the choice. Exercising legal rights (e.g., rights granted under the GDPR or CCPA ) to remove information. Having a preference for services whose privacy policies indicate less data collection and/or sharing. Turning off telemetry. Attack surface I mentioned content-blocking, which typically happens through browser extensions and/or third-party filter lists. These can add attack surface; be mindful of the trade-off. Even trusted extensions like uBlock Origin are no exception; exercise restraint when adding third-party filter lists. I covered this topic a bit more in A layered approach to content blocking . Safe yet limited approaches to content filtering should lay a foundation, topped off by risky yet powerful approaches that users selectively enable. Tracking evasion (TE) TE prevents an adversary from collecting meaningful information tied to one’s identity. Unlike “opting-out” and blocking well-known third-party trackers, tracking evasion distrusts all parties by default. This approach assumes that tracking is equally likely to happen through both first-party and third-party trackers. Therefore, a list of known third-party trackers is irrelevant to tracking evasion. Users following this approach in its purest form treat every party capable of tracking as a hostile tracker. TE techniques typically revolve around minimizing the size of one’s fingerprint, either through fingerprint normalization or randomization. note 2 Sample TE techniques Using the Tor Browser. Avoiding identifiable browser extensions. Randomizing typing behavior (A good example is kloak, a keystroke anonymizer used by Whonix ). Using a coarse scroll interval. Half-measures are ineffective If an adversary employs multiple fingerprinting vectors, then normalizing or randomizing a small subset of those vectors might make a user stand out even more. xkcd comic: license plate Toggle comic transcript Comic transcript Cueball is walking in from the right holding a license plate up with both hands for an off-panel Megan to see. It is possible to see the plate, but here it looks like all I’s (or 1’s). Cueball Check out my personalized license plate! Megan (off-panel) “1I1-III1”? Cueball It’s perfect! Cueball No one will be able to correctly record my plate number! Cueball I can commit any crime I want! Megan Sounds foolproof. Soon, at a crime scene: Witness The thief’s license plate was all “1"s or something. Police officer 1 Oh. That guy. Police officer 2 His address is on a post-it in the squad car. Transcript based on the explain xkcd wiki entry for xkcd #1105 . TE carries the risk of “springing a leak”. While TR presents an incremental solution, TE is much harder: it’s only a slight exaggeration to describe TE as a binary “all-or-nothing” approach. Other relevant discussions worth reading Arkenfox user.js issue 1274: Effort towards a common browser fingerprint PrivacyGuides discussion 341: Great browser re-write-reboot Conflicts between TR and TE Good approaches to TR often weaken TE. Conflating the two can have harmful consequences. Badness enumeration: content blocking While content blocking through badness enumeration is useful for TR, it’s generally antithetical to TE. A website can use a first-party script and/or inspect server logs to determine information such as: Which network requests succeed or fail. Whether certain elements (e.g. certain images) load successfully. Whether injected content (e.g. from an ad-blocker) is present. note 3 Whether content-blocking changes which elements are in or near the viewport. note 4 These pieces of information (and others I haven’t included) can tell a website if a browser has a content-blocker, and which filters a user has enabled. The latter piece of information can uniquely identify a user; this compromises TE. Nonetheless, these techniques aren’t in widespread use. I think that sharing first-party fingerprinting data between different organizations is the exception rather than norm. As long as this is the case, content-blocking remains a viable technique for TR. Badness enumeration shouldn’t be applied in a TE context. The Tor Browser’s design philosophy explicitly includes a “no filters” directive and recommends against their use, to reduce fingerprinting. As of mid-2022, the version of the Tor Browser included in Tails includes a content-filtering extension anyway. The entire point of using the Tor Browser Bundle is to maximize TE; Tails compromises this goal. Brave has a similar problem. Its content-blocker allows users to create custom filters or activate additional filter lists. Brave also tries to enable TE through fingerprint-randomization. Combining the two likely makes users even more unique. Brave could mitigate this flaw by having its “advanced” level of fingerprinting protection also normalize the content-blocking filters applied. note 5 Do Not Track (DNT) DNT was an HTTP request header indicating that a user does not wish to be tracked. Unfortunately, unlike Global Privacy Control , there was no legal obligation to obey DNT. The DNT header ended up being used as a fingerprinting vector to track users instead of a way for them to avoid tracking. WebKit removed DNT support because it was antithetical to TE, and its utility for TR was too low to justify these harms. Exceptions and overlap There is grey area between TE and TR. Some techniques don’t neatly fit into one of the two categories. Here’s an incomplete list of those techniques, for illustrative purposes: Disabling browser features The fingerprintability of disabling JavaScript, the Reporting API , and hyperlink auditing is typically dwarfed by the fingerprinting made possible by enabling them. I struggle to categorize this technique. On one hand, feature-toggles represent uncommon browser configuration that may prevent some trackers from running (TR); on the other, it treats all external parties equally and can reduce fingerprinting vectors (TE). I’m inclined to say that feature-disabling is closer to TE than TR only if enough people share the same configuration. note 6 Goodness enumeration Assuming that all actors are actively hostile might be overkill. A user may follow a TE approach and/or disable browser features, while also maintaining a list of trusted exceptions. Trusted sites may use disabled features or have access to a larger fingerprint. This represents a “middle ground” of sorts between the convenience of TR and the effectiveness of TE. Amnesia The most common amnesiac technique is clearing cookies. A more thorough technique is using a disposable VM that’s erased and re-created between sessions. Rather than reduce or evade tracking, these measures reduce the persistence of trackers (and/or malware) that slip through other defenses. The list goes on. “TR versus TE” is an important perspective to have, but it isn’t the only lens through which we should view privacy-enhancing techniques. Let’s be mindful of the TR/TE framework’s limitations. How to make privacy recommen­dations Privacy-enhancement recommendations need to account for whether a preference for TR over TE exists. Whenever applicable, please do the following: Clarify if a privacy-enhancing technique has a focus on TR or TE. When discussing TR techniques, mention any compromises made from a TE perspective. Most importantly: recognize that different people need to make different trade-offs. Someone with special needs might require some fingerprintable personalizations. If non-negotiable personalizations make tracking-evasion too difficult, you might need to steer the person towards tracking-reduction and explain the trade-offs involved. note 7 A single solution isn’t enough for everyone. In fact, it’s not usually enough for a...