Fig. 1: Barnacles on a log.
Organic accumulation of life, as found on a beach.
The web was not designed with reader privacy in mind. Each interaction with a website can be observed by the site’s publisher. Worse still, most websites nowadays include dozens of resources from other companies, such as embedded videos, images, fonts, and ‘like’ or ‘share’ buttons. Each of those companies are *also* able to track the readers — and are doing so en masse.
The website publisher exposes its visitor to such companies’ malpractice. The GDPR speaks of joint data controllers with shared responsibility, and recent court cases have held website publishers responsible for enabling data collecting by a third party (see e.g. CoJ case C 40/17, July 2019).
The owner of a website carries responsibility to not expose its visitors to trackers.
Fig. 2: Barnacles that log.
Organic accumulation of trackers, as found on a popular news website.
All of this makes it easy for a website to accrue a large number of trackers that not only cause the website to load more slowly and work more sluggishly, but also cause serious privacy headaches for readers and compliance problems for website admins. And much like real barnacles, barnacle trackers are often found in places that are hard to reach and inspect. Under the waterline of the website, so to speak.
Readers (or regulators) can check if privacy policies are being implemented and enforced simply by looking at the headers. A malicious site admin could of course work around the CSP, but that requires considerably more work than just embedding a random tracker.
Every change to the CSP has to be granted by the DPO/legal, who should anyway keep a list of processing partners. In many cases this could be automated.
For third-party processing based on consent, a server-side configuration could make CSP content dependent on the existence of a consent-storing cookie.