The EU’s proposed new privacy law would codify and expand the Right to Be Forgotten (RTBF) decision of 2014, Google Spain v AEPD and Mario Costeja González.  It would require some range of Internet platforms to remove content that allegedly infringes on the privacy rights of individuals.  This range of platforms definitely includes search engines, which already have to comply with RTBF procedures, but also might include social networks like Facebook.  It depends on whether they are considered “data controllers” when individuals post private information about themselves or others. The New York Times Editorial Board laments that the new law would require Internet platforms to disappear (technically, de-index or delist) content requested to be removed pending a decision about whether the requests are valid.  “Disturbingly, news organizations and other websites would not have an opportunity to object to those immediate removals and might not even have a change to protest permanent deletions.” What The Times misses is that this is ALREADY the state of RTBF compliance.  Only individuals whose requests to delist information are refused have the right to appeal.  Publishers cannot appeal a decision to disappear their content.  Nor does the public deprived of potentially newsworthy information have any rights.  This is because the appeal is to Data Protection Authorities, which are responsible for privacy, not for freedom of expression. Although the RTBF seeks to be sensitive to the tradeoffs between individual privacy and freedom of expression, the mechanics of implementation leave it entirely to unaccountable private parties to effectuate the balance. All the incentives are to remove content because there is no penalty or even review of delisting decisions.  For that reason, it’s not surprising that Google grants 42% of all delisting requests even under the current law, which is milder than what the new EU privacy law contemplates.   Frankly, it would not be surprising if that number climbed much higher.  The problem is not just that content is disappeared — there may be good reason to delist content that injures privacy interests and has no public interest value.  The problem is that neither the publishers nor the public knows why  choices are being made to disappear content.  Academics documented in an open letter to Google on the one-year anniversary of the RTBF decision the kinds of data that Google and other intermediaries should release.  This would be information in the aggregate and with no personally identifying information.  It’s not clear whether the intermediaries are even collecting this data; they sure aren’t releasing it.  Google’s transparency report provides very little information about why it makes delisting decisions, and almost all of it is anecdotal. If the EU adopts its new privacy law, it should include transparency requirements so that we know how private intermediaries are being enlisted to shape our information environment.