The ubiquitous use of advanced data mining technology in consumer markets not only threatens privacy, but it may also unfairly disadvantage certain protected groups by various forms of algorithmic discrimination. However, simultaneously, data technology may be used by the regulator to mitigate privacy and discrimination concerns (see also Hacker, 2017). Generally, the growing differentiation of services based on Big Data thus harbors the potential for both greater societal inequality and for greater equality. Anti-discrimination law and transparency alone, however, cannot do the job of curbing Big Data’s negative externalities while fostering its positive effects.
To rein in Big Data’s potential, BCCP Fellow Philipp Hacker and NYU’s Bilyana Petkova argue that regulatory strategies from behavioral economics (see also Hacker, 2016), contract and criminal law theory should be adapted to data-driven market environments.
Four strategies stand out: First, an active choice regime should be mandated, for services offered in the digital economy, that enables users to choose between data collecting services (paid by data) and data free services (paid by money). It would give consumers a real and salient choice between exposure to data analytics – with potential harm for privacy and discrimination – and simple monetary payment. Second, the authors propose using an ex post judicial control of unfair privacy policies to prevent contracts that unreasonably favor data collecting companies. Third, they suggest democratizing data collection by regular user surveys and data compliance officers partially elected by users. Fourth, concerning use of Big Data analytics by regulators for enforcement purposes, they trace personalization techniques to the old precept of treating like cases alike and different cases – differently. The income and wealth-responsive fines powered by Big Data that they suggest offer a glimpse into the future of the mitigation of economic and legal inequality by personalized law.
Throughout these different strategies, the authors show how salience of data collection can be coupled with attempts to prevent discrimination against and exploitation of users. Finally, they discuss all four proposals in the context of different test cases: social media, student education software and credit and cell phone markets.