Worldcoin, a cryptocurrency startup founded by OpenAI chief Sam Altman, has scanned the eyes and faces of over 450,000 people across two dozen countries as part of its goal to amass 1 billion users.
However, the San Francisco-based company’s data collection tactics and lack of transparency have sparked sharp criticism over privacy risks.
Mass Biometric Collection Raises Red Flags
At the core of Worldcoin’s offering is an orb-shaped device that scans users’ irises and other biometric data to generate a unique digital ID.
The company claims this is necessary to prevent fraud and ensure each person gets one share of its cryptocurrency.
But privacy advocates argue this results in a centralised database of people’s biometric information that could be hacked or exploited.
“Digital ID systems increase state and corporate control over individuals’ lives and rarely live up to the extraordinary benefits technocrats tend to attribute to them,” said Big Brother Watch’s Madeleine Stone.
Dubious Practices to Lure Users in Developing Nations
Much of Worldcoin’s recruitment has focused on developing countries where people are more vulnerable due to lower incomes and weaker data protections.
Tactics have included misleading claims, pressure for personal details, payments to officials, and targeting of students.
“It’s a race to see who gets the most data in this AI-driven economy,” said digital anthropologist Payal Arora.
With tighter regulations in the EU and the US, entrepreneurs look to the developing world for AI training data, where it’s cheaper and easier to collect biometric data at scale.
Flimsy Consent and Data Security Concerns
Many users report giving biometric data and personal details like emails without explaining how the data would be used.
Worldcoin’s claims that email and phone numbers were optional are contradicted by accounts of field staff demanding this information to sign people up.
While Worldcoin says biometric data stays on devices and is deleted after upload, security experts question if these measures are genuinely hack-proof.
“It’s usually an economic question…if this project is as successful as they want it to be, then it’s going to become more profitable to try and tackle,” said cryptography professor Jeremy Clark.
GDPR Compliance Murky Despite EU Presence
Although Worldcoin operates in Europe, its privacy policy appeared to claim exemptions from the EU’s GDPR data protection law.
But legal experts say such carve-outs are not possible, and the company would still be subject to GDPR’s requirements and penalties.
Worldcoin maintains it is GDPR compliant but has not provided its data impact assessment for public scrutiny as recommended by the European Commission.
Also read:
Technical Chaos Undermine Worldcoin’s Ambitions
Worldcoin’s technology has suffered numerous glitches, from iris recognition failures to disappearing accounts during testing.
Hardware malfunctions with its orb devices have hampered recruitment efforts, while a buggy app has drained phones and deleted user profiles.
Some early adopters now wonder if Worldcoin’s promises of payment will materialise, as the ability to trade its tokens still needs to be improved.
The company’s ability to scale up while resolving these issues remains uncertain.
Worldcoin’s Moonshot Aims Clash With On-the-Ground Realities
While Worldcoin’s founders may aspire to create a fairer, decentralised future, its practices reveal a disconnect with that vision.
“Forget all those people,” said digital anthropologist Payal Arora, referring to the legions of test subjects. To Worldcoin, they are data to feed its AI, not rights-bearing humans.
As the project advances from its messy testing phase, its lofty ideals will be tested against growing calls for accountability. For now, Worldcoin’s globe-spanning eyeball grab has laid bare the human costs of Silicon Valley’s latest moonshot.