LinkedIn has spent the past year building what it describes as the most comprehensive identity verification system of any major professional network, requiring executives to confirm their titles, businesses to prove their official status, and recruiters to validate their workplace credentials. The aim is to stamp out the fraudulent accounts and recruitment scams that have plagued the platform. That push may now be complicated by its own infrastructure.
LinkedIn’s third-party ID verification partner Persona has come under fire for reportedly sharing users’ personal information with its own data partners, as well as accessing expanded data on users who seek to verify their information via the platform.
A privacy researcher who investigated the process found that Persona not only collects standard personal details such as name, address, and date of birth, but also extracts facial geometry data from photos, pinpoints geographic location, and examines behavioral biometrics. All of that information can be shared with a global network of partners, vendors, and sub-processors that handle personal data on Persona’s behalf.
Persona confirmed it had addressed potential vulnerabilities and that no secrets or customer data were exposed through the highlighted gaps. LinkedIn, when contacted for comment, referred only to Persona’s statement and offered no additional clarification of its own.
The statement goes:
On February 16, 2026, security researchers @vmfunc, @MDLcsgo, and @DziurwaF published a blog post identifying exposed frontend source maps on a non-production subdomain under withpersona-gov.com. Within an hour of learning about the post, we disabled the subdomain, confirmed that no secrets or customer data were exposed, and began a thorough internal review.
According to the researcher:
Here’s something I almost missed. Buried in a table on page 6 of the privacy policy, under “legitimate interests”:
They use uploaded images of identity documents — that’s my passport — to train their AI. They’re teaching their system to recognize what passports look like in different countries. They also use your selfie to “identify improvements in the Service.”
Persona has shown their humility but it might seem a little tone-deaf at this point:
We recognize that trust in a company like Persona isn’t built by writing a single blog post. It’s built over time, through consistent transparency and action. We handle something deeply sensitive: helping verify that people are who they say they are. That responsibility demands that we engage openly when we make mistakes.
The company had earlier specifically noted that no personal data is used for AI training, and any biometric data is deleted immediately after processing, with all other personal data deleted within 30 days.
