LinkedIn Faces Lawsuit Over Alleged Use of Customer Data in AI Training

LinkedIn Faces Lawsuit Over Alleged Use of Customer Data in AI Training

Redacto
3 min read

Update – this lawsuit has been withdrawn, as it was levelled on the basis of LinkedIn using private messages to train AI. While they did not do this, LinkedIn did automatically opt all users into AI training without getting informed consent (TechMonitor)

In a recent development, Microsoft-owned LinkedIn is under legal scrutiny for allegedly disclosing customer information to train artificial intelligence models. The lawsuit claims that LinkedIn used personal data without users’ consent, raising significant concerns about privacy and data protection (Reuters)


What Is the Case About?

According to Reuters, the lawsuit alleges that LinkedIn collected and shared customer data to develop and enhance AI technologies. This practice, if true, could involve using sensitive user information, such as profiles, messages, and activity logs, without explicit permission.

LinkedIn have responded, stating that “these are false claims with no merit” (BBC) – though the merits of the lawsuit will remain uncertain until an outcome is reached.

While LinkedIn has stated its commitment to privacy and compliance with data protection laws, the case highlights ongoing debates around:

  • Transparency in AI training: How data is collected, shared, and used.
  • Consent: Whether users are adequately informed and can opt out of such practices.
  • Legal and ethical standards: How companies balance innovation with user rights.

What Does This Mean for You?

Even if you don’t use LinkedIn, this lawsuit underscores a broader issue: the use of personal data to fuel AI advancements. Similar practices may occur across platforms, leaving many users unaware of how their online activity contributes to AI development.

Given Trump’s recent decision to revoke Biden’s AI protections and provide federal support to AI developers, there is a distinct possibility that unconsensual data collection for AI training could explode in coming years.

The implications of this include:

  • Privacy Risks: Your personal data could be used in ways you didn’t intend.
  • Ethical Concerns: How companies handle data affects trust and accountability.
  • Proactive Protection: Users need tools to take control of their digital footprint.

How Redact.dev Helps You Protect Your Data

At Redact.dev, we empower users to regain control of their online presence by offering tools to:

  • Identify and delete old content that could be used for purposes like AI training.
  • Clean up profiles across platforms like LinkedIn, Twitter, and more.
  • Stay informed with updates about privacy risks and data protection.

As legal cases like this unfold, Redact.dev ensures your digital footprint aligns with your expectations.


Conclusion

The LinkedIn lawsuit serves as a critical reminder of the importance of online privacy in an AI-driven world. Whether the allegations hold up in court or not, staying vigilant about your data is essential.With Redact.dev, you can take proactive steps to protect your privacy and ensure your data doesn’t end up where you don’t want it.

© 2025 Redact - All rights reserved