Skip to main content
Start of main content.

How LinkedIn’s AI is learning from your data

linkedin

by Dr James Birt

LinkedIn’s recent privacy update has sparked a vital discussion about data ethics and user consent, particularly revealing the disparity between protections for Australian users and those available to European users.  

The platform's latest feature, introduced in September, allows LinkedIn to use user data—including posts, messages and interactions—to train artificial intelligence models.  

This decision has raised significant concerns about data transparency and the effectiveness of opt-out options.

The new feature, termed “Data for Generative AI Improvement,” is activated by default.

This means that unless users actively choose to opt out, their data is automatically included in LinkedIn’s AI training processes.

This default setting often results in users being included in data-sharing without their explicit awareness or consent, raising important questions about control over personal information and its management.

The broader issue with LinkedIn’s approach reflects a troubling trend in data privacy practices.

Even if users opt out, it does not necessarily undo the past use of their data. Once collected, opting out does not erase data already used for AI development.

This reliance on user passivity to advance AI technology highlights a deeper problem in how companies handle personal data.

Opting out of LinkedIn’s data-sharing feature is straightforward but varies slightly between devices.

On the desktop version, users must go to “Data Privacy” in the settings and privacy menu and disable the Data for Generative AI Improvement toggle off the option to “Use my data for training content creation AI models.”

On mobile, users must tap their profile picture, go to "Settings," then "Data Privacy," and toggle off the same option. However, this action only prevents future data use; it does not affect data already used for AI development.

LinkedIn’s practices contrast sharply with the protections available in the European Union.

Under the General Data Protection Regulation, the EU mandates strict rules on data collection and usage, requiring explicit consent before using personal data for purposes such as AI training.

LinkedIn adheres to these regulations by excluding EU users' data from AI training, showing a higher standard of user protection.

Australian users, however, do not enjoy similar safeguards, exposing a significant gap in privacy protections.

This disparity highlights the inadequacies in Australian data privacy laws.

Currently, Australian protections fall short compared to European standards, leaving users vulnerable to extensive data scraping practices without adequate recourse.

LinkedIn's decision to implement default data-sharing settings without comprehensive prior notification or user consent exacerbates these concerns.

LinkedIn’s use of user data for AI training before updating its terms of service has been criticised.  

Users were not initially informed of this data use, raising questions about the transparency of LinkedIn’s data practices.

Although LinkedIn has since updated its privacy policy, this reactive approach underscores a broader issue with how technology companies manage user data.

This situation mirrors a similar controversy involving Meta, the parent company of Facebook and Instagram.  

Meta has admitted to scraping public data from Australian users to train its AI models without offering an opt out option similar to those available in the EU.  

This lack of privacy options for Australian users represents a significant lapse in data protection, which critics argue requires stronger legislative action.

The central issue here is the inadequacy of current privacy laws to address the realities of data collection and AI development.  

Users are often left to navigate complex privacy settings and opt-out processes that can be confusing and insufficient.

Protecting personal data should not fall solely on users; instead, regulations must ensure that companies like LinkedIn and Meta are held accountable for their data practices.

The Australian government must strengthen privacy laws to better safeguard users from invasive data practices in response to these concerns.

Anticipated reforms to the Privacy Act should address the shortcomings highlighted by recent reviews, prioritising comprehensive data protections and enforcing stricter consent requirements for data usage.

 As the digital landscape evolves, so must our data privacy approach.  

Relying on user opt out mechanisms, particularly when data has already been collected, is inadequate.  

A proactive framework is needed to ensure that users are fully informed and have meaningful control over their data.  

Until such measures are in place, the challenge of protecting personal information amidst aggressive data practices will persist.

For now, Australian users must remain vigilant about their privacy settings and be aware of how their data is used.

  • Dr James Birt is an Associate Professor of Creative Media Studies at Bond University and Associate Dean of External Engagement for the Faculty of Society & Design. 

More from Bond

  • Cash in on happiness with the right gift

    Spending money on experiences, rather than material goods, boosts happiness and fosters stronger social connections.

    Read article
  • Apply for First Nations medical scholarships in January

    Join Juliette Levinge in Bond University’s Medical Program.

    Read article
  • Katura says ‘yes’ to the world

    Katura Halleday’s mother’s travel advice to her daughter was “say yes to everything”.Without hesitation the 20-year-old has embraced that mantra in the 24 countries she has visited since the a

    Read article
  • Deadly gender gap in CPR training

    Women are less likely than men to receive CPR after suffering cardiac arrest, and a lack of female training manikins may be to blame.

    Read article
  • The price of online fame for children

    Dr Tyler Wilson warns that a generation of "kidfluencers" may suffer lasting harm.

    Read article
Previous Next