Understanding the Digital Services Act and the Online Safety Act and their impact on recruitment

The digital landscape has transformed significantly over the past decade. Harmful and toxic content and its delivery to vulnerable adults and minors have led to campaigners putting extreme pressure on legislators to develop frameworks to address the issues of online safety and misinformation. The European Union’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA) are significant measures aimed at enhancing online safety, data protection, and content compliance. This blog explores the problems these acts are designed to solve, their key measures, and their impact on data privacy, data protection, and content compliance recruitment in the UK and Europe.

Problems Addressed by the Digital Services Act and the Online Safety Act

Misinformation and Disinformation

Both the DSA and OSA aim to combat the spread of misinformation and disinformation. With the rise of social media, false information can spread rapidly, influencing public opinion and undermining trust in democratic processes. These legislative frameworks seek to ensure that digital platforms are held accountable for the content they disseminate.

Illegal Content

Another significant issue is the presence of illegal content, including hate speech, terrorist propaganda, and child exploitation material. The DSA and OSA mandate stricter content moderation practices to swiftly identify and remove such content, thereby protecting users and maintaining the integrity of online platforms.

Data Privacy and Protection

In an era where data is a valuable commodity, ensuring data privacy and protection is paramount. The DSA and OSA introduce measures to safeguard personal data, requiring platforms to adhere to stringent data handling and storage practices to prevent breaches and misuse.

Transparency and Accountability

Both acts emphasize the need for transparency and accountability in digital platforms’ operations. This includes clear reporting mechanisms for content removal, advertising transparency, and the use of algorithms in content moderation and recommendation systems.

Key Measures of the Digital Services Act

Scope and Definitions

The DSA applies to a broad range of digital services, including online intermediaries, hosting services, online platforms, and very large online platforms (VLOPs) with over 45 million users in the EU.

Risk Assessments and Mitigation

VLOPs are required to conduct annual risk assessments to identify and mitigate systemic risks related to the dissemination of illegal content, impacts on fundamental rights, and threats to public health and security.

Transparency Reporting

Platforms must publish regular transparency reports detailing their content moderation activities, including the number of content removals, the reasons for removal, and the processes involved.

User Empowerment

The DSA mandates that platforms provide users with tools to report illegal content easily and to appeal content removal decisions. Additionally, users should have more control over their data and how it is used in advertising and content recommendations.

Key Measures of the Online Safety Act

Scope and Definitions

The OSA primarily targets social media platforms and other online services that allow user-generated content. It aims to ensure that these platforms take proactive steps to protect users from harmful content.

Duty of Care

Platforms have a duty of care to protect users, especially minors, from harmful content. This includes implementing robust age verification measures and content filtering tools.

Regulatory Oversight

The Office of Communications (Ofcom) is designated as the regulator responsible for enforcing the OSA. Ofcom has the authority to impose fines and other penalties on platforms that fail to comply with the Act’s provisions.

Transparency and Reporting

Similar to the DSA, the OSA requires platforms to be transparent about their content moderation practices and to publish regular reports on their efforts to tackle illegal and harmful content.

Impact on Data Privacy and Data Protection

Enhanced Data Privacy Measures

Both the DSA and OSA introduce stringent data privacy measures to protect users. Platforms must ensure that personal data is handled in compliance with existing regulations like the General Data Protection Regulation (GDPR). This includes obtaining explicit user consent for data collection and processing, implementing robust security measures to prevent data breaches, and providing users with greater control over their data.

Data Handling and Storage Practices

The Acts require platforms to adopt secure data handling and storage practices. This includes ensuring that data is stored securely, minimising data retention periods, and conducting regular audits to ensure compliance with data protection standards.

Impact on Compliance Costs

Implementing these data privacy and protection measures can be costly for platforms. They may need to invest in new technologies, hire additional compliance staff, and conduct regular training for employees to ensure adherence to the regulations.

Impact on Content Compliance Recruitment

Increased Demand for Compliance Professionals

The stringent requirements of the DSA and OSA have led to an increased demand for compliance professionals. Companies need experts who can navigate the complex regulatory landscape, conduct risk assessments, and implement effective content moderation strategies. This has resulted in a surge in recruitment for roles such as compliance officers, data protection officers, and content moderators.

Specialised Skill Sets

The evolving regulatory environment necessitates specialized skill sets. Compliance professionals need a deep understanding of data privacy laws, content moderation techniques, and the ability to analyse and mitigate risks associated with digital content. This has led to a demand for training and certification programs to equip professionals with the necessary skills.

Cross-Border Compliance

Given the global nature of digital platforms, compliance professionals must navigate multiple regulatory frameworks. This requires an understanding of international laws and the ability to ensure that platforms comply with regulations in different jurisdictions. This complexity has further driven the demand for skilled compliance professionals.

 

Conclusion

The Digital Services Act and the Online Safety Act represent significant steps towards creating a safer and more transparent digital environment. By addressing issues such as misinformation, illegal content, and data privacy, these legislative frameworks aim to protect users and hold digital platforms accountable. The stringent requirements of these Acts have profound implications for data privacy, data protection, and content compliance recruitment. As companies strive to comply with these regulations, the demand for skilled compliance professionals continues to grow, underscoring the importance of expertise in navigating the evolving digital regulatory landscape.