pexels-brett-sayles-4754260

The Importance of Protecting Freedom of Thought in the Digital Age

Freedom of thought is a fundamental human right that allows individuals to think, express, and exchange ideas freely without fear of censorship or persecution. In today’s digital age, this right is facing new and unprecedented challenges. In a recent policy brief titled “Protecting Freedom of Thought in the Digital Age,” the Centre for International Governance Innovation (CIGI) highlights these challenges and provides recommendations for protecting this essential human right. 

The Challenges to Freedom of Thought in the Digital Age 

The CIGI policy brief highlights that the digital age has brought new challenges to freedom of thought. One of the main challenges is the increasing use of digital technologies to monitor individuals’ online activities, including their thoughts and beliefs. Governments and private companies are now capable of collecting vast amounts of personal data, raising concerns about privacy and freedom of expression. 

Another challenge is the spread of disinformation and fake news, which can undermine freedom of thought by influencing public opinion and promoting falsehoods. The CIGI brief notes that this can have a significant impact on democratic processes, including elections. 

Finally, the brief highlights the growing power of technology companies in shaping public discourse. Social media platforms and search engines can influence what information is seen and heard, raising concerns about the potential for bias and the suppression of certain viewpoints. 

Protecting Freedom of Thought: Recommendations 

The CIGI policy brief provides several recommendations for protecting freedom of thought in the digital age. One of the main recommendations is the need for stronger legal frameworks to protect individuals’ privacy and freedom of expression online. This includes measures such as data protection laws and laws to regulate the use of surveillance technologies.  

The brief also recommends increased transparency and accountability from technology companies. This includes greater transparency about the algorithms used by social media platforms and search engines, as well as greater accountability for the impact of these technologies on public discourse.  

Finally, the brief highlights the importance of education and digital literacy in promoting freedom of thought. This includes teaching individuals how to identify and critically evaluate information online, as well as promoting media literacy and digital citizenship. 

Conclusion 

In conclusion, the CIGI policy brief “Protecting Freedom of Thought in the Digital Age” highlights the challenges facing freedom of thought in the digital age and provides recommendations for protecting this essential human right. As technology continues to shape our lives and societies, it is essential that we take steps to safeguard our fundamental freedoms, including freedom of thought. By implementing the recommendations outlined in this brief, we can work towards a more just and democratic digital future. 

ben-wicks-iDCtsz-INHI-unsplash

Data Privacy Laws for Children 

As the use of technology becomes more prevalent in our daily lives, the importance of protecting our personal data, especially that of children, has become increasingly important.

In recent years, governments around the world have enacted laws to protect children’s data privacy, but there is still much work to be done. 

One of the most comprehensive data privacy laws for children is the Children’s Online Privacy Protection Act (COPPA), which was enacted in the United States in 1998.

COPPA requires websites and online services to obtain parental consent before collecting personal information from children under the age of 13. The law also requires websites to post a clear and concise privacy policy, which must explain what information is being collected, how it is being used, and how it will be shared. 

Additionally, COPPA requires websites to provide parents with the option to review and delete their child’s personal information. 

COPPA has been successful in protecting children’s data privacy, but there are concerns that the law is outdated and does not adequately address newer technologies such as social media and mobile apps. To address these concerns, the Federal Trade Commission (FTC), which enforces COPPA, has proposed updates to the law. These updates include expanding COPPA’s coverage to include social media platforms and mobile apps, as well as strengthening parental consent requirements. 

Other countries have also enacted data privacy laws for children. 

In the European Union, the General Data Protection Regulation (GDPR) includes specific provisions for the protection of children’s data privacy. The GDPR requires parental consent for the processing of children’s personal data up to the age of 16, although individual EU member states can choose to lower this age to 13. 

In Australia, the Privacy Act 1988 includes a set of 13 Australian Privacy Principles (APPs) that govern the handling of personal information by Australian government agencies and businesses. APP 5 specifically addresses the collection of personal information from children under the age of 18 and requires parental consent for such collection. 

Despite the existence of these laws, there are still concerns that companies are not doing enough to protect children’s data privacy.  

A 2019 study by the FTC found that many mobile apps aimed at children were collecting data without parental consent, and a 2020 study by the Norwegian Consumer Council found similar issues with popular social media platforms. 

It is important for parents and caregivers to be aware of these data privacy laws and to take steps to protect their children’s personal information. This includes reading privacy policies, reviewing app permissions, and talking to their children about online privacy. 

In conclusion, protecting children’s data privacy is crucial in today’s digital age. While laws such as COPPA and the GDPR have been enacted to address this issue, there is still much work to be done to ensure that companies are following these regulations and adequately protecting children’s personal information. Parents and caregivers can play a role in this by educating themselves and their children about online privacy and taking steps to protect their personal data. 

Further Reading: 

Federal Trade Commission (n.d.) Children’s Online Privacy Protection Rule: A Six-Step Compliance Plan for Your Business.

Federal Trade Commission. (2020). Complying with COPPA: Frequently Asked Questions.

European Commission. (2018). Data protection rules for children.

fabio-oyXis2kALVg-unsplash

Data Privacy in the Workplace 

Employee Monitoring 

Employee monitoring refers to the use of different surveillance and data collection techniques by an employer. These techniques may include key cards, biometrics, and other electronic monitoring practices, as well as employee monitoring software (such as computer & workstation monitoring, internet & social media monitoring, video & audio monitoring, etc.). 

Most businesses maintain tabs on their workforce to improve employee concentration and productivity while ensuring data security by monitoring how and what information employees utilize. Based on what is being observed and whether the employee is aware that they are being tracked, these employee monitoring techniques can be roughly characterized as invasive or non-invasive. The main employee monitoring tools are discussed below. 

Computer and workstation monitoring – While the EU allows computer monitoring provided employees are notified in advance and it is done for legitimate business purposes without limiting employee rights to privacy, various Acts in the US legalize electronic surveillance of all actions on company-owned computers. 

Internet and social media monitoring – Employers are permitted to create social media policies in the US. Although the GDPR doesn’t have specific guidelines for monitoring social media and internet use at work, its privacy laws may have restrictions on what you can and cannot monitor. 

Monitoring screen content and keystrokes – Employers are permitted to use this technique on company-owned computers in the US, but given the nature of the tool, it is advised that they get employee consent. In most cases, using monitoring tools that record keystrokes or take images of employees’ displays is prohibited by the GDPR. 

Monitoring private messages and emails – Any email or private communication sent or received on a company-owned device is regarded as corporate property in the US. Because of this, it is acceptable for businesses to monitor confidential emails and messages. Email monitoring is legal under GDPR if the employee is informed and consents to it, the information collected about the employee through email monitoring is handled securely, and the company has a retention policy for emails and deletes them when it is up. 

Monitoring company phone conversations – Employers may only listen to calls and voicemails for proper business purposes in the US. However, since businesses have the right to monitor their own phones, there is some ambiguity when an employee uses a corporate phone for a personal call. Voicemails and phone calls are considered personal information under the GDPR; therefore, corporations must first get the participants’ consent before listening in on them. 

Video surveillance – Video surveillance is permitted under federal law when done for legitimate business-related purposes. This might be done to keep things safe generally or to stop theft. However, an audio recording shouldn’t be included with the video recording. Most surveillance tapes typically feature people who have not given their agreement to be watched, and under GDPR, identifiable faces are considered personal data. 

Monitoring personal devices – Monitoring of personal devices is permitted in the US if the employer has established clear standards regarding it. The GDPR is highly strict regarding personal device monitoring since it places a strong emphasis on defending employees’ privacy. It forbids employers from accessing the personal information on employees’ devices through scanning software. 

Monitoring employee location – Although it is strongly advised to tell employees and gain their approval, US regulations don’t specifically regulate monitoring locations. If a business wants to track the whereabouts of remote employees, GDPR mandates that they execute a DPIA (Data Protection Impact Assessment). You will then have a legal foundation for monitoring your staff thanks to the DPIA. 

GDPR and CCTV 

For businesses around the nation, CCTV is an essential security tool. However, if you don’t have the proper CCTV policy in place, you can end up breaking rigorous privacy regulations that defend people’s rights. 

The GDPR Act, which emphasizes that personal data should only be kept for as long as required, applies to any surveillance operations conducted outside of a person’s domestic property. Employers must therefore be able to justify the use of surveillance, identify those who are recorded, state how long they want to retain the footage, and describe how the data will be maintained and protected. 

Recommendations 

Look at the relevant laws – To be sure they are adhering to the law, employers should contact law firms. This is particularly crucial when specific employment laws change, and revised procedures are required. Employers who implement monitoring techniques for a remotely located crew that is spread out globe are highlighted. 

Be transparent about everything – Even though it’s not required, it’s usually a good idea to be open and honest with employees regarding monitoring procedures. They will be more open and accepting to the measures because they will understand the reasoning behind them better. 

Use employee-friendly tools – Employers should refrain from using equipment that secretly watch on their employees, like background-running keyloggers. Employee trust could decline as a result, and there may also be legal repercussions. 

pexels-rijan-hamidovic-2193300

Privacy Law in Australia

Australia’s privacy law is governed by the Privacy Act 1988, which outlines the principles of privacy protection and regulates the handling of personal information by private and public organizations. 

This article aims to provide a comprehensive guide to privacy law in Australia, covering the Privacy Act, data protection, and privacy rights in the country. 

Privacy Act 1988 

The Privacy Act 1988 is the primary law governing privacy in Australia. It applies to private sector organizations with an annual turnover of over AUD 3 million, all Australian government agencies, and some other organizations such as health service providers, credit reporting agencies, and businesses that handle tax file numbers. 

The Privacy Act regulates the collection, use, storage, and disclosure of personal information by organizations. It also provides individuals with the right to access and correct their personal information held by organizations. 

Data Protection 

The Privacy Act also contains the Australian Privacy Principles (APPs), which set out the standards for handling personal information. The APPs cover various aspects of data protection, including the collection, use, and disclosure of personal information, data quality and security, and the right to access and correct personal information. 

Under the APPs, organizations must obtain an individual’s consent before collecting their personal information, and they must only collect information that is necessary for their functions or activities. Organizations must also take reasonable steps to ensure that personal information is accurate, up-to-date, and secure. 

Privacy Rights in Australia 

In addition to the rights provided by the Privacy Act, individuals in Australia also have other privacy rights. For example, the Australian Constitution does not explicitly recognize a right to privacy, but the High Court has recognized that it is an implied right. This right protects individuals from unreasonable intrusions into their private lives and allows them to maintain control over their personal information. 

In addition, Australia has enacted other laws that protect privacy rights, such as the Telecommunications (Interception and Access) Act 1979, which regulates the interception of communications, and the Spam Act 2003, which regulates the sending of unsolicited electronic messages. 

Conclusion 

Privacy is a crucial aspect of individual freedom, and it is essential to understand how it is protected in your country. In Australia, privacy law is governed by the Privacy Act 1988, which regulates the handling of personal information by organizations. The Act contains the Australian Privacy Principles, which set out the standards for data protection. Individuals in Australia also have other privacy rights, including the implied right to privacy recognized by the High Court. By understanding privacy law in Australia, individuals can better protect their personal information and maintain control over their privacy. 

andrew-neel-K7JEYFDictM-unsplash

Privacy Law within the African Union

The African Union (AU) is a continental organization composed of 55 member states in Africa, with the aim of promoting peace, prosperity, and development across the continent. 

The Legal Framework 

The AU has established legal instruments that guide privacy law across its member states. One such instrument is the African Union Convention on Cyber Security and Personal Data Protection, also known as the Malabo Convention. This convention was adopted in 2014 and provides a framework for data protection and privacy within the AU.  

The Key Principles  

The Malabo Convention outlines key principles of privacy law within the AU, which include:  

  1. Data Protection: This principle emphasizes the protection of personal data, ensuring that individuals’ data is collected, processed, and stored in a lawful and secure manner. 
  1. Consent: This principle requires that individuals provide their informed consent before their personal data is collected and processed. 
  1. Purpose Limitation: This principle states that personal data should only be collected and used for the specific purpose for which it was collected, and not for any other unrelated purposes. 
  1. Data Minimization: This principle emphasizes that only the minimum amount of personal data necessary for the intended purpose should be collected and processed. 
  1. Data Security: This principle requires that appropriate technical and organizational measures be in place to protect personal data from unauthorized access, loss, or destruction.  

Data Subject Rights  

The Malabo Convention recognizes the rights of data subjects, which include: 

  1. Right to Access: Data subjects have the right to access their personal data that is being processed by data controllers. 
  1. Right to Rectification: Data subjects have the right to request correction of inaccurate or incomplete personal data. 
  1. Right to Erasure: Data subjects have the right to request the deletion of their personal data in certain circumstances. 
  1. Right to Object: Data subjects have the right to object to the processing of their personal data for certain reasons, such as direct marketing or profiling.  

The Obligations of Data Controllers and Processors  

The Malabo Convention imposes obligations on data controllers and processors, which include: 

  1. Lawful Processing: Personal data should be processed in accordance with applicable data protection laws and regulations. 
  1. Data Breach Notification: Data controllers and processors are required to notify data subjects and relevant authorities in the event of a data breach that could result in harm to the data subjects. 
  1. Cross-Border Data Transfers: Personal data can only be transferred outside of the AU if the receiving country has an adequate level of data protection, or if appropriate safeguards are in place.  

Enforcement and Remedies  

The Malabo Convention provides for enforcement mechanisms and remedies for violations of privacy law within the AU. This may include sanctions, fines, and other legal actions against data controllers and processors who fail to comply with the provisions of the convention.  

However, individual AU member states may have their own data protection laws and regulations that complement the convention.  

pexels-andre-furtado-370717

 Privacy Regulations in Canada  

Earning and maintaining consumer trust is not just a good idea – it’s central to creating brand loyalty. Great companies know that loyalty is an asset that will pay dividends in terms of growth and profitability for years to come. And every company today recognizes that trust goes hand in hand with personal and data privacy. Protecting your customers’ information is more vital than ever before.

Conversely, failing to protect your customers’ information is getting much more expensive, thanks to more laws and regulations and more costly penalties in jurisdictions around the world. Canada’s Bill C-27, introduced in 2022 and expected to become law in 2023, is one of the most recent examples of privacy statutes designed to give consumers more control over their personal information. The new legislation also amends the current approach to enforcement and penalties against companies.  

Bill C-27, also known as the Digital Charter Implementation Act, of 2022 – is designed to protect consumer privacy more fully through the Consumer Privacy Protection Act (CPPA); it also creates new requirements for “algorithmic transparency” through the Artificial Intelligence and Data Act. Part 2 of Bill C-27 proposes a substantial transformation in the enforcement of the CPPA through a new organization, the Personal Information and Data Protection Tribunal, a specialized administrative body that will have the power to directly levy monetary penalties against organizations for contraventions of the CPPA.  

It’s expected that Bill C-27 will become Canadian law sometime in 2023, and it will replace the Personal Information Protection and Electronic Document Act (PIPEDA) of 2000. It codifies into law the ten principles of Canada’s Digital Charter including “Strong Enforcement and Real Accountability.” Under PIPEDA, Canada’s federal Office of the Privacy Commissioner (OPC) is responsible for enforcement of the PIPEDA and substantial powers of investigation and audit, however reports are generally nonbinding – it could “name and shame” a company to nudge it towards compliance, but is required at the moment to apply to the federal court to request enforcement of recommendations.   

For example, consider the 2019 data breach at a Canadian financial services firm, which affected more than 40% of the company’s clients and members and went unreported for six months. The breach quickly became headline news, and eventually, the Privacy Commission issued a report highlighting the firm’s lack of oversight and accountability and made several recommendations. In this case, the OPC was restricted to issuing recommendations only and was unable to levy any administrative monetary penalty – although it is worth noting that a class action was brought directly by those affected by the breach. Under the proposed Personal Information and Data Protection Tribunal Act, this would change as the OPC would be able to make recommendations to the newly established Tribunal, and any decision of the Tribunal would be final and binding, and not subject to appeal.  

Consequently, both the OPC and the new Tribunal will have substantially more power to enforce certain provisions of Bill C-27 directly against organizations. If you do business in Canada, you’re going to have to be much more vigilant and accountable about how you’re gathering and using data. You’ll be required to create and maintain privacy management programs that reflect the volume and sensitivity of the information being collected. If you don’t comply, you can face administrative monetary penalties.  

Data breach notification under Bill C-27  

  • One area in particular will be the notification of data breaches. Companies doing business in Canada will have to be more proactive (and faster) about reporting data breaches and failing to do so will potentially cost  a substantial amount, in addition to any hit to your reputation. Under section 94(1) of the CPPA, failure to implement sufficient security safeguards that result in a data breach could see companies liable for AMPs of up to 3% of global annual revenue or CAN$10 million – whichever is higher. Significantly, failing to report a data breach is even more expensive: the maximum fine is 5% of global revenue or CAN$25 million (again, whichever number is higher).   
  • Data breaches, of course, generate headlines but now they can also result in substantial penalties. And is important to note that data breaches are not the only situations where AMPs may be levied. Under the new law, the Tribunal will be able to impose fines for misusing personal information or not enabling proper access to collected information by consumers. They have a right to know what’s been collected about them and the right to have it disposed of properly if desired. Finally, under section 107(1) of the new privacy legislation, individuals whose privacy rights have been violated will be able to bring a private right of action against the company responsible – another potential source of reputational damage and financial exposure.  

What you should be doing to prepare 

  • Your company should already be actively engaged in ensuring that you closely govern what kind of information you gather from customers and users as well as how you collect, store, and manage that data – legislation or no legislation. Failing to do this will put your reputation at risk.   
  • But given the likely passage of Bill C-27 and the establishment of the new Tribunal, you should also make sure your privacy management strategy is up to date. Put automated systems and guardrails in place to ensure compliance with the new provisions of the law in general and to monitor for data breaches. Doing so will help you avoid problems in the first place – and if a problem does occur, you should be able to identify it and report it quickly, which will help you avoid penalties.   

If this sounds a bit like the European Union’s General Data Protection Regulation (GDPR), you’re right. Canada and others (notably several states in the U.S.) have used GDPR as a model for the monitoring and enforcement of data privacy. 

lance-asper-0SQOEnmcuCU-unsplash (1)

Summary of data regulations in Florida 

On June 6, DeSantis signed Senate Bill 262 to create the Florida Digital Bill of Rights (FDBR). The law is scheduled to go into effect on July 1, 2024. Although the FDBR resembles other newly enacted state privacy laws, it has several unique aspects that add additional levels of analysis to determining multi-state privacy compliance. 

To qualify as a data controller under the FDBR, an organization must have $1 billion in global gross revenue and satisfy one of the following:

  • Derive 50 percent of its global gross revenue from the sale of advertisements online. 
  • Operate a consumer smart speaker and voice command service. 
  • Operate an app store or digital distribution platform with at least 250,000 different software applications.

Based on these requirements, it is clear that the FDBR is targeting large technology and advertising companies. However, the terms “processor” and “third-party” do not include these same threshold criteria as a data controller, so there are still compliance implications for businesses that process data on behalf of data controllers, as well as those who receive personal data in a third-party capacity but do not otherwise satisfy the data controller threshold. 

Like other data privacy laws, the FDBR provides exemptions to various entities regulated by federal law (e.g., the Health Information Portability and Accountability Act, the Gramm-Leach-Bliley Act, and the Fair Credit Reporting Act). It does not apply to individuals who are acting in a commercial or employment capacity. 

The FDBR provides consumers residing in Florida with the following data privacy rights:

  • Access rights, including a right to confirm whether the controller is processing any data at all.
  • Correction rights. 
  • Deletion rights concerning the data provided by or about the consumer
  • Data portability rights. 
  • Opt-out rights related to the sale of personal information targeted marketing and profiling
  • Opt-out rights related to the collection of sensitive data
  • Opt-out rights for the collection of personal data through voice recognition features

The FDBR sets forth specific processes for how data controllers must receive, process, and respond to individuals who exercise their privacy rights, including establishing a privacy rights appeals process. 

The FDBR provides that a data controller must obtain a consumer’s consent before they:

  • Use the consumer’s personal data for a purpose that is neither reasonably necessary nor compatible with the purpose for which the personal data is processed, as disclosed to the consumer. 
  • Process sensitive personal data of a consumer. 
  • Enroll the consumer in certain financial incentive programs. 

Like other privacy laws, the FDBR specifically prohibits using “dark patterns.”  Though the FDBR does not define dark patterns, it does state that consent cannot be obtained through acceptance of general or broad terms of use or by hovering over, muting, pausing, or closing a given piece of content.

The FDBR creates obligations for organizations that are not otherwise deemed data controllers. Specifically, all for-profit entities that conduct business in Florida and collect personal data are prohibited from selling a consumer’s sensitive data without first obtaining the consumer’s consent.

In addition to the typical obligations on controllers and processors seen in other states’ laws, the FDBR limits the retention of personal data. Controllers or processors may only retain personal data until the initial purpose for the collection is satisfied, the contract for which the data was collected or obtained is expired or terminated, or two years after the consumer’s last interaction with the regulated business.

The FDBR requires a data controller to post a privacy notice, which must be updated annually. In addition to the notices regarding the website selling sensitive or biometric data, if the controller operates a search engine, it is also required to disclose the parameters in ranking results. Specifically, the search engines must disclose the prioritization or de-prioritization of partisan or political ideology in search results. 

Enforcement 

Data controllers must undertake data impact assessments before engaging in certain processing activities. The Florida attorney general is granted the authority to request such assessments at any time. 

The FDBR grants the state Department of Legal Affairs exclusive authority to enforce the FDBR, and a violation of the FDBR is deemed an unfair and deceptive trade practice. The FDBR authorizes civil penalties of up to $50,000 per violation but does not create a private right of action. The FDBR includes a 45-day cure period that the Department of Legal Affairs may provide before initiating an enforcement action. 

Data Breach Notifications 

The FDBR amends the state’s data breach notification law. Florida’s data breach statute previously identified the following categories of data as personal information that, if compromised, could potentially trigger a data breach notification:

Requirements: 

  • Government identifiers (e.g., Social Security number, a driver’s license or identification card number, a passport number, military identification number);  
  • Certain financial account numbers and access codes;  
  • Medical data and health insurance policy numbers;  
  • Certain usernames or e-mail addresses in conjunction with their passwords.

The FDBR expanded this list of protected personal data to include an individual’s biometric data and any information regarding an individual’s geolocation when connected to an individual’s name. 

This amendment is especially important for organizations that use cookies, pixels, and tags on their website to identify an individual, such as through their social media account, and track their location, as such data may be subject to data breach notification requirements. The FDBR’s definition of geolocation does not correspond to the definition of “precise geolocation data” used elsewhere in the law and is likely broader in scope.

pexels-erik-mclean-12579260

 New Zealand’s Privacy Regulations  

What’s required under New Zealand’s new privacy legislation, and how can your organization comply? New Zealand introduced the Privacy Act 2020 on December 1, 2020 to strengthen data protection. The law establishes 13 information privacy principles that govern how organizations can collect, store, use, and share data. It also includes new rules for notifying individuals about data breaches and strengthens enforcement mechanisms. As a result, it’s essential for every company that operates in New Zealand to understand what’s required under the law. 

Who is subject to New Zealand’s Privacy Act 2020? 

The Privacy Act 2020 applies to any organization that collects, stores or handles personal information about New Zealand residents. 

Specifically, the law covers:

  • New Zealand agencies: any organization based in New Zealand.
  • Overseas agencies: any organization not based in New Zealand when carrying on business in the country.
  • Individuals: Any individual who is not a resident of New Zealand who collects or stores personal information while in the country (regardless of where the subject of that information is located). 

But it does grant several exceptions for: 

  • New Zealand government agencies 
  • Ombudsman 
  • News entities carrying on new activities 
  • Overseas governments performing government functions 

A Note on Scope 

The Privacy Act 2020 has an extraterritorial scope, meaning it does not matter where personal information is collected or where the individual is located if the subject of the data is a New Zealand resident. Additionally, the law only allows organizations to transfer personal information to another country if that country’s privacy laws are comparable to New Zealand’s.

How does New Zealand enforce the Privacy Act 2020?

The Office of the Privacy Commissioner is responsible for enforcing the Privacy Act 2020. The Commissioner can investigate any instances of potential non-compliance following a complaint or on its initiative. Upon investigation, the Commissioner can issue a compliance notice requiring an organization to take action or stop doing certain activities. Finally, the Commissioner can provide advice to the New Zealand government and organizations on the application of the Privacy Act 2020. 

What is the Penalty for Non-Compliance? 

Instances of non-compliance with the Privacy Act 2020, including not responding to requests for information from individuals and failing to notify the Commissioner about a serious privacy breach, are criminal offences and carry fines of up to $10,000 NZD. Affected individuals can also issue complaints to the Human Rights Review Tribunal, which can order the offending organization to pay damages. 

What is considered a privacy breach?  

Any unauthorized or accidental access to personal information; disclosure, alteration, loss, or destruction of personal information, or action that prevents an organization from accessing information temporarily or permanently. 

What is the standard for serious harm?  

The Commissioner offers an online survey, found here, to assess whether or not a privacy breach meets the standard for serious harm. It considers: 

  • If personal information is involved. 
  • Whether or not the personal information is sensitive. 
  • Who has obtained or may obtain the data? 
  • The harm that may be caused to affected individuals. 
  • Any action already taken to reduce the risk of harm. 
  • If the data is protected by a security measure. 

How to Respond to a Notifiable Privacy Breach 

Organizations that experience a notifiable privacy breach must notify the Privacy Commissioner and affected individuals as soon as practicable after becoming aware that the breach occurred. 

Notifying the Commissioner; 

These notifications should include this information: 

  • Contact details for the organization and person issuing the notification. 
  • Timeline details about the breach, including when it occurred and when it was discovered. 
  • Details about the breach, including how many people were affected, and the type of personnel. The information involved, and who might have the information. 
  • Details about the harm that may be caused to affected individuals following the breach. 
  • Steps the organization has taken or intends to take to notify individuals. 
  • Whether or not any other organizations were affected by the breach. 
  • Whether or not the organization has notified any other agencies about the breach. 

Notifying Individuals; 

 These notifications must include: 

  • Details about the breach, including when it happened, the personal information involved, and who might have the information (however it cannot identify that party unless it’s necessary to lessen a serious threat to the life or health of individuals). 
  • Steps the organization has taken or intends to take in response to the breach. 
  • Steps that affected individuals can take to mitigate or avoid potential harm. 
  • Confirmation that the organization has notified the Commissioner about the breach. 
  • A note that affected individuals have the right to make a complaint to the Commissioner. 
  • Contact details for a person within the organization who can field inquiries. 

Note: These notifications cannot identify any other affected individuals. To avoid a delay, organizations can share information in increments if it’s not all available immediately. 

Exceptions for issuing a notification 

Organizations are not required to notify affected individuals or give public notice if they believe the notification would: 

  • Prejudice the security or defense of New Zealand, international relations of the New Zealand government, or the maintenance of the law 
  • Endanger the safety of any person or reveal a trade secret 
  • Be contrary to the affected individual’s interests, if that individual is under the age of 16 
  • Be likely to prejudice the individual’s health, in consultation with the individual’s health practitioner (where practicable) 
  • Organizations can delay notifying affected individuals or giving public notice if they believe the risks of issuing the notification outweigh the benefits. 

What Types of Incidents Require Notification Under the Privacy Act 2020? 

Any privacy breach that meets the standard of creating a serious risk for the individuals whose data is involved requires notification under the Privacy Act 2020. Common examples include: 

  • Phishing malware or Trojan 
  • Man in the Middle Attack 
  • A type of privacy breach in which an attacker intercepts a digital conversation by sitting in between the two parties involved, which gives them access to the information being shared. 
  • Stolen-records 
  • Lost or Stolen Data 
  • Any case in which personal information gets lost or stolen, even if it was an accident. Organizations will need to assess what data was involved and who might have access to the data, among other factors. 
  • Data-theft 
  • Exfiltration 
  • Techniques that allow attackers to gain unauthorized access to data and then move that data to their servers or devices. This theft can create serious harm depending on the personal information involved. 

How Can Organizations Prepare for Compliance with the Privacy Act 2020? 

Under the Privacy Act 2020, organizations must appoint a privacy officer responsible for: 

  • Monitoring compliance with the law’s 13 information privacy principles 
  • Fielding requests made under the law 
  • Working with the Commissioner on any investigations 
  • Proactively preparing for incident response 

As part of this effort, the privacy officer should prepare for three phases of incident response: 

  • Readiness 
  • Response 
  • Ongoing Management 
pexels-akbar-nemati-12074226 (1)

Data Privacy Regulations in Qatar 

Qatar is the first Gulf country to pass a national data privacy law, paving the way for all other Gulf countries to follow suit. In 2016, Qatar enacted Law No. 13 Concerning Personal Data Privacy Protection Law (the “PDPPL”). The PDPPL establishes a certain degree of personal data protection, provides data subject rights, and prescribes guidelines for organizations to process personal data within Qatar.

Furthermore, on January 31, 2021, the Ministry of Transport and Communications (the “MOTC”) released a new set of guidelines (14 in number) on the PDPPL for regulated organizations as well as guidelines for data subjects. The law was passed in 2016 as the Personal Data Privacy Protection Law (PDPPL), and it applies to all personal data that is electronically processed or subject to processing within the country, except the Financial Center Free Zone in Qatar. 

The Personal Data Privacy Protection Law defines certain obligations for data controllers regarding the processing of sensitive personal data, data subject privacy notification, breach notification, data subject rights, and cross-border transfer, to name a few. However, when the law was first enacted in 2016, it didn’t go into more detail regarding how organizations must comply with the law. To overcome that shortcoming, the National Cyber Governance and Assurance Affairs (NCGAA) issued several guidelines to help organizations meet their compliance with PDPPL. 

Who Needs to Comply with Qatar’s PDPPL 

Almost every data privacy and protection law defines certain obligations for organizations or entities that are subject to the law, the territorial limitations of the law, and the type of personal data that the law applies to.

The Qatar PDPPL applies to all such personal data that is gathered, obtained, or extracted electronically, including the data that is obtained through a combination of traditional data processing and electronic data processing means. 

Exceptions 

However, there are certain exemptions to the type of personal data that is subject to the law. The PDPPL doesn’t apply to personal data that is used as statistical data, such as the personal data used for the census. Furthermore, the PDPPL may also not apply to personal data that is processed in private or family settings. 

Obligations for Organizations Under Qatar’s PDPPL 

General Data Processing Requirements Qatar’s PDPPL obligates that the controller consider the following requirements to perform the processing of personal data or sensitive personal data:

  • The personal data must be processed in a legitimate and honest manner.
  • The controller should take into account the controls, designs, and other services while processing personal data.
  • The controller should ensure technical, financial, and administrative measures to protect the data are met as set forth by the regulatory authorities; 
  • The controller shall not keep any personal data for a period that exceeds the necessary period of collection. 
  • The legislation requires that the controller inform the individual of the following information before processing their data, such as:

Data Protection Impact Assessment (DPIA)  

The need for performing a data protection impact assessment (DPIA) was vaguely hinted at in the official text of the Qatar PDPPL under Article 11, paragraph 1, and Article 13. For instance, the text states that the controller shall review “privacy protection measures before proceeding with new processing operations.” In light of this text, the PDPPL Guidelines recommend data controllers (but not all controllers) conduct an impact assessment to identify any risks associated with processing personal data or if the processing may result in any harm to the personal data or privacy of any individual. Moreover, organizations can be subjected to a fine of QAR 1,000,000 (USD 275,000) for failing to carry out a DPIA.    

Rights of Individuals 

The PDPPL outlines a set of rights that the legislation provides to individuals whose personal data is subject to processing, such as:

  • Right to Withdraw Consent An individual has the right to withdraw their prior consent from further processing.
  • Right to Object to Processing of Personal Data, An individual has the right to object to processing their personal data if such processing isn’t necessary or if the data is collected through illegal or unfair means.
  • Right to Omission or Erase of Personal Data An individual has the right to request the erasure or deletion of their data if the processing is not necessary the data is collected through unfair means, or the purpose of the processing ceases to exist.  
  • Right to Correction Individuals have the right to request corrections to their personal data through a verified and accurate request.   
  • Right to Access An individual has the right to request access to the personal data that is collected on them.   

Important Exemptions  

The legislation allows the Competent Authority to process some personal data without abiding by the provisions of certain provisions of the law if the processing is in the interest of protecting international relations, national security, or economic and financial interests. In such cases, the Competent Authority must create a separate record of the processing of such personal data. Similarly, a data controller is exempted from certain provisions in the following cases:   

  • Performing a task related to  the public interest;  
  • Implementing a legal obligation or an order rendered by a competent court;  
  • Protecting the vital interest of the individual;  
  • Processing personal data for scientific research purposes;  
  • Processing information necessary for an investigation into a criminal defense through an official request of investigative bodies.  

Breach Notification Requirements  

The PDPPL Guidelines introduce a 72-hour deadline within which the notification needs to be made as soon as an occurrence of a breach is detected. Apart from the deadline, the Guidelines also elaborate on the circumstances that may lead to “serious harm” to an individual’s privacy, such as:   

  • Processing of sensitive data.  
  • Performing automated-decision making.  
  • Collection of personal data via third parties.  
  • Direct marketing.  
  • Processing of employees’ data.  
  • Cross-border transfer.  

Penalties for Non-Compliance  

Financial and criminal penalties against violation and non-compliance are common components in many data protection and privacy laws. However, the Qatar data protection law imposes only severe financial penalties for legislative violations and non-compliance but no criminal penalties, such as imprisonment. The penalties range from QAR 1,000,000 to QAR 5,000,000, depending on the Article that has been violated.  

pexels-markus-winkler-3018977 (1)

Data Privacy in South Korea 

South Korea has emerged as a technological powerhouse, boasting one of the most advanced digital infrastructures globally. It’s a nation known for its blazing-fast internet speeds, tech-savvy population, and thriving e-commerce ecosystem. In 2018, South Korea participated in the OECD Digital Government Index (DGI). The DGI evaluates and measures the maturity of e-government policies and their implementation as part of a coherent government-wide approach. This participation allowed the Korean government to review its progress in six dimensions: digital by design, government as a platform, data-driven public sector, open by default, user-centric, and proactive. South Korea has been able to draw insights and lessons from its partners and the OECD in the areas of digital identity, data-driven public sector and service design and delivery through the work of the E-Leaders thematic groups. This ranked the country first among 29 OECD countries in the 2019 OECD Digital Government Index. Korea performed better on all six dimensions. 

As South Koreans embrace digital lifestyles, the volume of data generated is staggering. Every click, every purchase, and every interaction with online services leaves a trail of digital footprints. This surge in data creation has necessitated the development of robust data privacy regulations to safeguard the rights and freedoms of individuals. 

While data privacy is a concern worldwide, the approach to addressing it varies from one country to another. Many nations have enacted comprehensive data protection laws, such as the European Union’s General Data Protection Regulation (GDPR), which serve as global benchmarks for data privacy. South Korea, too, has established its legal framework to ensure the protection of personal information. South Korea’s data privacy regulations are aligned with international standards, particularly in addressing the processing of personal data and the rights of data subjects. These regulations are crucial not only for the protection of South Korean citizens but also for facilitating international data transfers and collaborations with countries that have similar data protection standards. 

In the upcoming sections of this blog series, we will delve deeper into the specifics of data privacy in South Korea. We will explore the legal framework, key regulations such as the Personal Information Protection Act (PIPA), the roles of data protection authorities, recent data privacy incidents and breaches, and suggest best practices for businesses operating in South Korea or hoping to about privacy policy enforcement. 

Data Privacy Laws and Regulations in South Korea 

  1. The Legal Framework 

Data privacy and protection laws of South Korea consist of a General Law and Specific Sector Laws. 

General Law – The Personal Information Protection Act (PIPA) of South Korea established on August 30, 2011 serves as the primary legislation governing the collection, use, and handling of personal information. PIPA is the cornerstone of South Korea’s approach to data privacy and aligns with international standards to ensure robust protection for individuals’ data. The collection and use of personal information (PI) mainly governed by PIPA works together with its Enforcement Decree or prime implementing regulation (PIPA-ED). It is important to note that there have been several amendments over time to this law. The recent March 14, 2023 amendment introduced a data subjects’ right to data portability and the right to contest automated decision-making, unification of the Special Provisions for ICSPs (as defined in the special laws section below) and general provisions for data handlers, relaxation of certain consent requirements for the processing of personal data, diversified legal bases of transferring personal data overseas, the PIPC’s power to suspend overseas personal data transfers, and data handlers’ obligation to destroy pseudonymized data. It becomes imperative that companies closely monitor the amendments to both the PIPA and its Enforcement Decree to ensure compliance with any additional data protection requirements that they may be subject to. 

Specific Sector Laws – In addition to PIPA, South Korea has established other regulations that complement its data privacy framework. These include: 

  • The Act on Promotion of IT Network Use and Information Protection (Network Act): This law addresses issues related to data breach notifications, requiring organizations to promptly notify both authorities and affected individuals in the event of a data breach. 
  • The Act on Credit Information Use and Protection (Credit Information Act): This law focuses on credit information, ensuring that the handling of such sensitive data is subject to stringent regulations. 
  1. Personal Information Protection Commission 

The Personal Information Protection Commission (PIPC) is the central authority responsible for overseeing and enforcing data privacy regulations in South Korea. PIPC plays a vital role in monitoring compliance, investigating violations, and providing guidance to organizations on data protection best practices. For all PIPC guidelines, see here. Non-compliance with PIPA can result in fines, business suspensions, and even criminal penalties in severe cases. For businesses operating in South Korea, compliance with PIPA is not optional; it’s a legal obligation. Companies must invest in data protection measures, including staff training, data security technologies, and compliance checks. Failure to do so not only exposes businesses to financial penalties but also reputational damage. 

  1. Personal Information (PI) Acquisition/Consent Principle 

According to Article 15(2) of PIPA and Article 22(1) of IC Network Act, Data handler must notify the following before obtaining the consent: 

  • purposes of collection/use of personal information  
  • items of personal information to be collected 
  • duration of retention/use of personal information  

And is required to: 

  • not use the personal information for any other purpose (Article 18 of PIPA, Article 24 of IC Network Act) 
  • publicly disclose its privacy policy (Article 30 of PIPA, Article 27-2 of IC Network Act) and notify the data subject of the specific usage of personal information at least once a year (Article 30-2 of IC Network Act) 
  • process personal information in such a manner as to minimize the possible infringement upon the privacy of the data subjects (Article 3(6) of PIPA) 

When disclosing PI to third parties, Data handler must notify the following before obtaining the consent: 

  • the recipient of the PI 
  • the purpose for which the recipient will use the PI 
  • particulars of the PI to be provided 
  • period of retention and use by the recipient 
  • the data subjects’ right to refuse his/her consent and outline any disadvantages, if any, which may follow from such refusal 
  1. Data Privacy Breach Sanctions and Enforcement in South Korea  

Collecting PI without consent attract a variety of sanction, including imprisonment with labour, fines, and administrative fines.  

  • Obtaining PI without the data subject’s consent (or another legal basis) or collecting the PI of a child under the age of 14 without the legal representative’s consent, may be subject to an administrative fine KRW 50 million or more. 
  • The use and disclosing of PI to a third party without the data subjects consent attracts a fine of up to KRW 50 million. 
  • Obtaining consent to process PI by fraud or unjust means is subject to imprisonment with labour for up to 3 years or a fine of up to KRW 30 million. 
  • Failing to provide the data subject with prescribed information in as regards consent principle when collecting PI is subject to an administrative fine of up to KRW 30 million 
  1. Notable Data Privacy Incidents in South Korea 

South Korea consistently ensures the effective enforcement of notice and consent regulations, with the Korea Communications Commission (‘KCC’) responsible for upholding data privacy provisions outlined in the Network Act and, subsequent to the 2020 amendments to the PIPA, the PIPC. Numerous instances have occurred where the KCC imposed penalty surcharges as a consequence of breaches of consent requirements. 

In 2019, the KCC levied a penalty surcharge against an Information and Communications Service Providers (ICSP) for gathering or utilizing personal information without proper consent. The official rationale behind this ruling was that the ICSP had failed to secure separate consent after duly informing individuals of the legally mandated information, including the specific personal data to be collected or used and the reasons for such collection or usage. 

Prior to the 2020 amendments to the PIPA coming into effect, on July 15, 2020, the KCC issued a corrective directive and imposed a penalty surcharge of KRW 180 million upon an international media platform operator. This action was taken in response to the operator’s unauthorized collection of personal information belonging to minors under the age of 14, conducted without the consent of their legal guardians. 

Furthermore, on April 28, 2021, the PIPC enacted penalties and levied a fine against a chatbot developer for breaching the provisions of PIPA. These violations encompassed the developer’s failure to adequately inform users of its other services, specifically the utilization of their messages for training a popular AI chatbot via machine learning. Additionally, the developer did not obtain explicit consent from users for this purpose. It is worth highlighting that the PIPC determined that merely incorporating a clause into the terms required for user application login did not meet the criteria for establishing “explicit consent” as mandated by PIPA. 

On July 12, 2023, the Personal Information Protection Commission (PIPC) announced its decision, in which it imposed an administrative fine of KRW 27 million (approx. $20,480) and a penalty of KRW 6.8 billion (approx. $5,192,120) on LG U+ Co., Ltd., for violations of the Personal Information Protection Act (PIPA), following a data breach. 

These aforementioned instances are significant because they demonstrate a shift from past practices, where South Korean privacy regulators are now more proactive in applying sanctions with stringent actions to non-Korean entities subject to the pertinent data protection laws in South Korea. 

South Korea’s approaches to data privacy issues and the swift sanctions imposed on violators tells of the nation’s commitment to preserving the rights of her citizens. The nation’s proactive stance on data privacy serves as an example for the world. Amendments to data privacy laws are both necessary and inevitable as the digital space keeps evolving. It is no doubt that a good way for businesses to start is to identify tools that design privacy in from the beginning – and that’s where Epistimis Modeling Tool comes handy. Epistimis Modeling Tool (EMT) is a solution for businesses seeking to enhance their data privacy practice as well as staying complaint. EMT is an innovative tool that is proven to help businesses adapt and thrive in this data-driven era. Just to mention a few function, Epistimis Modeling Tool: 

  • does not require coding skills nor knowledge 
  • designs privacy in from the beginning of the business design 
  • covers all business models irrespective of jurisdiction 
  • and provides you with on-the go update of rule amendments.