Marketing Privacy Glossary |
Term | Acronym | Definition |
A | | |
Access Control List | ACL | A list of objects, and who is allowed to access each object. |
Act on Protection of Personal Information | APPI – Japan | Japan law, applying to businesses that hold personal information of more than 5,000 people. It requires companies to specify the purpose for which personal information is utilized. Data subjects can request disclosure of information that is held about them. |
Active Data Collection | | Collection of data provided directly by the subject. |
Ads/Marketing Compliance Manager | | System to manage data regulations related to advertising and marketing activities, such as gathering consent. |
Affective computing | | Branch of artificial intelligence dealing with measurement or simulation of human emotions |
Algorithmic Trust | | Psychological phenomenon that people perceive algorithms as more trustworthy than humans |
Anonymization | | The process of removing personal identifiers from a data set, so that identity cannot be derived from the remaining data. Anonymization is irreversible. Compare pseudonymization. |
Appropriation | | Using another person’s identity without their approval. Also called identity theft. |
Authentication | | The process of ensuring a person (or other entity) possesses a piece of information they have previously provided. Compare with verification, which ensures a person is who they claim to be. For example: a password proves a user is authorized to access a social media account (authentication) but additional proof is needed to show the account was opened by the person whose name is on it (verification). |
Automated Policy Inheritance | | Ability to govern data by the rulies under which it was originally captured, regardless of where the data is subsequently used. |
B | | |
Biometric Information Privacy Act | BIPA – US, Illinois | Illinois law dating to 2008 that restricts collection of biometric data and gives private individuals the right to sue for damages after a violation. |
Bodily Privacy | | Privacy related to a person’s body, such as physical searches or drug tests. |
Breach Disclosure | | Practices related to informing authorities or subjects if their data is exposed. |
Bring Your Own Device | BYOD | Practice of allowing workers to access company systems through their personal devices. |
Bring Your Own Identity | BYOID | Practice of enabling Web site visitors authenticate themselves by connecting to identities they have etablished in other systems such as Facebook, LinkedIn, Google, Amazon, etc. |
Browser Fingerprinting | | Practice of identifying a device over time by storing and comparing a combination of technical attributes associated with the Web browser used on that device, typically without explicit permission; provides an alternative to other identification methods; may violate privacy rules. |
C | | |
California Consumer Privacy Act | CCPA – US, California | Privacy law implemented in State of California in 2020; includes extensive personal rights regarding data use including opt-out from sale of personal information and data portability. |
California Privacy Rights and Enforcement Act of 2020 (also known as Proposition 24) | CPRA- US, California | California law, appearing as Proposition 24 in November, 2020 election, that expands privacy rights provided within California Consumer Privacy Act (CCPA). |
Children’s Online Privacy Protection Act | COPPA – US | U.S. federal law governing how Web sites treat data for people under age 13. |
C-I-A Triad | | Information security principles: confidentiality, integrity, availability. |
Commission nationale de l’informatique et des libertés | CNIL – France | The national data proection authority for France. |
Consent | | Permission granted by a data owner to use their information for specified purposes; may be implicit or explicit. |
Consent Management Platform | CMP | System that collects consent in compliance with legal requirements. |
Content Data | | The actual text, images, and other information contained within a communication, or information derived from this; contrasts with metadata, which is limited to routing, etc. |
Contextual Advertising | | Advertising based on the content of a Web site or search query where the ad appears; does not require information about the individual receiving the advertisement. |
Cookie | | Small file installed on a Web browser to capture user information and share it with the cookie owner. |
Cookie Consent Manager | | System that collects consent to use Web browser cookies to store information about a user. |
Cookie Directive | | Amendment to the European Union ePrivacy Directive, adopted in 2009, that requires user consent to installation of cookies and other online tracking technologies. The ePrivacy Regulation, based on this directive, is still under negotiation. |
Corporate Owned, Personally Enabled | COPE | Corporately Owned, Personally Enabled: the business practice of providing employees with computing devices for personal use. |
Cross Border Data Transfers | | Movement of data from one legal jurisdiction to another. May be forbidden or governed by rules to ensure protections granted in the original jurisdiction are maintained. |
Customer Access | | Ability for a consumer to review and manage data collected about them; see Data Subject Access Requests. |
Customer Identity and Access Management | CIAM | Technology that manages, authenticates, and verifies customer identity and profile data |
Cybersquatting | | creation of a Web domain name similar to a another, popular domain, done to divert traffic or force a purchase by the rightful owner. |
D | | |
Dark Patterns | | Methods used to trick users into taking unintended actions, including purchases or revealing personal data. |
Data Aggregation | | Analytical method that creates summary measures (sum, average, median, etc.) of similar data items, often to obscure information about single individuals. |
Data Anonymization | | See Anonymization. |
Data Breach | | Any unauthorized access to data collected by an organization. |
Data Breach Notification (EU) | | The process of informing authorities and data subjects whose data has been exposed by a breech. Many privacy regulations impose specific requirements for when, how, and how quickly notifications must be made. |
Data Classification | | Process of determining the type of data stored in a particular object or field, so the data can be handled as required for that data type. |
Data Controller | | Under GDPR, an organization that determines the purposes and means by which personal data is processed. Compare Data Processor. |
Data De-Identification | | The process of removing personal identifiers from a data set, so that identity cannot be derived from the remaining data. May be reversible (pseudonymization) or irreversible (anonymization). |
Data Lifecycle Management | DLM | Systematic approach to managing data from acquisition through use to disposal. |
Data Mapping | | Process of tracking where each type of data is stored in company systems, to facilitate data management processes. |
Data Masking | | Process of hiding actual values of data elements without changing the format. |
Data Minimization | | The practice of collecting and using the minimum amount of personal data needed for a particular purpose. |
Data Pipeline | | Technology to automate the process of ingesting, preparing, and exposing data for analytics and operations. |
Data Portability | | The ability to move personal data from one system to another, despite format or structural differences. Goal is to avoid lock-in by the original system. |
Data Portability | | The ability to easily move personal data between systems, to avoid lock-in by original system |
Data Privacy | | Ideas and practices relating to control of personal data, especially by the subject. |
Data Processor | | Under GDPR, an organization that processes personal data on behalf of a Data Controller. |
Data Protection Authority | DPA | National body responsible for enforcing data protection regulations under the 1995 Personal Data Directive of the European union. Now called Supervisory Authorities under GDPR. |
Data Protection by Default | | Requirement under GDPR to collect the minimum required amount of personal data and to use it for only the specified purposes. |
Data Protection by Design | | Requirement under GDPR to design systems to implement data protection principles and safeguards. |
Data Protection Impact Assessment | DPIA | Analysis that assesses the impact on fundamental rights created by a proposed data collection process or project and identifies steps to control the risks. Required in advance of data collection under GDPR. |
Data Redaction | | Technique to preserve privacy by removing a portion of data, such as names from a document or digits from an ID number. |
Data Removal | | Practices related to deleting data from company systems, either on request or on retention schedule. May require removing or masking data in connected systems, back-ups, etc. Includes keeping records to prove requested removals have taken place. |
Data Retention | | Practices related to storing and processing data for specified time periods and deleting it after the period ends, as defined with contracts. |
Data Schema | | Structure used to organize stored data. |
Data Subject Access Rights, or Data Subject Access Request | DSAR | Processes related to accepting and executing requests by a data subject to an organization to review, change, and delete data about the subject held by the organization. |
Data Subject Rights Management | | Processes related to giving subjects control over data an organization has collected about them. |
Data Watermarks | | Technique for tagging data with its origin in ways that cannot be removed and are hidden from unauthorized users. |
DataOps | | Methodology to improve data quality and currency in support of analytics and data processing. |
Deletion Validation | | Technique for confirming that data has been erased. |
Denial of Service | DoS | Cyber attack that disrupts a system by creating an unmanageable volume of interactions |
DIFC Data Protection Law No.5 of 2020 | DIFC – UAE | UAE privacy law effective October 2020. Designed to match EU privacy standards. |
Differential Privacy | | Technique to share data while preserving privacy by exposing only pools of individual records that cannot be used to detect the presence of a specific individual |
Digital Fingerprinting | | Practice of identifying a device over time by storing and comparing a combination of technical attributes associated with the device, typically without explicit permission. See browser fingerprinting. |
Digital Rights Management | DRM | Techniques to track ownership and use of digital assets. Typically applied to content such as writing, music, or video but may apply to any type of data. |
Disassociability | | Technique of removing information from a data set that can be used to identify an individual while still allowing a system to meet its purpose. |
DNS spoofing | | Attack method that corrupts the Domain Name System by altering entries that direct traffic to the proper IP address, sending traffic somewhere else |
Do Not Track | DNT | Request by a data subject that a system not capture or share information about the subject’s behavior. Usually applied to tracking Web site behavior for marketing purposes. |
Draft Decree on Personal Data Protection | DPDP – Vietnam | Vietnam proposed law governing collection and use of personal data |
E | | |
Encryption | | Technique of transforming data so it cannot be understood but can be transformed back into its original form with an algorithm and/or key. In this sense, encryption is reversible. Compare to masking or anonymization, which are not reversible. |
European Union Agency for Fundamental Rights | FRA – EU | The independent center of reference and excellence for promoting and protecting human rights in the EU. |
F | | |
Family Education Rights and Privacy Act | FERPA – US | U.S. federal law governing access to student data held by educational institutions. |
Federal Information Security Management Act | FISMA – US | U.S. law establishing information security framework for federal agencies |
Federated Learning of CoHorts | FLoC | Method for grouping consumers based on browser behaviors without revealing personal identities. Used for privacy-safe ad targeting. |
First-Party Data | | Data about a person collected by a company as part of a direct relationship, such as during a visit to the company’s Web site or when making a purchase from the company. Compare second- and third-party data |
Fuzzing | | Software testing or attack method that submits large amounts of random data to a system. |
G | | |
General Data Protection Regulation | GDPR – EU | European Union regulation governing treatment of data collected about EU residents, adopted in 2016 and taking effect in May 2018. Defines rights of individuals and imposes requirements on organizations collecting personal data. |
Google Privacy Sandbox | | Technology being developed by Google to enable ad targeting without use of cookies. |
Gramm-Leach-Bliley Act | GLBA – US | U.S. law establishing data sharing notification and security rules for financial institutions. |
Granular consent options | | Practice of defining consent options related to specific uses, or purpose, data types, or data users. |
H | | |
Health Information Technology for Economic and Clinical Health Act | HITECH – US | U.S. law establishing breach reporting and notification rules for health information |
Health Insurance Portability and Accountability Act | HIPAA – US | United State law that regulates health insurance. It includes data privacy, security, and confidentiality. Healthcare providers and others covered by HIPPA have permission to use patient data if it relates to treatment, payment, or other healthcare needs. Any use of patient personal health information (PHI) for marketing or sales would require specific consent. |
I | | |
Identifiability | | Degree to which data can be connected to a specific individual, in terms of precision (determining which individual), confidence (connecting to the right individual), and security (avoiding misrepresentation). |
Identifier for Advertisers | IDFA | ID for Apple devices made available for advertising tracking. Apple rules added in 2020 require user consent for sharing with advertisers, limiting coverage. |
Identity Verification | | The process of ensuring that a person is who they claim to be. Compare with authentication, which confirms that a person possesses a piece of information they have previously provided. For example: a password proves a user is authorized to access a social media account (authentication) but additional proof is needed to show the account was opened by the person whose name is on it (verification). |
Incognito mode | | Browser feature that enables users to browse without allowing access to identifiers or tracking devices. Any similar technology in other systems. |
Information Commissioner’s Office | ICO | Independent body that upholds information rights within the UK. |
Information Governance | | Process of managing data from through use to disposal, including privacy compliance. |
Information Lifecycle | | Systematic approach to managing data from acquisition through use to disposal. |
J | | |
K | | |
L | | |
Lawfulness of Processing | | The GDPR principle that any processing of personal data must have a legal justification. There are six justifications: consent, contract, compliance, public interest, vital interest, and legitimate interest. |
Legal Basis for Processing | | see Lawfulness of Processing |
Lei Geral de Proteção de Dados | LGPD – Brazil | Brazil privacy law to be in effect by 2021. Includes extensive personal rights regarding data use. It applies to almost all sectors of economy, public and private; has extraterritorial scope, has a broad definition of what personal data is – and virtually any data can be considered personal and subject to law. |
Ley Federal de Protección de Datos Personales en Posesión de los Particulares | FDPL – Mexico | Mexico law which went into effect in 2012. Closely follows APEC Privacy Framework. It protects any data that could lead to identifying a person and data controllers may only collect data relevant to their commercial purposes. Personal data must be deleted when the controller no longer needs or uses it. |
M | | |
Mandatory Access Control | MAC | Data access control built into an operating system. |
Metadata (also see DRM) | | Data that describes the data elements stored in a system. |
Mobile Device Forensic Tools | MDFT | Technology to recover digital evidence from mobile devices, including mobile phones and other devices with communication abilities. |
Mobile Device Management | MDM | Technology that allows remote control over mobile devices, typically by a corporate owner providing the device to its workers. |
Multi-Factor Identification | | Authentication process requiring two or more pieces of information, such as a password plus fingerprint. |
N | | |
Noise Addition | | Injection of false information into a data set to make subject identification more difficult. |
Notifiable Data Breaches Act | NDB – Australia | Australia law establishing breach reporting and notification rules |
O | | |
Onward Transfer | | Transfer of data made by someone who did not receive it directly from the original data collector (controller). Example: a subcontractor for a data processor. |
P | | |
Payment Card IndustryData Security Standards | PCI DSS | Global security standard for data related to credit and debit card processing. |
Perimeter Controls | | Technology that protects entry into a network from the outside. |
Persistent Storage | | Storage of data in a medium that retains it indefinitely, such as tape or a hard drive. |
Personal Data | | Term defined in GDPR article 4 (1) as “‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;”. Personal data can be used to identify a specific individual, either by itself or in combination with other information. |
Personal Data Protection Act | PDPA – Singapore | Singapore law governing collection and use of personal data. |
Personal Data Protection Bill | PDP – India | India proposed law governing collection and use of personal data. |
Personal Data Protection Draft | PDP – Indonesia | Indonesia proposed law governing collection and use of personal data. |
Personal Information | | Term defined in CCPA Section 1798.140(o)(1) as “information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household”. Personal information can be associated with a specific individual but may not identify them. |
Personal Information Protection and Electronic Documents Act | PIPEDA – Canada | Canada law that applies to private sector organizations across Canada that collect, use, or disclose personal information. Individuals have the right to access data held by organizations and have a right to challenge accuracy. PII can only be used for purposes for which collected and must be protected. |
Personally Identifiable Information | PII | Term used primarily in the US to describe data that uniquely identifies a specific individual, such as a Social Security Number or email address. Sometimes also includes data that can identify a specific individual when used in combination with other data, such as gender, Zip code, and date of birth. |
Policy Definition | | Technology to define rules that govern use of personal data, often based on the data type, subject location, consent status, and other conditions. |
Policy Enforcement | | Technology to enforce data privacy policies that are defined within a system. |
Principal Agent Problem | | Problem that an agent may act in her own best interests, to the detriment of the principal she is representing |
Privacy Act | US | U.S. law governing collection of data about individuals by federal agencies |
Privacy Assessment | | Review of processes and technologies that an organization applies to privacy compliance. Compare with Privacy Impact Assessment, which applies only to a specific project. |
Privacy by Design | PbD | Systems engineering approach that includes data privacy as a fundamental consideration. |
Privacy Impact Assessment | PIA | Process or project analysis that defines what personal information is involved, how it is handled to comply with privacy regulations, and how risks are mitigated. |
Privacy Impact Assessment Triggers | | Events that require a privacy impact assessment to be conducted, such as merging data sets containing personal data. |
Privacy-Enhancing Technologies | PET | Technology that protects or perserves personal privacy, often by enabling subjects to expose the minimum amount of personal data needed to achieve a task. |
Private Facts | | Personal information that is not publicly known, is not a legitimate public concern, and the subject prefers to remain private. |
Profiling | | Techniques to classify or predict individual behavior based on personal data. |
Programmatic Buying | | Any form of automated advertising media buying, especially forms based on evaluating personal data of ad recipients. |
Programmatic Digital Out-of-Home (pDOOH) | pDOOH | Out of home digital advertising, such as electronic billboards and in-store signs, that is sold via automated bidding. |
Proportionality | | Concept of balancing the amount of personal data collected for a process against the value and risks of that process. |
Protection of Personal Information Act | POPIA – South Africa | South Africa privacy act protects personal information processed in South Africa and applies to any organization processing information in South Africa. |
Provisioning | | Assignment of resources to a person or system, typically when setting up a new account. |
Pseudonymization | | The process of removing personal identifiers from a data set, so that identity can subsequently be derived given additional data. Pseudonmyization is irreversible. Compare anonymization. |
Public Safety Data Sets | | Data sets containing information related to public safety, such as crime statistics, traffic accident locations, flood plains, vehicle recalls, and epidemiological records. |
Purpose Limitation (Principle of Finality) | | Principle that the purpose for which data will be used should be specified when it is collected and subsequent use must be compatible with those purposes or justified by other legal bases. |
Q | | |
R | | |
Record of Processing Action | ROPA | Requirement under GDPR for data controllers to keep a records of personal data processing and to put protections in place. |
Rectification | | The right for a subject to require corrections to inaccurate data an organization holds about the subject. |
Right of Access | | The right for a subject to view data that an organization holds about the subject. |
Right to Be Forgotten | RTBF | See Right to Erasure |
Right to Erasure | | The right for a subject to require an organization delete data it holds about the subject. |
Right to Restriction | | The right for a subject to restrict, under specified conditions, how an organization uses data it holds about that subject. |
Risk Analysis & Alert | | Analysis of the risks posed by personal data held by an organization or used in a particular project or process. |
Role-Based Access Controls | | Data access controls based on assigning specific rights to specific user roles, with the intent of only granting to data when it is needed for a specific purpose. |
S | | |
Sarbanes-Oxley Act: | SOX – US | U.S. law establishing governance and auditing requirements for public companies |
Secondary Use | | Using personal information for purposes not specified when it was collected. |
Second-Party Data | | Data about a person collected by a company as part of a direct relationship and then shared with another company. Compare first- and third-party data. |
Sensitivity Label | | Label assigned by data subjects to indicate how important they feel it is to keep the data private. |
Service Level Agreement | SLA | Set of performance measures that a vendor agrees to meet. |
Single Factor Identification | | Authentication process requiring a single piece of information, such as a password. |
Social Engineering | | Security attack method that relies on tricking authorized users into unintentionally enabling a breach, such as revealing a password or installing malicious software. |
spoofing | | See DNS spoofing. |
Standard Contract Clauses | SCC | Standard contract terms that specify how personal data will be used to ensure compliance with GDPR rules when the data is transferred outside of European Economic Area (EEA) to a location where data protection has not been assured through an adequacy decision. |
Statistical Noise | | Random variations in data values. May be introduced purposely as part of a differential privacy process to obscure actual values that can be used to reidentify individuals whose personal data is included in a data set. |
Subject Erasure Handling | | Process of removing subject data from an organization’s systems in response to a subject access request. |
Sugging | | Selling under the guide of research. |
Supervisory Authority | | National body responsible for enforcing data protection regulations under GDPR. Formerly known as Data Protection Authority. |
Surveillance-as-a-Service | | Business model that is based on customers paying to be surveilled. |
T | | |
Telematics | | Technology to collect and distribute data generated by vehicles or related devices. |
Territorial Privacy | | Privacy related to a person’s location, such as one’s home or vehicle. |
Third Party Consent | | Consent for a property search granted by someone with legal access to the property who is not the subject of the search, such as a co-tenant. |
Third Party Data Sharing | | Data shared with someone who lacks a direct relationship to the subject. |
Third-Party Data | | Data about a person purchased from a company that lacks a direct relationship to that person, such as a data compiler. The original source may have been a company with a direct relationship or a collection technique that works without a direct relationship. Compare first- and second-party data. |
Time-Stamping | | Process of recording when an activity took place, used in audits and verification. |
Tokenization | | Process of replacing a specific data element with another element, often generic (e.g. replacing person’s name with ‘Name’). |
Transient Storage | | Data storage that lasts only a brief period, such as during an interaction. |
Transparency Consent Framework | TCF | Technical standards, policies, and provider registries developed by IAB Europe to help publishers comply with GDPR consent regulations. |
U | | |
UK GDPR | | The post-Brexit transposition of the GDPR into UK law. |
Unambiguous Consent | | Consent that meets GDPR standard for being a clear, voluntary indication of user intent. |
User-based Access Controls | | Access based on rights granted to a specific user. |
V | | |
W | | |
Workflow Management | | Technology to follow a structured process to execute a task. |
X | | |
Y | | |
Z | | |
Zero Knowledge Proofs | ZKP | Data verification method that works without sharing the data being verified. |
Zero-Day Vulnerability | 0-day | Security flaw that a software developer knows about but has not developed a patch to fix |
Zero-Party Data | | Data that a person intentionally provides to a company within a direct relationship, such as a survey response. Compare first-party data. |