What Counts as Personal Data Under the GDPR? Real-World Scenarios and Examples

Check out the new compliance progress tracker


Product Pricing Demo Video Free HIPAA Training
LATEST
video thumbnail
Admin Dashboard Walkthrough Jake guides you step-by-step through the process of achieving HIPAA compliance
Ready to get started? Book a demo with our team
Talk to an expert

What Counts as Personal Data Under the GDPR? Real-World Scenarios and Examples

Kevin Henry

Data Privacy

April 10, 2025

8 minutes read
Share this article
What Counts as Personal Data Under the GDPR? Real-World Scenarios and Examples

Under the GDPR, personal data means any information relating to an identified or identifiable natural person (the data subject). If a piece of information can directly identify someone—or be combined with other elements to single them out—it qualifies as identifiable data and is protected.

This guide walks through real-world scenarios across common data types. Along the way, you’ll see how data processing principles like data minimization, consent requirements, and pseudonymization apply in practice, including when special category data demands extra safeguards.

Identification Details and Contact Information

What’s in scope

  • Names, aliases, screen names, and handles that point to a specific person.
  • Postal addresses (home or work tied to a person), apartment numbers, and P.O. boxes assigned to individuals.
  • Email addresses and phone numbers, including role accounts if they route to one person (for example, “ceo@company.com”).
  • Government and official identifiers such as passport numbers, driver’s licenses, national IDs, and social security numbers.
  • Customer or employee IDs that can be linked back to a person through a lookup table.

Real‑world scenarios

  • CRM records with names, email, job title, and direct dial numbers used by a sales team.
  • Shipping labels and courier tracking that display a recipient’s name and home address.
  • Scanned business cards stored in a contact app for outreach and networking.
  • Visitor badges linking a printed name to an internal access log and CCTV footage.

Practical tips

  • Apply data minimization: collect only the contact fields you truly need for the stated purpose.
  • Replace names with pseudonymous IDs where possible and store the mapping separately with strict access controls.
  • Be cautious with “role-based” details; if they point to a single person, they are still personal data.

Location and Online Identifiers

What’s in scope

  • Precise GPS coordinates, geofences, home/work location history, and travel itineraries.
  • IP addresses, device identifiers (IDFA/GAID), cookie IDs, advertising segments, and telemetry identifiers.
  • Wi‑Fi/Bluetooth beacons, license plate numbers, and vehicle identifiers when linkable to a person.

Real‑world scenarios

  • Ride‑hailing trip histories that reveal pickup/drop‑off patterns tied to a rider’s account.
  • Retail foot‑traffic analytics using Wi‑Fi pings to estimate repeat visits by the same device.
  • Server logs with IP addresses linked to login attempts and fraud analysis.
  • Adtech profiles built from cookie IDs and mobile advertising IDs across websites and apps.

Practical tips

  • Prefer coarse location (for example, city or region) when precise coordinates are not necessary.
  • Rotate or reset device identifiers where feasible, and limit retention windows for logs.
  • Hashing IPs or IDs may be useful, but remember that pseudonymization does not remove GDPR scope.

Biometric and Health Data

Special category data

Biometric and health data are special category data that require heightened safeguards. Biometric data refers to physical, physiological, or behavioral characteristics processed for the purpose of uniquely identifying someone (for example, a face template or fingerprint hash). Health data includes any information about physical or mental health, diagnoses, treatment, or related inferences.

Ready to simplify HIPAA compliance?

Join thousands of organizations that trust Accountable to manage their compliance needs.

Real‑world scenarios

  • Face or fingerprint templates used for phone unlock or building access control.
  • Voiceprints for call center authentication and fraud prevention.
  • Patient intake forms, lab results, prescriptions, and insurance claims in healthcare workflows.
  • Wearable metrics such as heart rate, sleep patterns, or menstrual cycle tracking in wellness apps.

Practical tips

  • Check consent requirements and Article 9 conditions before processing special category data; obtain explicit consent if relying on consent.
  • Use privacy by design: store biometric templates rather than raw images; encrypt data in transit and at rest.
  • Conduct Data Protection Impact Assessments (DPIAs) and strictly limit access on a need‑to‑know basis.
  • Note that an ordinary photograph is personal data; it becomes biometric data only when processed for unique identification.

Financial and Employment Information

What’s in scope

  • Bank account numbers, IBANs, card PANs (prefer tokenized storage), and transaction histories.
  • Credit scores, loan applications, and collections records linked to an individual.
  • Payroll details, salary, benefits, tax IDs, and time‑and‑attendance logs.
  • Resumes/CVs, performance reviews, training records, and disciplinary notes.
  • Trade union membership (special category), which requires extra protection.

Real‑world scenarios

  • E‑commerce platforms retaining card tokens and billing addresses for subscription renewals.
  • Payroll providers processing salary, bank details, and tax filings for employees.
  • Applicant tracking systems holding resumes, interview notes, and background checks.
  • Expense management tools storing receipts that reveal merchants, locations, and travel patterns.

Practical tips

  • Use tokenization and strong encryption; restrict who can view full account or card numbers.
  • Define clear retention periods tied to legal and business needs; delete or archive promptly when no longer required.
  • Apply purpose limitation—do not repurpose HR data (for example, wellness details) for unrelated analytics without a valid lawful basis.

Personal Characteristics and Educational Records

What’s in scope

  • Age, date of birth, marital status, nationality, and spoken languages.
  • Photos, audio, or video that identify a person; handwriting samples; distinctive physical attributes.
  • Beliefs, political opinions, religious or philosophical views, racial or ethnic origin, sexual orientation, and trade union membership (special category data when processed).
  • Preferences and profiles (for example, reading history or viewing habits) when linked to an individual.
  • Student records: transcripts, grades, attendance, accommodations, student IDs, and disciplinary actions.

Real‑world scenarios

  • Social media profiles combining photos, bios, and friend graphs into a rich identity.
  • EdTech platforms storing coursework, proctoring recordings, and academic performance trends.
  • Alumni directories listing names, degrees, graduation years, and current employers.

Practical tips

  • Avoid collecting sensitive attributes unless strictly necessary; if processed, implement enhanced safeguards.
  • Consider whether images or text reveal special category data indirectly (for example, religious attire or political context).
  • Provide accessible controls so individuals can correct inaccuracies in educational or profile records.

Combining Data Points for Identification

The mosaic effect

Multiple seemingly harmless data points can form identifiable data when combined. For instance, a partial ZIP code, birth month, and a unique job title may be enough to single out a data subject in a small community.

Pseudonymization vs. anonymization

  • Pseudonymization replaces direct identifiers (for example, name or email) with a token or hash. It reduces risks but remains personal data because re‑identification is possible through a key.
  • Anonymization removes links to a person in a way that is practically irreversible. Truly anonymous data falls outside the GDPR, but achieving this standard requires rigorous techniques and ongoing re‑identification testing.

Derived and inferred data

Profiles, scores, and predictions produced by analytics are personal data if they relate to a person or affect them. Treat inferences with the same care as source attributes, especially when they may reveal special category data.

Risk‑reducing practices

  • Data minimization: collect the smallest set of attributes needed for your purpose and strip quasi‑identifiers when possible.
  • Separate data environments and keys; enforce strict access to re‑identification mechanisms.
  • Use aggregation thresholds and suppression for reports to prevent singling out individuals.
  • Define short, purpose‑bound retention periods and document your lawful basis for each processing activity.

Key takeaways

What counts as personal data under the GDPR is broader than many expect: if a person can be identified directly or indirectly, the GDPR likely applies. Favor pseudonymization, minimal collection, and strict purpose limitation to reduce risk while respecting individuals’ rights.

FAQs.

What types of information are considered personal data under the GDPR?

Any information relating to an identified or identifiable person is personal data. This includes names and contact details; location data and online identifiers (IP, cookies, device IDs); financial and employment records; photos, audio, and profiles; and special category data such as health, biometric identifiers used for unique identification, racial or ethnic origin, religious or philosophical beliefs, political opinions, sexual orientation, and trade union membership.

How does combining data points affect personal data classification?

Combining indirect elements can make a person identifiable even if no single field seems sensitive. This “mosaic effect” means quasi‑identifiers (for example, ZIP code, job role, and timestamps) together may pinpoint a data subject. Such datasets remain personal data unless effectively anonymized; pseudonymization alone does not remove GDPR obligations.

Are biometric identifiers protected under the GDPR?

Yes. When biometric characteristics (like a face template, fingerprint, or voiceprint) are processed for the purpose of uniquely identifying someone, they are special category data. Processing typically requires an Article 9 condition—often explicit consent—plus strong security, limited access, and clear retention controls.

What obligations do organizations have when processing personal data?

Organizations must follow core principles: lawfulness, fairness, and transparency; purpose limitation; data minimization; accuracy; storage limitation; integrity and confidentiality; and accountability. They need a valid lawful basis (for example, contract, legitimate interests with balancing, legal obligation, vital interests, public task, or consent). Honor data subject rights (access, rectification, erasure, restriction, portability, objection), meet consent requirements when relying on consent, secure data with appropriate technical and organizational measures, manage processors, conduct DPIAs for high‑risk processing, practice privacy by design and by default, and notify supervisory authorities of certain breaches within 72 hours.

Share this article

Ready to simplify HIPAA compliance?

Join thousands of organizations that trust Accountable to manage their compliance needs.

Related Articles