Claude vs Kimi for Privacy
Claude offers stronger privacy assurances with Anthropic's transparent, safety-first approach and commitment to not training on user data, while Kimi lacks public privacy documentation and oversight mechanisms. For privacy-conscious users, Claude is the clearer choice, though both models don't include explicit source citations or opt-out controls that some privacy advocates demand.
Head-to-Head for Privacy
| Criteria | Claude | Kimi | Winner |
|---|---|---|---|
| Code Transparency | Fully proprietary code | Partially open-source for inspection | Kimi |
| Privacy Documentation | Comprehensive English policy details | Limited English privacy documentation | Claude |
| Safety Philosophy | Safety is stated core value | Reasoning-focused, less safety emphasis | Claude |
| Regional Data Jurisdiction | US company, Western regulations | China-based, different legal framework | Tie |
Code Transparency
Claude
Fully proprietary code
Kimi
Partially open-source for inspection
Privacy Documentation
Claude
Comprehensive English policy details
Kimi
Limited English privacy documentation
Safety Philosophy
Claude
Safety is stated core value
Kimi
Reasoning-focused, less safety emphasis
Regional Data Jurisdiction
Claude
US company, Western regulations
Kimi
China-based, different legal framework
See Them Answer a Privacy Question
We asked both models the same privacy-focused prompt.
“Draft a privacy policy for a health tracking app that complies with both GDPR and CCPA, and explain how user biometric data should be stored, shared, and deleted upon request.”
Draft a privacy policy for a health tracking app that complies with both GDPR and CCPA, and explain how user biometric data should be stored, shared, and deleted upon request.
Privacy Policy: HealthTrack Application
Comprehensive Privacy Policy Compliant with GDPR and CCPA
Effective Date: [DATE] Last Updated: [DATE] Version: 1.0
Plain Language Commitment: We will explain each section in plain language alongside the formal legal language. Look for the "What This Means" callouts throughout this document.
TABLE OF CONTENTS
- Introduction and Scope
- Legal Basis for Data Processing
- What Data We Collect
- Biometric Data: Special Provisions
- How We Use Your Data
- Data Storage and Security
- Data Sharing and Disclosure
- Your Rights Under GDPR
- Your Rights Under CCPA
- Data Retention and Deletion
- Biometric Data: Storage, Sharing, and Deletion Protocol
- Children's Privacy
- Cross-Border Data Transfers
- Cookies and Tracking Technologies
- Changes to This Policy
- Contact Information and Complaints
SECTION 1: INTRODUCTION AND SCOPE
1.1 Who We Are
[Company Name] ("Company," "we," "us," or "our") operates the HealthTrack application ("App"), a health and wellness tracking platform. We are:
- GDPR: The Data Controller responsible for your personal data
- CCPA: A Business that collects consumers' personal information
Registered Address: [Address] Data Protection Officer (DPO): [Name] — privacy@healthtrack.com EU Representative: [Name/Company] — [Address]
1.2 Scope of This Policy
This policy applies to:
| Scope | Coverage |
|---|---|
| Geographic | EEA residents (GDPR), California residents (CCPA), all other users |
| Platforms | iOS app, Android app, web dashboard |
| Services | Health tracking, analytics, coaching features |
| Data Types | All personal and biometric data collected |
What This Means: This policy covers everyone who uses HealthTrack. If you live in Europe or California, you have additional specific rights we explain in detail.
SECTION 2: LEGAL BASIS FOR DATA PROCESSING
2.1 GDPR Legal Bases
We process your data only when we have a valid legal basis. For each category:
| Data Category | Legal Basis | Article |
|---|---|---|
| Account creation | Contract (Art. 6(1)(b)) | Necessary to provide the service |
| Biometric/health data | Explicit Consent (Art. 9(2)(a)) | Sensitive category — requires opt-in |
| Safety features | Vital Interests (Art. 6(1)(d)) | Emergency situations only |
| Legal compliance | Legal Obligation (Art. 6(1)(c)) | Required by law |
| App improvements | Legitimate Interests (Art. 6(1)(f)) | After balancing test conducted |
| Marketing | Consent (Art. 6(1)(a)) | Separate opt-in required |
2.2 Explicit Consent for Biometric Data
Because biometric and health data constitutes Special Category Data under GDPR Article 9, we:
- Obtain separate, specific, informed, and unambiguous consent before collection
- Present consent requests in plain language, not buried in terms
- Allow consent to be withdrawn at any time without detriment
- Maintain records of all consent with timestamps and version tracking
- Never make service access conditional on biometric data consent
2.3 CCPA Framework
Under CCPA/CPRA, we acknowledge:
- Biometric data is a Sensitive Personal Information category
- We do not sell biometric or health data (and honor opt-outs for all other categories)
- We do not share biometric data for cross-context behavioral advertising
- Users may limit the use of sensitive personal information
What This Means: We will always ask for your specific permission before collecting health or body-related data. You can say no, and we cannot punish you for saying no.
SECTION 3: WHAT DATA WE COLLECT
3.1 Data Collection Categories
A. Account and Identity Data
• Name and username
• Email address
• Password (hashed — we never store plaintext passwords)
• Date of birth
• Gender (optional)
• Profile photo (optional)
• Account creation date and IP address
B. Health and Fitness Data (Requires Separate Consent)
• Heart rate and heart rate variability (HRV)
• Blood oxygen levels (SpO2)
• Sleep patterns and sleep stages
• Menstrual cycle data
• Caloric intake and nutritional information
• Exercise type, duration, and intensity
• Body weight, BMI, and body measurements
• Blood glucose readings (if manually entered)
• Medication tracking (if used)
• Mental wellness scores and mood logs
• Step count and activity levels
C. Biometric Data (Requires Explicit Separate Consent) ⚠️
• Fingerprint data (for app authentication only)
• Face ID / facial recognition data (for app authentication only)
• Continuous heart rate biometrics from wearables
• Unique physiological identifiers derived from health patterns
D. Device and Technical Data
• Device type, model, and operating system
• App version
• IP address (anonymized after 30 days)
• Crash reports and error logs
• Time zone and language settings
E. Usage Data
• Features accessed and frequency
• Session duration
• Navigation patterns within the app
• Goals set and progress
F. Data We Do NOT Collect
✗ Social Security Numbers
✗ Financial account numbers or credit card data
✗ Insurance information
✗ Precise location beyond what you explicitly share
✗ Contacts or communications from your device
✗ Any data from minors under 16 (GDPR) / 16 (CCPA)
3.2 How Data Is Collected
| Method | Description |
|---|---|
| Direct input | You manually enter data |
| Wearable sync | Connected devices (Apple Watch, Fitbit, Garmin, etc.) |
| HealthKit/Google Fit | Platform health APIs (with your permission) |
| Automatic sensors | Phone sensors with your permission |
| Third-party sources | Only with your explicit authorization |
What This Means: We collect health information you share with us or that comes from your fitness devices. We collect fingerprint or face data ONLY to let you log in securely — not for any other purpose.
SECTION 4: BIOMETRIC DATA — SPECIAL PROVISIONS
4.1 Definition of Biometric Data
For purposes of this policy, biometric data means:
Unique physical or behavioral characteristics used to identify or authenticate individuals, including but not limited to: fingerprint templates, facial geometry data, heart rate patterns that create unique physiological signatures, and any derived identifiers.
This definition is intentionally broad and consistent with:
- GDPR Article 4(14)
- Illinois BIPA (Biometric Information Privacy Act)
- CCPA/CPRA sensitive personal information category
- Washington My Health MY Data Act
4.2 Biometric Data Consent Requirements
Before collecting any biometric data, we will:
Step 1 — Disclosure: Present a clear notice stating:
- The specific biometric data to be collected
- The specific purpose(s) for collection
- How long data will be retained
- Whether data will be shared and with whom
Step 2 — Written Consent: Obtain your affirmative, written (digital) consent separately from our general terms.
Step 3 — Consent Record: Store your consent with:
- Date and time of consent
- Version of policy at time of consent
- Specific purposes consented to
- Method of consent given
Step 4 — Ongoing Consent Management: Provide easy access to withdraw consent at any time through Settings → Privacy → Biometric Data Consent.
4.3 Specific Biometric Use Limitations
| Biometric Type | Permitted Use | Prohibited Use |
|---|---|---|
| Fingerprint data | App authentication only | Marketing, profiling, sharing |
| Face ID data | App authentication only | Identity verification for third parties |
| Heart rate patterns | Health insights for YOU | Advertising targeting, employer sharing |
| Sleep biometrics | Personal health dashboard | Insurance decisions, sales |
| Physiological signatures | Trend analysis for you | Law enforcement (absent court order) |
SECTION 5: HOW WE USE YOUR DATA
5.1 Primary Purposes
| Purpose | Data Used | Legal Basis |
|---|---|---|
| Providing health tracking service | All health data | Contract + Consent |
| Authentication and security | Biometric (authentication only) | Legitimate interest + Consent |
| Personalized health insights | Health and activity data | Consent |
| App functionality and bug fixing | Usage and device data | Legitimate interest |
| Customer support | Account data, relevant health context | Contract |
5.2 Secondary Purposes (Opt-In Only)
| Purpose | Opt-In Required | Data Used |
|---|---|---|
| Research contributions | ✅ Yes — explicit | Anonymized/aggregated health data |
| Personalized coaching | ✅ Yes | Health trends |
| Health notifications | ✅ Yes | Relevant health metrics |
| Marketing communications | ✅ Yes | Contact info only |
5.3 What We Will Never Do
✗ Sell your personal or biometric data to third parties
✗ Share biometric data with employers, insurers, or government agencies
(except under valid legal compulsion — see Section 7.4)
✗ Use health data for targeted advertising
✗ Share identifiable health data for research without explicit consent
✗ Use your data to make automated decisions with legal or similarly
significant effects without human review
✗ Discriminate against you for exercising your privacy rights
What This Means: Your health data helps US give YOU better insights. We do not profit by selling your information or use it against you in any way.
SECTION 6: DATA STORAGE AND SECURITY
6.1 Storage Architecture
Infrastructure
Primary Storage: AWS / Azure [specify] — Region: [US-East/EU-West]
Backup Storage: [Provider] — Encrypted backups
Data Residency:
- EU users: Data stored in EU-based servers
- US users: Data stored in US-based servers
- Other regions: [Specify]
Encryption Standards
| Data State | Standard | Key Management |
|---|---|---|
| At rest | AES-256 encryption | AWS KMS / Azure Key Vault |
| In transit | TLS 1.3 minimum | Certificate pinning |
| Biometric templates | AES-256 + additional tokenization | Hardware Security Module (HSM) |
| Database encryption | Transparent Data Encryption (TDE) | Separate key management |
| Backup encryption | AES-256 | Separate backup keys |
6.2 Biometric-Specific Storage Protections
Biometric data receives additional security layers beyond standard personal data:
Layer 1: Device-level storage where technically possible
(Fingerprint/Face ID processed locally via Secure Enclave / TrustZone)
Layer 2: If server-side storage is necessary:
- Stored as irreversible mathematical representations (templates)
- Never stored as raw biometric images
- Isolated in dedicated, air-gapped database segments
- Accessible only by authenticated, authorized processes
Layer 3: Access controls
- Strict role-based access (minimum 2 personnel with access)
- All access logged with immutable audit trail
- No human-readable access to biometric templates
- Multi-factor authentication required for all access
Layer 4: Monitoring
- Continuous anomaly detection
- Automated alerts for unusual access patterns
- Regular penetration testing (quarterly minimum)
- Annual third-party security audits
6.3 Security Certifications and Standards
We maintain compliance with:
- SOC 2 Type II — Annual audit
- ISO 27001 — Information security management
- HIPAA Security Rule — Technical safeguards (where applicable)
- NIST Cybersecurity Framework — Ongoing implementation
6.4 Data Breach Response
In the event of a confirmed breach involving personal data:
GDPR Timeline:
- Notify supervisory authority: Within 72 hours
- Notify affected individuals: Without undue delay when high risk exists
- Document all breaches in our breach register regardless of notification requirement
CCPA/CPRA Timeline:
- Notify affected California residents: In the most expedient time possible
- Notify California Attorney General: If breach affects 500+ California residents
What This Means: Your data is locked with military-grade encryption. Biometric data is treated with an extra level of protection. If something goes wrong, we will tell you quickly.
SECTION 7: DATA SHARING AND DISCLOSURE
7.1 Sharing Categories Overview
| Recipient | Data Shared | Legal Basis | Your Control |
|---|---|---|---|
| Cloud infrastructure providers | All (as processors) | Processing agreement | Cannot opt out (essential) |
| Analytics providers (anonymized) | Aggregated stats only | Legitimate interest | Opt-out available |
| Healthcare providers | Only if YOU share/export | Your action | Full control |
| Research partners | Anonymized only with consent | Consent | Opt-in required |
| Emergency services | Relevant health data | Vital interests | Cannot opt out |
| Law enforcement | Legally required only | Legal obligation | See Section 7.4 |
7.2 Service Provider (Processor) Requirements
All third parties who process your data must:
✅ Sign a Data Processing Agreement (DPA) meeting GDPR Article 28 requirements ✅ Demonstrate equivalent security standards ✅ Process data only on our documented instructions ✅ Assist with data subject rights requests ✅ Delete or return data upon contract termination ✅ Submit to audits upon request ✅ Notify us of breaches within 24 hours
Current Service Providers (Processors):
| Provider | Purpose | Location | Safeguard |
|---|---|---|---|
| [Cloud Provider] | Infrastructure | US/EU | SCCs + DPA |
| [Analytics Provider] | App analytics | [Location] | DPA — anonymized only |
| [Support Platform] | Customer service | [Location] | DPA |
| [Email Provider] | Transactional emails | [Location] | DPA |
Full list available at: [URL] or upon request
7.3 Data We Do NOT Share
❌ Biometric data — Never shared with third parties for any commercial purpose
❌ Identifiable health data — Never sold or shared for advertising
❌ Any data — Not sold to data brokers
❌ Health data — Not shared with employers, insurance companies,
financial institutions, or housing providers
7.4 Legal Compulsion Disclosure
We may be required to disclose data to law enforcement or government agencies. When this occurs:
Our Process:
- We will review all requests for legal sufficiency
- We will challenge overbroad or legally deficient requests
- We will notify you before disclosure where legally permitted
- We will provide minimum necessary data only
- We will document all disclosures in our transparency report
We will not voluntarily provide biometric data to law enforcement absent:
- A valid court order or warrant
- Emergency circumstances involving imminent threat to life
- Your explicit consent
What This Means: We share data with companies that help us run the app — and they're all contractually bound to protect your information. We do not sell your data. Ever.
SECTION 8: YOUR RIGHTS UNDER GDPR
Applicable to residents of the European Economic Area (EEA), UK, and Switzerland
8.1 Your Rights Summary
| Right | What It Means | How to Exercise | Response Time |
|---|---|---|---|
| Right of Access (Art. 15) | Get a copy of all your data | In-app or email | 30 days |
| Right to Rectification (Art. 16) | Correct inaccurate data | In-app or email | 30 days |
| Right to Erasure (Art. 17) | Delete your data ("right to be forgotten") | In-app or email | 30 days |
| Right to Restrict Processing (Art. 18) | Limit how we use your data | Email request | 30 days |
| Right to Data Portability (Art. 20) | Receive data in machine-readable format | In-app export | 30 days |
| Right to Object (Art. 21) | Object to processing based on legitimate interests | Email request | Immediately upon receipt |
| Right to Withdraw Consent (Art. 7(3)) | Withdraw any previously given consent | In-app settings | Immediately |
| Rights re: Automated Decisions (Art. 22) | Human review of automated decisions | Email request | 30 days |
8.2 How to Exercise GDPR Rights
Option 1 — In-App (Fastest): Settings → Privacy & Data → My Rights → Select Right
Option 2 — Email: privacy@healthtrack.com Subject line: "GDPR Rights Request — [Your Name] — [Right Requested]"
Option 3 — Written Request: [Physical Address]
Verification: We will verify your identity before processing rights requests. We will not request excessive information — typically email verification is sufficient.
No Fees: We do not charge for rights requests unless they are manifestly unfounded or excessive, in which case we will inform you before charging.
8.3 Right to Lodge a Complaint
You have the right to complain to your national data protection authority:
- EU residents: Your national DPA (list at edpb.europa.eu)
- UK residents: Information Commissioner's Office (ico.org.uk)
- Switzerland: Federal Data Protection and Information Commissioner
SECTION 9: YOUR RIGHTS UNDER CCPA/CPRA
Applicable to California residents
9.1 CCPA/CPRA Rights Summary
| Right | Description | Response Time |
|---|---|---|
| Right to Know | What personal information we collect, use, share | 45 days (extendable) |
| Right to Delete | Request deletion of your personal information | 45 days |
| Right to Correct | Correct inaccurate personal information | 45 days |
| Right to Opt-Out of Sale/Sharing | Stop sale or sharing of personal info | Honored immediately |
| Right to Limit Use of Sensitive PI | Restrict use of sensitive personal information | Honored immediately |
| Right to Non-Discrimination | Exercise rights without penalty | Always applies |
| Right to Know About Automated Decision-Making | Understand automated profiling | 45 days |
9.2 Sensitive Personal Information Under CCPA/CPRA
Your biometric and health data is classified as Sensitive Personal Information (SPI). You have the right to direct us to:
- Use SPI only to provide the service you requested
- Not use SPI for inferring characteristics about you
- Not share SPI for cross-context behavioral advertising
To exercise this right: [Do Not Share My Sensitive Personal Information] — [Link]
9.3 "Do Not Sell or Share My Personal Information"
We do not sell personal information as defined by CCPA.
However, you may still opt out of any data sharing: [Do Not Sell or Share My Personal Information] — [Link]
9.4 How to Submit CCPA Requests
Online: [Privacy Request Portal URL] Toll-Free Number: 1-800-XXX-XXXX (available Monday–Friday, 9 AM–5 PM PT) Email: ccpa@healthtrack.com
Verification Process:
- We will verify your identity using information we already hold
- We may ask you to confirm up to two pieces of personal information
- We will not require creation of a new account
- Authorized agents may submit requests with proper documentation
9.5 Authorized Agents
California residents may designate an authorized agent to exercise rights. We require:
- Written authorization signed by you, or
- Power of attorney pursuant to California Probate Code
- We may verify the request directly with you
SECTION 10: DATA RETENTION AND DELETION
10.1 Retention Schedule
| Data Category | Retention Period | Basis |
|---|---|---|
| Account data | Duration of account + 2 years | Contract/Legal |
| Health and fitness data | Duration of account + 1 year (unless deletion requested) | Consent |
| Biometric data | Duration of consent or 3 years, whichever is shorter | Consent |
| Authentication biometrics | Session/device only (not retained server-side where possible) | Security |
| Financial transaction records | 7 years | Legal (tax/accounting) |
| Security logs | 2 years | Legitimate interest |
| Anonymized/aggregated data | Indefinitely (cannot be re-identified) | Legitimate interest |
| Backup data | Rolling 90-day backup window | Security |
10.2 Automatic Deletion Triggers
Data will be automatically deleted when:
⏰ Biometric data consent expires — deleted within 30 days
⏰ Account inactive for 2 consecutive years — notification sent, then deletion
⏰ Subscription cancellation — health data retained 1 year, then deleted
⏰ Consent withdrawn — relevant data deleted within 30 days
⏰ Account deletion requested — see Section 11
SECTION 11: BIOMETRIC DATA — COMPLETE STORAGE, SHARING, AND DELETION PROTOCOL
This section provides the detailed technical and procedural framework for biometric data handling.
11.1 Biometric Data Storage Protocol
Collection Phase
Step 1: CONSENT GATE
↓
User must complete separate biometric consent flow
Consent is version-controlled and timestamped
↓
Step 2: MINIMIZATION CHECK
↓
Only collect biometric data strictly necessary for stated purpose
Default: Process biometrics on-device where possible
↓
Step 3: TRANSFORMATION
↓
Raw biometric data (e.g., fingerprint scan) is IMMEDIATELY converted
to a mathematical template/hash
Original biometric image/recording is NEVER stored
↓
Step 4: TOKENIZATION
↓
Template is assigned a random token ID
Token is stored in main database
Actual biometric template stored in isolated secure vault
The two are never stored together
↓
Step 5: ENCRYPTION
↓
Template encrypted with AES-256
Key stored in Hardware Security Module (HSM)
↓
Step 6: ISOLATION
↓
Stored in dedicated biometric data store
Separated from all other personal data
Separate access controls and audit logging
Storage Locations
Authentication Biometrics (Fingerprint/Face ID):
├── PRIMARY: Device Secure Enclave (iOS) / StrongBox (Android)
│ └── Never leaves the device
│ └── No server-side storage
│
└── FALLBACK (if device doesn't support secure storage):
└── Server-side encrypted template vault
├── AES-256 encryption
├── HSM-managed keys
└── Isolated network segment
Health Biometrics (Heart rate patterns, physiological signatures):
├── Encrypted at-rest (AES-256)
├── User-specific encryption keys
├── Stored separate from identity data
└── Accessible only through authenticated API with audit logging
Data Flow Diagram
User Device App Server Secure Vault Your Dashboard
│ │ │ │
│──Biometric Input──▶│ │ │
│ │──Transform to──────▶│ │
│ │ Template │ │
│ │──Store Token──────▶│ │
│ │ │──Encrypted─────────▶│
│ │ │ Template │
│◀─Confirmation──────│ │ │
│ │ │ │
[Raw biometric data deleted from all processing nodes immediately]
11.2 Biometric Data Sharing Protocol
Sharing Decision Tree
Request to share biometric data?
│
▼
Is this for authentication?
├── YES → Process locally, no sharing
└── NO → Continue
│
▼
Is this a legal compulsion (court order/warrant)?
├── YES → Review validity → Challenge if deficient
│ Notify user → Disclose minimum necessary
└── NO → Continue
│
▼
Is this for research?
├── YES → Explicit separate consent obtained?
│ ├── YES → Anonymize/aggregate → Share
│ └── NO → Deny request
└── NO → Continue
│
▼
Is this a service provider (processor)?
├── YES → DPA signed? Security verified? Minimum necessary?
│ ├── YES → Permitted under specific conditions
│ └── NO → Deny request
└── NO → DENY — No legitimate basis exists
Absolute Sharing Prohibitions
BIOMETRIC DATA WILL NEVER BE SHARED WITH:
❌ Employers or prospective employers
❌ Insurance companies (health, life, disability)
❌ Financial institutions for credit decisions
❌ Advertisers or marketing platforms
❌ Data brokers
❌ Government agencies without valid legal compulsion
❌ Social media platforms
❌ Other apps or platforms without explicit, specific consent
❌ International recipients without adequate protections
11.3 Biometric Data Deletion Protocol
This is the most critical section for biometric data rights. We commit to the following:
Deletion Triggers
| Trigger | Deletion Timeline | Process |
|---|---|---|
| User deletion request | 30 days maximum (target: 48 hours for biometrics) | Automated + verified |
| Consent withdrawal | 30 days maximum | Automated |
| Account deletion | 30 days maximum | Comprehensive sweep |
| Retention period expiry | Automatic at expiry | Scheduled job |
| Specific consent expiry | Automatic at expiry | Scheduled job |
| Legal basis no longer applies | Immediate | Triggered process |
Deletion Process — Step by Step
DELETION REQUEST RECEIVED
│
▼
STEP 1: VERIFICATION (24-48 hours)
├── Verify identity of requestor
├── Log deletion request with timestamp
└── Send confirmation of receipt to user
│
▼
STEP 2: SCOPE IDENTIFICATION (48-72 hours)
├── Identify all biometric data across ALL systems:
│ ├── Primary database
│ ├── Backup systems
│ ├── Archive storage
│ ├── Disaster recovery copies
│ ├── Third-party processors
│ └── Anonymized dataset contributions (if applicable)
└── Generate deletion manifest
│
▼
STEP 3: PRIMARY DELETION (Within 7 days of verification)
├── Delete biometric templates from secure vault
├── Delete associated tokens from main database
├── Delete consent records (retain record THAT consent existed — not the data)
├── Delete any derived data unique to this user's biometrics
└── Overwrite deleted storage locations (DoD 5220.22-M standard or equivalent)
│
▼
STEP 4: PROCESSOR DELETION (Within 14 days)
├── Issue deletion instructions to all data processors
├── Obtain written confirmation of deletion
└── Document in deletion log
│
▼
STEP 5: BACKUP AND ARCHIVE DELETION (Within 30 days)
├── Flag backups containing this user's biometric data for purging
├── Purge from all backup rotations
└── Confirm backup deletion
│
▼
STEP 6: VERIFICATION AND AUDIT (Within 30 days)
├── Technical verification that deletion is complete
├── Review deletion manifest against all known storage locations
├── Generate deletion certificate
└── Notify user of completion with deletion certificate
│
▼
STEP 7: RETAIN PROOF OF DELETION
├── Maintain log that data existed and was deleted (no biometric data retained)
├── Retain consent record showing consent was given and withdrawn
└── Retain deletion request record for legal compliance
What "Deletion" Technically Means
We define deletion to mean:
CRYPTOGRAPHIC DELETION:
For encrypted data: We destroy the encryption keys, rendering
the data permanently inaccessible and cryptographically unrecoverable.
Timeline: Immediate for key destruction
PHYSICAL DELETION:
We overwrite the storage locations using secure deletion standards.
Data cannot be reconstructed through forensic means.
Timeline: Within 30 days
BACKUP DELETION:
Backups are purged on their next rotation cycle within 30 days.
No backup contains recoverable biometric data after this period.
Deletion Limitations and Exceptions
We will notify you if we cannot immediately delete data because of:
| Exception | Basis | Duration |
|---|---|---|
| Active legal hold | Court order or litigation | Duration of hold |
| Regulatory investigation | Legal obligation | Duration of investigation |
| Fraud prevention | Up to 12 months | Legal basis documented |
Even in these cases:
- Data is locked with restricted access
- Not used for any other purpose
- Deleted immediately when the exception no longer applies
Deletion Confirmation
Upon completed deletion, you will receive:
DELETION CERTIFICATE
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
User ID: [Token — not real ID for security]
Request Date: [DATE]
Completion Date: [DATE]
Data Types Deleted: [List]
Systems Cleared: [List]
Processor Confirmations: [List]
Certificate ID: [Unique ID for verification]
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
This certificate confirms deletion in accordance with
GDPR Article 17 and CCPA Section 1798.105
What This Means: When you ask us to delete your biometric data, we actually delete it — from everywhere, including backups. We'll send you written proof. Target: deleted within 48 hours; complete across all systems within 30 days.
SECTION 12: CHILDREN'S PRIVACY
We do not knowingly collect data from children:
- Under 16 in the EEA (GDPR Article 8)
- Under 16 in California (CCPA)
- Under 13 elsewhere (COPPA)
If we discover we have collected data from a child below the applicable age:
- We will delete it immediately without request
- We will notify the parent/guardian if contact information is available
- We will review how the collection occurred and prevent recurrence
Age Verification: We implement reasonable age verification at signup. We do not collect biometric data to verify age.
SECTION 13: CROSS-BORDER DATA TRANSFERS
13.1 GDPR Transfer Safeguards
When transferring EU/EEA personal data outside the EEA, we use:
| Mechanism | Application |
|---|---|
| Standard Contractual Clauses (SCCs) | Primary transfer mechanism — all non-adequate countries |
| Adequacy Decisions | UK, Switzerland, Canada, and other adequate countries |
| Binding Corporate Rules | Intra-group transfers |
| Explicit Consent | Case-by-case, where appropriate |
Transfer Impact Assessments (TIA): We conduct TIAs for all transfers to countries without adequacy decisions, assessing:
- Laws of the recipient country
- Risk of government access
- Supplementary technical measures in place
13.2 Biometric Data Transfer Restrictions
Biometric data transfers are subject to heightened restrictions:
✅ Transfer permitted:
- Within the EU/EEA
- To countries with adequacy decisions
- Via SCCs with completed TIA showing adequate protection
❌ Transfer not permitted:
- To countries with demonstrated government surveillance risks
without supplementary measures
- To processors that cannot demonstrate equivalent security
- Without documented transfer mechanism on file
SECTION 14: COOKIES AND TRACKING TECHNOLOGIES
14.1 App Tracking
| Tracker Type | Purpose | Basis | Control |
|---|---|---|---|
| Essential session tokens | App function | Contract | Cannot disable |
Try privacy tasks with both models
See Claude and Kimi answer side by side in Multichat
Detailed Breakdown
When privacy is a priority, the comparison between Claude and Kimi hinges less on features and more on jurisdiction, data governance, and organizational trust frameworks — and those differences are significant.
Claude is built by Anthropic, a US-based AI safety company with a clear and publicly available privacy policy. Free and Pro users should be aware that conversations may be reviewed to improve models, though Anthropic provides opt-out mechanisms and offers stronger data protections at the API level and for enterprise customers. Anthropic has pursued SOC 2 compliance, and its privacy documentation is detailed and written in accessible English. For professionals handling sensitive content — think draft legal briefs, medical research summaries, or business strategy documents — Claude's privacy posture is auditable and held to recognizable Western legal standards (GDPR, CCPA, etc.).
Kimi is developed by Moonshot AI, a Chinese company headquartered in Beijing. This raises an immediate and material concern for privacy-conscious users: data processed through Kimi may be subject to Chinese cybersecurity and data sovereignty laws, including the Data Security Law (DSL) and the Personal Information Protection Law (PIPL), which can compel companies to share data with Chinese government authorities under certain conditions. For most casual users this may seem abstract, but for anyone handling proprietary business data, sensitive personal information, or confidential client material, this is a concrete risk that must be weighed seriously. Kimi's documentation is also primarily in Chinese, making independent privacy auditing difficult for non-Chinese-speaking users or compliance teams.
In practical terms: if you are a freelancer drafting personal blog posts or a student brainstorming ideas, the difference may feel negligible. But if you are a lawyer reviewing case notes, a healthcare professional summarizing patient research, or a business analyst working with competitive intelligence, routing that data through a Chinese-jurisdiction AI service introduces risks that are hard to justify — regardless of how capable the model is.
Kimi's partially open-weight approach does offer some theoretical transparency into the model itself, but open weights do not translate into visibility over where your data goes or how it is stored.
Recommendation: For privacy-sensitive use cases, Claude is the clear choice. It offers a more established, auditable data governance framework, operates under recognizable Western legal standards, and provides enterprise-grade controls for organizations that need them. Kimi is a capable model, but its jurisdictional exposure makes it unsuitable for users where data confidentiality is non-negotiable.
Frequently Asked Questions
Other Topics for Claude vs Kimi
Privacy Comparisons for Other Models
Try privacy tasks with Claude and Kimi
Compare in Multichat — freeJoin 10,000+ professionals who use Multichat