RAIHR / Framework / Privacy
RAIHR Framework · Dimension 3 of 7

Privacy

Why HR professionals are now on the frontline of employee and candidate data protection

RAIHR Framework Series · Dimension 3 of 7


More data, less visibility

The average enterprise HR function today handles more personal data than it did five years ago — and understands less of what happens to it.

AI-powered hiring tools process resumes, video interviews, psychometric assessments, and behavioral signals. Engagement platforms continuously collect pulse survey responses and communication patterns. Predictive analytics models ingest years of performance, attendance, and compensation history. Workforce planning tools pull data across systems that were never designed to speak to each other.

Each of these tools collects data on real people — candidates who applied in good faith, employees who filled out surveys because HR asked them to. And in many organizations, no one has a clear picture of where that data goes, how long it is retained, what it is used for beyond the immediate stated purpose, or what protections apply when it crosses a vendor's servers in another jurisdiction.

That is a privacy problem. And HR owns it.


What privacy means in AI-assisted HR

Privacy, in the context of AI-assisted HR, is not primarily about data security — though security matters too. It is about data governance: whether personal data collected for one purpose is used for that purpose and only that purpose, whether people know what is being collected and why, and whether the organization maintains control over data it is responsible for.

The core principles that apply:

  • Purpose limitation: data collected to evaluate a candidate for a specific role should not be used to train a vendor's model, build an industry benchmark product, or assess future candidates at other organizations without explicit consent.
  • Data minimization: AI tools should collect only the data genuinely necessary for their stated function. A video interview that also captures facial micro-expressions and voice stress patterns is collecting significantly more than a standard interview — and HR needs to ask whether that additional collection is justified.
  • Retention limits: personal data should not be held indefinitely. Candidate data from unsuccessful applications, employee data from tools no longer in use, and assessment outputs from departed employees all need defined retention and deletion timelines.
  • Cross-border data flows: when candidate or employee data is processed by a vendor whose servers operate in another country, different legal frameworks apply. HR cannot manage this risk without knowing where the data goes.

Privacy in HR AI is not about locking data in a vault. It is about knowing what data you have, what it is being used for, who has access to it, and whether that use is what the people who provided it would reasonably expect.


Why HR is the last human checkpoint

HR sits in an unusual position in the privacy landscape. It is often the function that collects the most sensitive personal data in the organization — health information, disciplinary records, family circumstances, performance assessments — but it is rarely the function with the most data governance expertise.

The result is a gap: HR collects and deploys tools that process sensitive personal data, while the privacy implications of those tools are either assumed to be the vendor's responsibility or referred entirely to legal and IT teams who may lack the HR context to evaluate them meaningfully.

Neither assumption is safe. Vendors are data processors — they act on your instructions. The data controller responsibility sits with your organization, which means it sits with HR as the function that made the procurement decision, designed the data collection process, and communicates with candidates and employees.

When a candidate's video interview data ends up being used to train a shared model that serves the vendor's other clients, that is HR's problem — even if the vendor contract permitted it in language no one read carefully. When an employee's wellness app data is retained by a vendor after the contract ends, that is HR's problem too.


What privacy governance looks like in practice

Scenario 1: The vendor uses your data to train their model

Your organization has used an AI interview assessment platform for two years. You discover — in a contract renewal discussion — that the vendor's terms allow them to use candidate assessment data from all clients to improve their model. Your candidates' responses are being used to make the tool better for the vendor's other customers.

A privacy-literate HR professional recognizes this as a purpose limitation violation. Candidates provided their data to be assessed for a position at your organization — not to contribute to a commercial AI product. The right response is to negotiate the removal of this clause, request confirmation that previously collected data has not been used in this way, and evaluate whether the tool's continued use is appropriate given this practice.

Scenario 2: You do not know where the data goes

During a vendor evaluation, your team asks where candidate data is stored and processed. The vendor says their infrastructure is "cloud-based" and data may be processed in multiple regions depending on server availability.

A privacy-literate HR professional understands that this answer tells them nothing useful. Data storage jurisdiction determines which legal framework applies, what cross-border transfer mechanisms are required, and what rights candidates have over their data. The follow-up questions are specific: which countries? What legal basis for transfer? What happens to data retained in those jurisdictions after contract termination?

If the vendor cannot answer these questions, that is a red flag — not a detail to address later.

Scenario 3: A former candidate requests their data

Six months after an unsuccessful application, a candidate submits a data subject access request asking for all personal data your organization holds about them — including any AI-generated assessments.

A privacy-literate HR professional understands that this request covers AI-processed data, not just traditional HR records. They work with the vendor to retrieve the relevant assessment outputs, communicate transparently with the candidate about what data was collected and how it was used, and ensure deletion is completed in accordance with retention policy.

"The AI vendor holds that data, not us" is not a valid response. The data controller obligation does not transfer to the vendor.


Key questions to ask

# Question
1 Does your contract allow the vendor to use our candidates' or employees' data to train their models or build benchmark products? If so, on what basis, and can that clause be removed?
2 In which countries is our candidates' and employees' data stored and processed? What legal mechanism covers cross-border transfers to each jurisdiction?
3 What data does this tool actually collect — including data collected beyond its stated primary function (e.g., behavioral signals, biometric indicators, communication metadata)?
4 What is the vendor's data retention policy? What happens to our data if we terminate the contract, and within what timeframe is deletion confirmed?
5 Are candidates and employees informed — in plain language, before they participate — what data is collected, how it is used, and what their rights are?
6 What is our process for responding to data subject access and deletion requests that involve AI-processed data held by vendors?
7 Has a data protection impact assessment been completed for this tool, given the sensitivity of the data it processes?

Build the judgment to protect people's data

Privacy is one of seven dimensions in the RAIHR framework for responsible AI in HR. The ability to identify a purpose limitation violation in a vendor contract, to know what questions to ask about data residency, to respond correctly to a data subject access request involving AI outputs — these are not skills that come automatically with HR experience. They require deliberate learning.

The RAIHR Certified Practitioner program tests privacy judgment through scenario-based examination grounded in real HR situations. You will be assessed not on your ability to cite regulation, but on your ability to recognize what the right action is — and why the plausible alternatives fall short.

The certification is open to all HR professionals — regardless of seniority or technical background. No coding knowledge required. Open-book examination. 90 minutes. The question is not whether you can recall definitions. It is whether you can make the right call.

Ready to get certified? Register at raihr.org


RAIHR · Responsible AI in HR · raihr.org This article is part of the RAIHR Framework Series covering all seven dimensions of the certification program.

TransparencySecurity

Ready to certify your governance judgment?

Register and certify →