What Matters Now: Embracing the New Era of Disclosures for All HR Technology Stakeholders

Learn about the rising importance of disclosure as a critical tool to maintain trust and legitimacy.

Three people, standing, looking at holographic data charts

Ernest Ng

Ernest Ng, PhD, serves as the VP of Strategy and Research at HiredScore, specializing in business strategy and AI-driven HR innovations. His extensive experience includes roles at Salesforce, The Walt Disney Company, and the California Department of Education. Additionally, he teaches HR Analytics as an Adjunct Associate Professor at the University of Southern California.

Ready to see what HiredScore can do for your team?

Request a Demo

One prominent theme across the HR tech ecosystem that strongly emerged, especially as an AI vendor for HR, was the rising importance of disclosure as a critical tool to maintain trust and legitimacy for:

  1. Employer Disclosure with Candidates/Employees
  2. Solution Provider Disclosures to the Buyer
  3. Organizational Disclosures to the Government
  4. Industry Analysts with Consumers 

If ethics and trust are not at the forefront of your mind, you better study up fast. Regulation and compliance are going to become even more foundational for HR and business leaders striving to transform their organizations with AI. One area of safety that is rapidly progressing is disclosure and awareness - from data privacy, data use, consent, and opt-outs.

When building trust in any relationship, transparency is key and creates the foundation for many regulations and conversations surrounding ethical business practices, awareness of intentions, and reduced risk. Below are my thoughts on four key areas of disclosure:

Employer Disclosure with Candidates/Employees

As one of many examples, much has been made of NYC 144’s impact on the HR space over the past year. Many of our clients and prospects asked us about how this local law impacts their use of HiredScore. You can view our statement here, but central to this rule is the mandate of disclosure of the use of AI for candidates and employees. Further, in response to this law, employers are increasingly providing candidates and employees with the option to opt-out of certain types of AI products, especially those that analyze an applicant against job requirements, promotion options, and other types of assessment. Additionally, the employer using AI must provide a bias audit and summary of the results in a “clear and conspicuous manner”. It’s not that AI can’t be used in the hiring process, it’s that there needs to be full disclosure and clearly visible audits to ensure that the job seeker can make an informed decision. 

With our customers, we provide configuration options that enable applicants (employees and external candidates) the ability to opt out of our candidate scoring solution and our Fetch solution (which finds other jobs for an applicant that they qualify for when they apply or in the future if rejected) and as a leader in the space of AI-for-HR opt-out, have observed interesting patterns related to consumer control and value. 

For example, on average, since this feature went live in July 2023, less than 8% of candidates opt out of having their resume resurfaced for another role in the future. If candidates believe it will provide them value, they aren’t hesitant to opt in even when they are told that a product uses AI and will be incorporated in their current or future hiring processes. Can we be certain that this 8% only ever wants to work at your company at the job they applied to and, further, fully understands the benefits of the feature they are opting out of? No, but it is an applicant’s decision to make, even if misinformed, with the growing use and clear language of disclosures being an important part of the applicant and employer trust rapport.

There are countless other areas where information disclosures by employers to candidates and employees have been used to mitigate unfair business practices. From salary range, data use, arbitration, and NDAs, data disclosures across the board are becoming critical best practices to ensure that the end user is protected, is aware, and is given the right to choose their preference.

Solution Provider Disclosures to the Buyer

The second slide you see in every technology solution provider’s sales pitch owes itself to the Securities Exchange Act of 1934 and the Private Securities Litigation Reform Act of 1995. The safe harbor statement is a disclosure to the consumer that some statements may be forward-looking and not based on historical fact. It’s a best practice to remind buyers to make sure that all purchasing decisions are based on currently available products, and not assertions that may or may not occur. This disclosure is meant to protect the solution provider and give them the ability to project, predict, and presume possibilities without fear of major repercussions, while ensuring buyers are informed that the statements contain speculation. 

This helps to build trust, ensure the consumer is informed, and put guardrails on solution provider, deterring them from engaging in deceptive practices. As such, if you choose to buy a product based on future promises knowing fully well that they are not currently available, and those promises do not come to fruition, that’s your choice. Balancing out the information asymmetry is critical to ensuring fairness. 

Organizational Disclosures to the Government

Much like the transparency and disclosure requirements of NYC 144, the proposed EU AI Act requires organizations to disclose the AI they are developing, promoting, and employing, depending on the risk level involved to the EU regulating commission. Employment, worker management, and access to self-employment AI all pose high risks and are thereby subject to registration in the EU-wide public database and disclosure of risk management, data governance, documentation, oversight, accuracy, and conformity practices. These disclosures are established to keep organizations accountable for their actions and ensure that every part of society is benefiting from the advancements in AI, not just certain individuals and organizations. 

Additionally, the new Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence requires “developers of the most powerful AI systems share their safety test results and other critical information with the U.S. government. In accordance with the Defense Production Act, the Order will require that companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety must notify the federal government when training the model, and must share the results of all red-team safety tests. These measures will ensure AI systems are safe, secure, and trustworthy before companies make them public.” Whether it’s these AI regulations, filing taxes, SOX reporting, etc., all government regulations are about organizational disclosures to the government for the benefit of the public. 

Industry Analyst Disclosures to Consumers

The practice of disclosure is a familiar concept, understood by service providers, organizations, and consumers alike. Across various industries, disclosures are recommended to prevent conflicts of interest. For example, doctors presenting at medical conferences disclose financial conflicts, government officials submit financial disclosure forms (you can find stock trade trackers built off this information on the internet), and the FTC Act makes it clear that sponsored posts, articles, and content need to be properly disclosed to the consumer. Surprisingly, this is one area where the HR industry is behind perhaps every other area of technology, from B2B to B2C, with a lack of awareness and even accountability.

When I used to speak at these HR industry events as a representative for Salesforce, it was clear that I was representing the organization’s interests. Vendor sponsored presentations are typically clearly identified, so audience members are aware that vendors have paid to receive the chance to promote their products, results, executives, and clients. However, unlike in other industries, there is no requirement for presenters and speakers to disclose conflicts of interest and the practitioners who attend these conferences are only now starting to ask if they’ve unknowingly circulated, used for business cases, or trusted a biased perspective based on pay-to-play. In short, the HR industry must address this as a critical area to improve to maintain consumer trust and even laws that govern disclosure. No one wants to stop anyone from profiting off the rewards of the HR tech space, but simply to rise to the standards of the industries around us and require enhanced transparency and a commitment to disclosure - from board seats held to subscription or payment models for coverage and mention.

This is why many financial institutions do not let their stock analysts purchase individual stock that they cover, and most TV programs, before or after a financial institution’s representative goes on TV to speak about their analysis, there is a disclosure of financial positions on stocks they hold. Hiding conflicts of interests erodes trust, and it is a core component of deceptive business practices. If I were a stock analyst who held an equity position in a certain stock, went on TV to pump up that stock knowing insider information without disclosing my position, then sold the stock after a run up, I would be jailed for insider trading. These analyst positions have great power to influence the market, which many analysts are aware of. As such, without a robust disclosure practice, the consumer is not just at a huge disadvantage, but at risk of not understanding the information that was presented to them, and the HR industry itself will tarnish.

For organizations, these are huge purchasing decisions, sometimes estimating tens of millions of dollars. For the HR leader tasked with that decision, it’s either a career defining or limiting decision. I’ve seen many careers altered by an imprudent technology purchasing decision. And for the solution provider, a sale into a marque logo can make or break the organization. These are life altering decisions, and the word of industry analysts can hold a huge sway. Shouldn’t consumers know which factors are influencing their analysis or recommendations? It’s a small ask for such an impactful decision.     

If we are going to mature as an industry, we’ve got to perfect our ability to disclose across all four of these areas. The most recent developments - from employers themselves, privacy experts, and most recently governments and AI leaders - have yielded meaningful progress and positive results for our industry. However, there is room for improvement with our industry experts. Let’s all commit to being better around disclosures for the benefit of all our stakeholders.

Continue Reading
Keeping Up with the Evolution of AI for HR Governance: A Detailed Q&A Recap
How General Motors Delivers the Future of Recruiting with Workday + HiredScore
Webinar details
2024 Predictions
Image of a ball on a table with the text "2024" inside it

What Matters Now: Embracing the New Era of Disclosures for All HR Technology Stakeholders

By Ernest Ng
Ready to see what HiredScore can do for you?
Request a demo

One prominent theme across the HR tech ecosystem that strongly emerged, especially as an AI vendor for HR, was the rising importance of disclosure as a critical tool to maintain trust and legitimacy for:

  1. Employer Disclosure with Candidates/Employees
  2. Solution Provider Disclosures to the Buyer
  3. Organizational Disclosures to the Government
  4. Industry Analysts with Consumers 

If ethics and trust are not at the forefront of your mind, you better study up fast. Regulation and compliance are going to become even more foundational for HR and business leaders striving to transform their organizations with AI. One area of safety that is rapidly progressing is disclosure and awareness - from data privacy, data use, consent, and opt-outs.

When building trust in any relationship, transparency is key and creates the foundation for many regulations and conversations surrounding ethical business practices, awareness of intentions, and reduced risk. Below are my thoughts on four key areas of disclosure:

Employer Disclosure with Candidates/Employees

As one of many examples, much has been made of NYC 144’s impact on the HR space over the past year. Many of our clients and prospects asked us about how this local law impacts their use of HiredScore. You can view our statement here, but central to this rule is the mandate of disclosure of the use of AI for candidates and employees. Further, in response to this law, employers are increasingly providing candidates and employees with the option to opt-out of certain types of AI products, especially those that analyze an applicant against job requirements, promotion options, and other types of assessment. Additionally, the employer using AI must provide a bias audit and summary of the results in a “clear and conspicuous manner”. It’s not that AI can’t be used in the hiring process, it’s that there needs to be full disclosure and clearly visible audits to ensure that the job seeker can make an informed decision. 

With our customers, we provide configuration options that enable applicants (employees and external candidates) the ability to opt out of our candidate scoring solution and our Fetch solution (which finds other jobs for an applicant that they qualify for when they apply or in the future if rejected) and as a leader in the space of AI-for-HR opt-out, have observed interesting patterns related to consumer control and value. 

For example, on average, since this feature went live in July 2023, less than 8% of candidates opt out of having their resume resurfaced for another role in the future. If candidates believe it will provide them value, they aren’t hesitant to opt in even when they are told that a product uses AI and will be incorporated in their current or future hiring processes. Can we be certain that this 8% only ever wants to work at your company at the job they applied to and, further, fully understands the benefits of the feature they are opting out of? No, but it is an applicant’s decision to make, even if misinformed, with the growing use and clear language of disclosures being an important part of the applicant and employer trust rapport.

There are countless other areas where information disclosures by employers to candidates and employees have been used to mitigate unfair business practices. From salary range, data use, arbitration, and NDAs, data disclosures across the board are becoming critical best practices to ensure that the end user is protected, is aware, and is given the right to choose their preference.

Solution Provider Disclosures to the Buyer

The second slide you see in every technology solution provider’s sales pitch owes itself to the Securities Exchange Act of 1934 and the Private Securities Litigation Reform Act of 1995. The safe harbor statement is a disclosure to the consumer that some statements may be forward-looking and not based on historical fact. It’s a best practice to remind buyers to make sure that all purchasing decisions are based on currently available products, and not assertions that may or may not occur. This disclosure is meant to protect the solution provider and give them the ability to project, predict, and presume possibilities without fear of major repercussions, while ensuring buyers are informed that the statements contain speculation. 

This helps to build trust, ensure the consumer is informed, and put guardrails on solution provider, deterring them from engaging in deceptive practices. As such, if you choose to buy a product based on future promises knowing fully well that they are not currently available, and those promises do not come to fruition, that’s your choice. Balancing out the information asymmetry is critical to ensuring fairness. 

Organizational Disclosures to the Government

Much like the transparency and disclosure requirements of NYC 144, the proposed EU AI Act requires organizations to disclose the AI they are developing, promoting, and employing, depending on the risk level involved to the EU regulating commission. Employment, worker management, and access to self-employment AI all pose high risks and are thereby subject to registration in the EU-wide public database and disclosure of risk management, data governance, documentation, oversight, accuracy, and conformity practices. These disclosures are established to keep organizations accountable for their actions and ensure that every part of society is benefiting from the advancements in AI, not just certain individuals and organizations. 

Additionally, the new Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence requires “developers of the most powerful AI systems share their safety test results and other critical information with the U.S. government. In accordance with the Defense Production Act, the Order will require that companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety must notify the federal government when training the model, and must share the results of all red-team safety tests. These measures will ensure AI systems are safe, secure, and trustworthy before companies make them public.” Whether it’s these AI regulations, filing taxes, SOX reporting, etc., all government regulations are about organizational disclosures to the government for the benefit of the public. 

Industry Analyst Disclosures to Consumers

The practice of disclosure is a familiar concept, understood by service providers, organizations, and consumers alike. Across various industries, disclosures are recommended to prevent conflicts of interest. For example, doctors presenting at medical conferences disclose financial conflicts, government officials submit financial disclosure forms (you can find stock trade trackers built off this information on the internet), and the FTC Act makes it clear that sponsored posts, articles, and content need to be properly disclosed to the consumer. Surprisingly, this is one area where the HR industry is behind perhaps every other area of technology, from B2B to B2C, with a lack of awareness and even accountability.

When I used to speak at these HR industry events as a representative for Salesforce, it was clear that I was representing the organization’s interests. Vendor sponsored presentations are typically clearly identified, so audience members are aware that vendors have paid to receive the chance to promote their products, results, executives, and clients. However, unlike in other industries, there is no requirement for presenters and speakers to disclose conflicts of interest and the practitioners who attend these conferences are only now starting to ask if they’ve unknowingly circulated, used for business cases, or trusted a biased perspective based on pay-to-play. In short, the HR industry must address this as a critical area to improve to maintain consumer trust and even laws that govern disclosure. No one wants to stop anyone from profiting off the rewards of the HR tech space, but simply to rise to the standards of the industries around us and require enhanced transparency and a commitment to disclosure - from board seats held to subscription or payment models for coverage and mention.

This is why many financial institutions do not let their stock analysts purchase individual stock that they cover, and most TV programs, before or after a financial institution’s representative goes on TV to speak about their analysis, there is a disclosure of financial positions on stocks they hold. Hiding conflicts of interests erodes trust, and it is a core component of deceptive business practices. If I were a stock analyst who held an equity position in a certain stock, went on TV to pump up that stock knowing insider information without disclosing my position, then sold the stock after a run up, I would be jailed for insider trading. These analyst positions have great power to influence the market, which many analysts are aware of. As such, without a robust disclosure practice, the consumer is not just at a huge disadvantage, but at risk of not understanding the information that was presented to them, and the HR industry itself will tarnish.

For organizations, these are huge purchasing decisions, sometimes estimating tens of millions of dollars. For the HR leader tasked with that decision, it’s either a career defining or limiting decision. I’ve seen many careers altered by an imprudent technology purchasing decision. And for the solution provider, a sale into a marque logo can make or break the organization. These are life altering decisions, and the word of industry analysts can hold a huge sway. Shouldn’t consumers know which factors are influencing their analysis or recommendations? It’s a small ask for such an impactful decision.     

If we are going to mature as an industry, we’ve got to perfect our ability to disclose across all four of these areas. The most recent developments - from employers themselves, privacy experts, and most recently governments and AI leaders - have yielded meaningful progress and positive results for our industry. However, there is room for improvement with our industry experts. Let’s all commit to being better around disclosures for the benefit of all our stakeholders.