Reejig's Work Ontology™ Awarded 2023 Top HR Tech Product of the Year Read more

Unpacking how new Ethical AI laws will impact workforces in 2023 and beyond

The webinar recap

Hear from industry leaders at Reejig and Workology as they unpack:

  • What is the new NYC Local Law 144 and how does it impact your organization?
  • Is this a global movement? What are some of the other legislations that organizations should be aware of?
  • What should you be asking vendors when it comes to Ethical AI? And how should you be working with your privacy, security and risk teams?

Dive into more like this

There isn't any recommended Videos

Nothing to see here

We couldn’t find any content related to the theme you are searching for. Here is some recent pieces that you might enjoy.


Top advice from our expert panel:

Key takeaways:
  • If you use AI for any form of talent decision making in your organization and it results in discrimination, whether it is you or the AI, you are liable.
  • New legislation is being introduced in January 2023 to ensure that vendors cannot ‘mark their own homework’. Independent bias audits in talent AI are becoming mandated under NYC Local Law 144, and this is just the start of a global ripple effect.
  • Now is time to stop and ask your vendors, has your talent AI been independently audited? And if not, why not?
Reejig’s perspective on regulating and legislating Ethical AI

There’s a global movement by regulators to demand more accountability and more explainability from AI-based tools, especially those operating in areas that come with a higher risk of bias like human resources. 

The upcoming NYC legislation Local Law 144 is just the start of a growing number of regulations that give organizations the opportunity to review the HR AI vendors they’re using and the efforts they are going to eliminate bias in employment decisions. 

This is no longer an ‘ideal’ — it’s a must that comes with real consequences for organizations who do not have independently audited vendors onboard. 

We’re committed to using artificial intelligence for good. That’s why, in partnership with University of Technology Sydney, we developed the world’s first independently audited Ethical Talent AI certification, setting a new benchmark in trust and ethics for our industry. Our algorithms are compliant with global regulations on equal opportunity, anti-discrimination, and human rights so you can trust you’re making good and fair decisions for your people.

These good and fair decisions don’t just help you remain compliant, they serve to make people feel seen, feel heard, and avoid wasted potential by placing skills, experience, and future potential at the forefront of every employment decision.

What should talent professionals be asking vendors when it comes to Ethical AI?
  • Have you undergone an independent bias audit?
  • If so, who conducted your independent bias audit, did it include processes, data and algorithms scope, and when is it valid for?
  • When was the audit done, and do you have plans to have it re-done annually?
  • What was the independent audit process and will you share the audit outcomes?
  • How do you support notice and consent requirements?
  • What data sources do you collect your talent information from?
  • What personal characteristics and job information do you collect? How do you use it to inform your selection process?
  • How is your AI and machine learning built to reduce bias and provide ethical decision-making support? 
  • How are you regulating your AI and reviewing its selection process in relation to ethical standards and bias reduction?

We're here to help!

If you want to learn more about how you can best manage the upcoming Ethical AI laws, get in touch with one of the Reejig Ethical AI specialists.

Our team can support you to:

  • Understand the general impact of NYC Local Law 144 on your organization
  • Understand other global legislations regarding Ethical Talent AI
  • Get advice on how to best work with your privacy, security and risk teams to manage Talent AI

Get in touch


{{ content.title }}