Last week, Elon Musk's xAI filed a federal lawsuit to strike down Colorado's new AI anti-discrimination law — a law that would require employers to audit the AI tools they use in hiring and tell applicants when an AI is being used to screen them.

The legal argument: xAI claims the law is unconstitutional, vague, and will chill AI development.

Here's what that lawsuit means for NYC business owners: very little. Because New York City already passed nearly the same law three years ago. It's been on the books since 2023. And if you use AI in your hiring process — software that screens resumes, ranks candidates, or scores applications — you may already be out of compliance.

What Colorado's Law Actually Says

Colorado Senate Bill 205, which took effect this spring, requires any business deploying "high-risk" AI systems — defined as tools that make consequential decisions about employment, housing, credit, or education — to:

  1. Conduct annual bias audits of those systems

  2. Notify applicants when AI is being used to evaluate them

  3. Allow applicants to request a human review instead

  4. Disclose what data the AI is using

xAI's lawsuit argues the law is unconstitutionally vague and that it imposes impossible compliance burdens on small businesses.

But here's the thing: New York City's Local Law 144 already imposes virtually identical requirements — and it's been in effect since July 2023.

NYC Local Law 144: The Law You Might Already Be Violating

If your business has 4 or more employees in New York City and uses any automated employment decision tool (AEDT) to screen, rank, or evaluate job candidates or employees, Local Law 144 applies to you.

An AEDT is defined broadly: any "computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence" that issues simplified output — scores, classifications, recommendations — used to screen or rank candidates. That includes:

  • Resume screening software (HireVue, Workday, Greenhouse with AI ranking enabled)

  • Applicant tracking systems with AI-powered candidate scoring

  • Video interview analysis tools

  • Automated reference check platforms

The requirements are specific:

Annual bias audits. You must hire an independent auditor to evaluate your AEDT annually and publish the results publicly. The audit must analyze whether the tool has different selection rates by race/ethnicity and sex.

Candidate notification. Before using an AEDT to evaluate a candidate, you must notify them in the job posting that an AI tool will be used, what categories of data it uses, and how they can request an alternative process.

Public posting. Audit results must be published on your website and kept there for at least six months.

Penalties. First violations: $375 to $1,500. Each subsequent violation: up to $1,500 per day. The city's Department of Consumer and Worker Protection (DCWP) handles enforcement.

Who Is Actually Exposed in NYC

Here's where small business owners need to pay attention: the law doesn't just apply to companies buying fancy enterprise AI tools. It applies to anyone using third-party software that includes AI-powered candidate screening — even if you didn't know the AI was there.

Most modern applicant tracking systems — including platforms commonly used by small and mid-sized businesses — have AI ranking features that are often turned on by default. You may be using an AEDT without knowing it.

Check your current hiring software. Log in to your ATS and look for any features described as "AI-powered," "smart screening," "candidate scoring," "automated ranking," or "fit score." If any of those are enabled, you may be subject to Local Law 144.

Who's auditing? The DCWP has been in enforcement mode since late 2024. Complaints can be filed by job applicants, and civil rights organizations have been actively monitoring compliance.

What to Do Right Now: A Five-Step Compliance Checklist

1. Inventory your hiring tools.
List every platform your business uses in hiring: job boards, ATS platforms, interview scheduling software, background check vendors. Identify which ones use AI to make or influence decisions about candidates.

2. Read the terms.
For each platform, search for documentation on AI-powered features. Most vendors have disclosure pages. Ask your account rep directly: "Does this tool use AI to score or rank candidates?"

3. Contact your vendor about bias audits.
Reputable platforms — Greenhouse, Lever, Workday — either conduct their own bias audits or provide audit-ready documentation you can use. If your vendor can't provide this, that's a red flag.

4. Add disclosure language to your job postings.
Every job posting that uses an AEDT must include a notification. Draft a standard disclosure that covers: (a) that an AI tool is used, (b) what data it uses, and (c) how candidates can request a human alternative. Add it to your standard job posting template.

5. If you're using AI in hiring and haven't audited: pause and get advice.
The fines are real, and class action exposure is growing. If your business has been using AI hiring tools for more than a year without conducting a bias audit, consult an employment attorney before your next hire.

Why the xAI Lawsuit Actually Matters for NYC

Even if Musk's lawsuit succeeds in killing Colorado's law — and legal observers give it mixed odds — it will have zero direct effect on Local Law 144. New York's law is a city ordinance, not a state statute, and it operates independently.

But the litigation matters for a different reason: it signals that AI companies plan to fight these laws aggressively. That means enforcement is coming before reform. If you're using AI in hiring and haven't gotten compliant yet, the regulatory window is closing.

The Colorado fight also previews a battle coming to Albany. New York State is currently drafting its own AI accountability legislation — broader than the city's law, covering housing, credit, and more. When it passes, compliance will become even more complex.

The Bigger Picture

xAI argues these laws will hurt small businesses. But the actual small business risk right now isn't compliance costs — it's getting caught using a tool you didn't know was regulated.

Most violations NYC will see in the next year will come from businesses that didn't know they were using an AEDT. The software vendor enabled it by default, the owner never checked the settings, and now there's a complaint on file at the DCWP.

That's a fixable problem. It takes an afternoon to audit your hiring stack, update your job posting templates, and reach out to your ATS vendor for documentation.

The fine for not doing it: up to $1,500 per day, per violation.

Do the math.

The Metro Intel covers NYC business, housing, and local policy. Forward this to a business owner who needs it.

Keep Reading