Jay’s Story: A Zip Code, A Denied Loan, A Pattern Repeated

6/1/20251 min read

man holding engines
man holding engines

Jay owns a local repair shop and wanted to expand. Solid credit. Steady income. Responsible financial history. But when he applied for a small business loan, the automated approval system flagged him as “high risk.”

Why?
Because Jay lives in a zip code with a high concentration of minority-owned businesses—an area historically subject to redlining and loan discrimination. The AI model, trained on decades of biased lending data, internalized that “risk” without understanding its origins.

Bias Insight:

  • The AI penalized Jay not for who he was, but where he lived.

  • It replicated structural inequality without context.

  • It perpetuated financial exclusion.

This kind of algorithmic decision-making compounds historical harm. It locks out deserving individuals based on shadows of the past—turning inequality into code.