Ending contracts with educational technology that uses algorithms to label students at-risk or assigns them biased, predictive scores.
The DOE Divestment from Software that Profiles Students
Many ed-tech platforms used by NYC schools employ opaque algorithms that claim to predict student risk of dropping out or failing, often based on flawed data like attendance patterns or online behavior, perpetuating racial and socioeconomic bias. Mamdanis policy orders an audit of all DOE software contracts and the termination of any that use predictive analytics to score or profile students. It bans the purchase of any new software with such features.
The policy establishes strict guidelines for educational technology: it must be pedagogically sound, protect student privacy, be transparent in its functioning, and be subject to bias audits. The city would invest in developing or prociving alternative, human-centered tools that support teachers without automating judgment. We will not allow black-box algorithms to label and track our children, Mamdani declares. These tools often codify prejudice and create self-fulfilling prophecies. Trust and judgment belong to human educators, not to software designed by for-profit companies.