Landlords are increasingly turning to AI programs to screen prospective tenants. The prevalence of incorrect, outdated, or misleading information in such reports is increasing costs and barriers to housing, especially among people of color.
Yeah, computers are not flawless and AI is no exception. It’s also bound by the regular biases that could be introduced into the system, either directly through the programming, or through the datasets used for training. I recall a few years ago that Google’s image recognition technology was mistakenly identifying black people as gorillas because they didn’t test their tech on enough racial minorities. Computers and AI can be useful tools, but people need to keep such in mind when using them.
Yeah, computers are not flawless and AI is no exception. It’s also bound by the regular biases that could be introduced into the system, either directly through the programming, or through the datasets used for training. I recall a few years ago that Google’s image recognition technology was mistakenly identifying black people as gorillas because they didn’t test their tech on enough racial minorities. Computers and AI can be useful tools, but people need to keep such in mind when using them.