| From the Editor's Desk
How Machine Learning Pushes Us to Define Fairness Bias is machine learning's original sin. It's embedded in machine learning's essence: the system learns from data, and thus is prone to picking up the human biases that the data represents. For example, an ML hiring system trained on existing American employment is likely to "learn" that being a woman correlates poorly with being a CEO.
Cleaning the data so thoroughly that the system will discover no hidden, pernicious correlations can be extraordinarily difficult. Even with the greatest of care, an ML system might find biased patterns so subtle and complex that they hide from the best-intentioned human attention. Hence the necessary current focus among computer scientists, policy makers, and anyone concerned with social justice on how to keep bias out of AI.
Continued here
Read TradeBriefs every day, for the top insight!
Advertisers of the day Emeritus: Join the PG Diploma in Innovation & Design Thinking program Isha Leadership Academy: Join Sadhguru to Scale-up your Business and Yourself, Nov 27-30, 2019
Our advertisers help fund the daily operations of TradeBriefs. We request you to accept our promotional emails. | | |
No comments:
Post a Comment