- Michael Feindt is the strategic advisor at Blue Yonder, a leading digital fulfillment and supply chain solutions provider. With a background in phsyics and data science, he’s played a key role in the embedding of AI into Blue Yonder’s customer supply chain and merchandising processes.
- Feindt shares his thoughts on how AI can be harnessed to eliminate human biases in the workplace and at various pain points in the supply chain.
- Because of his work, Business Insider named Feindt to our annual list of the 10 leaders transforming supply chain in Europe.
- Visit Business Insider’s Transforming Business homepage for more stories .
Whether in response to government legislation or a sense of moral duty, in recent years a growing number of businesses have sought to eliminate as much internal discrimination as they possibly can.
Human-made decisions tend to be inherently unfair; not setting out to be “right” or to do “wrong,” but still unfair. This happens because we live in a world where everything is not equal â€” it makes sense that data reflecting activity in this kind of environment will show bias.
If society is biased, naÃ¯ve AI will be too
It’s an unfortunate fact of life that bias is part of everyday society â€” higher earners pay higher taxes, which is not balanced, but many people would still say it is fair. However, that same high earner wouldn’t then be charged more for a loaf of bread â€” so we have an inconsistency in terms of proportionate expenditure to income. This is the way the world works, and some would argue it is unfair. Early AI made decisions based on data that has been produced by this world â€” so it will make future decisions based on this historical bias, ensuring the cycle continues.
Hiring patterns have been an area of particular concern, with some high-profile cases hitting the headlines recently. Many businesses are now trying to make this fairer by revising their hiring policies and setting themselves numerical quotas, but algorithms won’t help them to do this unless they change the information that is being fed into AI in the first place. If it’s not guided to become fairer, it will find the best candidates, but not factor in issues around gender or race as part of its decisions.
Can we make AI more fair?
So is it actually possible for businesses to develop algorithms that can pull fair decisions out from data that is inherently unfair? To put it simply, they must find a way to provide AI with the right data inputs, and give it instructions to behave in the most ethical way possible, ignoring and unfolding historical biases and to be confident in leaving the business’ past behaviours behind.
This sounds a tall order, but it is well within the realms of possibility to adapt the instructions that are given, thereby removing discrimination. This then bounces the issue back to the business, as it is completely down to them to decide exactly what ‘fair’ is. It’s not enough to have a general feeling of what is “right” â€” businesses need to give exact quantities and measures for what they want AI to do.
When you think about it, this is a much more difficult undertaking than it might first sound. Do you actually want the same number of male IT professors as female ones if there have been five times as many men studying IT? Or do you look further back into the mists of time and consider the fact that women were not encouraged into the world of IT at a younger age?
While two-thirds of medicine students in Germany are women, almost three-quarters of electro-engineering students are male. Should we be telling AI to make these professionals an equal gender split, or does it need to factor in these social differences into its thinking? There’s a lot to take into consideration here, and businesses can not rush this kind of decision.
It may take some time and soul-searching, but this will be well worth their while when businesses start to see AI driving them towards fairer decisions. Key areas where people often experience discrimination, such as in the hiring process, and how much people are paid, will be levelled out.
Although we’ve seen examples of them maintaining the status quo in a bad way, data and algorithms actually have the power to wind back cultural imbalance and create a fairer world. Although the businesses involved may not have seen it that way, this year’s AI hiring headlines and subsequent protests can actually serve as a powerful wake-up call for us all: simply put, it’s down to us whether AI is a force for good or a force for bad. If you can provide it with data and instructions that are designed to shape the world in a certain way, AI will do that â€” so if businesses are willing to put in the time and effort to set things on a fairer course, AI can set about fighting discrimination and injustice.
Michael Feindt is the Strategic Advisor at Blue Yonder. With a background in phsyics and data science, he’s played a key role in the embedding of AI into Blue Yonder’s customer supply chain and merchandising processes.
Business Insider Emails & Alerts
Site highlights each day to your inbox.