Over the past several years, ethical & responsible artificial intelligence has come into increasing focus within the technology industry. The main reason for this is that we are learning that when we go down the path of machine learning, there can be all sorts of unintended consequences. For example, Amazon worked for many years to design an artificial intelligence based hiring algorithm. However, the project needed to be scrapped when they couldn’t figure out how to not discriminate against women
What is artificial intelligence?
What is machine learning?
What does ethical & responsible artificial intelligence plan to solve?
By combining the previous definitions, we can see the challenge. If there is a flawed data set that the machine learning algorithm is based on, then the artificial intelligence will create erroneous results. For example, an employer may find that their interview team is biased against women. When aggregated at scale, an artificial intelligence program may decide not to interview any women or a smaller number of women because their interviews don’t typically result in a hiring decision.
The next generation is going to design the bulk of the products and services that we utilize on a daily basis. While the current generation was not aware of these concerns and are now trying to re-design systems to account for these issues, the next generation can have an ethical and responsible AI mindset to start with.
They will be able to understand the potential ethical and moral implications of any technology products and services that they design. In addition, they can also service as a safeguard when they enter organizations to make sure that leadership is taking a stand on this important issue.
With the next generation having a mandate to build technology based products and services that will change the world, they must also make sure that their innovations and inventions do no harm to society.
Join our growing community!
Sign up to receive the latest news, special offers, & updates about Digital Adventures.