The Australian Human Rights Commission is trying to work out the good and evil of technology

Johannes Eisele/AFP/Getty Images

Technology has a light and a dark side, and can be a force for good or evil. For example, automatic weapons making kill decisions without human intervention versus drones flying into disaster zones to deliver life-saving medical supplies.

The Australian Human Rights Commission today launched a major project on human rights and technology, publishing an issues paper.

“Facial recognition technology, artificial intelligence (AI) that predicts the future, neural network computing … these are no longer science fiction,” says Edward Santow, the Human Rights Commissioner, launching the Human Rights and Technology Issues Paper.

“These developments promise enormous economic and social benefits. But the scope and pace of change also pose profound challenges.

“Technology should exist to serve humanity. Whether it does will depend on how it is deployed, by whom and to what end.”

It has been suggested that new technology is changing what it means to be human.

Economic inequalities may emerge with advances technology. Jobs will be lost, and new ones created, with the rise of robotics. And industries will rise and others fall as technology disrupts concentrations of economic power.

“There are numerous examples of the convergence of technology with human beings, ranging from the extraordinary to the mundane,” says Santow.

“Whether it be deep-brain stimulation aiming to treat degenerative brain conditions, or robots that behave like us, it is increasingly difficult to draw a bright line between the physical and digital worlds.”

The Issues Paper explores the rapid rise of new technology and how it affects the human rights in Australia.

The project will consider how law, policy, incentives and other measures can be used to promote human rights in a new era of technological development.

Last week a who’s who of CEOs, engineers and scientists from the technology industry signed a pledge against the development of lethal autonomous weapons.

Among them, Toby Walsh from the University of NSW who said: “We cannot hand over the decision as to who lives and who dies to machines. They do not have the ethics to do so. I encourage you and your organizations to pledge to ensure that war does not become more terrible in this way.”

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.