Artificial Intelligence, or AI, is increasing becoming a daily part of modern life.

But so too is the difficulty in distinguishing where the real world ends and AI begins.

UTV’s Up Close has investigated whether it’s something to be feared or embraced.

AI is also known as ‘machine learning’ – or the ability of a computer programme or machine to think and learn without being encoded with commands.

It’s used across a multitude of sectors.

For example, researchers at Queen’s University Belfast are using the technology to develop mini submersibles that are learning the difference between debris and ocean life with the aim of eventually decluttering underwater environments.

Prof Iain Styles told Up Close that machines need to be shown “lots of examples of the thing we want to do and then they learn how to that without further information from us”.

“They learn differently to us, so they typically need to see many more examples of the thing we need them to do. Once they can do it they are quite focused on the task,” he added.

While there may be wide-spread agreement that AI can be a force for good, what if it’s not properly regulated and controlled?

A more sinister side of the technology is that it can be used to create ‘deep-fakes’ – stealing someone’s identity to disseminate false information.

UTV’s Up Close team created their own ‘deep-fake’ of investigative journalist Niall Donnelly.

The technology wrote a script, copied his voice and made a moving image out of a still picture.

It took less than an hour to do, and the results were pretty convincing. The ease with which these so-called deep-fakes can be created is concerning for those working in online safety.

Child Protection Specialist Jim Gamble told Up Close that AI can cause difficulties for law-enforcement when trying to catch predators.

“A predator who has abused a child can be involved in that abuse themselves. They can capture that on film so they can revisit it,” he said.

“They can use artificial intelligence to change their appearance and the appearance of the child and that makes it much, much more difficult when it goes into circulation to identify who the predator is to locate and hold them to account but critically to identify who the real child is that’s been used in that image which has been transformed so you can find them and rescue them from that abuse.”

Mr Gamble said law enforcement needs to harness AI in a way that turns it back on the criminals.

“What you want to do is undermine their confidence in their own competence and their own ability to go online, because at the minute they think they can go on and do anything.

“The chances of being caught are far too low so using this technology we could actually accelerate law enforcements ability to infiltrate these group and engage in these networks and identify and locate the people behind it.”

As far as regulation is concerned, AI ethicist Nel Watson said: “We urgently need to have a global effort to make sense of this technology in the best way that we possibly can, infusing the very best of our humanity into these machines so that they in turn can help uphold our own humanity.”

If you have been affected by any of the issues in this article, please contact:

Women’s Aid

Nexus

Childline

Want a quick and expert briefing on the biggest news stories? Listen to our latest podcasts to find out What You Need To Know.



Source link

author-sign