Most people have at least a vague idea that giant technology companies such as Facebook and Google mine their personal details to drive businesses making billions of dollars but, judging by the almost universal acceptance, few see this as a cyber threat.
For researchers at the Jeff Bleich Centre for the US Alliance in Digital Technology, Security and Governance (JBC) it is just one of a suite of issues raised by recent and emerging technologies that threaten our security, health and even our country’s future.
“I'm interested in the strategic and social implications of emerging digital technologies,” says Dr Maryanne Kelton, a Senior Lecturer in International Relations at Flinders University and a research leader for the JBC.
“There's been a lot of work on the cyber technical threats such as ransomware and malware, but the work that we do looks at the social and the strategic implications.
“We analyse the effects on individuals and societies and political systems, and the effects for national security of these threats to democracy. We also consider the effects it has on young people and what that might mean aggregated up into a national security question.”
One particular problem Kelton sees for countries such as Australia, with no homegrown information company, is that we do not have control of our own data.
“We are effectively dealing with an information space that is operated and controlled by big international companies, trans-nationally,” she says.
It is also an issue that troubles Dr Zac Rogers, a Research Lead at JBC. He is currently collaborating with Defence to explore the impact of the human/computer interface on Australia’s internal and external security, national interests, defence planning, and strategy.
“As a country, we are a small data sample but we're also a data consumer,” he says. “We don't have the companies that hoover up most of this data, package it, process it, and sell it, so we have no provenance over the way they do that.”
But for Rogers the problem starts with our understanding of the technology itself.
“Artificial intelligence was a term invented by the field of computation back in the 1950s, when they needed to attract funding. The people who actually work in machine learning don't use these terms.
“But by using the term AI, we’ve allowed data science to be conflated with data commerce – the businesses that have been financialised over the past 15 years. Data science is slow, not very sexy, and takes a long time to reveal.
The Digital Wild West
“The conflation between data science and data commerce in the minds of the public, is almost irretrievable – even in academia, parts of the military and governments. They’ve all listened too much to vendors and not to scientists.
“It’s the data scientists who are saying, ‘put the brakes on this’.”
Machine learning works well when dealing with systems that work like a machine, but becomes problematic when dealing with human beings, society, and social systems, Rogers says.
With machine learning systems only as good as the data they are fed, the quality and fairness of that data becomes vital. And we have no idea about that.
“The algorithms are black boxes owned by corporations and we don't get to look in them. But through analysis we have discovered that the data these companies are using is problematic.
“Say, an insurance firm wants to understand the risk profile of a certain class of consumer as to the possibility they might get unwell.
“They get data from a huge ecosystem of data brokers who buy and sell these products and profiles and scores about people, but are largely unregulated. We don't really know how they come to their assumptions.
“But the insurance company just accepts the statistics that may say an individual has 70 per cent risk of getting a disease.”
In the face of this digital Wild West, many ask whether the law as it stands is fit for purpose to bring it under control.
Professor Tania Leiman, Dean of Law at Flinders University, and a research affiliate at JBC says that’s the wrong question.
Is the law keeping up?
“The challenge is not so much whether the law’s kept up, but whether it's responding effectively. We don't want the law to be running too far ahead, because that usually means that regulators are making bets on technology that's still emerging – and they may not bet on the right technologies,” she says.
We need to work out whether we are regulating particular technologies or whether are trying to regulate for outcomes, Leiman says.
“What we see historically, is that when new tech arrives it does prompt new regulatory responses. The industrial revolution, for example, was the genesis of a whole lot of new types of legislation.
“We also have to think carefully about who the legislation is benefiting (and why!) – is it owners of the technology who seek to profit from it or the people who are likely to be adversely impacted by the technology.”
A greater problem is jurisdictional, with multinational giants such as Facebook and Google able to navigate around laws made in a single country,” she says.
“But I don't think that's a reason not to have some firm boundaries,” says Leiman. “Not every organisation is a big multinational and regulatory boundaries can be really important in guiding behaviour of smaller organisations.
“We need much greater public discussion so people can understand the implications of this. It’s really complex. The challenge is that any regulatory framework might ask people to give up something to ensure we can introduce some protections. So far, what we've seen is that people are generally voting in favour of convenience – and often without realising that’s what they are doing.”
Melissa de Zwart, Professor in Digital Technology, Security and Governance at JBC, while admitting it is not going to be easy, believes democracies need to band together to deal with the problem.
“I'm not suggesting that we necessarily need to have some kind of international treaty, but I think you should have a coordinated legislative response to require these companies to be more responsible about their content,” she says. “Regulation really needs to be directed towards altering their business model that currently encourages negative and destructive content.”
But will our legislators find the backbone to stand up against tech giants? Dr Kelton suggests that apart from ensuring we are fully informed of the short and long term costs, including that of convenience, maybe when legislators are directly affected, say by trolling or misinformation, it may spur them to address the issue, as recent defamation cases against Twitter users might suggest. Legislators and policy makers do have agency.
But in the end it may be individuals who drive the change.
“Most of us exist in families and many have young children,” says Kelton. “Once we recognise some of the problems these technologies are causing, particularly in terms of wellbeing or mental health, that might be quite motivating.”
Sturt Rd, Bedford Park
South Australia 5042
South Australia | Northern Territory
Global | Online
CRICOS Provider: 00114A TEQSA Provider ID: PRV12097 TEQSA category: Australian University
Flinders University uses cookies to ensure website functionality, personalisation and a variety of purposes as set out in its website privacy statement. This statement explains cookies and their use by Flinders.
If you consent to the use of our cookies then please click the button below:
If you do not consent to the use of all our cookies then please click the button below. Clicking this button will result in all cookies being rejected except for those that are required for essential functionality on our website.