CPRC CEO, Lauren Solomon’s opening response at the 2019 University of Melbourne Digital Citizens Conference
Thank you so much to University of Melbourne for hosting such an important and timely conference.
I’m very humbled and I must admit, a little intimidated by the collection of brains and personalities in the room today. Many of whom I’ve been following the work of for quite some time!
Thank you for the opportunity to participate in such an exciting and ground-breaking event.
What I thought I’d do in responding to my wonderfully thoughtful and experienced panellists is pick up on two key themes that both have raised through their presentations, reflecting what these have meant to us at Consumer Policy Research Centre as we go about our work.
And then provide some reflections on public policy change more broadly.
So, spoiler – I’m not an expert on human rights or privacy – I’m possibly what could be best described as a lapsed resource economist that’s spent a fair bit of time looking into those humans that sit behind the demand curves:
- How people practically experience the market and the choices they’re presented with;
- What this means for how they exercise autonomy; and,
- Ultimately what impact this has on their welfare.
So perhaps it initially surprised me how much we’ve had to be open to these other fields of human rights and privacy in our work on digital markets.
It’s so tempting to just jump right in and look at this through the prism of a normal supply chain…
See we’ve got this resource (data) and it’s being mined from consumers and processed by companies. Value is being extracted and shifted from one part of the economy to another, and we just need to cleanly define the resource and come up with a way to value it and see how that evolves at different stages of the supply chain. Right?
Well not really. Because this resource is inextricably intertwined with a whole bunch of other stuff – the resource actually contains things fundamental to other aspects of our society – and indeed our very being.
Some would argue our data IS us.
Transparency is also important for policymakers and regulators to design effective interventions. If we don’t understand the data ecosystem, then how can we hope to intervene effectively to improve things, as opposed to inadvertently making them worse?
We’re hopeful reform in this area will shine a better light on data practices, to bring our policy frameworks into the digital age because ultimately transparency is going to be central to more informed and empowered consumers and more effective policy and regulatory solutions.
Accountability and explainability
Both Ed & Sven touched on the emerging issues in AI when it comes to explainability and accountability which are both linked to transparency.
Data-driven decisions increasingly being made about us without any transparency and without any accountability as to how or why decisions have been reached.
Let’s just boil this down in simple terms:
“Hey WizardScoresU – why did you decide I’ve got an online gambling habit and bump me 10 points on my risk profile so now firms are likely to charge me more for everything because they see me as a credit risk?”
“Because WizardScoresU says so, and he’s always right”.
“But why? I’ve never gambled, don’t have any intention to and don’t have a family history of it”
”The WizardScoresU uses data, the data never lies”
“Well can you at least tell me what I can do to fix it?”
”The WizardScoresU technology is our secret sauce, you can add more data and try again later.”
Now I reckon that to most people that exchange sounds like nonsense.
People don’t react well to being judged at the best of times, let alone by a faceless system that they can’t make any sense of.
And while this technology might be used to make decisions about what products we buy, as Ed highlighted… it can also impact who ends up in jail… or gets prioritised for life-saving treatments… or pursued by government debt collectors.
I’m not sure how “computer says no” might go down at future Senate Inquiries as an explanation for major decisions of government either.
As a society we really should be questioning whether we are ok with these automated decisions being made without this coming along with an explicit obligation to explain how a decision was reached an ability to correct the record if the data is wrong or to have the decision challenged.
Explainability is going to be one of those fundamental planks to build trust in AI and machine-learning technologies going forward.
The role of consumer and community expectations in public policy
I think it is entirely fair, appropriate and necessary for more people to start asking the questions about what sort of digital world we want to live in.
Technology is undeniably having a major impact – on the way we see and engage with the world around us.
Technology in many ways acts as a filter. A filter that shapes our experience and our engagement with others and with markets. Whether we’re talking about multi-sided platforms, digital personal assistants or accessing the daily news, our experience is increasingly being shaped by outside forces.
That experience can be shaped in a way which improves our welfare or detracts from it.
Those deploying technologies are the ones with the key to how that filter works, and often they don’t even fully understand the intricacies.
When we think about consumer markets, those actors holding the keys to that filter gives them an awful lot of power that should give us reason to pause.
Are we ok with initiatives like we’ve seen in the UK with Amazon gaining access to public NHS records to deliver in home, personalised health advice?
Are we ok with technologies being developed with the sole objective of replicating aspects of humans?
In May this year, AI firm Dessa announced they’d created the first synthetic voice of podcaster Joe Rogan. It talks like Joe, sounds like Joe, uses Joe’s turn of phrase. You’d think you were talking to Joe if engaging with it.
They were pretty open about this ground-breaking technology and the risks it posed – for national security, for deep-fakes influencing elections, for consumer scams.
All Dessa said they need is enough voice data and this can be generated of anyone…. Look out Caron Beaton-Wells you might be next!
We’re playing a big game of catch up here.
Governance frameworks need to be designed in a way that adequately addresses both the good and bad potential uses of data and technology.
Blind faith in either the tech evangelist or black mirror future will only do society a great disservice.
In fact, blind faith in any one discipline to solve these emerging problems is equally as problematic.
Now is a time to build greater shared understanding across the disciplines, to listen to the issues being raised by others.
But perhaps more importantly, to listen to what the community and consumers are saying they want and need out of all of this, because in the long term, technology will be rejected if it does not have the trust of the public.
What consumers told us through our research was that:
- Consumers value their privacy – 80% or more consumers didn’t want their phone number, contacts, messages or phone ID to be shared with third parties. 91% wanted companies to only collect the data necessary for delivering the service to be collected.
- Consumers want more control – 95% wanted options to opt out of certain types of data being collected and shared.
- They want their data to be used fairly – 88% of consumers did not agree with companies giving them a different price, compared to other consumers based on online browsing or payment behaviour.
As someone who has spent most of their time in and around public policy there is one observation I’d make:
Wherever consumer and community expectations fall out of alignment with their experience in their interaction with firms and markets, public pressure grows, and governments eventually intervene.
And what we’ve seen in Australia is what’s been called a rolling thunder of interventions due to the failures of markets – we’ve seen Royal Commissions into banks, aged care, Energy Inquiries at a state and national level.
Rapid-fire interventions have been deployed attempting to rebalance some of the power in those markets back towards consumers after years of mistreatment and in many cases exploitation.
We believe continuous and active vigilance in the monitoring of consumer outcomes and experiences in markets by policy and regulatory bodies is essential if we are to have sustainable policy frameworks in place, no matter whether we’re talking about banking, telecommunications, AI or data and information markets.
We also need space within the policy community to openly discuss the opportunities and risks of emerging technologies and markets. To develop clearer shared understanding of the different factors at play.
At the consultation session hosted by Ed and the Human Rights Commission a few months ago we saw the World Economic Forum representative make the observation that Australia was one of the most fractured jurisdictions in terms of our collective understanding of AI and technology issues.
From a public policy perspective, at CPRC we think that’s deeply concerning and disappointing.
And it’s part of the reason why our research has focussed on the benefits of developing a more coherent, economy-wide framework for how data governance works in Australia, encouraging policymakers to better tie together disparate strands of work around a Consumer Data Right, the ACCC Digital Platforms Inquiry and Public Data Sharing and Release Legislation.
We’ve also taken an active role at CPRC in connecting the sectors and disciplines through our funding of research partnerships:
- With University of Melbourne team exploring consumer profiling and new data collection technologies.
- With Queensland University of Technology team pursuing a proactive and positive Data Care agenda to give consumers greater autonomy.
- And with the Human Rights Commission to explore bias and discrimination in data processing.
We have brilliant minds around the country and many are gathered here in this room today.
I guess the bad news is… no one is coming to save us. The good news is… no one is coming to save us.
We collectively get to decide this future by working together by taking an active interest in what that future looks like. By asking tough questions and being open to changing our perspective.
And by taking responsibility for engaging more with consumers and the community on these issues.
Technology is moving fast and most people in their daily lives simply do not have the time to sit down and deeply ponder these issues – they’re trying to get the kids to school; juggling caring for aging parents; struggling to meet spiralling basic living costs with wages stagnating; and, trying to remain connected to others, often online.
If technology can help them do all of that a bit easier, then it’s completely rational behaviour to accept short-term benefit in the face of opaque longer-term risks.
This isn’t about placing all of the responsibility and blame on consumers and citizens.
It’s about the policy community developing and fostering ways for technology to make our lives easier in ways which also improves our welfare – both in the short and the long term.
I think that’s a future that we can and should fight for. And I’m sure many of you here today are going to part of that journey. I wish you all the best for the many bright ideas, deliberations and debate over the coming days and we look forward to being a small part of that big conversation.