You can also jump onto the hashtag #SomethingDigital or #NightNomads, or find us under @SD_BNE on Twitter to grab some more meaty insights.
>> On the ‘dark side of digital’
- The dark side of data itself isn’t about the data itself. It’s about the human element.
- As we move rapidly in our digital advancements, we’re not effectively considering the risks and we’re not putting ourselves in the shoes of an adversary.
>> On the rapid development of AI + risks
- As AI comes online, this is only increasing the risk [of a cyberattack], because you’re providing hackers with a data-rich environment.
- We know that software is eating the world, and now AI is eating data.
- AI is increasingly learning and taking on new functions, but it is also introducing new risks into every digital system. Eg. Data Bias, Adversarial Attacks, Performance Risks and Explainability Risks.
>> On a ‘trust revolution’
Davey introduced the term ‘trust revolution’ to the discussion, and stated that as we move into the artificial stage of our technology, we need a new type of trust that goes beyond cybersecurity alone.
- I believe trust will be a strong pillar of our new coming [digital] age .. we need to move from cybersecurity to cybertrust. In a way that doesn’t create a robot dystopian future.
>> On the accountability of our tech
When it comes to tech and the intersection of data, privacy and security, we are still very much in a grey area Davey pointed out:
- What matters more to me isn’t just what happens in Brisbane or in NY. What matters is what happens when the tech is taken out of our hands and applied in places where it’s not just our quarterly outcomes but where it’s life and death.
- As our systems become more and more autonomous, they are becoming less and less unaccountable for their outcomes.
- When it comes to AI we need to focus the research – for an agency to use AI, they also need to have accountability of it.
>> On the use of our data
We’re all aware that our data is being used by XYZ, but we are not taking as much action as we could be to protect it and to limit the use of it by companies.
- We are still waking up to how much we explicitly trust corporations with our data. We think that the government should have less, but we’re willing to give companies more.
- We can just ignore the facts that our data is being productised and therefore we are being productised.
- According to companies, you’ve read their terms and conditions [for the use of your data] … I don’t think people really understand how much info they’re giving out or how it’s being used.