Talking data, trust and security with Davey Gibian

Ahead of our 3rd and final Night Nomads event “The simplicity of data weaponisation”, we caught up with our International Keynote (and all-round legend) Davey Gibian.

As a specialist in data science, cybersecurity and geospatial systems + AI, Davey’s career spans across the White House, Silicon Valley, Wall Street & active war zones. So we couldn’t think of anyone more qualified to discuss with us all things data + privacy + security.

Keen for a sneak peek? Read the intriguing Q&A with Davey below on data weaponisation; data security; and WHY we need new language to discuss cybersecurity in a AI-enabled age … and then join us on March 26 to hear from him in person!

– – – – – – – – – – – – – – – – – – – – – – – – – – – –

>> Question: Data was always our saviour, helping us build better products, market efficiently and be more relevant. How did it suddenly turn into a weapon? 

The notion that data was suddenly turned into a weapon is a fallacy.

We tend to think of advances in tech as primarily going towards good. But data is just like any tool. The bow and arrow is as good for hunting as it is for war, and nuclear power is beneficial for power as it is for planetary annihilation. In the world of data, the ability to create more nuanced targeting and propaganda goes hand in hand with the ability to sell products better.

So it did not suddenly become used for malign intent, it’s simply that malign actors finally learned how to use it. And the scary thing is, this is only the beginning. Code libraries and purchasable datasets are making it easier than ever to use massive data tools, for both good and bad.

>> Question: How is data misuse (or weaponisation) impacting businesses today?

Let’s start by looking at the consumer side first.

Our information is being weaponised using techniques originally developed for military precision. As we continue on towards the extreme end of things, companies begin compiling a digital profile of you. This feeds on itself, giving you recommendations based on preferences observed to most impact you.

This is a combination of digital surveillance and the creation of a self-fulfilling prophecy of the self. If you are only being fed information designed to effect you in a certain way, you will never learn the other side, you will never see opposing points of view, and you will never grow.

It’s the opposite of what is supposed to happen in a college classroom, where competing points of view are given weight. Data weaponisation gives companies the ability not only to surveil us, but to determine who we are going to be.

So how does this impact companies?

Well it has good and bad effects, like any tool. On the good side, companies can better target consumer and potential customers.

On the bad side, it opens up an interesting mix of challenges. They include:

  • More targeted cybersecurity and phishing attacks against employees;
  • Targeting certain customers using the same set of algorithms can increase competition, making it harder to break into new segments;
  • Possibility of compliance and legal concerns when it comes to targeting consumers (for example, an AI that gives different loan rates to minority communities based on zip code);
  • As companies store more data, they open themselves up to greater risk of cyber attacks as the value of that data increases

Generally speaking, companies have a lot to gain by using data tools more effectively. But using these tools opens them up to unexpected risks.

>> Question: Why is it important for everyone, particularly those who are entrusted with our data, to take an active role in protecting it? 

The vast majority of people seem to distrust businesses on a macro level. In fact, the major barometer of trust in organisations – the Edelman Trust Barometer – has been showing a decline in public trust in all organisations including business, media, government and NGOs.

And yet, paradoxically, consumers provide companies with incredible detail about every facet of their lives. So I don’t think this is because they trust these companies, but instead because they don’t yet realise just how much privacy they are giving away. So those organisations entrusted to protect this data are not even seen to be the protectors by the public.

We need to scrutinise and interrogate any business or application that collects and stores data to ensure it is protecting information we entrust to it.

There are many tactical reasons why it is important to protect data. From banking details, to personal medical information, to tax information, data protection keeps people’s identities secure.

On a more strategic level, data protection is about trust. If businesses and organisations do not keep data secure then people will lose trust in them. That means we could lose out on all the benefits and efficiencies of the effective use of data.

>> Question: In light of the complexity of the threats at hand, is cyber security a lost battle?

Not yet! It is a battle in which the only strategy is to keep playing.

If a company or organisations falls behind, suddenly they will be more at risk of possible attack. So companies need to partner with organisations who are able to stay at the cutting edge of cyber attacks.

Most of cybersecurity is cyber hygiene. Just like locking the front door at night—or like in Brooklyn where they roll down a metal gate in front of the shop at closing time— cyber hygiene means not making yourself a target.

This is just now getting to the consumer though, as we are waking up to the risks of data breaches on personal information.

Now, if we talk about quantum and what that is about to do when it comes online, that’s a different story.

Finally, I will point out because it is top of mind for me these days, we need new language to discuss cybersecurity in a AI-enabled age. AI introduces completely new cyber threats that are not being effectively addressed by traditional cybersecurity companies. These threats include:

  • Biased data sets
  • AI performance risks (new environment risks)
  • Adversarial attacks on AI systems
  • And AI explainability concerns

This is the field of AI Security, which I think is going to be a major concern in the coming years.

So it is not lost yet, but cybersecurity is continuing to get more complex as new systems are brought online.

– – – – – – – – – – – – – – – – – – – – – – – – – – – –
.
To hear more of Davey’s outstanding insights join us on March 26 for ‘The simplicity of data weaponisation“. During this event Davey will discuss his experiences with government, security forces and activists to protect our technology, AI in particular, from weaponisation. Join us to learn how:

 

No Comments

Post A Comment

Loading...