Page 26 - Doing Data Together by The Scotsman
P. 26
DOINGDATATOGETHER PROMOTED CONTENT
Put people at
the centre of
how you use
information
Firas Khnaisser discusses why diverse teams help organisations get the most from their data
More often than not, when we talk about data, we’re actually talking about people. Yet perceptions of
data couldn’t be further from being human.
This means we’re often scared, bored or mistrusting of data.
The word “data” has come to encompass so many things that
we don’t know what people mean when they say it. Pundits tout it is as the solution to everything (much like everything else they’re selling). So it’s no wonder folk have become sceptical – it is high time we shifted the emphasis from data back to people.
If I gave two identical, clean, relevant datasets to two separate teams, will Team One derive the same insights from the data as Team Two? Will Team Two share the insights in the same way to allow for improved decision- making or operational efficiency? Will Team One be able to take action on this data? And if it did, does it have the right people to make happen?
All of those things are dependent on skillsets, outlooks and experience – attributes that
people bring. Yet for too long we’ve had a very linear focus on those attributes, defining teams based on their skillsets and not necessarily on their outlook and experience.
To be truly people-centric when working with data (or should I say people?), you need diverse teams that reflect society at large. Only by democratising talent can we truly democratise data and make sure we’re doing the right thing by society.
Don’t get me wrong, I’m not
DMA SCOTLAND
dma.org.uk/value-of-data
underplaying the role of having good data – I’m trying to illustrate the point that the value doesn’t necessarily lie in the data itself but in how you use it.
This reminds me of Information is Beautiful author David McCandless’ reference to “data as the new soil”. I love that positioning because it’s active, not passive.
The much more famous notion of data being “the new oil” is passive and assumes that by finding this magical resource all your wishes will come true, which we all know couldn’t be further from the truth. The soil analogy puts the onus back on people – that we ultimately reap what we sow – and that’s very important.
There’s a lot of talk nowadays about ethics in artificial intelligence, or AI. That’s great but equally puzzling to me. How did the chat about ethics in data and AI go mainstream?
Then I got my answer from Dr Ewa Luger, Chancellor’s Fellow
in Digital Arts and Humanities at the University of Edinburgh. She has argued that AI, and technology more broadly, is moving at such
a fast pace that regulation cannot keep up. Regulation tells you what you have to do, whereas ethics helps you understand what you should do. So, in the absence of regulation we have to fall back
on ethics for answers –
which also translates
to falling back on us,
the people and the
organisations we
work for.
For example, AI
If these algorithms are not examined, historical injustices will not only
persist in the present but continue into the future
algorithms are now widely used in sensitive social spheres –
including credit scoring, employment, education, policing, criminal justice, mental health. We now know that black people are particularly disadvantaged by these algorithms due to inherent bias
in the historical data used to build these algorithms. This means that if these algorithms are not examined, historical injustices that have happened to black communities will not only persist in the present but continue into the future.
So it’s up to people and
26