News from the front desk issue 423: At a Sydney dinner this week, a few people started shooting the breeze, wondering what the bright new world of technology and data mining would bring and how it will change our worlds. As you do.
We know there are giants silently tramping the planet gathering up billions of data sets to find patterns and predict behaviour. With this information, we can save energy, what’s remaining of our depleted environment (hopefully), minimise waste and bring help to people in need.
In cities and buildings, the changes can be impressive. People such as Bruce Duyshart of Meld says this learning and improving the process is never ending.
The more data you can feed into the system – about cars on the street, cars in the basement, bikes and people movement – the more smoothly and efficiently your premises will run. The more efficient your cities.
It’s all about the people, Duyshart says. “What insights have I gained that will result in better user experience or better operational experience?”
Offices and apartment buildings alike will be data mined to produce the best results for people.
But those who control that world of deep learning need the results to be predictive or there’d be no point, right? And it’s a small step to see that alongside will emerge the more nefarious dark arts of influence and dare we say it, the temptation to suggest behaviour and, finally, to almost ensure it.
It could be as innocent as a reward for not using the lifts during peak hour, so changing our lunch habits. Energy saving, worthy stuff. It could be small financial rewards if we buy this or that.
And we know, let’s face it, that our election tendencies are already up for grabs.
The people working inside Google-like labs are the best in the world, but soon they will be redundant as AI takes its learnings and leaves us all for dust.
It’s not hard to realise that those who control AI will be the wealthy landlords of the future. As one of our dinner hosts suggested, property might have been our most valuable commodity in the past, but in the future, it will be information.
Yet we give it all away willingly, seduced by the shiny apple that will entertain us and save us from fearful boredom.
It’s a world where emerging behemoths are organising for control of AI. Under cover of commercial in confidence. Code encrypted.
But who is controlling the controllers?
According to Angie Abdilla from Indigenous-led social enterprise Old Ways New, the key to good governance with AI lies in Indigenous knowledge systems.
In Indigenous lore two principles are front and centre –caring for country and caring for kin, or community. Get these two right and the rest pretty much looks after itself. It’s about balance and complex systems. And according to Abdilla the evidence is there for the taking.
“It’s why this is the longest living continuum of culture on the driest continent on Earth. And when see who is going to have the best governance for the emerging technology of course it’s the Indigenous people because of those two facts,” she says.
An intriguing event in Hawaii
In March, Abdilla, who was a panellist at The Fifth Estate’s Tomorrowland 2018, will take part in an intriguing event that she’s helped to organise in Hawaii to see how this thinking can start to make its way into the corridors of influence over AI and technology. Globally.
The organisers of the event are the Massachusetts Institute of Technology, Concordia University, Canada and it’s funded by Canadian Institute of Advanced Research, which Abdilla says, is known for it’s leading progressions in AI.
“It’s a big deal,” she says. And the purpose of the work is pure “r & d” and “looking at provocations – how can indigenous knowledge systems inform AI”.
The event will comprise around 40 mostly Indigenous people with superior technology skills, “leading experts who have the capacity to work with us and our objectives… incredible people.” From Maori people developing machine learning applications, involving language and speech recognition, to coders, energy experts, cultural theorists and philosophers.
And it’s been carefully balanced for diversity.
That’s why the search has taken a year to find the right composition for the group, Abdilla says.
“We know that Indigenous knowledge systems are incredibly nuanced and complex and are proven to have the capacity to manage the delicate balance of our environment and communities in the most successful way we’ve seen ever.
“So if we look at what this technology looks like and how we develop it, it needs to be based on those two core principles because most of the issues we have with AI come down to governance.”
In the case of a large urban renewal project, her focus is on the process of a smart cities approach based on inclusivity.
“How do we create a smart cities plan looking at the future governance of urban environments? Of course it’s about technology and of course, it’s about the strategies.
“How do we create better systems because the problems we have with AI is around governance; it’s about ethics and bias. Most issues come down to those areas, that’s really what governance is.
Abdilla’s previous work with urban development programs involved attending COP 21 UNPI, where this notion started to unfold as Indigenous leaders worked on a program “to inform nation-states and their policies around climate change, because most of the time Indigenous people do not contribute to climate change themselves but they are the most affected by it.”
Worldwide the evidence is in that Indigenous people are working on the most interesting and innovative ways of managing country in ways that are respectful and based on thousands of years of practice, she says.
Now imagine if Google and AI were smart enough to embrace that kind of thinking.