Stephen Scheeler
Stephen Scheeler

Facebook has never been under more scrutiny, especially this week after the Christchurch atrocity. Speaking at the Green Building Council of Australia’s Transform event this week, a former chief with the global giant stepped through the unintended consequences of the company’s rapid growth and the subsequent lessons for sustainable digital transformation.

Stephen Scheeler came to Australia in 2013 to run Facebook’s Australia and New Zealand business when the company was first listed on the stock market.

He now goes by the title “the Digital CEO”: an advisor, mentor and speaker dedicated to driving the uptake of digital tools in Australian businesses. He says the region is falling behind, putting the economy, businesses and “the future of Australia for our children and grandchildren” at risk.

But as a former employee of Facebook, Scheeler approaches his role with caution.

“What this all boils down to is an ethical question. How in the future should we treat the individual’s privacy and their digital identity?” 

Displaying a photo to prove he worked with Mark Zuckerberg, he says that the founder of Facebook is “a genius, not perfect, but smarter than pretty much anyone you’ve ever met. And other geniuses have said that… they’ve had that opinion,” he says. 

“I’m also of the opinion that he wants to do social good and change the world for the better. The things that are happening at the moment mean I think Mark [Zuckerberg] is not sleeping well and that he and the entirety of Facebook are trying to find a solution.”

Scheeler uses the tech company’s rapid growth and its various shortcomings to provide powerful context and lessons for businesses and institutions in the process of taking up digital tools.

The platform started off with the intention to connect people – a mission as valiant as “motherhood and apple pie” according to Scheeler. 

Although he believes this ethos endures in some capacity, the company has morphed into much more than a platform for users to communicate with one another.

He says there were three key components to Facebook’s rise before the company started to “come apart at the seams”.

The first is that it’s a platform for authentic identity. Unlike Twitter, where there’s a level of anonymity if you choose, the platform doesn’t accommodate fake or anonymous users and seeks to flush them out.

The implication of this, says Scheeler, is that the platform knows exactly who it’s talking to and when, which is extremely appealing to advertisers.

The next step for the tech company was creating personalisation at scale. Using machine learning and algorithms, the company set out to create news feeds where users see only the most “relevant” posts from others.

Finally, the company brought in data science to drive deliberate, data-fuelled growth based on the social imperative that people want to connect with one another. 

By drawing out insights from the vast quantities of data becoming available, the company was able to drive user engagement and create the “modern Facebook we have today.”

The result, Scheeler says, is a “personal, daily newspaper for 1.3 billion humans.” 

So what could go wrong? What’s a filter bubble?

First, there’s the problem of “filter bubbles” where users are stuck consuming the same ideas and information, which Scheeler says is “great for advertisers but not good for public discourse”.

There’s also the more high profile incidents such as election hacking, data breaches and data misuse, such as the Cambridge Analytica scandal.

Why did it go so wrong?

Scheeler believes Facebook became unstuck for a number of reasons. 

The company grew too fast for a start, with small teams of inexperienced developers building products and releasing them to millions of people without considering the potential risks and unintended consequences. 

These teams also tend to operate on the assumption that people only use the tools for good, Scheeler added.

Essentially, he says the problem was “too much data, not enough ethics.”

How your innocent photo can end up in an unknown narrative 

On a macro level, Scheeler says digital disruption devoid of ethics can lead to the erosion of faith in institutions.

There’s also the danger that people’s “nano-bits” of data are being aggregated into “actionable data sets”. 

This is essentially when benign data – like a photo of someone walking down the street – becomes knitted together with other data to be used in various unknown applications. 

We need trust and privacy – and transparency on our data

Scheeler hopes that in the future, everyone and everything will have a “data identity”. This means “you’ll be able to answer questions about who has your data and what they are doing with it.”

He also says people will also have sovereign control over their data and that data ownership will also be rules-based, not unlike the rules for property ownership.

And data will also be monetisable, or not, depending on what people choose – “most people don’t know how their data is being monetised at the moment.”

He says among important tech-enabled development are protocols that allow machine learning without centralised data storage. Called “federated learning”, these protocols allow data to be moved from company to company without revealing the person’s identity.

He says the outcome will be that people will start moving to organisations that they trust and that are transparent about their activities. This is already starting to happen, he says, with Facebook in the bottom 10 of a list of 100 organisations ranked according to trust.

Join the Conversation


Your email address will not be published.

  1. It was a great presentation by Stephen Scheeler, and pleased that TFE posted this article.
    Facebook is a very interesting case study. There is a big question around whether it is fit to continue to be a pillar of our digital world, or is it simply not capable of evolving fast enough. It is having a prolonged trust crisis and this largely boils down to ethics, and in turn to its values – those that are in its DNA and lived by, rather than espoused. What are those values and to what extent have they evolved from what was set in place by Zuckerberg when we created a tool to assess women on his uni campus? Such values permeate a corporate culture and define the ethical basis of the decisions it makes and the numerous algorithms that Facebook develops – and thus the impact it has in the world for better or worse. A new generation of technology companies may now enter the market founded on solid ethical frameworks, with a clear social purpose and relevant lived values. Could they even be able to compete with the tech titans based on their ability to build much greater trust capital with their stakeholders?
    This is of course not about the tech industry, this case study is highly relevant to the property sector, especially as the sector evolves into a digital sector in its own right. As we look at prop tech or smart cities, let’s enquire into the ethical frameworks, the values, that underpin the decisions to deploy the technology and of those organisations that are doing it. If this is clear and appropriate then we are starting to have a sound basis for trust. In time the property industry’s level of ethics and responsibility with the use of technology will be tested; by the community, the media and ultimately regulation. We must be on the front foot with this, just like we were with environmental sustainability.
    Read Crossing the Threshold to learning more: