Twitter’s “next big ideas” will be not be driven by marketers or members of the exec team, but by users of its recently launched prototype app twttr, according to the platform’s head of brand strategy Alex Josephson.
The experimental space, which was launched in January and rolled out to select users in March, allows the company to test out new ideas and gather feedback from highly engaged users to build new features.
Talking to Marketing Week at the Cannes Lions Festival of Creativity yesterday (17 June), Josephson says Twitter’s users have always been critical in driving product innovation but the prototype app allows the business to formalise that development process.
“A lot of the core features of Twitter today are actually features we didn’t invent,” he says. “The hashtag was invented by our users, and so was the @ symbol – once we saw the behaviour we decided to make it an official product. That’s really where we draw our inspiration from.
“Every change we make to the product is based on what we’re seeing the audience do from a behaviour standpoint. That includes moving from 140 to 280 characters. That was based on data we were seeing of some tweets just not being sent because people had run out of room.”
Taking that one step further, the prototype app allows Twitter to collect ideas from users proactively and test in a controlled environment before rolling out.
“The next big ideas for Twitter will probably graduate from the prototyping app, so once again we’re sticking to our roots and really relying on our users to influence the product,” he adds.
Tackling inappropriate content
While innovation is important, and making the product as engaging as possible is one of Twitter’s key priorities, Josephson says its top priority is ensuring the platform and the conversations people have on it are “healthy” and non-abusive.
“The health of the conversation is tremendously important to us, and there have been a lot of advancements in this area over the past 12 to 18 months,” he says.
He admits that in the past “too much onus” was put on users to report offensive activity, so one of these measures is the introduction of machine learning technology that proactively detects suspicious behaviour on the platform.
“The most important thing to us is that people are able [to communicate in a safe environment]. Twitter is open to the public and when people come to talk about any topic of conversation you’re going to see all sides and perspectives. It’s important then for us to maintain a healthy environment for those conversations. That’s true for every platform.”
Since implementing the tech, 40% of the instances Twitter has taken action on were detected by machine learning, as a result abuse reports were down 16% year on year.
At the same time Josephson says daily active users continue to rise – up between 9% and 14% each quarter for the past two years.
“Healthy conversation leads to more users on the app every day which leads to better inventory for advertisers,” he adds.
But given trolls and extremist groups continue to find a way through, Josephson says regulation also has a role to play.
“Regulation is important where it makes sense, and we’re very supportive of that,” he says. “We have specific efforts geared directly towards extremist groups and taking action on them, for example. We’ve done that to a high degree and will continue to do that.”
He concludes: “We really rely on our terms of service and have continued to update those policies over the past year, and been very transparent about those changes. We take regulation very seriously. When manipulation is found, say with the US election in 2016, we shared all that data immediately with the government, regulators and academic institutions because no one company can solve this problem alone; it’s too huge.”