As a result of Cambridge Analytica, misappropriation of data, #deletefacebook, calls for regulation and the US Congressional attestation, Facebook announced a series of '' initiatives to restrict access to data and protect people on the platform. What is more remarkable, however, is that Mark Zuckerberg also organized a last-minute meeting with media and analysts to explain these efforts and to answer tough questions for an hour.
Let's start with the news of the company
In order to better protect the information of Facebook users, the company makes the following changes in nine priority areas over the coming months (Source Facebook:
Events API : ] Until today, users could authorize an application to obtain information about events that they host or assist, including This allowed users to add Facebook events to the agenda, to the ticket office or to other applications.According to the company, events on Facebook contain information about the presence from other people as well as about the publications on the wall of the event. From today on, applications using the API can no longer access the list of the ###. ###, ###, ### ###, ################################################################# 39, guests or publicat ions on the wall of events.
Groups API : Applications currently require administrator permission groups. For secret groups, applications need the permission of an administrator. However, the groups contain information about people and conversations and Facebook wants to make sure everything is protected. To go ahead, any third-party applications using the Groups API will need the approval of Facebook and an administrator to ensure that they benefit at the same time. group. Applications will no longer be able to access the list of members of a group. Facebook also deletes personal information, such as names and profile photos, attached to posts or comments.
Page API : Previously, third-party applications could use the API. This allows developers to create tools to help page owners perform common tasks such as scheduling publications and responding to comments or messages. At the same time, applications also have access to more data than necessary. Now, Facebook wants to make sure that the information on the page is only available for applications that provide useful services to our community. Any future access to the Pages API will need to be approved by Facebook.
Connecting to Facebook : Two weeks ago, Facebook announced changes to Facebook Login. As of today, Facebook will need to approve all applications that require access to information such as records, preferences, photos, publications, videos, events and groups. In addition, the company no longer allows apps to request access to personal information such as religious or political opinions, relationships and relationship details, personalized friend lists. , professional activities, reading activities, video surveillance activity and gaming activity. Soon, Facebook will also remove the ability of a developer to request data from people sharing with them if there has been no activity on the application in at least three month.
Instagram platform API : Facebook accelerates the depreciation of the Instagram platform API takes effect today.
Search and Retrieval : Previously, users could enter a phone number or email address in the Facebook search. According to Facebook, "malicious actors" have abused these features to collect public profile information by submitting phone numbers or email addresses. Given the scale and sophistication of the business, Facebook believes that most Facebook users could have seen their public profile in this way. This feature is now disabled. Changes to account recovery also reduce the risk of scraping.
Call History and Text : Call History and Text was part of an optional feature for Messenger or Facebook Lite users on Android. Facebook has reviewed this feature to confirm that it does not collect the content of messages. Newspapers older than one year will be deleted. Moreover, the larger data, such as call duration, will no longer be collected.
Data Provider and Partner Categories : Facebook closes the Partner Categories, a product that allowed third-party data providers to offer their targeting directly to Facebook. The company said that "although this is common practice in the industry (…), it will help improve the privacy of people on Facebook"
App Controls : User Feeds for see what applications they are using and what information they have shared with these applications. Users will also have easier access to delete applications they no longer need.
Cambridge Analytica could have data from 87 million people
Facebook also made a surprising announcement. After careful review, the company estimates that Cambridge Analytica may have collected information on 87 million people. 81.6% of these users resided in the United States, with the remainder of the users being dispersed among others in the Philippines, Indonesia, the United Kingdom, Mexico, Canada and India. The original New York Times reports estimated that the number of affected users was closer to 50 million.
Mark Zuckerberg faces the media; Show maturity and also naivety and strategic sense
In a rare gesture, Mark Zuckerberg invited the press and analysts to a conference the next day where he shared the latest steps taken by the company to protect user data . platform and protect users against misinformation. I have to give Mark a credit. After visiting AWOL after following Cambridge Analytic's SNAFU data, he did an enlightened media tour. He sincerely wants us to know that he has made mistakes, that he is learning from them and that he is trying to do the right thing. On our call, he stayed beyond his allotted time to answer difficult questions for a good 60 minutes.
From the beginning, Mark approached the discussion by acknowledging that he and the rest of Facebook
"It is clear now that we have not done enough to prevent abuse … this goes for false information , foreign interference, elections, hate speech, in addition to developers and data privacy, "said Zuckerberg. "We did not take a broad view of our responsibility – it was my fault."
He also committed to making a mistake while focusing on protecting users' data. and, ultimately, about their experience on Facebook.
"It's not enough to connect people, we have to make sure that these links are positive and that they bring people together," he said. "It's not enough to give a voice to people. We must make sure that people do not use this voice to hurt people or spread misinformation. And it's not enough to give people tools to manage applications. We need to make sure that all these developers also protect people's information. We must ensure that all members of our ecosystem protect the information. "
Zuckerberg admitted that data protection was only part of the company's multi-faceted strategy to get the platform back on track." Misinformation, security concerns and the polarization of users still threaten the facts, the truth and the upcoming elections.
He shared some of the great steps taken by Facebook to tackle these problems. "Yesterday we took a big step by taking Russian pages out of the # 39; IRA ", is it boasted." Since we became aware of this activity … we have worked to eliminate the IRA in order to protect the integrity of elections around the world whole. In all, we have about 15,000 people working on security and content review and we will have more than 20,000 by the end of the year. This will be a major goal for us. "
He added:" While we are doing this, we have also spotted and identified this network of fake accounts that the IRA has used so that we can work to delete them completely from Facebook. first action we took against the IRA and Russia itself. And this included the identification and demolition of a Russian press organization. We still have work to do here. "
Highlights, Observations, and Findings
This conversation was quite dense – in fact, it took hours to spend the conversation just to put this article together – I understand if you do not have the time read the entire interview or listen to all of the questions and answers. To help you, I present some of the highlights, ideas and points to remember of our time
1. Mark Zuckerberg wants you to know that he is really sorry.He repeatedly explained that he feels the weight of his mistakes, his misunderstandings and his errors of judgment in everything related to user data, false information, falsification of elections, polarization, He also wants you to know that he learns from his mistakes and that his priority is to solve these problems while by regaining confidence. think of it as a multi-year strategy that Facebook has been running for a year.
2. Facebook now estimates that up to 87 million users, not 50, mainly in the United States, may have been affected by Kogan's personality quiz application. Facebook does not know how much user data has been sold or used by Cambridge Analytica. This was not a data breach according to the company. People have gladly taken the quiz Kogan
3. Facebook has also potentially exposed millions of user profiles to data recovery due to existing API standards on other fronts over the years. Users who have not turned off the email / phone number search should assume that their public information has been erased by sophisticated actors who have evaded the rate cap. The extent of this scraping and the manner in which the data were used by third parties are unknown. Facebook has since disabled access. Even then, it is unacceptable that it has not been taken seriously before and we are now hearing about it. Facebook must play its part by exposing the data to the bad actors who have collected information for harmful purposes. Facebook's role in not knowing how much the players play in the system makes society a bad actor in its own right. Ignorance is a happiness until it is not.
4. Mark thinks that Facebook has not done enough work to explain the privacy of users, how the company earns money and how it does and does not use content / user data . It changes.
5. Mark and the board / shareholders believe that he is still the right person for the job. Two journalists asked directly if he would resign by force or choice. His response was categorical: "no". His reasoning is that it is his ship and that it is he who will fix everything. He said repeatedly that he wanted to do the right thing. While applauding his "awakening", he has made big mistakes as a leader who needs more than promises of rectification. I still believe that Facebook would benefit from seasoned strategic leadership to establish / renew a social contract with users, Facebook and its partners. After all, society fights wars on many fronts. And society has shown a pattern of neglect or ignorance in the past and then apologizes. We can assume that this diagram will only continue.
6. There is still a lot of naivety at stake here with respect to user trust, data and the militarization of information against Facebook users. Even if the company aims to correct its wrongs, the company and its main actors have not got anything to gain. There is a story of missing important events here. And, Mark has a story of minimizing these events, acting too late and apologizing after the fact. "I did not know" is not an appropriate answer. Even though the company is making significant progress, there is no reason to believe that data thieves, information terrorists and swindlers who change their form are not already close to the Facebook team. Remember, after the 2016 election, Mark said it was "crazy" that false information could in one way or another influence an election. He has since narrowed this reaction, but it was still his initial response and his belief.
7. Facebook is already taking action against economic actors, government interference and lack of truthfulness and promises to do more. It has since deleted thousands of Russian IRA accounts. Russia responded that Facebook's movements are considered "censorship"
8. It's not all Facebook's fault, according to Facebook. Mark places some of the responsibility on Facebook users who have not read the ToS, manages their data settings, or understands what's going on when you put your whole life online. In his opinion, and it's a hard pill to swallow, no one has forced users to answer a personality quiz. Nobody forces people to share every aspect of their lives online. Although the company facilitates understanding for users and management of what they share, users still do not realize that this free service involves agreement as a user. their attention is for sale.
9. In the future, Facebook is not as worried about data breaches as it is about user manipulation and psyops. According to Mark, users are more likely to be subject to social engineering threats during hacking or intrusion. Social engineering is the use of centralized planning and coordinated efforts to manipulate individuals to disclose information for fraudulent purposes. It can also be aimed at manipulating individual perspectives, behaviors and influencing social change (for better or for worse). Users are not ready to fully understand if, when and how they are likely to be handled. We need to understand how we influence each other based on our own cognitive biases and how we choose to share and perceive information in real time.
10. Facebook really wants you to know that it does not sell user data to advertisers. But, she also acknowledges that she could have done and will do a better job of helping users understand the business model of Facebook. In addition to fighting the information war, Facebook also favors targeting ads, better news feeds and creating / distributing better products and services that users love.
11. Although more than 87 million users have been affected by Kogan's personality quiz and some of this information has been sold and used by Cambridge Analytica, and user data has been compromised in several other ways, #deletefacebook significant impact. However, Mark says that the fact that the movement has grown is "not good". This leads to a separate but related conversation about the dependency and dependence of users on these platforms. ] 12. Users can not rely on Facebook, Youtube, Twitter, Reddit and others to protect them. The respective leaders of each of these platforms MUST fight the bad actors to protect the users. At the same time, they do not do enough. Users are in many ways unintentional pawns, which equates not only to social engineering, but also to an information war and a psyche in its own right to cause chaos, disruption or worse. Do not get me wrong, people, their minds and their beliefs are attacked. They are not just the "bad actors". We are witnessing real villains, regardless of their intent, their damage, their abuse and their human relationships, the truth and the digital and real democracy.
People and their relationships with each other are becoming radicalized and armed under their noses. Nobody teaches people how this is going. Moreover, we still do not expose the social design secrets that make these applications and services addictive. Faced with social disorder, people easily share everything about themselves online and believe that they control their own experiences, situation analyzes and resulting emotions. I do not know if people could really leave even if they wanted it and that is what scares me the most. The regulation is approaching.
Questions and answers in full: All history after Zuckerberg
Please note that this call lasted 60 minutes and that the following is not a full transcript . I went through the whole conversation to bring out the main points and the context.
David McCabe, Axios: "Since the numbers [around the IRA] have changed dramatically, why do legislators give them a complete and accurate picture now?"
Zuckerberg: "We will find more content over time.As long as there will be people in Russia who will try to find ways to exploit these systems, the battle will be endless. You never completely solve security, it's an arms race, in retrospect, we were late and we did not invest in advance, and I'm sure we're making progress against those adversaries. are very sophisticated, it would be a mistake to assume that you can fully solve a problem like this … "
Rory Cellan-Jones, BBC: In November 2016, they claimed that false information would have been could tilt the electi ons. Do you take this seriously enough …? "
Zuckerberg: " Yes. I clearly made a mistake in rejecting the fake news as crazy as [not] that had an impact. What I think is clear at this point is that it was too flippant. I should never have called him crazy. This is clearly a problem that requires painstaking work … It is an important area of work for us. "
Ian Sherr, CNET: " You have just announced 87 million people affected by Cambridge Analytica you have known this number because the number of 50 million has been around for quite some time. We have the impression that the data keeps changing and that we do not get a clear vision of what is happening here. "
Zuckerberg: " We have just finalized our understanding of the situation over the last few days. We did not publish the 50 million number … we wanted to wait to have a complete understanding. Just to give you a complete picture of this topic, we do not have a logbook yet when the application of [Aleksandr] Kogan interviewed the friends of all … We wanted to have a view of the world. together and a conservative estimate. I'm pretty confident, based on our analysis, that it's not more than $ 87 million. It could very well be less … "
David Ingram, Reuters: " Why have not there been any audits of the use of it? Social Graph API between 2010 and 2015
Zuckerberg: "In retrospect, I think we should have done more all the time." Just to talk about how we thought at that time, simply as an explanation, I do not try to defend this now … I think our view in many aspects of our relationship with people was that our job was to give them tools and that was mainly the responsibility of people in the way they chose to use them … I think it was in hindsight to have this limited vision, but the reason we acted like we were We did when someone chose to share their data and that platform was designed from the same way as the application quiz of personality, we think so, Kogan broke the rules. And, it broke expectations, but also people chose to share this data with them. But today, considering what we know, not just developers, but all of our tools and our place in society, service is so important in people's lives, I think we need to understand a broader vision of our responsibility. We are not only creating tools for which we must take responsibility for the results achieved in the way people use these tools as well. That's why we did not do it at the time. Knowing what I know today, we clearly should have done more and we will go forward.
Cecilia King, NY Times: "Mark, you indicated that you might be comfortable with a regulation, I would like to ask you about the privacy rules that are on the point to take effect in Europe … GDPR Would you be comfortable with these types of data protection regulations in the United States and with global users? "
Zuckerberg: "Regulations like the GDPR are very positive … everywhere, not just in Europe."
Tony Romm, Washington Post: "Do you believe that this [data scraping] was in violation of your regulation from 2011 with the FTC? "
Zuckerberg: " We have worked hard to make sure that we comply with it.The reality here is that we need to take a broader view of our responsibilities rather than simple legal responsibility. do the right thing and make sure that people's information is protected. We do investigations, we block the platform, etc. I think our responsibilities to people who use Facebook are greater than those written in that order and that's what I want to hold back.
Hannah Kuchler, Financial Times, "Investors were very concerned about whether this was a result of corporate governance issues on Facebook. Has the council discussed the possibility of resigning as president? "
Zuckerberg: " Ahhh, not to my knowledge. "
Alexis Madrigal, Atlantic: never made a decision that benefited Facebook's business, but not to the community."
Zuckerberg: "What makes our product difficult to manage and use, it's not the trade-offs Think it's pretty easy, because in the long run, business will be better if you serve people, I just think it would be clear-sighted to focus on short term on value for people and I do not think we are short term.All the tough decisions we have to make are actually trade-offs between people.One of the big differences between the type of product that we build, that's why I call it a community and what are, in my opinion, the specific governance issues that different people who use Facebook have different interests.Some people want to share a political speech that they believe is valid, while others think it is a hate speech. These are questions of real values and trade-offs between free expression on the one hand and ensuring that it is a safe community on the other … we do it in a static environment. Social norms are continually changing and are different in every country in the world. It is difficult to obtain these compromises and we do not always get them right. "
Alyssa Newcomb, NBC News: " You said you made mistakes in the past. Do you still think that you are the best person to advance Facebook? "
Zuckerberg: " Yes. I think life is about learning from mistakes and what you need to do to move forward. The reality is that when you build something like Facebook that is unprecedented in the world, there will be things that you will miss … I do not think anyone is going to be perfect. I think what people can hold us accountable for is learning from our mistakes and continually making better results, while continuing to evolve our vision of our responsibilities. And, in the end, if we build things that people like and if that makes their lives better. I think it's important not to lose sight of it all. I am the first to admit that we have not taken a broad view of our responsibilities. I also think that it's important to keep in mind that there are billions of people who love the services we build because they get real value … C & D Is something that I'm really proud of my business to do … "
Josh Constine, TechCrunch: " Facebook explained that account recovery and search tools utilizing e-mails and phone numbers could have been used to collect information about all Facebook users. it was before a month ago, why Facebook did not immediately inform the public? "
Zuckerberg: " We have examined this issue and have understood it in recent days as part of the audit. of our global system. Everyone has a setting on Facebook that controls, it's just in your privacy settings, so people can search for you through your contact information. Most people have this enabled and that is the default. Many people have also turned it off. This is not everyone. Definitely, the potential here would be that over the period of time around this feature, people have been able to erase public information. Il est raisonnable de s’attendre à ce que, si ce paramètre était activé, à un moment donné au cours des dernières années, quelqu'un ait probablement accédé à votre information publique de cette façon. "
Will Oremus, Slate: 19459005 ] "Vous exploitez une entreprise qui compte sur des personnes disposées à partager des données qui sont ensuite utilisées pour les cibler avec des annonces. Nous savons également maintenant qu'il peut être utilisé pour manipuler des moyens ou des moyens auxquels ils ne s'attendent pas. Nous savons également que vous protégez votre vie privée à certains égards. Vous avez reconnu avoir mis de la bande sur votre webcam à un moment donné. Je pense que vous avez acheté le lot autour d'une de vos maisons pour avoir plus d'intimité. Quelles autres mesures prenez-vous pour protéger votre vie privée en ligne? En tant qu'utilisateur de Facebook, voudriez-vous vous inscrire à des applications telles que le quiz de personnalité? "
Zuckerberg: " J'utilise certainement beaucoup d'applications. Je suis un grand utilisateur d'Internet. Afin de protéger la vie privée, je conseille aux gens de suivre de nombreuses pratiques exemplaires en matière de sécurité. Activer l'authentification à deux facteurs. Changez vos mots de passe régulièrement. Ne faites pas des outils de récupération de mot de passe des informations que vous mettez à la disposition du public … regardez et comprenez que la plupart des attaques seront de l'ingénierie sociale et non des personnes essayant de pénétrer dans les systèmes de sécurité. Pour Facebook en particulier, je pense que l'une des choses que nous devons faire … ne sont que les contrôles de confidentialité que vous avez déjà. En particulier avant l'événement GDPR, les gens vont demander si nous allons mettre en œuvre toutes ces choses. Ma réponse à cette question est que nous avons eu presque tout ce qui a été mis en œuvre pendant des années … le fait que la plupart des gens n’en soient pas conscients est un problème. Nous devons faire un meilleur travail pour mettre ces outils devant les gens et pas seulement les offrir. J'encourage les gens à les utiliser et à s'assurer qu'ils sont à l'aise avec la façon dont leurs informations sont utilisées sur nos systèmes et autres. "
Sarah Frier, Bloomberg: pourrait être n'importe où maintenant. Quels résultats espérez-vous obtenir des audits et que ne pourrez-vous pas trouver? "
Zuckerberg: " Aucune mesure de sécurité ne sera parfaite. Cependant, une grande partie de la stratégie doit impliquer de modifier les aspects économiques des mauvais acteurs potentiels pour qu’ils ne valent pas la peine de faire ce qu’ils pourraient faire autrement. Nous n'allons pas pouvoir sortir et trouver chaque mauvaise utilisation des données. Ce que nous pouvons faire, c’est rendre les choses plus difficiles pour les gens qui vont de l’avant, changer les calculs pour ceux qui envisagent de faire quelque chose de rudimentaire et je pense que nous pourrons éventuellement découvrir une grande quantité de mauvaises activités. de ce qui existe et nous serons en mesure de faire des vérifications pour nous assurer que les gens se débarrassent de c es données. "
Steve Kovach, Business Insider: problème ou tout problème de confidentialité des données? "
Zuckerberg: " Je ne l'ai pas fait. Je pense que nous travaillons encore à cela. Au bout du compte, c'est ma responsabilité. J'ai commencé cet endroit. Je le lance Je suis responsable de ce qui se passe ici. Je vais faire le meilleur travail pour aider à faire avancer les choses. Je ne cherche pas à jeter quelqu'un d'autre dans le bus pour les erreurs que nous avons commises ici. "
Nancy Cortez, CBS News: " Vos critiques disent que le modèle commercial de Facebook dépend de la collecte de données personnelles. pouvez-vous rassurer les utilisateurs que leurs informations ne seront pas utilisées d'une manière inattendue? "
Zuckerberg: " Je pense que nous pouvons mieux expliquer ce que nous avons réellement faire. Il y a beaucoup d'idées fausses sur ce que nous faisons que je pense que nous n'avons pas réussi à clarifier pendant des années. Premièrement, la grande majorité des données que Facebook connaît à votre sujet, c'est parce que vous avez choisi de les partager. Il ne suit pas … nous ne suivons pas et nous n'achetons pas et ne vendons pas [data] … En ce qui concerne l'activité publicitaire, cela représente une part relativement moindre de ce que nous faisons. La plupart des activités sont des personnes qui partagent des informations sur Facebook. C'est pourquoi je pense que les gens comprennent la quantité de contenu, car ils y mettent toutes les photos et les informations. Pour une raison quelconque, nous n’avons pas été en mesure de rejeter cette notion depuis des années, que nous vendions des données à des annonceurs. Nous ne le faisons pas. Cela va tout simplement à l'encontre de nos propres incitations … Nous pouvons certainement mieux expliquer cela et rendre ces choses compréhensibles. La réalité est que la façon dont nous gérons le service est que les gens partagent des informations, nous les utilisons pour aider les gens à se connecter et améliorer les services, et nous diffusons des annonces pour en faire un service gratuit pour tous. 19659003]Rebecca Jarvis, ABC News: “Cambridge Analytica has tweeted now since this conversation began, 'When Facebook contacted us to let us know the data had been improperly obtained, we immediately deleted the raw data from our file server, and began the process of searching for and removing any of its derivatives in our system.' Now that you have this finalized understanding, do you agree with Cambridge Analytica's interpretation in this tweet and will Facebook be pursuing legal action against them?”
Zuckerberg: “I don't think what we announced today is connected to what they just said at all. What we announced with the 87 million is the maximum number of people that we could calculate could have been accessed. We don’t know how many people’s information Kogan actually got. We don’t know what he sold to Cambridge Analytica. We don’t today what they have in their system. What we have said and what they agreed to is to do a full forensic audit of their systems so we can get those answers. But at the same time, the UK government and the ICO are doing a government interpretation and that takes precedence. We’ve stood down temporarily…and once that’s down, we’ll resume ours so we can get answers to the questions you’re asking and ultimately make sure that not of the data persists or is being used improperly. At that point, if it makes sense, we will take legal action, if we need to do that to protect people’s information.
Alex Kantrowitz, Buzzfeed, “Facebook’s so good at making money. I wonder if your problems could somewhat be mitigated if company didn’t try to make so much. You could still run Facebook as a free service, but collect significantly less data and offer significantly less ad targeting…so, I wonder if that would put you and society and less risk.”
Zuckerberg: “People tell us that if they’re going to see ads, they want the ads to be good. The way the ads are good is making it so that when someone tells us they have an interest…that the ads are actually relevant to what they care about. Like most of the hard decisions that we make, this is one where there’s a trade-off between values people really care about. On the one hand, people want relevant experiences and on the other hand, I do think that there’s some discomfort how data is used in systems like ads. I think the feedback is overwhelmingly on the side of wanting a better experience…”
Nancy Scola, Politico, “When you became aware in 2015 that Cambridge Analytica inappropriately accessed this Facebook data, did you know that firm’s role in American politics and in Republican politics in particular?”
Zuckerberg: “I certainly didn’t. One of the things in retrospect…people ask, ‘why didn’t you ban them back then?’ We banned Kogan’s app from the platform back then. Why didn’t we do that? It turns out, in our understanding of the situation, that they weren’t any of Facebook’s services back then. They weren’t an advertiser, although they went on to become one in the 2016 election. They weren’t administering tools and they didn’t build an app directly. They were not really a player we had been paying attention to.”
Carlos Hernandez, Expansion: “Mark, you mentioned that one of the main important aspects of Facebook is the people. And, one of the biggest things around the use of these social platforms is the complexity of users understanding how these companies store data and use their information. With everything that is happening, how can you help users learn better how Facebook, What’s App and Instagram is collecting and using data?”
Zuckerberg: “I think we need to do a better job of explaining the principles that the service operates under. But, the main principles are, you have control of everything you put on the service, most of the content that Facebook knows about you is because you chose to share that content with friends and put it on your profile and we’re going to use data to make the services better…but, we’re never going to sell your information. If we can get to a place where we can communicate that in a way people understand it, then we have a shot at distilling this down to a simpler thing. That’s certainly not something we’ve succeeded at doing historically.
Kurt Wagner, Recode: “There’s been the whole #deletefacebook thing from a couple of weeks ago, there’s been advertisers who have said that they’re pulling advertising money or pull their pages down altogether, I’m wondering if on the back end, have you seen any actual change in usage from users or change in ad buys over the last couple weeks…”
Zuckerberg: “I don’t think there’s been any meaningful impact that we’ve observed. But look, it’s not good. I don’t want anyone to be unhappy with our services or what we do as a company. Even if we can’t measure a change in the usage of the products or the business…it’s still speaks to feeling like this was a massive breach of trust and we have a lot of work to do to repair that.”
Fernando Santillanes, Grupo Milenio: “There’s a lot of concern in Mexico about fake news. Associating with media to identify these fake articles is not enough. What do you say to all the Facebook users who want to see Facebook take a more active Facebook position to detect and suppress fake news?”
Zuckerberg: “This is an important question. 2018 is going to be an important year for protecting important election integrity around the world. Let me talk about how we’re fighting fake news across the board. There are three different types of activity that require different strategies for fighting them. It’s important people understand all of what we’re doing here. The three basic categories are, 1) there are economic actors who are basically spammers, 2) governments trying to interfere in elections, which is basically a security issue and 3) polarization and lack of truthfulness in what you describe as the media.”
In response to economic actors, he explained, “These are folks like the Macedonian trolls. What these folks are doing, it’s just an economic game. It’s not ideological at all. They come up with the most sensational thing they can in order to get you to click on it so they can make money on ads. If we can make it so that the economics stop working for them, then they’ll move on to something else. These are literally the same type of people who have been sending you Viagra emails in the 90s. We can attack it on both sides. On the revenue side, we make it so that they can’t run on the Facebook ad network. On the distribution side, we make it so that as we detect this stuff, it gets less distribution on News Feeds.”
The second category involves national security issues, i.e. Russian election interference. Zuckerberg’s response to solve this problem involves identifying bad actors, “People are setting up these large networks of fake accounts and we need to track that really carefully in order to remove it from Facebook entirely as a security issue.”
The third category is about media, which Zuckerberg believes requires deeper fact checking. “We find that fact checkers can review high volume things to show useful signals and remove from feeds if it’s a hoax. But there’s still a big polarization issue. Even if someone isn’t sharing something that’s false, they are cherry picking facts to tell one side of a story where the aggregate picture ends up not being true. There, the work we need to do is to promote broadly trusted journalism. The folks who, people across society, believe are going to take the full picture and do a fair and thorough job.”
He closed on that topic on an optimistic note, “Those three streams, if we can do a good job on each of those, will make a big dent across the world and that’s basically the roadmap that we’re executing.”
Casey Newton, The Verge: “With respect to some of the measures you’re putting into place to protect election integrity and to reduce fake news…how are you evaluating the effectiveness of the changes you’re making and how will you communicate wins and losses…?”
Zuckerberg: “One of the big things we’re working on now is a major transparency effort to be able to share the prevalence of different types of bad content. One of the big issues that we see is a lot of the debate around fake news or hate speech happens through anecdotes. People see something that’s bad and shouldn’t be allowed on the service and they call us out on it, and frankly they’re right, it shouldn’t be there and we should do a better job of taking that down. But, what think is missing from the debate today are the prevalence of these different categories of bad content. Whether it’s fake news and all the different kinds there in, hate speech, bullying, terror content, all of things that I think we can all agree are bad and we want to drive down, the most important thing there is to make sure that the numbers that we put out are accurate. We wouldn’t be doing anyone a favor by putting out numbers and coming back a quarter later saying, ‘hey, we messed this up.’ Part of transparency is to inform the public debate and build trust. If we have to go back and restate those because we got it wrong, the calculation internally is that it’s much better to take a little longer to make sure we’re accurate than to put something out that might be wrong. We should be held accountable and measured by the public. It will help create more informed debate. And, my hope over time is that the playbook and scorecard that we put out will also be followed by other internet platforms so that way there can be a standard measure across the industry. “
Barbara Ortutay, Associated Press, “What are you doing differently now to prevent things from happening and not just respond after the fact?
Zuckerberg: “Going forward, a lot of the new product development has already internalized this perspective of the broader responsibility we’re trying to take to make sure our tools are used well. Right now, if you take the election integrity work, in 2016 we were behind where we wanted to be. We had a more traditional view of the security threats. We expected Russia and other countries to try phishing and other security exploits, but not necessarily the misinformation campaign that they did. We were behind. That was a really big miss. We want to make sure we’re not behind again. We’ve been proactively developing AI tools to detect trolls who are spreading fake news or foreign interference…we were able to take down thousands of fake accounts. We’re making progress. It’s not that there’s no bad content out there. I don’t want to ever promise that we’re going to find everything…we need to strengthen our systems. Across the different products that we are building, we are starting to internalize a lot more that we have this broader responsibility. The last thing that I’ll say on this, I wish I could snap my fingers and in six months, we’ll have solved all of these issues. I think the reality is that given how complex Facebook is, and how many systems there are, and how we need to rethink our relationship with people and our responsibility across every single part of what we do, I do think this is a multiyear effort. It will continue to get better every month.”
As I once said, and believe more today than before, with social media comes great responsibility. While optimism leads to great, and even unprecedented innovation, it can also prevent seeing what lies ahead to thwart looming harm and destruction. Zuckerberg and company have to do more than fix what’s broken. It has to look forward to break and subsequently fix what trolls, hackers and bad actors are already seeking to undermine. And it’s not just Facebook. Youtube, Google, Instagram, Reddit, 4/8Chan et al., have to collaborate and coordinate massive efforts to protect users, suppress fake news and promote truth.
Brian Solis is principal analyst and futurist at Altimeterthe digital analyst group at ProphetBrian is world renowned keynote speaker and 7x best-selling author. His latest book, X: Where Business Meets Design, explores the future of brand and customer engagement through experience design. Invite him to speak at your event or bring him in to inspire and change executive mindsets.
Connect with Brian!
The post If you had one hour with Mark Zuckerberg, what would you ask? Here’s what I learned about the state and future of Facebook, data, politics and bad actors appeared first on Brian Solis.