Dear Visitor,

Our system has found that you are using an ad-blocking browser add-on.

We just wanted to let you know that our site content is, of course, available to you absolutely free of charge.

Our ads are the only way we have to be able to bring you the latest high-quality content, which is written by professional journalists, with the help of editors, graphic designers, and our site production and I.T. staff, as well as many other talented people who work around the clock for this site.

So, we ask you to add this site to your Ad Blocker’s "white list" or to simply disable your Ad Blocker while visiting this site.

Continue on this site freely
  HOME     MENU     SEARCH     NEWSLETTER    
TECHNOLOGY, DISCOVERY & INNOVATION. UPDATED 8 MINUTES AGO.
You are here: Home / Digital Life / FB, Google, Twitter Join Trust Project
Facebook, Google, Twitter Join Trust Project To Target 'Fake News'
Facebook, Google, Twitter Join Trust Project To Target 'Fake News'
By Shirley Siluk / Sci-Tech Today Like this on Facebook Tweet this Link thison Linkedin Link this on Google Plus
PUBLISHED:
NOVEMBER
17
2017
Over the past year, tech giants have come under increasing scrutiny for their roles in spreading viral misinformation that might have helped to decide the 2016 presidential election.

Some of them are now testing the use of "trust indicators" to highlight news sources that meet certain quality and reliability standards. Developed over the past three years by news organization representatives working with the nonpartisan Trust Project, these indicators are aimed at providing readers with more transparency about the news outlets, journalists, financial sponsorship, and methods behind the stories they read, hear, or see.

Among the tech companies that have agreed to use such indicators for their content are Bing, Facebook, Google, and Twitter. The decision is the latest sign these companies are starting to recognize the extent of the problem with misinformation, propaganda, and "fake news" online.

'Harder than Ever To Tell What's Accurate'

Sally Lehrman, a former writer and editor at the San Francisco Examiner and journalism instructor at California's Santa Clara University, began talking with news editors in 2014 about the impact that technology was having on the quality of news reporting; her work led to the launch of the Trust Project, now hosted by the university's Markkula Center for Applied Ethics.

"In today's digitized and socially networked world, it's harder than ever to tell what's accurate reporting, advertising, or even misinformation," Lehrman said in yesterday's announcement from the Trust Center. "An increasingly skeptical public wants to know the expertise, enterprise and ethics behind a news story. The Trust Indicators put tools into people's hands, giving them the means to assess whether news comes from a credible source they can depend on."

In addition to agreeing to use the project's trust indicators, Bing, Facebook, Google, and Twitter are looking into other ideas that can better highlight reliable news reporting.

The Trust Project identifies key trust indicators in eight categories: best practices and standards, author expertise, type of work, citations and references, methods, local sourcing, diverse sourcing, and efforts to seek public feedback.

For example, a news article using the trust indicators might provide links or information about the publisher's mission and funding, the journalist's experience, other sources for background information, and reporting processes.

'Important Contextual Information'

Facebook said yesterday that it has started testing a trust indicator module with a small group of publishers, and plans to expand that use over the next few months. The module allows publishers to upload links through the Brand Asset Library with more information about owners, ownership structure, as well as policies on ethics, fact-checking, and corrections. That information will then appear along with the publisher's News Feed articles.

"We believe that helping people access this important contextual information can help them evaluate if articles are from a publisher they trust, and if the story itself is credible," product manager Andrew Anker wrote in a Facebook announcement yesterday. "This step is part of our larger efforts to combat false news and misinformation on Facebook -- providing people with more context to help them make more informed decisions, advance news literacy and education, and working to reinforce indicators of publisher integrity on our platform."

Google said it will employ a similar approach by allowing news publishers to embed information about trust indicators into the HTML code of articles and Web sites.

"When tech platforms like Google crawl the content, we can easily parse out the information (such as Best Practices, Author Info, Citations & References, Type of Work)," search group product manager Jeff Chang wrote yesterday on the Google blog. "This works like the ClaimReview schema tag we use for fact-checking articles. Once we’ve done that, we can analyze the information and present it directly to the user in our various products."

The next step will be to find ways to display such trust indicators alongside articles that appear on Google Search, Google News, and other products, Chang said.

At the end of last month, senior executives from Google, Facebook, and Twitter appeared before the Senate Judiciary Committee in Washington, D.C., to offer testimony and answer questions about suspicious online activities that might have pushed Russia-sponsored misinformation to an estimated 126 million Americans in the lead-up to the 2016 presidential election.

That represented a dramatic shift from late last year when Facebook co-founder and CEO Mark Zuckerberg dismissed as "crazy" the suggestion that his platform had any influence on the election of Donald Trump as president of the United States.

Speaking at the University of Kansas yesterday about his efforts to talk with people across the U.S. since the election, Zuckerberg acknowledged his platform had more influence than he initially recognized.

"I think it's very clear at this point that the Russians tried to use these tools to sow distrust leading up to the 2016 election and afterwards," he said. "What they did is wrong. And it is our responsibility to do everything we can to prevent them or anyone else from doing this again."

Image credit: iStock; Artist's concept.

Tell Us What You Think
Comment:

Name:

Like Us on FacebookFollow Us on Twitter
MORE IN DIGITAL LIFE

NETWORK SECURITY SPOTLIGHT
A security researcher has found that hundreds of different models of HP notebooks, tablets, and other devices include a keylogger that could track and record every keystroke a user makes.

SCI-TECH TODAY
NEWSFACTOR NETWORK SITES
NEWSFACTOR SERVICES
© Copyright 2017 NewsFactor Network. All rights reserved. Member of Accuserve Ad Network.