CSUN Alumnus McNally Discusses Using AI to Help Facebook and Google Fight Fraud and Abuse
You can’t fight fake news without real people — someone has to review dubious posts to decide on whether the standards of journalistic due diligence were followed.
But Michael McNally ’88 (Computer Science), Facebook’s director of engineering: news feed science and integrity, uses artificial intelligence (AI) to accelerate the process of weeding out misinformation within the hundreds of billions of stories posted on Facebook every day. McNally writes algorithms that look for telltale signs of the systematic spread of dishonesty, identifying posts and pages that seek to profit or gain influence from sensationalism and falsehoods.
It was a challenge McNally felt he had to accept after several years of fighting fraud within Google’s advertising service.
“I moved from Google to Facebook with a sense of social purpose or meaning,” said McNally. “I felt that harm had been done to the public sphere, that there had been misinformation, polarization, deception. It’s a privilege to join a team of leaders to defend the public interest.”
McNally returned to his alma mater, California State University, Northridge, on March 7 to inspire a new generation of students interested in careers in technology. He gave a talk titled From CSUN, to defending Google and Facebook against fraud and abuse, hosted by the CSUN Alumni Association, in a packed University Student Union Grand Salon. McNally discussed his current fight against misinformation, his past successes and the role CSUN played in preparing him for it all.
McNally was born in Burbank and attended Valley Alternative School in Van Nuys. He’s a second-generation Matador: His mother completed a Master of Fine Arts at CSUN. He called himself a “classic nerd, classic geek,” who loved animation and “things that beeped and whistled and so forth.” He initially taught himself computer programming, pestering the staff at a computer store until they gave him access to the machines in the back. In high school, he taught his school’s only computer programming class.
At CSUN, he studied computer science with a minor in English and creative writing — he thought he might write science fiction one day. He parlayed his programming skills into jobs programming and developing computer games in two of California’s most prominent valleys, San Fernando and Silicon. He became interested in the problem-solving possibilities of AI, and went to UCLA to get a master’s degree and completed all but the dissertation of a Ph.D.
He said the computer programming skills he developed at CSUN were directly relevant to his roles at Google and Facebook. But he also learned things beyond the academic foundation. He took dance classes and public speaking to push himself past his comfort zone. In other words, CSUN taught him the value of embracing big challenges.
“College was a wonderful place to go out and do hard things,” McNally said. “I just pushed myself into things that are really, really hard. When I talk about fake news, well, that’s really, really hard.”
When users log in to Facebook, the content they see in their news feed is determined by an algorithm that ranks posts from their friends, groups and “liked” pages. The ranking is based on individual and group data: content the user has liked, and content that users with similar interests have liked.
The downside to a free and open platform such as Facebook, McNally said, is that bad actors will try to exploit it. Businesses can profit by attracting cheap traffic through attention-grabbing, sensational posts. The more attention a post gets — whether it’s from people who agree with it or who are scandalized by it — the more money the post makes. A recent attention-grabbing post McNally cited was about a lost ship reappearing in the Bermuda Triangle — completely untrue, but the story spread quickly through social media.
Fake news posts also are created by individuals who want to influence a debate by spreading misinformation. (Given the political meaning now attached to the term “fake news,” Facebook chooses to refer to “fake news” as “false news,” McNally said.) As a recent example, McNally pointed to posts created in response to the mass shooting in Parkland, Fla., including posts that described student survivors as “crisis actors” who are paid to travel to the scenes of tragedies to advance a political agenda.
McNally asked audience members to reflect on their reactions when they first heard about the shooting.
“Can you remember what passed through your mind when you heard someone had taken an assault rifle, broken into a school, and murdered 17 people — students and faculty?” he asked the audience, then repeated some of their responses. “‘Sadness.’ ‘Anger.’ ‘Shock.’ ‘Frustration.’
“Try this on for size: How would you think about people who responded not like you did, but they responded, ‘Yes, this is a great opportunity to make a buck’?” he said. “Some people responded to tragedy with a desire to exploit the community. And that’s what we’re fighting.”
McNally supports two Facebooks teams, News Feed Science and News Feed Integrity. The Feed Science team studies how users interact with content, determining what people like and what they don’t, with the goal of showing users content that they are most likely to enjoy. The Feed Integrity team works to reduce misinformation, spam, clickbait and other material that Facebook users consider low quality.
Facebook uses multiple professional fact-checking services to make the final call on whether articles were written using good-faith journalistic practices. If the fact checkers determine an article was not written using journalistic standards, Facebook can demote it in its rankings and it will be less likely to appear in people’s news feeds.
Algorithms can find telltale signals of accounts pushing misinformation and hate speech, identify the entities responsible, track down their relationships and categorize them into groups. (In technical terms, machine learning can form bottom-up clusters based on data relationships between entities.) The Site Integrity Team can then choose how to respond, including taking down accounts that are found to be in violation of policy.
Another issue: Some purveyors of misinformation have figured out ways to manipulate Facebook’s algorithm, which rewards engagement. Therefore, memes ask you to engage — “like,” tag friends, say “amen.”
“It’s not wrong in itself, but if you do it a lot, it gets spammy,” McNally said. “If it tells you to engage, it’s more likely to go viral.”
McNally said AI also helps identify spammer pages that are covered with ads that can eat up bandwidth and slow down a user’s experience. Facebook can demote both engagement bait posts and low-quality pages. Facebook also enforces stricter ad policies, and tries to prevent pages and domains with sketchy reputations from advertising.
“We try to fight it on every point of the chain,” McNally said. “[The efforts] all stack on top of each other, to have the aggregate effect to deprive them of income and influence.”
At the end of McNally’s talk, CSUN students lined up at microphones on both sides of the Grand Salon to ask questions. For example, when asked what roles may exist for non-programmers in a digital economy, McNally mentioned public relations, marketing, legal and designers focused on user experience.
“There’s quite a bit of diversity of roles,” McNally said.
Hamid Johari, interim dean of CSUN’s College of Engineering and Computer Science, who introduced McNally at the event, said it was important for students from a variety of backgrounds to see McNally as an example.
“One of the biggest benefits of this event is the engagement of students, and to see how a technical background can take them to some important things that affect really large numbers of people,” Johari said. “Facebook has over a billion users. To be able to do something good for a large number of people is an amazing impact.”