Florida Tech Research Analyzing Misformation’s Roots

On social media sites like Facebook, the spread of misinformation about political figures has risen in recent years. Companies have been placed under congressional scrutiny for their role in deploying ads and promoting user content that deliberately deceive the public.

New Florida Tech research is looking at the dynamics of misinformation, the spread of misleading or outright false information, and how to counter it.

Florida Tech computer engineering and sciences associate professor Georgios Anagnostopoulos is working on the project, “Deep Agent: A Framework for Information Spread and Evolution in Social Networks.” The $432,459 grant was awarded by UCF and the Defense Advanced Research Projects Agency (DARPA), with work starting in October 2017 and completing in December. The research is part of DARPA’s Information Innovation Office Social Sim program.

Most often, misinformation is being employed to sway public opinion. For example, an organization, company or state actor may use social media to influence the general public’s disposition on hot-button political and social issues.

Misinformation can also be used towards financial effects. In order to finance state needs or illicit operations, hackers may spread false information to get people to invest in a particular type of cryptocurrency. Once the cryptocurrency reaches a certain price point, those behind the misinformation campaign sell the currency, allowing them to make substantial profits.

In order to combat misinformation, researchers look to understand how such information diffuses over social networks. When modeling such diffusion processes, Anagnostopoulos and his collaborators use social science theories, analyze motivations and model collaborative attitudes between people, as well as attentive behavior. By building those theories into the models, the models provide a way of understanding how and why misinformation spreads and predicting where it will.

Most of the effort’s models are agent-based, a powerful modeling technique that simulates the actions and interactions of many users/entities to eventually simulate and explain system-wide emergent behaviors.

The echo chamber, a metaphorical description of an environment where a person only pays attention to information sources that support her/his beliefs and opinions, has been found to play a major role in the spread of misinformation. Additionally, the team is looking into modeling how trust relationships evolve and how such relationships influence such spread.

In the future, Anagnostopoulos foresees a deeper understanding of this phenomenon and how to combat it. However, while better education of the general public on this subject will help everyone understand how easily mislead one could be misled, no one is immune.

“You may think you’re a critical thinker, but our DNA programs us to be like this,” Anagnostopoulos said. “While there are actual benefits to our innate trusting behavior, such as reinforcing our social bonds, these instincts can be easily turned against us.”

Show More
Back to top button
Close