In today’s fractured online landscape, it is harder than ever to identify harmful actors such as trolls and misinformation spreaders.

Often, efforts to spot malicious accounts focus on analysing what they say. However, our latest research suggests we should be paying more attention to what they do – and how they do it.

We have developed a way to identify potentially harmful online actors based solely on their behavioural patterns – the way they interact with others – rather than the content they share. We presented our results at the recent ACM Web Conference, and were awarded Best Paper.

  • refalo@programming.dev
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 day ago

    I think this applies to every person on Earth. Some just want to watch the world burn, or they don’t even realize that they have no idea what they’re talking about.

    Usually when I see someone with a horrible take, I check their post history and it’s nothing but the same attitude over and over. Dogmatism, egotism and otherwise straight up incompetence, Dunning-Kruger style.

    • sirdorius@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 day ago

      I’ve known a few people like this in real life, but they are usually quickly excluded from social circles as they get quite tiresome. But online they can just continue doing it, especially if it’s not bad enough to warrant a ban.

      • refalo@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        What frustrates me is that it’s almost impossible to find a platform for real-time chat for technical subjects that aren’t completely dominated with this type of person filling the logs 24/7 and just making the whole experience exhausting.