Hi Lisa, thank you for taking time to participate in our speaker Q&A! Would you first like to introduce yourself and tell us about the projects you are currently working on.
My name is Lisa Sugiura and I am a Reader in Cybercrime and Gender in the School of Criminology and Criminal Justice at the University of Portsmouth. My research interests are in the area of online gender-based violence and I am working on projects involving online misogynist and male supremacist groups, supporting victims of sexual abuse, and the use of technology in domestic abuse.
You are a panelist at the upcoming Cyber Security and Data Protection Summit, in which you will discuss the online safety bill, how successful do you think the bill will be in ending misogynistic and sexist language online? Do you feel the bill seeks to successfully tackle violence against women enough, or are there proposals which should be included within the bill?
I believe that the bill has the potential to make a significant difference to the prevalence of abuse suffered by UK internet users, however, for it to appropriately protect those who are disproportionately targeted by online abusers – namely women, and especially black women, and racialised and minoritised persons, further additions need to be made to strengthen the bill, by explicitly including VAWG, intersectional considerations, and taking into account lived experiences of those subjected to forms of online abuses. Currently women are missing from the bill, therefore their inclusion is up to secondary legislation and processes. There is much rhetoric about how online misogyny is a recognised problem and the bill is set to tackle this, but it is unclear how this will be achieved exactly.
Research from Refuge found that 1 in 3 UK women have experienced online abuse perpetrated on social media or other online platforms at some point in their lives. Do individual companies e.g. Meta, Snapchat and dating apps need to take a certain level of responsibility as individual organisations separately to Government guidance and legislation?
Yes, there needs to be a greater emphasis on individual platform responsibility, particularly in relation to platform design, systems, and processes. There needs to be more leadership from platform providers in tackling online-gender based violence and gender impact assessments built into the standard process for tech development projects. Platforms should do more than merely comply with the legislation but take steps to meaningfully improve the user experiences of those most at risk of being targeted by abusers, be proactive in preventing gendered and sexual abuse and abusers on their sites, and actually act on violations with real consequences. Mitigating abuse is not a current priority of technology business models and far more is needed to ensure safer online spaces. There is also the problem of how clandestine social media sites notorious for abuse and offensive behaviours will be able to evade the bill and continue to operate without impunity.
Do you believe that social media and online attitudes are becoming more negative? Or, do you believe that derogatory language and violence towards women online is becoming more overt?
I believe that contemporary misogyny has found a new avenue online, enabling it to disseminate and advance in unprecedented ways, appealing to and reaching new generations. It is not necessarily the case that social media and online attitudes are becoming more negative towards women, misogyny is not unique to cyberspace, but online patriarchal ideologies about women’s inferiority and other sexist and misogynist tropes are regurgitated and amplified, often propagated by men with significant reach and appeal.
In your experience and research, do derogatory comments/hate etc online increase after divisive political/social events?
Yes, hate-fueled ideologies are linked with the wider socio-political climate, and these are not confined to online spaces but are exacerbated and enabled by digital technologies. Also, when you have leaders – who are outwardly misogynist and racist– who can maintain powerful and public platforms, such hate is validated and emboldens online supremacist movements.
Targeting hate speech could be effective, however this is perhaps a short-term solution to a wider societal problem. In your opinion, what one solution could be trialed to reduce sexism on a long-term basis? Do you think these interventions should be placed in the physical or cyber world?
These problems are combinations of both society and technology, therefore we need a combination of legal, social, and technical solutions that are intersectional, victim-survivor centered, and trauma-informed to protect against further harms. On a long-term basis we need to extend sexual health promotion and sexual violence prevention education with young people to include ethical use of technologies and focus on the need for consent and build content around consensual technology used in sexual violence prevention approaches more broadly. We also need to speak with and engage young men and boys about men’s violence against women in a way that doesn’t demonise them but which also doesn’t diminish the reality of women’s lived experiences.
‘Online harm’ as a term has been considered by some as too vague to control, how do you think the phrasing can be more clearly defined to ensure the online safety bill can be more effective?
To effectively tackle online gender-based violence, this needs to be explicitly included in the definition of harms. A definition that doesn’t include an intersectional understanding of the gendered and racialised nature of online abuse in the primary legislation of the bill, will result in the bill failing to make meaningful change to those who most frequently subjected to online abuse. I appreciate, however, that the rapid pace of technology development and the continually shifting nature of online abuse means that any definition needs to be flexible enough to be able to apply to any emerging harms and not be made redundant or less effective in the future. The definition of harms should also consider physical as well as psychological impacts for intended victims, but also bystanders who may be affected by viewing the content, especially when they belong to the same minoritised groups.
We can’t wait to have you speaking at the summit this November, what are you most looking forward to about speaking and attending?
I am looking forward to engaging and networking with people across a broad range of sectors and professions and hearing about different perspectives. I am also grateful to have the opportunity to highlight the significant issues that are impacting upon women and girl’s lives and to try to work to find solutions.
And a final question…which historical figure from past or present would you like to invite for dinner and why?
Can I have two? Ruth Bader Ginsburg and bell hooks, both such inspirational women who fought for and succeeded in enacting positive social change – they would have amazing stories to tell.
Lisa joined us to speak on the panel 'Online Safety Bill: Promoting User Safety' at the Cyber Security & Data Protection Summit on 17th November 2022. To check out this year's edition of the Cyber Security & Data Protection Summit click here
Dr Lisa Sugiura
Dr Lisa Sugiura is a Reader in Cybercrime and Gender in the School of Criminology and Criminal Justice at the University of Portsmouth.