Police in Warwickshire record 57 sexual grooming crimes in two years with record numbers of children targeted on Instagram across the country

More than 50 grooming crimes were recorded by police in Warwickshire over the last two years, data obtained by the NSPCC has revealed.
PolicePolice
Police

A total of 57 offences of sexual communication with a child were recorded by Warwickshire Police in 2017/18 and 2018/19.

In England and Wales there were 4,373 offences of sexual communication with a child recorded in the year to April 2019 compared with 3,217 in the previous year.

Hide Ad
Hide Ad

The offence came into force on April 3, 2017, following an NSPCC campaign.

PolicePolice
Police

The data obtained from 43 police forces in England and Wales under Freedom of Information laws also revealed that, where age was provided, one in five victims were aged just 11 or younger.

In 2018/19 in England and Wales the number of recorded instances of the use of Instagram, which is owned by Facebook, was more than double that of the previous year.

Overall in the last two years, Facebook-owned apps (Facebook, Messenger, Instagram, WhatsApp) and Snapchat were used in 73 per cent of the instances where Warwickshire Police recorded and provided the communication method. Instagram was used in a third of them.

Hide Ad
Hide Ad

The Government has indicated it will publish a draft Online Harms Bill early next year, following the NSPCC’s Wild West Web campaign.

The proposals would introduce independent regulation of social networks, with tough sanctions if they fail to keep children safe on their platforms.

The NSPCC believes it is now crucial that Boris Johnson’s Government makes a public commitment to draw up these Online Harms laws and implement robust regulation for tech firms to force them to protect children as a matter of urgency.

Peter Wanless, NSPCC Chief Executive, said: “It’s now clearer than ever that Government has no time to lose in getting tough on these tech firms.

Hide Ad
Hide Ad

“Despite the huge amount of pressure that social networks have come under to put basic protections in place, children are being groomed and abused on their platforms every single day.

"These figures are yet more evidence that social networks simply won’t act unless they are forced to by law. The Government needs to stand firm and bring in regulation without delay.”

Freya (whose name has been changed to protect her anonymity) was 12 when, while she was staying at a friend’s house, a stranger bombarded her Instagram account with sexual messages and videos.

Her mum Pippa (whose name has been changed to protect anonymity) told the NSPCC: “She was quiet and seemed on edge when she came home the next day. I noticed her shaking and knew there was something wrong so encouraged her to tell me what the problem was.

Hide Ad
Hide Ad

“When she showed me the messages, I just felt sick. It was such a violation and he was so persistent.

"He knew she was 12, but he kept bombarding her with texts and explicit videos and images.

"Freya didn’t even understand what she was looking at.

"There were pages and pages of messages, he just didn’t give up.

“Our children should be safe in their bedrooms, but they’re not. They should be safe from messages from strangers if their accounts are on private, but they’re not.”

Hide Ad
Hide Ad

The NSPCC’s Wild West Web campaign is calling for social media regulation to require platforms to:

Take proactive action to identify and prevent grooming on their sites by:

Using Artificial Intelligence to detect suspicious behaviour

Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts

Hide Ad
Hide Ad

Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children

Design young people’s accounts with the highest privacy settings, such as geo-locators off by default, contact details being private and unsearchable and livestreaming limited to contacts only.

The charity wants to see tough sanctions for tech firms that fail to protect their young users – including steep fines for companies, boardroom bans for directors, and a new criminal offence for platforms that commit gross breaches of the duty of care