OPINION

I have a writer friend who writes erotica of a, shall we say, ‘niche’ variety. Now, just as Agatha Christie didn’t go around murdering people, this writer has no real-life interest in or experience of the things she writes about. Some of her fans, however, seem to assume otherwise. Worse, they apparently want to get in on the action.

In a recent post to a writer’s discussion group we’re in, she asked how she should handle certain queries from fans. She was getting a disturbing number asking for intimate details about her real life family. More disturbingly, some asked for photos and descriptions of her children.

As it turns out, though, this sort of creepy predatoriness is not reserved for erotica authors.

One in eight Australians on dating apps have received a request to facilitate child sexual exploitation or abuse, according to a survey of 10,000 people.

The Australian Institute of Criminology (AIC) has released a report showing that a staggering 12.4 per cent of respondents had received at least one of these five requests:

• A request for photos of their children or other children they had access to

• Pressure to provide sexual images of those children

• A request to meet those children before it was “appropriate”

• Requests for information of a sexual nature about those children (eg breast size, whether they had their period)

• An offer of payment for photos, videos or live streams of those children

Now, some of those may be innocent enough. Maybe. But most are not so much red flags as gigantic, lurid warning beacons.

While a request for photos may sound “quite innocuous”, researchers found the majority of cases were in fact of a sinister nature.

AIC Deputy Director Rick Brown said the “results were quite disturbing”.

And some are, on further investigation, obviously very far from innocent.

“We found that about half (48.5 per cent) of those that had been asked for an image, reported being pressured to provide sexual images of children,” he said.

“Sixty-nine per cent reported being asked questions of a sexual nature about the children, and in about 63 per cent of cases, the individual was offered payment for photos, videos or live streams of the children.”

Here’s the thing, though: predators only try this stuff on if they think it’s going to work. The sheer prevalence of the tactics is, criminologists say, an indicator that they’re hitting paydirt often enough to spur them on.

“It would be reasonable to assume that at least some of those [people] would have followed through, just by the very numbers,” he said.

“One would suspect that it’s successful for some [predators].”

And, like most predators, they target the easier marks.

The survey uncovered five factors that made a person more likely to receive requests from child predators.

Younger people, First Nations people, people whose first language was not English, and people with a disability or long-term illness were all at higher risk, according to Dr Brown.

People who’d chosen to link their social media accounts to the dating app profiles were also at a higher risk.

And if you think it’s just creepy men in raincoats, think again. Men were just as likely as women to be targeted and just as often targeted by women. Although criminologists suspect that at least some of those are men pretending to be women.

At least, unlike certain other social media companies, the dating apps are trying to improve safety for their users.

Match Group – which owns Tinder, Hinge, OkCupid and Plenty of Fish – introduced a “law enforcement portal” in late 2021 and optional ID verification in late 2023.

There’s also an in-app safety centre and reporting mechanisms that can lead to a profile being banned.

Still, don’t put anything online that you wouldn’t be comfortable telling a random stranger in the street and, for God’s sake, get a grip on privacy settings.

Punk rock philosopher. Liberalist contrarian. Grumpy old bastard. I grew up in a generational-Labor-voting family. I kept the faith long after the political left had abandoned it. In the last decade...