Amazon has pulled from its e-commerce website several children’s clothing items displaying a sexually explicit message, following a CBC News investigation.
The items, sold by third-party sellers, included a dress, T-shirt, summer hat and hoodies boldly displaying the message “I love c–k,” using a heart emoji. Sometimes a rooster emoji replaced the word, “c–k,” which is another word for a rooster and slang for a male appendage.
Ads for the products showed children modelling the clothing.
“This is disgusting,” said Karolina Zikova of Chilliwack, B.C., who alerted Amazon, CBC News and the Canadian Centre for Child Protection to the problem last week after discovering one of the items when shopping on Amazon.ca.
“It can be connected with pedophilia,” she said.
Following a CBC inquiry, the Seattle-based e-commerce company removed the items.
“The larger question is how does this type of material even hit their services?” said Signy Arnason, associate executive director of the Canadian Centre for Child Protection, which also flagged the problem to Amazon.
“It’s normalizing the sexual commodification of children,” Arnason said.
‘Who is buying these things?’
Zikova first discovered the items when searching on Amazon for a bathing suit for her eight-year-old niece. That’s when she came across an ad for a “girl’s sporty swimsuit,” which showed a young girl wearing a white bathing suit with the message “I love c–k” displayed repeatedly.
“I was quite shocked because it’s a girl that’s maybe seven, eight years old in the picture,” she said. “How is it possible that somebody is selling it, and who is buying these things?”
Zikova complained to Amazon, which removed the bathing suit from the site.
Concerned that Amazon may still be selling similar items, Zikova further searched the site. This time, she was dismayed to find an ad for a children’s hoodie displaying the same explicit message, modelled by a young boy.
Zikova contacted Amazon using its online chat option, but this time she was unsuccessful in getting the item removed.
According to the online chat transcript, the employee she conversed with did not appear to understand the scope of Zikova’s complaint. After Zikova protested, the employee said someone from a different department would contact her.
She said she hadn’t heard from Amazon by the following day, so she contacted CBC News.
“I hoped that … it will go public, so they will actually have to do something about it.”
In the meantime, Zikova uncovered several other children’s items bearing the same “I love c–k” message. They included a “Christmas dress for girls” modelled by a young girl and marketed as “funny.”
“How is this funny?” she said.
Amazon told CBC News in an email on Sunday that the items violate the company’s offensive products policy, which bans children’s items with adult content, including sexual references.
“All sellers must follow our selling guidelines and those who do not will be subject to action, including potential removal of their account,” an Amazon spokesperson said.
Amazon said the second employee Zikova spoke with should have followed proper procedure to resolve the complaint and that the company is providing additional training to customer service staff as a result of what happened.
The company also reported it had conducted an investigation to ensure no similar products remained on its site.
However, the following day, the Canadian Centre for Child Protection informed CBC News that a similar item was still available on Amazon’s Canadian site: a T-shirt for both adults and kids that referred to a sexual act involving “daddy” and “c–k.”
The organization said it notified Amazon about the T-shirt on Monday morning.
CBC News contacted the third-party seller, Khang Cò, which removed the T-shirt late Monday night.
“It’s our mistake when picking the product,” a representative with the business wrote in an online message. “Thanks for letting me know.”
Amazon said on Tuesday it’s now reviewing its product catalogue for any listings it may have previously missed.
Both Zikova and Arnason, with the child protection centre, said they want Amazon to adopt tighter controls to prevent similar items from appearing on its site.
“You would not find a retailer that would be able to put [these items] in their window,” Arnason said. “They would be shut down, police would be involved.”
Amazon said that its technology, as well as dedicated staff, constantly scan all products listed for sale to search and immediately remove ones that violate its policies.
Last year, the online retailer deleted the N-word from a product description of a black-coloured action figure and admitted to CBC that its safeguards failed to screen out the racist term.
Given the size of Amazon’s marketplace, it would be difficult for the company to vet every single product, according to retail analyst Alex Arifuzzaman.
The company offers hundreds of millions of items, many from third-party sellers.
“It’s never going to be perfect,” said Arifuzzaman, with Toronto based InterStratics Consultants. “There’s always going to be, kind of, things seeping through the edges there.”
Still, he said, Amazon needs to search for ways to improve its vetting process.
“There has to be some kind of an innovative solution,” Arifuzzaman said, such as making third-party sellers sign an agreement guaranteeing the products they sell aren’t offensive.
“And if it is, then there is some kind of a penalty in place,” he said.