Kids as young as seven are accessing age-restricted items such as weapons, alcohol and vapes in the United Kingdom, digital identity platform Luciditi is warning, as it launches a new age verification product this week.
The British company is arguing that the country needs to step up its enforcement of incoming regulations to prevent companies from selling age-restricted products and services to underage people online.
“The law is very clear in that it is an offense to sell items such as weapons, alcohol and vapes to under-age children online and yet our study reveals that many online businesses across the UK still don’t have either the technological capabilities or the appetite to adhere to it,” says Ian Moody, co-founder and CEO for Luciditi.
When the regulation is finally adopted, however, the company will have a solution ready: Its new online age check solution called Luciditi Age Assurance can be deployed across an online retailer’s website or mobile app. The product does not reveal the identity of the shopper, the company says in a release.
Luciditi provides digital identity verification, Right to Rent, Right to Work certified by Digital Identity and Attributes Trust Framework (DIATF) and passwordless biometric authentication. Facial age assurance and age verification with ID checks and selfie biometrics are also among its product suite.
While the UK may not have enough regulations for online sales, it is preparing a new law on content restrictions for minors The Online Safety Bill obligates all internet platforms with British users to prevent minors from accessing harmful and Illegal content such as child sexual abuse, hate crimes, self-harm, terrorism, illegal drugs or weapons and more.
Biometrics may become an important part of how the Online Safety Bill is finally implemented. Both voice and face biometrics have been floated as a solution for age verification despite criticism from privacy organizations, including a new warning this week from the Electronic Frontier Foundation.
Australia’s online safety commissioner argues for stronger age verification
Access to prohibited items such as alcohol and exposure to unsavory content is not the only danger lurking online for minors. Australia’s regulator for online safety warned this week that platforms should introduce stronger age verification to prevent kids from being coerced into making sexual abuse material.
eSafety Commissioner Julie Inman Grant presented research conducted by the agency which showed that one in eight (12 percent) child sexual abuse videos and photographs were “self-generated,” meaning that children were groomed or coerced by predators into producing it.
Predators found their victims on social media sites or online gaming platforms, while the materials were distributed through messaging platforms. The analysis covered more than 1,300 child sexual abuse material (CSAM) reports collected within the past year, ABC reports.
The commissioner argues that while parents play an important role, tech companies are equally responsible for making their products safe for the community. The age self-declaration measures on social media are too easy to circumvent and must be made stronger.
“We do need more rigorous age verification, but I’d say verification technologies overall,” says Inman Grant.
Earlier this year, the eSafety commissioner published recommendations around age verification called the Roadmap for Age Verification which proposed trials of age-assurance technologies. The Australian federal government said that the recommendations will not be followed until it finalizes a new industry code for preventing children from accessing legal but age-inappropriate adult content.
In August, Australia’s Albanese government declined to impose mandatory age verification for adult content, citing the immaturity of current technology options.
age verification | Australia | biometrics | digital identity | industry | Luciditi | Online Safety Bill | regulation | UK