The tech—which uses artificial intelligence (AI) to identify addicts who have asked to be barred from betting sites—is set to be rolled out across gambling venues in the state of New South Wales (NSW) next year.
Supporters say it will help curb problem gambling in a country where the addiction affects about 1 percent of the population and annual losses run to billions of dollars.
But the technology is “invasive, dangerous and undermines our most basic and fundamental rights,” said Samantha Floreani, program lead at Digital Rights Watch, a nonprofit.
“We should be exceptionally wary of introducing it into more areas of our lives, and it should not be seen as a simple quick-fix solution to complex social issues,” she said.
‘Best opportunity’
Facial recognition systems use AI to match live images of a person against a database of images—in this case a gallery of people who have voluntarily signed up to a “self-exclusion” scheme for problem gamblers.
If the camera identifies someone on the statewide database, a member of staff is alerted so they can be denied entry to casinos or escorted away from slot machines in hotels and bars.
“We think this is the best opportunity we’ve got in preventing people who have self-excluded from entering the venues,” said John Green, director of Australian Hotels Association NSW.
The data collected will be secured and encrypted, and will not be accessible by any third parties, including the police and even the gambling venues, Green said.
However, digital rights groups said the tech was ineffective in stopping problem gambling and could go on to be used for wider surveillance, adding such projects underline the need for tougher privacy and data rights laws to protect citizens.
Increasingly used
“People who opt into self-exclusion programs deserve meaningful support, rather than having punitive surveillance technology imposed upon them,” said Floreani of Digital Rights Watch. “And those who have not opted into these programs ought to be able to go to the pub without having their faces scanned and their privacy undermined.”
Facial recognition technology is increasingly used for everything from unlocking mobile phones to checking in for flights. It has also been adopted by some police forces.
Advocates say it helps keep public order, solve crime and even find missing people.
Critics say there is little evidence it reduces crime, and that it carries an inherent risk of bias and misidentification, especially for darker-skinned people and women.
Gambling industry bodies have said the facial recognition cameras would only be used to enforce the self-exclusion scheme.
A draft law introduced in NSW parliament last month that will formally legalize the tech in clubs and pubs includes language that would allow other uses, including identifying people banned for being too drunk.
The Australian Human Rights Commission last year called for a ban on the technology until it is better regulated with “stronger, clearer and more targeted” human rights protections.
There also is growing pushback against facial recognition in Europe, the United States and elsewhere, with companies including Microsoft and Amazon ending or curbing sales of the technology to the police.
RELATED STORY:
Vegas, Macau…Dubai? Global casinos raise bets on gambling in the Gulf