Australian gambling venues have taken a step further in their pursuit of safe gambling by installing a facial-recognition system for problem gamblers.
This new innovation, which uses an artificial intelligence (AI) system, will scan all persons making entry into gaming venues through small cameras, which will be mostly installed at the entrance of the facility. This scanning is meant to identify persons with problem gambling who have been prohibited from gambling facilities or who have been enrolled in the self-exclusion registry either by themselves or by close family members and friends.
Already, the Warilla Hotel in Illawarra has the necessary installation in place, which means guests making entry to the venue are scanned before accessing the facility. According to reports, more gaming venues in New South Wales will be incorporating this technology by next year.
While the aim of the move is to promote safe gambling, many have raised their eyebrows at it, stating that it is intrusive and breaches privacy. However, in response to this, ClubsNSW and the Australian Hotels Association NSW (AHA NSW) noted that they have taken measures to ensure there is absolute privacy protection.
READ: NSW likely to expand cashless gambling trials
According to the director of AHA NSW, John Green, the personal data that is gathered will be protected and encrypted, and it will be kept out of sight or accessible from any third parties, including law enforcement and gambling establishments.
“We think this is the best opportunity we’ve got in preventing people who have self-excluded from entering the venues,” said Green.
On the other hand, the program lead at Digital Rights Watch, Samantha Floreani, noted that the bodies involved should be cautious about incorporating it into even more aspects of people’s lives.
“It is invasive, dangerous and undermines our most basic and fundamental rights,” said Floreani.
“We should be exceptionally wary of introducing it into more areas of our lives, and it should not be seen as a simple quick-fix solution to complex social issues.
“People who opt into self-exclusion programs deserve meaningful support, rather than having punitive surveillance technology imposed upon them. And those who have not opted into these programs ought to be able to go to the pub without having their faces scanned and their privacy undermined.”
Many have been demanding for the reversal of this implementation and calling for the attention of legislative bodies. Hence, the non-government body Digital Rights Watch is insisting that Australia’s 1988 Privacy Act be revised so that it more adequately addresses the application of facial recognition technology.