In communist Hong Kong and China there is no privacy of anything including data!
For Hong Kong (HK) people their personal data is now even more important after the illegal annexation of the SAR under the national security law by the Chinese Communist Party (CCP)!
THIS IS THE HK GOVERNMENT AT ITS FINEST : In 2017, the HK government's Registration and Electoral Office (REO) either lost or had stolen two laptops containing HK voter registration details of 3.7 million HKers! Anyone found guilty of anything? Anyone disciplined? Any other followup action to further secure all HKer data?
As always, when HK government is being held to account silence reigned/rained and we all got wet!
In the RTHK article below, was the policeman charged with a violation under the law and taken to Court? Note that to date, NOT one single policeman has been charged with anything related to HK protests - despite all that the UN special rapporteurs have written about, etc.!
Does the HK government expect HK people to believe anything it says or does on data privacy or anything else? HK urgently needs political reform!
HK people know that all of their data held by mainland Chinese companies including banks is already in the hands of CCP.
Alibaba - and companies it offers services to - includes commercial face recognition to supress Uyghurs in China! See article below.
So when can we in HK expect mainland Chinese companies to start using face recognition technology tied to our private data to oppress HK people?
In Hong Kong the fox is in-charge of the chickens! What will the cunning fox do next?
The HK government's COVID-19 response = mass surveillance = data privacy & human rights violations. How can Hongkongers evade this state intrusion?
Prosperity and stability in HK depends upon HK people's consent to vaccination during COVID-19. Isn't the government dreaming if it thinks it can win the total submission and cooperation of the people to achieve 70-80% of population vaccination needed for herd immunity?
Cop showing ID card to camera breached privacy laws
RTHK 22 December 2020 (format added)
The Privacy Commissioner said on Tuesday an investigation into an incident involving a police officer displaying a reporter’s ID card in front of a live-recording camera during an anti-government protest last year found that the officer breached privacy laws.
On December 26, an officer at Tai Po Mega Mall demanded to see the ID and press cards of a Stand News reporter covering the protest. He held up the journalists' two press cards to a TV camera, and then did the same with the ID card, for around 40 seconds.
Following an investigation, the watchdog said the officer had used the reporter’s personal data without his consent and his action was not consistent or directly related to the aim of stop-and-searches to verify a reporter’s identity.
In a report, the Privacy Commissioner said the officer had breached a data protection principle listed in the Personal Data (Privacy) Ordinance.
The commissioner recommends the police to revise its Force Procedures Manual to make sure officers are aware of and comply with data protection principles.
The report said the police manual should cover principles stated in the privacy laws, that a data user should only use and disclose personal data for the original purpose of collection, and that they should take steps to make sure the data they hold is protected against unauthorised or accidental access or use.
It called for clear policies and guidelines to make sure frontline officers would protect people’s personal data when they carry out stop and searches.
The Privacy Commissioner also called on the police to boost their training for officers to establish a culture for personal data privacy and enhance their “professional image and service quality.”
In response, police say they accept the findings of the investigation and will follow up on the recommendations proposed by the watchdog.
They also said they had rebuked the officer concerned and will carry out a disciplinary review.
The force said it will review relevant policies and guidelines and enhance training for officers to make sure they understand and abide by the privacy laws.
Alibaba Cloud Services Offered Racial Profiling of Uyghurs to Commercial Clients: Report
RFA 22 December 2020 (format added)
Online sales and tech giant Alibaba is offering facial recognition services to its cloud customers that enables them to detect members of the ethnic minority Uyghur group, who are already being subjected to mass incarceration in "re-education" camps by the ruling Chinese Communist Party (CCP) in the northwestern region of Xinjiang, according to a report by a group tracking video surveillance around the world.
While China's law enforcement agencies are already known to use this kind of technology to carry out racial profiling of Uyghurs and other ethnic minorities, the report by IPVM said this is the first time it has been detected being used commercially in China.
Alibaba, which is listed on the New York Stock Exchange with a market capitalization of around U.S.$700 billion, "openly offers Uyghur/'ethnic minority' recognition as ... service [on Alibaba Cloud], allowing customers to be alerted any time Alibaba detects a Uyghur," the report found.
It said Alibaba Cloud had "quickly deleted" mentions of Uyghurs and minority detection on its website after IPVM contacted the company for comment.
Alibaba Cloud then claimed, without evidence or explanation, that these features were only used "within a testing environment," IPVM said, which collaborated on reports on racial profiling of Uyghurs using facial recognition in the New York Times, which has also published its own report.
Earlier this month, IPVM and The Washington Post said that fellow Chinese tech giants Huawei and Megvii had also tested and validated 'Uyghur alarms' as part of facial recognition software intended to be used by police as a part of a nationwide video surveillance network.
Alibaba Cloud, also known as Aliyun, claims three million customers around the world, and is China's largest cloud service.
The reference to Uyghur facial recognition was part of Alibaba's "Cloud Shield" solution, which offers clients the ability to detect and recognize text, pictures, videos, and voices containing pornography, politics, terrorism, advertisements, and spam.
"Terrorism" has frequently been used by the CCP as a catch-all charge targeting Uyghurs for normal religious activities, including wearing modest clothing, growing beards, reading the Quran, and fasting during Ramadan.
A version of Alibaba Cloud's content security API page cached by Google on Dec. 13 showed a query that could determine whether a person "is a member of an ethnic minority" by analyzing their facial features.
Elsewhere, the Cloud Shield API Guide places the word Uyghur in parentheses after the word "minority."
"The technology works for any video or pictures with Uyghur faces, so even an anodyne prerecorded video of a Uyghur explaining her first day of university would be flagged if the Alibaba Cloud client had toggled on this feature and it worked as intended," the IPVM report said.
It said Alibaba owns YouKu, similar to YouTube, while the company's e-commerce platforms make heavy use of livestreaming, and that Weibo, one of China's largest social media apps, also uses Alibaba Cloud.
The detection of Uyghurs isn't offered by Alibaba's global services in English, suggesting that the function is only offered in China, the report said.
"Soon after IPVM and The New York Times reached out for comment, Alibaba removed the API Guide mentioning Uyghurs [and] the "ethnic minority" face detection feature from the two pages mentioning it," it said, adding that the claims that Alibaba was only "testing" the service weren't supported by any evidence.
An August 2020 API on the website of fellow cloud services provider Kingsoft, which held a U.S.$10 billion IPO on the NASDAQ in May, also offered "Uyghur, non-Uyghur" face detection, IPVM said.
Kingsoft has since deleted this API from its website, denying it could identify Uyghurs.
"The Subject API was not able to distinguish or identify individuals of Uyghur background," the company said in a statement. "The labeling on the basis of any race is inappropriate and inconsistent with Kingsoft Cloud’s policies and values."
"This misleading product is being withdrawn and we will conduct a full review of our API platform," it said.
"It has been well documented that police in China use Uyghur 'alerts' in their video surveillance systems," IPVM said.
"Alibaba's offering of this explicitly racist technology to its vast Cloud clientele shows the repression of Uyghurs goes well beyond law enforcement," it said in conclusion.
Reported by Lin Peiyu for RFA's Mandarin Service. Translated and edited by Luisetta Mudie.