The Way You Walk

Move over facial recognition, there’s a new camera-based identification technology on the scene. Welcome gait recognition. 🚶‍♂️

Chinese company Watrix can now identify and track people based on the way they walk. So just walk like Monty Python all the time? Nope, won’t do the trick. 🙅‍♂️

According to Haung Yongzhen, the CEO of Watrix: “You don’t need people’s cooperation for us to be able to recognize their identity. Gait analysis can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body.”

It appears that China is going to lead the way in all forms of tracking and recognition so it can all be bundled up in a citizen control monitoring system. Big Brother is watching indeed. 👁️

Src: Outer Places

Wu-Tang Was Right

Cash does rules everything around us. 💰

There seems to be a trend amongst Chinese tech companies to deflect when asked about what societal implications their tech could have by shrugging and talking dollar signs. 💲💲💲

Exhibit A:

“We’re not really thinking very far ahead, you know, whether we’re having some conflicts with humans, those kinds of things,” [SenseTime co-founder Tang Xiao’ou] said. “We’re just trying to make money.”

Src: Bloomberg

Exhibit B: (Outerplaces)

According to Haung Yongzhen, the CEO of Watrix: “You don’t need people’s cooperation for us to be able to recognize their identity. Gait analysis can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body.”

Src: Outer Places

Exhibit C:

“We don’t support the government,” [Su Qingfeng, the head of ZTE’s Venezuela unit,] said. “We are just developing our market.”

Src: Reuters

I find it interesting that state supported companies in a Communist country keep using Capitalism as a shield. 🛡️

SenseTime is Watching 👀

Computer vision is the engine behind China’s Panopticon, and SenseTime is the engine behind many of these computer vision capabilities. And a lot of what they have developed has a dystopian feel to it with hidden cameras scanning faces and more and triggering the appropriate actions via AI. 📹

“That’s really how they see future interactions,” says Jean-François Gagné, who runs Canadian startup Element AI Inc. “You don’t need to log in to your computer, you don’t need to get a boarding pass, you don’t need to do anything anywhere. You’re just recognized.”

Not gonna lie, that does sound pretty cool. No more remembering passwords or tickets, no panicked pocket checking. But that glosses over the tyrannical implications of the tech as well, like freezing a dissident out of everything based on their face. 👤

Src: Bloomberg

They’re Making a List, And Checking it Lots 📇

Oh yay, China is exporting its panopticon. 🤐

First up? Venezuela. The land of oil, soaring inflation, an imploding economy, and a leader that’s super into overt political intimidation. 🔨

It can’t be that bad, right? 🤷

“What we saw in China changed everything,” said the member of the Venezuelan delegation, technical advisor Anthony Daquin. His initial amazement, he said, gradually turned to fear that such a system could lead to abuses of privacy by Venezuela’s government. “They were looking to have citizen control.”

Uh, maybe it can. 😟

This holiday season’s must have gift? A build-your-own-authoritarian-regmie kit. 🎁

Src: Reuters

The Spy Who Doctored You 👩‍⚕️

So many mixed emotions on this bit of news. Place a device in your small apartment/home and let it monitor your health via electromagnetic disturbances, even through walls. 😨

On one hand, it’s really cool to think of the positive impacts this could have. Uncovering trends, monitoring habits, divorcing the data collection from the feebleness of human memory to put a device on or charge it. And then there are the home automation aspects that it could be adapted for. The truly immersive smart home hub. 💡

On the other hand, the surveillance possibilities and sketchy corporate uses from the benign health tracking are rather terrifying. I could see China undertaking a mass rollout of devices like these to augment their camera and digital tracking network. Just as I could see insurance companies requiring the use of these devices to issue policies, and using the data to “personalize” pricing in real time, and not to the benefit of the customer. 🙀

These are exciting and terrifying times. 🔮

Src: MIT Tech Review

When People Do Bad Things With Algorithms 😈

There is a lot of concern surrounding decisions made by algorithms and bias baked into said systems, but that is far from the only concern. These podcast episodes do a tremendous job illustrating what happens when people use neutral data for the wrong things. When the reason for the data becomes perverted. At the end of the day AI systems and algorithms are the products of humans, and we are far from perfect, logical, and rational. We’re still our own worst enemy. 👤

Src: The Crime Machine, Part 1

Src: The Crime Machine, Part 2

With Great Computing Power Comes Great Responsibility? 🚨

Maybe that open letter decrying autonomous weapons wasn’t the best choice? 🤔

Relax, nothing crazy happened. Paul Scharre just brought up some really good points in this interview with the MIT Tech Review. And they boil down to the best way to impact the smart weapons sector is to help educate and steer policy, not stay away from it. 🔖

The open letter is the typical tech sector response to a problem like this too, avoid it and shift blame. “We’re just engineers”. 🙄

Smart weapons are coming one way or another, and I like the idea of having the people concerned about them involved in their creation and regulation. ⚒

Src: MIT Tech Review

Facial Rec Tech Mess 😟

This article is short but meaty. Synopsis: a lot of people are concerned about the current state of facial recognition and what it could mean for the future. I’m going to use the same headings as the MIT post and offer my random thoughts. 💭

The questioners: The call for regulation and safeguards around facial recognition has been sounded. It is definitely a field that warrants a closer look by various watchdog groups due to the concerns and potentials outlined below. 📯

Eye spies: China has a very robust recognition system in place. China is also an authoritarian government that controls information and has a social credit scoring system in place. Facial recognition can allow for a level of monitoring and control that hasn’t been truly feasible until now, whether governmental or military. And when the tech giants are asking for regulation, you know something’s up. Do we want to be like China? 🇨🇳

Oh, (big) brother: News flash, facial recognition might not be perfect! My bigger concern is that Amazon’s response to the ACLU’s findings is that “the system was used incorrectly.” Really? That’s the response? Issue #1: blaming the user has not been going well for tech companies lately, not sure this was the best course of action. Issue #2: WHY CAN THE SYSTEM BE USED INCORRECTLY?!?!? Sorry for the yelling, but if the ACLU can use incorrectly that means that every law enforcement agency using the software can also use it incorrectly. This seems like a big problem with the system. Maybe make the system simple and foolproof before sending it out into the wild to determine people’s fates and futures. 🤦 🤦‍♀️

Bias baked in: Nothing new here, but another reminder that bias is a very real factor in these systems and needs to be addressed early and often in the process. One big step to help would be creating and using more diverse data sets. 👐

Src: MIT Tech Review

Targeted Listening 👂

Google recently demoed an AI model that can focus in on one voice in a noisy environment. Might sound easy because we do this pretty naturally, but machines aren’t like us. Mics pick up everything in their range without distinguishing, kind of like how a picture might look different than what you saw with your eyes. 👀

This is yet another AI advance that is, as Android Police said, cool and terrifying. On the cool side, this could make digital assistants really useful. And who knows what other beneficial uses will be dreamed up. On the terrifying side, Big Brother could be listening as well as watching. 🕴

Src: Android Police