Artificial Intelligence (AI) also known as Machine Learning can be used with images, and that is a blessing and a curse. In healthcare it's a great adjunct to diagnosis. There are several companies that look at x-rays and can indicate to the doctor areas the algorithm thinks need to be evaluated. I've been lucky enough to work with some dental AI systems and I think they are the future as a diagnostic aid.
However, just like programs can learn what tooth decay looks like on an image, there are companies out there selling programs that can easily and rapidly identify an individual in a photograph or even on a live CCTV camera on a city street. China is probably the world leader in tracking citizens. The cities on the Chinese mainland have cameras everywhere and their facial recognition programs know where you go every moment. That's scary.
Of course, that would never happen in a democracy right? Well, maybe U.S. citizens aren't being tracked *everywhere* they go by our government, but with our addiction to taking p photos and sharing them online, it's certainly easy to develop "patterns of life" about individuals. That means someone somewhere know that every Monday-Thursday you're in the gym from 6:00 am to 7:30. Or... that you were at a Black Lives Matter peaceful protest.
In most instances the AI algorithms are owned and run by the social media companies. When you sign up for FaceBook or Instagram or SnapChat, nobody (or hardly anyone) actually reads the TOS (Terms of Service) and you end up agreeing to let a company do things with your data. However, not too long ago a company named ClearView AI came up with a concept that isn't exactly ethical, but they did it anyway.
ClearView AI developed a program that scoured the web and grabbed every photo it could find. Taking data (including images) that a user didn't agree to give you, is called "scraping" and ClearView scraped a lot of images from the web. It used all of those images to train its AI engine, and that engine got pretty good at recognizing faces. Then ClearIf you View went to other businesses and law enforcement and offered them the ability to identify faces taken for surveillance video.
The good news is that now some researchers and developers are fighting back. I've just read an article from MIT Technology Review that highlights programs that can be used to make your images unreadable by the AI engines. Cool huh?
This is going to become a cat and mouse game just like security has become. Hackers find weaknesses, companies close them, and hackers then search and find more weaknesses.
No comments:
Post a Comment