Pimeyes and Clearview AI

face recognition

Sunday, January 26, 2020

Folks, it doesn’t get any more obvious that this. You are constantly being tracked and information is being mined for programs you were never away of or signed up for. This is not in a paranoid 1984 kind of way but the way the world is moving.

A few months ago we came across articles about how protestors in HK were donning hats and face masks to prevent facial recognition and also how London PD were leveraging street cameras to take advantage of new AI algorithms in the fight to identify people all “in the name” of protecting from future crimes.

New programs are coming out and we will point out some that either you might be using or may consider using depending what side of the equation you are.

An example is PimEyes. This was described by Steemhunt as a search engine that can track and recognize people’s faces.

Now you may have already used a similar tool like TinEye for reverse image searching to find the origin of various photos but PimEyes is more tailored toward faces.

Our staff has also seen a steady surge and uptick for instance in online fraud and scams especially with those in the “lonely hearts club” and we’re talking about singles that just want to find a date, a hookup or just a partner on online dating sites. Unfortunately many apps and sites are “The Wild Wild West” out there and unregulated somewhat or can’t be completely safeguarded against shysters that just want your online phone or your Gmail or to get you onto another app platform like Hangouts and send you phishing links and malware. (This is digressing I know to a different topic but bear with us.) So one thing is these reverse search engines like TinEye and PimEyes might be able to weed through thousands of photos in seconds to recognize if you as prospective single have found another prospective single or a fraudster that’s stealing an innocent party’s photo and trying to pass it off as their own. Gasp!

Now according to a January 18, 2020 article by New York Times Clearview AI is one company that is also working with law enforcement to scrape social media and the web in and effort to pull together the largest database greater than any one has ever seen.

Most people agree that FB-social-media-esque tracking is creepy and an invasion of privacy and most people agree with the basic concept of fundamental privacies.

This app had been used to solve a few cases like shoplifting and ID theft and other crimes before. And many have refrained from using such a massive scouring of data till now and held back. However with this new tech comes AI and augmented reality identification according to NYT’s analysis of the program code. And also the possibility of weaponizing data searches.

You know that checkmate and background site where you were worried that an ex was cheating or perhaps you want to actually follow and stalk someone or dig up juicy dirt… Or maybe even lest we say it, use some stuff in the political arena? Well, here you could possibly exploit the data in the wrong hands. But isn’t that always the case? A butter knife or a car or loaded weapon can be used for useful purposes or to destroy lives depending on the intent. And that goes the same for many industries such as weapons that people want to regulate but don’t know where to draw the line without causing unburden to some that have helpful intentions and those that don’t. Same with privacy.

Apple recently again has been featured in the news again because even though it tries to push for privacy and stand up for privacy of the consumer, because that is good business. And as Spock tries to state “The needs of the many outweigh the needs of the few” according to a philosophy article by Ari Armstrong on September 12, 2013 entitled ‘Spock’s Illogic: “The Needs of the Many Outweigh the Needs of the Few”‘ but then who are we to judge what ultimately is for “the greater good”? You saw this theme recurring over and over in places like Will Smith’s iRobot. And it’s covered in EthicsSage March 10, 2015 article. You see hard decisions made by superheroes with moral quandries such as Batman in The Dark Knight trying to decide whether to save Harvey Dent or his love interest or in Spiderman II, Peter Parker’s character caught between saving a vehicle full of people or his crush, Mary Jane. And in Watchmen, Ozymandias decides that the greater good of humanity getting along is more important than the taint of a “small lie” and exposing the truth.

In reality what is good and not so good is often determined by humans and we’re often at odds with doing what seems right and balancing also what we feel.

If perhaps for instance one person had been allowed to sit on a bus instead of getting up during the 60’s during the Civil Rights movement, would history have changed or another person be there to take its place? As in Armstrong’s “The Objective Standard” article doing the opposite of what societal pressures dictate may not be as Kirk asks “the logical thing to do”. But it might be “the human thing to do” as Spock reply. Sometimes we have to think for ourselves. And this has happened in various times such as Oscar Schindler in the Schindler’s List or Sound of Music or a priest secretly marrying Romeo and Juliet against family wishes, etc. We even see the blurring of what is good and not good in the movie Wicked with the relation of the characters.

Now as new tech apps and companies come into the foray another “FB” Analytica event will likely come by with some company emphasizing profits over human right to privacy or some other ethic concern. It’s only a matter of time. And if not one company it will be another. As you saw in Jurassic Park or the Robocop 2014 remake, the company’s bottom line will drive what they often do, “ethics be d*mned”. And so we always need a counterbalance and grounded views so nothing becomes too far out.

It is interesting to note that facial recognition tools have been around around 20 years according to the NYT article. And the article also mentions a “Smartcheckr” tool for facial recognition. The company also had a large amount of seed funding. To be clear the apps have been great research tools. We’ve always wanted to find photos, music, sound lyrics, and anything else including memories at the drop of a hat. How often has it occurred that you forgot something and wish you could recall it. With this type of technology you can help find missing ones and also solve crimes. How cool is that?

The database boasts an amazing 8 billion images versus a local PD that only has millions of photos.

Another database such as FACES is also a facial recognition tool, but it’s only state provided. This shows sometimes tech is way better in some companies while sometimes vice versa.

False positives
However, no tech is 100% perfect and even these programs sometimes glitch out or erroneously create false matches. It’s not perfect and may misidentify some people with similar facial features. And that can be really bad.

Proprietary data violations
Also in order to get all this data the company has had to covertly scour, scrub and reap websites, including various social media sites which clearly have a “Do Not Scrape” policy meaning they aren’t allowed to use bandwidth or save for their own personal business purposes and you can imagine also how many personal individual rights are also being violated as such — “just cause you put your photo online at the time for your friend doesn’t mean you authorized someone to take your photo and monetize it”… On a side note a lot of starting actors or indie extras often aren’t thoroughly aware although it is likely in their paperwork that all they are doing is being used as free publicity and background for a commercial or film.

Clearly the company might be exceeding acceptable use policies.

However, when used in the argument of solving crimes and legal policies it’s been the view of most in the upper echelon that it’s legal to do when used for what it was intended.

There has been a lot of debate about privacy and privacy laws. And also about self-policing. Some people are very for this new technology and some people are very against. What will the future be like? Will we every have a country with really good privacy rights laws to prevent us from one day needing to cover our faces or worry about someone using our biometric data and grabbing something like a fingerprint and identifying everything about us? Maybe today it’s eye recognition and one day it’s a drop of tears or DNA or our blood or sweat or other fluids and they can determine who the rest of our family are, what our kids and grandchildren look like and find our whole genetic history and diseases we’re predisposed to and school and job we’re at. It’s pictures today but one day we may have a DNA search engine or just have to swab our tongue or touch a metal finger reader to find out everything we ever wanted to know about a prospective client, date, employee or spouse. Maybe it will predict when they will get sick or if they will cheat or what day they might die or if they go to jail or if they’re predisposed to being an inventor and rich or committing crimes.

Do we want a world like that? These are questions we have to start asking today to figure out how our tech progresses.

Author: savvywealthmedia

Leave a Reply

Your email address will not be published. Required fields are marked *