Technology

Dangerous Technologies: Smartphone can lead to a global catastrophe

Against the backdrop of fears about the transfer of personal data to the state and third parties, we do not have time to think that we voluntarily give away much more valuable information – biometric.

There are a lot of tools for biometric identification: voice, heartbeat, gait, vein pattern, retina and many others. Have you ever wondered why a company with a capitalization of $1 trillion, which can afford any technological caprice, chooses from all possible options and combinations? Face recognition?

Technical moments

If you do not go into the technical details, approximately 30,000 infrared dots are projected onto the surface of the face to detect it. The “portrait” obtained through sophisticated mathematics identifies the user’s unique natural face.

In the description of the technology Apple claims that the data of the person is stored on the device and no intelligence agencies and third parties are transferred, just as a fingerprint is always stored only on the device and de facto is only a way of verification. For example, for banking applications that, when a mobile wallet owner tries to make a financial transaction, receive a guaranteed response but do not see the fingerprint itself.

Nowadays, information about the safe storage of biometrics can be trusted (at least because no one could prove otherwise by collapsing the company’s shares). But the trick is that if Apple or Google decide one day to share that data with someone in a future version of its products, no one will stop them from doing so, because by clicking “I agree” with the user’s policy, you’ve said goodbye to your data once and for all.

The exception to the rule is, where the GDPR (General Data Protection Regulation) came into force in this year, which is designed to severely punish companies for misuse of private data without the knowledge of the owner. Fines are measured as a percentage of a company’s annual income, so the law is taken seriously.

More specifically, a lie detector

Let’s assume that the future has come and companies are actively passing on biometrics to third parties (of course, with the permission of the user who clicks “I agree” again without reading the essence of the agreement).

Here begins the most interesting thing. There are 57 muscles on the human face, some of which are paired. The Apple camera has a huge density of sensors and monitors even the smallest changes in the mimicry of these muscles in real time. Of course, this can and should be done if your goal is to identify the person. Everything is fine here.

But the fact is that the science of psychology was not invented yesterday. Those who watched Lie to me know that even minor uncontrollable eye micromovements, pupil contractions, lip tremors and thousands of other combinations of microscopic mimics can be used with a high degree of probability to identify the psychological state of the person – to determine whether he is lying or not, is in a state of excitement or depression.

The accuracy of the models, according to Paul Eckmann, the inventor of the methodology, can reach 80%. However, this requires long-term observation of the subject. But the contextual advertising industry will be satisfied with much less accuracy: consumers are billions, and it is not necessary to guess exactly the condition of each of them.

The world of high technology, where microphones and mobile cameras are present in every mobile device, can collect the necessary data, forcibly including the microphone and cameras at the right time. It’s just a matter of desire. And legally correct wording in the user agreement.

Certainly, for creation of the working models of the artificial intelligence capable to define microemotions, training of algorithms and enough big sample of “experimental” people is required. But it hardly becomes a problem. Today, machines are already quite accurately identify in photos faces and objects for search needs so such a narrow task as the definition of microemotions, too, can be solved. And in the near future.

Carrots, whips and freedom for the user

Carrots, in fact, already here – profiling by psychotype will allow to raise the effectiveness of contextual advertising at times. For example, your psychological profiling, compared with the history of search, shopping and other data, will allow you to choose the unique wording of ads that work strictly for you. In a private discussion, Apple and Facebook colleagues did not address my concerns.

Nowadays, face recognition is a password-free identification, a convenient and practical feature. Everything I described above hasn’t happened yet. But in a couple of years, you may find yourself in a situation where you are looking at your device, and from it, as if through a window, someone will climb into your brain. The developers of smartphones and social networks are not the only ones who work in this direction.

I know about at least three startups developing technologies of psychological profiling for the needs of retail: with the help of external surveillance cameras and applications of loyalty programs a human face is run through the models of physiognomic analysis, so that on the basis of the machine’s verdict about the psychotype/state of a person the salesmen could act according to the scenario sharpened for this type.

We are all online, so democratic principles such as the right to privacy must be safeguarded in the digital world as well as in the real world, as the difference between them is no longer obvious.