I eat

The saying goes that our eyes are the window to the soul. Perhaps in time they will serve a less romantic purpose, as windows to make money.

Researchers at Carnegie Mellon University in Pittsburgh, one of the leading institutions for artificial intelligence research, embarked on a study using facial recognition algorithms to track the expressions of merchants. His goal: to find correlations between mood swings and market changes. If traders seem enthusiastic, it might be time to buy. Are there more furrowed eyebrows than usual? It might be time to sell. The provisional US patent application was filed on September 13, 2022.

“The market is driven by human emotions,” says Mario Savvides, lead scientist on the project. “What came to mind is, can we abstract things like expression or movements as early indications of volatility? Everyone is getting excited, or everyone is shrugging or scratching their head or leaning forward… Did everyone have a reaction within a five second period?

The main phase of the study will take place over 12 months starting in the third quarter of 2023 and will involve about 70 traders at investment firms, mostly located in the US. All will have cameras mounted on their computers to record their faces and gestures throughout the day, according to Savvides. The cameras will be linked to software from Oosto, an Israeli company formerly known as AnyVision Interactive Technologies Ltd., which hopes to develop an alert system for trends in traders’ faces, or a volatility index that it can sell to trading companies. investment.

Oosto, which makes facial recognition scanners for airports and workplaces, declined to name the companies in the study, but said those companies would have early access to any new tools that emerge from the research. The images of each individual will remain on his own computer or in his physical facilities; Only data and numbers representing their expressions and gestures will be uploaded to researchers.

A person’s face is made up of 68 different points that change position frequently, according to Savvides, co-author of a study on facial “landmarks” in 2017.

Its system will also track a trader’s gaze to see if they’re talking to a colleague or looking at their screen, and notice if their peers are doing the same. “We have a whole toolbox of search algorithms that we will test to see if they correlate with a market signal,” Savvides said. “We are looking for needles in a haystack.”

Advertisers already use facial analysis to study how exciting an ad is, while retailers use it to see how bored customers are, and hiring managers to determine, rather creepily, whether a job candidate is the right fit. enthusiastic enough.

The stock market study at first glance seems more dystopian. Trading algorithms have tried for years to harness information from the weather, social media or satellites, but there’s something a little demeaning about traders themselves being exploited for data. Researchers are arguably also putting traders in an endless feedback loop where their actions and decisions become derivative and their conspicuously lemming-like behavior amplified. If you thought the market was already driven by a herd mentality, this will probably make things worse, but that’s how the market works.

“Everyone on the street is talking,” says a London trader (not part of the study) who said he would find these peer sentiment alerts useful. “All we do is discuss ideas and share information… Nonverbal communication is massive.” Years ago, parks were noisy places where people used to talk on three or four phone lines at the same time; now many communicate through chat rooms and talking is minimal.

But the study also points to another uncomfortable phenomenon: facial recognition is here to stay, and its more controversial cousin, facial analysis, could be too. Despite all the concern that has been raised around facial recognition, including the mistakes it can make as a surveillance tool, tens of millions of us still unhesitatingly use it to unlock our phones.

Facial analysis like the one used by Carnegie Mellon opens up a bigger can of worms. Last summer, Microsoft Corp. promised to remove its facial analysis tools, which estimated a person’s age, gender and emotional state, admitting that the system could be unreliable and invasive.(1) That might not matter too much. for traders, eager to take advantage of all the data they can for an edge. But this study, if successful, could jumpstart research on face analysis for other purposes, such as assessing a person’s emotional state during a work meeting.

“If you are making a business deal over Zoom, can you have an AI read faces to tell if someone is cheating or being a hard bargainer?” Savvides asks. “C’est possible. Why not?”

Zoom Video Communications Inc. introduced a feature last year that tracks sentiment in a recorded work meeting. Called Zoom IQ, the software is aimed at sales professionals who match participants with a score between 0 and 100, with anything above 50 indicating more engagement in the conversation. The system doesn’t use facial analysis, but tracks speaker participation, or how long they wait to respond, and offers their score at the end of the meeting.

More than two dozen rights groups have called on Zoom to stop working on the feature, arguing that sentiment analysis is backed by pseudoscience and is “inherently biased.” A Zoom spokesperson said the company still sells the software and that it “turns customer interactions into meaningful information.”

It can be argued that the Carnegie researchers shouldn’t care what their facial analysis tool tells them about the emotions of traders; they just need to spot the patterns that point to correlations and turn those numbers into a search algorithm. But the downside of turning emotions into a number is just that: you run the risk of devaluing one of the most fundamental characteristics of being human. It could be better if it doesn’t catch on.

More from Bloomberg’s opinion:

• Why casinos are spying on their ultra-rich clients: Parmy Olson

• Be careful, here come the predictions for 2023: John Authers

• The future strongest opponent of Magnus Carlsen is the AI: Tyler Cowen

(1) Amazon continues to sell facial analysis software that estimates someone’s gender and also guesses if they’re happy, confused, upset, or more.

This column does not necessarily reflect the opinion of the editorial board or of Bloomberg LP and its owners.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former Wall Street Journal and Forbes reporter, she is the author of “We Are Anonymous.”

More stories like this are available at bloomberg.com/opinion

By sbavh

Leave a Reply

Your email address will not be published. Required fields are marked *