1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
In 2008, United Airlines baggage handlers in Chicago severely damaged Dave Carroll’s Taylor guitar worth $3,500. After nine months of unsuccessful pleas for compensation, Carroll wrote the song "United Breaks Guitars” – a YouTube hit with more than 17 million views. Four weeks later, the airline’s stock fell about 10 percent for a loss of $180 million.
There are nearly 33,000 broadcasting stations in the U.S. alone, and every week adults consume more than 45 hours of TV and radio combined (not to mention the three billion internet users worldwide). The debate about how media influences social behavior, from inducing eating habits to fuelling riots, or steering the stock market, as was the case of United Airlines, is ongoing – and intensifying.
If media exposure alters our perception, can we quantify its effect on brand reputation? Signal processing may have the answer.
Measuring public opinion is no longer just a powerful tool for governments and agencies, but a business asset. As companies worldwide spend hundreds of billions yearly on advertising and branding, how do they know the campaigns work?
Perception is the process by which we interpret and organize sensations to produce meaningful experiences of the world. We form attitudes toward perceived situations unconsciously, but when it comes to detecting attitudes, things get quite complicated, which is why most attitude assessments are surveys. To measure attitudes toward brands, social media supports automatic interpretation as information is often #tagged. Though consumers may trust social media more than advertising when choosing brands, the medium’s influence is limited as it’s still viewed with scepticism. Information delivered by TV and radio stations, even on social media, enhances credibility. Live TV and radio program tracking has historically been a laborious manual job, where people would have to collect, clip, tag and interpret multimedia on the go. Until now.
Signal Processing is changing the game. A multimedia team at the SAIL LABS in Austria is working on a fully automated solution for monitoring and analytics of cross-media in 26 European languages, ranging from Spanish to Russian. The automatic image, video and audio recognition system is fed by cross-media collecting tools, and is trained and tested with multiple lexicons to detect sentiments and derive brand-related intelligence.
How exactly does it work? People interpret news based on verbal and non-verbal communication. This may be an easy task for most people, but machines have difficulties understanding context and grasping hidden meanings (think IBM Watson on Jeopardy). SAIL LABS’ system overcomes this limitation by integrating six key components to ensure information retrieval and analytics for any type of heterogeneous media communication. The Media Mining Feeder & Indexer for TV, FM radio, video (YouTube) and internet (Social Media, Feeds, Websites) is SAIL LABS’ framework to continuously and concurrently capture and record unstructured data from multiple sources and process it into structured, searchable and easily accessible information. The text, image and video documents from social media are then jointly analysed using a processing engine based on state-of-the art algorithms for image and speech recognition –the Media Mining Indexer – to automatically identify relevant manifestations or patterns in the signals.
Image recognition is used to detect logos, persons and other objects relevant for a given brand. Speech recognition detects relevant keywords, phrases, products or brand names; natural language processing utilizes models based on deep learning neural network technologies. Sentiment analysis algorithms host further technologies to determine speaker or writer attitude and assign a polarity to the analysed document. Content is indexed and enriched with information about the language, speaker, named entities, topics and sentiment. The enriched content is subsequently sent to a Media Mining Server for storage and can be accessed for efficient search, retrieval and advanced analytics through a Media Mining Client.
The SAIL LABS’ machine takes on the laborious and continuous task of monitoring and labelling of cross-media delivering information about trends, relations, global hot spots and actors in real-time. A wide set of components for visualization and analysis, ontologies and social media analytics are available for turning raw, unstructured data into actionable knowledge and intelligence. The system supports brand managers in the definition of strategies and actions based on objective metrics collected on the spot, tracing events as they unfold in every corner of the world.
As the SAIL LABS machine grows more sophisticated, could we be looking at a future where our decisions are made entirely by machines? Signal processing and AI are game changers when it comes to allowing machines to watch the news for us. If successful, the way we consume information may entirely change for the better, complementing our daily tasks and transforming the way we access information.
Anca Popescu is with the EU Satellite Center in the Capability Development Division, in Madrid, and has extensive expertise in business innovation and product development. Anca completed her Ph.D. in Electronics and Telecommunications Engineering from University Politehnica of Bucharest in 2011, specializing in signal and satellite image processing. You can reach Anca through LinkedIn.