Amenity Analytics offers an innovative approach to text analytics, combining machine learning with NLP and other forms of artificial intelligence. Our technology is guided by industry experts, with the idea of solving actual business problems.
The task of analyzing earnings calls is fraught with challenges.
First, there is the issue of coverage. It is impossible for an analyst to physically listen to every major conference call during earnings season. Then there is the process of identifying, evaluating, and weighing investment signals from these calls—a complicated exercise regardless if done manually or by basic algorithms.
In addition, how would analysts be able to view this data in aggregate, whether it is the performance of one company over time or of multiple companies within a particular sector.
A text analytics solution built on NLP technology performs much of the heavy lifting when it comes to uncovering valuable information from earnings calls.
NLP can automatically synthesize and summarize these calls and extract signals around sentiment and targeted events. Analysts can easily navigate transcripts by key language drivers such as headwinds and tailwinds. They can view clustering of positive and negative sentiment—or trends over time and across one or multiple companies—to uncover themes that are otherwise missed in traditional analysis.
This information serves as a much needed supplement for analysts, whether or not they are able to listen to the calls, by providing them with deeper insights around management commentary.
One of the Amenity products available for demo is an out-of-box, cloud-based platform trained on financial data including earnings call transcripts and SEC filings.
With demoing, an investment management company can see what true NLP is capable of doing. More importantly, it distinguishes NLP-based text analytics from other, similar sounding tools marketed to finance which rely heavily on Bag-of-Words, text mining, and machine learning. Some of the key areas for comparison are accuracy, speed, quality of extractions, and workflow integration.
Demoing also serves as an “idea-starter” for additional customization. Amenity is able to dive deep into the domain expertise of the company to model their platform according to the client’s proprietary way of looking at things.
The investment arm of a financial institution used Amenity's API to create a data set on the biggest players in the autonomous car market to support its strategy and business development efforts.
Amenity extracted the most critical autonomous car industry news on a daily basis, identifying actionable patterns across the industry over time. The industry analysis model featured custom event types and modified versions of core taxonomies to best identify insights that are meaningful to the autonomous car industry.
Amenity was able to acheive a high degree of accuracy using its proprietary NLP API including tokenization, lemmatization, named entity recognition (NER), dependency parsing and semantic role labeling. The result was a comprehensive analysis that provided a 360 degree view on the Autonomous Car industry financials, the supply chain, retail, OEM, suppliers, and technological trends.
Companies are sitting on a massive trove of customer data in the form of phone transcripts, emails, and surveys. The challenge lies in extracting, analyzing, and summarizing this type of data that is not easy to process.
Another consideration is keeping the customer’s personal data secure in order to remain compliant with company and government data protection policies.
Advanced forms of NLP are designed to handle the unique syntax, vocabulary, and other nuances of human language.
An NLP vendor can provide an on-premises solution to eliminate the complications of transferring data to a third party vendor and ensuring that vendor processes and handles the data according to the insurance company’s data protection policies. With on-premises, the client hosts the software on their own servers; the insurance company is responsible for processing and storing all data.
The insurance company supplies the NLP vendor with a taxonomy of topic areas that they are most concerned about, such as whether a service representative was rude or nice, or if a representative resolved an issue quickly or slowly. The NLP can be built to take into account various forms of language that relate to the various topics. It can also identify multiple sentiments and events within one call.
Example: "My first rep was great, but he transferred me to another rep, who then transferred me to another rep. It took two months for my claim to solved."
The NLP vendor can develop a model that quickly extracts, categorizes, and summarizes these various events, and also provides a sentiment score for those events.
At the start of the decision-making process, underwriters are generally provided with only two main data sources: the client-provided information in the application form and previous claims led against the applicant’s insurance policy. From there, they may use Google search, social media, and online news sites to locate any red flags that would impact the final evaluation.
Because the research methodology is manual and ad-hoc, there is a strong likelihood for unknown risks in the portfolio.
The insurance company can use NLP technology to automate the risk monitoring process.
A vendor with the ability to customize and quickly execute their technology can build a platform that captures news from a wide array of cloud-based documents, whether it is news on private or public companies, SEC filings, or earnings call transcripts.
The platform then scans the content for the insurance company’s specific view of risk. If there is a news story about the applicant in the context of that risk, the underwriter can factor that information into his or her decision.
The build-out process may involve several stages in order to reach an automation process that is an improvement on the existing process. After the vendor creates the initial model, the insurance company will assess the data results to determine which areas need fine tuning. The vendor then adjusts the model. There may be several iterations before finally arriving on a custom solution that automates risk detection according to the specific way the insurance company views risk.
The underwriter can now log onto the platform dashboard, select the relevant company, and see the number of mentions around the associated risk topic over a period of time.
Key account executives are looking to arm their sales reps with timely, relevant lead intelligence on a broad area of topics related to their customers—business trends; company reputation; competitive intelligence; product launches; market insights; deals; financial and legal news; and so on. This information helps reps to determine why, when, and how to approach companies for ad placement.
Executives may task a strategic insights group or another department with gathering and distributing this information. In either case, the process is ad-hoc and manual. They often perform research using Google Alerts, web searches, and social media, and record and track this information in MS Word documents or spreadsheets.
It is difficult to cut through the noise in order to identify trends.
In addition, executives have no systematic way of measuring and scoring this information for overall sentiment (how positive or negative was the story) and impactfulness (is it a big or small story). A news story about the recall of a client’s product is a critical event—one that the sales rep should jump on—as opposed to a story about the client winning an award.
A cloud-based, NLP solution can be used to track specific names and sectors; scan the news, major publications, social media, and wire services for relevant insights; and surface those articles through an interactive dashboard.
Based on the information provided by the media company, the NLP vendor creates a specific taxonomy that allows company to watch for commentary from trusted online sources that talks in a very specific way about their clients and their products. The NLP then compares the quantity of these highlights along with the sentiment over time. It can also compare the performance of its clients to the clients’ competitors and to the industry sector average.
The NLP technology integrates with customer applications like Salesforce so that sales teams have a holistic view of client intelligence where they are accessing and tracking customer data.
Users can also access this information directly from a web-based platform. The platform is interactive and allows for filtering by company, sector, and topic. It can also send emails that are customized to the users.
Sales reps are able to understand fairly quickly how their clients are being discussed or perceived in the media. They can provide recommendations on campaigns, including which media areas to consider for ad placement, so that they can enhance or counteract media mentions.
Reps can even be specific with the product or service being advertised, given the topics in the news. For example, there may be an article about the efficacy of a face cream in a high-profile publication that is fairly positive about that brand’s face cream.
The sales rep could reach out to this advertiser client to inform that they have media with the outlet that had published the article and to pitch ad placements there—since the placements would be contextually relevant and land favorably relative to the content.
An Enterprise company wanted to use social media data to better understand areas of customer concern. Amenity parsed through this company's social media mentions to reveal a detailed, customized picture of consumer sentiment.
By text mining underutilized consumer data Amenity was able to perform a social sentiment analysis for this company.
Using its proprietary NLP API, Amenity created customized taxonomies that were relevent to social sentiment analysis. Amenity’s text mining software then performed information extractions on key topics & areas of concern. These topics were then sorted into event types.
Sentiment scores were created based on the number of positive and negative extractions associated with a given topic. The output was a detailed dataset on the Voice of the the Customer that assessed brand perceptions and the topics that drove its customer sentiment.
Keep up to date with our analyses and how we're making changes.