


Five success stories explore the business value of natural language processing
Data is now one of the most valuable enterprise commodities. According to CIO.com's "State of the CIO 2022" report, 35% of IT leaders said that data and business analytics will account for the largest share of their organization's IT investments this year, and 58% of respondents said that in the next year They will increase their investment in data analysis.
While data comes in many forms, perhaps the largest, untapped data pool is text, whether it’s patents, product specifications, academic publications, market research, news, or social feeds. Text-based, and the number of texts is constantly growing. According to Foundry's 2022 Data and Analytics Study, 36% of IT leaders believe that managing this unstructured data is one of the biggest challenges they face. That’s why research firm Lux Research points out that natural language processing (NLP) technology—especially topic modeling—is becoming a key tool for unlocking the value of data.
Natural language processing is a branch of artificial intelligence (AI) used to train computers to understand, process and generate language. Search engines, machine translation services, and voice assistants are all powered by natural language processing. Topic modeling is a natural language processing technique that breaks down an idea into subcategories of common concepts defined by phrases. According to Lux Research, topic modeling allows organizations to associate documents with specific topics and then extract data, such as growth trends in a topic over time. Topic modeling can also be used to establish a "fingerprint" for a given document and then discover other documents with similar fingerprints.
As enterprises become increasingly interested in AI, they are turning to natural language processing to unlock the value of unstructured data contained in text documents. Research firm MarketsandMarkets predicts that the natural language processing market will grow from US$15.7 billion in 2022 to US$49.4 billion in 2027, with a compound annual growth rate (CAGR) of 25.7% during this period.
Let’s take a look at five examples of how organizations are using natural language processing to create business results.
Eli Lilly: Doing business globally through natural language processing
Multinational pharmaceutical company Eli Lilly is using natural language processing to help more than 30,000 employees around the world communicate within the company and Share accurate, timely information externally. Lilly has developed a homegrown IT solution called Lilly Translate that uses natural language processing and deep learning to generate content translations through a proven API layer.
For years, Eli Lilly relied on third-party human translation vendors to translate everything from internal training materials to formal technical exchanges with regulators. Now, Lilly Translate service provides users and systems with real-time translation of Word, Excel, PowerPoint and text, while keeping the document format unchanged. Eli Lilly uses deep learning language models trained on life sciences and Lilly content to help improve translation accuracy, creating refined language models that recognize Lilly-specific terminology and industry-specific technical language while maintaining the format of regulated documents.
Timothy F. Coleman, vice president, information and digital solutions officer, Lilly, said: “Lilly Translate touches every area of the company, from human resources to corporate audit services to the ethics and compliance hotline, Finance, Sales and Marketing, Regulatory Affairs, and many other areas. This saves a huge amount of time, as translation now takes seconds instead of weeks, freeing up key resources to focus on other important tasks. Business activities.”
Coleman’s advice: Support projects that are driven by passion. Lilly Translate began as a passion project by a curious software engineer whose idea was to solve a pain point in the Lilly Regulatory Affairs system portfolio: business partners were constantly experiencing delays and friction in their translation services. Coleman shared the idea and technical vision with other executives and managers and immediately gained project support from Eli Lilly's global regulatory affairs international leadership, who advocated for investment in the tool.
"[The idea] was a great combination of the opportunity to explore and learn about emerging technologies, and what started out as a great learning opportunity has now become one that Lilly software engineers grab and run." Great project opportunity.”
Accenture: Using natural language processing to analyze contracts
Accenture is using natural language processing for legal analysis. Accenture's Legal Intelligent Contract Exploration (ALICE) project helps this global services company with 2,800 professionals conduct text searches in its millions of contracts, including searching for contract terms.
ALICE uses "word embedding", a natural language processing method, which can help compare words based on semantic similarity. The model examines contract documents paragraph by paragraph, looking for keywords to determine whether the paragraph is relevant to a specific contract clause type. For example, words like "flood," "earthquake," or "disaster" often appear with a "force majeure" clause.
Mike Maresca, global managing director for digital business transformation, operations and enterprise analytics at Accenture, said: “As we continue to leverage this capability and continue to enhance it, its use continues to expand and we see additional value. opportunities, and we're looking for new ways to get value from existing data."
Accenture said the project significantly reduces the time lawyers spend manually reading documents to obtain specific information.
Maresca’s advice: Don’t be afraid to delve deeper into natural language processing. “If innovation is part of the culture, you can’t be afraid of failure and let’s experiment and iterate.”
Verizon: Using natural language processing to respond to customer requests
Verizon’s business Service assurance departments are using natural language processing and deep learning to automatically process customer request reviews. The department receives more than 100,000 inbound requests each month, and previously they had to read and take action until Verizon's IT arm—Global Technology Solutions (GTS)—built the AI-Enabled Digital Worker for Service Assurance.
This Digital Worker combines web-based deep learning technology with natural language processing to read repair orders sent primarily via email and the Verizon portal. It automatically responds to the most common requests, such as reports. The current work order status or repair progress is updated, and more complex issues are submitted to human engineers.
"By automating responses to these requests, we can respond within minutes instead of hours after the email is sent," said Stefan Toth, executive director of systems engineering, Global Technology Solutions (GTS), Verizon Business Group explain.
In February 2020, Verizon stated that Digital Worker had saved nearly 10,000 man-hours per month since the second quarter of last year.
Toth’s advice: seek open source. "Look around, network with your business partners, and I'm sure you'll find opportunities. Think about open source and experiment before making a large financial commitment. We found that there is a lot of open source software available now."
Great Wolf Lodge: Using natural language processing-driven AI to track guest sentiment
Artificial Intelligence Lexicographer (GAIL) developed by hospital and entertainment chain Great Wolf Lodge sifts through comments in monthly surveys to determine Whether the author may be a troll, a critic, or a neutral party.
This AI tool utilizes natural language processing and was trained on more than 67,000 reviews specifically for the service industry. GAIL runs in the cloud and uses an in-house developed algorithm to discover the key factors that indicate how respondents feel about Great Wolf Lodge. Great Wolf Lodge stated that as of September 2019, GAIL's accuracy can reach 95%. For a small part of the information that GAIL cannot understand, Great Wolf Lodge will use traditional text analysis to process it.
Great Wolf Lodge Chief Information Officer Edward Malinowski said: "We want to be better able to interact with guests in every aspect."
Great Wolf Lodge's business operations team uses GAIL-generated Insights to adjust their service, the company is currently developing a chatbot to answer guests' frequently asked questions about Great Wolf Lodge service.
Malinowski’s advice: Avoid technology for technology’s sake. Choose tools that strike the right balance between technology and practicality and are aligned with business goals. "You have to be careful about what's a gimmick and what's a real solution to a problem." Provider Contracts app that automatically reads notes on each contract regarding payment, deductibles, and unrelated expense instructions, then calculates pricing and updates claims.
The application blends natural language processing and special database software to identify payment attributes and build additional data that can be automatically read by the system. As a result, many claims are settled overnight.
The app allows Aetna's more than 50 claims adjudicators to refocus on contracts and claims that require higher-level thinking, as well as coordination among different health insurance companies.
"It comes down to providing a better experience for the end user," Aetna Chief Technology Officer Claus Jensen said. The software will help Aetna become a better partner to providers and patients in the healthcare ecosystem. "We do more than just pay bills and answer questions on the phone."
Aetna estimates that as of July 2019, the app has helped them save $6 million annually in processing and rework costs .
Jensen’s advice: Narrow your focus and take your time. In an ideal world, companies would implement AI that can solve very niche problems. Jensen said broad-based solutions are vague and ultimately fail, and if Aetna applies general-purpose AI to their business, it certainly won't work. In addition, Aetna spent several months instrumenting the process, writing rules, and testing the application. Jensen said many people don't have the patience to slow down and do things the right way.
The above is the detailed content of Five success stories explore the business value of natural language processing. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics



This site reported on June 27 that Jianying is a video editing software developed by FaceMeng Technology, a subsidiary of ByteDance. It relies on the Douyin platform and basically produces short video content for users of the platform. It is compatible with iOS, Android, and Windows. , MacOS and other operating systems. Jianying officially announced the upgrade of its membership system and launched a new SVIP, which includes a variety of AI black technologies, such as intelligent translation, intelligent highlighting, intelligent packaging, digital human synthesis, etc. In terms of price, the monthly fee for clipping SVIP is 79 yuan, the annual fee is 599 yuan (note on this site: equivalent to 49.9 yuan per month), the continuous monthly subscription is 59 yuan per month, and the continuous annual subscription is 499 yuan per year (equivalent to 41.6 yuan per month) . In addition, the cut official also stated that in order to improve the user experience, those who have subscribed to the original VIP

Improve developer productivity, efficiency, and accuracy by incorporating retrieval-enhanced generation and semantic memory into AI coding assistants. Translated from EnhancingAICodingAssistantswithContextUsingRAGandSEM-RAG, author JanakiramMSV. While basic AI programming assistants are naturally helpful, they often fail to provide the most relevant and correct code suggestions because they rely on a general understanding of the software language and the most common patterns of writing software. The code generated by these coding assistants is suitable for solving the problems they are responsible for solving, but often does not conform to the coding standards, conventions and styles of the individual teams. This often results in suggestions that need to be modified or refined in order for the code to be accepted into the application

To learn more about AIGC, please visit: 51CTOAI.x Community https://www.51cto.com/aigc/Translator|Jingyan Reviewer|Chonglou is different from the traditional question bank that can be seen everywhere on the Internet. These questions It requires thinking outside the box. Large Language Models (LLMs) are increasingly important in the fields of data science, generative artificial intelligence (GenAI), and artificial intelligence. These complex algorithms enhance human skills and drive efficiency and innovation in many industries, becoming the key for companies to remain competitive. LLM has a wide range of applications. It can be used in fields such as natural language processing, text generation, speech recognition and recommendation systems. By learning from large amounts of data, LLM is able to generate text

Large Language Models (LLMs) are trained on huge text databases, where they acquire large amounts of real-world knowledge. This knowledge is embedded into their parameters and can then be used when needed. The knowledge of these models is "reified" at the end of training. At the end of pre-training, the model actually stops learning. Align or fine-tune the model to learn how to leverage this knowledge and respond more naturally to user questions. But sometimes model knowledge is not enough, and although the model can access external content through RAG, it is considered beneficial to adapt the model to new domains through fine-tuning. This fine-tuning is performed using input from human annotators or other LLM creations, where the model encounters additional real-world knowledge and integrates it

Editor |ScienceAI Question Answering (QA) data set plays a vital role in promoting natural language processing (NLP) research. High-quality QA data sets can not only be used to fine-tune models, but also effectively evaluate the capabilities of large language models (LLM), especially the ability to understand and reason about scientific knowledge. Although there are currently many scientific QA data sets covering medicine, chemistry, biology and other fields, these data sets still have some shortcomings. First, the data form is relatively simple, most of which are multiple-choice questions. They are easy to evaluate, but limit the model's answer selection range and cannot fully test the model's ability to answer scientific questions. In contrast, open-ended Q&A

Machine learning is an important branch of artificial intelligence that gives computers the ability to learn from data and improve their capabilities without being explicitly programmed. Machine learning has a wide range of applications in various fields, from image recognition and natural language processing to recommendation systems and fraud detection, and it is changing the way we live. There are many different methods and theories in the field of machine learning, among which the five most influential methods are called the "Five Schools of Machine Learning". The five major schools are the symbolic school, the connectionist school, the evolutionary school, the Bayesian school and the analogy school. 1. Symbolism, also known as symbolism, emphasizes the use of symbols for logical reasoning and expression of knowledge. This school of thought believes that learning is a process of reverse deduction, through existing

Editor | KX In the field of drug research and development, accurately and effectively predicting the binding affinity of proteins and ligands is crucial for drug screening and optimization. However, current studies do not take into account the important role of molecular surface information in protein-ligand interactions. Based on this, researchers from Xiamen University proposed a novel multi-modal feature extraction (MFE) framework, which for the first time combines information on protein surface, 3D structure and sequence, and uses a cross-attention mechanism to compare different modalities. feature alignment. Experimental results demonstrate that this method achieves state-of-the-art performance in predicting protein-ligand binding affinities. Furthermore, ablation studies demonstrate the effectiveness and necessity of protein surface information and multimodal feature alignment within this framework. Related research begins with "S

According to news from this website on July 5, GlobalFoundries issued a press release on July 1 this year, announcing the acquisition of Tagore Technology’s power gallium nitride (GaN) technology and intellectual property portfolio, hoping to expand its market share in automobiles and the Internet of Things. and artificial intelligence data center application areas to explore higher efficiency and better performance. As technologies such as generative AI continue to develop in the digital world, gallium nitride (GaN) has become a key solution for sustainable and efficient power management, especially in data centers. This website quoted the official announcement that during this acquisition, Tagore Technology’s engineering team will join GLOBALFOUNDRIES to further develop gallium nitride technology. G
