Business Wire

BOOST.AI

11.5.2023 09:02:35 CEST | Business Wire | Press release

Share
Boost.ai Unveils Large Language Model Enhancements to Conversational AI Platform

Boost.ai, a leading conversational AI solution provider, today announced Version 12 of its platform, the first of a series of planned updates by the company to incorporate Large Language Model (LLM)-enriched features. This iteration is focused on key customer experience (CX) improvements, including content suggestion, content rewriting and accelerated generation of training data. The new update will take advantage of Generative AI to suggest messaging content to AI Trainers within the boost.ai platform, generating suggested responses and resulting in drastically reduced implementation times for new intents. With this latest release, boost.ai reinforces its commitment to researching, developing, releasing, and maintaining responsible implementations of LLM-powered, enterprise-quality conversational AI features in order to further enhance the customer experience.

ChatGPT has dominated headlines throughout 2023, yet questions about the accuracy of its underlying models, GPT-3.5 and GPT-4, persist. With an 85% accuracy rate, LLMs are a good indication of the potential of this technology in consumer-facing applications, but in their current raw state they lack the dependency for integration directly into a bank or insurance firm's systems. Boost.ai helps to resolve this issue by seamlessly integrating the predictive capabilities of LLMs with enterprise-grade control of their conversational AI platform, creating a Hybrid Natural Language Understanding (NLU) system that offers unmatched accuracy, flexibility, and cost-effectiveness. Expertly weaving together the right AI components for each task, boost.ai’s platform ensures precise and accurate answers - harnessing the benefits of LLMs, while still adhering to stringent quality assurance requirements.

“LLM technology offers great promise, but most applications just aren’t properly designed to securely and scalably support real-world businesses. With worries about accuracy or even inappropriate behavior, established institutions like banks could not risk direct access to this iteration of generative AI - until now,” said Jerry Haywood, CEO of boost.ai. “By pairing LLMs with our conversational AI, we’re able to ensure accuracy and open the door for customers in sensitive industries like financial services. We’re proud to be pioneering a way forward for businesses to harness this tech right now. It’s available for customers to use and enhance their existing solution, and to help them achieve speed to value significantly sooner whilst minimizing the risks currently dominating headlines.”

With a Hybrid NLU approach, enterprises gain the ability to combine boost.ai’s market-leading intent management, context handling, and dialogue management solutions with powerful LLM-enriched tools. Boost.ai’s existing intent engine is highly trained with guardrails in place to help guide the LLM, increasing overall accuracy and reducing the number of false positives. The end result is a chatbot that can confidently provide answers to inquiries, and a more streamlined development path that radically enhances how boost.ai customers can build scalable customer experiences for chat and voice.

“With our Hybrid NLU, we’ve been able to surpass the performance of either model on its own, providing our customers with the best of both worlds,” said Lars Ropeid Selsås, Founder of boost.ai. “Boost.ai will continue to operate at the cutting edge of AI possibility, refining our platform so that our customers are always receiving the best possible technology and service.”

On May 9th, Boost.ai co-founder, Henry Vaage Iversen and the technical team conducted a webinar highlighting the strengths of Hybrid NLU and mapping out a path for businesses to start using LLM technology to their advantage. For a recording of the full webinar, please visit: https://www.boost.ai/webinars/enhancing-cx-and-reducing-operational-costs-with-boost-ai-large-language-models

About Boost.ai

Boost.ai is a global leader in conversational AI optimized for scale. Boasting the industry's most robust intent portfolio, Boost.ai is pioneering an era of broad-scope virtual agents to deliver the most advanced and scalable technology on the market. With consistent resolution rates of 90%, Boost.ai’s market-leading virtual agent supports enterprise customers across key industries throughout the United States and Europe, including banking, insurance, telecom, retail and more. In 2021, Boost.ai was named a major player in the IDC MarketScape category, Worldwide Conversational AI Platforms for Customer Service. Key customers include Santander Bank, MSU Federal Credit Union, Aspire General Services, Tokio Marine and more. Learn more at boost.ai.

To view this piece of content from cts.businesswire.com, please give your consent at the top of this page.

View source version on businesswire.com: https://www.businesswire.com/news/home/20230511005082/en/

About Business Wire

Business Wire
Business Wire
101 California Street, 20th Floor
CA 94111 San Francisco

http://businesswire.com
DK

Subscribe to releases from Business Wire

Subscribe to all the latest releases from Business Wire by registering your e-mail address below. You can unsubscribe at any time.

Latest releases from Business Wire

TSMC Debuts A13 Technology at 2026 North America Technology Symposium22.4.2026 21:00:00 CEST | Press release

Newest Node Pushes Density Scaling and Energy Efficiency to New Heights to Address Industry’s Most Demanding Applications TSMC (TWSE: 2330, NYSE: TSM) today debuted its latest innovation in its most advanced process technology at the Company’s 2026 North America Technology Symposium. TSMC’s new A13 process is a direct shrink of its industry-leading A14 node announced in 2025, enabling even more compact and efficient designs to address insatiable customer demand in computational requirements for next-generation artificial intelligence, high-performance computing (HPC), and mobile applications. Representing TSMC’s commitment to continuous improvement, A13 provides 6% area savings from A14. Design rules are fully backward compatible with A14, enabling customers to quickly migrate their designs to TSMC’s latest nanosheet transistor technology. In addition, A13 delivers increased power efficiency and performance gains through design-technology co-optimization, and is scheduled to enter prod

Galderma Shareholders Approve All Annual General Meeting Proposals22.4.2026 17:45:00 CEST | Press release

Galderma Group AG (SIX:GALD), the pure-play dermatology category leader, announced that shareholders approved all proposals put forward by the Board of Directors at its Annual General Meeting (AGM), held earlier today via live webcast. This includes the payment of a gross dividend of 0.35 CHF per dividend-bearing share1, to be distributed out of reserves from capital contributions. Shareholders approved the election of Harry Kirsch as independent member of the Board of Directors, as well as the election of Samuel du Retail and Delphine Viguier-Hovasse, representatives of L’Oréal, as non-independent members of the Board. In addition, Thomas Ebeling (Chair), Daniel Browne, Maria Teresa Hilado, Karen Lee Ling, Roberto Marques, Sherilyn McCoy and Dr. Flemming Ørnskov were re-elected to the Board. The AGM also approved the company’s 2025 Annual Financial Statements, Non-Financial Report and Compensation Report. Detailed voting results and the official minutes will be published on Galderma's

Altrove and Bloomineral Named Winners of the 2026 Grand Prix ACF AutoTech22.4.2026 15:21:00 CEST | Press release

IoT.Bzh receives the inaugural Industrialization Prize at the 9th edition of the international automotive startup competition Altrove and Bloomineral have been crowned winners of the 2026 Grand Prix ACF AutoTech, the international startup competition recognizing the best of automotive innovation. The ninth edition was held on Wednesday, April 15 at the Automobile Club de France in Paris, where IoT.Bzh also received the first-ever Industrialization Prize. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20260422236542/en/ Picture of the end of the event with the winners : Bloomineral, Altrove and Iot.Bzh + all the Jury Members from French Automotive OEM and Tier1 Hosted by competition founder Richard de Cabrol and Simon Degiovanni, the evening gathered more than 250 leaders from the automotive, technology, digital, business and media sectors, with attendees joining both on-site and online. Six finalist startups, selected from mor

I/ONX Shatters the Host Tax: New Symphony SixtyFour Architecture Delivers 50% TCO Savings Across AI Inference and Fine-Tuning Lifecycle22.4.2026 15:00:00 CEST | Press release

Eliminating infrastructure overhead of legacy designs, I/ONX debuts a scaled AI inference and fine-tuning stack that cuts power by up to 30kW per rack and reduces cost of rack-scale deployments by up to 70% I/ONX High Performance Compute (HPC), a leading provider of heterogeneous AI systems, today announced the global launch of the Symphony SixtyFour, a high-density platform designed to collapse the physical and economic footprint of AI inference and fine-tuning infrastructure. By supporting up to 64 accelerators on a single node, I/ONX eliminates the redundant Host Tax—the massive overhead in power, hardware, and licensing that negatively impacts ROI in enterprise AI. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20260422485327/en/ I/ONX Introduces Symphony SixtyFour: The Host Tax is Over. Save 30-50% on your AI Infrastructure Costs. While inference now accounts for 90% of enterprise AI workloads, enterprises are entirely li

Thales Introduces Imperva for Google Cloud, Bringing Its Enterprise-Grade Application Security Capabilities Directly into Google Cloud22.4.2026 15:00:00 CEST | Press release

New offering eliminates the need to choose between cloud-native performance and advanced security as enterprises scale modern applications Thales today announced the Controlled Availability of Imperva for Google Cloud, bringing the industry's most trusted application security platform directly into Google Cloud. Designed to operate within Google Cloud, the new offering enables organizations to protect web applications and APIs by leveraging Google Cloud’s Service Extension traffic, preserving existing pipelines, integrations, and monitoring workflows. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20260422746638/en/ ©Thales As enterprises accelerate cloud adoption, development teams increasingly standardize on native cloud services to improve speed and reduce operational complexity. Many security solutions, however, require external routing that introduces latency and friction. At the same time, native cloud security tools oft

In our pressroom you can read all our latest releases, find our press contacts, images, documents and other relevant information about us.

Visit our pressroom
World GlobeA line styled icon from Orion Icon Library.HiddenA line styled icon from Orion Icon Library.Eye