Business Wire

GCORE

6.6.2024 10:31:30 CEST | Business Wire | Press release

Share
Gcore Unveils Inference at the Edge – Bringing AI Applications Closer to End Users for Seamless Real-Time Performance

Gcore, the global edge AI, cloud, network, and security solutions provider, today announced the launch of Gcore Inference at the Edge, a breakthrough solution that provides ultra-low latency experiences for AI applications. This innovative solution enables the distributed deployment of pre-trained machine learning (ML) models to edge inference nodes, ensuring seamless, real-time inference.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20240606719181/en/

To view this piece of content from mms.businesswire.com, please give your consent at the top of this page.

Gcore Inference at the Edge empowers businesses across diverse industries with cost-effective, scalable, and secure AI model deployment (Graphic: Gcore)

Gcore Inference at the Edge empowers businesses across diverse industries—including automotive, manufacturing, retail, and technology—with cost-effective, scalable, and secure AI model deployment. Use cases such as generative AI, object recognition, real-time behavioural analysis, virtual assistants, and production monitoring can now be rapidly realised on a global scale.

Gcore Inference at the Edge runs on Gcore's extensive global network of 180+ edge nodes, all interconnected by Gcore's sophisticated low-latency smart routing technology. Each high-performance node sits at the edge of the Gcore network, strategically placing servers close to end users. Inference at the Edge runs on NVIDIA L40S GPUs, the market-leading chip designed specifically for AI inference. When a user sends a request, an edge node determines the route to the nearest available inference region with the lowest latency, achieving a typical response time of under 30 ms.

The new solution supports a wide range of fundamental ML and custom models. Available open-source foundation models in the Gcore ML Model Hub include LLaMA Pro 8B, Mistral 7B, and Stable-Diffusion XL. Models can be selected and trained agnostically to suit any use case, before distributing them globally to Gcore Inference at the Edge nodes. This addresses a significant challenge faced by development teams where AI models are typically run on the same servers they were trained on, resulting in poor performance.

Benefits of Gcore Inference at the Edge include:

  • Cost-effective deployment: A flexible pricing structure ensures customers only pay for the resources they use.
  • Inbuilt DDoS protection: ML endpoints are automatically protected from DDoS attacks through Gcore’s infrastructure.
  • Outstanding data privacy and security: The solution features built-in compliance with GDPR, PCI DSS, and ISO/IEC 27001 standards.
  • Model autoscaling: Autoscaling is available to handle load spikes, so a model is always ready to support peak demand and unexpected surges.
  • Unlimited object storage: Scalable S3-compatible cloud storage that grows with evolving model needs.

Andre Reitenbach, CEO at Gcore comments: “Gcore Inference at the Edge empowers customers to focus on getting their machine learning models trained, rather than worrying about the costs, skills, and infrastructure required to deploy AI applications globally. At Gcore, we believe the edge is where the best performance and end-user experiences are achieved, and that is why we are continuously innovating to ensure every customer receives unparalleled scale and performance. Gcore Inference at the Edge delivers all the power with none of the headache, providing a modern, effective, and efficient AI inference experience.”

Learn more at https://gcore.com/inference-at-the-edge

About Gcore
Gcore is the global edge AI, cloud, network, and security solutions provider. Gcore provides its solutions to global leaders in numerous industries. The company manages its own global IT infrastructure across six continents, with one of the best network performances in Europe, Africa, and LATAM, due to the average response time of 30 ms worldwide. Gcore’s network consists of 180+ points of presence around the world in reliable Tier IV and Tier III data centres, with a total capacity exceeding 200 Tbps.

To view this piece of content from cts.businesswire.com, please give your consent at the top of this page.

View source version on businesswire.com: https://www.businesswire.com/news/home/20240606719181/en/

About Business Wire

Business Wire
Business Wire
101 California Street, 20th Floor
CA 94111 San Francisco

http://businesswire.com
DK

Subscribe to releases from Business Wire

Subscribe to all the latest releases from Business Wire by registering your e-mail address below. You can unsubscribe at any time.

Latest releases from Business Wire

InterSystems Appoints Former NHS and Mass General Leader Dr. Tim Ferris as Vice President, Healthcare Practice9.3.2026 17:00:00 CET | Press release

InterSystems, a creative data technology provider powering more than one billion health records globally, today announced the appointment of Tim Ferris, M.D., as Vice President, Healthcare Practice. The announcement comes as healthcare leaders gather for the 2026 HIMSS Global Health Conference & Exhibition. In this role, Dr. Ferris will leverage his comprehensive view of the industry to help drive the clinical and strategic direction of the company’s healthcare solutions worldwide. Drawing on his vast experience, he will serve as a strategic advisor to global health systems and governments, engineer targeted technology solutions based on real-world executive needs, advance his academic research on health data architecture and lead public discourse on the intersection of AI and care delivery. Dr. Ferris brings a unique global perspective to InterSystems that is virtually unmatched in healthcare. His career spans 30 years as a practicing primary care physician, executive leadership at pr

Radial Selects Riskified to Power Payment Fraud and Refund/Return Protection for Merchant Client Portfolio9.3.2026 15:00:00 CET | Press release

The partnership brings Riskified’s AI-powered platform for payment fraud, refund claim, and return abuse protection to Radial’s global network of merchants Riskified (NYSE: RSKD), a leader in ecommerce fraud and risk intelligence, today announced a strategic partnership with Radial, a leading 3PL set to become Paxon later this year. Radial will integrate with Riskified’s AI-powered platform to help its merchants approve more legitimate orders and reduce losses from payment fraud, including many merchants that use Shopify as their ecommerce platform. Radial supports many of the world’s most recognized retail brands with a global ecommerce fulfillment network of more than 20 centers across North America, helping merchants deliver orders quickly and cost effectively. By bringing Riskified’s AI-powered fraud decisioning into its commerce ecosystem, Radial gives merchants the surgical ability to calibrate the checkout experience according to risk—without slowing fulfillment. This also suppo

Boomi Activates Data for the Enterprise9.3.2026 14:00:00 CET | Press release

New platform innovations activate contextual data to power production-scale enterprise AI, with a new European platform instance for localized control Boomi™, the data activation company, today announced new capabilities within the Boomi Enterprise Platform. Data activation brings data to life across systems and processes, delivering it with the right context and timing to power everything from AI to BI. The Boomi Enterprise Platform, the foundation that puts data in motion, now adds new semantic context to help AI agents operate on grounded business realities, expands governed SAP data movement with change data capture, enhances transparency and oversight across agentic workflows, and introduces a dedicated European platform instance for localized data control. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20260309682526/en/ Boomi Activates Data for the Enterprise “Last year, Boomi helped enterprises move from experimentatio

Thredd Accelerates in 2026 with its Global Platform and Delivering Enterprise-Scale Outcomes9.3.2026 14:00:00 CET | Press release

Thredd, the AI-first issuer processing platform, today announced a major acceleration phase for 2026, marked by the appointment of Marilyn McDonald as Chief Technology Officer and the promotion of Ryan Dew to Chief Product Officer. Thredd has never been in a stronger position. A fully cloud-native version of its global platform is live in the United States, with end-to-end credit capabilities launching in the coming months alongside a state-of-the-art debit platform and modern unified ledger. Marilyn McDonald joins as CTO to lead the next phase of enterprise-scale execution. She succeeds Edwin Poot, who helped lead the modernisation of Thredd’s architecture and technology stack, laying the foundation for cloud-native and agentic growth. Marilyn brings global transformation experience from senior roles at Citigroup, Mastercard, Expedia Group and StubHub. Her focus is on strengthening operational readiness, bank-grade delivery, enterprise execution and continuous resilience as Thredd sca

MariaDB to Acquire GridGain: Architecting the Real-Time Foundation for the Agentic Enterprise9.3.2026 14:00:00 CET | Press release

Strategic acquisition unites MariaDB’s mission-critical relational database with GridGain’s extreme in-memory speed to power the next generation of AI applications MariaDB plc today announced that it has entered into a definitive agreement to acquire GridGain Systems, Inc., the pioneer of in-memory computing and creator of open source Apache Ignite. By merging MariaDB’s AI-ready relational database with GridGain’s scalable, in-memory power, MariaDB is setting a new industry standard: sub-millisecond data infrastructure for the agentic era. Closing the AI Latency Gap As enterprises move beyond passive chatbots toward agentic AI – autonomous systems that reason, plan and execute tasks – they are quickly becoming limited by traditional data architectures. AI agents require real-time access to massive datasets with zero friction. This acquisition bridges that gap by fusing: MariaDB’s reliability: Proven, ACID-compliant transactional integrity for the world’s most sensitive data, with nativ

In our pressroom you can read all our latest releases, find our press contacts, images, documents and other relevant information about us.

Visit our pressroom
World GlobeA line styled icon from Orion Icon Library.HiddenA line styled icon from Orion Icon Library.Eye