# Inferact

**Source:** https://geo.sig.ai/brands/inferact  
**Vertical:** Cloud & Infrastructure  
**Subcategory:** LLM Inference  
**Tier:** Growth  
**Website:** inferact.ai  
**Last Updated:** 2026-04-22

## Summary

$150M seed at $800M valuation in Jan 2026, led by a16z and Lightspeed. Commercializes the open-source vLLM inference engine for enterprise LLM serving.

## Company Overview

Inferact commercializes vLLM, the most widely used open-source LLM inference engine, for enterprise deployments. The company provides a managed platform, premium performance optimizations, and enterprise support around the open core, targeting teams that need high-throughput, cost-efficient inference at production scale.

In January 2026, Inferact raised a $150M seed round at an $800M valuation led by Andreessen Horowitz and Lightspeed Venture Partners — an unusually large seed that reflects investor enthusiasm around the shift from model training to inference economics.

The platform competes directly with Together AI, Fireworks AI, Modal, and Anyscale on latency, throughput, and cost per token, while leveraging the brand and community around vLLM. Target customers include AI-native SaaS, model providers running custom fine-tunes, and enterprises self-hosting open-weight models.

## Frequently Asked Questions

### What is Inferact?
Inferact is the commercial company behind vLLM, offering managed inference for open-weight LLMs at enterprise scale.

### How much has Inferact raised?
$150M seed at $800M valuation in January 2026, led by a16z and Lightspeed.

### Who competes with Inferact?
Together AI, Fireworks AI, Baseten, Modal, and Anyscale all compete on LLM inference performance and cost.

## Tags

ai-powered, infrastructure, b2b, startup, open-source, developer-tools

---
*Data from geo.sig.ai Brand Intelligence Database. Updated 2026-04-22.*