in

Open-Source Agentic AI Langbase Makes AI ‘Composable’, Transforming 50 Million Developers into AI Engineers


“I never wanted to be a founder… I’m a developer; I love writing code, and I want to enable anyone to build with AI.”

Conventional startup wisdom shuns such an honest admission. But there’s an abstract, palpable charisma when speaking with Ahmad Awais, the Founder and CEO of Langbase, that reflects a form of rarified leadership. If it’s this persuasive raison d’être alone that convinces some of the greatest investors in the world to stake his Agentic AI startup, then all the YC/a16z/Khosla principles for startup success are debunked.

If this is the key mentality to startup success, lock us up and throw away the key.

But it’s not–free another day. Awais is what we would crown a ‘tech OG’, having literally built much of the web’s foundational infrastructure that we enjoy today. Even this website and its features for that matter. His persona is the culmination of decades of rich experience. It’s earned. Now, he’s channeling that into what’s become the leading platform for building Large Language Model (LLM) apps.

In an exclusive interview with StartupHub.ai, Awais is making a first media reveal, providing commentary on his startup’s progress to date, and future plans to enable the future of Agentic AI.

COVID fortunes

No stranger to the developer community, with over two decades of experience, Awais’ contributions to platforms like WordPress Core, Node.js Foundation, and helping open-source React have had a lasting impact. It was the unexpected success of a side project during the early days of COVID-19 that set the stage for Langbase.

Ahmad AwaisAhmad Awais
Ahmad Awais, founder and CEO of Langbase, was the former VP of Developer Relations at RapidAPI and a founding member of the Google Developers Advisory Board (gDAB). His work in open-source engineering has earned a host of accolades, including the prestigious GitHub Stars Award. His contributions have been integral to the open-source web community, with projects like the NASA Mars Ingenuity Helicopter mission (he facetiously touts as his short-lived run in space exploration) and WordPress Core. Credit: Langbase.

“I built Corona CLI, a command-line tool to track COVID-19 data. It wasn’t supposed to be over-engineered, but it went viral,” Awais explained. A simple project quickly gained unexpected traction, processing over 6 billion API requests. The success caught the eye of OpenAI’s Greg Brockman and served as a watershed moment, prompting Awais to tinker in the AI space.

Initially skeptical, Awais found inspiration in the capabilities of OpenAI’s models. After securing early access to GPT-3 in mid 2020 and engaging with Brockman, he began to experiment, realizing the potential of integrating AI with developer tools. This insight—particularly in automating code suggestions to streamline the development process—marked the impetus for Langbase.

“The biggest beneficiaries of AI are likely to be web developers—they just don’t realize it yet.”

Langbase is Awais’ answer to the growing complexity causing a gridlock in the AI landscape. “Developers are going to use AI the most, at least for the next five to ten years,” he mused. Langbase is built on the premise of composability, or ‘Composable AI,’ a concept deeply rooted in software engineering but applied to AI in a way that empowers developers.

Drawing an analogy between the evolution of web development and the inextricable concept of components, he recounted how the web transitioned from monolithic structures—where everything was bundled into a single HTML file, with CSS and JavaScript interwoven—to a more modular approach. This allowed a piece of HTML, CSS, and JavaScript to be managed independently, enabling developers to update components across projects without breaking the entire system. A similar evolution occurred with servers, where developers initially struggled to run multiple languages like PHP and Python together ‘used to be a lot of duct tape involved’ he noted, until Docker containers made them portable and scalable. Now, components have made their way into AI. It’s called AI PIPEs.

The latest advancements in composability have been driven by the rise of personalization techniques, such as few-shot learning, Retrieval-Augmented Generation (RAG), and ReACT (Reasoning and Acting). Awais recognized that developers needed more robust tools, including system instructions, forking & versioning of prompts, and the ability to fine-tune models. “Then, Google introduced a 2 million token length capability, reducing the necessity for RAG, and suddenly, prompt engineering became cool again.” This shift led to further experimentation, evaluations, and the development of inference API engines.

Langbase answers the call for the next-in-line succession, continuing the never-ending loop of innovation in software engineering for AI.

Composable AI for Every Developer

Langbase makes AI development accessible to every developer, not just ML experts. Awais envisions a future where AI infrastructure is as easy to use as ‘npm install’. “Our mission is clear: AI for all. Not just ML wizards, but every single developer.” That vision is reflected in Langbase’s flagship product; ‘PIPEs’ — your custom-built AI agents available as an API. 

PIPE, which stands for Prompt Instructions Personalization Engine, are modular AI components that function similarly to Docker containers or React components, but for AI. They’re designed to be highly composable, auto-scalable, and forkable, allowing developers to mix and match different AI models, datasets, and instructions to build complex AI functionalities.

Awais considers it the smallest possible unit, staying true to composability.

Each PIPE can connect to various LLMs and integrate with different data sources, enabling developers to customize the behavior and performance of the AI based on their specific needs. PIPEs also come with built-in infrastructure, which means developers don’t have to worry about setting up and managing the underlying technology—Langbase handles it. 

In other words, developers can focus on crafting innovative AI solutions without the hindrance of backend operations.

The crux of it is that for every PIPE comes a corresponding API. A PIPE can therefore connect to another PIPE. To that end, Langbase is demonstrably composable–and hence, composable AI.

The platform supports over 100 LLMs from the likes of OpenAI, Anthropic, Google, and emerging players such as Mistral and Fireworks. This universal compatibility, combined with Langbase’s zero-configuration infrastructure, enables developers to experiment with different LLMs, optimize their applications, and deploy AI features at an unprecedented pace and scale.

⌘ Langbase Studio: The composable approach to AI. API-first, real-time collab, zero-config observability. From devs to CxOs, everyone’s in the loop. It’s about shipping AI products that matter, quickly. Credit: Langbase.

“If you’re just giving your customers prompt engineering, that’s very limiting. You need to provide them with an entire composable pipeline,” Awais emphasized. This approach allows developers to personalize and optimize AI models for their specific needs, whether it’s summarizing emails, generating code, or building complex AI-driven applications.

Langbase’s composable nature extends beyond just the models. Just for good measure, Awais added managed long-term Memory and RAG native features to Langbase, surpassing latency and at a lower cost than prevailing market offerings. Just those features could be standalone startup premises. It also includes LLMOps, preview deployments, memory management, customizable rate limits, and personalized APIs.

As an open-source offering, Langbase contributes to a landscape where AI-related contributions on GitHub surged by 45% in the past year, according Octoverse: The state of open source and rise of AI in 2023. Features like open PIPEs (open-source AI), allow developers to see what others are building, replicate it, and even improve upon it.

They’re targeting 50 million developers the world over, all primed to ship AI products.

This developer-first approach has already attracted hundreds of paid users, customers, and processed over 271 million API requests this year alone. “We’ve processed 63 billion AI tokens this year… This is working,” Awais asserted.

Langbase is also equipped with a host of observability features to bring full production visibility. Expected LLM costs, traceability and logs of each run are all baked into the interface.

In the near future, the platform will feature an LLM router to optimize the selection of each LLM per task. “Two major new products in our roadmap, we’re excited to launch soon”, Awais noted.

Compared to Langchain, Awais was magnanimous. But with some repeated line of questioning, the metaphor of ‘duct tape’ did come about. “It’s useful until you want to build something.”

Looking ahead, Awais sees Langbase as a foundational platform that will continue to evolve with the needs of developers—laser focused on providing a top-notch developer experience.

Awais admitted his original intention was to fund someone else to build Langbase. But the tidal currents of entrepreneurship were too strong for him to resist. 

Earlier this year, Langbase raised a pre-seed funding round from 24 elite angel investors, several a16z scouts, including Guy Podjarny (Snyk), Luca Maestri (CFO Apple), and Tom Preston-Werner, the co-founder of GitHub, who likened the platform to “what GitHub would look like if it were built for the software of today, with LLMs.”

In the mitosis-like market of AI tools,–a new, indistinguishable offering every week– Composable AI and developer upfitting sets Langbase in the upper right corner of the first quadrant. Awais’ prediction: the web will be the biggest benefactor of AI. As such, “this is the paradigm people need to know about.”


Tech Titans: Unveiling the Secrets of Investing in Emerging AI Companies - AI Time Journal

The Launch of B2B’s Digital-First Customer Lifecycle  – AI Time Journal

NVIDIA's share price nosedives as antitrust clouds gather

NVIDIA’s share price nosedives as antitrust clouds gather