New in June 2023

LangChain adds Baseten integration

LangChain isn’t just a popular open-source toolkit for building with LLMs, it’s a testament to the energy radiating from the AI dev ecosystem. LangChain provides tooling and standards for building apps like chatbots, agents, and document-specific question answering across a wide range of models and platforms.

from langchain.llms import Baseten
wizardlm = Baseten(model="MODEL_VERSION_ID", verbose=True)
wizardlm("What is the difference between a Wizard and a Sorcerer?")

With LangChain’s new Baseten integration, you can power any LangChain app with models hosted on your Baseten account. Build with LLMs like Falcon, WizardLM, and Alpaca in just a few lines of code!

Get started:

All-new model logs on Baseten

Next time you deploy a model on Baseten, check out the refreshed model logs tab to see deployment progress and information on any issues.

âś•
Model deployment logs on the new logs tab

Falcon soars to top LLM leaderboard

HuggingFace’s open LLM leaderboard shows the rapid progress that open-source LLMs have made since LLaMA was introduced in February. For the last few weeks, there’s been a new champion on the board: Falcon-40B.

Falcon-40B Instruct has everything you need in an open-source LLM: it performs well on benchmark tasks, has a GPT-like interface, and is licensed for commercial use.

To experiment with Falcon-40B, Baseten engineer Sid Shanker deployed the model and tested it on a variety of tasks from writing recipes to fiction to code. Check out his writeup to see for yourself how Falcon performed.

Falcon-40B runs on two A100 GPUs. But you can run the still-impressive seven billion parameter version on a single A10. Deploy Falcon-7B in two clicks from our model library for inexpensive experimentation, or deploy the full model from GitHub with Truss for more robust tasks.

Foundation models 101

With Baseten’s model library, you can get your own instance of open-source foundation models on autoscaling production infrastructure. But what is a foundation model? And once you have it, how can you adapt it to fit your use case?

Foundation models are models that:

  • are trained on a dataset that is both broad in scope and massive in size

  • can be further adapted to a wide variety of downstream applications

For a complete rundown of how these models are created and used, here’s a primer on foundation models.

The whole point of foundation models is adapting them to your needs. Check out this overview of prompt engineering, vector databases, and fine-tuning to pick the approach that works for your project.

Thanks all!

— The team at Baseten

Video