Jul 6, 2021
To keep sensitive data like API keys and passwords secret, instead of including those in plaintext in code you can now use the secret manager to store them and retrieve them in code only when needed. To do so, open the secret manager from the settings menu
Once the secret data is there and given a name, you can retrieve the secret in your code using context.get_secret('my_secret_name')
.
May 5, 2021
Your Baseten applications and machine learning models are restricted to your organization — no one outside of your Baseten organization can access them.
However you can now explicitly share an application with the outside world. You do so through the Share UI on the application header.
Note: once you make an application public, all its views, worklets, and queries are publicly accessible.
Apr 14, 2021
Allowing you to write Python code without worrying about its execution environment is at the core of Baseten — no dockerfiles, no Kubernetes deployments, no Flask app boilerplate.
However this doesn't come at the expense of control. You now have full control over the Python environment that executes your code, whether you want to install a 3rd party package from PyPi or install a custom wheel.
See it in action below and read about it on our docs.
Mar 3, 2021
So far there were two ways to make your BaseTen apps data-aware:
Call external API endpoints from worklets
Connect to external data stores
Both of these require an external service / data store to already exist, whereas many BaseTen users want to create self-sufficient apps that create new data (e.g. data labeling, content moderation).
Today we're announcing the availability of the Baseten data store: a managed database as well as data modeling and access tools built using familiar technologies like Postgres and SQLAlchemy. No more provisioning RDS instances, managing the security groups, and writing DDL statements. Instead you can define your data model, iterate on it, and access it both using raw SQL and SQLAlchemy ORM.
Mar 2, 2021
Baseten makes it easy for you to deploy your own machine learning models behind RESTful, scalable APIs.
Now, we're introducing the Baseten model zoo that allows users to deploy state-of-the-art, pretrained models on Baseten. In our model zoo, we have models from Hugging Face, TensorflowHub, and OpenAI.
Using these models is easy — choose your model, hit deploy, and in minutes have APIs ready to be used in your application or within Baseten's application building framework.
We're adding new models often — if there's any model you'd like to see in the Baseten model zoo, please let us know; we'd love to add them.