Generative AI and Streamlit: A perfect match

The future is about to get interesting…

Posted in LLMs,
Generative AI and Streamlit: A perfect match

Five years ago, we started Streamlit to help you share your work through beautiful web apps. Our vision was to create a thin wrapper around the Python ecosystem, allowing you to bring APIs, models, and business logic to life just by typing.

Today, generative AI is exploding into the Python ecosystem, and Streamlit has been right there. In fact, it's so easy that…

…LLMs can write a Streamlit app for you!

0:00
/0:35

Streamlit is the UI powering the LLM movement

How is it that LLMs are so good at writing Streamlit apps? It's because of all of the Streamlit code the community has shared!

  • Over 190,000 snippets of Streamlit code (and counting) exist on GitHub alone, all of which have helped train GPT4 and other LLMs. This means that analysts, data scientists, and even students can quickly perform analyses, rough out new apps, and weave auto-generated Streamlit fragments throughout other apps.
  • Over 5,000 LLM-powered Streamlit apps (and counting) have already been built on the Community Cloud alone. And these numbers are growing rapidly every day.

What does this mean for the future?

With its simple API and design, Streamlit has surfed each wave of generative AI advances, helping you effortlessly bring LLM-powered apps to life.

But what is even more important is how the rest of the LLM ecosystem is using Streamlit and how we can weave this all together for even more powerful apps.

💬 Chat interfaces as a key UI paradigm

Streamlit is perfectly suited for generative AI with its simple API, real-time visualization, interactive abilities, and user-friendly interfaces.

We are releasing chat elements soon that will help you make all the LLM-powered apps you can dream up.

⚙️ Easily connect to the LLM ecosystem

As LLMs continue to gain popularity, a vast ecosystem of tools is emerging around them. Here are some of the tools that already work well with Streamlit.

🦜🔗 LangChain's callback system to see what an LLM is thinking

As we move beyond simple chatbots to more complex applications, it'll be critical to understand how LLMs think and the steps they take to answer questions. We've enabled you to add intermediate step information to your Streamlit + LangChain app with just one command:

0:00
/0:15

In addition to LangChain, here are some other tools that we're excited about:

We'll be announcing more integrations soon, so stay tuned!

❄️ Unlock business insights with LLMs in Snowflake

Since our acquisition in 2022, we've been integrating Streamlit's functionality into Snowflake's enterprise-grade data solutions used by some of the largest companies in the world.

We'll share more at Snowflake Summit 2023, June 26-29 (you can still register here). Here are a few Streamlit + LLM sessions:

  1. Unleashing the Power of Large Language Models with Snowflake with Adrien Treuille, Streamlit co-founder, and Richard Meng, Senior Software Engineer.
  2. GPTZero: Idea to Iteration Powered by Streamlit and Snowflake with Caroline Frasca and Edward Tian.
  3. Keynote: Generative AI and LLMs in the enterprise

Check out more LLM talks here (you can watch them online after the Summit).

Let's build the future together

This community's engagement, creativity, and innovation are absolutely incredible. Thank you for inspiring our roadmap with your contributions.

Your support, combined with the rapid advancements coming from generative AI, will enable us to release new features that help shape the future of apps with LLMs. We want to hear from you!

In the meantime, check out our new generative AI hub and join us on the forum to share your ideas.

Let's do this together!

Love,

Adrien, Amanda, Thiago, and everyone at Streamlit. 🎈

Share this post

Comments

Continue the conversation in our forums →

Also in LLMs...

View even more →