What is Hugging Face ?
The Hugging Face Hub is a collaboration platform that hosts a huge collection of open-source models and datasets for machine learning, think of it being like Github for ML. The hub facilitates sharing and collaborating by making it easy for you to discover, learn, and interact with useful ML assets from the open-source community. The hub integrates with, and is used in conjunction with the Transformers library, as models deployed using the Transformers library are downloaded from the hub.
Concepts around Hugging Face
Hugging Face Transformers
Hugging Face Transformers is an open-source Python library that provides access to thousands of pre-trained Transformers models for natural language processing (NLP), computer vision, audio tasks, and more. It simplifies the process of implementing Transformer models by abstracting away the complexity of training or deploying models in lower level ML frameworks like PyTorch, TensorFlow and JAX.
Hugging Face Spaces
Spaces from Hugging Face is a service available on the Hugging Face Hub that provides an easy to use GUI for building and deploying web hosted ML demos and apps. The service allows you to quickly build ML demos, upload your own apps to be hosted, or even select a number of pre-configured ML applications to deploy instantly.
Let’s deploy ML application on Hugging Face
Step 1: Create Hugging Face account
Go to hf.co, click “Sign Up” and create an account. dd your billing information : Within your HF account go to Settings > Billing, add your credit card to the payment information section.
We need NVIDIA T4 small to run summarization model in this tutorial.
Step 2: Create the space
Once logged in , within the “Spaces” tab. Create new Streamlit space with NVIDIA T4 small hardware. Choose sleep time to avoid billing when your application is iddle. Make the space public or private. Once the space is created (hello-world in this example) you can clone it with git and your code.
git clone [email protected]:spaces/doveaia/hello-world
There is README.md in the repo containing yaml header bloc describing the space.
---
title: Hello World
emoji: 🐢
colorFrom: red
colorTo: gray
sdk: streamlit
sdk_version: 1.41.1
app_file: app.py
pinned: false
---
The takeaway in this metadata is that the app file is named app.py
Step 3: The application code
import streamlit as st
from transformers import pipeline
st.title("💬 Chatbot")
if "messages" not in st.session_state:
st.session_state["messages"] = [{"role": "assistant", "content": "Hello. I can summarize text."}]
for msg in st.session_state.messages:
st.chat_message(msg["role"]).write(msg["content"])
if prompt := st.chat_input():
summarizer = pipeline("summarization")
st.session_state.messages.append({"role": "user", "content": prompt})
st.chat_message("user").write(prompt)
summary = summarizer(prompt)
msg = summary[0]["summary_text"]
st.session_state.messages.append({"role": "assistant", "content": msg})
st.chat_message("assistant").write(msg)
In this code, we use pipeline in transformers library to load summarization model. For the python dependencies are listed in requirements.txt file.
Step 4: Tests the app
To test the application, go to the space url : https://huggingface.co/spaces/doveaia/hello-world