Hugging Face Transformers Library (Part 1)
ai notes
I knew AI was more than LLMs and wrappers, but scrolling through Hugging Face left me overwhelmed pretty quickly.
This is me taking small (hopefully consistent) steps to get familiar with the landscape.
New Python Project
Python has a great package and environment manager called uv. It feels like the good old npm. A few key commands:
uv init example
uv add transformers
uv add --dev ruff
uv run script.py
Same folks also made ruff, which is basically ESLint and Prettier for Python:
ruff check
ruff format
Running Transformers
I barely scratched the surface of the Transformers library, but with a little help from an LLM, here’s a tiny script that runs sentiment analysis:
from transformers import pipeline
from torch import cuda
def run():
if cuda.is_available():
device = f"GPU: {cuda.get_device_name(0)}"
else:
device = "CPU only"
print(device)
classifier = pipeline(
"sentiment-analysis",
# Hugging Face model
model="distilbert/distilbert-base-uncased-finetuned-sst-2-english",
# pinning model version
revision="714eb0f",
)
sentences = [
"My first experience with the transformers library.",
"I might struggle with it at first.",
"But let's proceed!",
]
for text in sentences:
result = classifier(text)[0]
print(f"{text} - {result['label']} ({result['score']:.2%})")
if __name__ == "__main__":
run()
Output:
GPU: NVIDIA GeForce GTX 1650
Device set to use cuda:0
My first experience with the transformers library. - POSITIVE (90.80%)
I might struggle with it at first. - NEGATIVE (99.91%)
But let's proceed! - POSITIVE (99.34%)
Next Steps
Hugging Face recommends starting with their LLM Course.