Skip to content

Conversation

@hanouticelina
Copy link

@hanouticelina hanouticelina commented Jun 9, 2025

Hi there, I'm Célina from Hugging Face 🤗,
This PR introduces support for Hugging Face's Inference Providers (documentation here) as a LLM provider. With this integration, you can now access 650+ open source chat models hosted on the Hugging Face Hub, including deepseek-ai/DeepSeek-R1-0528 and the latest Qwen3 models!

Our API is fully aligned with the OpenAI REST API specs so I mainly followed the existing OpenAIProvider implementation.
I don't think there are tests of LLM providers, so I tested the integration by running the script examples:

import os

from memvid.chat import MemvidChat
from memvid.encoder import MemvidEncoder

chunks = ["Important fact 1", "Important fact 2", "Historical event details"]
encoder = MemvidEncoder()
encoder.add_chunks(chunks)
encoder.build_video("memory.mp4", "memory_index.json")

# Chat with your memory
chat = MemvidChat("memory.mp4", "memory_index.json", llm_provider="huggingface")
chat.start_session()
response = chat.chat("What do you know about historical events?")
print(response)

Looking forward to your feedback!

cc @Wauplin for viz

@hanouticelina hanouticelina changed the title Add hugging Face as a Provider Jun 9, 2025
@hanouticelina hanouticelina changed the title Add Hugging Face as a Provider Jun 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant