Google Cloud 🀝🏻 Hugging Face

community
Activity Feed

AI & ML interests

Google Cloud and Hugging Face

Recent Activity

google-cloud-partnership's activity

pagezyhfΒ 
posted an update 21 days ago
pagezyhfΒ 
posted an update 23 days ago
view post
Post
959
It’s 2nd of December , here’s your Cyber Monday present 🎁 !

We’re cutting our price down on Hugging Face Inference Endpoints and Spaces!

Our folks at Google Cloud are treating us with a 40% price cut on GCP Nvidia A100 GPUs for the next 3️⃣ months. We have other reductions on all instances ranging from 20 to 50%.

Sounds like the time to give Inference Endpoints a try? Get started today and find in our documentation the full pricing details.
https://ui.endpoints.huggingface.co/
https://huggingface.co./pricing
pagezyhfΒ 
posted an update 29 days ago
view post
Post
297
Hello Hugging Face Community,

if you use Google Kubernetes Engine to host you ML workloads, I think this series of videos is a great way to kickstart your journey of deploying LLMs, in less than 10 minutes! Thank you @wietse-venema-demo !

To watch in this order:
1. Learn what are Hugging Face Deep Learning Containers
https://youtu.be/aWMp_hUUa0c?si=t-LPRkRNfD3DDNfr

2. Learn how to deploy a LLM with our Deep Learning Container using Text Generation Inference
https://youtu.be/Q3oyTOU1TMc?si=V6Dv-U1jt1SR97fj

3. Learn how to scale your inference endpoint based on traffic
https://youtu.be/QjLZ5eteDds?si=nDIAirh1r6h2dQMD

If you want more of these small tutorials and have any theme in mind, let me know!
jeffboudierΒ 
posted an update about 1 month ago
pagezyhfΒ 
posted an update about 1 month ago
view post
Post
1357
Hello Hugging Face Community,

I'd like to share here a bit more about our Deep Learning Containers (DLCs) we built with Google Cloud, to transform the way you build AI with open models on this platform!

With pre-configured, optimized environments for PyTorch Training (GPU) and Inference (CPU/GPU), Text Generation Inference (GPU), and Text Embeddings Inference (CPU/GPU), the Hugging Face DLCs offer:

⚑ Optimized performance on Google Cloud's infrastructure, with TGI, TEI, and PyTorch acceleration.
πŸ› οΈ Hassle-free environment setup, no more dependency issues.
πŸ”„ Seamless updates to the latest stable versions.
πŸ’Ό Streamlined workflow, reducing dev and maintenance overheads.
πŸ”’ Robust security features of Google Cloud.
☁️ Fine-tuned for optimal performance, integrated with GKE and Vertex AI.
πŸ“¦ Community examples for easy experimentation and implementation.
πŸ”œ TPU support for PyTorch Training/Inference and Text Generation Inference is coming soon!

Find the documentation at https://huggingface.co./docs/google-cloud/en/index
If you need support, open a conversation on the forum: https://discuss.huggingface.co/c/google-cloud/69
jeffboudierΒ 
posted an update 3 months ago
alvarobarttΒ 
updated a Space 3 months ago
jeffboudierΒ 
posted an update 3 months ago
view post
Post
450
Inference Endpoints got a bunch of cool updates yesterday, this is my top 3
jeffboudierΒ 
posted an update 3 months ago
view post
Post
4032
Pro Tip - if you're a Firefox user, you can set up Hugging Chat as integrated AI Assistant, with contextual links to summarize or simplify any text - handy!

In this short video I show how to set it up
  • 2 replies
Β·