How to install and run LLM (DeepSeek-r1) in your local machine without Internet | Ollama | ChatBoxAI

Votre vidéo commence dans 10
Passer (5)
Formation gratuite en FR pour les membres inscrits sur les sites de vidéos

Merci ! Partagez avec vos amis !

Vous avez aimé cette vidéo, merci de votre vote !

Ajoutées by admin
10 Vues
How to install and run LLM (DeepSeek-r1) in your local machine without Internet.

Model name: deepseek-r1 model [https://ollama.com/library/deepseek-r1]
Model engine platform: Ollama [https://github.com/ollama/ollama]
AI client application: ChatBox [https://chatboxai.app/]

--------------------------------------------------------------------------------------

Schedule a meeting in case of any queries/guidance/counselling:
https://calendly.com/naveenautomationlabs

~~~Subscribe to this channel, and press bell icon to get some interesting videos on Selenium and Automation:
https://www.youtube.com/c/Naveen%20AutomationLabs?sub_confirmation=1

Follow me on my Facebook Page:
https://www.facebook.com/groups/naveenqtpexpert/

Let's join our Automation community for some amazing knowledge sharing and group discussion on Telegram:
https://t.me/joinchat/9FrG-KzGlvxjNmQ1

Naveen AutomationLabs Paid Courses:
GIT Hub Course:
https://naveenautomationlabs.com/gitcourse/

Java & Selenium:
https://naveenautomationlabs.com/selenium-java-full-paid-course-recorded-videos/

Java & API +POSTMAN + RestAssured + HttpClient:
https://naveenautomationlabs.com/manual-automation-testing-of-webservices-api/
Catégories
prompts ia
Mots-clés
DeepSeek-r1, LLM installation, run LLM locally

Ajouter un commentaire

Commentaires

Soyez le premier à commenter cette vidéo.