{"id":642,"date":"2024-12-29T18:25:02","date_gmt":"2024-12-29T17:25:02","guid":{"rendered":"https:\/\/www.pingwho.org\/?p=642"},"modified":"2026-02-22T20:13:14","modified_gmt":"2026-02-22T19:13:14","slug":"ollama-ou-comment-heberger-votre-intelligence-artificielle","status":"publish","type":"post","link":"https:\/\/www.pingwho.org\/index.php\/29\/12\/2024\/ollama-ou-comment-heberger-votre-intelligence-artificielle\/","title":{"rendered":"Ollama ! Ou comment, h\u00e9berger votre Intelligence Artificielle ?"},"content":{"rendered":"\n<h1 class=\"wp-block-heading\"><\/h1>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"721\" src=\"https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-1024x721.png\" alt=\"\" class=\"wp-image-703\" srcset=\"https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-1024x721.png 1024w, https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-300x211.png 300w, https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-768x541.png 768w, https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-1536x1082.png 1536w, https:\/\/www.pingwho.org\/wp-content\/uploads\/2024\/12\/ollama_cloud-2048x1443.png 2048w\" sizes=\"auto, (max-width: 706px) 89vw, (max-width: 767px) 82vw, 740px\" \/><\/figure>\n\n\n\n<p><strong><a href=\"https:\/\/ollama.com\" data-type=\"link\" data-id=\"http:\/\/github.com\/ollama\/ollama\/README.md\" target=\"_blank\" rel=\"noreferrer noopener\">Ollama <\/a><\/strong>est un <strong>outil open source<\/strong> sous <strong><a href=\"https:\/\/mit-license.org\/\">license MIT<\/a><\/strong>, qui vous permet d&rsquo;h\u00e9berger facilement votre propre <strong>mod\u00e8le de langage avanc\u00e9 sur votre ordinateur<\/strong> ou votre serveur. Il fournit une interface simple pour interagir avec ces mod\u00e8les et les int\u00e9grer dans vos applications.<\/p>\n\n\n\n<p>Avec <strong>Ollama<\/strong>, vous pouvez :<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>H\u00e9berger des <a href=\"https:\/\/ollama.com\/search\" data-type=\"link\" data-id=\"https:\/\/ollama.com\/search\">mod\u00e8les de langage<\/a><\/strong> populaires tels que <strong>Llama 2,Falcon 40B et OPT-IML-2.7B<\/strong> sur votre propre mat\u00e9riel.<\/li>\n\n\n\n<li><strong>Interagir <\/strong>avec ces mod\u00e8les via une interface en ligne de commande ou l&rsquo;<strong>API HTTP<\/strong> fournie par <strong>Ollama<\/strong>.<\/li>\n\n\n\n<li>Utiliser <strong>Ollama comme une passerelle<\/strong> pour acc\u00e9der \u00e0 des mod\u00e8les h\u00e9berg\u00e9s sur d&rsquo;autres serveurs, tels que <strong>les mod\u00e8les propos\u00e9s par NVIDIA<\/strong>.<\/li>\n<\/ul>\n\n\n\n<p>Pour utiliser <strong>Ollama<\/strong>, vous devez disposer d&rsquo;un ordinateur dot\u00e9 d&rsquo;une <strong>carte graphique compatible avec le mod\u00e8le de langage<\/strong> que vous souhaitez ex\u00e9cuter. Ollama prend en charge les cartes graphiques<strong> NVIDIA CUDA et AMD ROCm<\/strong>, ce qui permet l&rsquo;ex\u00e9cution de mod\u00e8les volumineux sur des machines puissantes.<\/p>\n\n\n\n<p><strong><a href=\"https:\/\/www.gentoo.org\">Gentoo Linux<\/a><\/strong> \u00e9tait d\u00e9pourvue d&rsquo;installation classique pour Ollama. Les ebuilds prenant en charge l&rsquo;application, sont directement installables, et fonctionnels. Ils sont disponible dans cet overlay d\u00e9di\u00e9 :<strong> <\/strong><a href=\"https:\/\/github.com\/jaypeche\/pingwho-overlay\/tree\/master\/sci-ml\/ollama-bin\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>https:\/\/github.com\/jaypeche\/pingwho-overlay\/tree\/master\/sci-ml\/ollama-bin<\/strong><\/a><\/p>\n\n\n\n<p><strong>Par s\u00e9curit\u00e9<\/strong>, ce paquet n\u00e9cessite \u00e9galement, <a href=\"https:\/\/github.com\/jaypeche\/pingwho-overlay\/tree\/master\/acct-user\/ollama\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>acct-user\/ollama<\/strong><\/a> et <strong><a href=\"https:\/\/github.com\/jaypeche\/pingwho-overlay\/tree\/master\/acct-group\/ollama\" data-type=\"link\" data-id=\"https:\/\/ftp.pingwho.org\/pub\/gentoo\/ftp\/overlay\/pingwho-overlay\/acct-group\/ollama\/\" target=\"_blank\" rel=\"noreferrer noopener\">a<\/a><\/strong><a href=\"https:\/\/github.com\/jaypeche\/pingwho-overlay\/tree\/master\/acct-group\/ollama\" data-type=\"link\" data-id=\"https:\/\/ftp.pingwho.org\/pub\/gentoo\/ftp\/overlay\/pingwho-overlay\/acct-group\/ollama\/\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>cct-group\/ollama<\/strong><\/a>, pour que le service fonctionne avec <strong>son utilisateur d\u00e9di\u00e9<\/strong>, et ainsi restreindre l&rsquo;acc\u00e8s au Shell.<\/p>\n\n\n\n<p>Les <strong>logs de compilation<\/strong>, sont disponible i\u00e7i : <\/p>\n\n\n\n<p><a href=\"https:\/\/gist.github.com\/jaypeche\/55d6c1fb1f6799a6ee027deb3e9bd3a9\"><strong>https:\/\/gist.github.com\/jaypeche\/55d6c1fb1f6799a6ee027deb3e9bd3a9<\/strong><\/a><\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Ollama est un outil open source sous license MIT, qui vous permet d&rsquo;h\u00e9berger facilement votre propre mod\u00e8le de langage avanc\u00e9 sur votre ordinateur ou votre serveur. Il fournit une interface simple pour interagir avec ces mod\u00e8les et les int\u00e9grer dans vos applications. Avec Ollama, vous pouvez : Pour utiliser Ollama, vous devez disposer d&rsquo;un ordinateur &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/www.pingwho.org\/index.php\/29\/12\/2024\/ollama-ou-comment-heberger-votre-intelligence-artificielle\/\" class=\"more-link\">Continuer la lecture<span class=\"screen-reader-text\"> de &laquo;&nbsp;Ollama ! Ou comment, h\u00e9berger votre Intelligence Artificielle ?&nbsp;&raquo;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20,1,24,22],"tags":[],"class_list":["post-642","post","type-post","status-publish","format-standard","hentry","category-admin","category-general","category-reseau","category-security"],"_links":{"self":[{"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/posts\/642","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/comments?post=642"}],"version-history":[{"count":40,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/posts\/642\/revisions"}],"predecessor-version":[{"id":726,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/posts\/642\/revisions\/726"}],"wp:attachment":[{"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/media?parent=642"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/categories?post=642"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.pingwho.org\/index.php\/wp-json\/wp\/v2\/tags?post=642"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}