<!DOCTYPE html>
<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body>
<p>Hi All,</p>
<p>We were chatting before the most recent NetSIG about the new
Llamafile app, which has excellent support for IPv6. The app runs
a webserver (which is IPv6 accessible). The new llamafile app
takes a -m parameter which points to the gguf LLM model.</p>
<p><b> Old way</b><br>
./google_gemma-3-4b-it-Q6_K.llamafile --server -v2 --host
lxcllama.example.com<br>
<b>New way</b><br>
llamafile -m model.gguf --server --port 8080</p>
<p>Find the new llamafile at:</p>
<p> <a class="moz-txt-link-freetext" href="https://github.com/mozilla-ai/llamafile/releases/tag/0.10.0">https://github.com/mozilla-ai/llamafile/releases/tag/0.10.0</a></p>
<p>You can find gguf (LLM models) at:</p>
<p> <a class="moz-txt-link-freetext" href="https://huggingface.co/models?library=gguf">https://huggingface.co/models?library=gguf</a></p>
<p>I start my llamafile using this command:</p>
<p> ./llamafile-0.10.0 -m Qwen3.5-9B.Q4_K_M.gguf --server --port
8080 --host lxcllama.example.com </p>
<p>This way any webbrowser at my house, can access the LLM.</p>
<p>Happy LLM-ing,</p>
<p>Craig...</p>
<pre class="moz-signature" cols="72">--
IPv6 is the future, the future is here
<a class="moz-txt-link-freetext" href="http://ipv6hawaii.org/">http://ipv6hawaii.org/</a></pre>
</body>
</html>