{"id":13685,"date":"2025-09-12T02:19:21","date_gmt":"2025-09-12T06:19:21","guid":{"rendered":"https:\/\/spinor.info\/weblog\/?p=13685"},"modified":"2025-09-12T02:19:21","modified_gmt":"2025-09-12T06:19:21","slug":"apocalypse-proof-ai","status":"publish","type":"post","link":"https:\/\/spinor.info\/weblog\/?p=13685","title":{"rendered":"Apocalypse-proof AI"},"content":{"rendered":"<p>GPT, Claude, Gemini, Grok&#8230; great services. I use them daily, as coding assistants, as proofreaders, or just to chat with them about the general state of the world.<\/p>\n<p>But they all reside in the cloud. Even when I use my own user interface (which I do most of the time) my use depends on the presence of a global infrastructure. Should that global infrastructure disappear, for whatever reason &#8212; cyberattack, political decisions, war &#8212; my user interface would turn useless, an empty shell with nothing within.<\/p>\n<p>Well, at least that was the case until yesterday. As of today, I have an alternative.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-13686\" src=\"https:\/\/spinor.info\/weblog\/wp-content\/uploads\/2025\/09\/llava-captures.png\" alt=\"\" width=\"669\" height=\"521\" srcset=\"https:\/\/spinor.info\/weblog\/wp-content\/uploads\/2025\/09\/llava-captures.png 669w, https:\/\/spinor.info\/weblog\/wp-content\/uploads\/2025\/09\/llava-captures-300x234.png 300w, https:\/\/spinor.info\/weblog\/wp-content\/uploads\/2025\/09\/llava-captures-150x117.png 150w\" sizes=\"(max-width: 669px) 100vw, 669px\" \/><\/p>\n<p>Not a great alternative, to be sure. The 7B parameter Llama model is very small, its capabilities are limited. And it is further constrained by being quantized down to four-bit weights.<\/p>\n<p>Which makes it all the more surprising that even such a simple model can faithfully execute zero-shot instructions, such as a system prompt that tells it how to use Google. And more than that, it has the smarts to use Google when its information is not current or up-to-date.<\/p>\n<p>I never expected this from such a small, &#8220;toy&#8221; model that was released almost two years ago, in late 2023. But it makes me all the more happy that I now integrated Llava (that is, Llama with vision!) into my WISPL front-end.<\/p>\n<p>Should disaster strike, we may no longer have access to &#8220;bleeding edge&#8221; frontier models like GPT-5 or Claude-4.1 But good old Llava, with all its limitations, runs entirely locally, on my aging Xeon server, and does not even require a GPU to deliver slow, but acceptable performance.<\/p>\n<p>I won&#8217;t be using Llava daily, to be sure. But it&#8217;s there&#8230; consider it insurance.<\/p>\n<fb:like href='https:\/\/spinor.info\/weblog\/?p=13685' send='true' layout='standard' show_faces='true' width='450' height='65' action='like' colorscheme='light' font='lucida grande'><\/fb:like>","protected":false},"excerpt":{"rendered":"<p>GPT, Claude, Gemini, Grok&#8230; great services. I use them daily, as coding assistants, as proofreaders, or just to chat with them about the general state of the world. But they all reside in the cloud. Even when I use my own user interface (which I do most of the time) my use depends on the <a href='https:\/\/spinor.info\/weblog\/?p=13685' class='excerpt-more'>[&#8230;]<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[58,35],"tags":[],"class_list":["post-13685","post","type-post","status-publish","format-standard","hentry","category-cybernetics","category-personal","category-58-id","category-35-id","post-seq-1","post-parity-odd","meta-position-corners","fix"],"_links":{"self":[{"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/posts\/13685","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13685"}],"version-history":[{"count":1,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/posts\/13685\/revisions"}],"predecessor-version":[{"id":13687,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=\/wp\/v2\/posts\/13685\/revisions\/13687"}],"wp:attachment":[{"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13685"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13685"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/spinor.info\/weblog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13685"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}