{ "@context":[ "https://www.w3.org/ns/activitystreams", {"Hashtag":"as:Hashtag"} ], "published":"2023-11-29T22:35:23.514Z", "attributedTo":"https://gopinath.org/actors/rahul", "to":["https://www.w3.org/ns/activitystreams#Public","https://wandering.shop/users/cstross"], "cc":["https://gopinath.org/actors/rahul/followers"], "content":"

Running LLMs from a single file! With LLamaFile from Mozilla. All you need isĀ 

curl -LO https://huggingface.co/jartine/llava-v1.5-7B-GGUF/resolve/main/llamafile-server-0.1-llava-v1.5-7b-q4
chmod 755 llamafile-server-0.1-llava-v1.5-7b-q4
./llamafile-server-0.1-llava-v1.5-7b-q4


Now, I am waiting for an easy way to finetune an LLM from a Mac (this is mostly doable now) or using a CPU, and we will soon be within reach of the world of custom agents promised by @cstross

", "mediaType":"text/html", "attachment":[], "tag":[ {"type":"Mention","name":"@cstross@wandering.shop","href":"https://wandering.shop/users/cstross"} ], "type":"Note", "id":"https://gopinath.org/objects/bIno66_QH3U" }