LLM Inference

A Fast Chat LLM Inference Module for Lilypad


This LLM Inference Module is a community-contributed module developed at AugmentHack.xyz The repo for this module can be found here.

See the original AugmentHack entry below:

[CLI] Usage


lilypad run fastchat:v0.0.1 "paramsStr"


Where "paramsStr" is a question in CID form for the LLM. For example https://ipfs.io/ipfs/QmcPjQwVcJiFge3yNjVL2NoZsTQ3GBpXAZe21S2Ncg16Gt is a bare file CID which contains

    "template": "You are a friendly chatbot assistant that responds conversationally to users' questions. \n Keep the answers short, unless specifically asked by the user to elaborate on something. \n \n Question: {question} \n \n Answer:",
    "parameters": {"question": "What is a chatbot?"}

To use it you would run:

lilypad run fastchat:v0.0.1 QmcPjQwVcJiFge3yNjVL2NoZsTQ3GBpXAZe21S2Ncg16Gt


The output will be an IPFS CID, for example running the above input would result in the following link:


Under link/output/result.json you will see

{"question": "What is a chatbot?", "text": "<pad> A  chatbot  is  a  computer  program  that  can  interact  with  users  in  a  conversational  manner.  It  is  designed  to  answer  questions  and  provide  information  in  a  way  that  is  natural  and  conversational.\n"}

LLM Module Code

Last updated