Hayloft
UI tool for LLM frameworks to make easy prompt/completion tracking, store and comparison of different sessions.
LlamaIndex-hayloft3.mp4
Installation
Install package with pip
pip install hayloft
Usage
Start hayloft server
hayloft start
Trace logs of your script on http://localhost:7000
LlamaIndex
Install llama-index
, create example.py
file as below. Put examples folder from llama_index repo near the file.
import os
os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'
from llama_index import SimpleDirectoryReader, VectorStoreIndex
from hayloft.llama_index import grab_logs
grab_logs()
documents = SimpleDirectoryReader("examples/paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query("What did the author do growing up?")
Or you can start live sessions from hayloft ui, just modify code like here
import os
os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'
from llama_index import SimpleDirectoryReader, VectorStoreIndex
from hayloft.llama_index import listen
def agent(query: str):
documents = SimpleDirectoryReader("examples/paul_graham_essay/data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query(query)
listen(agent)
Start this script
python example.py
BabyAGI
Clone BabyAGI fork
repo, setup virtual env and install all dependencies
git clone git@github.com:eturchenkov/babyagi-hayloft.git && cd babyagi-hayloft
pip install -r requirements.txt
Adjust config in .env file and start babyagi.py
script
python babyagi.py