Helicone Community Page

Updated 3 months ago

integrating with llama-index

why isn't this working?

try:
summary_index = SummaryIndex.from_documents(documents, show_progress=True)
summary_index.storage_context.persist(Path(f"{thisdoc_dir}/index"))
chat_engine = summary_index.as_query_engine(
response_mode="tree_summarize", service_context=ServiceContext.from_defaults(
llm=OpenAI(temperature=temperature, model=qmodel, max_tokens=qmax_tokens, qmemory=qmemory,
prompt_helper=prompt_helper, api_base = "https://oai.hconeai.com/v1", headers=get_helicone_header("llamaindexer"))))

summary_index.storage_context.persist(Path(f"{thisdoc_dir}/index"))
st.success(f'saved index to storage in {thisdoc_dir}/index')
C
1 comment
Are you able to provide more context? What error are you receiving?
Add a reply
Sign up and join the conversation on Discord