Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Support for the ChatGPT endpoint now added to lambdaprompt[1]! (solves a similar problem as langchain, with almost no boilerplate!) Props to openai for making such a usable endpoint, was very easy to wrap.

Example code using the new function and endpoint:

    import lambdaprompt as lp
    convo = lp.AsyncGPT3Chat([{'system': 'You are a {{ type_of_bot }}'}])
    await convo("What should we get for lunch?", type_of_bot="pirate")
> As a pirate, I would suggest we have some hearty seafood such as fish and chips or a seafood platter. We could also have some rum to wash it down! Arrr!

(In order to use lambdaprompt, just `pip install lambdaprompt` and export OPENAI_API_KEY=...)

[1] https://github.com/approximatelabs/lambdaprompt



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: