You now have a postgres instance on port 5438 hat you can log into with postgres:postgres
We manage environment variables that are needed for interacting with LLMs/APIs in different ways but using the percolate cli can be a generally useful way to bootstrap your environment.
You can install percolate-db with pip but lets use the codebase for now...
cdclients/python/percolate#if you have API keys like OPEN_AI_KEY these are synced into your local instancepythonpercolate/cli.pyaddenv--sync
Another thing you can do is index Percolate files so you can ask questions about Percolate. This will use your Open AI key to generate embeddings.
pythonpercolate/cli.pyindex
Now you can ask questions from the cli
pythonpercolate/cli.pyask'are there SQL functions in Percolate for interacting with models like Claude?'
Percolate is a database - it wraps Postgres and adds extensions for vector and graph data. It also pushes agentic AI down into the data tier. Using your favourite Postgres client,
select*from percolate('What is the capital of ireland?')--try different models--select * from percolate('how can percolate help me with creating agentic systems',-- 'deepseek-chat')--see what Models are in Percolate by default--select * from p8."LangaugeModelApi"
This trivial example tests that we are connected to a langauge model(s) without using tools or data
If we want to use an Agent we can try the built in ones as an example