zurfer 14 days ago

Congrats on the launch and kudos for open sourcing everything.

I'll put you on this list: https://github.com/Snowboard-Software/awesome-ai-analytics

One thing I'd love more clarity on as a user would be how no information leaves my computer when local LLMs are on the roadmap only and how this squares with being the fastest way to analyze data (for me local llms are slow).

  • RamiAwar 12 days ago

    Thank you!

    On the info leaving your machine part, obviously since we're using OpenAI the table metadata will be sent to draft the SQL. But the conversations and messages and the stored results and everything stays local in an SQLite DB. No cloud or anything involved. With Local LLMs it will indeed be FULLY airtight.

    We don't support local LLMs yet cause we want to ensure high quality results. We're only exposing models after we test them thoroughly (eval pipeline work nearly done now - less than a week left probs). So soon we'll be releasing more supported models, including local LLMs. But only if they're good enough/fast enough. Speed is one concern for local LLMs but you should see the quality - pretty meh right now. But haven't tested all of them so can't generalize yet. With the eval pipeline this will be much easier.

marwanj 12 days ago

Great job! Kudos for finally creating privacy centered AI. Can't wait to experiment with it

  • RamiAwar 11 days ago

    Thank you! Will send you a private demo :)