Show HN: CodeZen – A simple CLI tool to ask LLM questions about your codebase

github.com

3 points by mmaorc a year ago

Hello HN!

I regularly utilize LLMs for querying my codebases and have been seeking an efficient solution for that. Using ChatGPT can be quite tedious, requiring manual copying and pasting of file contents. Github Copilot Chat (beta) falls short when the queries need information from multiple files.

This is why I created a tiny CLI tool named CodeZen. It lets you ask a question and sends it to GPT with the entire codebase as context. For example, you can use `codezen "write me a readme.md file"` to create an entire readme file for your project.

It’s pretty bare-bones right now - only suitable for codebases small enough to fit in the LLM context. Yet, it's been quite handy for me, and I thought it might be beneficial for others too.

Looking forward to hearing your feedback and suggestions

Maor.

melx a year ago

> ...sends it to GPT with the entire codebase as context.

I never (so far) used LLM for "code assistance". However I would not want to use a route of sending everything in to OpenAI (or similar).

Maybe offer some sort of `.*ignore` that includes files to be excluded in the (API) request.

  • mmaorc a year ago

    Thanks for the feedback I already use the .gitignore file to decide which files I should ignore. I should also add an option to add more files to the ignore list as well.

    • melx a year ago

      Yes but while adding files to .gitignore will not send these to AI using your tool, it will break a project that uses your tool. Hence the idea of a separate ignore file to git, but with similar behaviour to gitignore.

      • mmaorc a year ago

        Yes, I completely agree. This is what I meant, but I wasn't clear enough. I'm working on adding `.czignore` functionality right now.

      • mmaorc a year ago

        Fixed. You can use version 0.3.0 that supports .czignore with the same format as .gitignore

kordlessagain a year ago

I was indexing some of Georgi Gerganov's llama.cpp code yesterday with DoctorGPT (https://github.com/FeatureBaseDB/DoctorGPT/tree/main#readme) and was thinking about doing what you are doing with CodeZen regarding reading in a whole repo. I see you have some limits in indexing larger files and have used a solution for that in DoctorGPT. I'll dig into your code and maybe we can chat about it if you are interested.