Show HN: A lightweight LLM proxy to get structured results from most LLMs

l1m.io

9 points by lunarcave 21 hours ago

Hey HN!

After struggling with complex prompt engineering and unreliable parsing, we built L1M, a simple API that lets you extract structured data from unstructured text and images.

    curl -X POST https://api.l1m.io/structured \
    -H "Content-Type: application/json" \
    -H "X-Provider-Url: demo" \
    -H "X-Provider-Key: demo" \
    -H "X-Provider-Model: demo" \
    -d '{
      "input": "A particularly severe crisis in 1907 led Congress to enact the Federal Reserve Act in 1913",
      "schema": {
        "type": "object",
        "properties": {
          "items": {
            "type": "array",
            "items": {
              "type": "object",
              "properties": {
                "name": { "type": "string" },
                "price": { "type": "number" }
              }
            }
          }
        }
      }
    }'
This is actually a component we unbundled from our larger because we think it's useful on its own.

It's fully open source (MIT license) and you can:

- Use with text or images - Bring your own model (OpenAI, Anthropic, or any compatible API) - Run locally with Ollama for privacy - Cache responses with customizable TTL

The code is at https://github.com/inferablehq/l1m with SDKs for Node.js, Python, and Go.

Would love to hear if this solves a pain point for you!