Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

README.md

aicodesign (Python)

Provenance and review tracking for AI-generated code.

In fast-moving environments, using LLMs to generate code accelerates development, but it introduces varying levels of risk. aicodesign provides lightweight Python decorators to explicitly mark the review status and trust boundaries of AI-generated code running in production.

Installation

Using pip

pip install aicodesign

Using uv

uv add aicodesign

The Three Tiers of Trust

This library standardizes AI code into three distinct categories based on human verification:

1. @ai_draft (High Risk)

  • Code Reviews: 0
  • Test Reviews: 0
  • Concept: The code and its tests were generated by an LLM and pushed without human review. It is a raw draft. Emits a runtime logger warning when executed.

2. @ai_blackbox (Medium Risk)

  • Code Reviews: 0
  • Test Reviews: 1+ (Human Verified)
  • Concept: The internal logic is unreviewed (a black box), but the code is bounded by strict, human-reviewed unit tests. We know what it does, even if we haven't audited how it does it.

3. @ai_co_signed (Lower Risk)

  • Code Reviews: 1 (Human Verified)
  • Test Reviews: 1+ (Human Verified)
  • Concept: A human developer has reviewed the AI's logic and tests, officially putting their name on the line alongside the LLM. Requires a mandatory reviewer argument.

Usage Examples

from aicodesign import ai_draft, ai_blackbox, ai_co_signed

# Tier 3: Pure AI Draft
@ai_draft(ticket="HFT-101")
def calculate_momentum_alpha(prices):
    # Unreviewed logic and tests
    pass

# Tier 2: AI Blackbox
@ai_blackbox(ticket="HFT-102", notes="Tests verify strict output boundaries")
def parse_exchange_feed(payload):
    # Logic is unreviewed, but a human vetted the test harness
    pass

# Tier 1: Co-Signed Code
@ai_co_signed(reviewer="alice.dev", ticket="HFT-103")
def update_order_book(book, new_orders):
    # A human has audited the logic and tests
    pass

Introspection

All decorators attach metadata to the functions, making it easy to build CI/CD guardrails or runtime telemetry to track AI code execution:

print(update_order_book.__ai_provenance__)  # Output: "co_signed"
print(update_order_book.__ai_reviewer__)    # Output: "alice.dev"

License

MIT