Skip to content

Comments

⚡ Bolt: Optimized Blockchain Verification to O(1)#449

Open
RohanExploit wants to merge 4 commits intomainfrom
bolt-blockchain-optimization-12126339649829960137
Open

⚡ Bolt: Optimized Blockchain Verification to O(1)#449
RohanExploit wants to merge 4 commits intomainfrom
bolt-blockchain-optimization-12126339649829960137

Conversation

@RohanExploit
Copy link
Owner

@RohanExploit RohanExploit commented Feb 22, 2026

This PR implements a performance optimization for the blockchain-style report integrity system. By explicitly storing the previous block's hash in the current record, we eliminate the need for a secondary database lookup during verification, achieving O(1) complexity. The hash calculation has also been strengthened to include geographic coordinates, ensuring reports are "geographically sealed". Backward compatibility is maintained for all existing records.


PR created automatically by Jules for task 12126339649829960137 started by @RohanExploit


Summary by cubic

Optimized blockchain verification to O(1) by storing previous_integrity_hash and hashing with lat/lon. Also hardened Render deployments with safer CORS defaults, DB startup timeouts, and dependency fixes.

  • New Features

    • Store previous_integrity_hash on create for O(1) verification; legacy records fall back to predecessor lookup.
    • Hash scheme: description|category|lat|lon|prev_hash with lat/lon fixed to 7 decimals; verify endpoint returns previous_hash and uses O(1) when available.
    • Issue/verification schemas expose integrity_hash and previous_integrity_hash.
  • Migration

    • DB: add previous_integrity_hash (nullable). No backfill; verification auto-falls back for legacy records.
    • Render hardening: default CORS allow_origin_regex when FRONTEND_URL is missing, set PYTHONPATH='.', add DB init/migration timeouts (10s/20s) with better logging, and run frontend build with CI=false.
    • Dependencies: split python-jose/cryptography, pin bcrypt<4, add voice/language deps and sub-deps (six, ecdsa, rsa, pyasn1, python-dotenv), and restore generate_reference_id.

Written for commit c1f082a. Summary will update on new commits.

Summary by CodeRabbit

Release Notes

  • New Features

    • Enhanced blockchain integrity verification with faster O(1) verification through stored predecessor linking
    • Improved integrity validation now incorporates location coordinates for more robust verification
    • Maintains backward compatibility with existing blockchain records
  • Tests

    • Expanded test coverage for optimized and legacy verification paths and end-to-end scenarios

- Added `previous_integrity_hash` column to `Issue` model to enable O(1) verification.
- Updated hashing logic to include geographic context (latitude/longitude) as per memory.
- Optimized `verify_blockchain_integrity` endpoint to reduce database queries by 50%.
- Implemented backward compatibility for legacy records.
- Added comprehensive tests in `tests/test_blockchain.py`.
- Fixed missing `generate_reference_id` in `backend/utils.py`.
- Updated `.jules/bolt.md` with performance learnings.
@google-labs-jules
Copy link
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

Copilot AI review requested due to automatic review settings February 22, 2026 13:55
@netlify
Copy link

netlify bot commented Feb 22, 2026

Deploy Preview for fixmybharat canceled.

Name Link
🔨 Latest commit c1f082a
🔍 Latest deploy log https://app.netlify.com/projects/fixmybharat/deploys/699b0ea617c03f0008b160e2

@github-actions
Copy link

🙏 Thank you for your contribution, @RohanExploit!

PR Details:

Quality Checklist:
Please ensure your PR meets the following criteria:

  • Code follows the project's style guidelines
  • Self-review of code completed
  • Code is commented where necessary
  • Documentation updated (if applicable)
  • No new warnings generated
  • Tests added/updated (if applicable)
  • All tests passing locally
  • No breaking changes to existing functionality

Review Process:

  1. Automated checks will run on your code
  2. A maintainer will review your changes
  3. Address any requested changes promptly
  4. Once approved, your PR will be merged! 🎉

Note: The maintainers will monitor code quality and ensure the overall project flow isn't broken.

@coderabbitai
Copy link

coderabbitai bot commented Feb 22, 2026

📝 Walkthrough

Walkthrough

This PR implements an O(1) blockchain integrity verification system by storing the previous block's hash directly in each Issue record, eliminating predecessor lookups. Hash computation now includes latitude and longitude for new issues. A utility function for generating reference IDs is added, and tests cover both legacy and optimized verification paths.

Changes

Cohort / File(s) Summary
Data Model
backend/models.py
Added previous_integrity_hash column to Issue model to store links to previous blocks for O(1) verification.
API Implementation
backend/routers/issues.py
Introduced O(1) verification path using stored previous hash and lat/lon; updated hash computation to include coordinates; added backward-compatible legacy fallback; exposes previous_hash in verification responses.
API Schemas
backend/schemas.py
Extended IssueResponse with integrity_hash and previous_integrity_hash fields; added previous_hash field to BlockchainVerificationResponse.
Utilities
backend/utils.py
Added generate_reference_id() function returning cryptographically secure identifiers in XXXX-XXXX-XXXX format.
Documentation
.jules/bolt.md
Added section documenting blockchain verification improvement and O(1) verification strategy using stored predecessor hashes.
Tests
tests/test_blockchain.py
Renamed legacy verification test; added optimized O(1) path test; introduced end-to-end API creation/verification test; expanded failure scenarios with previous_integrity_hash handling.

Sequence Diagram

sequenceDiagram
    participant Client
    participant API as Issues Router
    participant DB as Database
    participant Verification as Verification Engine

    Client->>API: create_issue(description, category, lat, lon)
    API->>DB: Query last issue for previous_integrity_hash
    DB-->>API: Return last issue (or None)
    API->>Verification: compute_hash(description, category, lat, lon, prev_hash)
    Verification-->>API: integrity_hash
    API->>DB: Store Issue with integrity_hash, previous_integrity_hash
    DB-->>API: Confirmation
    API-->>Client: Issue created

    Client->>API: verify_blockchain_integrity(issue_id)
    API->>DB: Fetch issue
    DB-->>API: Issue with previous_integrity_hash
    
    alt Has previous_integrity_hash (Optimized O(1))
        API->>Verification: compute_hash(description, category, lat, lon, previous_integrity_hash)
        Verification-->>API: computed_hash
        API->>API: Compare computed_hash with stored integrity_hash
        API-->>Client: Verification result + previous_hash
    else Legacy path (No previous_integrity_hash)
        API->>DB: Query predecessor issues (O(log N))
        DB-->>API: Predecessor chain
        API->>Verification: Verify full chain
        Verification-->>API: Verification result
        API-->>Client: Verification result
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested labels

size/m

Poem

🐰 A chain of hashes, linked with care,
No more predecessor queries to spare!
O(1) magic with lat and lon in sight,
Previous blocks now stored, verification's light! ⚡

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 45.45% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly describes the main change: implementing O(1) optimization for blockchain verification, which is the primary focus of the PR across multiple files.
Description check ✅ Passed PR description clearly explains the optimization (O(1) verification via stored previous_integrity_hash), geographic sealing (lat/lon in hash), and backward compatibility. Includes implementation details and testing information.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch bolt-blockchain-optimization-12126339649829960137

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

- Set `PYTHONPATH` to `.` in `render.yaml` as per memory.
- Added `FRONTEND_URL` fallback in `backend/main.py` to prevent crashes when the env var is missing in production.
- Pinned `bcrypt<4.0.0` and split `python-jose` dependencies in `backend/requirements-render.txt` to avoid build issues.
- Added `CI=false` to frontend build in `render-build.sh` to ignore non-critical warnings.
- Fixed `generate_reference_id` import/missing issue in `backend/utils.py`.
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the issue “blockchain integrity” mechanism to support O(1) verification by persisting the previous hash on each Issue, and strengthens the hash format to include geographic coordinates while keeping a legacy verification path.

Changes:

  • Add previous_integrity_hash to the Issue model and store it during issue creation.
  • Update blockchain verification to use previous_integrity_hash when present (O(1)), with a fallback for legacy rows.
  • Expand tests to cover legacy verification, optimized verification, and end-to-end API creation+verification.

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 2 comments.

Show a summary per file
File Description
backend/routers/issues.py Updates hash computation to include lat/lon and updates verification logic to use stored previous_integrity_hash.
backend/models.py Adds previous_integrity_hash column to the Issue model.
backend/schemas.py Extends response schemas with integrity-hash related fields (IssueResponse + verification response).
tests/test_blockchain.py Adds coverage for legacy and optimized verification paths plus an API E2E test.
backend/utils.py Adds generate_reference_id() utility (used by voice router).
.jules/bolt.md Documents the O(1) blockchain verification learning/action.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.


# Simple but effective SHA-256 chaining
hash_content = f"{description}|{category}|{prev_hash}"
# Blockchain Feature: Geographically sealed chaining
Copy link

Copilot AI Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line 178 is not indented to match the surrounding block (it starts at column 0 while inside the if deduplication_info is None ... branch). This will raise an IndentationError and prevent the module from importing. Indent this comment line to the same level as the following lat_str/lon_str lines.

Suggested change
# Blockchain Feature: Geographically sealed chaining
# Blockchain Feature: Geographically sealed chaining

Copilot uses AI. Check for mistakes.
# Simple but effective SHA-256 chaining
hash_content = f"{description}|{category}|{prev_hash}"
# Blockchain Feature: Geographically sealed chaining
# Format lat/lon to 7 decimal places for consistent hashing as per memory
Copy link

Copilot AI Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment "for consistent hashing as per memory" is unclear/irrelevant in-source (it reads like an internal note rather than a spec). Consider replacing it with a concrete rationale (e.g., "to avoid float serialization differences") or a link/reference to a design doc/test that defines the 7-decimal formatting contract.

Suggested change
# Format lat/lon to 7 decimal places for consistent hashing as per memory
# Format lat/lon to 7 decimal places to ensure deterministic hashing across
# environments (avoids float serialization/rounding differences altering the chain).

Copilot uses AI. Check for mistakes.
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 4 files (changes from recent commits).

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="backend/main.py">

<violation number="1" location="backend/main.py:132">
P1: Defaulting to a wildcard CORS regex in production when FRONTEND_URL is missing opens the API to any Netlify subdomain. This weakens CORS protection compared to the previous fail-fast behavior and can allow unintended origins to access authenticated endpoints.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

# To prevent Render deployment crashes, default to a wildcard regex if missing
# Log a warning instead of raising an error
logger.warning("FRONTEND_URL environment variable is missing in production. Defaulting to allow all origins (regex) for availability.")
frontend_url = r"https://.*\.netlify\.app"
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Defaulting to a wildcard CORS regex in production when FRONTEND_URL is missing opens the API to any Netlify subdomain. This weakens CORS protection compared to the previous fail-fast behavior and can allow unintended origins to access authenticated endpoints.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At backend/main.py, line 132:

<comment>Defaulting to a wildcard CORS regex in production when FRONTEND_URL is missing opens the API to any Netlify subdomain. This weakens CORS protection compared to the previous fail-fast behavior and can allow unintended origins to access authenticated endpoints.</comment>

<file context>
@@ -126,10 +126,10 @@ async def lifespan(app: FastAPI):
+        # To prevent Render deployment crashes, default to a wildcard regex if missing
+        # Log a warning instead of raising an error
+        logger.warning("FRONTEND_URL environment variable is missing in production. Defaulting to allow all origins (regex) for availability.")
+        frontend_url = r"https://.*\.netlify\.app"
     else:
         logger.warning("FRONTEND_URL not set. Defaulting to http://localhost:5173 for development.")
</file context>
Suggested change
frontend_url = r"https://.*\.netlify\.app"
raise ValueError(
"FRONTEND_URL environment variable is required for security in production. "
"Set it to your frontend URL (e.g., https://your-app.netlify.app)."
)
Fix with Cubic

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 6 files

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="backend/routers/issues.py">

<violation number="1" location="backend/routers/issues.py:180">
P2: Using "0.0000000" as the placeholder for missing latitude/longitude makes the integrity hash collide with valid coordinates at (0.0, 0.0). That means a report can flip between “no coordinates” and a real (0,0) location without changing the hash. Use a distinct sentinel (e.g., "null") for missing coordinates in both hash generation and verification to avoid collisions.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

hash_content = f"{description}|{category}|{prev_hash}"
# Blockchain Feature: Geographically sealed chaining
# Format lat/lon to 7 decimal places for consistent hashing as per memory
lat_str = f"{latitude:.7f}" if latitude is not None else "0.0000000"
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Feb 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Using "0.0000000" as the placeholder for missing latitude/longitude makes the integrity hash collide with valid coordinates at (0.0, 0.0). That means a report can flip between “no coordinates” and a real (0,0) location without changing the hash. Use a distinct sentinel (e.g., "null") for missing coordinates in both hash generation and verification to avoid collisions.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At backend/routers/issues.py, line 180:

<comment>Using "0.0000000" as the placeholder for missing latitude/longitude makes the integrity hash collide with valid coordinates at (0.0, 0.0). That means a report can flip between “no coordinates” and a real (0,0) location without changing the hash. Use a distinct sentinel (e.g., "null") for missing coordinates in both hash generation and verification to avoid collisions.</comment>

<file context>
@@ -175,8 +175,13 @@ async def create_issue(
-            hash_content = f"{description}|{category}|{prev_hash}"
+# Blockchain Feature: Geographically sealed chaining
+            # Format lat/lon to 7 decimal places for consistent hashing as per memory
+            lat_str = f"{latitude:.7f}" if latitude is not None else "0.0000000"
+            lon_str = f"{longitude:.7f}" if longitude is not None else "0.0000000"
+
</file context>
Fix with Cubic

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
backend/routers/issues.py (1)

173-209: ⚠️ Potential issue | 🟠 Major

TOCTOU race condition breaks the single-chain guarantee.

Between the await that reads prev_hash (line 173–175) and the await that inserts the new issue (line 209), the event loop can yield to other coroutines. Two concurrent POST /api/issues requests can both read the same prev_hash = hash_N and both persist previous_integrity_hash = hash_N, forking the chain:

Issue N   →  Issue N+1  (previous_integrity_hash = hash_N)
          →  Issue N+2  (previous_integrity_hash = hash_N)  ← also points to N, not N+1

Individual record integrity checks still pass (each issue's stored hash is self-consistent), but the chain is no longer linear, which is the central design property claimed by this PR.

🔒 Proposed fix — serialize chain-building with an asyncio lock

Add a module-level lock in issues.py:

import asyncio
_chain_lock = asyncio.Lock()

Then wrap the fetch-and-insert block:

+        async with _chain_lock:
             # Blockchain feature: calculate integrity hash for the report
             prev_issue = await run_in_threadpool(
                 lambda: db.query(Issue.integrity_hash).order_by(Issue.id.desc()).first()
             )
             prev_hash = prev_issue[0] if prev_issue and prev_issue[0] else ""
             lat_str = f"{latitude:.7f}" if latitude is not None else "0.0000000"
             lon_str = f"{longitude:.7f}" if longitude is not None else "0.0000000"
             hash_content = f"{description}|{category}|{lat_str}|{lon_str}|{prev_hash}"
             integrity_hash = hashlib.sha256(hash_content.encode()).hexdigest()
             ...
             new_issue = Issue(
                 ...
                 integrity_hash=integrity_hash,
                 previous_integrity_hash=prev_hash
             )
             await run_in_threadpool(save_issue_db, db, new_issue)

An asyncio.Lock is sufficient for single-process deployments. For multi-process (e.g., Gunicorn workers), a DB-level advisory lock or SELECT ... FOR UPDATE SKIP LOCKED would be required.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/routers/issues.py` around lines 173 - 209, There is a TOCTOU race
where two concurrent handlers can read the same prev_hash and both insert a
child pointing to it; fix by serializing the read-hash/compute-hash/insert
sequence with a module-level asyncio.Lock (e.g., add _chain_lock =
asyncio.Lock() at top of issues.py) and wrap the block that calls
run_in_threadpool(lambda: db.query(...).first()), computes integrity_hash,
constructs new Issue (integrity_hash and previous_integrity_hash) and calls
run_in_threadpool(save_issue_db, db, new_issue) inside "async with _chain_lock:"
so only one coroutine can build and persist the chain at a time; keep
save_issue_db, the prev_issue read lambda, integrity_hash calculation, and
previous_integrity_hash assignment unchanged otherwise.
🧹 Nitpick comments (1)
backend/routers/issues.py (1)

178-178: Misindented comment at column 0 inside an indented block.

♻️ Proposed fix
-# Blockchain Feature: Geographically sealed chaining
+            # Blockchain Feature: Geographically sealed chaining
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@backend/routers/issues.py` at line 178, The comment string "# Blockchain
Feature: Geographically sealed chaining" is misindented at column 0 inside an
indented block; fix it by aligning the comment with the surrounding block
indentation (or move it to top-level if it was intended as a file-level header)
so it matches the indentation of nearby statements in backend/routers/issues.py
and no longer breaks the block structure—locate the exact comment text and
either prepend the same spaces/tabs as the enclosing block or place it at column
0 as a file header.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@tests/test_blockchain.py`:
- Line 56: Replace boolean equality comparisons with direct truthiness checks in
the test assertions: change occurrences like the comparison of data["is_valid"]
== True to a direct assert data["is_valid"], and where tests compare to False
use assert not <expr>; update all affected assertions (the ones referencing
data["is_valid"] and any other similar == True/== False checks noted in the diff
at the other occurrences) to remove == True/== False and use direct truthiness
(e.g., assert <expr> or assert not <expr>) so the tests follow Ruff E712 style.
- Line 109: The test function test_blockchain_creation_and_verification_api has
an unused fixture parameter db_session (triggers Ruff ARG001); remove db_session
from the test signature so pytest still provides it via the client fixture
dependency — update the function definition in tests/test_blockchain.py to
accept only client (i.e., def
test_blockchain_creation_and_verification_api(client):) and ensure there are no
other references to db_session inside that function.

---

Outside diff comments:
In `@backend/routers/issues.py`:
- Around line 173-209: There is a TOCTOU race where two concurrent handlers can
read the same prev_hash and both insert a child pointing to it; fix by
serializing the read-hash/compute-hash/insert sequence with a module-level
asyncio.Lock (e.g., add _chain_lock = asyncio.Lock() at top of issues.py) and
wrap the block that calls run_in_threadpool(lambda: db.query(...).first()),
computes integrity_hash, constructs new Issue (integrity_hash and
previous_integrity_hash) and calls run_in_threadpool(save_issue_db, db,
new_issue) inside "async with _chain_lock:" so only one coroutine can build and
persist the chain at a time; keep save_issue_db, the prev_issue read lambda,
integrity_hash calculation, and previous_integrity_hash assignment unchanged
otherwise.

---

Nitpick comments:
In `@backend/routers/issues.py`:
- Line 178: The comment string "# Blockchain Feature: Geographically sealed
chaining" is misindented at column 0 inside an indented block; fix it by
aligning the comment with the surrounding block indentation (or move it to
top-level if it was intended as a file-level header) so it matches the
indentation of nearby statements in backend/routers/issues.py and no longer
breaks the block structure—locate the exact comment text and either prepend the
same spaces/tabs as the enclosing block or place it at column 0 as a file
header.

response = client.get(f"/api/issues/{issue1.id}/blockchain-verify")
assert response.status_code == 200
data = response.json()
assert data["is_valid"] == True
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Replace == True comparisons with direct truthiness assertions (Ruff E712).

🔧 Proposed fix
-    assert data["is_valid"] == True     # line 56
+    assert data["is_valid"]

-    assert data["is_valid"] == True     # line 63
+    assert data["is_valid"]

-    assert data["is_valid"] == True     # line 105
+    assert data["is_valid"]

-    assert v_resp1.json()["is_valid"] == True   # line 124
+    assert v_resp1.json()["is_valid"]

-    assert v_resp2.json()["is_valid"] == True   # line 141
+    assert v_resp2.json()["is_valid"]

Also applies to: 63-63, 105-105, 124-124, 141-141

🧰 Tools
🪛 Ruff (0.15.1)

[error] 56-56: Avoid equality comparisons to True; use data["is_valid"]: for truth checks

Replace with data["is_valid"]

(E712)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_blockchain.py` at line 56, Replace boolean equality comparisons
with direct truthiness checks in the test assertions: change occurrences like
the comparison of data["is_valid"] == True to a direct assert data["is_valid"],
and where tests compare to False use assert not <expr>; update all affected
assertions (the ones referencing data["is_valid"] and any other similar ==
True/== False checks noted in the diff at the other occurrences) to remove ==
True/== False and use direct truthiness (e.g., assert <expr> or assert not
<expr>) so the tests follow Ruff E712 style.

assert data["current_hash"] == hash2
assert data["previous_hash"] == hash1

def test_blockchain_creation_and_verification_api(client, db_session):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Remove the unused db_session parameter (Ruff ARG001).

client already depends on db_session, so pytest resolves the fixture automatically — the explicit parameter in the test signature is redundant.

🔧 Proposed fix
-def test_blockchain_creation_and_verification_api(client, db_session):
+def test_blockchain_creation_and_verification_api(client):
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def test_blockchain_creation_and_verification_api(client, db_session):
def test_blockchain_creation_and_verification_api(client):
🧰 Tools
🪛 Ruff (0.15.1)

[warning] 109-109: Unused function argument: db_session

(ARG001)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_blockchain.py` at line 109, The test function
test_blockchain_creation_and_verification_api has an unused fixture parameter
db_session (triggers Ruff ARG001); remove db_session from the test signature so
pytest still provides it via the client fixture dependency — update the function
definition in tests/test_blockchain.py to accept only client (i.e., def
test_blockchain_creation_and_verification_api(client):) and ensure there are no
other references to db_session inside that function.

- Added missing voice and language dependencies to `backend/requirements-render.txt`.
- Set `PYTHONPATH` to `.` in `render.yaml` to ensure `backend` package is importable.
- Optimized `requirements-render.txt` by removing unused large libraries (`indic-nlp-library`, `numpy`, `pandas`) while keeping essential voice support.
- Implemented `FRONTEND_URL` fallback and `allow_origin_regex` in `backend/main.py` for safer production CORS defaults.
- Added `CI=false` to frontend build in `render-build.sh` to prevent build failures from warnings.
- Fixed `generate_reference_id` missing issue in `backend/utils.py`.
- Includes blockchain O(1) optimization.
@github-actions
Copy link

🔍 Quality Reminder

Thanks for the updates! Please ensure:
- Your changes don't break existing functionality
- All tests still pass
- Code quality standards are maintained

*The maintainers will verify that the overall project flow remains intact.*

- Added missing sub-dependencies (`six`, `ecdsa`, `rsa`, `pyasn1`, `python-dotenv`) to `backend/requirements-render.txt`.
- Added timeouts (10s/20s) to database initialization and migrations in `lifespan` to prevent hanging during Render's health checks.
- Enhanced logging in `backend/main.py` for better visibility into the startup process.
- Verified production startup and health check responsiveness locally.
- Includes previous blockchain O(1) optimization.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant