Files
calminer/docs/quickstart.md
zwitschi dd3f3141e3
Some checks failed
Run Tests / test (push) Failing after 5m2s
feat: Add currency management feature with CRUD operations
- Introduced a new template for currency overview and management (`currencies.html`).
- Updated footer to include attribution to AllYouCanGET.
- Added "Currencies" link to the main navigation header.
- Implemented end-to-end tests for currency creation, update, and activation toggling.
- Created unit tests for currency API endpoints, including creation, updating, and activation toggling.
- Added a fixture to seed default currencies for testing.
- Enhanced database setup tests to ensure proper seeding and migration handling.
2025-10-25 15:44:57 +02:00

9.3 KiB
Raw Blame History

Quickstart & Expanded Project Documentation

This document contains the expanded development, usage, testing, and migration guidance moved out of the top-level README for brevity.

Development

To get started locally:

# Clone the repository
git clone https://git.allucanget.biz/allucanget/calminer.git
cd calminer

# Create and activate a virtual environment
python -m venv .venv
.\.venv\Scripts\Activate.ps1

# Install dependencies
pip install -r requirements.txt

# Start the development server
uvicorn main:app --reload

Docker-based setup

To build and run the application using Docker instead of a local Python environment:

# Build the application image (multi-stage build keeps runtime small)
docker build -t calminer:latest .

# Start the container on port 8000
docker run --rm -p 8000:8000 calminer:latest

# Supply environment variables (e.g., Postgres connection)
docker run --rm -p 8000:8000 ^
  -e DATABASE_DRIVER="postgresql" ^
  -e DATABASE_HOST="db.host" ^
  -e DATABASE_PORT="5432" ^
  -e DATABASE_USER="calminer" ^
  -e DATABASE_PASSWORD="s3cret" ^
  -e DATABASE_NAME="calminer" ^
  -e DATABASE_SCHEMA="public" ^
  calminer:latest

If you maintain a Postgres or Redis dependency locally, consider authoring a docker compose stack that pairs them with the app container. The Docker image expects the database to be reachable and migrations executed before serving traffic.

Usage Overview

  • API base URL: http://localhost:8000/api
  • Key routes include creating scenarios, parameters, costs, consumption, production, equipment, maintenance, and reporting summaries. See the routes/ directory for full details.

Dashboard Preview

  1. Start the FastAPI server and navigate to /.
  2. Review the headline metrics, scenario snapshot table, and cost/activity charts sourced from the current database state.
  3. Use the "Refresh Dashboard" button to pull freshly aggregated data via /ui/dashboard/data without reloading the page.

Testing

Run the unit test suite:

pytest

E2E tests use Playwright and a session-scoped live_server fixture that starts the app at http://localhost:8001 for browser-driven tests.

Migrations & Baseline

A consolidated baseline migration (scripts/migrations/000_base.sql) captures all schema changes required for a fresh installation. The script is idempotent: it creates the currency and measurement_unit reference tables, ensures consumption and production records expose unit metadata, and enforces the foreign keys used by CAPEX and OPEX.

Configure granular database settings in your PowerShell session before running migrations:

$env:DATABASE_DRIVER = 'postgresql'
$env:DATABASE_HOST = 'localhost'
$env:DATABASE_PORT = '5432'
$env:DATABASE_USER = 'calminer'
$env:DATABASE_PASSWORD = 's3cret'
$env:DATABASE_NAME = 'calminer'
$env:DATABASE_SCHEMA = 'public'
python scripts/setup_database.py --run-migrations --seed-data --dry-run
python scripts/setup_database.py --run-migrations --seed-data

The dry-run invocation reports which steps would execute without making changes. The live run applies the baseline (if not already recorded in schema_migrations) and seeds the reference data relied upon by the UI and API.

The application still accepts DATABASE_URL as a fallback if the granular variables are not set.

Database bootstrap workflow

Provision or refresh a database instance with scripts/setup_database.py. Populate the required environment variables (an example lives at config/setup_test.env.example) and run:

# Load test credentials (PowerShell)
Get-Content .\config\setup_test.env.example |
  ForEach-Object {
    if ($_ -and -not $_.StartsWith('#')) {
      $name, $value = $_ -split '=', 2
      Set-Item -Path Env:$name -Value $value
    }
  }

# Dry-run to inspect the planned actions
python scripts/setup_database.py --ensure-database --ensure-role --ensure-schema --initialize-schema --run-migrations --seed-data --dry-run -v

# Execute the full workflow
python scripts/setup_database.py --ensure-database --ensure-role --ensure-schema --initialize-schema --run-migrations --seed-data -v

Typical log output confirms:

  • Admin and application connections succeed for the supplied credentials.
  • Database and role creation are idempotent (already present when rerun).
  • SQLAlchemy metadata either reports missing tables or All tables already exist.
  • Migrations list pending files and finish with Applied N migrations (a new database reports Applied 1 migrations for 000_base.sql).

After a successful run the target database contains all application tables plus schema_migrations, and that table records each applied migration file. New installations only record 000_base.sql; upgraded environments retain historical entries alongside the baseline.

Seeding reference data

scripts/seed_data.py provides targeted control over the baseline datasets when the full setup script is not required:

python scripts/seed_data.py --currencies --units --dry-run
python scripts/seed_data.py --currencies --units

The seeder upserts the canonical currency catalog (USD, EUR, CLP, RMB, GBP, CAD, AUD) using ASCII-safe symbols (USD$, EUR, etc.) and the measurement units referenced by the UI (tonnes, kilograms, pounds, liters, cubic_meters, kilowatt_hours). The setup script invokes the same seeder when --seed-data is provided and verifies the expected rows afterward, warning if any are missing or inactive.

Rollback guidance

scripts/setup_database.py now tracks compensating actions when it creates the database or application role. If a later step fails, the script replays those rollback actions (dropping the newly created database or role and revoking grants) before exiting. Dry runs never register rollback steps and remain read-only.

If the script reports that some rollback steps could not complete—for example because a connection cannot be established—rerun the script with --dry-run to confirm the desired end state and then apply the outstanding cleanup manually:

python scripts/setup_database.py --ensure-database --ensure-role --dry-run -v

# Manual cleanup examples when automation cannot connect
psql -d postgres -c "DROP DATABASE IF EXISTS calminer"
psql -d postgres -c "DROP ROLE IF EXISTS calminer"

After a failure and rollback, rerun the full setup once the environment issues are resolved.

CI pipeline environment

The .gitea/workflows/test.yml job spins up a temporary PostgreSQL 16 container and runs the setup script twice: once with --dry-run to validate the plan and again without it to apply migrations and seeds. No external secrets are required; the workflow sets the following environment variables for both invocations and for pytest:

Variable Value Purpose
DATABASE_DRIVER postgresql Signals the driver to the setup script
DATABASE_HOST 127.0.0.1 Points to the linked job service
DATABASE_PORT 5432 Default service port
DATABASE_NAME calminer_ci Target database created by the workflow
DATABASE_USER calminer Application role used during tests
DATABASE_PASSWORD secret Password for both admin and app role
DATABASE_SCHEMA public Default schema for the tests
DATABASE_SUPERUSER calminer Setup script uses the same role for admin actions
DATABASE_SUPERUSER_PASSWORD secret Matches the Postgres service password
DATABASE_SUPERUSER_DB calminer_ci Database to connect to for admin operations

The workflow also updates DATABASE_URL for pytest to point at the CI Postgres instance. Existing tests continue to work unchanged, since SQLAlchemy reads the URL exactly as it does locally.

Because the workflow provisions everything inline, no repository or organization secrets need to be configured for basic CI runs. If you later move the setup step to staging or production pipelines, replace these inline values with secrets managed by the CI platform.

Database Objects

The database contains tables such as capex, opex, chemical_consumption, fuel_consumption, water_consumption, scrap_consumption, production_output, equipment_operation, ore_batch, exchange_rate, and simulation_result.

Current implementation status (2025-10-21)

  • Currency normalization: a currency table and backfill scripts exist; routes accept currency_id and currency_code for compatibility.
  • Simulation engine: scaffolding in services/simulation.py and /api/simulations/run return in-memory results; persistence to models/simulation_result is planned.
  • Reporting: services/reporting.py provides summary statistics used by POST /api/reporting/summary.
  • Tests & coverage: unit and E2E suites exist; recent local coverage is >90%.
  • Remaining work: authentication, persist simulation runs, CI/CD and containerization.

Where to look next