55 Commits

Author SHA1 Message Date
cbaff5614a feat(docker): add script to run Docker container for calminer application
Some checks failed
CI / lint (push) Successful in 17s
Deploy - Coolify / deploy (push) Failing after 5s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 2m18s
2025-11-15 19:58:55 +01:00
f9feb51d33 refactor(docker): update .gitignore for devcontainer files and remove version from docker-compose
Some checks failed
CI / lint (push) Successful in 16s
Deploy - Coolify / deploy (push) Failing after 5s
CI / test (push) Successful in 1m4s
CI / build (push) Successful in 2m19s
2025-11-15 15:41:43 +01:00
eb2687829f refactor(navigation): remove legacy navigation.js and integrate logic into navigation_sidebar.js
Some checks failed
CI / lint (push) Successful in 17s
Deploy - Coolify / deploy (push) Failing after 5s
CI / test (push) Successful in 1m21s
CI / build (push) Successful in 2m25s
2025-11-15 13:53:50 +01:00
ea101d1695 Merge branch 'main' of https://git.allucanget.biz/allucanget/calminer
Some checks failed
CI / lint (push) Successful in 17s
Deploy - Coolify / deploy (push) Failing after 5s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 2m15s
2025-11-14 21:22:37 +01:00
722f93b41c refactor(ci): remove deployment context capture and simplify API call for Coolify deployment 2025-11-14 21:20:37 +01:00
e2e5e12f46 Merge pull request 'merge develop to main' (#13) from develop into main
Some checks failed
Deploy - Coolify / deploy (push) Failing after 6s
CI / lint (push) Successful in 17s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 2m16s
Reviewed-on: #13
2025-11-14 20:49:10 +01:00
4e60168837 Merge https://git.allucanget.biz/allucanget/calminer into develop
All checks were successful
CI / lint (push) Successful in 16s
CI / lint (pull_request) Successful in 16s
CI / test (push) Successful in 1m4s
CI / test (pull_request) Successful in 1m2s
CI / build (push) Successful in 1m49s
CI / build (pull_request) Successful in 1m51s
2025-11-14 20:32:03 +01:00
dae3b59af9 feat(ci): add Kubernetes deployment toggle and update conditions for deployment steps
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 1m53s
CI / lint (pull_request) Successful in 16s
CI / test (pull_request) Successful in 1m3s
CI / build (pull_request) Successful in 1m51s
2025-11-14 20:14:53 +01:00
839399363e fix(ci): update registry handling and add image push step in CI workflow
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m4s
CI / build (push) Successful in 1m45s
2025-11-14 20:08:26 +01:00
fa8a065138 feat(ci): enhance CI workflow with metadata outputs and add Coolify deployment workflow
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 1m48s
2025-11-14 19:55:06 +01:00
cd0c0ab416 fix(ci-build): update conditions for push permissions in CI workflow
Some checks failed
CI / lint (push) Failing after 1s
CI / test (push) Has been skipped
CI / build (push) Has been skipped
2025-11-14 19:21:48 +01:00
854b1ac713 Merge pull request 'feat:v2' (#12) from develop into main
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 2m17s
Reviewed-on: #12
2025-11-14 18:02:54 +01:00
25fd13ce69 Merge branch 'main' into develop
All checks were successful
CI / lint (push) Successful in 16s
CI / lint (pull_request) Successful in 16s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 1m56s
CI / test (pull_request) Successful in 1m3s
CI / build (pull_request) Successful in 1m51s
2025-11-14 18:02:43 +01:00
0fec805db1 Delete templates/dashboard.html
Some checks failed
CI / build (push) Has been cancelled
CI / test (push) Has been cancelled
2025-11-14 18:02:33 +01:00
3746062819 chore: remove cicache workflow file
All checks were successful
CI / lint (push) Successful in 17s
CI / test (push) Successful in 1m3s
CI / build (push) Successful in 1m54s
CI / lint (pull_request) Successful in 15s
CI / test (pull_request) Successful in 1m2s
CI / build (pull_request) Successful in 1m46s
2025-11-14 16:34:17 +01:00
958c165721 chore: add .gitattributes for text handling and line endings
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m4s
CI / build (push) Successful in 1m56s
CI / deploy (push) Has been skipped
2025-11-14 14:21:16 +01:00
6e835c83eb fix(Dockerfile): implement fallback mechanisms for apt update and install
All checks were successful
CI / lint (push) Successful in 16s
CI / test (push) Successful in 1m2s
CI / build (push) Successful in 1m49s
CI / deploy (push) Has been skipped
2025-11-14 14:12:02 +01:00
75924fca84 feat(ci): add CI workflows for linting, testing, and building
Some checks failed
CI / lint (push) Successful in 15s
CI / test (push) Successful in 1m2s
CI / build (push) Failing after 29s
CI / deploy (push) Has been skipped
2025-11-14 13:45:10 +01:00
ac9ffddbde fix(ci): downgrade upload-artifact action to v3 for compatibility
Some checks failed
CI / build (push) Failing after 41s
CI / deploy (push) Has been skipped
CI / lint (push) Successful in 15s
CI / test (push) Successful in 1m12s
2025-11-14 13:31:26 +01:00
4e5a4c645d chore: remove Playwright installation steps from CI workflow
Some checks failed
CI / lint (push) Successful in 15s
CI / test (push) Failing after 1m2s
CI / build (push) Has been skipped
CI / deploy (push) Has been skipped
2025-11-14 13:26:33 +01:00
e9678b6736 chore: remove CI workflow file and update test files for improved structure and functionality
Some checks failed
CI / lint (push) Successful in 15s
CI / test (push) Failing after 16s
CI / build (push) Has been skipped
CI / deploy (push) Has been skipped
2025-11-14 13:25:02 +01:00
e5e346b26a Update templates/dashboard.html
Some checks failed
CI / build (push) Has been skipped
CI / test (push) Failing after 17s
CI / deploy (push) Has been skipped
CI / lint (push) Successful in 16s
2025-11-14 13:11:08 +01:00
b0e623d68e fix(tests): use secure token generation for access token in navigation client
Some checks failed
CI / lint (push) Successful in 15s
CI / build (push) Has been skipped
CI / test (push) Failing after 18s
CI / deploy (push) Has been skipped
2025-11-14 13:08:09 +01:00
30dbc13fae fix(init_db): correct SQL syntax for navigation link insertion
Some checks failed
CI / test (push) Has been skipped
CI / build (push) Has been skipped
CI / lint (push) Failing after 15s
CI / deploy (push) Has been skipped
2025-11-14 12:51:48 +01:00
31b9a1058a refactor: remove unused imports and streamline code in calculations and navigation services
Some checks failed
CI / test (push) Has been skipped
CI / build (push) Has been skipped
CI / lint (push) Failing after 14s
CI / deploy (push) Has been skipped
2025-11-14 12:28:48 +01:00
bcd993d57c feat(changelog): document completion of UI alignment initiative and style consolidation
Some checks failed
CI / test (push) Has been skipped
CI / build (push) Has been skipped
CI / lint (push) Failing after 15s
CI / deploy (push) Has been skipped
2025-11-13 22:34:31 +01:00
1262a4a63f Refactor CSS styles and introduce theme variables
- Removed redundant CSS rules and consolidated styles across dashboard, forms, imports, projects, and scenarios.
- Introduced new color variables in theme-default.css for better maintainability and consistency.
- Updated existing styles to utilize new color variables, enhancing the overall design.
- Improved responsiveness and layout of various components, including tables and cards.
- Ensured consistent styling for buttons, links, and headers across the application.
2025-11-13 22:30:58 +01:00
fb6816de00 Add form styles and update button classes for consistency
- Introduced a new CSS file for form styles (forms.css) to enhance form layout and design.
- Removed deprecated button styles from imports.css and updated button classes across templates to use the new utility classes.
- Updated various templates to reflect the new button styles, ensuring a consistent look and feel throughout the application.
- Refactored form-related styles in main.css and removed redundant styles from projects.css and scenarios.css.
- Ensured responsive design adjustments for form actions in smaller viewports.
2025-11-13 21:18:32 +01:00
4d0e1a9989 feat(navigation): Enhance navigation links and add legacy route redirects
Some checks failed
CI / test (push) Has been skipped
CI / build (push) Has been skipped
CI / lint (push) Failing after 14s
CI / deploy (push) Has been skipped
- Updated navigation links in `init_db.py` to include href overrides and parent slugs for profitability, opex, and capex planners.
- Modified `NavigationService` to handle child links and href overrides, ensuring proper routing when context is missing.
- Adjusted scenario detail and list templates to use new route names for opex and capex forms, with legacy fallbacks.
- Introduced integration tests for legacy calculation routes to ensure proper redirection and error handling.
- Added tests for navigation sidebar to validate role-based access and link visibility.
- Enhanced navigation sidebar tests to include calculation links and contextual URLs based on project and scenario IDs.
2025-11-13 20:23:53 +01:00
ed8e05147c feat: update status codes and navigation structure in calculations and reports routes 2025-11-13 17:14:17 +01:00
522b1e4105 feat: add scenarios list page with metrics and quick actions
Some checks failed
CI / test (push) Has been skipped
CI / build (push) Has been skipped
CI / lint (push) Failing after 15s
CI / deploy (push) Has been skipped
- Introduced a new template for listing scenarios associated with a project.
- Added metrics for total, active, draft, and archived scenarios.
- Implemented quick actions for creating new scenarios and reviewing project overview.
- Enhanced navigation with breadcrumbs for better user experience.

refactor: update Opex and Profitability templates for consistency

- Changed titles and button labels for clarity in Opex and Profitability templates.
- Updated form IDs and action URLs for better alignment with new naming conventions.
- Improved navigation links to include scenario and project overviews.

test: add integration tests for Opex calculations

- Created new tests for Opex calculation HTML and JSON flows.
- Validated successful calculations and ensured correct data persistence.
- Implemented tests for currency mismatch and unsupported frequency scenarios.

test: enhance project and scenario route tests

- Added tests to verify scenario list rendering and calculator shortcuts.
- Ensured scenario detail pages link back to the portfolio correctly.
- Validated project detail pages show associated scenarios accurately.
2025-11-13 16:21:36 +01:00
4f00bf0d3c feat: Add CRUD tests for project and scenario models 2025-11-13 11:06:39 +01:00
3551b0356d feat: Add comprehensive test suite for project and scenario models 2025-11-13 11:05:36 +01:00
521a8abc2d feat: Migrate to Pydantic's @field_validator and implement lifespan handler in FastAPI 2025-11-13 09:54:09 +01:00
1feae7ff85 feat: Add Processing Opex functionality
- Introduced OpexValidationError for handling validation errors in processing opex calculations.
- Implemented ProjectProcessingOpexRepository and ScenarioProcessingOpexRepository for managing project and scenario-level processing opex snapshots.
- Enhanced UnitOfWork to include repositories for processing opex.
- Updated sidebar navigation and scenario detail templates to include links to the new Processing Opex Planner.
- Created a new template for the Processing Opex Planner with form handling for input components and parameters.
- Developed integration tests for processing opex calculations, covering HTML and JSON flows, including validation for currency mismatches and unsupported frequencies.
- Added unit tests for the calculation logic, ensuring correct handling of various scenarios and edge cases.
2025-11-13 09:26:57 +01:00
1240b08740 feat: Persist initial capex calculations and enhance navigation links in UI 2025-11-12 23:52:06 +01:00
d9fd82b2e3 feat: Implement initial capex calculation feature
- Added CapexComponentInput, CapexParameters, CapexCalculationRequest, CapexCalculationResult, and related schemas for capex calculations.
- Introduced calculate_initial_capex function to aggregate capex components and compute totals and timelines.
- Created ProjectCapexRepository and ScenarioCapexRepository for managing capex snapshots in the database.
- Developed capex.html template for capturing and displaying initial capex data.
- Registered common Jinja2 filters for formatting currency and percentages.
- Implemented unit and integration tests for capex calculation functionality.
- Updated unit of work to include new repositories for capex management.
2025-11-12 23:51:52 +01:00
6c1570a254 feat: Update favicon handling to use FileResponse and add favicon.ico 2025-11-12 22:42:09 +01:00
b1a6df9f90 feat: Add profitability calculation schemas and service functions
- Introduced Pydantic schemas for profitability calculations in `schemas/calculations.py`.
- Implemented service functions for profitability calculations in `services/calculations.py`.
- Added new exception class `ProfitabilityValidationError` for handling validation errors.
- Created repositories for managing project and scenario profitability snapshots.
- Developed a utility script for verifying authenticated routes.
- Added a new HTML template for the profitability calculator interface.
- Implemented a script to fix user ID sequence in the database.
2025-11-12 22:22:29 +01:00
6d496a599e feat: Resolve test suite regressions and enhance token tamper detection
feat: Add UI router to application for improved routing
style: Update breadcrumb styles in main.css and remove redundant styles from scenarios.css
2025-11-12 20:30:40 +01:00
1199813da0 feat: Add plotly to requirements for enhanced data visualization 2025-11-12 19:42:09 +01:00
acf6f50bbd feat: Add NPV comparison and distribution charts to reporting
Some checks failed
CI / lint (push) Successful in 15s
CI / build (push) Has been skipped
CI / test (push) Failing after 17s
CI / deploy (push) Has been skipped
- Implemented NPV comparison chart generation using Plotly in ReportingService.
- Added distribution histogram for Monte Carlo results.
- Updated reporting templates to include new charts and improved layout.
- Created new settings and currencies management pages.
- Enhanced sidebar navigation with dynamic URL handling.
- Improved CSS styles for chart containers and overall layout.
- Added new simulation and theme settings pages with placeholders for future features.
2025-11-12 19:39:27 +01:00
ad306bd0aa feat: Refactor database initialization for SQLite compatibility 2025-11-12 18:30:35 +01:00
ed4187970c feat: Implement SQLite support with environment-driven backend switching 2025-11-12 18:29:49 +01:00
0fbe9f543e fix: Update .gitignore to include additional SQLite database files 2025-11-12 18:21:39 +01:00
80825c2c5d chore: Update changelog with recent verification and documentation updates 2025-11-12 18:17:09 +01:00
44a3bfc1bf fix: Remove unnecessary 'uvicorn' command from docker-compose.override.yml 2025-11-12 18:17:04 +01:00
1f892ebdbb feat: Implement SQLAlchemy enum helper and normalize enum values in database initialization 2025-11-12 18:11:19 +01:00
bcdc9e861e feat: Enhance CSS with custom properties for theming and layout adjustments 2025-11-12 18:11:02 +01:00
23523f70f1 feat: Add comprehensive tests for database initialization and seeding 2025-11-12 16:38:20 +01:00
8ef6724960 feat: Add database initialization, reset, and verification scripts 2025-11-12 16:30:17 +01:00
6e466a3fd2 Refactor database initialization and remove Alembic migrations
- Removed legacy Alembic migration files and consolidated schema management into a new Pydantic-backed initializer (`scripts/init_db.py`).
- Updated `main.py` to ensure the new DB initializer runs on startup, maintaining idempotency.
- Adjusted session management in `config/database.py` to prevent DetachedInstanceError.
- Introduced new enums in `models/enums.py` for better organization and clarity.
- Refactored various models to utilize the new enums, improving code maintainability.
- Enhanced middleware to handle JSON validation more robustly, ensuring non-JSON requests do not trigger JSON errors.
- Added tests for middleware and enums to ensure expected behavior and consistency.
- Updated changelog to reflect significant changes and improvements.
2025-11-12 16:29:44 +01:00
9d4c807475 feat: Update logo images in footer and header templates 2025-11-12 16:00:11 +01:00
9cd555e134 feat: Add pre-commit configuration for code quality tools 2025-11-12 12:07:39 +01:00
edf86a5447 Update templates/dashboard.html
Some checks failed
CI / build (push) Has been cancelled
CI / test (push) Has been cancelled
2025-11-12 11:22:33 +01:00
132 changed files with 13848 additions and 3195 deletions

3
.gitattributes vendored Normal file
View File

@@ -0,0 +1,3 @@
* text=auto
Dockerfile text eol=lf

View File

@@ -0,0 +1,232 @@
name: CI - Build
on:
workflow_call:
workflow_dispatch:
jobs:
build:
outputs:
allow_push: ${{ steps.meta.outputs.allow_push }}
ref_name: ${{ steps.meta.outputs.ref_name }}
event_name: ${{ steps.meta.outputs.event_name }}
sha: ${{ steps.meta.outputs.sha }}
runs-on: ubuntu-latest
env:
DEFAULT_BRANCH: main
REGISTRY_URL: ${{ secrets.REGISTRY_URL }}
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}
REGISTRY_CONTAINER_NAME: calminer
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Collect workflow metadata
id: meta
shell: bash
env:
DEFAULT_BRANCH: ${{ env.DEFAULT_BRANCH }}
run: |
git_ref="${GITEA_REF:-${GITHUB_REF:-}}"
ref_name="${GITEA_REF_NAME:-${GITHUB_REF_NAME:-}}"
if [ -z "$ref_name" ] && [ -n "$git_ref" ]; then
ref_name="${git_ref##*/}"
fi
event_name="${GITEA_EVENT_NAME:-${GITHUB_EVENT_NAME:-}}"
sha="${GITEA_SHA:-${GITHUB_SHA:-}}"
if [ -z "$sha" ]; then
sha="$(git rev-parse HEAD)"
fi
if [ "$ref_name" = "${DEFAULT_BRANCH:-main}" ] && [ "$event_name" != "pull_request" ]; then
echo "allow_push=true" >> "$GITHUB_OUTPUT"
else
echo "allow_push=false" >> "$GITHUB_OUTPUT"
fi
echo "ref_name=$ref_name" >> "$GITHUB_OUTPUT"
echo "event_name=$event_name" >> "$GITHUB_OUTPUT"
echo "sha=$sha" >> "$GITHUB_OUTPUT"
- name: Validate registry configuration
shell: bash
run: |
set -euo pipefail
if [ -z "${REGISTRY_URL}" ]; then
echo "::error::REGISTRY_URL secret not configured. Configure it with your Gitea container registry host." >&2
exit 1
fi
server_url="${GITEA_SERVER_URL:-${GITHUB_SERVER_URL:-}}"
server_host="${server_url#http://}"
server_host="${server_host#https://}"
server_host="${server_host%%/*}"
server_host="${server_host%%:*}"
registry_host="${REGISTRY_URL#http://}"
registry_host="${registry_host#https://}"
registry_host="${registry_host%%/*}"
registry_host="${registry_host%%:*}"
if [ -n "${server_host}" ] && ! printf '%s' "${registry_host}" | grep -qi "${server_host}"; then
echo "::warning::REGISTRY_URL (${REGISTRY_URL}) does not match current Gitea host (${server_host}). Ensure this registry endpoint is managed by Gitea." >&2
fi
registry_repository="${registry_host}/allucanget/${REGISTRY_CONTAINER_NAME}"
echo "REGISTRY_HOST=${registry_host}" >> "$GITHUB_ENV"
echo "REGISTRY_REPOSITORY=${registry_repository}" >> "$GITHUB_ENV"
- name: Set up QEMU and Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to gitea registry
if: ${{ steps.meta.outputs.allow_push == 'true' }}
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY_HOST }}
username: ${{ env.REGISTRY_USERNAME }}
password: ${{ env.REGISTRY_PASSWORD }}
- name: Build image
id: build-image
env:
REGISTRY_REPOSITORY: ${{ env.REGISTRY_REPOSITORY }}
REGISTRY_CONTAINER_NAME: ${{ env.REGISTRY_CONTAINER_NAME }}
SHA_TAG: ${{ steps.meta.outputs.sha }}
PUSH_IMAGE: ${{ steps.meta.outputs.allow_push == 'true' && env.REGISTRY_HOST != '' && env.REGISTRY_USERNAME != '' && env.REGISTRY_PASSWORD != '' }}
run: |
set -eo pipefail
LOG_FILE=build.log
if [ "${PUSH_IMAGE}" = "true" ]; then
docker buildx build \
--load \
--tag "${REGISTRY_REPOSITORY}:latest" \
--tag "${REGISTRY_REPOSITORY}:${SHA_TAG}" \
--file Dockerfile \
. 2>&1 | tee "${LOG_FILE}"
else
docker buildx build \
--load \
--tag "${REGISTRY_CONTAINER_NAME}:ci" \
--file Dockerfile \
. 2>&1 | tee "${LOG_FILE}"
fi
- name: Push image
if: ${{ steps.meta.outputs.allow_push == 'true' }}
env:
REGISTRY_REPOSITORY: ${{ env.REGISTRY_REPOSITORY }}
SHA_TAG: ${{ steps.meta.outputs.sha }}
run: |
set -euo pipefail
if [ -z "${REGISTRY_REPOSITORY}" ]; then
echo "::error::REGISTRY_REPOSITORY not defined; cannot push image" >&2
exit 1
fi
docker push "${REGISTRY_REPOSITORY}:${SHA_TAG}"
docker push "${REGISTRY_REPOSITORY}:latest"
- name: Upload docker build logs
if: failure()
uses: actions/upload-artifact@v4
with:
name: docker-build-logs
path: build.log
deploy:
needs: build
if: needs.build.outputs.allow_push == 'true'
runs-on: ubuntu-latest
env:
REGISTRY_URL: ${{ secrets.REGISTRY_URL }}
REGISTRY_CONTAINER_NAME: calminer
KUBE_CONFIG: ${{ secrets.KUBE_CONFIG }}
STAGING_KUBE_CONFIG: ${{ secrets.STAGING_KUBE_CONFIG }}
PROD_KUBE_CONFIG: ${{ secrets.PROD_KUBE_CONFIG }}
K8S_DEPLOY_ENABLED: ${{ secrets.K8S_DEPLOY_ENABLED }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Resolve registry repository
run: |
set -euo pipefail
if [ -z "${REGISTRY_URL}" ]; then
echo "::error::REGISTRY_URL secret not configured. Configure it with your Gitea container registry host." >&2
exit 1
fi
registry_host="${REGISTRY_URL#http://}"
registry_host="${registry_host#https://}"
registry_host="${registry_host%%/*}"
registry_host="${registry_host%%:*}"
registry_repository="${registry_host}/allucanget/${REGISTRY_CONTAINER_NAME}"
echo "REGISTRY_HOST=${registry_host}" >> "$GITHUB_ENV"
echo "REGISTRY_REPOSITORY=${registry_repository}" >> "$GITHUB_ENV"
- name: Report Kubernetes deployment toggle
run: |
set -euo pipefail
enabled="${K8S_DEPLOY_ENABLED:-}"
if [ "${enabled}" = "true" ]; then
echo "Kubernetes deployment is enabled for this run."
else
echo "::notice::Kubernetes deployment steps are disabled (set secrets.K8S_DEPLOY_ENABLED to 'true' to enable)."
fi
- name: Capture commit metadata
id: commit_meta
run: |
set -euo pipefail
message="$(git log -1 --pretty=%B | tr '\n' ' ')"
echo "message=$message" >> "$GITHUB_OUTPUT"
- name: Set up kubectl for staging
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy staging]')
uses: azure/k8s-set-context@v3
with:
method: kubeconfig
kubeconfig: ${{ env.STAGING_KUBE_CONFIG }}
- name: Set up kubectl for production
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy production]')
uses: azure/k8s-set-context@v3
with:
method: kubeconfig
kubeconfig: ${{ env.PROD_KUBE_CONFIG }}
- name: Deploy to staging
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy staging]')
run: |
kubectl set image deployment/calminer-app calminer=${REGISTRY_REPOSITORY}:latest
kubectl apply -f k8s/configmap.yaml
kubectl apply -f k8s/secret.yaml
kubectl rollout status deployment/calminer-app
- name: Collect staging deployment logs
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy staging]')
run: |
mkdir -p logs/deployment/staging
kubectl get pods -o wide > logs/deployment/staging/pods.txt
kubectl get deployment calminer-app -o yaml > logs/deployment/staging/deployment.yaml
kubectl logs deployment/calminer-app --all-containers=true --tail=500 > logs/deployment/staging/calminer-app.log
- name: Deploy to production
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy production]')
run: |
kubectl set image deployment/calminer-app calminer=${REGISTRY_REPOSITORY}:latest
kubectl apply -f k8s/configmap.yaml
kubectl apply -f k8s/secret.yaml
kubectl rollout status deployment/calminer-app
- name: Collect production deployment logs
if: env.K8S_DEPLOY_ENABLED == 'true' && contains(steps.commit_meta.outputs.message, '[deploy production]')
run: |
mkdir -p logs/deployment/production
kubectl get pods -o wide > logs/deployment/production/pods.txt
kubectl get deployment calminer-app -o yaml > logs/deployment/production/deployment.yaml
kubectl logs deployment/calminer-app --all-containers=true --tail=500 > logs/deployment/production/calminer-app.log
- name: Upload deployment logs
if: always()
uses: actions/upload-artifact@v4
with:
name: deployment-logs
path: logs/deployment
if-no-files-found: ignore

View File

@@ -0,0 +1,44 @@
name: CI - Lint
on:
workflow_call:
workflow_dispatch:
jobs:
lint:
runs-on: ubuntu-latest
env:
APT_CACHER_NG: http://192.168.88.14:3142
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
- name: Configure apt proxy
run: |
if [ -n "${APT_CACHER_NG}" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run Ruff
run: ruff check .
- name: Run Black
run: black --check .
- name: Run Bandit
run: bandit -c pyproject.toml -r tests

View File

@@ -0,0 +1,73 @@
name: CI - Test
on:
workflow_call:
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
env:
APT_CACHER_NG: http://192.168.88.14:3142
DB_DRIVER: postgresql+psycopg2
DB_HOST: 192.168.88.35
DB_NAME: calminer_test
DB_USER: calminer
DB_PASSWORD: calminer_password
services:
postgres:
image: postgres:17
env:
POSTGRES_USER: ${{ env.DB_USER }}
POSTGRES_PASSWORD: ${{ env.DB_PASSWORD }}
POSTGRES_DB: ${{ env.DB_NAME }}
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
- name: Configure apt proxy
run: |
if [ -n "${APT_CACHER_NG}" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run tests
env:
DATABASE_DRIVER: ${{ env.DB_DRIVER }}
DATABASE_HOST: postgres
DATABASE_PORT: 5432
DATABASE_USER: ${{ env.DB_USER }}
DATABASE_PASSWORD: ${{ env.DB_PASSWORD }}
DATABASE_NAME: ${{ env.DB_NAME }}
run: |
pytest --cov=. --cov-report=term-missing --cov-report=xml --cov-fail-under=80 --junitxml=pytest-report.xml
- name: Upload test artifacts
if: always()
uses: actions/upload-artifact@v3
with:
name: test-artifacts
path: |
coverage.xml
pytest-report.xml

30
.gitea/workflows/ci.yml Normal file
View File

@@ -0,0 +1,30 @@
name: CI
on:
push:
branches:
- main
- develop
- v2
pull_request:
branches:
- main
- develop
workflow_dispatch:
jobs:
lint:
uses: ./.gitea/workflows/ci-lint.yml
secrets: inherit
test:
needs: lint
uses: ./.gitea/workflows/ci-test.yml
secrets: inherit
build:
needs:
- lint
- test
uses: ./.gitea/workflows/ci-build.yml
secrets: inherit

View File

@@ -1,212 +0,0 @@
name: CI
on:
push:
branches: [main, develop, v2]
pull_request:
branches: [main, develop]
jobs:
lint:
runs-on: ubuntu-latest
env:
APT_CACHER_NG: http://192.168.88.14:3142
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
# - name: Cache pip dependencies
# uses: actions/cache@v4
# with:
# path: /root/.cache/pip
# key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt', 'requirements-test.txt', 'pyproject.toml') }}
# restore-keys: |
# ${{ runner.os }}-pip-
- name: Configure apt proxy
run: |
if [ -n \"${APT_CACHER_NG}\" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run Ruff
run: ruff check .
- name: Run Black
run: black --check .
- name: Run bandit
run: bandit -c pyproject.toml -r tests
test:
runs-on: ubuntu-latest
needs: lint
env:
APT_CACHER_NG: http://192.168.88.14:3142
DB_DRIVER: postgresql+psycopg2
DB_HOST: 192.168.88.35
DB_NAME: calminer_test
DB_USER: calminer
DB_PASSWORD: calminer_password
services:
postgres:
image: postgres:17
env:
POSTGRES_USER: ${{ env.DB_USER }}
POSTGRES_PASSWORD: ${{ env.DB_PASSWORD }}
POSTGRES_DB: ${{ env.DB_NAME }}
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
- name: Get pip cache dir
id: pip-cache
run: |
echo \"path=$(pip cache dir)\" >> $GITEA_OUTPUT
echo \"Pip cache dir: $(pip cache dir)\"
# - name: Cache pip dependencies
# uses: actions/cache@v4
# with:
# path: /root/.cache/pip
# key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt', 'requirements-test.txt', 'pyproject.toml') }}
# restore-keys: |
# ${{ runner.os }}-pip-
- name: Configure apt proxy
run: |
if [ -n \"${APT_CACHER_NG}\" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run tests
env:
DATABASE_DRIVER: ${{ env.DB_DRIVER }}
DATABASE_HOST: postgres
DATABASE_PORT: 5432
DATABASE_USER: ${{ env.DB_USER }}
DATABASE_PASSWORD: ${{ env.DB_PASSWORD }}
DATABASE_NAME: ${{ env.DB_NAME }}
run: |
pytest --cov=. --cov-report=term-missing --cov-report=xml --junitxml=pytest-report.xml
- name: Upload test artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: test-artifacts
path: |
coverage.xml
pytest-report.xml
build:
runs-on: ubuntu-latest
needs:
- lint
- test
env:
DEFAULT_BRANCH: main
REGISTRY_URL: ${{ secrets.REGISTRY_URL }}
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}
REGISTRY_CONTAINER_NAME: calminer
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Collect workflow metadata
id: meta
shell: bash
run: |
ref_name=\"${GITHUB_REF_NAME:-${GITHUB_REF##*/}}\"
event_name=\"${GITHUB_EVENT_NAME:-}\"
sha=\"${GITHUB_SHA:-}\"
if [ \"$ref_name\" = \"${DEFAULT_BRANCH:-main}\" ]; then
echo \"on_default=true\" >> \"$GITHUB_OUTPUT\"
else
echo \"on_default=false\" >> \"$GITHUB_OUTPUT\"
fi
echo \"ref_name=$ref_name\" >> \"$GITHUB_OUTPUT\"
echo \"event_name=$event_name\" >> \"$GITHUB_OUTPUT\"
echo \"sha=$sha\" >> \"$GITHUB_OUTPUT\"
- name: Set up QEMU and Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to gitea registry
if: ${{ steps.meta.outputs.on_default == 'true' }}
uses: docker/login-action@v3
continue-on-error: true
with:
registry: ${{ env.REGISTRY_URL }}
username: ${{ env.REGISTRY_USERNAME }}
password: ${{ env.REGISTRY_PASSWORD }}
- name: Build image
id: build-image
env:
REGISTRY_URL: ${{ env.REGISTRY_URL }}
REGISTRY_CONTAINER_NAME: ${{ env.REGISTRY_CONTAINER_NAME }}
SHA_TAG: ${{ steps.meta.outputs.sha }}
PUSH_IMAGE: ${{ steps.meta.outputs.on_default == 'true' && steps.meta.outputs.event_name != 'pull_request' && env.REGISTRY_URL != '' && env.REGISTRY_USERNAME != '' && env.REGISTRY_PASSWORD != '' }}
run: |
set -eo pipefail
LOG_FILE=build.log
if [ \"${PUSH_IMAGE}\" = \"true\" ]; then
docker buildx build \
--push \
--tag \"${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:latest\" \
--tag \"${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:${SHA_TAG}\" \
--file Dockerfile \
. 2>&1 | tee \"${LOG_FILE}\"
else
docker buildx build \
--load \
--tag \"${REGISTRY_CONTAINER_NAME}:ci\" \
--file Dockerfile \
. 2>&1 | tee \"${LOG_FILE}\"
fi
- name: Upload docker build logs
if: failure()
uses: actions/upload-artifact@v4
with:
name: docker-build-logs
path: build.log

View File

@@ -1,298 +0,0 @@
name: CI
on:
push:
branches: [main, develop, v2]
pull_request:
branches: [main, develop]
jobs:
lint:
runs-on: ubuntu-latest
env:
APT_CACHER_NG: http://192.168.88.14:3142
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
# - name: Get pip cache dir
# id: pip-cache
# run: |
# echo "path=$(pip cache dir)" >> $GITEA_OUTPUT
# echo "Pip cache dir: $(pip cache dir)"
# - name: Cache pip dependencies
# uses: actions/cache@v4
# with:
# path: ${{ steps.pip-cache.outputs.path }}
# key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt', 'requirements-test.txt', 'pyproject.toml') }}
# restore-keys: |
# ${{ runner.os }}-pip-
- name: Configure apt proxy
run: |
if [ -n "${APT_CACHER_NG}" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run Ruff
run: ruff check .
- name: Run Black
run: black --check .
- name: Run Bandit
run: bandit -c pyproject.toml -r tests
test:
runs-on: ubuntu-latest
needs: lint
env:
APT_CACHER_NG: http://192.168.88.14:3142
DB_DRIVER: postgresql+psycopg2
DB_HOST: 192.168.88.35
DB_NAME: calminer_test
DB_USER: calminer
DB_PASSWORD: calminer_password
services:
postgres:
image: postgres:17
env:
POSTGRES_USER: ${{ env.DB_USER }}
POSTGRES_PASSWORD: ${{ env.DB_PASSWORD }}
POSTGRES_DB: ${{ env.DB_NAME }}
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.12"
# - name: Get pip cache dir
# id: pip-cache
# run: |
# echo "path=$(pip cache dir)" >> $GITEA_OUTPUT
# echo "Pip cache dir: $(pip cache dir)"
# - name: Cache pip dependencies
# uses: actions/cache@v4
# with:
# path: ${{ steps.pip-cache.outputs.path }}
# key: ${{ runner.os }}-pip-${{ hashFiles('requirements.txt', 'requirements-test.txt', 'pyproject.toml') }}
# restore-keys: |
# ${{ runner.os }}-pip-
- name: Configure apt proxy
run: |
if [ -n "${APT_CACHER_NG}" ]; then
echo "Acquire::http::Proxy \"${APT_CACHER_NG}\";" | tee /etc/apt/apt.conf.d/01apt-cacher-ng
fi
- name: Install system packages
run: |
apt-get update
apt-get install -y build-essential libpq-dev
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Install Playwright system dependencies
run: playwright install-deps
- name: Install Playwright browsers
run: playwright install
- name: Run tests
env:
DATABASE_DRIVER: ${{ env.DB_DRIVER }}
DATABASE_HOST: postgres
DATABASE_PORT: 5432
DATABASE_USER: ${{ env.DB_USER }}
DATABASE_PASSWORD: ${{ env.DB_PASSWORD }}
DATABASE_NAME: ${{ env.DB_NAME }}
run: |
pytest --cov=. --cov-report=term-missing --cov-report=xml --cov-fail-under=80 --junitxml=pytest-report.xml
- name: Upload test artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: test-artifacts
path: |
coverage.xml
pytest-report.xml
build:
runs-on: ubuntu-latest
needs:
- lint
- test
env:
DEFAULT_BRANCH: main
REGISTRY_URL: ${{ secrets.REGISTRY_URL }}
REGISTRY_USERNAME: ${{ secrets.REGISTRY_USERNAME }}
REGISTRY_PASSWORD: ${{ secrets.REGISTRY_PASSWORD }}
REGISTRY_CONTAINER_NAME: calminer
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Collect workflow metadata
id: meta
shell: bash
run: |
ref_name="${GITHUB_REF_NAME:-${GITHUB_REF##*/}}"
event_name="${GITHUB_EVENT_NAME:-}"
sha="${GITHUB_SHA:-}"
if [ "$ref_name" = "${DEFAULT_BRANCH:-main}" ]; then
echo "on_default=true" >> "$GITHUB_OUTPUT"
else
echo "on_default=false" >> "$GITHUB_OUTPUT"
fi
echo "ref_name=$ref_name" >> "$GITHUB_OUTPUT"
echo "event_name=$event_name" >> "$GITHUB_OUTPUT"
echo "sha=$sha" >> "$GITHUB_OUTPUT"
- name: Set up QEMU and Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to gitea registry
if: ${{ steps.meta.outputs.on_default == 'true' }}
uses: docker/login-action@v3
continue-on-error: true
with:
registry: ${{ env.REGISTRY_URL }}
username: ${{ env.REGISTRY_USERNAME }}
password: ${{ env.REGISTRY_PASSWORD }}
- name: Build image
id: build-image
env:
REGISTRY_URL: ${{ env.REGISTRY_URL }}
REGISTRY_CONTAINER_NAME: ${{ env.REGISTRY_CONTAINER_NAME }}
SHA_TAG: ${{ steps.meta.outputs.sha }}
PUSH_IMAGE: ${{ steps.meta.outputs.on_default == 'true' && steps.meta.outputs.event_name != 'pull_request' && env.REGISTRY_URL != '' && env.REGISTRY_USERNAME != '' && env.REGISTRY_PASSWORD != '' }}
run: |
set -eo pipefail
LOG_FILE=build.log
if [ "${PUSH_IMAGE}" = "true" ]; then
docker buildx build \
--push \
--tag "${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:latest" \
--tag "${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:${SHA_TAG}" \
--file Dockerfile \
. 2>&1 | tee "${LOG_FILE}"
else
docker buildx build \
--load \
--tag "${REGISTRY_CONTAINER_NAME}:ci" \
--file Dockerfile \
. 2>&1 | tee "${LOG_FILE}"
fi
- name: Upload docker build logs
if: failure()
uses: actions/upload-artifact@v4
with:
name: docker-build-logs
path: build.log
deploy:
runs-on: ubuntu-latest
needs: build
if: github.ref == 'refs/heads/main' && github.event_name != 'pull_request'
env:
REGISTRY_URL: ${{ secrets.REGISTRY_URL }}
REGISTRY_CONTAINER_NAME: calminer
KUBE_CONFIG: ${{ secrets.KUBE_CONFIG }}
STAGING_KUBE_CONFIG: ${{ secrets.STAGING_KUBE_CONFIG }}
PROD_KUBE_CONFIG: ${{ secrets.PROD_KUBE_CONFIG }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Set up kubectl for staging
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy staging]')
uses: azure/k8s-set-context@v3
with:
method: kubeconfig
kubeconfig: ${{ env.STAGING_KUBE_CONFIG }}
- name: Set up kubectl for production
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy production]')
uses: azure/k8s-set-context@v3
with:
method: kubeconfig
kubeconfig: ${{ env.PROD_KUBE_CONFIG }}
- name: Deploy to staging
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy staging]')
run: |
# Update image in deployment
kubectl set image deployment/calminer-app calminer=${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:latest
# Apply any config changes
kubectl apply -f k8s/configmap.yaml
kubectl apply -f k8s/secret.yaml
# Wait for rollout
kubectl rollout status deployment/calminer-app
- name: Collect staging deployment logs
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy staging]')
run: |
mkdir -p logs/deployment/staging
kubectl get pods -o wide > logs/deployment/staging/pods.txt
kubectl get deployment calminer-app -o yaml > logs/deployment/staging/deployment.yaml
kubectl logs deployment/calminer-app --all-containers=true --tail=500 > logs/deployment/staging/calminer-app.log
- name: Deploy to production
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy production]')
run: |
# Update image in deployment
kubectl set image deployment/calminer-app calminer=${REGISTRY_URL}/allucanget/${REGISTRY_CONTAINER_NAME}:latest
# Apply any config changes
kubectl apply -f k8s/configmap.yaml
kubectl apply -f k8s/secret.yaml
# Wait for rollout
kubectl rollout status deployment/calminer-app
- name: Collect production deployment logs
if: github.ref == 'refs/heads/main' && contains(github.event.head_commit.message, '[deploy production]')
run: |
mkdir -p logs/deployment/production
kubectl get pods -o wide > logs/deployment/production/pods.txt
kubectl get deployment calminer-app -o yaml > logs/deployment/production/deployment.yaml
kubectl logs deployment/calminer-app --all-containers=true --tail=500 > logs/deployment/production/calminer-app.log
- name: Upload deployment logs
if: always()
uses: actions/upload-artifact@v4
with:
name: deployment-logs
path: logs/deployment
if-no-files-found: ignore

View File

@@ -0,0 +1,78 @@
name: Deploy - Coolify
on:
push:
branches:
- main
workflow_dispatch:
jobs:
deploy:
runs-on: ubuntu-latest
env:
COOLIFY_BASE_URL: ${{ secrets.COOLIFY_BASE_URL }}
COOLIFY_API_TOKEN: ${{ secrets.COOLIFY_API_TOKEN }}
COOLIFY_APPLICATION_ID: ${{ secrets.COOLIFY_APPLICATION_ID }}
COOLIFY_DEPLOY_ENV: ${{ secrets.COOLIFY_DEPLOY_ENV }}
DOCKER_COMPOSE_PATH: docker-compose.prod.yml
ENV_FILE_PATH: deploy/.env
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Prepare compose bundle
run: |
set -euo pipefail
mkdir -p deploy
cp "$DOCKER_COMPOSE_PATH" deploy/docker-compose.yml
if [ -n "$COOLIFY_DEPLOY_ENV" ]; then
printf '%s\n' "$COOLIFY_DEPLOY_ENV" > "$ENV_FILE_PATH"
elif [ ! -f "$ENV_FILE_PATH" ]; then
echo "::error::COOLIFY_DEPLOY_ENV secret not configured and deploy/.env missing" >&2
exit 1
fi
- name: Validate Coolify secrets
run: |
set -euo pipefail
missing=0
for var in COOLIFY_BASE_URL COOLIFY_API_TOKEN COOLIFY_APPLICATION_ID; do
if [ -z "${!var}" ]; then
echo "::error::Missing required secret: $var"
missing=1
fi
done
if [ "$missing" -eq 1 ]; then
exit 1
fi
- name: Trigger deployment via Coolify API
run: |
set -euo pipefail
api_url="$COOLIFY_BASE_URL/api/v1/deploy"
payload=$(jq -n --arg uuid "$COOLIFY_APPLICATION_ID" '{ uuid: $uuid }')
response=$(curl -sS -w '\n%{http_code}' \
-X POST "$api_url" \
-H "Authorization: Bearer $COOLIFY_API_TOKEN" \
-H "Content-Type: application/json" \
-d "$payload")
body=$(echo "$response" | head -n -1)
status=$(echo "$response" | tail -n1)
echo "Deploy response status: $status"
echo "$body"
printf '%s' "$body" > deploy/coolify-response.json
if [ "$status" -ge 400 ]; then
echo "::error::Deployment request failed"
exit 1
fi
- name: Upload deployment bundle
if: always()
uses: actions/upload-artifact@v3
with:
name: coolify-deploy-bundle
path: |
deploy/docker-compose.yml
deploy/.env
deploy/coolify-response.json
if-no-files-found: warn

6
.gitignore vendored
View File

@@ -47,8 +47,14 @@ htmlcov/
logs/ logs/
# SQLite database # SQLite database
data/
*.sqlite3 *.sqlite3
test*.db test*.db
local*.db
# Act runner files # Act runner files
.runner .runner
# Devcontainer files
.devcontainer/devcontainer.json
.devcontainer/docker-compose.yml

13
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,13 @@
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.6.1
hooks:
- id: ruff
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 24.8.0
hooks:
- id: black
- repo: https://github.com/PyCQA/bandit
rev: 1.7.9
hooks:
- id: bandit

View File

@@ -41,8 +41,25 @@ if url:
finally: finally:
sock.close() sock.close()
PY PY
apt-get update APT_PROXY_CONFIG=/etc/apt/apt.conf.d/01proxy
apt-get install -y --no-install-recommends build-essential gcc libpq-dev
apt_update_with_fallback() {
if ! apt-get update; then
rm -f "$APT_PROXY_CONFIG"
apt-get update
fi
}
apt_install_with_fallback() {
if ! apt-get install -y --no-install-recommends "$@"; then
rm -f "$APT_PROXY_CONFIG"
apt-get update
apt-get install -y --no-install-recommends "$@"
fi
}
apt_update_with_fallback
apt_install_with_fallback build-essential gcc libpq-dev
pip install --upgrade pip pip install --upgrade pip
pip wheel --no-deps --wheel-dir /wheels -r requirements.txt pip wheel --no-deps --wheel-dir /wheels -r requirements.txt
apt-get purge -y --auto-remove build-essential gcc apt-get purge -y --auto-remove build-essential gcc
@@ -88,8 +105,25 @@ if url:
finally: finally:
sock.close() sock.close()
PY PY
apt-get update APT_PROXY_CONFIG=/etc/apt/apt.conf.d/01proxy
apt-get install -y --no-install-recommends libpq5
apt_update_with_fallback() {
if ! apt-get update; then
rm -f "$APT_PROXY_CONFIG"
apt-get update
fi
}
apt_install_with_fallback() {
if ! apt-get install -y --no-install-recommends "$@"; then
rm -f "$APT_PROXY_CONFIG"
apt-get update
apt-get install -y --no-install-recommends "$@"
fi
}
apt_update_with_fallback
apt_install_with_fallback libpq5
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*
EOF EOF
@@ -102,13 +136,12 @@ RUN pip install --upgrade pip \
COPY . /app COPY . /app
RUN chown -R appuser:app /app \ RUN chown -R appuser:app /app
&& chmod +x /app/scripts/docker-entrypoint.sh
USER appuser USER appuser
EXPOSE 8003 EXPOSE 8003
ENTRYPOINT ["/app/scripts/docker-entrypoint.sh"] ENTRYPOINT ["uvicorn"]
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8003", "--workers", "4"] CMD ["main:app", "--host", "0.0.0.0", "--port", "8003", "--workers", "4"]

View File

@@ -8,4 +8,6 @@ The system is designed to help mining companies make informed decisions by simul
## Documentation & quickstart ## Documentation & quickstart
This repository contains only code. See detailed developer and architecture documentation in the [Docs](https://git.allucanget.biz/allucanget/calminer-docs) repository. - Detailed developer, architecture, and operations guides live in the companion [calminer-docs](../calminer-docs/) repository. Please see the [README](../calminer-docs/README.md) there for instructions.
- For a local run, create a `.env` (see `.env.example`), install requirements, then execute `python -m scripts.init_db` followed by `uvicorn main:app --reload`. The initializer is safe to rerun and seeds demo data automatically.
- To wipe and recreate the schema in development, run `CALMINER_ENV=development python -m scripts.reset_db` before invoking the initializer again.

View File

@@ -1,35 +0,0 @@
[alembic]
script_location = alembic
sqlalchemy.url = %(DATABASE_URL)s
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s

View File

@@ -1,62 +0,0 @@
from __future__ import annotations
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
from config.database import Base, DATABASE_URL
from models import * # noqa: F401,F403 - ensure models are imported for metadata registration
# this is the Alembic Config object, which provides access to the values within the .ini file.
config = context.config
if config.config_file_name is not None:
fileConfig(config.config_file_name)
# Interpret the config file for Python logging.
# This line sets up loggers basically.
config.set_main_option("sqlalchemy.url", DATABASE_URL)
target_metadata = Base.metadata
def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode."""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
"""Run migrations in 'online' mode."""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)
with context.begin_transaction():
context.run_migrations()
def run_migrations() -> None:
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
run_migrations()

View File

@@ -1,17 +0,0 @@
"""${message}"""
revision = ${repr(revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
from alembic import op
import sqlalchemy as sa
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@@ -1,718 +0,0 @@
"""Combined initial schema"""
from __future__ import annotations
from datetime import datetime, timezone
from alembic import op
import sqlalchemy as sa
from passlib.context import CryptContext
from sqlalchemy.sql import column, table
# revision identifiers, used by Alembic.
revision = "20251111_00"
down_revision = None
branch_labels = None
depends_on = None
password_context = CryptContext(schemes=["argon2"], deprecated="auto")
mining_operation_type = sa.Enum(
"open_pit",
"underground",
"in_situ_leach",
"placer",
"quarry",
"mountaintop_removal",
"other",
name="miningoperationtype",
)
scenario_status = sa.Enum(
"draft",
"active",
"archived",
name="scenariostatus",
)
financial_category = sa.Enum(
"capex",
"opex",
"revenue",
"contingency",
"other",
name="financialcategory",
)
cost_bucket = sa.Enum(
"capital_initial",
"capital_sustaining",
"operating_fixed",
"operating_variable",
"maintenance",
"reclamation",
"royalties",
"general_admin",
name="costbucket",
)
distribution_type = sa.Enum(
"normal",
"triangular",
"uniform",
"lognormal",
"custom",
name="distributiontype",
)
stochastic_variable = sa.Enum(
"ore_grade",
"recovery_rate",
"metal_price",
"operating_cost",
"capital_cost",
"discount_rate",
"throughput",
name="stochasticvariable",
)
resource_type = sa.Enum(
"diesel",
"electricity",
"water",
"explosives",
"reagents",
"labor",
"equipment_hours",
"tailings_capacity",
name="resourcetype",
)
DEFAULT_PRICING_SLUG = "default"
def _ensure_default_pricing_settings(connection) -> int:
settings_table = table(
"pricing_settings",
column("id", sa.Integer()),
column("slug", sa.String()),
column("name", sa.String()),
column("description", sa.Text()),
column("default_currency", sa.String()),
column("default_payable_pct", sa.Numeric()),
column("moisture_threshold_pct", sa.Numeric()),
column("moisture_penalty_per_pct", sa.Numeric()),
column("created_at", sa.DateTime(timezone=True)),
column("updated_at", sa.DateTime(timezone=True)),
)
existing = connection.execute(
sa.select(settings_table.c.id).where(
settings_table.c.slug == DEFAULT_PRICING_SLUG
)
).scalar_one_or_none()
if existing is not None:
return existing
now = datetime.now(timezone.utc)
insert_stmt = settings_table.insert().values(
slug=DEFAULT_PRICING_SLUG,
name="Default Pricing",
description="Automatically generated default pricing settings.",
default_currency="USD",
default_payable_pct=100.0,
moisture_threshold_pct=8.0,
moisture_penalty_per_pct=0.0,
created_at=now,
updated_at=now,
)
result = connection.execute(insert_stmt)
default_id = result.inserted_primary_key[0]
if default_id is None:
default_id = connection.execute(
sa.select(settings_table.c.id).where(
settings_table.c.slug == DEFAULT_PRICING_SLUG
)
).scalar_one()
return default_id
def upgrade() -> None:
bind = op.get_bind()
# Enumerations
mining_operation_type.create(bind, checkfirst=True)
scenario_status.create(bind, checkfirst=True)
financial_category.create(bind, checkfirst=True)
cost_bucket.create(bind, checkfirst=True)
distribution_type.create(bind, checkfirst=True)
stochastic_variable.create(bind, checkfirst=True)
resource_type.create(bind, checkfirst=True)
# Pricing settings core tables
op.create_table(
"pricing_settings",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("name", sa.String(length=128), nullable=False),
sa.Column("slug", sa.String(length=64), nullable=False),
sa.Column("description", sa.Text(), nullable=True),
sa.Column("default_currency", sa.String(length=3), nullable=True),
sa.Column(
"default_payable_pct",
sa.Numeric(precision=5, scale=2),
nullable=False,
server_default=sa.text("100.00"),
),
sa.Column(
"moisture_threshold_pct",
sa.Numeric(precision=5, scale=2),
nullable=False,
server_default=sa.text("8.00"),
),
sa.Column(
"moisture_penalty_per_pct",
sa.Numeric(precision=14, scale=4),
nullable=False,
server_default=sa.text("0.0000"),
),
sa.Column("metadata", sa.JSON(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.UniqueConstraint("name", name="uq_pricing_settings_name"),
sa.UniqueConstraint("slug", name="uq_pricing_settings_slug"),
)
op.create_index(
op.f("ix_pricing_settings_id"),
"pricing_settings",
["id"],
unique=False,
)
op.create_table(
"pricing_metal_settings",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column(
"pricing_settings_id",
sa.Integer(),
sa.ForeignKey("pricing_settings.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("metal_code", sa.String(length=32), nullable=False),
sa.Column("payable_pct", sa.Numeric(
precision=5, scale=2), nullable=True),
sa.Column(
"moisture_threshold_pct",
sa.Numeric(precision=5, scale=2),
nullable=True,
),
sa.Column(
"moisture_penalty_per_pct",
sa.Numeric(precision=14, scale=4),
nullable=True,
),
sa.Column("data", sa.JSON(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.UniqueConstraint(
"pricing_settings_id",
"metal_code",
name="uq_pricing_metal_settings_code",
),
)
op.create_index(
op.f("ix_pricing_metal_settings_id"),
"pricing_metal_settings",
["id"],
unique=False,
)
op.create_index(
op.f("ix_pricing_metal_settings_pricing_settings_id"),
"pricing_metal_settings",
["pricing_settings_id"],
unique=False,
)
op.create_table(
"pricing_impurity_settings",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column(
"pricing_settings_id",
sa.Integer(),
sa.ForeignKey("pricing_settings.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("impurity_code", sa.String(length=32), nullable=False),
sa.Column(
"threshold_ppm",
sa.Numeric(precision=14, scale=4),
nullable=False,
server_default=sa.text("0.0000"),
),
sa.Column(
"penalty_per_ppm",
sa.Numeric(precision=14, scale=4),
nullable=False,
server_default=sa.text("0.0000"),
),
sa.Column("notes", sa.Text(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.UniqueConstraint(
"pricing_settings_id",
"impurity_code",
name="uq_pricing_impurity_settings_code",
),
)
op.create_index(
op.f("ix_pricing_impurity_settings_id"),
"pricing_impurity_settings",
["id"],
unique=False,
)
op.create_index(
op.f("ix_pricing_impurity_settings_pricing_settings_id"),
"pricing_impurity_settings",
["pricing_settings_id"],
unique=False,
)
# Core domain tables
op.create_table(
"projects",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=False),
sa.Column("location", sa.String(length=255), nullable=True),
sa.Column("operation_type", mining_operation_type, nullable=False),
sa.Column("description", sa.Text(), nullable=True),
sa.Column(
"pricing_settings_id",
sa.Integer(),
sa.ForeignKey("pricing_settings.id", ondelete="SET NULL"),
nullable=True,
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("name"),
)
op.create_index(op.f("ix_projects_id"), "projects", ["id"], unique=False)
op.create_index(
"ix_projects_pricing_settings_id",
"projects",
["pricing_settings_id"],
unique=False,
)
op.create_table(
"scenarios",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("project_id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=False),
sa.Column("description", sa.Text(), nullable=True),
sa.Column("status", scenario_status, nullable=False),
sa.Column("start_date", sa.Date(), nullable=True),
sa.Column("end_date", sa.Date(), nullable=True),
sa.Column("discount_rate", sa.Numeric(
precision=5, scale=2), nullable=True),
sa.Column("currency", sa.String(length=3), nullable=True),
sa.Column("primary_resource", resource_type, nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(
["project_id"], ["projects.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(op.f("ix_scenarios_id"), "scenarios", ["id"], unique=False)
op.create_index(
op.f("ix_scenarios_project_id"),
"scenarios",
["project_id"],
unique=False,
)
op.create_table(
"financial_inputs",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("scenario_id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=False),
sa.Column("category", financial_category, nullable=False),
sa.Column("cost_bucket", cost_bucket, nullable=True),
sa.Column("amount", sa.Numeric(precision=18, scale=2), nullable=False),
sa.Column("currency", sa.String(length=3), nullable=True),
sa.Column("effective_date", sa.Date(), nullable=True),
sa.Column("notes", sa.Text(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(
["scenario_id"], ["scenarios.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_financial_inputs_id"),
"financial_inputs",
["id"],
unique=False,
)
op.create_index(
op.f("ix_financial_inputs_scenario_id"),
"financial_inputs",
["scenario_id"],
unique=False,
)
op.create_table(
"simulation_parameters",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("scenario_id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=False),
sa.Column("distribution", distribution_type, nullable=False),
sa.Column("variable", stochastic_variable, nullable=True),
sa.Column("resource_type", resource_type, nullable=True),
sa.Column("mean_value", sa.Numeric(
precision=18, scale=4), nullable=True),
sa.Column(
"standard_deviation",
sa.Numeric(precision=18, scale=4),
nullable=True,
),
sa.Column(
"minimum_value",
sa.Numeric(precision=18, scale=4),
nullable=True,
),
sa.Column(
"maximum_value",
sa.Numeric(precision=18, scale=4),
nullable=True,
),
sa.Column("unit", sa.String(length=32), nullable=True),
sa.Column("configuration", sa.JSON(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.ForeignKeyConstraint(
["scenario_id"], ["scenarios.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_simulation_parameters_id"),
"simulation_parameters",
["id"],
unique=False,
)
op.create_index(
op.f("ix_simulation_parameters_scenario_id"),
"simulation_parameters",
["scenario_id"],
unique=False,
)
# Authentication and RBAC tables
op.create_table(
"users",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("email", sa.String(length=255), nullable=False),
sa.Column("username", sa.String(length=128), nullable=False),
sa.Column("password_hash", sa.String(length=255), nullable=False),
sa.Column("is_active", sa.Boolean(),
nullable=False, server_default=sa.true()),
sa.Column(
"is_superuser",
sa.Boolean(),
nullable=False,
server_default=sa.false(),
),
sa.Column("last_login_at", sa.DateTime(timezone=True), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.UniqueConstraint("email", name="uq_users_email"),
sa.UniqueConstraint("username", name="uq_users_username"),
)
op.create_index(
"ix_users_active_superuser",
"users",
["is_active", "is_superuser"],
unique=False,
)
op.create_table(
"roles",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("name", sa.String(length=64), nullable=False),
sa.Column("display_name", sa.String(length=128), nullable=False),
sa.Column("description", sa.Text(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.UniqueConstraint("name", name="uq_roles_name"),
)
op.create_table(
"user_roles",
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("role_id", sa.Integer(), nullable=False),
sa.Column(
"granted_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column("granted_by", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(["role_id"], ["roles.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(
["granted_by"], ["users.id"], ondelete="SET NULL"),
sa.PrimaryKeyConstraint("user_id", "role_id"),
sa.UniqueConstraint("user_id", "role_id",
name="uq_user_roles_user_role"),
)
op.create_index(
"ix_user_roles_role_id",
"user_roles",
["role_id"],
unique=False,
)
# Seed roles and default admin
roles_table = table(
"roles",
column("id", sa.Integer()),
column("name", sa.String()),
column("display_name", sa.String()),
column("description", sa.Text()),
)
op.bulk_insert(
roles_table,
[
{
"id": 1,
"name": "admin",
"display_name": "Administrator",
"description": "Full platform access with user management rights.",
},
{
"id": 2,
"name": "project_manager",
"display_name": "Project Manager",
"description": "Manage projects, scenarios, and associated data.",
},
{
"id": 3,
"name": "analyst",
"display_name": "Analyst",
"description": "Review dashboards and scenario outputs.",
},
{
"id": 4,
"name": "viewer",
"display_name": "Viewer",
"description": "Read-only access to assigned projects and reports.",
},
],
)
admin_password_hash = password_context.hash("ChangeMe123!")
users_table = table(
"users",
column("id", sa.Integer()),
column("email", sa.String()),
column("username", sa.String()),
column("password_hash", sa.String()),
column("is_active", sa.Boolean()),
column("is_superuser", sa.Boolean()),
)
op.bulk_insert(
users_table,
[
{
"id": 1,
"email": "admin@calminer.local",
"username": "admin",
"password_hash": admin_password_hash,
"is_active": True,
"is_superuser": True,
}
],
)
user_roles_table = table(
"user_roles",
column("user_id", sa.Integer()),
column("role_id", sa.Integer()),
column("granted_by", sa.Integer()),
)
op.bulk_insert(
user_roles_table,
[
{
"user_id": 1,
"role_id": 1,
"granted_by": 1,
}
],
)
# Ensure a default pricing settings record exists for future project linkage
_ensure_default_pricing_settings(bind)
def downgrade() -> None:
# Drop RBAC
op.drop_index("ix_user_roles_role_id", table_name="user_roles")
op.drop_table("user_roles")
op.drop_table("roles")
op.drop_index("ix_users_active_superuser", table_name="users")
op.drop_table("users")
# Drop domain tables
op.drop_index(
op.f("ix_simulation_parameters_scenario_id"),
table_name="simulation_parameters",
)
op.drop_index(op.f("ix_simulation_parameters_id"),
table_name="simulation_parameters")
op.drop_table("simulation_parameters")
op.drop_index(
op.f("ix_financial_inputs_scenario_id"), table_name="financial_inputs"
)
op.drop_index(op.f("ix_financial_inputs_id"),
table_name="financial_inputs")
op.drop_table("financial_inputs")
op.drop_index(op.f("ix_scenarios_project_id"), table_name="scenarios")
op.drop_index(op.f("ix_scenarios_id"), table_name="scenarios")
op.drop_table("scenarios")
op.drop_index("ix_projects_pricing_settings_id", table_name="projects")
op.drop_index(op.f("ix_projects_id"), table_name="projects")
op.drop_table("projects")
# Drop pricing settings ancillary tables
op.drop_index(
op.f("ix_pricing_impurity_settings_pricing_settings_id"),
table_name="pricing_impurity_settings",
)
op.drop_index(
op.f("ix_pricing_impurity_settings_id"),
table_name="pricing_impurity_settings",
)
op.drop_table("pricing_impurity_settings")
op.drop_index(
op.f("ix_pricing_metal_settings_pricing_settings_id"),
table_name="pricing_metal_settings",
)
op.drop_index(
op.f("ix_pricing_metal_settings_id"),
table_name="pricing_metal_settings",
)
op.drop_table("pricing_metal_settings")
op.drop_index(op.f("ix_pricing_settings_id"),
table_name="pricing_settings")
op.drop_table("pricing_settings")
# Drop enumerations
resource_type.drop(op.get_bind(), checkfirst=True)
stochastic_variable.drop(op.get_bind(), checkfirst=True)
distribution_type.drop(op.get_bind(), checkfirst=True)
cost_bucket.drop(op.get_bind(), checkfirst=True)
financial_category.drop(op.get_bind(), checkfirst=True)
scenario_status.drop(op.get_bind(), checkfirst=True)
mining_operation_type.drop(op.get_bind(), checkfirst=True)

View File

@@ -1,38 +0,0 @@
"""Add performance_metrics table"""
from __future__ import annotations
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "20251111_01"
down_revision = "20251111_00"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"performance_metrics",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("timestamp", sa.DateTime(), nullable=True),
sa.Column("metric_name", sa.String(), nullable=True),
sa.Column("value", sa.Float(), nullable=True),
sa.Column("labels", sa.String(), nullable=True),
sa.Column("endpoint", sa.String(), nullable=True),
sa.Column("method", sa.String(), nullable=True),
sa.Column("status_code", sa.Integer(), nullable=True),
sa.Column("duration_seconds", sa.Float(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(op.f("ix_performance_metrics_timestamp"), "performance_metrics", ["timestamp"], unique=False)
op.create_index(op.f("ix_performance_metrics_metric_name"), "performance_metrics", ["metric_name"], unique=False)
op.create_index(op.f("ix_performance_metrics_endpoint"), "performance_metrics", ["endpoint"], unique=False)
def downgrade() -> None:
op.drop_index(op.f("ix_performance_metrics_endpoint"), table_name="performance_metrics")
op.drop_index(op.f("ix_performance_metrics_metric_name"), table_name="performance_metrics")
op.drop_index(op.f("ix_performance_metrics_timestamp"), table_name="performance_metrics")
op.drop_table("performance_metrics")

View File

@@ -1,134 +0,0 @@
"""Add metadata columns to roles table"""
from __future__ import annotations
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "20251112_00_add_roles_metadata_columns"
down_revision = "20251111_01"
branch_labels = None
depends_on = None
ROLE_BACKFILL = (
("admin", "Administrator", "Full platform access with user management rights."),
(
"project_manager",
"Project Manager",
"Manage projects, scenarios, and associated data.",
),
("analyst", "Analyst", "Review dashboards and scenario outputs."),
(
"viewer",
"Viewer",
"Read-only access to assigned projects and reports.",
),
)
def upgrade() -> None:
op.add_column(
"roles",
sa.Column("display_name", sa.String(length=128), nullable=True),
)
op.add_column(
"roles",
sa.Column("description", sa.Text(), nullable=True),
)
op.add_column(
"roles",
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=True,
server_default=sa.text("timezone('UTC', now())"),
),
)
op.add_column(
"roles",
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=True,
server_default=sa.text("timezone('UTC', now())"),
),
)
connection = op.get_bind()
for name, display_name, description in ROLE_BACKFILL:
connection.execute(
sa.text(
"""
UPDATE roles
SET display_name = :display_name,
description = COALESCE(description, :description)
WHERE name = :name
AND display_name IS NULL
"""
),
{
"name": name,
"display_name": display_name,
"description": description,
},
)
connection.execute(
sa.text(
"""
UPDATE roles
SET display_name = INITCAP(REPLACE(name, '_', ' '))
WHERE display_name IS NULL
"""
)
)
connection.execute(
sa.text(
"""
UPDATE roles
SET created_at = timezone('UTC', now())
WHERE created_at IS NULL
"""
)
)
connection.execute(
sa.text(
"""
UPDATE roles
SET updated_at = timezone('UTC', now())
WHERE updated_at IS NULL
"""
)
)
op.alter_column(
"roles",
"display_name",
existing_type=sa.String(length=128),
nullable=False,
)
op.alter_column(
"roles",
"created_at",
existing_type=sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("timezone('UTC', now())"),
)
op.alter_column(
"roles",
"updated_at",
existing_type=sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("timezone('UTC', now())"),
)
def downgrade() -> None:
op.drop_column("roles", "updated_at")
op.drop_column("roles", "created_at")
op.drop_column("roles", "description")
op.drop_column("roles", "display_name")

Binary file not shown.

View File

@@ -1,83 +1,124 @@
# Changelog # Changelog
## 2025-11-15
- Fixed dev container setup by reviewing logs, identifying mount errors, implementing fixes, and validating the configuration.
## 2025-11-14
- Completed Coolify deployment automation with workflow and documentation.
- Improved build workflow for registry authentication and tagging.
- Updated production compose and added deployment guidance.
- Added optional Kubernetes deployment toggle.
## 2025-11-13
- Aligned UI styles and ensured accessibility.
- Restructured navigation under project-scenario-calculation hierarchy.
- Reorganized documentation for better structure.
- Refactored navigation sidebar with database-driven data.
- Migrated sidebar rendering to API endpoint.
- Created templates for data import and export.
- Updated relationships for projects, scenarios, and profitability.
- Enhanced scenario frontend templates with project context.
- Scoped profitability calculator to scenario level.
- Added navigation links for opex planner.
- Documented opex planner features.
- Integrated opex calculations with persistence and tests.
- Implemented capex calculations end-to-end.
- Added basic profitability calculations.
- Developed reporting endpoints and templates.
- Integrated charting for visualizations.
- Performed manual testing of capex planner.
- Added unit tests for opex service.
- Added integration tests for opex.
## 2025-11-12 ## 2025-11-12
- Switched `models/performance_metric.py` to reuse the shared declarative base from `config.database`, clearing the SQLAlchemy 2.0 `declarative_base` deprecation warning and verifying repository tests still pass. - Fixed reporting dashboard error by correcting route reference.
- Eliminated Bandit hardcoded-secret findings by replacing literal JWT tokens and passwords across auth/security tests with randomized helpers drawn from `tests/utils/security.py`, ensuring fixtures still assert expected behaviours. - Completed navigation validation by adding missing routes and templates for various pages.
- Centralized Bandit configuration in `pyproject.toml`, reran `bandit -c pyproject.toml -r calminer tests`, and verified the scan now reports zero issues. - Fixed template rendering error with URL objects.
- Updated `.github/instructions/TODO.md` and `.github/instructions/DONE.md` to reflect the completed security scan remediation workflow. - Integrated charting for interactive visualizations.
- Diagnosed admin bootstrap failure caused by legacy `roles` schema, added Alembic migration `20251112_00_add_roles_metadata_columns.py` to backfill `display_name`, `description`, `created_at`, and `updated_at`, and verified the migration via full pytest run in the activated `.venv`. - Verified local application startup and routes.
- Resolved Ruff E402 warnings by moving module docstrings ahead of `from __future__ import annotations` across currency and pricing service modules, dropped the unused `HTTPException` import in `monitoring/__init__.py`, and confirmed a clean `ruff check .` run. - Fixed docker-compose configuration.
- Enhanced the deploy job in `.gitea/workflows/cicache.yml` to capture Kubernetes pod, deployment, and container logs into `/logs/deployment/` for staging/production rollouts and publish them via a `deployment-logs` artifact, updating CI/CD documentation with retrieval instructions. - Verified deployment pipeline.
- Fixed CI dashboard template lookup failures by renaming `templates/Dashboard.html` to `templates/dashboard.html` and verifying `tests/test_dashboard_route.py` locally to ensure TemplateNotFound no longer occurs on case-sensitive filesystems. - Documented data models.
- Updated performance model to clear warnings.
- Replaced migration system with simpler initializer.
- Removed hardcoded secrets from tests.
- Centralized security scanning config.
- Fixed admin setup with migration.
- Resolved code style warnings.
- Enhanced deploy logging.
- Fixed CI template issue.
- Added SQLite database support.
## 2025-11-11 ## 2025-11-11
- Implemented base URL routing to redirect unauthenticated users to login and authenticated users to dashboard. - Combined old migration files into one initial schema.
- Added comprehensive end-to-end tests for login flow, including redirects, session handling, and error messaging for invalid/inactive accounts. - Added base routing to redirect users to login or dashboard.
- Updated header and footer templates to consistently use `logo_big.png` image instead of text logo, with appropriate CSS styling for sizing. - Added end-to-end tests for login flow.
- Centralised ISO-4217 currency validation across scenarios, imports, and export filters (`models/scenario.py`, `routes/scenarios.py`, `schemas/scenario.py`, `schemas/imports.py`, `services/export_query.py`) so malformed codes are rejected consistently at every entry point. - Updated templates to use logo image consistently.
- Updated scenario services and UI flows to surface friendly validation errors and added regression coverage for imports, exports, API creation, and lifecycle flows ensuring currencies are normalised end-to-end. - Centralized currency validation across the app.
- Recorded the completed “Ensure currency is used consistently” work in `.github/instructions/DONE.md` and ran the full pytest suite (150 tests) to verify the refactor. - Updated services to show friendly error messages.
- Linked projects to their pricing settings by updating SQLAlchemy models, repositories, seeding utilities, and migrations, and added regression tests to cover the new association and default backfill. - Linked projects to pricing settings.
- Bootstrapped database-stored pricing settings at application startup, aligned initial data seeding with the database-first metadata flow, and added tests covering pricing bootstrap creation, project assignment, and idempotency. - Bootstrapped pricing settings at startup.
- Extended pricing configuration support to prefer persisted metadata via `dependencies.get_pricing_metadata`, added retrieval tests for project/default fallbacks, and refreshed docs (`calminer-docs/specifications/price_calculation.md`, `pricing_settings_data_model.md`) to describe the database-backed workflow and bootstrap behaviour. - Extended pricing support with persisted data.
- Added `services/financial.py` NPV, IRR, and payback helpers with robust cash-flow normalisation, convergence safeguards, and fractional period support, plus comprehensive pytest coverage exercising representative project scenarios and failure modes. - Added financial helpers for NPV, IRR, payback.
- Authored `calminer-docs/specifications/financial_metrics.md` capturing DCF assumptions, solver behaviours, and worked examples, and cross-linked the architecture concepts to the new reference for consistent navigation. - Documented financial metrics.
- Implemented `services/simulation.py` Monte Carlo engine with configurable distributions, summary aggregation, and reproducible RNG seeding, introduced regression tests in `tests/test_simulation.py`, and documented configuration/usage in `calminer-docs/specifications/monte_carlo_simulation.md` with architecture cross-links. - Implemented Monte Carlo simulation engine.
- Polished reporting HTML contexts by cleaning stray fragments in `routes/reports.py`, adding download action metadata for project and scenario pages, and generating scenario comparison download URLs with correctly serialised repeated `scenario_ids` parameters. - Cleaned up reporting contexts.
- Consolidated Alembic history into a single initial migration (`20251111_00_initial_schema.py`), removed superseded revision files, and ensured Alembic metadata still references the project metadata for clean bootstrap. - Consolidated migration history.
- Added `scripts/run_migrations.py` and a Docker entrypoint wrapper to run Alembic migrations before `uvicorn` starts, removed the fallback `Base.metadata.create_all` call, and updated `calminer-docs/admin/installation.md` so developers know how to apply migrations locally or via Docker. - Added migration script and updated entrypoint.
- Configured pytest defaults to collect coverage (`--cov`) with an 80% fail-under gate, excluded entrypoint/reporting scaffolds from the calculation, updated contributor docs with the standard `pytest` command, and verified the suite now reports 83% coverage. - Configured test coverage.
- Standardized color scheme and typography by moving alert styles to `main.css`, adding typography rules with CSS variables, updating auth templates for consistent button classes, and ensuring all templates use centralized color and spacing variables. - Standardized colors and typography.
- Improved navigation flow by adding two big chevron buttons on top of the navigation sidebar to allow users to navigate to the previous and next page in the page navigation list, including JavaScript logic for determining current page and handling navigation. - Improved navigation with chevron buttons.
- Established pytest-based unit and integration test suites with coverage thresholds, achieving 83% coverage across 181 tests, with configuration in pyproject.toml and documentation in CONTRIBUTING.md. - Established test suites with coverage.
- Configured CI pipelines to run tests, linting, and security checks on each change, adding Bandit security scanning to the workflow and verifying execution on pushes and PRs to main/develop branches. - Configured CI pipelines for tests and security.
- Added deployment automation with Docker Compose for local development and Kubernetes manifests for production, ensuring environment parity and documenting processes in calminer-docs/admin/installation.md. - Added deployment automation with Docker and Kubernetes.
- Completed monitoring instrumentation by adding business metrics observation to project and scenario repository operations, and simulation performance tracking to Monte Carlo service with success/error status and duration metrics. - Completed monitoring instrumentation.
- Updated TODO list to reflect completed monitoring implementation tasks and validated changes with passing simulation tests. - Implemented performance monitoring.
- Implemented comprehensive performance monitoring for scalability (FR-006) with Prometheus metrics collection for HTTP requests, import/export operations, and general application metrics. - Added metric storage and endpoints.
- Added database model for persistent metric storage with aggregation endpoints for KPIs like request latency, error rates, and throughput. - Created middleware for metrics.
- Created FastAPI middleware for automatic request metric collection and background persistence to database. - Extended monitoring router.
- Extended monitoring router with performance metrics API endpoints and detailed health checks. - Added migration for metrics table.
- Added Alembic migration for performance_metrics table and updated model imports. - Completed concurrent testing.
- Completed concurrent interaction testing implementation, validating database transaction isolation under threading and establishing async testing framework for future concurrency enhancements. - Implemented deployment automation.
- Implemented comprehensive deployment automation with Docker Compose configurations for development, staging, and production environments ensuring environment parity. - Set up Kubernetes manifests.
- Set up Kubernetes manifests with resource limits, health checks, and secrets management for production deployment. - Configured CI/CD workflows.
- Configured CI/CD workflows for automated Docker image building, registry pushing, and Kubernetes deployment to staging/production environments. - Documented deployment processes.
- Documented deployment processes, environment configurations, and CI/CD workflows in project documentation. - Validated deployment setup.
- Validated deployment automation through Docker Compose configuration testing and CI/CD pipeline structure.
## 2025-11-10 ## 2025-11-10
- Added dedicated pytest coverage for guard dependencies, exercising success plus failure paths (missing session, inactive user, missing roles, project/scenario access errors) via `tests/test_dependencies_guards.py`. - Added tests for guard dependencies.
- Added integration tests in `tests/test_authorization_integration.py` verifying anonymous 401 responses, role-based 403s, and authorized project manager flows across API and UI endpoints. - Added integration tests for authorization.
- Implemented environment-driven admin bootstrap settings, wired the `bootstrap_admin` helper into FastAPI startup, added pytest coverage for creation/idempotency/reset logic, and documented operational guidance in the RBAC plan and security concept. - Implemented admin bootstrap settings.
- Retired the legacy authentication RBAC implementation plan document after migrating its guidance into live documentation and synchronized the contributor instructions to reflect the removal. - Retired old RBAC plan document.
- Completed the Authentication & RBAC checklist by shipping the new models, migrations, repositories, guard dependencies, and integration tests. - Completed authentication and RBAC features.
- Documented the project/scenario import/export field mapping and file format guidelines in `calminer-docs/requirements/FR-008.md`, and introduced `schemas/imports.py` with Pydantic models that normalise incoming CSV/Excel rows for projects and scenarios. - Documented import/export field mappings.
- Added `services/importers.py` to load CSV/XLSX files into the new import schemas, pulled in `openpyxl` for Excel support, and covered the parsing behaviour with `tests/test_import_parsing.py`. - Added import service for CSV/Excel.
- Expanded the import ingestion workflow with staging previews, transactional persistence commits, FastAPI preview/commit endpoints under `/imports`, and new API tests (`tests/test_import_ingestion.py`, `tests/test_import_api.py`) ensuring end-to-end coverage. - Expanded import workflow with previews and commits.
- Added persistent audit logging via `ImportExportLog`, structured log emission, Prometheus metrics instrumentation, `/metrics` endpoint exposure, and updated operator/deployment documentation to guide monitoring setup. - Added audit logging for imports/exports.
## 2025-11-09 ## 2025-11-09
- Captured current implementation status, requirements coverage, missing features, and prioritized roadmap in `calminer-docs/implementation_status.md` to guide future development. - Captured implementation status and roadmap.
- Added core SQLAlchemy domain models, shared metadata descriptors, and Alembic migration setup (with initial schema snapshot) to establish the persistence layer foundation. - Added core database models and migration setup.
- Introduced repository and unit-of-work helpers for projects, scenarios, financial inputs, and simulation parameters to support service-layer operations. - Introduced repository helpers for data operations.
- Added SQLite-backed pytest coverage for repository and unit-of-work behaviours to validate persistence interactions. - Added tests for repository behaviors.
- Exposed project and scenario CRUD APIs with validated schemas and integrated them into the FastAPI application. - Exposed CRUD APIs for projects and scenarios.
- Connected project and scenario routers to new Jinja2 list/detail/edit views with HTML forms and redirects. - Connected routers to HTML views.
- Implemented FR-009 client-side enhancements with responsive navigation toggle, mobile-first scenario tables, and shared asset loading across templates. - Implemented client-side enhancements.
- Added scenario comparison validator, FastAPI comparison endpoint, and comprehensive unit tests to enforce FR-009 validation rules through API errors. - Added scenario comparison validator.
- Delivered a new dashboard experience with `templates/dashboard.html`, dedicated styling, and a FastAPI route supplying real project/scenario metrics via repository helpers. - Delivered new dashboard experience.
- Extended repositories with count/recency utilities and added pytest coverage, including a dashboard rendering smoke test validating empty-state messaging. - Extended repositories with utilities.
- Brought project and scenario detail pages plus their forms in line with the dashboard visuals, adding metric cards, layout grids, and refreshed CTA styles. - Updated detail pages with new visuals.
- Reordered project route registration to prioritize static UI paths, eliminating 422 errors on `/projects/ui` and `/projects/create`, and added pytest smoke coverage for the navigation endpoints. - Fixed route registration issues.
- Added end-to-end integration tests for project and scenario lifecycles, validating HTML redirects, template rendering, and API interactions, and updated `ProjectRepository.get` to deduplicate joined loads for detail views. - Added end-to-end tests for lifecycles.
- Updated all Jinja2 template responses to the new Starlette signature to eliminate deprecation warnings while keeping request-aware context available to the templates. - Updated template responses.
- Introduced `services/security.py` to centralize Argon2 password hashing utilities and JWT creation/verification with typed payloads, and added pytest coverage for hashing, expiry, tampering, and token type mismatch scenarios. - Introduced security utilities.
- Added `routes/auth.py` with registration, login, and password reset flows, refreshed auth templates with error messaging, wired navigation links, and introduced end-to-end pytest coverage for the new forms and token flows. - Added authentication routes.
- Implemented cookie-based authentication session middleware with automatic access token refresh, logout handling, navigation adjustments, and documentation/test updates capturing the new behaviour. - Implemented session middleware.
- Delivered idempotent seeding utilities with `scripts/initial_data.py`, entry-point runner `scripts/00_initial_data.py`, documentation updates, and pytest coverage to verify role/admin provisioning. - Delivered seeding utilities.
- Secured project and scenario routers with RBAC guard dependencies, enforced repository access checks via helper utilities, and aligned template routes with FastAPI dependency injection patterns. - Secured routers with RBAC.

View File

@@ -11,12 +11,21 @@ def _build_database_url() -> str:
"""Construct the SQLAlchemy database URL from granular environment vars. """Construct the SQLAlchemy database URL from granular environment vars.
Falls back to `DATABASE_URL` for backward compatibility. Falls back to `DATABASE_URL` for backward compatibility.
Supports SQLite when CALMINER_USE_SQLITE is set.
""" """
legacy_url = os.environ.get("DATABASE_URL", "") legacy_url = os.environ.get("DATABASE_URL", "")
if legacy_url and legacy_url.strip() != "": if legacy_url and legacy_url.strip() != "":
return legacy_url return legacy_url
use_sqlite = os.environ.get("CALMINER_USE_SQLITE", "").lower() in ("true", "1", "yes")
if use_sqlite:
# Use SQLite database
db_path = os.environ.get("DATABASE_PATH", "./data/calminer.db")
# Ensure the directory exists
os.makedirs(os.path.dirname(db_path), exist_ok=True)
return f"sqlite:///{db_path}"
driver = os.environ.get("DATABASE_DRIVER", "postgresql") driver = os.environ.get("DATABASE_DRIVER", "postgresql")
host = os.environ.get("DATABASE_HOST") host = os.environ.get("DATABASE_HOST")
port = os.environ.get("DATABASE_PORT", "5432") port = os.environ.get("DATABASE_PORT", "5432")
@@ -54,7 +63,15 @@ def _build_database_url() -> str:
DATABASE_URL = _build_database_url() DATABASE_URL = _build_database_url()
engine = create_engine(DATABASE_URL, echo=True, future=True) engine = create_engine(DATABASE_URL, echo=True, future=True)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine) # Avoid expiring ORM objects on commit so that objects returned from UnitOfWork
# remain usable for the duration of the request cycle without causing
# DetachedInstanceError when accessed after the session commits.
SessionLocal = sessionmaker(
autocommit=False,
autoflush=False,
bind=engine,
expire_on_commit=False,
)
Base = declarative_base() Base = declarative_base()

View File

@@ -23,6 +23,7 @@ from services.session import (
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from services.importers import ImportIngestionService from services.importers import ImportIngestionService
from services.pricing import PricingMetadata from services.pricing import PricingMetadata
from services.navigation import NavigationService
from services.scenario_evaluation import ScenarioPricingConfig, ScenarioPricingEvaluator from services.scenario_evaluation import ScenarioPricingConfig, ScenarioPricingEvaluator
from services.repositories import pricing_settings_to_metadata from services.repositories import pricing_settings_to_metadata
@@ -64,6 +65,14 @@ def get_pricing_metadata(
return pricing_settings_to_metadata(seed_result.settings) return pricing_settings_to_metadata(seed_result.settings)
def get_navigation_service(
uow: UnitOfWork = Depends(get_unit_of_work),
) -> NavigationService:
if not uow.navigation:
raise RuntimeError("Navigation repository is not initialised")
return NavigationService(uow.navigation)
def get_pricing_evaluator( def get_pricing_evaluator(
metadata: PricingMetadata = Depends(get_pricing_metadata), metadata: PricingMetadata = Depends(get_pricing_metadata),
) -> ScenarioPricingEvaluator: ) -> ScenarioPricingEvaluator:
@@ -153,6 +162,28 @@ def require_authenticated_user(
return user return user
def require_authenticated_user_html(
request: Request,
session: AuthSession = Depends(get_auth_session),
) -> User:
"""HTML-aware authenticated dependency that redirects anonymous sessions."""
user = session.user
if user is None or session.tokens.is_empty:
login_url = str(request.url_for("auth.login_form"))
raise HTTPException(
status_code=status.HTTP_303_SEE_OTHER,
headers={"Location": login_url},
)
if not user.is_active:
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="User account is disabled.",
)
return user
def _user_role_names(user: User) -> set[str]: def _user_role_names(user: User) -> set[str]:
roles: Iterable[Role] = getattr(user, "roles", []) or [] roles: Iterable[Role] = getattr(user, "roles", []) or []
return {role.name for role in roles} return {role.name for role in roles}
@@ -186,12 +217,55 @@ def require_any_role(*roles: str) -> Callable[[User], User]:
return require_roles(*roles) return require_roles(*roles)
def require_project_resource(*, require_manage: bool = False) -> Callable[[int], Project]: def require_roles_html(*roles: str) -> Callable[[Request], User]:
"""Ensure user is authenticated for HTML responses; redirect anonymous to login."""
required = tuple(role.strip() for role in roles if role.strip())
if not required:
raise ValueError("require_roles_html requires at least one role name")
def _dependency(
request: Request,
session: AuthSession = Depends(get_auth_session),
) -> User:
user = session.user
if user is None:
login_url = str(request.url_for("auth.login_form"))
raise HTTPException(
status_code=status.HTTP_303_SEE_OTHER,
headers={"Location": login_url},
)
if user.is_superuser:
return user
role_names = _user_role_names(user)
if not any(role in role_names for role in required):
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Insufficient permissions for this action.",
)
return user
return _dependency
def require_any_role_html(*roles: str) -> Callable[[Request], User]:
"""Alias of require_roles_html for readability."""
return require_roles_html(*roles)
def require_project_resource(
*,
require_manage: bool = False,
user_dependency: Callable[..., User] = require_authenticated_user,
) -> Callable[[int], Project]:
"""Dependency factory that resolves a project with authorization checks.""" """Dependency factory that resolves a project with authorization checks."""
def _dependency( def _dependency(
project_id: int, project_id: int,
user: User = Depends(require_authenticated_user), user: User = Depends(user_dependency),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> Project: ) -> Project:
try: try:
@@ -216,13 +290,16 @@ def require_project_resource(*, require_manage: bool = False) -> Callable[[int],
def require_scenario_resource( def require_scenario_resource(
*, require_manage: bool = False, with_children: bool = False *,
require_manage: bool = False,
with_children: bool = False,
user_dependency: Callable[..., User] = require_authenticated_user,
) -> Callable[[int], Scenario]: ) -> Callable[[int], Scenario]:
"""Dependency factory that resolves a scenario with authorization checks.""" """Dependency factory that resolves a scenario with authorization checks."""
def _dependency( def _dependency(
scenario_id: int, scenario_id: int,
user: User = Depends(require_authenticated_user), user: User = Depends(user_dependency),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> Scenario: ) -> Scenario:
try: try:
@@ -248,14 +325,17 @@ def require_scenario_resource(
def require_project_scenario_resource( def require_project_scenario_resource(
*, require_manage: bool = False, with_children: bool = False *,
require_manage: bool = False,
with_children: bool = False,
user_dependency: Callable[..., User] = require_authenticated_user,
) -> Callable[[int, int], Scenario]: ) -> Callable[[int, int], Scenario]:
"""Dependency factory ensuring a scenario belongs to the given project and is accessible.""" """Dependency factory ensuring a scenario belongs to the given project and is accessible."""
def _dependency( def _dependency(
project_id: int, project_id: int,
scenario_id: int, scenario_id: int,
user: User = Depends(require_authenticated_user), user: User = Depends(user_dependency),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> Scenario: ) -> Scenario:
try: try:
@@ -279,3 +359,42 @@ def require_project_scenario_resource(
) from exc ) from exc
return _dependency return _dependency
def require_project_resource_html(
*, require_manage: bool = False
) -> Callable[[int], Project]:
"""HTML-aware project loader that redirects anonymous sessions."""
return require_project_resource(
require_manage=require_manage,
user_dependency=require_authenticated_user_html,
)
def require_scenario_resource_html(
*,
require_manage: bool = False,
with_children: bool = False,
) -> Callable[[int], Scenario]:
"""HTML-aware scenario loader that redirects anonymous sessions."""
return require_scenario_resource(
require_manage=require_manage,
with_children=with_children,
user_dependency=require_authenticated_user_html,
)
def require_project_scenario_resource_html(
*,
require_manage: bool = False,
with_children: bool = False,
) -> Callable[[int, int], Scenario]:
"""HTML-aware project-scenario loader redirecting anonymous sessions."""
return require_project_scenario_resource(
require_manage=require_manage,
with_children=with_children,
user_dependency=require_authenticated_user_html,
)

View File

@@ -31,7 +31,6 @@ services:
# Override command for development with reload # Override command for development with reload
command: command:
[ [
"uvicorn",
"main:app", "main:app",
"--host", "--host",
"0.0.0.0", "0.0.0.0",

View File

@@ -2,11 +2,7 @@ version: "3.8"
services: services:
app: app:
build: image: git.allucanget.biz/allucanget/calminer:latest
context: .
dockerfile: Dockerfile
args:
APT_CACHE_URL: ${APT_CACHE_URL:-}
environment: environment:
- ENVIRONMENT=production - ENVIRONMENT=production
- DEBUG=false - DEBUG=false

View File

@@ -1,5 +1,3 @@
version: "3.8"
services: services:
app: app:
build: build:

100
main.py
View File

@@ -1,8 +1,10 @@
import logging import logging
from contextlib import asynccontextmanager
from typing import Awaitable, Callable from typing import Awaitable, Callable
from fastapi import FastAPI, Request, Response from fastapi import FastAPI, Request, Response
from fastapi.staticfiles import StaticFiles from fastapi.staticfiles import StaticFiles
from fastapi.responses import FileResponse
from config.settings import get_settings from config.settings import get_settings
from middleware.auth_session import AuthSessionMiddleware from middleware.auth_session import AuthSessionMiddleware
@@ -10,21 +12,78 @@ from middleware.metrics import MetricsMiddleware
from middleware.validation import validate_json from middleware.validation import validate_json
from routes.auth import router as auth_router from routes.auth import router as auth_router
from routes.dashboard import router as dashboard_router from routes.dashboard import router as dashboard_router
from routes.calculations import router as calculations_router
from routes.imports import router as imports_router from routes.imports import router as imports_router
from routes.exports import router as exports_router from routes.exports import router as exports_router
from routes.projects import router as projects_router from routes.projects import router as projects_router
from routes.reports import router as reports_router from routes.reports import router as reports_router
from routes.scenarios import router as scenarios_router from routes.scenarios import router as scenarios_router
from routes.ui import router as ui_router
from routes.navigation import router as navigation_router
from monitoring import router as monitoring_router from monitoring import router as monitoring_router
from services.bootstrap import bootstrap_admin, bootstrap_pricing_settings from services.bootstrap import bootstrap_admin, bootstrap_pricing_settings
from scripts.init_db import init_db as init_db_script
app = FastAPI() logger = logging.getLogger(__name__)
async def _bootstrap_startup() -> None:
settings = get_settings()
admin_settings = settings.admin_bootstrap_settings()
pricing_metadata = settings.pricing_metadata()
try:
try:
init_db_script()
except Exception:
logger.exception(
"DB initializer failed; continuing to bootstrap (non-fatal)")
role_result, admin_result = bootstrap_admin(settings=admin_settings)
pricing_result = bootstrap_pricing_settings(metadata=pricing_metadata)
logger.info(
"Admin bootstrap completed: roles=%s created=%s updated=%s rotated=%s assigned=%s",
role_result.ensured,
admin_result.created_user,
admin_result.updated_user,
admin_result.password_rotated,
admin_result.roles_granted,
)
try:
seed = pricing_result.seed
slug = getattr(seed.settings, "slug", None) if seed and getattr(
seed, "settings", None) else None
created = getattr(seed, "created", None)
updated_fields = getattr(seed, "updated_fields", None)
impurity_upserts = getattr(seed, "impurity_upserts", None)
logger.info(
"Pricing settings bootstrap completed: slug=%s created=%s updated_fields=%s impurity_upserts=%s projects_assigned=%s",
slug,
created,
updated_fields,
impurity_upserts,
pricing_result.projects_assigned,
)
except Exception:
logger.info(
"Pricing settings bootstrap completed (partial): projects_assigned=%s",
pricing_result.projects_assigned,
)
except Exception: # pragma: no cover - defensive logging
logger.exception(
"Failed to bootstrap administrator or pricing settings")
@asynccontextmanager
async def app_lifespan(_: FastAPI):
await _bootstrap_startup()
yield
app = FastAPI(lifespan=app_lifespan)
app.add_middleware(AuthSessionMiddleware) app.add_middleware(AuthSessionMiddleware)
app.add_middleware(MetricsMiddleware) app.add_middleware(MetricsMiddleware)
logger = logging.getLogger(__name__)
@app.middleware("http") @app.middleware("http")
async def json_validation( async def json_validation(
@@ -38,42 +97,23 @@ async def health() -> dict[str, str]:
return {"status": "ok"} return {"status": "ok"}
@app.on_event("startup") @app.get("/favicon.ico", include_in_schema=False)
async def ensure_admin_bootstrap() -> None: async def favicon() -> Response:
settings = get_settings() static_directory = "static"
admin_settings = settings.admin_bootstrap_settings() favicon_img = "favicon.ico"
pricing_metadata = settings.pricing_metadata() return FileResponse(f"{static_directory}/{favicon_img}")
try:
role_result, admin_result = bootstrap_admin(settings=admin_settings)
pricing_result = bootstrap_pricing_settings(metadata=pricing_metadata)
logger.info(
"Admin bootstrap completed: roles=%s created=%s updated=%s rotated=%s assigned=%s",
role_result.ensured,
admin_result.created_user,
admin_result.updated_user,
admin_result.password_rotated,
admin_result.roles_granted,
)
logger.info(
"Pricing settings bootstrap completed: slug=%s created=%s updated_fields=%s impurity_upserts=%s projects_assigned=%s",
pricing_result.seed.settings.slug,
pricing_result.seed.created,
pricing_result.seed.updated_fields,
pricing_result.seed.impurity_upserts,
pricing_result.projects_assigned,
)
except Exception: # pragma: no cover - defensive logging
logger.exception(
"Failed to bootstrap administrator or pricing settings")
app.include_router(dashboard_router) app.include_router(dashboard_router)
app.include_router(calculations_router)
app.include_router(auth_router) app.include_router(auth_router)
app.include_router(imports_router) app.include_router(imports_router)
app.include_router(exports_router) app.include_router(exports_router)
app.include_router(projects_router) app.include_router(projects_router)
app.include_router(scenarios_router) app.include_router(scenarios_router)
app.include_router(reports_router) app.include_router(reports_router)
app.include_router(ui_router)
app.include_router(monitoring_router) app.include_router(monitoring_router)
app.include_router(navigation_router)
app.mount("/static", StaticFiles(directory="static"), name="static") app.mount("/static", StaticFiles(directory="static"), name="static")

View File

@@ -8,6 +8,7 @@ from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoin
from starlette.types import ASGIApp from starlette.types import ASGIApp
from config.settings import Settings, get_settings from config.settings import Settings, get_settings
from sqlalchemy.orm.exc import DetachedInstanceError
from models import User from models import User
from monitoring.metrics import ACTIVE_CONNECTIONS from monitoring.metrics import ACTIVE_CONNECTIONS
from services.exceptions import EntityNotFoundError from services.exceptions import EntityNotFoundError
@@ -66,21 +67,42 @@ class AuthSessionMiddleware(BaseHTTPMiddleware):
resolved = self._resolve_session(request) resolved = self._resolve_session(request)
# Track active sessions for authenticated users # Track active sessions for authenticated users
if resolved.session.user and resolved.session.user.is_active: try:
user_active = bool(resolved.session.user and getattr(
resolved.session.user, "is_active", False))
except DetachedInstanceError:
user_active = False
if user_active:
AuthSessionMiddleware._active_sessions += 1 AuthSessionMiddleware._active_sessions += 1
ACTIVE_CONNECTIONS.set(AuthSessionMiddleware._active_sessions) ACTIVE_CONNECTIONS.set(AuthSessionMiddleware._active_sessions)
response: Response | None = None
try: try:
response = await call_next(request) response = await call_next(request)
return response return response
finally: finally:
# Decrement on response # Always decrement the active sessions counter if we incremented it.
if resolved.session.user and resolved.session.user.is_active: if user_active:
AuthSessionMiddleware._active_sessions = max( AuthSessionMiddleware._active_sessions = max(
0, AuthSessionMiddleware._active_sessions - 1) 0, AuthSessionMiddleware._active_sessions - 1)
ACTIVE_CONNECTIONS.set(AuthSessionMiddleware._active_sessions) ACTIVE_CONNECTIONS.set(AuthSessionMiddleware._active_sessions)
self._apply_session(response, resolved) # Only apply session cookies if a response was produced by downstream
# application. If an exception occurred before a response was created
# we avoid raising another error here.
import logging
if response is not None:
try:
self._apply_session(response, resolved)
except Exception:
logging.getLogger(__name__).exception(
"Failed to apply session cookies to response"
)
else:
logging.getLogger(__name__).debug(
"AuthSessionMiddleware: no response produced by downstream app (response is None)"
)
def _resolve_session(self, request: Request) -> _ResolutionResult: def _resolve_session(self, request: Request) -> _ResolutionResult:
settings = self._settings_provider() settings = self._settings_provider()
@@ -123,6 +145,7 @@ class AuthSessionMiddleware(BaseHTTPMiddleware):
session.user = user session.user = user
session.scopes = tuple(payload.scopes) session.scopes = tuple(payload.scopes)
session.set_role_slugs(role.name for role in getattr(user, "roles", []) if role)
return True return True
def _try_refresh_token( def _try_refresh_token(
@@ -144,6 +167,7 @@ class AuthSessionMiddleware(BaseHTTPMiddleware):
session.user = user session.user = user
session.scopes = tuple(payload.scopes) session.scopes = tuple(payload.scopes)
session.set_role_slugs(role.name for role in getattr(user, "roles", []) if role)
access_token = create_access_token( access_token = create_access_token(
str(user.id), str(user.id),

View File

@@ -10,10 +10,14 @@ async def validate_json(
) -> Response: ) -> Response:
# Only validate JSON for requests with a body # Only validate JSON for requests with a body
if request.method in ("POST", "PUT", "PATCH"): if request.method in ("POST", "PUT", "PATCH"):
try: # Only attempt JSON parsing when the client indicates a JSON content type.
# attempt to parse json body content_type = (request.headers.get("content-type") or "").lower()
await request.json() if "json" in content_type:
except Exception: try:
raise HTTPException(status_code=400, detail="Invalid JSON payload") # attempt to parse json body
await request.json()
except Exception:
raise HTTPException(
status_code=400, detail="Invalid JSON payload")
response = await call_next(request) response = await call_next(request)
return response return response

View File

@@ -1,14 +1,11 @@
"""Database models and shared metadata for the CalMiner domain.""" """Database models and shared metadata for the CalMiner domain."""
from .financial_input import FinancialCategory, FinancialInput from .financial_input import FinancialInput
from .metadata import ( from .metadata import (
COST_BUCKET_METADATA, COST_BUCKET_METADATA,
RESOURCE_METADATA, RESOURCE_METADATA,
STOCHASTIC_VARIABLE_METADATA, STOCHASTIC_VARIABLE_METADATA,
CostBucket,
ResourceDescriptor, ResourceDescriptor,
ResourceType,
StochasticVariable,
StochasticVariableDescriptor, StochasticVariableDescriptor,
) )
from .performance_metric import PerformanceMetric from .performance_metric import PerformanceMetric
@@ -17,20 +14,43 @@ from .pricing_settings import (
PricingMetalSettings, PricingMetalSettings,
PricingSettings, PricingSettings,
) )
from .project import MiningOperationType, Project from .enums import (
from .scenario import Scenario, ScenarioStatus CostBucket,
from .simulation_parameter import DistributionType, SimulationParameter DistributionType,
FinancialCategory,
MiningOperationType,
ResourceType,
ScenarioStatus,
StochasticVariable,
)
from .project import Project
from .scenario import Scenario
from .simulation_parameter import SimulationParameter
from .user import Role, User, UserRole, password_context from .user import Role, User, UserRole, password_context
from .navigation import NavigationGroup, NavigationLink
from .profitability_snapshot import ProjectProfitability, ScenarioProfitability
from .capex_snapshot import ProjectCapexSnapshot, ScenarioCapexSnapshot
from .opex_snapshot import (
ProjectOpexSnapshot,
ScenarioOpexSnapshot,
)
__all__ = [ __all__ = [
"FinancialCategory", "FinancialCategory",
"FinancialInput", "FinancialInput",
"MiningOperationType", "MiningOperationType",
"Project", "Project",
"ProjectProfitability",
"ProjectCapexSnapshot",
"ProjectOpexSnapshot",
"PricingSettings", "PricingSettings",
"PricingMetalSettings", "PricingMetalSettings",
"PricingImpuritySettings", "PricingImpuritySettings",
"Scenario", "Scenario",
"ScenarioProfitability",
"ScenarioCapexSnapshot",
"ScenarioOpexSnapshot",
"ScenarioStatus", "ScenarioStatus",
"DistributionType", "DistributionType",
"SimulationParameter", "SimulationParameter",
@@ -47,4 +67,6 @@ __all__ = [
"UserRole", "UserRole",
"password_context", "password_context",
"PerformanceMetric", "PerformanceMetric",
"NavigationGroup",
"NavigationLink",
] ]

111
models/capex_snapshot.py Normal file
View File

@@ -0,0 +1,111 @@
from __future__ import annotations
from datetime import datetime
from typing import TYPE_CHECKING
from sqlalchemy import JSON, DateTime, ForeignKey, Integer, Numeric, String
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func
from config.database import Base
if TYPE_CHECKING: # pragma: no cover
from .project import Project
from .scenario import Scenario
from .user import User
class ProjectCapexSnapshot(Base):
"""Snapshot of aggregated capex metrics at the project level."""
__tablename__ = "project_capex_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
project_id: Mapped[int] = mapped_column(
ForeignKey("projects.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
total_capex: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
contingency_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
contingency_amount: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
total_with_contingency: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
component_count: Mapped[int | None] = mapped_column(Integer, nullable=True)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
project: Mapped[Project] = relationship(
"Project", back_populates="capex_snapshots"
)
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ProjectCapexSnapshot(id={id!r}, project_id={project_id!r}, total_capex={total_capex!r})".format(
id=self.id, project_id=self.project_id, total_capex=self.total_capex
)
)
class ScenarioCapexSnapshot(Base):
"""Snapshot of capex metrics for an individual scenario."""
__tablename__ = "scenario_capex_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
scenario_id: Mapped[int] = mapped_column(
ForeignKey("scenarios.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
total_capex: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
contingency_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
contingency_amount: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
total_with_contingency: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
component_count: Mapped[int | None] = mapped_column(Integer, nullable=True)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
scenario: Mapped[Scenario] = relationship(
"Scenario", back_populates="capex_snapshots"
)
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ScenarioCapexSnapshot(id={id!r}, scenario_id={scenario_id!r}, total_capex={total_capex!r})".format(
id=self.id, scenario_id=self.scenario_id, total_capex=self.total_capex
)
)

96
models/enums.py Normal file
View File

@@ -0,0 +1,96 @@
from __future__ import annotations
from enum import Enum
from typing import Type
from sqlalchemy import Enum as SQLEnum
def sql_enum(enum_cls: Type[Enum], *, name: str) -> SQLEnum:
"""Build a SQLAlchemy Enum that maps using the enum member values."""
return SQLEnum(
enum_cls,
name=name,
create_type=False,
validate_strings=True,
values_callable=lambda enum_cls: [member.value for member in enum_cls],
)
class MiningOperationType(str, Enum):
"""Supported mining operation categories."""
OPEN_PIT = "open_pit"
UNDERGROUND = "underground"
IN_SITU_LEACH = "in_situ_leach"
PLACER = "placer"
QUARRY = "quarry"
MOUNTAINTOP_REMOVAL = "mountaintop_removal"
OTHER = "other"
class ScenarioStatus(str, Enum):
"""Lifecycle states for project scenarios."""
DRAFT = "draft"
ACTIVE = "active"
ARCHIVED = "archived"
class FinancialCategory(str, Enum):
"""Enumeration of cost and revenue classifications."""
CAPITAL_EXPENDITURE = "capex"
OPERATING_EXPENDITURE = "opex"
REVENUE = "revenue"
CONTINGENCY = "contingency"
OTHER = "other"
class DistributionType(str, Enum):
"""Supported stochastic distribution families for simulations."""
NORMAL = "normal"
TRIANGULAR = "triangular"
UNIFORM = "uniform"
LOGNORMAL = "lognormal"
CUSTOM = "custom"
class ResourceType(str, Enum):
"""Primary consumables and resources used in mining operations."""
DIESEL = "diesel"
ELECTRICITY = "electricity"
WATER = "water"
EXPLOSIVES = "explosives"
REAGENTS = "reagents"
LABOR = "labor"
EQUIPMENT_HOURS = "equipment_hours"
TAILINGS_CAPACITY = "tailings_capacity"
class CostBucket(str, Enum):
"""Granular cost buckets aligned with project accounting."""
CAPITAL_INITIAL = "capital_initial"
CAPITAL_SUSTAINING = "capital_sustaining"
OPERATING_FIXED = "operating_fixed"
OPERATING_VARIABLE = "operating_variable"
MAINTENANCE = "maintenance"
RECLAMATION = "reclamation"
ROYALTIES = "royalties"
GENERAL_ADMIN = "general_admin"
class StochasticVariable(str, Enum):
"""Domain variables that typically require probabilistic modelling."""
ORE_GRADE = "ore_grade"
RECOVERY_RATE = "recovery_rate"
METAL_PRICE = "metal_price"
OPERATING_COST = "operating_cost"
CAPITAL_COST = "capital_cost"
DISCOUNT_RATE = "discount_rate"
THROUGHPUT = "throughput"

View File

@@ -1,13 +1,11 @@
from __future__ import annotations from __future__ import annotations
from datetime import date, datetime from datetime import date, datetime
from enum import Enum
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from sqlalchemy import ( from sqlalchemy import (
Date, Date,
DateTime, DateTime,
Enum as SQLEnum,
ForeignKey, ForeignKey,
Integer, Integer,
Numeric, Numeric,
@@ -19,23 +17,13 @@ from sqlalchemy.orm import Mapped, mapped_column, relationship, validates
from sqlalchemy.sql import func from sqlalchemy.sql import func
from config.database import Base from config.database import Base
from .metadata import CostBucket from .enums import CostBucket, FinancialCategory, sql_enum
from services.currency import normalise_currency from services.currency import normalise_currency
if TYPE_CHECKING: # pragma: no cover if TYPE_CHECKING: # pragma: no cover
from .scenario import Scenario from .scenario import Scenario
class FinancialCategory(str, Enum):
"""Enumeration of cost and revenue classifications."""
CAPITAL_EXPENDITURE = "capex"
OPERATING_EXPENDITURE = "opex"
REVENUE = "revenue"
CONTINGENCY = "contingency"
OTHER = "other"
class FinancialInput(Base): class FinancialInput(Base):
"""Line-item financial assumption attached to a scenario.""" """Line-item financial assumption attached to a scenario."""
@@ -47,10 +35,10 @@ class FinancialInput(Base):
) )
name: Mapped[str] = mapped_column(String(255), nullable=False) name: Mapped[str] = mapped_column(String(255), nullable=False)
category: Mapped[FinancialCategory] = mapped_column( category: Mapped[FinancialCategory] = mapped_column(
SQLEnum(FinancialCategory), nullable=False sql_enum(FinancialCategory, name="financialcategory"), nullable=False
) )
cost_bucket: Mapped[CostBucket | None] = mapped_column( cost_bucket: Mapped[CostBucket | None] = mapped_column(
SQLEnum(CostBucket), nullable=True sql_enum(CostBucket, name="costbucket"), nullable=True
) )
amount: Mapped[float] = mapped_column(Numeric(18, 2), nullable=False) amount: Mapped[float] = mapped_column(Numeric(18, 2), nullable=False)
currency: Mapped[str | None] = mapped_column(String(3), nullable=True) currency: Mapped[str | None] = mapped_column(String(3), nullable=True)

View File

@@ -1,45 +1,7 @@
from __future__ import annotations from __future__ import annotations
from dataclasses import dataclass from dataclasses import dataclass
from enum import Enum from .enums import ResourceType, CostBucket, StochasticVariable
class ResourceType(str, Enum):
"""Primary consumables and resources used in mining operations."""
DIESEL = "diesel"
ELECTRICITY = "electricity"
WATER = "water"
EXPLOSIVES = "explosives"
REAGENTS = "reagents"
LABOR = "labor"
EQUIPMENT_HOURS = "equipment_hours"
TAILINGS_CAPACITY = "tailings_capacity"
class CostBucket(str, Enum):
"""Granular cost buckets aligned with project accounting."""
CAPITAL_INITIAL = "capital_initial"
CAPITAL_SUSTAINING = "capital_sustaining"
OPERATING_FIXED = "operating_fixed"
OPERATING_VARIABLE = "operating_variable"
MAINTENANCE = "maintenance"
RECLAMATION = "reclamation"
ROYALTIES = "royalties"
GENERAL_ADMIN = "general_admin"
class StochasticVariable(str, Enum):
"""Domain variables that typically require probabilistic modelling."""
ORE_GRADE = "ore_grade"
RECOVERY_RATE = "recovery_rate"
METAL_PRICE = "metal_price"
OPERATING_COST = "operating_cost"
CAPITAL_COST = "capital_cost"
DISCOUNT_RATE = "discount_rate"
THROUGHPUT = "throughput"
@dataclass(frozen=True) @dataclass(frozen=True)

125
models/navigation.py Normal file
View File

@@ -0,0 +1,125 @@
from __future__ import annotations
from datetime import datetime
from typing import List, Optional
from sqlalchemy import (
Boolean,
CheckConstraint,
DateTime,
ForeignKey,
Index,
Integer,
String,
UniqueConstraint,
)
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func
from sqlalchemy.ext.mutable import MutableList
from sqlalchemy import JSON
from config.database import Base
class NavigationGroup(Base):
__tablename__ = "navigation_groups"
__table_args__ = (
UniqueConstraint("slug", name="uq_navigation_groups_slug"),
Index("ix_navigation_groups_sort_order", "sort_order"),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True)
slug: Mapped[str] = mapped_column(String(64), nullable=False)
label: Mapped[str] = mapped_column(String(128), nullable=False)
sort_order: Mapped[int] = mapped_column(
Integer, nullable=False, default=100)
icon: Mapped[Optional[str]] = mapped_column(String(64))
tooltip: Mapped[Optional[str]] = mapped_column(String(255))
is_enabled: Mapped[bool] = mapped_column(
Boolean, nullable=False, default=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
links: Mapped[List["NavigationLink"]] = relationship(
"NavigationLink",
back_populates="group",
cascade="all, delete-orphan",
order_by="NavigationLink.sort_order",
)
def __repr__(self) -> str: # pragma: no cover
return f"NavigationGroup(id={self.id!r}, slug={self.slug!r})"
class NavigationLink(Base):
__tablename__ = "navigation_links"
__table_args__ = (
UniqueConstraint("group_id", "slug",
name="uq_navigation_links_group_slug"),
Index("ix_navigation_links_group_sort", "group_id", "sort_order"),
Index("ix_navigation_links_parent_sort",
"parent_link_id", "sort_order"),
CheckConstraint(
"(route_name IS NOT NULL) OR (href_override IS NOT NULL)",
name="ck_navigation_links_route_or_href",
),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True)
group_id: Mapped[int] = mapped_column(
ForeignKey("navigation_groups.id", ondelete="CASCADE"), nullable=False
)
parent_link_id: Mapped[Optional[int]] = mapped_column(
ForeignKey("navigation_links.id", ondelete="CASCADE")
)
slug: Mapped[str] = mapped_column(String(64), nullable=False)
label: Mapped[str] = mapped_column(String(128), nullable=False)
route_name: Mapped[Optional[str]] = mapped_column(String(128))
href_override: Mapped[Optional[str]] = mapped_column(String(512))
match_prefix: Mapped[Optional[str]] = mapped_column(String(512))
sort_order: Mapped[int] = mapped_column(
Integer, nullable=False, default=100)
icon: Mapped[Optional[str]] = mapped_column(String(64))
tooltip: Mapped[Optional[str]] = mapped_column(String(255))
required_roles: Mapped[list[str]] = mapped_column(
MutableList.as_mutable(JSON), nullable=False, default=list
)
is_enabled: Mapped[bool] = mapped_column(
Boolean, nullable=False, default=True)
is_external: Mapped[bool] = mapped_column(
Boolean, nullable=False, default=False)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
group: Mapped[NavigationGroup] = relationship(
NavigationGroup,
back_populates="links",
)
parent: Mapped[Optional["NavigationLink"]] = relationship(
"NavigationLink",
remote_side="NavigationLink.id",
back_populates="children",
)
children: Mapped[List["NavigationLink"]] = relationship(
"NavigationLink",
back_populates="parent",
cascade="all, delete-orphan",
order_by="NavigationLink.sort_order",
)
def is_visible_for_roles(self, roles: list[str]) -> bool:
if not self.required_roles:
return True
role_set = set(roles)
return any(role in role_set for role in self.required_roles)
def __repr__(self) -> str: # pragma: no cover
return f"NavigationLink(id={self.id!r}, slug={self.slug!r})"

123
models/opex_snapshot.py Normal file
View File

@@ -0,0 +1,123 @@
from __future__ import annotations
from datetime import datetime
from typing import TYPE_CHECKING
from sqlalchemy import JSON, Boolean, DateTime, ForeignKey, Integer, Numeric, String
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func
from config.database import Base
if TYPE_CHECKING: # pragma: no cover
from .project import Project
from .scenario import Scenario
from .user import User
class ProjectOpexSnapshot(Base):
"""Snapshot of recurring opex metrics at the project level."""
__tablename__ = "project_opex_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
project_id: Mapped[int] = mapped_column(
ForeignKey("projects.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
overall_annual: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
escalated_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
annual_average: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
evaluation_horizon_years: Mapped[int | None] = mapped_column(
Integer, nullable=True)
escalation_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
apply_escalation: Mapped[bool] = mapped_column(
Boolean, nullable=False, default=True)
component_count: Mapped[int | None] = mapped_column(Integer, nullable=True)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
project: Mapped[Project] = relationship(
"Project", back_populates="opex_snapshots"
)
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ProjectOpexSnapshot(id={id!r}, project_id={project_id!r}, overall_annual={overall_annual!r})".format(
id=self.id,
project_id=self.project_id,
overall_annual=self.overall_annual,
)
)
class ScenarioOpexSnapshot(Base):
"""Snapshot of opex metrics for an individual scenario."""
__tablename__ = "scenario_opex_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
scenario_id: Mapped[int] = mapped_column(
ForeignKey("scenarios.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
overall_annual: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
escalated_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
annual_average: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
evaluation_horizon_years: Mapped[int | None] = mapped_column(
Integer, nullable=True)
escalation_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
apply_escalation: Mapped[bool] = mapped_column(
Boolean, nullable=False, default=True)
component_count: Mapped[int | None] = mapped_column(Integer, nullable=True)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
scenario: Mapped[Scenario] = relationship(
"Scenario", back_populates="opex_snapshots"
)
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ScenarioOpexSnapshot(id={id!r}, scenario_id={scenario_id!r}, overall_annual={overall_annual!r})".format(
id=self.id,
scenario_id=self.scenario_id,
overall_annual=self.overall_annual,
)
)

View File

@@ -0,0 +1,133 @@
from __future__ import annotations
from datetime import datetime
from typing import TYPE_CHECKING
from sqlalchemy import JSON, DateTime, ForeignKey, Integer, Numeric, String
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func
from config.database import Base
if TYPE_CHECKING: # pragma: no cover
from .project import Project
from .scenario import Scenario
from .user import User
class ProjectProfitability(Base):
"""Snapshot of aggregated profitability metrics at the project level."""
__tablename__ = "project_profitability_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
project_id: Mapped[int] = mapped_column(
ForeignKey("projects.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
npv: Mapped[float | None] = mapped_column(Numeric(18, 2), nullable=True)
irr_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
payback_period_years: Mapped[float | None] = mapped_column(
Numeric(12, 4), nullable=True
)
margin_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
revenue_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
opex_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
sustaining_capex_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
capex: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
net_cash_flow_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
project: Mapped[Project] = relationship(
"Project", back_populates="profitability_snapshots")
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ProjectProfitability(id={id!r}, project_id={project_id!r}, npv={npv!r})".format(
id=self.id, project_id=self.project_id, npv=self.npv
)
)
class ScenarioProfitability(Base):
"""Snapshot of profitability metrics for an individual scenario."""
__tablename__ = "scenario_profitability_snapshots"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
scenario_id: Mapped[int] = mapped_column(
ForeignKey("scenarios.id", ondelete="CASCADE"), nullable=False, index=True
)
created_by_id: Mapped[int | None] = mapped_column(
ForeignKey("users.id", ondelete="SET NULL"), nullable=True, index=True
)
calculation_source: Mapped[str | None] = mapped_column(
String(64), nullable=True)
calculated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
currency_code: Mapped[str | None] = mapped_column(String(3), nullable=True)
npv: Mapped[float | None] = mapped_column(Numeric(18, 2), nullable=True)
irr_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
payback_period_years: Mapped[float | None] = mapped_column(
Numeric(12, 4), nullable=True
)
margin_pct: Mapped[float | None] = mapped_column(
Numeric(12, 6), nullable=True)
revenue_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
opex_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
sustaining_capex_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
capex: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True)
net_cash_flow_total: Mapped[float | None] = mapped_column(
Numeric(18, 2), nullable=True
)
payload: Mapped[dict | None] = mapped_column(JSON, nullable=True)
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now()
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now(), onupdate=func.now()
)
scenario: Mapped[Scenario] = relationship(
"Scenario", back_populates="profitability_snapshots")
created_by: Mapped[User | None] = relationship("User")
def __repr__(self) -> str: # pragma: no cover
return (
"ScenarioProfitability(id={id!r}, scenario_id={scenario_id!r}, npv={npv!r})".format(
id=self.id, scenario_id=self.scenario_id, npv=self.npv
)
)

View File

@@ -1,10 +1,14 @@
from __future__ import annotations from __future__ import annotations
from datetime import datetime from datetime import datetime
from enum import Enum
from typing import TYPE_CHECKING, List from typing import TYPE_CHECKING, List
from sqlalchemy import DateTime, Enum as SQLEnum, ForeignKey, Integer, String, Text from .enums import MiningOperationType, sql_enum
from .profitability_snapshot import ProjectProfitability
from .capex_snapshot import ProjectCapexSnapshot
from .opex_snapshot import ProjectOpexSnapshot
from sqlalchemy import DateTime, ForeignKey, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column, relationship from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func from sqlalchemy.sql import func
@@ -15,18 +19,6 @@ if TYPE_CHECKING: # pragma: no cover
from .pricing_settings import PricingSettings from .pricing_settings import PricingSettings
class MiningOperationType(str, Enum):
"""Supported mining operation categories."""
OPEN_PIT = "open_pit"
UNDERGROUND = "underground"
IN_SITU_LEACH = "in_situ_leach"
PLACER = "placer"
QUARRY = "quarry"
MOUNTAINTOP_REMOVAL = "mountaintop_removal"
OTHER = "other"
class Project(Base): class Project(Base):
"""Top-level mining project grouping multiple scenarios.""" """Top-level mining project grouping multiple scenarios."""
@@ -36,7 +28,9 @@ class Project(Base):
name: Mapped[str] = mapped_column(String(255), nullable=False, unique=True) name: Mapped[str] = mapped_column(String(255), nullable=False, unique=True)
location: Mapped[str | None] = mapped_column(String(255), nullable=True) location: Mapped[str | None] = mapped_column(String(255), nullable=True)
operation_type: Mapped[MiningOperationType] = mapped_column( operation_type: Mapped[MiningOperationType] = mapped_column(
SQLEnum(MiningOperationType), nullable=False, default=MiningOperationType.OTHER sql_enum(MiningOperationType, name="miningoperationtype"),
nullable=False,
default=MiningOperationType.OTHER,
) )
description: Mapped[str | None] = mapped_column(Text, nullable=True) description: Mapped[str | None] = mapped_column(Text, nullable=True)
pricing_settings_id: Mapped[int | None] = mapped_column( pricing_settings_id: Mapped[int | None] = mapped_column(
@@ -60,6 +54,51 @@ class Project(Base):
"PricingSettings", "PricingSettings",
back_populates="projects", back_populates="projects",
) )
profitability_snapshots: Mapped[List["ProjectProfitability"]] = relationship(
"ProjectProfitability",
back_populates="project",
cascade="all, delete-orphan",
order_by=lambda: ProjectProfitability.calculated_at.desc(),
passive_deletes=True,
)
capex_snapshots: Mapped[List["ProjectCapexSnapshot"]] = relationship(
"ProjectCapexSnapshot",
back_populates="project",
cascade="all, delete-orphan",
order_by=lambda: ProjectCapexSnapshot.calculated_at.desc(),
passive_deletes=True,
)
opex_snapshots: Mapped[List["ProjectOpexSnapshot"]] = relationship(
"ProjectOpexSnapshot",
back_populates="project",
cascade="all, delete-orphan",
order_by=lambda: ProjectOpexSnapshot.calculated_at.desc(),
passive_deletes=True,
)
@property
def latest_profitability(self) -> "ProjectProfitability | None":
"""Return the most recent profitability snapshot, if any."""
if not self.profitability_snapshots:
return None
return self.profitability_snapshots[0]
@property
def latest_capex(self) -> "ProjectCapexSnapshot | None":
"""Return the most recent capex snapshot, if any."""
if not self.capex_snapshots:
return None
return self.capex_snapshots[0]
@property
def latest_opex(self) -> "ProjectOpexSnapshot | None":
"""Return the most recent opex snapshot, if any."""
if not self.opex_snapshots:
return None
return self.opex_snapshots[0]
def __repr__(self) -> str: # pragma: no cover - helpful for debugging def __repr__(self) -> str: # pragma: no cover - helpful for debugging
return f"Project(id={self.id!r}, name={self.name!r})" return f"Project(id={self.id!r}, name={self.name!r})"

View File

@@ -1,25 +1,27 @@
from __future__ import annotations from __future__ import annotations
from datetime import date, datetime from datetime import date, datetime
from enum import Enum
from typing import TYPE_CHECKING, List from typing import TYPE_CHECKING, List
from sqlalchemy import ( from sqlalchemy import (
Date, Date,
DateTime, DateTime,
Enum as SQLEnum,
ForeignKey, ForeignKey,
Integer, Integer,
Numeric, Numeric,
String, String,
Text, Text,
UniqueConstraint,
) )
from sqlalchemy.orm import Mapped, mapped_column, relationship, validates from sqlalchemy.orm import Mapped, mapped_column, relationship, validates
from sqlalchemy.sql import func from sqlalchemy.sql import func
from config.database import Base from config.database import Base
from services.currency import normalise_currency from services.currency import normalise_currency
from .metadata import ResourceType from .enums import ResourceType, ScenarioStatus, sql_enum
from .profitability_snapshot import ScenarioProfitability
from .capex_snapshot import ScenarioCapexSnapshot
from .opex_snapshot import ScenarioOpexSnapshot
if TYPE_CHECKING: # pragma: no cover if TYPE_CHECKING: # pragma: no cover
from .financial_input import FinancialInput from .financial_input import FinancialInput
@@ -27,18 +29,14 @@ if TYPE_CHECKING: # pragma: no cover
from .simulation_parameter import SimulationParameter from .simulation_parameter import SimulationParameter
class ScenarioStatus(str, Enum):
"""Lifecycle states for project scenarios."""
DRAFT = "draft"
ACTIVE = "active"
ARCHIVED = "archived"
class Scenario(Base): class Scenario(Base):
"""A specific configuration of assumptions for a project.""" """A specific configuration of assumptions for a project."""
__tablename__ = "scenarios" __tablename__ = "scenarios"
__table_args__ = (
UniqueConstraint("project_id", "name",
name="uq_scenarios_project_name"),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True, index=True) id: Mapped[int] = mapped_column(Integer, primary_key=True, index=True)
project_id: Mapped[int] = mapped_column( project_id: Mapped[int] = mapped_column(
@@ -47,7 +45,9 @@ class Scenario(Base):
name: Mapped[str] = mapped_column(String(255), nullable=False) name: Mapped[str] = mapped_column(String(255), nullable=False)
description: Mapped[str | None] = mapped_column(Text, nullable=True) description: Mapped[str | None] = mapped_column(Text, nullable=True)
status: Mapped[ScenarioStatus] = mapped_column( status: Mapped[ScenarioStatus] = mapped_column(
SQLEnum(ScenarioStatus), nullable=False, default=ScenarioStatus.DRAFT sql_enum(ScenarioStatus, name="scenariostatus"),
nullable=False,
default=ScenarioStatus.DRAFT,
) )
start_date: Mapped[date | None] = mapped_column(Date, nullable=True) start_date: Mapped[date | None] = mapped_column(Date, nullable=True)
end_date: Mapped[date | None] = mapped_column(Date, nullable=True) end_date: Mapped[date | None] = mapped_column(Date, nullable=True)
@@ -55,7 +55,7 @@ class Scenario(Base):
Numeric(5, 2), nullable=True) Numeric(5, 2), nullable=True)
currency: Mapped[str | None] = mapped_column(String(3), nullable=True) currency: Mapped[str | None] = mapped_column(String(3), nullable=True)
primary_resource: Mapped[ResourceType | None] = mapped_column( primary_resource: Mapped[ResourceType | None] = mapped_column(
SQLEnum(ResourceType), nullable=True sql_enum(ResourceType, name="resourcetype"), nullable=True
) )
created_at: Mapped[datetime] = mapped_column( created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True), nullable=False, server_default=func.now() DateTime(timezone=True), nullable=False, server_default=func.now()
@@ -78,6 +78,27 @@ class Scenario(Base):
cascade="all, delete-orphan", cascade="all, delete-orphan",
passive_deletes=True, passive_deletes=True,
) )
profitability_snapshots: Mapped[List["ScenarioProfitability"]] = relationship(
"ScenarioProfitability",
back_populates="scenario",
cascade="all, delete-orphan",
order_by=lambda: ScenarioProfitability.calculated_at.desc(),
passive_deletes=True,
)
capex_snapshots: Mapped[List["ScenarioCapexSnapshot"]] = relationship(
"ScenarioCapexSnapshot",
back_populates="scenario",
cascade="all, delete-orphan",
order_by=lambda: ScenarioCapexSnapshot.calculated_at.desc(),
passive_deletes=True,
)
opex_snapshots: Mapped[List["ScenarioOpexSnapshot"]] = relationship(
"ScenarioOpexSnapshot",
back_populates="scenario",
cascade="all, delete-orphan",
order_by=lambda: ScenarioOpexSnapshot.calculated_at.desc(),
passive_deletes=True,
)
@validates("currency") @validates("currency")
def _normalise_currency(self, key: str, value: str | None) -> str | None: def _normalise_currency(self, key: str, value: str | None) -> str | None:
@@ -86,3 +107,27 @@ class Scenario(Base):
def __repr__(self) -> str: # pragma: no cover def __repr__(self) -> str: # pragma: no cover
return f"Scenario(id={self.id!r}, name={self.name!r}, project_id={self.project_id!r})" return f"Scenario(id={self.id!r}, name={self.name!r}, project_id={self.project_id!r})"
@property
def latest_profitability(self) -> "ScenarioProfitability | None":
"""Return the most recent profitability snapshot for this scenario."""
if not self.profitability_snapshots:
return None
return self.profitability_snapshots[0]
@property
def latest_capex(self) -> "ScenarioCapexSnapshot | None":
"""Return the most recent capex snapshot for this scenario."""
if not self.capex_snapshots:
return None
return self.capex_snapshots[0]
@property
def latest_opex(self) -> "ScenarioOpexSnapshot | None":
"""Return the most recent opex snapshot for this scenario."""
if not self.opex_snapshots:
return None
return self.opex_snapshots[0]

View File

@@ -1,13 +1,13 @@
from __future__ import annotations from __future__ import annotations
from datetime import datetime from datetime import datetime
from enum import Enum
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from .enums import DistributionType, ResourceType, StochasticVariable, sql_enum
from sqlalchemy import ( from sqlalchemy import (
JSON, JSON,
DateTime, DateTime,
Enum as SQLEnum,
ForeignKey, ForeignKey,
Integer, Integer,
Numeric, Numeric,
@@ -17,22 +17,11 @@ from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.sql import func from sqlalchemy.sql import func
from config.database import Base from config.database import Base
from .metadata import ResourceType, StochasticVariable
if TYPE_CHECKING: # pragma: no cover if TYPE_CHECKING: # pragma: no cover
from .scenario import Scenario from .scenario import Scenario
class DistributionType(str, Enum):
"""Supported stochastic distribution families for simulations."""
NORMAL = "normal"
TRIANGULAR = "triangular"
UNIFORM = "uniform"
LOGNORMAL = "lognormal"
CUSTOM = "custom"
class SimulationParameter(Base): class SimulationParameter(Base):
"""Probability distribution settings for scenario simulations.""" """Probability distribution settings for scenario simulations."""
@@ -44,13 +33,13 @@ class SimulationParameter(Base):
) )
name: Mapped[str] = mapped_column(String(255), nullable=False) name: Mapped[str] = mapped_column(String(255), nullable=False)
distribution: Mapped[DistributionType] = mapped_column( distribution: Mapped[DistributionType] = mapped_column(
SQLEnum(DistributionType), nullable=False sql_enum(DistributionType, name="distributiontype"), nullable=False
) )
variable: Mapped[StochasticVariable | None] = mapped_column( variable: Mapped[StochasticVariable | None] = mapped_column(
SQLEnum(StochasticVariable), nullable=True sql_enum(StochasticVariable, name="stochasticvariable"), nullable=True
) )
resource_type: Mapped[ResourceType | None] = mapped_column( resource_type: Mapped[ResourceType | None] = mapped_column(
SQLEnum(ResourceType), nullable=True sql_enum(ResourceType, name="resourcetype"), nullable=True
) )
mean_value: Mapped[float | None] = mapped_column( mean_value: Mapped[float | None] = mapped_column(
Numeric(18, 4), nullable=True) Numeric(18, 4), nullable=True)

View File

@@ -27,10 +27,12 @@ branch = true
source = ["."] source = ["."]
omit = [ omit = [
"tests/*", "tests/*",
"alembic/*",
"scripts/*", "scripts/*",
"main.py", "main.py",
"routes/reports.py", "routes/reports.py",
"routes/calculations.py",
"services/calculations.py",
"services/importers.py",
"services/reporting.py", "services/reporting.py",
] ]
@@ -39,6 +41,6 @@ skip_empty = true
show_missing = true show_missing = true
[tool.bandit] [tool.bandit]
exclude_dirs = ["alembic", "scripts"] exclude_dirs = ["scripts"]
skips = ["B101", "B601"] # B101: assert_used, B601: shell_injection (may be false positives) skips = ["B101", "B601"] # B101: assert_used, B601: shell_injection (may be false positives)

View File

@@ -1,2 +1 @@
-r requirements.txt -r requirements.txt
alembic

View File

@@ -14,3 +14,4 @@ python-jose
python-multipart python-multipart
openpyxl openpyxl
prometheus-client prometheus-client
plotly

View File

@@ -5,7 +5,6 @@ from typing import Any, Iterable
from fastapi import APIRouter, Depends, HTTPException, Request, UploadFile, status from fastapi import APIRouter, Depends, HTTPException, Request, UploadFile, status
from fastapi.responses import HTMLResponse, RedirectResponse from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.templating import Jinja2Templates
from pydantic import ValidationError from pydantic import ValidationError
from starlette.datastructures import FormData from starlette.datastructures import FormData
@@ -43,9 +42,10 @@ from services.session import (
) )
from services.repositories import RoleRepository, UserRepository from services.repositories import RoleRepository, UserRepository
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from routes.template_filters import create_templates
router = APIRouter(tags=["Authentication"]) router = APIRouter(tags=["Authentication"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
_PASSWORD_RESET_SCOPE = "password-reset" _PASSWORD_RESET_SCOPE = "password-reset"
_AUTH_SCOPE = "auth" _AUTH_SCOPE = "auth"

2119
routes/calculations.py Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -4,14 +4,14 @@ from datetime import datetime
from fastapi import APIRouter, Depends, Request from fastapi import APIRouter, Depends, Request
from fastapi.responses import HTMLResponse, RedirectResponse from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.templating import Jinja2Templates from routes.template_filters import create_templates
from dependencies import get_current_user, get_unit_of_work from dependencies import get_current_user, get_unit_of_work
from models import ScenarioStatus, User from models import ScenarioStatus, User
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
router = APIRouter(tags=["Dashboard"]) router = APIRouter(tags=["Dashboard"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
def _format_timestamp(moment: datetime | None) -> str | None: def _format_timestamp(moment: datetime | None) -> str | None:

View File

@@ -7,7 +7,6 @@ from typing import Annotated
from fastapi import APIRouter, Depends, HTTPException, Request, Response, status from fastapi import APIRouter, Depends, HTTPException, Request, Response, status
from fastapi.responses import HTMLResponse, StreamingResponse from fastapi.responses import HTMLResponse, StreamingResponse
from fastapi.templating import Jinja2Templates
from dependencies import get_unit_of_work, require_any_role from dependencies import get_unit_of_work, require_any_role
from schemas.exports import ( from schemas.exports import (
@@ -24,10 +23,12 @@ from services.export_serializers import (
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from models.import_export_log import ImportExportLog from models.import_export_log import ImportExportLog
from monitoring.metrics import observe_export from monitoring.metrics import observe_export
from routes.template_filters import create_templates
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
router = APIRouter(prefix="/exports", tags=["exports"]) router = APIRouter(prefix="/exports", tags=["exports"])
templates = create_templates()
@router.get( @router.get(
@@ -49,7 +50,6 @@ async def export_modal(
submit_url = request.url_for( submit_url = request.url_for(
"export_projects" if dataset == "projects" else "export_scenarios" "export_projects" if dataset == "projects" else "export_scenarios"
) )
templates = Jinja2Templates(directory="templates")
return templates.TemplateResponse( return templates.TemplateResponse(
request, request,
"exports/modal.html", "exports/modal.html",

View File

@@ -5,9 +5,12 @@ from io import BytesIO
from fastapi import APIRouter, Depends, File, HTTPException, UploadFile, status from fastapi import APIRouter, Depends, File, HTTPException, UploadFile, status
from fastapi import Request from fastapi import Request
from fastapi.responses import HTMLResponse from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
from dependencies import get_import_ingestion_service, require_roles from dependencies import (
get_import_ingestion_service,
require_roles,
require_roles_html,
)
from models import User from models import User
from schemas.imports import ( from schemas.imports import (
ImportCommitRequest, ImportCommitRequest,
@@ -17,9 +20,10 @@ from schemas.imports import (
ScenarioImportPreviewResponse, ScenarioImportPreviewResponse,
) )
from services.importers import ImportIngestionService, UnsupportedImportFormat from services.importers import ImportIngestionService, UnsupportedImportFormat
from routes.template_filters import create_templates
router = APIRouter(prefix="/imports", tags=["Imports"]) router = APIRouter(prefix="/imports", tags=["Imports"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
MANAGE_ROLES = ("project_manager", "admin") MANAGE_ROLES = ("project_manager", "admin")
@@ -32,7 +36,7 @@ MANAGE_ROLES = ("project_manager", "admin")
) )
def import_dashboard( def import_dashboard(
request: Request, request: Request,
_: User = Depends(require_roles(*MANAGE_ROLES)), _: User = Depends(require_roles_html(*MANAGE_ROLES)),
) -> HTMLResponse: ) -> HTMLResponse:
return templates.TemplateResponse( return templates.TemplateResponse(
request, request,

63
routes/navigation.py Normal file
View File

@@ -0,0 +1,63 @@
from __future__ import annotations
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, Request
from dependencies import (
get_auth_session,
get_navigation_service,
require_authenticated_user,
)
from models import User
from schemas.navigation import (
NavigationGroupSchema,
NavigationLinkSchema,
NavigationSidebarResponse,
)
from services.navigation import NavigationGroupDTO, NavigationLinkDTO, NavigationService
from services.session import AuthSession
router = APIRouter(prefix="/navigation", tags=["Navigation"])
def _to_link_schema(dto: NavigationLinkDTO) -> NavigationLinkSchema:
return NavigationLinkSchema(
id=dto.id,
label=dto.label,
href=dto.href,
match_prefix=dto.match_prefix,
icon=dto.icon,
tooltip=dto.tooltip,
is_external=dto.is_external,
children=[_to_link_schema(child) for child in dto.children],
)
def _to_group_schema(dto: NavigationGroupDTO) -> NavigationGroupSchema:
return NavigationGroupSchema(
id=dto.id,
label=dto.label,
icon=dto.icon,
tooltip=dto.tooltip,
links=[_to_link_schema(link) for link in dto.links],
)
@router.get(
"/sidebar",
response_model=NavigationSidebarResponse,
name="navigation.sidebar",
)
async def get_sidebar_navigation(
request: Request,
_: User = Depends(require_authenticated_user),
session: AuthSession = Depends(get_auth_session),
service: NavigationService = Depends(get_navigation_service),
) -> NavigationSidebarResponse:
dto = service.build_sidebar(session=session, request=request)
return NavigationSidebarResponse(
groups=[_to_group_schema(group) for group in dto.groups],
roles=list(dto.roles),
generated_at=datetime.now(tz=timezone.utc),
)

View File

@@ -4,23 +4,26 @@ from typing import List
from fastapi import APIRouter, Depends, Form, HTTPException, Request, status from fastapi import APIRouter, Depends, Form, HTTPException, Request, status
from fastapi.responses import HTMLResponse, RedirectResponse from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.templating import Jinja2Templates
from dependencies import ( from dependencies import (
get_pricing_metadata, get_pricing_metadata,
get_unit_of_work, get_unit_of_work,
require_any_role, require_any_role,
require_any_role_html,
require_project_resource, require_project_resource,
require_project_resource_html,
require_roles, require_roles,
require_roles_html,
) )
from models import MiningOperationType, Project, ScenarioStatus, User from models import MiningOperationType, Project, ScenarioStatus, User
from schemas.project import ProjectCreate, ProjectRead, ProjectUpdate from schemas.project import ProjectCreate, ProjectRead, ProjectUpdate
from services.exceptions import EntityConflictError from services.exceptions import EntityConflictError
from services.pricing import PricingMetadata from services.pricing import PricingMetadata
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from routes.template_filters import create_templates
router = APIRouter(prefix="/projects", tags=["Projects"]) router = APIRouter(prefix="/projects", tags=["Projects"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
READ_ROLES = ("viewer", "analyst", "project_manager", "admin") READ_ROLES = ("viewer", "analyst", "project_manager", "admin")
MANAGE_ROLES = ("project_manager", "admin") MANAGE_ROLES = ("project_manager", "admin")
@@ -79,7 +82,7 @@ def create_project(
) )
def project_list_page( def project_list_page(
request: Request, request: Request,
_: User = Depends(require_any_role(*READ_ROLES)), _: User = Depends(require_any_role_html(*READ_ROLES)),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> HTMLResponse: ) -> HTMLResponse:
projects = _require_project_repo(uow).list(with_children=True) projects = _require_project_repo(uow).list(with_children=True)
@@ -101,7 +104,8 @@ def project_list_page(
name="projects.create_project_form", name="projects.create_project_form",
) )
def create_project_form( def create_project_form(
request: Request, _: User = Depends(require_roles(*MANAGE_ROLES)) request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
) -> HTMLResponse: ) -> HTMLResponse:
return templates.TemplateResponse( return templates.TemplateResponse(
request, request,
@@ -122,7 +126,7 @@ def create_project_form(
) )
def create_project_submit( def create_project_submit(
request: Request, request: Request,
_: User = Depends(require_roles(*MANAGE_ROLES)), _: User = Depends(require_roles_html(*MANAGE_ROLES)),
name: str = Form(...), name: str = Form(...),
location: str | None = Form(None), location: str | None = Form(None),
operation_type: str = Form(...), operation_type: str = Form(...),
@@ -221,7 +225,8 @@ def delete_project(
) )
def view_project( def view_project(
request: Request, request: Request,
project: Project = Depends(require_project_resource()), _: User = Depends(require_any_role_html(*READ_ROLES)),
project: Project = Depends(require_project_resource_html()),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> HTMLResponse: ) -> HTMLResponse:
project = _require_project_repo(uow).get(project.id, with_children=True) project = _require_project_repo(uow).get(project.id, with_children=True)
@@ -256,8 +261,9 @@ def view_project(
) )
def edit_project_form( def edit_project_form(
request: Request, request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
project: Project = Depends( project: Project = Depends(
require_project_resource(require_manage=True) require_project_resource_html(require_manage=True)
), ),
) -> HTMLResponse: ) -> HTMLResponse:
return templates.TemplateResponse( return templates.TemplateResponse(
@@ -283,8 +289,9 @@ def edit_project_form(
) )
def edit_project_submit( def edit_project_submit(
request: Request, request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
project: Project = Depends( project: Project = Depends(
require_project_resource(require_manage=True) require_project_resource_html(require_manage=True)
), ),
name: str = Form(...), name: str = Form(...),
location: str | None = Form(None), location: str | None = Form(None),

View File

@@ -1,17 +1,19 @@
from __future__ import annotations from __future__ import annotations
from datetime import date, datetime from datetime import date
from fastapi import APIRouter, Depends, HTTPException, Query, Request, status from fastapi import APIRouter, Depends, HTTPException, Query, Request, status
from fastapi.encoders import jsonable_encoder from fastapi.encoders import jsonable_encoder
from fastapi.responses import HTMLResponse from fastapi.responses import HTMLResponse
from fastapi.templating import Jinja2Templates
from dependencies import ( from dependencies import (
get_unit_of_work, get_unit_of_work,
require_any_role, require_any_role,
require_any_role_html,
require_project_resource, require_project_resource,
require_scenario_resource, require_scenario_resource,
require_project_resource_html,
require_scenario_resource_html,
) )
from models import Project, Scenario, User from models import Project, Scenario, User
from services.exceptions import EntityNotFoundError, ScenarioValidationError from services.exceptions import EntityNotFoundError, ScenarioValidationError
@@ -24,96 +26,10 @@ from services.reporting import (
validate_percentiles, validate_percentiles,
) )
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from routes.template_filters import create_templates
router = APIRouter(prefix="/reports", tags=["Reports"]) router = APIRouter(prefix="/reports", tags=["Reports"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
# Add custom Jinja2 filters
def format_datetime(value):
"""Format a datetime object for display in templates."""
if not isinstance(value, datetime):
return ""
if value.tzinfo is None:
# Assume UTC if no timezone
from datetime import timezone
value = value.replace(tzinfo=timezone.utc)
# Format as readable date/time
return value.strftime("%Y-%m-%d %H:%M UTC")
def currency_display(value, currency_code):
"""Format a numeric value with currency symbol/code."""
if value is None:
return ""
# Format the number
if isinstance(value, (int, float)):
formatted_value = f"{value:,.2f}"
else:
formatted_value = str(value)
# Add currency code
if currency_code:
return f"{currency_code} {formatted_value}"
return formatted_value
def format_metric(value, metric_name, currency_code=None):
"""Format metric values appropriately based on metric type."""
if value is None:
return ""
# For currency-related metrics, use currency formatting
currency_metrics = {'npv', 'inflows', 'outflows',
'net', 'total_inflows', 'total_outflows', 'total_net'}
if metric_name in currency_metrics and currency_code:
return currency_display(value, currency_code)
# For percentage metrics
percentage_metrics = {'irr', 'payback_period'}
if metric_name in percentage_metrics:
if isinstance(value, (int, float)):
return f"{value:.2f}%"
return f"{value}%"
# Default numeric formatting
if isinstance(value, (int, float)):
return f"{value:,.2f}"
return str(value)
def percentage_display(value):
"""Format a value as a percentage."""
if value is None:
return ""
if isinstance(value, (int, float)):
return f"{value:.2f}%"
return f"{value}%"
def period_display(value):
"""Format a period value (like payback period)."""
if value is None:
return ""
if isinstance(value, (int, float)):
if value == int(value):
return f"{int(value)} years"
return f"{value:.1f} years"
return str(value)
templates.env.filters['format_datetime'] = format_datetime
templates.env.filters['currency_display'] = currency_display
templates.env.filters['format_metric'] = format_metric
templates.env.filters['percentage_display'] = percentage_display
templates.env.filters['period_display'] = period_display
READ_ROLES = ("viewer", "analyst", "project_manager", "admin") READ_ROLES = ("viewer", "analyst", "project_manager", "admin")
MANAGE_ROLES = ("project_manager", "admin") MANAGE_ROLES = ("project_manager", "admin")
@@ -167,7 +83,7 @@ def project_summary_report(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc
@@ -220,7 +136,7 @@ def project_scenario_comparison_report(
unique_ids = list(dict.fromkeys(scenario_ids)) unique_ids = list(dict.fromkeys(scenario_ids))
if len(unique_ids) < 2: if len(unique_ids) < 2:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail="At least two unique scenario_ids must be provided for comparison.", detail="At least two unique scenario_ids must be provided for comparison.",
) )
if fmt.lower() != "json": if fmt.lower() != "json":
@@ -234,7 +150,7 @@ def project_scenario_comparison_report(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc
@@ -242,7 +158,7 @@ def project_scenario_comparison_report(
scenarios = uow.validate_scenarios_for_comparison(unique_ids) scenarios = uow.validate_scenarios_for_comparison(unique_ids)
except ScenarioValidationError as exc: except ScenarioValidationError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail={ detail={
"code": exc.code, "code": exc.code,
"message": exc.message, "message": exc.message,
@@ -313,7 +229,7 @@ def scenario_distribution_report(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc
@@ -335,8 +251,8 @@ def scenario_distribution_report(
) )
def project_summary_page( def project_summary_page(
request: Request, request: Request,
project: Project = Depends(require_project_resource()), project: Project = Depends(require_project_resource_html()),
_: User = Depends(require_any_role(*READ_ROLES)), _: User = Depends(require_any_role_html(*READ_ROLES)),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
include: str | None = Query( include: str | None = Query(
None, None,
@@ -370,7 +286,7 @@ def project_summary_page(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc
@@ -399,8 +315,8 @@ def project_summary_page(
) )
def project_scenario_comparison_page( def project_scenario_comparison_page(
request: Request, request: Request,
project: Project = Depends(require_project_resource()), project: Project = Depends(require_project_resource_html()),
_: User = Depends(require_any_role(*READ_ROLES)), _: User = Depends(require_any_role_html(*READ_ROLES)),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
scenario_ids: list[int] = Query( scenario_ids: list[int] = Query(
..., alias="scenario_ids", description="Repeatable scenario identifier."), ..., alias="scenario_ids", description="Repeatable scenario identifier."),
@@ -421,7 +337,7 @@ def project_scenario_comparison_page(
unique_ids = list(dict.fromkeys(scenario_ids)) unique_ids = list(dict.fromkeys(scenario_ids))
if len(unique_ids) < 2: if len(unique_ids) < 2:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail="At least two unique scenario_ids must be provided for comparison.", detail="At least two unique scenario_ids must be provided for comparison.",
) )
@@ -430,7 +346,7 @@ def project_scenario_comparison_page(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc
@@ -438,7 +354,7 @@ def project_scenario_comparison_page(
scenarios = uow.validate_scenarios_for_comparison(unique_ids) scenarios = uow.validate_scenarios_for_comparison(unique_ids)
except ScenarioValidationError as exc: except ScenarioValidationError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail={ detail={
"code": exc.code, "code": exc.code,
"message": exc.message, "message": exc.message,
@@ -476,8 +392,10 @@ def project_scenario_comparison_page(
) )
def scenario_distribution_page( def scenario_distribution_page(
request: Request, request: Request,
scenario: Scenario = Depends(require_scenario_resource()), _: User = Depends(require_any_role_html(*READ_ROLES)),
_: User = Depends(require_any_role(*READ_ROLES)), scenario: Scenario = Depends(
require_scenario_resource_html()
),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
include: str | None = Query( include: str | None = Query(
None, None,
@@ -501,7 +419,7 @@ def scenario_distribution_page(
percentile_values = validate_percentiles(percentiles) percentile_values = validate_percentiles(percentiles)
except ValueError as exc: except ValueError as exc:
raise HTTPException( raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, status_code=status.HTTP_422_UNPROCESSABLE_CONTENT,
detail=str(exc), detail=str(exc),
) from exc ) from exc

View File

@@ -6,14 +6,16 @@ from typing import List
from fastapi import APIRouter, Depends, Form, HTTPException, Request, status from fastapi import APIRouter, Depends, Form, HTTPException, Request, status
from fastapi.responses import HTMLResponse, RedirectResponse from fastapi.responses import HTMLResponse, RedirectResponse
from fastapi.templating import Jinja2Templates
from dependencies import ( from dependencies import (
get_pricing_metadata, get_pricing_metadata,
get_unit_of_work, get_unit_of_work,
require_any_role, require_any_role,
require_any_role_html,
require_roles, require_roles,
require_roles_html,
require_scenario_resource, require_scenario_resource,
require_scenario_resource_html,
) )
from models import ResourceType, Scenario, ScenarioStatus, User from models import ResourceType, Scenario, ScenarioStatus, User
from schemas.scenario import ( from schemas.scenario import (
@@ -31,9 +33,10 @@ from services.exceptions import (
) )
from services.pricing import PricingMetadata from services.pricing import PricingMetadata
from services.unit_of_work import UnitOfWork from services.unit_of_work import UnitOfWork
from routes.template_filters import create_templates
router = APIRouter(tags=["Scenarios"]) router = APIRouter(tags=["Scenarios"])
templates = Jinja2Templates(directory="templates") templates = create_templates()
READ_ROLES = ("viewer", "analyst", "project_manager", "admin") READ_ROLES = ("viewer", "analyst", "project_manager", "admin")
MANAGE_ROLES = ("project_manager", "admin") MANAGE_ROLES = ("project_manager", "admin")
@@ -170,6 +173,63 @@ def create_scenario_for_project(
return _to_read_model(created) return _to_read_model(created)
@router.get(
"/projects/{project_id}/scenarios/ui",
response_class=HTMLResponse,
include_in_schema=False,
name="scenarios.project_scenario_list",
)
def project_scenario_list_page(
project_id: int,
request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
uow: UnitOfWork = Depends(get_unit_of_work),
) -> HTMLResponse:
try:
project = _require_project_repo(uow).get(
project_id, with_children=True)
except EntityNotFoundError as exc:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)
) from exc
scenarios = sorted(
project.scenarios,
key=lambda scenario: scenario.updated_at or scenario.created_at,
reverse=True,
)
scenario_totals = {
"total": len(scenarios),
"active": sum(
1 for scenario in scenarios if scenario.status == ScenarioStatus.ACTIVE
),
"draft": sum(
1 for scenario in scenarios if scenario.status == ScenarioStatus.DRAFT
),
"archived": sum(
1 for scenario in scenarios if scenario.status == ScenarioStatus.ARCHIVED
),
"latest_update": max(
(
scenario.updated_at or scenario.created_at
for scenario in scenarios
if scenario.updated_at or scenario.created_at
),
default=None,
),
}
return templates.TemplateResponse(
request,
"scenarios/list.html",
{
"project": project,
"scenarios": scenarios,
"scenario_totals": scenario_totals,
},
)
@router.get("/scenarios/{scenario_id}", response_model=ScenarioRead) @router.get("/scenarios/{scenario_id}", response_model=ScenarioRead)
def get_scenario( def get_scenario(
scenario: Scenario = Depends(require_scenario_resource()), scenario: Scenario = Depends(require_scenario_resource()),
@@ -263,7 +323,7 @@ def _scenario_form_state(
def create_scenario_form( def create_scenario_form(
project_id: int, project_id: int,
request: Request, request: Request,
_: User = Depends(require_roles(*MANAGE_ROLES)), _: User = Depends(require_roles_html(*MANAGE_ROLES)),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
metadata: PricingMetadata = Depends(get_pricing_metadata), metadata: PricingMetadata = Depends(get_pricing_metadata),
) -> HTMLResponse: ) -> HTMLResponse:
@@ -301,7 +361,7 @@ def create_scenario_form(
def create_scenario_submit( def create_scenario_submit(
project_id: int, project_id: int,
request: Request, request: Request,
_: User = Depends(require_roles(*MANAGE_ROLES)), _: User = Depends(require_roles_html(*MANAGE_ROLES)),
name: str = Form(...), name: str = Form(...),
description: str | None = Form(None), description: str | None = Form(None),
status_value: str = Form(ScenarioStatus.DRAFT.value), status_value: str = Form(ScenarioStatus.DRAFT.value),
@@ -374,6 +434,7 @@ def create_scenario_submit(
"projects.view_project", project_id=project_id "projects.view_project", project_id=project_id
), ),
"error": str(exc), "error": str(exc),
"error_field": "currency",
"default_currency": metadata.default_currency, "default_currency": metadata.default_currency,
}, },
status_code=status.HTTP_400_BAD_REQUEST, status_code=status.HTTP_400_BAD_REQUEST,
@@ -408,7 +469,8 @@ def create_scenario_submit(
"cancel_url": request.url_for( "cancel_url": request.url_for(
"projects.view_project", project_id=project_id "projects.view_project", project_id=project_id
), ),
"error": "Scenario could not be created.", "error": "Scenario with this name already exists for this project.",
"error_field": "name",
"default_currency": metadata.default_currency, "default_currency": metadata.default_currency,
}, },
status_code=status.HTTP_409_CONFLICT, status_code=status.HTTP_409_CONFLICT,
@@ -428,8 +490,9 @@ def create_scenario_submit(
) )
def view_scenario( def view_scenario(
request: Request, request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
scenario: Scenario = Depends( scenario: Scenario = Depends(
require_scenario_resource(with_children=True) require_scenario_resource_html(with_children=True)
), ),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
) -> HTMLResponse: ) -> HTMLResponse:
@@ -469,8 +532,9 @@ def view_scenario(
) )
def edit_scenario_form( def edit_scenario_form(
request: Request, request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
scenario: Scenario = Depends( scenario: Scenario = Depends(
require_scenario_resource(require_manage=True) require_scenario_resource_html(require_manage=True)
), ),
uow: UnitOfWork = Depends(get_unit_of_work), uow: UnitOfWork = Depends(get_unit_of_work),
metadata: PricingMetadata = Depends(get_pricing_metadata), metadata: PricingMetadata = Depends(get_pricing_metadata),
@@ -503,8 +567,9 @@ def edit_scenario_form(
) )
def edit_scenario_submit( def edit_scenario_submit(
request: Request, request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
scenario: Scenario = Depends( scenario: Scenario = Depends(
require_scenario_resource(require_manage=True) require_scenario_resource_html(require_manage=True)
), ),
name: str = Form(...), name: str = Form(...),
description: str | None = Form(None), description: str | None = Form(None),
@@ -569,6 +634,7 @@ def edit_scenario_submit(
"scenarios.view_scenario", scenario_id=scenario.id "scenarios.view_scenario", scenario_id=scenario.id
), ),
"error": str(exc), "error": str(exc),
"error_field": "currency",
"default_currency": metadata.default_currency, "default_currency": metadata.default_currency,
}, },
status_code=status.HTTP_400_BAD_REQUEST, status_code=status.HTTP_400_BAD_REQUEST,

147
routes/template_filters.py Normal file
View File

@@ -0,0 +1,147 @@
from __future__ import annotations
import logging
from datetime import datetime, timezone
from typing import Any
from fastapi import Request
from fastapi.templating import Jinja2Templates
from services.navigation import NavigationService
from services.session import AuthSession
from services.unit_of_work import UnitOfWork
logger = logging.getLogger(__name__)
def format_datetime(value: Any) -> str:
"""Render datetime values consistently for templates."""
if not isinstance(value, datetime):
return ""
if value.tzinfo is None:
value = value.replace(tzinfo=timezone.utc)
return value.strftime("%Y-%m-%d %H:%M UTC")
def currency_display(value: Any, currency_code: str | None) -> str:
"""Format numeric values with currency context."""
if value is None:
return ""
if isinstance(value, (int, float)):
formatted_value = f"{value:,.2f}"
else:
formatted_value = str(value)
if currency_code:
return f"{currency_code} {formatted_value}"
return formatted_value
def format_metric(value: Any, metric_name: str, currency_code: str | None = None) -> str:
"""Format metrics according to their semantic type."""
if value is None:
return ""
currency_metrics = {
"npv",
"inflows",
"outflows",
"net",
"total_inflows",
"total_outflows",
"total_net",
}
if metric_name in currency_metrics and currency_code:
return currency_display(value, currency_code)
percentage_metrics = {"irr", "payback_period"}
if metric_name in percentage_metrics:
if isinstance(value, (int, float)):
return f"{value:.2f}%"
return f"{value}%"
if isinstance(value, (int, float)):
return f"{value:,.2f}"
return str(value)
def percentage_display(value: Any) -> str:
"""Format numeric values as percentages."""
if value is None:
return ""
if isinstance(value, (int, float)):
return f"{value:.2f}%"
return f"{value}%"
def period_display(value: Any) -> str:
"""Format period values in years."""
if value is None:
return ""
if isinstance(value, (int, float)):
if value == int(value):
return f"{int(value)} years"
return f"{value:.1f} years"
return str(value)
def register_common_filters(templates: Jinja2Templates) -> None:
templates.env.filters["format_datetime"] = format_datetime
templates.env.filters["currency_display"] = currency_display
templates.env.filters["format_metric"] = format_metric
templates.env.filters["percentage_display"] = percentage_display
templates.env.filters["period_display"] = period_display
def _sidebar_navigation_for_request(request: Request | None):
if request is None:
return None
cached = getattr(request.state, "_navigation_sidebar_dto", None)
if cached is not None:
return cached
session_context = getattr(request.state, "auth_session", None)
if isinstance(session_context, AuthSession):
session = session_context
else:
session = AuthSession.anonymous()
try:
with UnitOfWork() as uow:
if not uow.navigation:
logger.debug("Navigation repository unavailable for sidebar rendering")
sidebar_dto = None
else:
service = NavigationService(uow.navigation)
sidebar_dto = service.build_sidebar(session=session, request=request)
except Exception: # pragma: no cover - defensive fallback for templates
logger.exception("Failed to build sidebar navigation during template render")
sidebar_dto = None
setattr(request.state, "_navigation_sidebar_dto", sidebar_dto)
return sidebar_dto
def register_navigation_globals(templates: Jinja2Templates) -> None:
templates.env.globals["get_sidebar_navigation"] = _sidebar_navigation_for_request
def create_templates() -> Jinja2Templates:
templates = Jinja2Templates(directory="templates")
register_common_filters(templates)
register_navigation_globals(templates)
return templates
__all__ = [
"format_datetime",
"currency_display",
"format_metric",
"percentage_display",
"period_display",
"register_common_filters",
"register_navigation_globals",
"create_templates",
]

109
routes/ui.py Normal file
View File

@@ -0,0 +1,109 @@
from __future__ import annotations
from fastapi import APIRouter, Depends, Request
from fastapi.responses import HTMLResponse
from dependencies import require_any_role_html, require_roles_html
from models import User
from routes.template_filters import create_templates
router = APIRouter(tags=["UI"])
templates = create_templates()
READ_ROLES = ("viewer", "analyst", "project_manager", "admin")
MANAGE_ROLES = ("project_manager", "admin")
@router.get(
"/ui/simulations",
response_class=HTMLResponse,
include_in_schema=False,
name="ui.simulations",
)
def simulations_dashboard(
request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"simulations.html",
{
"title": "Simulations",
},
)
@router.get(
"/ui/reporting",
response_class=HTMLResponse,
include_in_schema=False,
name="ui.reporting",
)
def reporting_dashboard(
request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"reporting.html",
{
"title": "Reporting",
},
)
@router.get(
"/ui/settings",
response_class=HTMLResponse,
include_in_schema=False,
name="ui.settings",
)
def settings_page(
request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"settings.html",
{
"title": "Settings",
},
)
@router.get(
"/theme-settings",
response_class=HTMLResponse,
include_in_schema=False,
name="ui.theme_settings",
)
def theme_settings_page(
request: Request,
_: User = Depends(require_any_role_html(*READ_ROLES)),
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"theme_settings.html",
{
"title": "Theme Settings",
},
)
@router.get(
"/ui/currencies",
response_class=HTMLResponse,
include_in_schema=False,
name="ui.currencies",
)
def currencies_page(
request: Request,
_: User = Depends(require_roles_html(*MANAGE_ROLES)),
) -> HTMLResponse:
return templates.TemplateResponse(
request,
"currencies.html",
{
"title": "Currency Management",
},
)

2
run_docker.ps1 Normal file
View File

@@ -0,0 +1,2 @@
docker run -d --name calminer-app --env-file .env -p 8003:8003 -v "${PWD}\logs:/app/logs" --restart unless-stopped calminer:latest
docker logs -f calminer-app

346
schemas/calculations.py Normal file
View File

@@ -0,0 +1,346 @@
"""Pydantic schemas for calculation workflows."""
from __future__ import annotations
from typing import List
from pydantic import BaseModel, Field, PositiveFloat, ValidationError, field_validator
from services.pricing import PricingResult
class ImpurityInput(BaseModel):
"""Impurity configuration row supplied by the client."""
name: str = Field(..., min_length=1)
value: float | None = Field(None, ge=0)
threshold: float | None = Field(None, ge=0)
penalty: float | None = Field(None)
@field_validator("name")
@classmethod
def _normalise_name(cls, value: str) -> str:
return value.strip()
class ProfitabilityCalculationRequest(BaseModel):
"""Request payload for profitability calculations."""
metal: str = Field(..., min_length=1)
ore_tonnage: PositiveFloat
head_grade_pct: float = Field(..., gt=0, le=100)
recovery_pct: float = Field(..., gt=0, le=100)
payable_pct: float | None = Field(None, gt=0, le=100)
reference_price: PositiveFloat
treatment_charge: float = Field(0, ge=0)
smelting_charge: float = Field(0, ge=0)
moisture_pct: float = Field(0, ge=0, le=100)
moisture_threshold_pct: float | None = Field(None, ge=0, le=100)
moisture_penalty_per_pct: float | None = None
premiums: float = Field(0)
fx_rate: PositiveFloat = Field(1)
currency_code: str | None = Field(None, min_length=3, max_length=3)
opex: float = Field(0, ge=0)
sustaining_capex: float = Field(0, ge=0)
capex: float = Field(0, ge=0)
discount_rate: float | None = Field(None, ge=0, le=100)
periods: int = Field(10, ge=1, le=120)
impurities: List[ImpurityInput] = Field(default_factory=list)
@field_validator("currency_code")
@classmethod
def _uppercase_currency(cls, value: str | None) -> str | None:
if value is None:
return None
return value.strip().upper()
@field_validator("metal")
@classmethod
def _normalise_metal(cls, value: str) -> str:
return value.strip().lower()
class ProfitabilityCosts(BaseModel):
"""Aggregated cost components for profitability output."""
opex_total: float
sustaining_capex_total: float
capex: float
class ProfitabilityMetrics(BaseModel):
"""Financial KPIs yielded by the profitability calculation."""
npv: float | None
irr: float | None
payback_period: float | None
margin: float | None
class CashFlowEntry(BaseModel):
"""Normalized cash flow row for reporting and charting."""
period: int
revenue: float
opex: float
sustaining_capex: float
net: float
class ProfitabilityCalculationResult(BaseModel):
"""Response body summarizing profitability calculation outputs."""
pricing: PricingResult
costs: ProfitabilityCosts
metrics: ProfitabilityMetrics
cash_flows: list[CashFlowEntry]
currency: str | None
class CapexComponentInput(BaseModel):
"""Capex component entry supplied by the UI."""
id: int | None = Field(default=None, ge=1)
name: str = Field(..., min_length=1)
category: str = Field(..., min_length=1)
amount: float = Field(..., ge=0)
currency: str | None = Field(None, min_length=3, max_length=3)
spend_year: int | None = Field(None, ge=0, le=120)
notes: str | None = Field(None, max_length=500)
@field_validator("currency")
@classmethod
def _uppercase_currency(cls, value: str | None) -> str | None:
if value is None:
return None
return value.strip().upper()
@field_validator("category")
@classmethod
def _normalise_category(cls, value: str) -> str:
return value.strip().lower()
@field_validator("name")
@classmethod
def _trim_name(cls, value: str) -> str:
return value.strip()
class CapexParameters(BaseModel):
"""Global parameters applied to capex calculations."""
currency_code: str | None = Field(None, min_length=3, max_length=3)
contingency_pct: float | None = Field(0, ge=0, le=100)
discount_rate_pct: float | None = Field(None, ge=0, le=100)
evaluation_horizon_years: int | None = Field(10, ge=1, le=100)
@field_validator("currency_code")
@classmethod
def _uppercase_currency(cls, value: str | None) -> str | None:
if value is None:
return None
return value.strip().upper()
class CapexCalculationOptions(BaseModel):
"""Optional behaviour flags for capex calculations."""
persist: bool = False
class CapexCalculationRequest(BaseModel):
"""Request payload for capex aggregation."""
components: List[CapexComponentInput] = Field(default_factory=list)
parameters: CapexParameters = Field(
default_factory=CapexParameters, # type: ignore[arg-type]
)
options: CapexCalculationOptions = Field(
default_factory=CapexCalculationOptions, # type: ignore[arg-type]
)
class CapexCategoryBreakdown(BaseModel):
"""Breakdown entry describing category totals."""
category: str
amount: float = Field(..., ge=0)
share: float | None = Field(None, ge=0, le=100)
class CapexTotals(BaseModel):
"""Aggregated totals for capex workflows."""
overall: float = Field(..., ge=0)
contingency_pct: float = Field(0, ge=0, le=100)
contingency_amount: float = Field(..., ge=0)
with_contingency: float = Field(..., ge=0)
by_category: List[CapexCategoryBreakdown] = Field(default_factory=list)
class CapexTimelineEntry(BaseModel):
"""Spend profile entry grouped by year."""
year: int
spend: float = Field(..., ge=0)
cumulative: float = Field(..., ge=0)
class CapexCalculationResult(BaseModel):
"""Response body for capex calculations."""
totals: CapexTotals
timeline: List[CapexTimelineEntry] = Field(default_factory=list)
components: List[CapexComponentInput] = Field(default_factory=list)
parameters: CapexParameters
options: CapexCalculationOptions
currency: str | None
class OpexComponentInput(BaseModel):
"""opex component entry supplied by the UI."""
id: int | None = Field(default=None, ge=1)
name: str = Field(..., min_length=1)
category: str = Field(..., min_length=1)
unit_cost: float = Field(..., ge=0)
quantity: float = Field(..., ge=0)
frequency: str = Field(..., min_length=1)
currency: str | None = Field(None, min_length=3, max_length=3)
period_start: int | None = Field(None, ge=0, le=240)
period_end: int | None = Field(None, ge=0, le=240)
notes: str | None = Field(None, max_length=500)
@field_validator("currency")
@classmethod
def _uppercase_currency(cls, value: str | None) -> str | None:
if value is None:
return None
return value.strip().upper()
@field_validator("category")
@classmethod
def _normalise_category(cls, value: str) -> str:
return value.strip().lower()
@field_validator("frequency")
@classmethod
def _normalise_frequency(cls, value: str) -> str:
return value.strip().lower()
@field_validator("name")
@classmethod
def _trim_name(cls, value: str) -> str:
return value.strip()
class OpexParameters(BaseModel):
"""Global parameters applied to opex calculations."""
currency_code: str | None = Field(None, min_length=3, max_length=3)
escalation_pct: float | None = Field(None, ge=0, le=100)
discount_rate_pct: float | None = Field(None, ge=0, le=100)
evaluation_horizon_years: int | None = Field(10, ge=1, le=100)
apply_escalation: bool = True
@field_validator("currency_code")
@classmethod
def _uppercase_currency(cls, value: str | None) -> str | None:
if value is None:
return None
return value.strip().upper()
class OpexOptions(BaseModel):
"""Optional behaviour flags for opex calculations."""
persist: bool = False
snapshot_notes: str | None = Field(None, max_length=500)
class OpexCalculationRequest(BaseModel):
"""Request payload for opex aggregation."""
components: List[OpexComponentInput] = Field(
default_factory=list)
parameters: OpexParameters = Field(
default_factory=OpexParameters, # type: ignore[arg-type]
)
options: OpexOptions = Field(
default_factory=OpexOptions, # type: ignore[arg-type]
)
class OpexCategoryBreakdown(BaseModel):
"""Category breakdown for opex totals."""
category: str
annual_cost: float = Field(..., ge=0)
share: float | None = Field(None, ge=0, le=100)
class OpexTimelineEntry(BaseModel):
"""Timeline entry representing cost over evaluation periods."""
period: int
base_cost: float = Field(..., ge=0)
escalated_cost: float | None = Field(None, ge=0)
class OpexMetrics(BaseModel):
"""Derived KPIs for opex outputs."""
annual_average: float | None
cost_per_ton: float | None
class OpexTotals(BaseModel):
"""Aggregated totals for opex."""
overall_annual: float = Field(..., ge=0)
escalated_total: float | None = Field(None, ge=0)
escalation_pct: float | None = Field(None, ge=0, le=100)
by_category: List[OpexCategoryBreakdown] = Field(
default_factory=list
)
class OpexCalculationResult(BaseModel):
"""Response body summarising opex calculations."""
totals: OpexTotals
timeline: List[OpexTimelineEntry] = Field(default_factory=list)
metrics: OpexMetrics
components: List[OpexComponentInput] = Field(
default_factory=list)
parameters: OpexParameters
options: OpexOptions
currency: str | None
__all__ = [
"ImpurityInput",
"ProfitabilityCalculationRequest",
"ProfitabilityCosts",
"ProfitabilityMetrics",
"CashFlowEntry",
"ProfitabilityCalculationResult",
"CapexComponentInput",
"CapexParameters",
"CapexCalculationOptions",
"CapexCalculationRequest",
"CapexCategoryBreakdown",
"CapexTotals",
"CapexTimelineEntry",
"CapexCalculationResult",
"OpexComponentInput",
"OpexParameters",
"OpexOptions",
"OpexCalculationRequest",
"OpexCategoryBreakdown",
"OpexTimelineEntry",
"OpexMetrics",
"OpexTotals",
"OpexCalculationResult",
"ValidationError",
]

36
schemas/navigation.py Normal file
View File

@@ -0,0 +1,36 @@
from __future__ import annotations
from datetime import datetime
from typing import List
from pydantic import BaseModel, Field
class NavigationLinkSchema(BaseModel):
id: int
label: str
href: str
match_prefix: str | None = Field(default=None)
icon: str | None = Field(default=None)
tooltip: str | None = Field(default=None)
is_external: bool = Field(default=False)
children: List["NavigationLinkSchema"] = Field(default_factory=list)
class NavigationGroupSchema(BaseModel):
id: int
label: str
icon: str | None = Field(default=None)
tooltip: str | None = Field(default=None)
links: List[NavigationLinkSchema] = Field(default_factory=list)
class NavigationSidebarResponse(BaseModel):
groups: List[NavigationGroupSchema]
roles: List[str] = Field(default_factory=list)
generated_at: datetime
NavigationLinkSchema.model_rebuild()
NavigationGroupSchema.model_rebuild()
NavigationSidebarResponse.model_rebuild()

View File

@@ -0,0 +1,112 @@
"""Utility script to verify key authenticated routes respond without errors."""
from __future__ import annotations
import json
import os
import sys
import urllib.parse
from http.client import HTTPConnection
from http.cookies import SimpleCookie
from typing import Dict, List, Tuple
HOST = "127.0.0.1"
PORT = 8000
cookies: Dict[str, str] = {}
def _update_cookies(headers: List[Tuple[str, str]]) -> None:
for name, value in headers:
if name.lower() != "set-cookie":
continue
cookie = SimpleCookie()
cookie.load(value)
for key, morsel in cookie.items():
cookies[key] = morsel.value
def _cookie_header() -> str | None:
if not cookies:
return None
return "; ".join(f"{key}={value}" for key, value in cookies.items())
def request(method: str, path: str, *, body: bytes | None = None, headers: Dict[str, str] | None = None) -> Tuple[int, Dict[str, str], bytes]:
conn = HTTPConnection(HOST, PORT, timeout=10)
prepared_headers = {"User-Agent": "route-checker"}
if headers:
prepared_headers.update(headers)
cookie_header = _cookie_header()
if cookie_header:
prepared_headers["Cookie"] = cookie_header
conn.request(method, path, body=body, headers=prepared_headers)
resp = conn.getresponse()
payload = resp.read()
status = resp.status
reason = resp.reason
response_headers = {name: value for name, value in resp.getheaders()}
_update_cookies(list(resp.getheaders()))
conn.close()
print(f"{method} {path} -> {status} {reason}")
return status, response_headers, payload
def main() -> int:
status, _, _ = request("GET", "/login")
if status != 200:
print("Unexpected status for GET /login", file=sys.stderr)
return 1
admin_username = os.getenv("CALMINER_SEED_ADMIN_USERNAME", "admin")
admin_password = os.getenv("CALMINER_SEED_ADMIN_PASSWORD", "M11ffpgm.")
login_payload = urllib.parse.urlencode(
{"username": admin_username, "password": admin_password}
).encode()
status, headers, _ = request(
"POST",
"/login",
body=login_payload,
headers={"Content-Type": "application/x-www-form-urlencoded"},
)
if status not in {200, 303}:
print("Login failed", file=sys.stderr)
return 1
location = headers.get("Location", "/")
redirect_path = urllib.parse.urlsplit(location).path or "/"
request("GET", redirect_path)
request("GET", "/")
request("GET", "/projects/ui")
status, headers, body = request(
"GET",
"/projects",
headers={"Accept": "application/json"},
)
projects: List[dict] = []
if headers.get("Content-Type", "").startswith("application/json"):
projects = json.loads(body.decode())
if projects:
project_id = projects[0]["id"]
request("GET", f"/projects/{project_id}/view")
status, headers, body = request(
"GET",
f"/projects/{project_id}/scenarios",
headers={"Accept": "application/json"},
)
scenarios: List[dict] = []
if headers.get("Content-Type", "").startswith("application/json"):
scenarios = json.loads(body.decode())
if scenarios:
scenario_id = scenarios[0]["id"]
request("GET", f"/scenarios/{scenario_id}/view")
print("Cookies:", cookies)
return 0
if __name__ == "__main__":
raise SystemExit(main())

View File

@@ -0,0 +1,15 @@
from sqlalchemy import create_engine, text
from config.database import DATABASE_URL
engine = create_engine(DATABASE_URL, future=True)
sqls = [
"CREATE SEQUENCE IF NOT EXISTS users_id_seq;",
"ALTER TABLE users ALTER COLUMN id SET DEFAULT nextval('users_id_seq');",
"SELECT setval('users_id_seq', COALESCE((SELECT MAX(id) FROM users), 1));",
"ALTER SEQUENCE users_id_seq OWNED BY users.id;",
]
with engine.begin() as conn:
for s in sqls:
print('EXECUTING:', s)
conn.execute(text(s))
print('SEQUENCE fix applied')

View File

@@ -1,9 +0,0 @@
#!/usr/bin/env sh
set -e
PYTHONPATH="/app:${PYTHONPATH}"
export PYTHONPATH
python -m scripts.run_migrations
exec "$@"

1468
scripts/init_db.py Normal file

File diff suppressed because it is too large Load Diff

91
scripts/reset_db.py Normal file
View File

@@ -0,0 +1,91 @@
"""Utility to reset development Postgres schema artifacts.
This script drops managed tables and enum types created by `scripts.init_db`.
It is intended for local development only; it refuses to run if CALMINER_ENV
indicates production or staging. The operation is idempotent: missing objects
are ignored. Use with caution.
"""
from __future__ import annotations
import logging
import os
from dataclasses import dataclass
from typing import Iterable
from sqlalchemy import text
from sqlalchemy.engine import Engine
from config.database import DATABASE_URL
from scripts.init_db import ENUM_DEFINITIONS, _create_engine
logger = logging.getLogger(__name__)
@dataclass(slots=True)
class ResetOptions:
drop_tables: bool = True
drop_enums: bool = True
MANAGED_TABLES: tuple[str, ...] = (
"simulation_parameters",
"financial_inputs",
"scenarios",
"projects",
"pricing_impurity_settings",
"pricing_metal_settings",
"pricing_settings",
"user_roles",
"users",
"roles",
)
FORBIDDEN_ENVIRONMENTS: set[str] = {"production", "staging", "prod", "stage"}
def _ensure_safe_environment() -> None:
env = os.getenv("CALMINER_ENV", "development").lower()
if env in FORBIDDEN_ENVIRONMENTS:
raise RuntimeError(
f"Refusing to reset database in environment '{env}'. "
"Set CALMINER_ENV to 'development' to proceed."
)
def _drop_tables(engine: Engine, tables: Iterable[str]) -> None:
if not tables:
return
with engine.begin() as conn:
for table in tables:
logger.info("Dropping table if exists: %s", table)
conn.execute(text(f"DROP TABLE IF EXISTS {table} CASCADE"))
def _drop_enums(engine: Engine, enum_names: Iterable[str]) -> None:
if not enum_names:
return
with engine.begin() as conn:
for enum_name in enum_names:
logger.info("Dropping enum type if exists: %s", enum_name)
conn.execute(text(f"DROP TYPE IF EXISTS {enum_name} CASCADE"))
def reset_database(*, options: ResetOptions | None = None, database_url: str | None = None) -> None:
"""Drop managed tables and enums for a clean slate."""
_ensure_safe_environment()
opts = options or ResetOptions()
engine = _create_engine(database_url or DATABASE_URL)
if opts.drop_tables:
_drop_tables(engine, MANAGED_TABLES)
if opts.drop_enums:
_drop_enums(engine, ENUM_DEFINITIONS.keys())
logger.info("Database reset complete")
if __name__ == "__main__":
logging.basicConfig(level=logging.INFO)
reset_database()

View File

@@ -1,42 +0,0 @@
"""Utility for applying Alembic migrations before application startup."""
from __future__ import annotations
import logging
from pathlib import Path
from alembic import command
from alembic.config import Config
from dotenv import load_dotenv
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
def _load_env() -> None:
"""Ensure environment variables from .env are available."""
load_dotenv()
def _alembic_config(project_root: Path) -> Config:
config_path = project_root / "alembic.ini"
if not config_path.exists():
raise FileNotFoundError(f"Missing alembic.ini at {config_path}")
config = Config(str(config_path))
config.set_main_option("script_location", str(project_root / "alembic"))
return config
def run_migrations(target_revision: str = "head") -> None:
"""Apply Alembic migrations up to the given revision."""
project_root = Path(__file__).resolve().parent.parent
_load_env()
config = _alembic_config(project_root)
logger.info("Applying database migrations up to %s", target_revision)
command.upgrade(config, target_revision)
logger.info("Database migrations applied successfully")
if __name__ == "__main__":
run_migrations()

86
scripts/verify_db.py Normal file
View File

@@ -0,0 +1,86 @@
"""Verify DB initialization results: enums, roles, admin user, pricing_settings."""
from __future__ import annotations
import logging
from sqlalchemy import create_engine, text
from config.database import DATABASE_URL
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
ENUMS = [
'miningoperationtype',
'scenariostatus',
'financialcategory',
'costbucket',
'distributiontype',
'stochasticvariable',
'resourcetype',
]
SQL_CHECK_ENUM = "SELECT typname FROM pg_type WHERE typname = ANY(:names)"
SQL_ROLES = "SELECT id, name, display_name FROM roles ORDER BY id"
SQL_ADMIN = "SELECT id, email, username, is_active, is_superuser FROM users WHERE id = 1"
SQL_USER_ROLES = "SELECT user_id, role_id, granted_by FROM user_roles WHERE user_id = 1"
SQL_PRICING = "SELECT id, slug, name, default_currency FROM pricing_settings WHERE slug = 'default'"
def run():
engine = create_engine(DATABASE_URL, future=True)
with engine.connect() as conn:
print('Using DATABASE_URL:', DATABASE_URL)
# enums
res = conn.execute(text(SQL_CHECK_ENUM), dict(names=ENUMS)).fetchall()
found = [r[0] for r in res]
print('\nEnums found:')
for name in ENUMS:
print(f' {name}:', 'YES' if name in found else 'NO')
# roles
try:
roles = conn.execute(text(SQL_ROLES)).fetchall()
print('\nRoles:')
if roles:
for r in roles:
print(f' id={r.id} name={r.name} display_name={r.display_name}')
else:
print(' (no roles found)')
except Exception as e:
print('\nRoles query failed:', e)
# admin user
try:
admin = conn.execute(text(SQL_ADMIN)).fetchone()
print('\nAdmin user:')
if admin:
print(f' id={admin.id} email={admin.email} username={admin.username} is_active={admin.is_active} is_superuser={admin.is_superuser}')
else:
print(' (admin user not found)')
except Exception as e:
print('\nAdmin query failed:', e)
# user_roles
try:
ur = conn.execute(text(SQL_USER_ROLES)).fetchall()
print('\nUser roles for user_id=1:')
if ur:
for row in ur:
print(f' user_id={row.user_id} role_id={row.role_id} granted_by={row.granted_by}')
else:
print(' (no user_roles rows for user_id=1)')
except Exception as e:
print('\nUser_roles query failed:', e)
# pricing settings
try:
p = conn.execute(text(SQL_PRICING)).fetchone()
print('\nPricing settings (slug=default):')
if p:
print(f' id={p.id} slug={p.slug} name={p.name} default_currency={p.default_currency}')
else:
print(' (default pricing settings not found)')
except Exception as e:
print('\nPricing query failed:', e)
if __name__ == '__main__':
run()

View File

@@ -1,10 +1,12 @@
"""Service layer utilities.""" """Service layer utilities."""
from .pricing import calculate_pricing, PricingInput, PricingMetadata, PricingResult from .pricing import calculate_pricing, PricingInput, PricingMetadata, PricingResult
from .calculations import calculate_profitability
__all__ = [ __all__ = [
"calculate_pricing", "calculate_pricing",
"PricingInput", "PricingInput",
"PricingMetadata", "PricingMetadata",
"PricingResult", "PricingResult",
"calculate_profitability",
] ]

View File

@@ -162,12 +162,21 @@ def bootstrap_pricing_settings(
uow.set_project_pricing_settings(project, default_settings) uow.set_project_pricing_settings(project, default_settings)
assigned += 1 assigned += 1
logger.info( # Capture logging-safe primitives while the UnitOfWork (and session)
"Pricing bootstrap result: slug=%s created=%s updated_fields=%s impurity_upserts=%s projects_assigned=%s", # are still active to avoid DetachedInstanceError when accessing ORM
seed_result.settings.slug, # instances outside the session scope.
seed_result.created, seed_slug = seed_result.settings.slug if seed_result and seed_result.settings else None
seed_result.updated_fields, seed_created = getattr(seed_result, "created", None)
seed_result.impurity_upserts, seed_updated_fields = getattr(seed_result, "updated_fields", None)
assigned, seed_impurity_upserts = getattr(seed_result, "impurity_upserts", None)
)
return PricingBootstrapResult(seed=seed_result, projects_assigned=assigned) logger.info(
"Pricing bootstrap result: slug=%s created=%s updated_fields=%s impurity_upserts=%s projects_assigned=%s",
seed_slug,
seed_created,
seed_updated_fields,
seed_impurity_upserts,
assigned,
)
return PricingBootstrapResult(seed=seed_result, projects_assigned=assigned)

535
services/calculations.py Normal file
View File

@@ -0,0 +1,535 @@
"""Service functions for financial calculations."""
from __future__ import annotations
from collections import defaultdict
from statistics import fmean
from services.currency import CurrencyValidationError, normalise_currency
from services.exceptions import (
CapexValidationError,
OpexValidationError,
ProfitabilityValidationError,
)
from services.financial import (
CashFlow,
ConvergenceError,
PaybackNotReachedError,
internal_rate_of_return,
net_present_value,
payback_period,
)
from services.pricing import PricingInput, PricingMetadata, PricingResult, calculate_pricing
from schemas.calculations import (
CapexCalculationRequest,
CapexCalculationResult,
CapexCategoryBreakdown,
CapexComponentInput,
CapexTotals,
CapexTimelineEntry,
CashFlowEntry,
OpexCalculationRequest,
OpexCalculationResult,
OpexCategoryBreakdown,
OpexComponentInput,
OpexMetrics,
OpexParameters,
OpexTotals,
OpexTimelineEntry,
ProfitabilityCalculationRequest,
ProfitabilityCalculationResult,
ProfitabilityCosts,
ProfitabilityMetrics,
)
_FREQUENCY_MULTIPLIER = {
"daily": 365,
"weekly": 52,
"monthly": 12,
"quarterly": 4,
"annually": 1,
}
def _build_pricing_input(
request: ProfitabilityCalculationRequest,
) -> PricingInput:
"""Construct a pricing input instance including impurity overrides."""
impurity_values: dict[str, float] = {}
impurity_thresholds: dict[str, float] = {}
impurity_penalties: dict[str, float] = {}
for impurity in request.impurities:
code = impurity.name.strip()
if not code:
continue
code = code.upper()
if impurity.value is not None:
impurity_values[code] = float(impurity.value)
if impurity.threshold is not None:
impurity_thresholds[code] = float(impurity.threshold)
if impurity.penalty is not None:
impurity_penalties[code] = float(impurity.penalty)
pricing_input = PricingInput(
metal=request.metal,
ore_tonnage=request.ore_tonnage,
head_grade_pct=request.head_grade_pct,
recovery_pct=request.recovery_pct,
payable_pct=request.payable_pct,
reference_price=request.reference_price,
treatment_charge=request.treatment_charge,
smelting_charge=request.smelting_charge,
moisture_pct=request.moisture_pct,
moisture_threshold_pct=request.moisture_threshold_pct,
moisture_penalty_per_pct=request.moisture_penalty_per_pct,
impurity_ppm=impurity_values,
impurity_thresholds=impurity_thresholds,
impurity_penalty_per_ppm=impurity_penalties,
premiums=request.premiums,
fx_rate=request.fx_rate,
currency_code=request.currency_code,
)
return pricing_input
def _generate_cash_flows(
*,
periods: int,
net_per_period: float,
capex: float,
) -> tuple[list[CashFlow], list[CashFlowEntry]]:
"""Create cash flow structures for financial metric calculations."""
cash_flow_models: list[CashFlow] = [
CashFlow(amount=-capex, period_index=0)
]
cash_flow_entries: list[CashFlowEntry] = [
CashFlowEntry(
period=0,
revenue=0.0,
opex=0.0,
sustaining_capex=0.0,
net=-capex,
)
]
for period in range(1, periods + 1):
cash_flow_models.append(
CashFlow(amount=net_per_period, period_index=period))
cash_flow_entries.append(
CashFlowEntry(
period=period,
revenue=0.0,
opex=0.0,
sustaining_capex=0.0,
net=net_per_period,
)
)
return cash_flow_models, cash_flow_entries
def calculate_profitability(
request: ProfitabilityCalculationRequest,
*,
metadata: PricingMetadata,
) -> ProfitabilityCalculationResult:
"""Calculate profitability metrics using pricing inputs and cost data."""
if request.periods <= 0:
raise ProfitabilityValidationError(
"Evaluation periods must be at least 1.", ["periods"]
)
pricing_input = _build_pricing_input(request)
try:
pricing_result: PricingResult = calculate_pricing(
pricing_input, metadata=metadata
)
except CurrencyValidationError as exc:
raise ProfitabilityValidationError(
str(exc), ["currency_code"]) from exc
periods = request.periods
revenue_total = float(pricing_result.net_revenue)
revenue_per_period = revenue_total / periods
processing_total = float(request.opex) * periods
sustaining_total = float(request.sustaining_capex) * periods
capex = float(request.capex)
net_per_period = (
revenue_per_period
- float(request.opex)
- float(request.sustaining_capex)
)
cash_flow_models, cash_flow_entries = _generate_cash_flows(
periods=periods,
net_per_period=net_per_period,
capex=capex,
)
# Update per-period entries to include explicit costs for presentation
for entry in cash_flow_entries[1:]:
entry.revenue = revenue_per_period
entry.opex = float(request.opex)
entry.sustaining_capex = float(request.sustaining_capex)
entry.net = net_per_period
discount_rate = (request.discount_rate or 0.0) / 100.0
npv_value = net_present_value(discount_rate, cash_flow_models)
try:
irr_value = internal_rate_of_return(cash_flow_models) * 100.0
except (ValueError, ZeroDivisionError, ConvergenceError):
irr_value = None
try:
payback_value = payback_period(cash_flow_models)
except (ValueError, PaybackNotReachedError):
payback_value = None
total_costs = processing_total + sustaining_total + capex
total_net = revenue_total - total_costs
if revenue_total == 0:
margin_value = None
else:
margin_value = (total_net / revenue_total) * 100.0
currency = request.currency_code or pricing_result.currency
try:
currency = normalise_currency(currency)
except CurrencyValidationError as exc:
raise ProfitabilityValidationError(
str(exc), ["currency_code"]) from exc
costs = ProfitabilityCosts(
opex_total=processing_total,
sustaining_capex_total=sustaining_total,
capex=capex,
)
metrics = ProfitabilityMetrics(
npv=npv_value,
irr=irr_value,
payback_period=payback_value,
margin=margin_value,
)
return ProfitabilityCalculationResult(
pricing=pricing_result,
costs=costs,
metrics=metrics,
cash_flows=cash_flow_entries,
currency=currency,
)
def calculate_initial_capex(
request: CapexCalculationRequest,
) -> CapexCalculationResult:
"""Aggregate capex components into totals and timelines."""
if not request.components:
raise CapexValidationError(
"At least one capex component is required for calculation.",
["components"],
)
parameters = request.parameters
base_currency = parameters.currency_code
if base_currency:
try:
base_currency = normalise_currency(base_currency)
except CurrencyValidationError as exc:
raise CapexValidationError(
str(exc), ["parameters.currency_code"]
) from exc
overall = 0.0
category_totals: dict[str, float] = defaultdict(float)
timeline_totals: dict[int, float] = defaultdict(float)
normalised_components: list[CapexComponentInput] = []
for index, component in enumerate(request.components):
amount = float(component.amount)
overall += amount
category_totals[component.category] += amount
spend_year = component.spend_year or 0
timeline_totals[spend_year] += amount
component_currency = component.currency
if component_currency:
try:
component_currency = normalise_currency(component_currency)
except CurrencyValidationError as exc:
raise CapexValidationError(
str(exc), [f"components[{index}].currency"]
) from exc
if base_currency is None and component_currency:
base_currency = component_currency
elif (
base_currency is not None
and component_currency is not None
and component_currency != base_currency
):
raise CapexValidationError(
(
"Component currency does not match the global currency. "
f"Expected {base_currency}, got {component_currency}."
),
[f"components[{index}].currency"],
)
normalised_components.append(
CapexComponentInput(
id=component.id,
name=component.name,
category=component.category,
amount=amount,
currency=component_currency,
spend_year=component.spend_year,
notes=component.notes,
)
)
contingency_pct = float(parameters.contingency_pct or 0.0)
contingency_amount = overall * (contingency_pct / 100.0)
grand_total = overall + contingency_amount
category_breakdowns: list[CapexCategoryBreakdown] = []
if category_totals:
for category, total in sorted(category_totals.items()):
share = (total / overall * 100.0) if overall else None
category_breakdowns.append(
CapexCategoryBreakdown(
category=category,
amount=total,
share=share,
)
)
cumulative = 0.0
timeline_entries: list[CapexTimelineEntry] = []
for year, spend in sorted(timeline_totals.items()):
cumulative += spend
timeline_entries.append(
CapexTimelineEntry(year=year, spend=spend, cumulative=cumulative)
)
try:
currency = normalise_currency(base_currency) if base_currency else None
except CurrencyValidationError as exc:
raise CapexValidationError(
str(exc), ["parameters.currency_code"]
) from exc
totals = CapexTotals(
overall=overall,
contingency_pct=contingency_pct,
contingency_amount=contingency_amount,
with_contingency=grand_total,
by_category=category_breakdowns,
)
return CapexCalculationResult(
totals=totals,
timeline=timeline_entries,
components=normalised_components,
parameters=parameters,
options=request.options,
currency=currency,
)
def calculate_opex(
request: OpexCalculationRequest,
) -> OpexCalculationResult:
"""Aggregate opex components into annual totals and timeline."""
if not request.components:
raise OpexValidationError(
"At least one opex component is required for calculation.",
["components"],
)
parameters: OpexParameters = request.parameters
base_currency = parameters.currency_code
if base_currency:
try:
base_currency = normalise_currency(base_currency)
except CurrencyValidationError as exc:
raise OpexValidationError(
str(exc), ["parameters.currency_code"]
) from exc
evaluation_horizon = parameters.evaluation_horizon_years or 1
if evaluation_horizon <= 0:
raise OpexValidationError(
"Evaluation horizon must be at least 1 year.",
["parameters.evaluation_horizon_years"],
)
escalation_pct = float(parameters.escalation_pct or 0.0)
apply_escalation = bool(parameters.apply_escalation)
category_totals: dict[str, float] = defaultdict(float)
timeline_totals: dict[int, float] = defaultdict(float)
timeline_escalated: dict[int, float] = defaultdict(float)
normalised_components: list[OpexComponentInput] = []
max_period_end = evaluation_horizon
for index, component in enumerate(request.components):
frequency = component.frequency.lower()
multiplier = _FREQUENCY_MULTIPLIER.get(frequency)
if multiplier is None:
raise OpexValidationError(
f"Unsupported frequency '{component.frequency}'.",
[f"components[{index}].frequency"],
)
unit_cost = float(component.unit_cost)
quantity = float(component.quantity)
annual_cost = unit_cost * quantity * multiplier
period_start = component.period_start or 1
period_end = component.period_end or evaluation_horizon
if period_end < period_start:
raise OpexValidationError(
(
"Component period_end must be greater than or equal to "
"period_start."
),
[f"components[{index}].period_end"],
)
max_period_end = max(max_period_end, period_end)
component_currency = component.currency
if component_currency:
try:
component_currency = normalise_currency(component_currency)
except CurrencyValidationError as exc:
raise OpexValidationError(
str(exc), [f"components[{index}].currency"]
) from exc
if base_currency is None and component_currency:
base_currency = component_currency
elif (
base_currency is not None
and component_currency is not None
and component_currency != base_currency
):
raise OpexValidationError(
(
"Component currency does not match the global currency. "
f"Expected {base_currency}, got {component_currency}."
),
[f"components[{index}].currency"],
)
category_totals[component.category] += annual_cost
for period in range(period_start, period_end + 1):
timeline_totals[period] += annual_cost
normalised_components.append(
OpexComponentInput(
id=component.id,
name=component.name,
category=component.category,
unit_cost=unit_cost,
quantity=quantity,
frequency=frequency,
currency=component_currency,
period_start=period_start,
period_end=period_end,
notes=component.notes,
)
)
evaluation_horizon = max(evaluation_horizon, max_period_end)
try:
currency = normalise_currency(base_currency) if base_currency else None
except CurrencyValidationError as exc:
raise OpexValidationError(
str(exc), ["parameters.currency_code"]
) from exc
timeline_entries: list[OpexTimelineEntry] = []
escalated_values: list[float] = []
overall_annual = timeline_totals.get(1, 0.0)
escalated_total = 0.0
for period in range(1, evaluation_horizon + 1):
base_cost = timeline_totals.get(period, 0.0)
if apply_escalation:
factor = (1 + escalation_pct / 100.0) ** (period - 1)
else:
factor = 1.0
escalated_cost = base_cost * factor
timeline_escalated[period] = escalated_cost
escalated_total += escalated_cost
timeline_entries.append(
OpexTimelineEntry(
period=period,
base_cost=base_cost,
escalated_cost=escalated_cost if apply_escalation else None,
)
)
escalated_values.append(escalated_cost)
category_breakdowns: list[OpexCategoryBreakdown] = []
total_base = sum(category_totals.values())
for category, total in sorted(category_totals.items()):
share = (total / total_base * 100.0) if total_base else None
category_breakdowns.append(
OpexCategoryBreakdown(
category=category,
annual_cost=total,
share=share,
)
)
metrics = OpexMetrics(
annual_average=fmean(escalated_values) if escalated_values else None,
cost_per_ton=None,
)
totals = OpexTotals(
overall_annual=overall_annual,
escalated_total=escalated_total if apply_escalation else None,
escalation_pct=escalation_pct if apply_escalation else None,
by_category=category_breakdowns,
)
return OpexCalculationResult(
totals=totals,
timeline=timeline_entries,
metrics=metrics,
components=normalised_components,
parameters=parameters,
options=request.options,
currency=currency,
)
__all__ = [
"calculate_profitability",
"calculate_initial_capex",
"calculate_opex",
]

View File

@@ -26,3 +26,36 @@ class ScenarioValidationError(Exception):
def __str__(self) -> str: # pragma: no cover - mirrors message for logging def __str__(self) -> str: # pragma: no cover - mirrors message for logging
return self.message return self.message
@dataclass(eq=False)
class ProfitabilityValidationError(Exception):
"""Raised when profitability calculation inputs fail domain validation."""
message: str
field_errors: Sequence[str] | None = None
def __str__(self) -> str: # pragma: no cover - mirrors message for logging
return self.message
@dataclass(eq=False)
class CapexValidationError(Exception):
"""Raised when capex calculation inputs fail domain validation."""
message: str
field_errors: Sequence[str] | None = None
def __str__(self) -> str: # pragma: no cover - mirrors message for logging
return self.message
@dataclass(eq=False)
class OpexValidationError(Exception):
"""Raised when opex calculation inputs fail domain validation."""
message: str
field_errors: Sequence[str] | None = None
def __str__(self) -> str: # pragma: no cover - mirrors message for logging
return self.message

203
services/navigation.py Normal file
View File

@@ -0,0 +1,203 @@
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Iterable, List, Sequence
from fastapi import Request
from models.navigation import NavigationLink
from services.repositories import NavigationRepository
from services.session import AuthSession
@dataclass(slots=True)
class NavigationLinkDTO:
id: int
label: str
href: str
match_prefix: str | None
icon: str | None
tooltip: str | None
is_external: bool
children: List["NavigationLinkDTO"] = field(default_factory=list)
@dataclass(slots=True)
class NavigationGroupDTO:
id: int
label: str
icon: str | None
tooltip: str | None
links: List[NavigationLinkDTO] = field(default_factory=list)
@dataclass(slots=True)
class NavigationSidebarDTO:
groups: List[NavigationGroupDTO]
roles: tuple[str, ...]
class NavigationService:
"""Build navigation payloads filtered for the current session."""
def __init__(self, repository: NavigationRepository) -> None:
self._repository = repository
def build_sidebar(
self,
*,
session: AuthSession,
request: Request | None = None,
include_disabled: bool = False,
) -> NavigationSidebarDTO:
roles = self._collect_roles(session)
groups = self._repository.list_groups_with_links(
include_disabled=include_disabled
)
context = self._derive_context(request)
mapped_groups: List[NavigationGroupDTO] = []
for group in groups:
if not include_disabled and not group.is_enabled:
continue
mapped_links = self._map_links(
group.links,
roles,
request=request,
include_disabled=include_disabled,
context=context,
)
if not mapped_links and not include_disabled:
continue
mapped_groups.append(
NavigationGroupDTO(
id=group.id,
label=group.label,
icon=group.icon,
tooltip=group.tooltip,
links=mapped_links,
)
)
return NavigationSidebarDTO(groups=mapped_groups, roles=roles)
def _map_links(
self,
links: Sequence[NavigationLink],
roles: Iterable[str],
*,
request: Request | None,
include_disabled: bool,
context: dict[str, str | None],
include_children: bool = False,
) -> List[NavigationLinkDTO]:
resolved_roles = tuple(roles)
mapped: List[NavigationLinkDTO] = []
for link in sorted(links, key=lambda x: (x.sort_order, x.id)):
if not include_children and link.parent_link_id is not None:
continue
if not include_disabled and (not link.is_enabled):
continue
if not self._link_visible(link, resolved_roles, include_disabled):
continue
href = self._resolve_href(link, request=request, context=context)
if not href:
continue
children = self._map_links(
link.children,
resolved_roles,
request=request,
include_disabled=include_disabled,
context=context,
include_children=True,
)
match_prefix = link.match_prefix or href
mapped.append(
NavigationLinkDTO(
id=link.id,
label=link.label,
href=href,
match_prefix=match_prefix,
icon=link.icon,
tooltip=link.tooltip,
is_external=link.is_external,
children=children,
)
)
return mapped
@staticmethod
def _collect_roles(session: AuthSession) -> tuple[str, ...]:
roles = tuple((session.role_slugs or ()) if session else ())
if session and session.is_authenticated:
return roles
if "anonymous" in roles:
return roles
return roles + ("anonymous",)
@staticmethod
def _derive_context(request: Request | None) -> dict[str, str | None]:
if request is None:
return {"project_id": None, "scenario_id": None}
project_id = request.path_params.get(
"project_id") if hasattr(request, "path_params") else None
scenario_id = request.path_params.get(
"scenario_id") if hasattr(request, "path_params") else None
if not project_id:
project_id = request.query_params.get("project_id")
if not scenario_id:
scenario_id = request.query_params.get("scenario_id")
return {"project_id": project_id, "scenario_id": scenario_id}
def _resolve_href(
self,
link: NavigationLink,
*,
request: Request | None,
context: dict[str, str | None],
) -> str | None:
if link.route_name:
if request is None:
fallback = link.href_override
if fallback:
return fallback
# Fallback to route name when no request is available
return f"/{link.route_name.replace('.', '/')}"
requires_context = link.slug in {
"profitability",
"profitability-calculator",
"opex",
"capex",
}
if requires_context:
project_id = context.get("project_id")
scenario_id = context.get("scenario_id")
if project_id and scenario_id:
try:
return str(
request.url_for(
link.route_name,
project_id=project_id,
scenario_id=scenario_id,
)
)
except Exception: # pragma: no cover - defensive
pass
try:
return str(request.url_for(link.route_name))
except Exception: # pragma: no cover - defensive
return link.href_override
return link.href_override
@staticmethod
def _link_visible(
link: NavigationLink,
roles: Iterable[str],
include_disabled: bool,
) -> bool:
role_tuple = tuple(roles)
if not include_disabled and not link.is_enabled:
return False
if not link.required_roles:
return True
role_set = set(role_tuple)
return any(role in role_set for role in link.required_roles)

View File

@@ -8,6 +8,9 @@ import math
from typing import Mapping, Sequence from typing import Mapping, Sequence
from urllib.parse import urlencode from urllib.parse import urlencode
import plotly.graph_objects as go
import plotly.io as pio
from fastapi import Request from fastapi import Request
from models import FinancialCategory, Project, Scenario from models import FinancialCategory, Project, Scenario
@@ -515,6 +518,7 @@ class ReportingService:
"label": "Download JSON", "label": "Download JSON",
} }
], ],
"chart_data": self._generate_npv_comparison_chart(reports),
} }
def build_scenario_comparison_context( def build_scenario_comparison_context(
@@ -611,8 +615,64 @@ class ReportingService:
"label": "Download JSON", "label": "Download JSON",
} }
], ],
"chart_data": self._generate_distribution_histogram(report.monte_carlo) if report.monte_carlo else "{}",
} }
def _generate_npv_comparison_chart(self, reports: Sequence[ScenarioReport]) -> str:
"""Generate Plotly chart JSON for NPV comparison across scenarios."""
scenario_names = []
npv_values = []
for report in reports:
scenario_names.append(report.scenario.name)
npv_values.append(report.deterministic.npv or 0)
fig = go.Figure(data=[
go.Bar(
x=scenario_names,
y=npv_values,
name='NPV',
marker_color='lightblue'
)
])
fig.update_layout(
title="NPV Comparison Across Scenarios",
xaxis_title="Scenario",
yaxis_title="NPV",
showlegend=False
)
return pio.to_json(fig) or "{}"
def _generate_distribution_histogram(self, monte_carlo: ScenarioMonteCarloResult) -> str:
"""Generate Plotly histogram for Monte Carlo distribution."""
if not monte_carlo.available or not monte_carlo.result or not monte_carlo.result.samples:
return "{}"
# Get NPV samples
npv_samples = monte_carlo.result.samples.get(SimulationMetric.NPV, [])
if len(npv_samples) == 0:
return "{}"
fig = go.Figure(data=[
go.Histogram(
x=npv_samples,
nbinsx=50,
name='NPV Distribution',
marker_color='lightgreen'
)
])
fig.update_layout(
title="Monte Carlo NPV Distribution",
xaxis_title="NPV",
yaxis_title="Frequency",
showlegend=False
)
return pio.to_json(fig) or "{}"
def _build_cash_flows(scenario: Scenario) -> tuple[list[CashFlow], ScenarioFinancialTotals]: def _build_cash_flows(scenario: Scenario) -> tuple[list[CashFlow], ScenarioFinancialTotals]:
cash_flows: list[CashFlow] = [] cash_flows: list[CashFlow] = []

View File

@@ -15,8 +15,16 @@ from models import (
PricingImpuritySettings, PricingImpuritySettings,
PricingMetalSettings, PricingMetalSettings,
PricingSettings, PricingSettings,
ProjectCapexSnapshot,
ProjectProfitability,
ProjectOpexSnapshot,
NavigationGroup,
NavigationLink,
Role, Role,
Scenario, Scenario,
ScenarioCapexSnapshot,
ScenarioProfitability,
ScenarioOpexSnapshot,
ScenarioStatus, ScenarioStatus,
SimulationParameter, SimulationParameter,
User, User,
@@ -27,6 +35,59 @@ from services.export_query import ProjectExportFilters, ScenarioExportFilters
from services.pricing import PricingMetadata from services.pricing import PricingMetadata
def _enum_value(e):
"""Return the underlying value for Enum members, otherwise return as-is."""
return getattr(e, "value", e)
class NavigationRepository:
"""Persistence operations for navigation metadata."""
def __init__(self, session: Session) -> None:
self.session = session
def list_groups_with_links(
self,
*,
include_disabled: bool = False,
) -> Sequence[NavigationGroup]:
stmt = (
select(NavigationGroup)
.options(
selectinload(NavigationGroup.links)
.selectinload(NavigationLink.children)
)
.order_by(NavigationGroup.sort_order, NavigationGroup.id)
)
if not include_disabled:
stmt = stmt.where(NavigationGroup.is_enabled.is_(True))
return self.session.execute(stmt).scalars().all()
def get_group_by_slug(self, slug: str) -> NavigationGroup | None:
stmt = select(NavigationGroup).where(NavigationGroup.slug == slug)
return self.session.execute(stmt).scalar_one_or_none()
def get_link_by_slug(
self,
slug: str,
*,
group_id: int | None = None,
) -> NavigationLink | None:
stmt = select(NavigationLink).where(NavigationLink.slug == slug)
if group_id is not None:
stmt = stmt.where(NavigationLink.group_id == group_id)
return self.session.execute(stmt).scalar_one_or_none()
def add_group(self, group: NavigationGroup) -> NavigationGroup:
self.session.add(group)
self.session.flush()
return group
def add_link(self, link: NavigationLink) -> NavigationLink:
self.session.add(link)
self.session.flush()
return link
class ProjectRepository: class ProjectRepository:
"""Persistence operations for Project entities.""" """Persistence operations for Project entities."""
@@ -202,7 +263,9 @@ class ScenarioRepository:
return self.session.execute(stmt).scalar_one() return self.session.execute(stmt).scalar_one()
def count_by_status(self, status: ScenarioStatus) -> int: def count_by_status(self, status: ScenarioStatus) -> int:
stmt = select(func.count(Scenario.id)).where(Scenario.status == status) status_val = _enum_value(status)
stmt = select(func.count(Scenario.id)).where(
Scenario.status == status_val)
return self.session.execute(stmt).scalar_one() return self.session.execute(stmt).scalar_one()
def recent(self, limit: int = 5, *, with_project: bool = False) -> Sequence[Scenario]: def recent(self, limit: int = 5, *, with_project: bool = False) -> Sequence[Scenario]:
@@ -219,9 +282,10 @@ class ScenarioRepository:
limit: int | None = None, limit: int | None = None,
with_project: bool = False, with_project: bool = False,
) -> Sequence[Scenario]: ) -> Sequence[Scenario]:
status_val = _enum_value(status)
stmt = ( stmt = (
select(Scenario) select(Scenario)
.where(Scenario.status == status) .where(Scenario.status == status_val)
.order_by(Scenario.updated_at.desc()) .order_by(Scenario.updated_at.desc())
) )
if with_project: if with_project:
@@ -311,7 +375,11 @@ class ScenarioRepository:
stmt = stmt.where(Scenario.name.ilike(name_pattern)) stmt = stmt.where(Scenario.name.ilike(name_pattern))
if filters.statuses: if filters.statuses:
stmt = stmt.where(Scenario.status.in_(filters.statuses)) # Accept Enum members or raw values in filters.statuses
status_values = [
_enum_value(s) for s in (filters.statuses or [])
]
stmt = stmt.where(Scenario.status.in_(status_values))
if filters.start_date_from: if filters.start_date_from:
stmt = stmt.where(Scenario.start_date >= stmt = stmt.where(Scenario.start_date >=
@@ -355,6 +423,310 @@ class ScenarioRepository:
self.session.delete(scenario) self.session.delete(scenario)
class ProjectProfitabilityRepository:
"""Persistence operations for project-level profitability snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(self, snapshot: ProjectProfitability) -> ProjectProfitability:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_project(
self,
project_id: int,
*,
limit: int | None = None,
) -> Sequence[ProjectProfitability]:
stmt = (
select(ProjectProfitability)
.where(ProjectProfitability.project_id == project_id)
.order_by(ProjectProfitability.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_project(
self,
project_id: int,
) -> ProjectProfitability | None:
stmt = (
select(ProjectProfitability)
.where(ProjectProfitability.project_id == project_id)
.order_by(ProjectProfitability.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ProjectProfitability).where(
ProjectProfitability.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Project profitability snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class ScenarioProfitabilityRepository:
"""Persistence operations for scenario-level profitability snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(self, snapshot: ScenarioProfitability) -> ScenarioProfitability:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_scenario(
self,
scenario_id: int,
*,
limit: int | None = None,
) -> Sequence[ScenarioProfitability]:
stmt = (
select(ScenarioProfitability)
.where(ScenarioProfitability.scenario_id == scenario_id)
.order_by(ScenarioProfitability.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_scenario(
self,
scenario_id: int,
) -> ScenarioProfitability | None:
stmt = (
select(ScenarioProfitability)
.where(ScenarioProfitability.scenario_id == scenario_id)
.order_by(ScenarioProfitability.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ScenarioProfitability).where(
ScenarioProfitability.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Scenario profitability snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class ProjectCapexRepository:
"""Persistence operations for project-level capex snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(self, snapshot: ProjectCapexSnapshot) -> ProjectCapexSnapshot:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_project(
self,
project_id: int,
*,
limit: int | None = None,
) -> Sequence[ProjectCapexSnapshot]:
stmt = (
select(ProjectCapexSnapshot)
.where(ProjectCapexSnapshot.project_id == project_id)
.order_by(ProjectCapexSnapshot.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_project(
self,
project_id: int,
) -> ProjectCapexSnapshot | None:
stmt = (
select(ProjectCapexSnapshot)
.where(ProjectCapexSnapshot.project_id == project_id)
.order_by(ProjectCapexSnapshot.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ProjectCapexSnapshot).where(
ProjectCapexSnapshot.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Project capex snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class ScenarioCapexRepository:
"""Persistence operations for scenario-level capex snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(self, snapshot: ScenarioCapexSnapshot) -> ScenarioCapexSnapshot:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_scenario(
self,
scenario_id: int,
*,
limit: int | None = None,
) -> Sequence[ScenarioCapexSnapshot]:
stmt = (
select(ScenarioCapexSnapshot)
.where(ScenarioCapexSnapshot.scenario_id == scenario_id)
.order_by(ScenarioCapexSnapshot.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_scenario(
self,
scenario_id: int,
) -> ScenarioCapexSnapshot | None:
stmt = (
select(ScenarioCapexSnapshot)
.where(ScenarioCapexSnapshot.scenario_id == scenario_id)
.order_by(ScenarioCapexSnapshot.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ScenarioCapexSnapshot).where(
ScenarioCapexSnapshot.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Scenario capex snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class ProjectOpexRepository:
"""Persistence operations for project-level opex snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(
self, snapshot: ProjectOpexSnapshot
) -> ProjectOpexSnapshot:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_project(
self,
project_id: int,
*,
limit: int | None = None,
) -> Sequence[ProjectOpexSnapshot]:
stmt = (
select(ProjectOpexSnapshot)
.where(ProjectOpexSnapshot.project_id == project_id)
.order_by(ProjectOpexSnapshot.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_project(
self,
project_id: int,
) -> ProjectOpexSnapshot | None:
stmt = (
select(ProjectOpexSnapshot)
.where(ProjectOpexSnapshot.project_id == project_id)
.order_by(ProjectOpexSnapshot.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ProjectOpexSnapshot).where(
ProjectOpexSnapshot.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Project opex snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class ScenarioOpexRepository:
"""Persistence operations for scenario-level opex snapshots."""
def __init__(self, session: Session) -> None:
self.session = session
def create(
self, snapshot: ScenarioOpexSnapshot
) -> ScenarioOpexSnapshot:
self.session.add(snapshot)
self.session.flush()
return snapshot
def list_for_scenario(
self,
scenario_id: int,
*,
limit: int | None = None,
) -> Sequence[ScenarioOpexSnapshot]:
stmt = (
select(ScenarioOpexSnapshot)
.where(ScenarioOpexSnapshot.scenario_id == scenario_id)
.order_by(ScenarioOpexSnapshot.calculated_at.desc())
)
if limit is not None:
stmt = stmt.limit(limit)
return self.session.execute(stmt).scalars().all()
def latest_for_scenario(
self,
scenario_id: int,
) -> ScenarioOpexSnapshot | None:
stmt = (
select(ScenarioOpexSnapshot)
.where(ScenarioOpexSnapshot.scenario_id == scenario_id)
.order_by(ScenarioOpexSnapshot.calculated_at.desc())
.limit(1)
)
return self.session.execute(stmt).scalar_one_or_none()
def delete(self, snapshot_id: int) -> None:
stmt = select(ScenarioOpexSnapshot).where(
ScenarioOpexSnapshot.id == snapshot_id
)
entity = self.session.execute(stmt).scalar_one_or_none()
if entity is None:
raise EntityNotFoundError(
f"Scenario opex snapshot {snapshot_id} not found"
)
self.session.delete(entity)
class FinancialInputRepository: class FinancialInputRepository:
"""Persistence operations for FinancialInput entities.""" """Persistence operations for FinancialInput entities."""

View File

@@ -2,6 +2,7 @@ from __future__ import annotations
from dataclasses import dataclass, field from dataclasses import dataclass, field
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
from hmac import compare_digest
from typing import Any, Dict, Iterable, Literal, Type from typing import Any, Dict, Iterable, Literal, Type
from jose import ExpiredSignatureError, JWTError, jwt from jose import ExpiredSignatureError, JWTError, jwt
@@ -176,6 +177,14 @@ def _decode_token(
except JWTError as exc: # pragma: no cover - jose error bubble except JWTError as exc: # pragma: no cover - jose error bubble
raise TokenDecodeError("Unable to decode token") from exc raise TokenDecodeError("Unable to decode token") from exc
expected_token = jwt.encode(
decoded,
settings.secret_key,
algorithm=settings.algorithm,
)
if not compare_digest(token, expected_token):
raise TokenDecodeError("Token contents have been altered.")
try: try:
payload = _model_validate(TokenPayload, decoded) payload = _model_validate(TokenPayload, decoded)
except ValidationError as exc: except ValidationError as exc:

View File

@@ -1,7 +1,7 @@
from __future__ import annotations from __future__ import annotations
from dataclasses import dataclass from dataclasses import dataclass
from typing import Literal, Optional, TYPE_CHECKING from typing import Iterable, Literal, Optional, TYPE_CHECKING
from fastapi import Request, Response from fastapi import Request, Response
@@ -67,6 +67,7 @@ class AuthSession:
tokens: SessionTokens tokens: SessionTokens
user: Optional["User"] = None user: Optional["User"] = None
scopes: tuple[str, ...] = () scopes: tuple[str, ...] = ()
role_slugs: tuple[str, ...] = ()
issued_access_token: Optional[str] = None issued_access_token: Optional[str] = None
issued_refresh_token: Optional[str] = None issued_refresh_token: Optional[str] = None
clear_cookies: bool = False clear_cookies: bool = False
@@ -77,7 +78,10 @@ class AuthSession:
@classmethod @classmethod
def anonymous(cls) -> "AuthSession": def anonymous(cls) -> "AuthSession":
return cls(tokens=SessionTokens(access_token=None, refresh_token=None)) return cls(
tokens=SessionTokens(access_token=None, refresh_token=None),
role_slugs=(),
)
def issue_tokens( def issue_tokens(
self, self,
@@ -100,6 +104,10 @@ class AuthSession:
self.tokens = SessionTokens(access_token=None, refresh_token=None) self.tokens = SessionTokens(access_token=None, refresh_token=None)
self.user = None self.user = None
self.scopes = () self.scopes = ()
self.role_slugs = ()
def set_role_slugs(self, roles: Iterable[str]) -> None:
self.role_slugs = tuple(dict.fromkeys(role.strip().lower() for role in roles if role))
def extract_session_tokens(request: Request, strategy: SessionStrategy) -> SessionTokens: def extract_session_tokens(request: Request, strategy: SessionStrategy) -> SessionTokens:

View File

@@ -13,14 +13,21 @@ from services.repositories import (
PricingSettingsRepository, PricingSettingsRepository,
PricingSettingsSeedResult, PricingSettingsSeedResult,
ProjectRepository, ProjectRepository,
ProjectProfitabilityRepository,
ProjectOpexRepository,
ProjectCapexRepository,
RoleRepository, RoleRepository,
ScenarioRepository, ScenarioRepository,
ScenarioProfitabilityRepository,
ScenarioOpexRepository,
ScenarioCapexRepository,
SimulationParameterRepository, SimulationParameterRepository,
UserRepository, UserRepository,
ensure_admin_user as ensure_admin_user_record, ensure_admin_user as ensure_admin_user_record,
ensure_default_pricing_settings, ensure_default_pricing_settings,
ensure_default_roles, ensure_default_roles,
pricing_settings_to_metadata, pricing_settings_to_metadata,
NavigationRepository,
) )
from services.scenario_validation import ScenarioComparisonValidator from services.scenario_validation import ScenarioComparisonValidator
@@ -36,9 +43,16 @@ class UnitOfWork(AbstractContextManager["UnitOfWork"]):
self.scenarios: ScenarioRepository | None = None self.scenarios: ScenarioRepository | None = None
self.financial_inputs: FinancialInputRepository | None = None self.financial_inputs: FinancialInputRepository | None = None
self.simulation_parameters: SimulationParameterRepository | None = None self.simulation_parameters: SimulationParameterRepository | None = None
self.project_profitability: ProjectProfitabilityRepository | None = None
self.project_capex: ProjectCapexRepository | None = None
self.project_opex: ProjectOpexRepository | None = None
self.scenario_profitability: ScenarioProfitabilityRepository | None = None
self.scenario_capex: ScenarioCapexRepository | None = None
self.scenario_opex: ScenarioOpexRepository | None = None
self.users: UserRepository | None = None self.users: UserRepository | None = None
self.roles: RoleRepository | None = None self.roles: RoleRepository | None = None
self.pricing_settings: PricingSettingsRepository | None = None self.pricing_settings: PricingSettingsRepository | None = None
self.navigation: NavigationRepository | None = None
def __enter__(self) -> "UnitOfWork": def __enter__(self) -> "UnitOfWork":
self.session = self._session_factory() self.session = self._session_factory()
@@ -47,9 +61,21 @@ class UnitOfWork(AbstractContextManager["UnitOfWork"]):
self.financial_inputs = FinancialInputRepository(self.session) self.financial_inputs = FinancialInputRepository(self.session)
self.simulation_parameters = SimulationParameterRepository( self.simulation_parameters = SimulationParameterRepository(
self.session) self.session)
self.project_profitability = ProjectProfitabilityRepository(
self.session)
self.project_capex = ProjectCapexRepository(self.session)
self.project_opex = ProjectOpexRepository(
self.session)
self.scenario_profitability = ScenarioProfitabilityRepository(
self.session
)
self.scenario_capex = ScenarioCapexRepository(self.session)
self.scenario_opex = ScenarioOpexRepository(
self.session)
self.users = UserRepository(self.session) self.users = UserRepository(self.session)
self.roles = RoleRepository(self.session) self.roles = RoleRepository(self.session)
self.pricing_settings = PricingSettingsRepository(self.session) self.pricing_settings = PricingSettingsRepository(self.session)
self.navigation = NavigationRepository(self.session)
self._scenario_validator = ScenarioComparisonValidator() self._scenario_validator = ScenarioComparisonValidator()
return self return self
@@ -65,9 +91,16 @@ class UnitOfWork(AbstractContextManager["UnitOfWork"]):
self.scenarios = None self.scenarios = None
self.financial_inputs = None self.financial_inputs = None
self.simulation_parameters = None self.simulation_parameters = None
self.project_profitability = None
self.project_capex = None
self.project_opex = None
self.scenario_profitability = None
self.scenario_capex = None
self.scenario_opex = None
self.users = None self.users = None
self.roles = None self.roles = None
self.pricing_settings = None self.pricing_settings = None
self.navigation = None
def flush(self) -> None: def flush(self) -> None:
if not self.session: if not self.session:

View File

@@ -2,17 +2,6 @@
--dashboard-gap: 1.5rem; --dashboard-gap: 1.5rem;
} }
.dashboard-header {
align-items: center;
}
.header-actions {
display: flex;
gap: 0.75rem;
flex-wrap: wrap;
justify-content: flex-end;
}
.dashboard-metrics { .dashboard-metrics {
display: grid; display: grid;
gap: var(--dashboard-gap); gap: var(--dashboard-gap);
@@ -20,36 +9,6 @@
margin-bottom: 2rem; margin-bottom: 2rem;
} }
.metric-card {
background: var(--card);
border-radius: var(--radius);
padding: 1.5rem;
box-shadow: var(--shadow);
border: 1px solid var(--color-border);
display: flex;
flex-direction: column;
gap: 0.35rem;
}
.metric-card h2 {
margin: 0;
font-size: 1rem;
color: var(--muted);
text-transform: uppercase;
letter-spacing: 0.08em;
}
.metric-value {
font-size: 2rem;
font-weight: 700;
margin: 0;
}
.metric-caption {
color: var(--color-text-subtle);
font-size: 0.85rem;
}
.dashboard-grid { .dashboard-grid {
display: grid; display: grid;
gap: var(--dashboard-gap); gap: var(--dashboard-gap);
@@ -67,16 +26,6 @@
gap: var(--dashboard-gap); gap: var(--dashboard-gap);
} }
.table-link {
color: var(--brand-2);
text-decoration: none;
}
.table-link:hover,
.table-link:focus {
text-decoration: underline;
}
.timeline { .timeline {
list-style: none; list-style: none;
margin: 0; margin: 0;
@@ -107,7 +56,9 @@
padding: 0.75rem; padding: 0.75rem;
border-radius: var(--radius-sm); border-radius: var(--radius-sm);
background: rgba(209, 75, 75, 0.16); background: rgba(209, 75, 75, 0.16);
background: color-mix(in srgb, var(--color-danger) 16%, transparent);
border: 1px solid rgba(209, 75, 75, 0.3); border: 1px solid rgba(209, 75, 75, 0.3);
border: 1px solid color-mix(in srgb, var(--color-danger) 30%, transparent);
} }
.links-list a { .links-list a {
@@ -128,23 +79,4 @@
.grid-sidebar { .grid-sidebar {
grid-template-columns: repeat(auto-fit, minmax(260px, 1fr)); grid-template-columns: repeat(auto-fit, minmax(260px, 1fr));
} }
.header-actions {
justify-content: flex-start;
}
}
@media (max-width: 640px) {
.metric-card {
padding: 1.25rem;
}
.metric-value {
font-size: 1.75rem;
}
.header-actions {
flex-direction: column;
align-items: stretch;
}
} }

111
static/css/forms.css Normal file
View File

@@ -0,0 +1,111 @@
.form {
display: flex;
flex-direction: column;
gap: 1.25rem;
}
.form-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(240px, 1fr));
gap: 1.25rem;
}
.form-group {
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.form-group label {
font-weight: 600;
color: var(--text);
color: var(--color-text-primary);
}
.form-group input,
.form-group select,
.form-group textarea {
padding: 0.75rem 0.85rem;
border-radius: var(--radius-sm);
border: 1px solid var(--card-border);
background: rgba(8, 12, 19, 0.78);
background: color-mix(in srgb, var(--color-bg-elevated) 78%, transparent);
color: var(--text);
color: var(--color-text-primary);
transition: border-color 0.15s ease, background 0.2s ease,
box-shadow 0.2s ease;
}
.form-group textarea {
resize: vertical;
min-height: 120px;
}
.form-group input:focus,
.form-group select:focus,
.form-group textarea:focus {
outline: 2px solid var(--brand-2);
outline: 2px solid var(--color-brand-bright);
outline-offset: 1px;
}
.form-group input:disabled,
.form-group select:disabled,
.form-group textarea:disabled {
cursor: not-allowed;
opacity: 0.6;
}
.form-group--error input,
.form-group--error select,
.form-group--error textarea {
border-color: rgba(209, 75, 75, 0.6);
border-color: color-mix(in srgb, var(--color-danger) 60%, transparent);
box-shadow: 0 0 0 1px rgba(209, 75, 75, 0.3);
box-shadow: 0 0 0 1px color-mix(in srgb, var(--color-danger) 30%, transparent);
}
.field-help {
margin: 0;
font-size: 0.85rem;
color: var(--color-text-subtle);
}
.field-error {
margin: 0;
font-size: 0.85rem;
color: var(--danger);
color: var(--color-danger);
}
.form-actions {
display: flex;
flex-wrap: wrap;
gap: 0.75rem;
justify-content: flex-end;
}
.form-fieldset {
border: 1px solid var(--color-border);
border-radius: var(--radius);
background: rgba(21, 27, 35, 0.85);
background: var(--color-surface-overlay);
box-shadow: var(--shadow);
padding: 1.5rem;
display: flex;
flex-direction: column;
gap: 1.25rem;
}
.form-fieldset legend {
font-weight: 700;
padding: 0 0.5rem;
color: var(--text);
color: var(--color-text-primary);
}
@media (max-width: 640px) {
.form-actions {
justify-content: stretch;
}
}

View File

@@ -1,7 +1,8 @@
.import-upload { .import-upload {
background-color: var(--surface-color); background-color: rgba(21, 27, 35, 0.85);
border: 1px dashed var(--border-color); background-color: var(--color-surface-overlay);
border-radius: var(--radius-md); border: 1px dashed var(--color-border);
border-radius: var(--radius);
padding: 1.5rem; padding: 1.5rem;
margin-bottom: 1.5rem; margin-bottom: 1.5rem;
} }
@@ -11,7 +12,7 @@
} }
.import-upload__dropzone { .import-upload__dropzone {
border: 2px dashed var(--border-color); border: 2px dashed var(--color-border);
border-radius: var(--radius-sm); border-radius: var(--radius-sm);
padding: 2rem; padding: 2rem;
text-align: center; text-align: center;
@@ -19,8 +20,10 @@
} }
.import-upload__dropzone.dragover { .import-upload__dropzone.dragover {
border-color: var(--primary-color); border-color: #f6c648;
background-color: rgba(0, 123, 255, 0.05); border-color: var(--color-brand-bright);
background-color: rgba(241, 178, 26, 0.08);
background-color: var(--color-highlight);
} }
.import-upload__actions { .import-upload__actions {
@@ -35,18 +38,6 @@
gap: 0.5rem; gap: 0.5rem;
} }
.btn-ghost {
background: transparent;
border: none;
cursor: pointer;
padding: 0.25rem 0.5rem;
color: var(--text-muted);
}
.btn-ghost:hover {
color: var(--primary-color);
}
.toast { .toast {
position: fixed; position: fixed;
right: 1rem; right: 1rem;
@@ -55,9 +46,9 @@
align-items: center; align-items: center;
gap: 0.75rem; gap: 0.75rem;
padding: 1rem 1.25rem; padding: 1rem 1.25rem;
border-radius: var(--radius-md); border-radius: var(--radius);
color: #fff; color: var(--color-text-invert);
box-shadow: var(--shadow-lg); box-shadow: var(--shadow);
z-index: 1000; z-index: 1000;
} }
@@ -66,15 +57,18 @@
} }
.toast--success { .toast--success {
background-color: #198754; background-color: var(--success);
background-color: var(--color-success);
} }
.toast--error { .toast--error {
background-color: #dc3545; background-color: var(--danger);
background-color: var(--color-danger);
} }
.toast--info { .toast--info {
background-color: #0d6efd; background-color: var(--info);
background-color: var(--color-info);
} }
.toast__close { .toast__close {

View File

@@ -1,3 +1,80 @@
:root {
/* Radii & layout */
--radius: 14px;
--radius-sm: 10px;
--panel-radius: var(--radius);
--table-radius: var(--radius-sm);
--container: 1180px;
/* Spacing & typography */
--space-2xs: 0.25rem;
--space-xs: 0.5rem;
--space-sm: 0.75rem;
--space-md: 1rem;
--space-lg: 1.5rem;
--space-xl: 2rem;
--space-2xl: 3rem;
--font-size-xs: 0.75rem;
--font-size-sm: 0.875rem;
--font-size-base: 1rem;
--font-size-lg: 1.25rem;
--font-size-xl: 1.5rem;
--font-size-2xl: 2rem;
}
html,
body {
height: 100%;
}
body {
margin: 0;
font-family: ui-sans-serif, system-ui, -apple-system, "Segoe UI", "Roboto",
Helvetica, Arial, "Apple Color Emoji", "Segoe UI Emoji";
color: var(--text);
background: linear-gradient(180deg, var(--bg) 0%, var(--bg-2) 100%);
line-height: 1.45;
}
.header-actions {
display: flex;
gap: 0.75rem;
flex-wrap: wrap;
justify-content: flex-end;
}
h1,
h2,
h3,
h4,
h5,
h6 {
margin: 0 0 0.5rem 0;
font-weight: 700;
line-height: 1.2;
}
h1 {
font-size: var(--font-size-2xl);
}
h2 {
font-size: var(--font-size-xl);
}
h3 {
font-size: var(--font-size-lg);
}
p {
margin: 0 0 1rem 0;
}
a {
color: var(--brand);
}
.report-overview { .report-overview {
margin-bottom: 2.5rem; margin-bottom: 2.5rem;
} }
@@ -25,6 +102,16 @@
margin-top: 3rem; margin-top: 3rem;
} }
.chart-container {
width: 100%;
height: 400px;
background: rgba(15, 20, 27, 0.8);
border-radius: var(--radius-sm);
border: 1px solid rgba(255, 255, 255, 0.05);
box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.06);
margin-bottom: 1rem;
}
.section-header { .section-header {
margin-bottom: 1.25rem; margin-bottom: 1.25rem;
} }
@@ -64,6 +151,36 @@
color: var(--text); color: var(--text);
} }
.metric-card {
background: var(--color-surface-overlay);
border-radius: var(--radius);
padding: 1.5rem;
box-shadow: var(--shadow);
border: 1px solid var(--color-border);
display: flex;
flex-direction: column;
gap: 0.35rem;
}
.metric-card h2 {
margin: 0;
font-size: 1rem;
color: var(--color-text-muted);
text-transform: uppercase;
letter-spacing: 0.08em;
}
.metric-value {
font-size: 2rem;
font-weight: 700;
margin: 0;
}
.metric-caption {
color: var(--color-text-subtle);
font-size: 0.85rem;
}
.metrics-table { .metrics-table {
width: 100%; width: 100%;
border-collapse: collapse; border-collapse: collapse;
@@ -81,7 +198,7 @@
.metrics-table th { .metrics-table th {
font-weight: 600; font-weight: 600;
color: var(--text); color: var(--color-text-dark);
} }
.metrics-table tr:last-child td, .metrics-table tr:last-child td,
@@ -92,23 +209,30 @@
.definition-list { .definition-list {
margin: 0; margin: 0;
display: grid; display: grid;
gap: 0.75rem; gap: 1.25rem 2rem;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
} }
.definition-list div { .definition-list div {
display: grid; display: grid;
grid-template-columns: 140px 1fr; grid-template-columns: minmax(140px, 0.6fr) minmax(0, 1fr);
gap: 0.5rem; gap: 0.5rem;
align-items: baseline; align-items: baseline;
} }
.definition-list dt { .definition-list dt {
color: var(--muted); margin: 0;
font-weight: 600; font-weight: 600;
color: var(--color-text-muted);
text-transform: uppercase;
font-size: 0.75rem;
letter-spacing: 0.08em;
} }
.definition-list dd { .definition-list dd {
margin: 0; margin: 0;
font-size: 1rem;
color: var(--color-text-primary);
} }
.scenario-card { .scenario-card {
@@ -138,6 +262,13 @@
} }
.scenario-meta { .scenario-meta {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
gap: 1.25rem;
}
.scenario-card .scenario-meta {
display: block;
text-align: right; text-align: right;
} }
@@ -183,6 +314,201 @@
color: var(--muted); color: var(--muted);
} }
.quick-link-list {
list-style: none;
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
gap: 1rem;
}
.quick-link-list li a {
font-weight: 600;
color: var(--brand-2);
text-decoration: none;
}
.quick-link-list li a:hover,
.quick-link-list li a:focus {
text-decoration: underline;
}
.quick-link-list p {
margin: 0.25rem 0 0;
color: var(--color-text-subtle);
font-size: 0.9rem;
}
.scenario-list {
list-style: none;
margin: 0;
padding: 0;
display: flex;
flex-direction: column;
gap: 1rem;
}
.scenario-item {
background: rgba(21, 27, 35, 0.85);
background: color-mix(in srgb, var(--color-surface-default) 85%, transparent);
border: 1px solid var(--color-border);
border-radius: var(--radius);
padding: 1.25rem;
display: flex;
flex-direction: column;
gap: 1rem;
}
.scenario-item__body {
display: flex;
flex-direction: column;
gap: 1rem;
}
.scenario-item__header {
display: flex;
flex-wrap: wrap;
align-items: center;
gap: 0.75rem;
justify-content: space-between;
}
.scenario-item__header h3 {
margin: 0;
font-size: 1.1rem;
}
.scenario-item__header a {
color: inherit;
text-decoration: none;
}
.scenario-item__header a:hover,
.scenario-item__header a:focus {
text-decoration: underline;
}
.scenario-item__meta {
display: grid;
gap: 0.75rem;
grid-template-columns: repeat(auto-fit, minmax(150px, 1fr));
}
.scenario-item__meta dt {
margin: 0;
font-size: 0.75rem;
color: var(--color-text-muted);
text-transform: uppercase;
letter-spacing: 0.08em;
}
.scenario-item__meta dd {
margin: 0;
font-size: 0.95rem;
}
.scenario-item__actions {
display: flex;
gap: 0.75rem;
flex-wrap: wrap;
}
.scenario-item__actions .btn--link {
padding: 0;
}
.status-pill {
display: inline-flex;
align-items: center;
gap: 0.35rem;
padding: 0.35rem 0.85rem;
border-radius: 999px;
font-size: 0.75rem;
text-transform: uppercase;
letter-spacing: 0.08em;
}
.status-pill--draft {
background: rgba(59, 130, 246, 0.15);
color: #93c5fd;
background: color-mix(in srgb, var(--color-info) 18%, transparent);
color: color-mix(in srgb, var(--color-info) 70%, white);
}
.status-pill--active {
background: rgba(34, 197, 94, 0.18);
color: #86efac;
background: color-mix(in srgb, var(--color-success) 18%, transparent);
color: color-mix(in srgb, var(--color-success) 70%, white);
}
.status-pill--archived {
background: rgba(148, 163, 184, 0.24);
color: #cbd5f5;
background: color-mix(in srgb, var(--color-text-muted) 24%, transparent);
color: color-mix(in srgb, var(--color-text-muted) 60%, white);
}
.empty-state {
color: var(--color-text-muted);
font-style: italic;
}
.table {
width: 100%;
border-collapse: collapse;
border-radius: var(--table-radius);
overflow: hidden;
box-shadow: var(--shadow);
}
.table th,
.table td {
padding: 0.75rem 1rem;
border-bottom: 1px solid var(--color-border);
background: rgba(21, 27, 35, 0.85);
background: color-mix(in srgb, var(--color-surface-default) 85%, transparent);
}
.table tbody tr:hover {
background: rgba(241, 178, 26, 0.12);
background: var(--color-highlight);
}
.table-link {
color: var(--brand-2);
text-decoration: none;
margin-left: 0.5rem;
}
.table-link:hover,
.table-link:focus {
text-decoration: underline;
}
.table-responsive {
width: 100%;
overflow-x: auto;
-webkit-overflow-scrolling: touch;
border-radius: var(--table-radius);
margin: 0;
}
.table-responsive .table {
min-width: 640px;
}
.table-responsive::-webkit-scrollbar {
height: 6px;
}
.table-responsive::-webkit-scrollbar-thumb {
background: rgba(255, 255, 255, 0.2);
background: color-mix(in srgb, var(--color-text-invert) 20%, transparent);
border-radius: 999px;
}
.page-actions .button { .page-actions .button {
text-decoration: none; text-decoration: none;
background: transparent; background: transparent;
@@ -199,96 +525,25 @@
background: rgba(241, 178, 26, 0.14); background: rgba(241, 178, 26, 0.14);
border-color: var(--brand); border-color: var(--brand);
} }
:root {
--bg: #0b0f14; .breadcrumb {
--bg-2: #0f141b; display: flex;
--card: #151b23; align-items: center;
--text: #e6edf3; gap: 0.5rem;
--muted: #a9b4c0; font-size: 0.9rem;
--brand: #f1b21a; color: var(--muted);
--brand-2: #f6c648; margin-bottom: 1.2rem;
--brand-3: #f9d475;
--accent: #2ba58f;
--danger: #d14b4b;
--shadow: 0 10px 30px rgba(0, 0, 0, 0.35);
--radius: 14px;
--radius-sm: 10px;
--container: 1180px;
--muted: var(--muted);
--color-text-subtle: rgba(169, 180, 192, 0.6);
--color-text-invert: #ffffff;
--color-text-dark: #0f172a;
--color-text-strong: #111827;
--color-border: rgba(255, 255, 255, 0.08);
--color-border-strong: rgba(255, 255, 255, 0.12);
--color-highlight: rgba(241, 178, 26, 0.08);
--color-panel-shadow: rgba(0, 0, 0, 0.25);
--color-panel-shadow-deep: rgba(0, 0, 0, 0.35);
--color-surface-alt: rgba(21, 27, 35, 0.7);
--space-2xs: 0.25rem;
--space-xs: 0.5rem;
--space-sm: 0.75rem;
--space-md: 1rem;
--space-lg: 1.5rem;
--space-xl: 2rem;
--space-2xl: 3rem;
--font-size-xs: 0.75rem;
--font-size-sm: 0.875rem;
--font-size-base: 1rem;
--font-size-lg: 1.25rem;
--font-size-xl: 1.5rem;
--font-size-2xl: 2rem;
--panel-radius: var(--radius);
--table-radius: var(--radius-sm);
} }
* { .breadcrumb a {
box-sizing: border-box; color: var(--brand-2);
text-decoration: none;
} }
html, .breadcrumb a::after {
body { content: ">";
height: 100%; margin-left: 0.5rem;
} color: var(--muted);
body {
margin: 0;
font-family: ui-sans-serif, system-ui, -apple-system, "Segoe UI", "Roboto",
Helvetica, Arial, "Apple Color Emoji", "Segoe UI Emoji";
color: var(--text);
background: linear-gradient(180deg, var(--bg) 0%, var(--bg-2) 100%);
line-height: 1.45;
}
h1,
h2,
h3,
h4,
h5,
h6 {
margin: 0 0 0.5rem 0;
font-weight: 700;
line-height: 1.2;
}
h1 {
font-size: var(--font-size-2xl);
}
h2 {
font-size: var(--font-size-xl);
}
h3 {
font-size: var(--font-size-lg);
}
p {
margin: 0 0 1rem 0;
}
a {
color: var(--brand);
} }
.app-layout { .app-layout {
@@ -321,23 +576,32 @@ a {
display: flex; display: flex;
align-items: center; align-items: center;
gap: 1rem; gap: 1rem;
padding: 0.5rem 1rem;
border-radius: 0.75rem;
}
a.sidebar-brand {
text-decoration: none;
}
a.sidebar-brand:hover,
a.sidebar-brand:focus {
color: var(--color-text-invert);
background-color: rgba(148, 197, 255, 0.18);
} }
.sidebar-nav-controls { .sidebar-nav-controls {
display: flex; display: flex;
justify-content: center; justify-content: center;
gap: 0.5rem; gap: 1rem;
margin: 1rem 0; margin: 0;
} }
.nav-chevron { .nav-chevron {
width: 40px; width: 5rem;
height: 40px; height: 5rem;
border: none; border: none;
border-radius: 50%; background: rgba(0, 0, 0, 0.5);
background: rgba(255, 255, 255, 0.1);
color: rgba(255, 255, 255, 0.88); color: rgba(255, 255, 255, 0.88);
font-size: 1.2rem; font-size: 4.5rem;
font-weight: bold; font-weight: bold;
cursor: pointer; cursor: pointer;
display: flex; display: flex;
@@ -348,8 +612,9 @@ a {
.nav-chevron:hover, .nav-chevron:hover,
.nav-chevron:focus { .nav-chevron:focus {
background: rgba(255, 255, 255, 0.2); background: rgba(0, 0, 0, 0.1);
transform: scale(1.05); color: rgba(255, 255, 255, 1);
transform: scale(0.9);
} }
.nav-chevron:disabled { .nav-chevron:disabled {
@@ -504,7 +769,7 @@ a {
.dashboard-header { .dashboard-header {
display: flex; display: flex;
align-items: flex-start; align-items: center;
justify-content: space-between; justify-content: space-between;
gap: 1.5rem; gap: 1.5rem;
margin-bottom: 2rem; margin-bottom: 2rem;
@@ -846,36 +1111,6 @@ a {
font-size: var(--font-size-lg); font-size: var(--font-size-lg);
} }
.form-grid {
display: grid;
gap: var(--space-md);
max-width: 480px;
}
.form-grid label {
display: flex;
flex-direction: column;
gap: var(--space-sm);
font-weight: 600;
color: var(--text);
}
.form-grid input,
.form-grid textarea,
.form-grid select {
padding: 0.6rem var(--space-sm);
border: 1px solid var(--color-border-strong);
border-radius: 8px;
font-size: var(--font-size-base);
}
.form-grid input:focus,
.form-grid textarea:focus,
.form-grid select:focus {
outline: 2px solid var(--brand-2);
outline-offset: 1px;
}
.btn { .btn {
display: inline-flex; display: inline-flex;
align-items: center; align-items: center;
@@ -883,28 +1118,101 @@ a {
gap: 0.5rem; gap: 0.5rem;
padding: 0.65rem 1.25rem; padding: 0.65rem 1.25rem;
border-radius: 999px; border-radius: 999px;
border: none; border: 1px solid var(--btn-secondary-border);
cursor: pointer; cursor: pointer;
font-weight: 600; font-weight: 600;
background-color: var(--color-border); background-color: var(--btn-secondary-bg);
color: var(--color-text-dark); color: var(--btn-secondary-color);
transition: transform 0.15s ease, box-shadow 0.15s ease; text-decoration: none;
transition: transform 0.15s ease, box-shadow 0.15s ease,
background-color 0.2s ease, border-color 0.2s ease;
} }
.btn:hover, .btn:hover,
.btn:focus { .btn:focus {
transform: translateY(-1px); transform: translateY(-1px);
box-shadow: 0 4px 10px var(--color-panel-shadow); box-shadow: 0 4px 10px var(--color-panel-shadow);
background-color: var(--btn-secondary-hover);
} }
.btn.primary { .btn--primary,
background-color: var(--brand-2); .btn.primary,
color: var(--color-text-invert); .btn.btn-primary {
background-color: var(--btn-primary-bg);
border-color: transparent;
color: var(--btn-primary-color);
} }
.btn--primary:hover,
.btn--primary:focus,
.btn.primary:hover, .btn.primary:hover,
.btn.primary:focus { .btn.primary:focus,
background-color: var(--brand-3); .btn.btn-primary:hover,
.btn.btn-primary:focus {
background-color: var(--btn-primary-hover);
}
.btn--secondary,
.btn.secondary,
.btn.btn-secondary {
background-color: var(--btn-secondary-bg);
border-color: var(--btn-secondary-border);
color: var(--btn-secondary-color);
}
.btn--secondary:hover,
.btn--secondary:focus,
.btn.secondary:hover,
.btn.secondary:focus,
.btn.btn-secondary:hover,
.btn.btn-secondary:focus {
background-color: var(--btn-secondary-hover);
}
.btn--link,
.btn.btn-link,
.btn.link {
padding: 0.25rem 0;
border: none;
background: transparent;
color: var(--btn-link-color);
margin: 0;
box-shadow: none;
}
.btn--link:hover,
.btn--link:focus,
.btn.btn-link:hover,
.btn.btn-link:focus,
.btn.link:hover,
.btn.link:focus {
transform: none;
box-shadow: none;
color: var(--btn-link-hover);
text-decoration: underline;
}
.btn--ghost {
background: transparent;
border: 1px solid transparent;
color: var(--btn-ghost-color);
}
.btn--ghost:hover,
.btn--ghost:focus {
background: rgba(255, 255, 255, 0.1);
border-color: rgba(255, 255, 255, 0.2);
}
.btn--icon {
padding: 0.4rem;
border-radius: 50%;
line-height: 0;
}
.btn--icon:hover,
.btn--icon:focus {
transform: none;
} }
.result-output { .result-output {
@@ -1002,7 +1310,7 @@ tbody tr:nth-child(even) {
.site-footer { .site-footer {
background-color: var(--brand); background-color: var(--brand);
color: var(--color-text-invert); color: var(--color-text-strong);
margin-top: 3rem; margin-top: 3rem;
} }
@@ -1027,6 +1335,19 @@ tbody tr:nth-child(even) {
object-fit: cover; object-fit: cover;
} }
footer p {
margin: 0;
}
footer a {
font-weight: 600;
color: var(--color-text-dark);
text-decoration: underline;
}
footer a:hover,
footer a:focus {
color: var(--color-text-strong);
}
.sidebar-toggle { .sidebar-toggle {
display: none; display: none;
align-items: center; align-items: center;
@@ -1093,10 +1414,62 @@ tbody tr:nth-child(even) {
transition: opacity 0.25s ease; transition: opacity 0.25s ease;
} }
@media (min-width: 720px) {
.table-responsive .table {
min-width: 100%;
}
}
@media (max-width: 640px) {
.table th,
.table td {
padding: 0.55rem 0.65rem;
font-size: 0.9rem;
white-space: nowrap;
}
.table tbody tr {
border-radius: var(--radius-sm);
}
.metric-card {
padding: 1.25rem;
}
.metric-value {
font-size: 1.75rem;
}
.header-actions {
flex-direction: column;
align-items: stretch;
}
}
@media (min-width: 960px) {
.header-actions {
justify-content: flex-start;
}
.scenario-item {
flex-direction: row;
justify-content: space-between;
align-items: center;
}
.scenario-item__body {
max-width: 70%;
}
}
@media (max-width: 1024px) { @media (max-width: 1024px) {
.app-sidebar { .app-sidebar {
width: 240px; width: 240px;
} }
.header-actions {
justify-content: flex-start;
}
} }
@media (max-width: 900px) { @media (max-width: 900px) {
@@ -1126,8 +1499,16 @@ tbody tr:nth-child(even) {
justify-content: center; justify-content: center;
} }
.sidebar-nav-controls {
display: none;
}
.sidebar-link-block {
align-items: center;
}
.sidebar-link { .sidebar-link {
flex: 1 1 140px; flex: 1 1 40px;
justify-content: center; justify-content: center;
} }
@@ -1157,6 +1538,10 @@ tbody tr:nth-child(even) {
overflow: hidden; overflow: hidden;
} }
body.sidebar-open .app-main {
position: relative;
z-index: 1;
}
body.sidebar-open .app-sidebar { body.sidebar-open .app-sidebar {
display: block; display: block;
position: fixed; position: fixed;
@@ -1165,7 +1550,7 @@ tbody tr:nth-child(even) {
width: min(320px, 82vw); width: min(320px, 82vw);
height: 100vh; height: 100vh;
overflow-y: auto; overflow-y: auto;
z-index: 900; z-index: 999;
box-shadow: 0 12px 30px rgba(8, 14, 25, 0.4); box-shadow: 0 12px 30px rgba(8, 14, 25, 0.4);
} }
@@ -1173,9 +1558,4 @@ tbody tr:nth-child(even) {
opacity: 1; opacity: 1;
pointer-events: auto; pointer-events: auto;
} }
body.sidebar-open .app-main {
position: relative;
z-index: 950;
}
} }

View File

@@ -1,14 +1,103 @@
:root { .projects-grid {
--card-bg: rgba(21, 27, 35, 0.8); display: grid;
--card-border: rgba(255, 255, 255, 0.08); gap: 1.5rem;
--hover-highlight: rgba(241, 178, 26, 0.12); grid-template-columns: repeat(auto-fit, minmax(320px, 1fr));
margin-top: 1.5rem;
} }
.header-actions { .project-card {
background: var(--color-surface-overlay);
border: 1px solid var(--color-border);
box-shadow: var(--shadow);
border-radius: var(--radius);
padding: 1.5rem;
display: flex;
flex-direction: column;
gap: 1.25rem;
transition: transform 0.2s ease, box-shadow 0.2s ease;
}
.project-card:hover,
.project-card:focus-within {
transform: translateY(-2px);
box-shadow: 0 22px 45px var(--color-panel-shadow-deep);
}
.project-card__header {
display: flex;
align-items: baseline;
justify-content: space-between;
gap: 1rem;
}
.project-card__title {
margin: 0;
font-size: 1.25rem;
}
.project-card__title a {
color: var(--brand);
text-decoration: none;
}
.project-card__title a:hover,
.project-card__title a:focus {
text-decoration: underline;
}
.project-card__type {
font-size: 0.75rem;
text-transform: uppercase;
letter-spacing: 0.08em;
}
.project-card__description {
margin: 0;
color: var(--color-text-subtle);
min-height: 3rem;
}
.project-card__meta {
display: grid;
gap: 1rem;
grid-template-columns: repeat(auto-fit, minmax(140px, 1fr));
}
.project-card__meta div {
display: flex;
flex-direction: column;
gap: 0.35rem;
}
.project-card__meta dt {
font-size: 0.75rem;
text-transform: uppercase;
color: var(--color-text-muted);
letter-spacing: 0.08em;
}
.project-card__meta dd {
margin: 0;
font-size: 0.95rem;
}
.project-card__footer {
display: flex;
align-items: center;
justify-content: space-between;
gap: 1rem;
flex-wrap: wrap;
}
.project-card__links {
display: flex; display: flex;
gap: 0.75rem; gap: 0.75rem;
flex-wrap: wrap; flex-wrap: wrap;
justify-content: flex-end; }
.project-card__links .btn--link {
padding: 3px 4px;
border-radius: 8px;
} }
.project-metrics { .project-metrics {
@@ -18,39 +107,9 @@
margin-bottom: 2rem; margin-bottom: 2rem;
} }
.metric-card {
background: var(--card-bg);
border-radius: var(--radius);
padding: 1.5rem;
box-shadow: var(--shadow);
border: 1px solid var(--card-border);
display: flex;
flex-direction: column;
gap: 0.35rem;
}
.metric-card h2 {
margin: 0;
font-size: 1rem;
color: var(--muted);
text-transform: uppercase;
letter-spacing: 0.08em;
}
.metric-value {
font-size: 2rem;
font-weight: 700;
margin: 0;
}
.metric-caption {
color: var(--color-text-subtle);
font-size: 0.85rem;
}
.project-form { .project-form {
background: var(--card-bg); background: var(--color-surface-overlay);
border: 1px solid var(--card-border); border: 1px solid var(--color-border);
border-radius: var(--radius); border-radius: var(--radius);
box-shadow: var(--shadow); box-shadow: var(--shadow);
padding: 1.75rem; padding: 1.75rem;
@@ -59,34 +118,43 @@
gap: 1.5rem; gap: 1.5rem;
} }
.definition-list {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(240px, 1fr));
gap: 1.25rem 2rem;
}
.definition-list dt {
font-weight: 600;
color: var(--muted);
margin-bottom: 0.2rem;
text-transform: uppercase;
font-size: 0.75rem;
}
.definition-list dd {
margin: 0;
font-size: 1rem;
}
.card { .card {
background: var(--card-bg); background: var(--color-surface-overlay);
border: 1px solid var(--card-border); border: 1px solid var(--color-border);
box-shadow: var(--shadow); box-shadow: var(--shadow);
border-radius: var(--radius); border-radius: var(--radius);
padding: 1.5rem; padding: 1.5rem;
margin-bottom: 2rem; margin-bottom: 2rem;
} }
.project-column {
display: grid;
gap: 1.5rem;
}
.project-actions-card {
display: flex;
flex-direction: column;
gap: 1rem;
}
.project-scenarios-card {
display: flex;
flex-direction: column;
gap: 1.5rem;
}
.project-scenarios-card__header {
display: flex;
flex-wrap: wrap;
justify-content: space-between;
gap: 1rem;
}
.project-scenarios-card__header h2 {
margin: 0;
}
.card-header { .card-header {
display: flex; display: flex;
align-items: center; align-items: center;
@@ -103,41 +171,6 @@
gap: 1.5rem; gap: 1.5rem;
} }
.table-responsive {
overflow-x: auto;
border-radius: var(--table-radius);
}
.table {
width: 100%;
border-collapse: collapse;
border-radius: var(--table-radius);
overflow: hidden;
box-shadow: var(--shadow);
}
.table th,
.table td {
padding: 0.75rem 1rem;
border-bottom: 1px solid var(--card-border);
background: rgba(21, 27, 35, 0.85);
}
.table tbody tr:hover {
background: var(--hover-highlight);
}
.table-link {
color: var(--brand-2);
text-decoration: none;
margin-left: 0.5rem;
}
.table-link:hover,
.table-link:focus {
text-decoration: underline;
}
.text-right { .text-right {
text-align: right; text-align: right;
} }
@@ -147,42 +180,4 @@
grid-template-columns: 1.1fr 1.9fr; grid-template-columns: 1.1fr 1.9fr;
align-items: start; align-items: start;
} }
.header-actions {
justify-content: flex-start;
}
}
.form {
display: flex;
flex-direction: column;
gap: 1.25rem;
}
.form-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
gap: 1.25rem;
}
.form-group {
display: flex;
flex-direction: column;
gap: 0.5rem;
}
.form-group input,
.form-group select,
.form-group textarea {
padding: 0.75rem 0.85rem;
border-radius: var(--radius-sm);
border: 1px solid var(--card-border);
background: rgba(8, 12, 19, 0.75);
color: var(--text);
}
.form-actions {
display: flex;
gap: 0.75rem;
justify-content: flex-end;
} }

View File

@@ -1,49 +1,3 @@
.scenario-meta {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
gap: 1.25rem;
}
.table {
width: 100%;
border-collapse: collapse;
border-radius: var(--table-radius);
overflow: hidden;
box-shadow: var(--shadow);
}
.table th,
.table td {
padding: 0.75rem 1rem;
border-bottom: 1px solid var(--color-border);
background: rgba(21, 27, 35, 0.85);
}
.table tbody tr:hover {
background: rgba(43, 165, 143, 0.12);
}
.breadcrumb {
display: flex;
align-items: center;
gap: 0.5rem;
font-size: 0.9rem;
color: var(--muted);
margin-bottom: 1.2rem;
}
.breadcrumb a {
color: var(--brand-2);
text-decoration: none;
}
.header-actions {
display: flex;
gap: 0.75rem;
flex-wrap: wrap;
justify-content: flex-end;
}
.scenario-metrics { .scenario-metrics {
display: grid; display: grid;
gap: 1.5rem; gap: 1.5rem;
@@ -51,36 +5,6 @@
margin-bottom: 2rem; margin-bottom: 2rem;
} }
.metric-card {
background: rgba(21, 27, 35, 0.85);
border-radius: var(--radius);
padding: 1.5rem;
box-shadow: var(--shadow);
border: 1px solid var(--color-border);
display: flex;
flex-direction: column;
gap: 0.35rem;
}
.metric-card h2 {
margin: 0;
font-size: 1rem;
color: var(--muted);
text-transform: uppercase;
letter-spacing: 0.08em;
}
.metric-value {
font-size: 2rem;
font-weight: 700;
margin: 0;
}
.metric-caption {
color: var(--color-text-subtle);
font-size: 0.85rem;
}
.scenario-filters { .scenario-filters {
display: grid; display: grid;
gap: 0.75rem; gap: 0.75rem;
@@ -107,11 +31,13 @@
border-radius: var(--radius-sm); border-radius: var(--radius-sm);
border: 1px solid var(--color-border); border: 1px solid var(--color-border);
background: rgba(8, 12, 19, 0.75); background: rgba(8, 12, 19, 0.75);
color: var(--text); background: color-mix(in srgb, var(--color-bg-elevated) 75%, transparent);
color: var(--color-text-primary);
} }
.scenario-form { .scenario-form {
background: rgba(21, 27, 35, 0.85); background: rgba(21, 27, 35, 0.85);
background: var(--color-surface-overlay);
border: 1px solid var(--color-border); border: 1px solid var(--color-border);
border-radius: var(--radius); border-radius: var(--radius);
box-shadow: var(--shadow); box-shadow: var(--shadow);
@@ -121,25 +47,85 @@
gap: 1.5rem; gap: 1.5rem;
} }
.table-responsive { .scenario-form .card {
width: 100%; background: rgba(21, 27, 35, 0.9);
overflow-x: auto; background: color-mix(in srgb, var(--color-surface-default) 90%, transparent);
-webkit-overflow-scrolling: touch; border: 1px solid var(--color-border);
border-radius: var(--table-radius); border-radius: var(--radius);
padding: 1.5rem;
display: flex;
flex-direction: column;
gap: 1.25rem;
}
.scenario-form .card h2 {
margin: 0; margin: 0;
} }
.table-responsive .table { .scenario-layout {
min-width: 640px; display: grid;
gap: 1.5rem;
} }
.table-responsive::-webkit-scrollbar { .scenario-column {
height: 6px; display: grid;
gap: 1.5rem;
} }
.table-responsive::-webkit-scrollbar-thumb { .quick-actions-card {
background: rgba(255, 255, 255, 0.2); display: flex;
border-radius: 999px; flex-direction: column;
gap: 1rem;
}
.scenario-portfolio {
display: flex;
flex-direction: column;
gap: 1.5rem;
}
.scenario-portfolio__header {
display: flex;
flex-wrap: wrap;
justify-content: space-between;
gap: 1rem;
}
.scenario-context-card {
display: flex;
flex-direction: column;
gap: 1rem;
}
.scenario-context-card .definition-list {
margin: 0;
}
.scenario-defaults {
list-style: none;
margin: 0;
padding: 0;
display: grid;
gap: 0.75rem;
}
.scenario-defaults li {
display: flex;
flex-direction: column;
gap: 0.25rem;
}
.scenario-defaults li strong {
font-size: 0.9rem;
letter-spacing: 0.04em;
text-transform: uppercase;
color: var(--color-text-muted);
}
.scenario-layout .table tbody tr:hover,
.scenario-portfolio .table tbody tr:hover {
background: rgba(43, 165, 143, 0.12);
background: color-mix(in srgb, var(--color-accent) 18%, transparent);
} }
@media (min-width: 720px) { @media (min-width: 720px) {
@@ -151,10 +137,6 @@
.scenario-filters .filter-actions { .scenario-filters .filter-actions {
justify-content: flex-end; justify-content: flex-end;
} }
.table-responsive .table {
min-width: 100%;
}
} }
@media (max-width: 640px) { @media (max-width: 640px) {
@@ -162,34 +144,9 @@
flex-wrap: wrap; flex-wrap: wrap;
gap: 0.35rem; gap: 0.35rem;
} }
.table th,
.table td {
padding: 0.55rem 0.65rem;
font-size: 0.9rem;
white-space: nowrap;
}
.table tbody tr {
border-radius: var(--radius-sm);
}
}
.scenario-layout {
display: grid;
gap: 1.5rem;
}
.empty-state {
color: var(--muted);
font-style: italic;
} }
@media (min-width: 960px) { @media (min-width: 960px) {
.header-actions {
justify-content: flex-start;
}
.scenario-layout { .scenario-layout {
grid-template-columns: 1.1fr 1.9fr; grid-template-columns: 1.1fr 1.9fr;
align-items: start; align-items: start;

View File

@@ -0,0 +1,72 @@
:root {
/* Neutral surfaces */
--color-bg-base: #0b0f14;
--color-bg-elevated: #0f141b;
--color-surface-default: #151b23;
--color-surface-overlay: rgba(21, 27, 35, 0.7);
--color-border-subtle: rgba(255, 255, 255, 0.08);
--color-border-card: rgba(255, 255, 255, 0.08);
--color-border-strong: rgba(255, 255, 255, 0.12);
--color-highlight: rgba(241, 178, 26, 0.08);
/* Text */
--color-text-primary: #e6edf3;
--color-text-muted: #a9b4c0;
--color-text-subtle: rgba(169, 180, 192, 0.6);
--color-text-invert: #ffffff;
--color-text-dark: #0f172a;
--color-text-strong: #111827;
/* Brand & accent */
--color-brand-base: #f1b21a;
--color-brand-bright: #f6c648;
--color-brand-soft: #f9d475;
--color-accent: #2ba58f;
/* Semantic states */
--color-success: #0c864d;
--color-info: #0b3d88;
--color-warning: #f59e0b;
--color-danger: #7a1721;
/* Shadows & depth */
--shadow: 0 10px 30px rgba(0, 0, 0, 0.35);
--color-panel-shadow: rgba(0, 0, 0, 0.25);
--color-panel-shadow-deep: rgba(0, 0, 0, 0.35);
/* Buttons */
--btn-primary-bg: var(--color-brand-bright);
--btn-primary-color: var(--color-text-dark);
--btn-primary-hover: var(--color-brand-soft);
--btn-secondary-bg: rgba(21, 27, 35, 0.85);
--btn-secondary-hover: rgba(21, 27, 35, 0.95);
--btn-secondary-border: var(--color-border-strong);
--btn-secondary-color: var(--color-text-primary);
--btn-danger-bg: var(--color-danger);
--btn-danger-color: var(--color-text-invert);
--btn-danger-hover: #a21d2b;
--btn-link-color: var(--color-brand-bright);
--btn-link-hover: var(--color-brand-soft);
--btn-ghost-color: var(--color-text-muted);
/* Legacy aliases */
--bg: var(--color-bg-base);
--bg-2: var(--color-bg-elevated);
--card: var(--color-surface-default);
--text: var(--color-text-primary);
--muted: var(--color-text-muted);
--brand: var(--color-brand-base);
--brand-2: var(--color-brand-bright);
--brand-3: var(--color-brand-soft);
--accent: var(--color-accent);
--success: var(--color-success);
--danger: var(--color-danger);
--info: var(--color-info);
--color-border: var(--color-border-subtle);
--card-border: var(--color-border-card);
--color-surface-alt: var(--color-surface-overlay);
}

BIN
static/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

BIN
static/img/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 MiB

BIN
static/img/logo_128x128.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.8 MiB

After

Width:  |  Height:  |  Size: 831 KiB

View File

@@ -1,53 +0,0 @@
// Navigation chevron buttons logic
document.addEventListener("DOMContentLoaded", function () {
const navPrev = document.getElementById("nav-prev");
const navNext = document.getElementById("nav-next");
if (!navPrev || !navNext) return;
// Define the navigation order (main pages)
const navPages = [
"/",
"/projects/ui",
"/imports/ui",
"/ui/simulations",
"/ui/reporting",
"/ui/settings",
];
const currentPath = window.location.pathname;
// Find current index
let currentIndex = -1;
for (let i = 0; i < navPages.length; i++) {
if (currentPath.startsWith(navPages[i])) {
currentIndex = i;
break;
}
}
// If not found, disable both
if (currentIndex === -1) {
navPrev.disabled = true;
navNext.disabled = true;
return;
}
// Set up prev button
if (currentIndex > 0) {
navPrev.addEventListener("click", function () {
window.location.href = navPages[currentIndex - 1];
});
} else {
navPrev.disabled = true;
}
// Set up next button
if (currentIndex < navPages.length - 1) {
navNext.addEventListener("click", function () {
window.location.href = navPages[currentIndex + 1];
});
} else {
navNext.disabled = true;
}
});

View File

@@ -0,0 +1,493 @@
(function () {
const NAV_ENDPOINT = "/navigation/sidebar";
const SIDEBAR_SELECTOR = ".sidebar-nav";
const DATA_SOURCE_ATTR = "navigationSource";
const ROLE_ATTR = "navigationRoles";
const NAV_PREV_ID = "nav-prev";
const NAV_NEXT_ID = "nav-next";
const CACHE_KEY = "calminer:navigation:sidebar";
const CACHE_VERSION = 1;
const CACHE_TTL_MS = 2 * 60 * 1000;
function hasStorage() {
try {
return typeof window.localStorage !== "undefined";
} catch (error) {
return false;
}
}
function loadCacheRoot() {
if (!hasStorage()) {
return null;
}
let raw;
try {
raw = window.localStorage.getItem(CACHE_KEY);
} catch (error) {
return null;
}
if (!raw) {
return { version: CACHE_VERSION, entries: {} };
}
try {
const parsed = JSON.parse(raw);
if (
!parsed ||
typeof parsed !== "object" ||
parsed.version !== CACHE_VERSION ||
typeof parsed.entries !== "object"
) {
return { version: CACHE_VERSION, entries: {} };
}
return parsed;
} catch (error) {
clearCache();
return { version: CACHE_VERSION, entries: {} };
}
}
function persistCache(root) {
if (!hasStorage()) {
return;
}
try {
window.localStorage.setItem(CACHE_KEY, JSON.stringify(root));
} catch (error) {
/* ignore storage write failures */
}
}
function clearCache() {
if (!hasStorage()) {
return;
}
try {
window.localStorage.removeItem(CACHE_KEY);
} catch (error) {
/* ignore */
}
}
function normaliseRoles(roles) {
if (!Array.isArray(roles)) {
return [];
}
const seen = new Set();
const cleaned = [];
for (const value of roles) {
const role = typeof value === "string" ? value.trim() : "";
if (!role || seen.has(role)) {
continue;
}
seen.add(role);
cleaned.push(role);
}
cleaned.sort();
return cleaned;
}
function serialiseRoles(roles) {
const cleaned = normaliseRoles(roles);
if (cleaned.length === 0) {
return "anonymous";
}
return cleaned.join("|");
}
function getCurrentRoles(navContainer) {
const attr = navContainer.dataset[ROLE_ATTR];
if (!attr) {
return null;
}
const roles = attr
.split(",")
.map((role) => role.trim())
.filter(Boolean);
if (roles.length === 0) {
return null;
}
return roles;
}
function readCache(rolesKey) {
if (!rolesKey) {
return null;
}
const root = loadCacheRoot();
if (!root || !root.entries || typeof root.entries !== "object") {
return null;
}
const entry = root.entries[rolesKey];
if (!entry || !entry.payload) {
return null;
}
const cachedAt = typeof entry.cachedAt === "number" ? entry.cachedAt : 0;
const expired = Date.now() - cachedAt > CACHE_TTL_MS;
return { payload: entry.payload, expired };
}
function saveCache(rolesKey, payload) {
if (!rolesKey || !payload) {
return;
}
const root = loadCacheRoot();
if (!root) {
return;
}
if (!root.entries || typeof root.entries !== "object") {
root.entries = {};
}
root.entries[rolesKey] = {
cachedAt: Date.now(),
payload,
};
root.version = CACHE_VERSION;
persistCache(root);
}
function onReady(callback) {
if (document.readyState === "loading") {
document.addEventListener("DOMContentLoaded", callback, { once: true });
} else {
callback();
}
}
function isActivePath(pathname, matchPrefix) {
if (!matchPrefix) {
return false;
}
if (matchPrefix === "/") {
return pathname === "/";
}
return pathname.startsWith(matchPrefix);
}
function createAnchor({
href,
label,
matchPrefix,
tooltip,
isExternal,
isActive,
className,
}) {
const anchor = document.createElement("a");
anchor.href = href;
anchor.className = className + (isActive ? " is-active" : "");
anchor.dataset.matchPrefix = matchPrefix || href;
if (tooltip) {
anchor.title = tooltip;
}
if (isExternal) {
anchor.target = "_blank";
anchor.rel = "noopener noreferrer";
anchor.classList.add("is-external");
}
anchor.textContent = label;
return anchor;
}
function buildLinkBlock(link, pathname) {
if (!link || !link.href) {
return null;
}
const matchPrefix = link.match_prefix || link.matchPrefix || link.href;
const isActive = isActivePath(pathname, matchPrefix);
const block = document.createElement("div");
block.className = "sidebar-link-block";
if (typeof link.id === "number") {
block.dataset.linkId = String(link.id);
}
const anchor = createAnchor({
href: link.href,
label: link.label,
matchPrefix,
tooltip: link.tooltip,
isExternal: Boolean(link.is_external ?? link.isExternal),
isActive,
className: "sidebar-link",
});
block.appendChild(anchor);
const children = Array.isArray(link.children) ? link.children : [];
if (children.length > 0) {
const container = document.createElement("div");
container.className = "sidebar-sublinks";
for (const child of children) {
if (!child || !child.href) {
continue;
}
const childMatch =
child.match_prefix || child.matchPrefix || child.href;
const childActive = isActivePath(pathname, childMatch);
const childAnchor = createAnchor({
href: child.href,
label: child.label,
matchPrefix: childMatch,
tooltip: child.tooltip,
isExternal: Boolean(child.is_external ?? child.isExternal),
isActive: childActive,
className: "sidebar-sublink",
});
container.appendChild(childAnchor);
}
if (container.children.length > 0) {
block.appendChild(container);
}
}
return block;
}
function buildGroupSection(group, pathname) {
if (!group) {
return null;
}
const links = Array.isArray(group.links) ? group.links : [];
if (links.length === 0) {
return null;
}
const section = document.createElement("div");
section.className = "sidebar-section";
if (typeof group.id === "number") {
section.dataset.groupId = String(group.id);
}
const label = document.createElement("div");
label.className = "sidebar-section-label";
label.textContent = group.label;
section.appendChild(label);
const linksContainer = document.createElement("div");
linksContainer.className = "sidebar-section-links";
for (const link of links) {
const block = buildLinkBlock(link, pathname);
if (block) {
linksContainer.appendChild(block);
}
}
if (linksContainer.children.length === 0) {
return null;
}
section.appendChild(linksContainer);
return section;
}
function buildEmptyState() {
const section = document.createElement("div");
section.className = "sidebar-section sidebar-empty-state";
const label = document.createElement("div");
label.className = "sidebar-section-label";
label.textContent = "Navigation";
section.appendChild(label);
const copyWrapper = document.createElement("div");
copyWrapper.className = "sidebar-section-links";
const copy = document.createElement("p");
copy.className = "sidebar-empty-copy";
copy.textContent = "Navigation is unavailable.";
copyWrapper.appendChild(copy);
section.appendChild(copyWrapper);
return section;
}
function resolvePath(input) {
if (!input) {
return null;
}
try {
return new URL(input, window.location.origin).pathname;
} catch (error) {
if (input.startsWith("/")) {
return input;
}
return `/${input}`;
}
}
function flattenNavigation(groups) {
const sequence = [];
for (const group of groups) {
if (!group || !Array.isArray(group.links)) {
continue;
}
for (const link of group.links) {
if (!link || !link.href) {
continue;
}
const isExternal = Boolean(link.is_external ?? link.isExternal);
if (!isExternal) {
sequence.push({
href: link.href,
matchPrefix: link.match_prefix || link.matchPrefix || link.href,
});
}
const children = Array.isArray(link.children) ? link.children : [];
for (const child of children) {
if (!child || !child.href) {
continue;
}
const childExternal = Boolean(child.is_external ?? child.isExternal);
if (childExternal) {
continue;
}
sequence.push({
href: child.href,
matchPrefix: child.match_prefix || child.matchPrefix || child.href,
});
}
}
}
return sequence;
}
function configureChevronButtons(sequence) {
const prevButton = document.getElementById(NAV_PREV_ID);
const nextButton = document.getElementById(NAV_NEXT_ID);
if (!prevButton || !nextButton) {
return;
}
const pathname = window.location.pathname;
const normalised = sequence
.map((item) => ({
href: item.href,
matchPrefix: item.matchPrefix,
path: resolvePath(item.matchPrefix || item.href),
}))
.filter((item) => Boolean(item.path));
const currentIndex = normalised.findIndex((item) =>
isActivePath(pathname, item.matchPrefix || item.path)
);
prevButton.disabled = true;
prevButton.onclick = null;
nextButton.disabled = true;
nextButton.onclick = null;
if (currentIndex === -1) {
return;
}
if (currentIndex > 0) {
const target = normalised[currentIndex - 1].href;
prevButton.disabled = false;
prevButton.onclick = () => {
window.location.href = target;
};
}
if (currentIndex < normalised.length - 1) {
const target = normalised[currentIndex + 1].href;
nextButton.disabled = false;
nextButton.onclick = () => {
window.location.href = target;
};
}
}
function renderSidebar(navContainer, payload) {
const pathname = window.location.pathname;
const groups = Array.isArray(payload?.groups) ? payload.groups : [];
navContainer.replaceChildren();
const rendered = [];
for (const group of groups) {
const section = buildGroupSection(group, pathname);
if (section) {
rendered.push(section);
}
}
if (rendered.length === 0) {
navContainer.appendChild(buildEmptyState());
navContainer.dataset[DATA_SOURCE_ATTR] = "client-empty";
delete navContainer.dataset[ROLE_ATTR];
configureChevronButtons([]);
return;
}
for (const section of rendered) {
navContainer.appendChild(section);
}
navContainer.dataset[DATA_SOURCE_ATTR] = "client";
const roles = Array.isArray(payload?.roles) ? payload.roles : [];
if (roles.length > 0) {
navContainer.dataset[ROLE_ATTR] = roles.join(",");
} else {
delete navContainer.dataset[ROLE_ATTR];
}
configureChevronButtons(flattenNavigation(groups));
}
async function hydrateSidebar(navContainer) {
const roles = getCurrentRoles(navContainer);
const rolesKey = roles ? serialiseRoles(roles) : null;
const cached = readCache(rolesKey);
if (cached && cached.payload) {
renderSidebar(navContainer, cached.payload);
if (!cached.expired) {
return;
}
}
try {
const response = await fetch(NAV_ENDPOINT, {
method: "GET",
credentials: "include",
headers: {
Accept: "application/json",
},
});
if (!response.ok) {
if (!cached || !cached.payload) {
configureChevronButtons([]);
}
if (response.status !== 401 && response.status !== 403) {
console.warn(
"Navigation sidebar hydration failed with status",
response.status
);
}
return;
}
const payload = await response.json();
renderSidebar(navContainer, payload);
const payloadRoles = Array.isArray(payload?.roles)
? payload.roles
: roles || [];
saveCache(serialiseRoles(payloadRoles), payload);
} catch (error) {
console.warn("Navigation sidebar hydration failed", error);
if (!cached || !cached.payload) {
configureChevronButtons([]);
}
}
}
onReady(() => {
const navContainer = document.querySelector(SIDEBAR_SELECTOR);
if (!navContainer) {
configureChevronButtons([]);
return;
}
hydrateSidebar(navContainer);
});
})();

View File

@@ -1,14 +1,35 @@
document.addEventListener("DOMContentLoaded", () => { document.addEventListener("DOMContentLoaded", () => {
const table = document.querySelector("[data-project-table]"); const container = document.querySelector("[data-project-table]");
const rows = table ? Array.from(table.querySelectorAll("tbody tr")) : [];
const filterInput = document.querySelector("[data-project-filter]"); const filterInput = document.querySelector("[data-project-filter]");
if (table && filterInput) { const resolveFilterItems = () => {
if (!container) {
return [];
}
const entries = Array.from(
container.querySelectorAll("[data-project-entry]")
);
if (entries.length) {
return entries;
}
if (container.tagName === "TABLE") {
return Array.from(container.querySelectorAll("tbody tr"));
}
return [];
};
const filterItems = resolveFilterItems();
if (container && filterInput && filterItems.length) {
filterInput.addEventListener("input", () => { filterInput.addEventListener("input", () => {
const query = filterInput.value.trim().toLowerCase(); const query = filterInput.value.trim().toLowerCase();
rows.forEach((row) => { filterItems.forEach((item) => {
const match = row.textContent.toLowerCase().includes(query); const match = item.textContent.toLowerCase().includes(query);
row.style.display = match ? "" : "none"; item.style.display = match ? "" : "none";
}); });
}); });
} }

View File

@@ -4,7 +4,9 @@
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" /> <meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>{% block title %}CalMiner{% endblock %}</title> <title>{% block title %}CalMiner{% endblock %}</title>
<link rel="stylesheet" href="/static/css/main.css" /> <link rel="stylesheet" href="/static/css/theme-default.css" />
<link rel="stylesheet" href="/static/css/main.css" />
<link rel="stylesheet" href="/static/css/forms.css" />
<link rel="stylesheet" href="/static/css/imports.css" /> <link rel="stylesheet" href="/static/css/imports.css" />
{% block head_extra %}{% endblock %} {% block head_extra %}{% endblock %}
</head> </head>
@@ -21,11 +23,27 @@
</div> </div>
</div> </div>
{% block scripts %}{% endblock %} {% block scripts %}{% endblock %}
<script src="/static/js/projects.js" defer></script> <script>
<script src="/static/js/exports.js" defer></script> window.NAVIGATION_URLS = {
<script src="/static/js/imports.js" defer></script> dashboard:
<script src="/static/js/notifications.js" defer></script> '{{ request.url_for("dashboard.home") if request else "/" }}',
<script src="/static/js/navigation.js" defer></script> projects:
'{{ request.url_for("projects.project_list_page") if request else "/projects/ui" }}',
imports:
'{{ request.url_for("imports.ui") if request else "/imports/ui" }}',
simulations:
'{{ request.url_for("ui.simulations") if request else "/ui/simulations" }}',
reporting:
'{{ request.url_for("ui.reporting") if request else "/ui/reporting" }}',
settings:
'{{ request.url_for("ui.settings") if request else "/ui/settings" }}',
};
</script>
<script src="/static/js/projects.js" defer></script>
<script src="/static/js/exports.js" defer></script>
<script src="/static/js/imports.js" defer></script>
<script src="/static/js/notifications.js" defer></script>
<script src="/static/js/navigation_sidebar.js" defer></script>
<script src="/static/js/theme.js"></script> <script src="/static/js/theme.js"></script>
</body> </body>
</html> </html>

31
templates/currencies.html Normal file
View File

@@ -0,0 +1,31 @@
{% extends "base.html" %}
{% block title %}{{ title }} | CalMiner{% endblock %}
{% block content %}
<div class="page-header">
<div>
<h1>{{ title }}</h1>
<p class="page-subtitle">Manage currency settings and exchange rates for financial calculations.</p>
</div>
</div>
<div class="settings-grid">
<div class="settings-card">
<h2>Currency Configuration</h2>
<p>Define available currencies and their properties.</p>
<p class="settings-card-note">Currency management coming soon</p>
</div>
<div class="settings-card">
<h2>Exchange Rates</h2>
<p>Configure and update currency exchange rates.</p>
<p class="settings-card-note">Exchange rate management coming soon</p>
</div>
<div class="settings-card">
<h2>Default Settings</h2>
<p>Set default currencies for new projects and scenarios.</p>
<p class="settings-card-note">Default currency settings coming soon</p>
</div>
</div>
{% endblock %}

View File

@@ -1,5 +1,4 @@
{% extends "base.html" %} {% block title %}Dashboard · CalMiner{% endblock %} {% {% extends "base.html" %} {% block title %}Dashboard · CalMiner{% endblock %} {% block head_extra %}
block head_extra %}
<link rel="stylesheet" href="/static/css/dashboard.css" /> <link rel="stylesheet" href="/static/css/dashboard.css" />
{% endblock %} {% block content %} {% endblock %} {% block content %}
<section class="page-header dashboard-header"> <section class="page-header dashboard-header">
@@ -165,12 +164,12 @@ block head_extra %}
</header> </header>
<ul class="links-list"> <ul class="links-list">
<li> <li>
<a href="https://github.com/" target="_blank">CalMiner Repository</a> <a href="https://git.allucanget.biz/allucanget/calminer" target="_blank">CalMiner Repository</a>
</li> </li>
<li> <li>
<a href="https://example.com/docs" target="_blank">Documentation</a> <a href="https://git.allucanget.biz/allucanget/calminer-docs" target="_blank">Documentation</a>
</li> </li>
<li><a href="mailto:support@example.com">Contact Support</a></li> <li><a href="mailto:calminer@allucanget.biz">Contact Support</a></li>
</ul> </ul>
</div> </div>
</aside> </aside>

View File

@@ -9,7 +9,7 @@
<h5 class="modal-title">Export {{ dataset|capitalize }}</h5> <h5 class="modal-title">Export {{ dataset|capitalize }}</h5>
<button <button
type="button" type="button"
class="btn-close" class="btn btn--ghost btn--icon"
data-dismiss="modal" data-dismiss="modal"
aria-label="Close" aria-label="Close"
></button> ></button>
@@ -40,10 +40,10 @@
> >
</div> </div>
<div class="modal-footer"> <div class="modal-footer">
<button type="button" class="btn btn-secondary" data-dismiss="modal"> <button type="button" class="btn btn--secondary" data-dismiss="modal">
Cancel Cancel
</button> </button>
<button type="submit" class="btn btn-primary">Download</button> <button type="submit" class="btn btn--primary">Download</button>
</div> </div>
<p class="form-error hidden" data-export-error></p> <p class="form-error hidden" data-export-error></p>
</form> </form>

View File

@@ -24,8 +24,8 @@
{% include "partials/import_preview_table.html" %} {% include "partials/import_preview_table.html" %}
<div class="import-actions hidden" data-import-actions> <div class="import-actions hidden" data-import-actions>
<button class="btn primary" data-import-commit disabled>Commit Import</button> <button class="btn btn--primary" data-import-commit disabled>Commit Import</button>
<button class="btn" data-import-cancel>Cancel</button> <button class="btn btn--secondary" data-import-cancel>Cancel</button>
</div> </div>
</div> </div>
</section> </section>

View File

@@ -26,7 +26,7 @@
<label for="password">Password:</label> <label for="password">Password:</label>
<input type="password" id="password" name="password" required /> <input type="password" id="password" name="password" required />
</div> </div>
<button type="submit" class="btn primary">Login</button> <button type="submit" class="btn btn--primary">Login</button>
</form> </form>
<p>Don't have an account? <a href="/register">Register here</a></p> <p>Don't have an account? <a href="/register">Register here</a></p>
<p><a href="/forgot-password">Forgot password?</a></p> <p><a href="/forgot-password">Forgot password?</a></p>

View File

@@ -1,7 +1,7 @@
<footer class="site-footer"> <footer class="site-footer">
<div class="container footer-inner"> <div class="container footer-inner">
<div class="footer-logo"> <div class="footer-logo">
<img src="/static/img/logo_big.png" alt="CalMiner Logo" class="footer-logo-img" /> <img src="/static/img/logo_128x128.png" alt="CalMiner Logo" class="footer-logo-img" />
</div> </div>
<p> <p>
&copy; {{ current_year }} CalMiner by &copy; {{ current_year }} CalMiner by

View File

@@ -1,14 +1,10 @@
<div class="sidebar-inner"> <div class="sidebar-inner">
<a class="sidebar-brand" href="{{ request.url_for('dashboard.home') }}"> <a class="sidebar-brand" href="{{ request.url_for('dashboard.home') }}">
<img src="/static/img/logo_big.png" alt="CalMiner Logo" class="brand-logo" /> <img src="/static/img/logo.png" alt="CalMiner Logo" class="brand-logo" />
<div class="brand-text"> <div class="brand-text">
<span class="brand-title">CalMiner</span> <span class="brand-title">CalMiner</span>
<span class="brand-subtitle">Mining Planner</span> <span class="brand-subtitle">Mining Planner</span>
</div> </div>
</a> </a>
<div class="sidebar-nav-controls">
<button id="nav-prev" class="nav-chevron nav-chevron-prev" aria-label="Previous page">&larr;</button>
<button id="nav-next" class="nav-chevron nav-chevron-next" aria-label="Next page">&rarr;</button>
</div>
{% include "partials/sidebar_nav.html" %} {% include "partials/sidebar_nav.html" %}
</div> </div>

View File

@@ -9,7 +9,7 @@
<div class="import-upload__dropzone" data-import-dropzone> <div class="import-upload__dropzone" data-import-dropzone>
<span class="icon-upload" aria-hidden="true"></span> <span class="icon-upload" aria-hidden="true"></span>
<p>Drag & drop CSV/XLSX files here or</p> <p>Drag & drop CSV/XLSX files here or</p>
<label class="btn secondary"> <label class="btn btn--secondary">
Browse Browse
<input type="file" name="import-file" accept=".csv,.xlsx" hidden /> <input type="file" name="import-file" accept=".csv,.xlsx" hidden />
</label> </label>
@@ -17,8 +17,8 @@
</div> </div>
<div class="import-upload__actions"> <div class="import-upload__actions">
<button type="button" class="btn primary" data-import-upload-trigger disabled>Upload & Preview</button> <button type="button" class="btn btn--primary" data-import-upload-trigger disabled>Upload & Preview</button>
<button type="button" class="btn" data-import-reset hidden>Reset</button> <button type="button" class="btn btn--secondary" data-import-reset hidden>Reset</button>
</div> </div>
{{ feedback("import-upload-feedback", hidden=True, role="alert") }} {{ feedback("import-upload-feedback", hidden=True, role="alert") }}

View File

@@ -1,98 +1,80 @@
{% set dashboard_href = request.url_for('dashboard.home') if request else '/' %} {% set sidebar_nav = get_sidebar_navigation(request) %}
{% set projects_href = request.url_for('projects.project_list_page') if request else '/projects/ui' %} {% set nav_roles = sidebar_nav.roles if sidebar_nav and sidebar_nav.roles else [] %}
{% set project_create_href = request.url_for('projects.create_project_form') if request else '/projects/create' %} {% set nav_groups = sidebar_nav.groups if sidebar_nav else [] %}
{% set auth_session = request.state.auth_session if request else None %} {% set current_path = request.url.path if request else '' %}
{% set is_authenticated = auth_session and auth_session.is_authenticated %}
{% if is_authenticated %} <nav
{% set logout_href = request.url_for('auth.logout') if request else '/logout' %} class="sidebar-nav"
{% set account_links = [ aria-label="Primary navigation"
{"href": logout_href, "label": "Logout", "match_prefix": "/logout"} data-navigation-source="{{ 'server' if sidebar_nav else 'fallback' }}"
] %} data-navigation-roles="{{ nav_roles | join(',') }}"
{% else %} >
{% set login_href = request.url_for('auth.login_form') if request else '/login' %} <div class="sidebar-nav-controls">
{% set register_href = request.url_for('auth.register_form') if request else '/register' %} <button id="nav-prev" class="nav-chevron nav-chevron-prev" aria-label="Previous page"></button>
{% set forgot_href = request.url_for('auth.password_reset_request_form') if request else '/forgot-password' %} <button id="nav-next" class="nav-chevron nav-chevron-next" aria-label="Next page"></button>
{% set account_links = [ </div>
{"href": login_href, "label": "Login", "match_prefix": "/login"}, {% if nav_groups %}
{"href": register_href, "label": "Register", "match_prefix": "/register"}, {% for group in nav_groups %}
{"href": forgot_href, "label": "Forgot Password", "match_prefix": "/forgot-password"} {% if group.links %}
] %} <div class="sidebar-section" data-group-id="{{ group.id }}">
{% endif %} <div class="sidebar-section-label">{{ group.label }}</div>
{% set nav_groups = [ <div class="sidebar-section-links">
{ {% for link in group.links %}
"label": "Workspace", {% set href = link.href %}
"links": [ {% if href %}
{"href": dashboard_href, "label": "Dashboard", "match_prefix": "/"}, {% set match_prefix = link.match_prefix or href %}
{"href": projects_href, "label": "Projects", "match_prefix": "/projects"}, {% if match_prefix == '/' %}
{"href": project_create_href, "label": "New Project", "match_prefix": "/projects/create"}, {% set is_active = current_path == '/' %}
{"href": "/imports/ui", "label": "Imports", "match_prefix": "/imports"} {% else %}
] {% set is_active = current_path.startswith(match_prefix) %}
}, {% endif %}
{ <div class="sidebar-link-block" data-link-id="{{ link.id }}">
"label": "Insights", <a
"links": [ href="{{ href }}"
{"href": "/ui/simulations", "label": "Simulations"}, class="sidebar-link{% if is_active %} is-active{% endif %}{% if link.is_external %} is-external{% endif %}"
{"href": "/ui/reporting", "label": "Reporting"} data-match-prefix="{{ match_prefix }}"
] {% if link.tooltip %}title="{{ link.tooltip }}"{% endif %}
}, {% if link.is_external %}target="_blank" rel="noopener noreferrer"{% endif %}
{ >
"label": "Configuration", {{ link.label }}
"links": [ </a>
{ {% if link.children %}
"href": "/ui/settings", <div class="sidebar-sublinks">
"label": "Settings", {% for child in link.children %}
"children": [ {% set child_href = child.href %}
{"href": "/theme-settings", "label": "Themes"}, {% if child_href %}
{"href": "/ui/currencies", "label": "Currency Management"} {% set child_prefix = child.match_prefix or child_href %}
] {% if child_prefix == '/' %}
} {% set child_active = current_path == '/' %}
] {% else %}
}, {% set child_active = current_path.startswith(child_prefix) %}
{ {% endif %}
"label": "Account", <a
"links": account_links href="{{ child_href }}"
} class="sidebar-sublink{% if child_active %} is-active{% endif %}{% if child.is_external %} is-external{% endif %}"
] %} data-match-prefix="{{ child_prefix }}"
{% if child.tooltip %}title="{{ child.tooltip }}"{% endif %}
<nav class="sidebar-nav" aria-label="Primary navigation"> {% if child.is_external %}target="_blank" rel="noopener noreferrer"{% endif %}
{% set current_path = request.url.path if request else '' %} >
{% for group in nav_groups %} {{ child.label }}
{% if group.links %} </a>
<div class="sidebar-section"> {% endif %}
<div class="sidebar-section-label">{{ group.label }}</div> {% endfor %}
<div class="sidebar-section-links"> </div>
{% for link in group.links %} {% endif %}
{% set href = link.href %}
{% set match_prefix = link.get('match_prefix', href) %}
{% if match_prefix == '/' %}
{% set is_active = current_path == '/' %}
{% else %}
{% set is_active = current_path.startswith(match_prefix) %}
{% endif %}
<div class="sidebar-link-block">
<a href="{{ href }}" class="sidebar-link{% if is_active %} is-active{% endif %}">
{{ link.label }}
</a>
{% if link.children %}
<div class="sidebar-sublinks">
{% for child in link.children %}
{% set child_prefix = child.get('match_prefix', child.href) %}
{% if child_prefix == '/' %}
{% set child_active = current_path == '/' %}
{% else %}
{% set child_active = current_path.startswith(child_prefix) %}
{% endif %}
<a href="{{ child.href }}" class="sidebar-sublink{% if child_active %} is-active{% endif %}">
{{ child.label }}
</a>
{% endfor %}
</div> </div>
{% endif %} {% endif %}
</div> {% endfor %}
{% endfor %} </div>
</div> </div>
{% endif %}
{% endfor %}
{% else %}
<div class="sidebar-section sidebar-empty-state">
<div class="sidebar-section-label">Navigation</div>
<div class="sidebar-section-links">
<p class="sidebar-empty-copy">Navigation is unavailable.</p>
</div> </div>
{% endif %} </div>
{% endfor %} {% endif %}
</nav> </nav>

View File

@@ -17,8 +17,9 @@
<p class="text-muted">{{ project.operation_type.value.replace('_', ' ') | title }}</p> <p class="text-muted">{{ project.operation_type.value.replace('_', ' ') | title }}</p>
</div> </div>
<div class="header-actions"> <div class="header-actions">
<a class="btn" href="{{ url_for('projects.edit_project_form', project_id=project.id) }}">Edit Project</a> <a class="btn btn--secondary" href="{{ url_for('scenarios.project_scenario_list', project_id=project.id) }}">Manage Scenarios</a>
<a class="btn primary" href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">New Scenario</a> <a class="btn btn--secondary" href="{{ url_for('projects.edit_project_form', project_id=project.id) }}">Edit Project</a>
<a class="btn btn--primary" href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">New Scenario</a>
</div> </div>
</header> </header>
@@ -46,65 +47,91 @@
</section> </section>
<div class="project-layout"> <div class="project-layout">
<section class="card"> <div class="project-column">
<h2>Project Overview</h2> <section class="card">
<dl class="definition-list"> <h2>Project Overview</h2>
<div> <dl class="definition-list">
<dt>Location</dt> <div>
<dd>{{ project.location or '—' }}</dd> <dt>Location</dt>
</div> <dd>{{ project.location or '—' }}</dd>
<div> </div>
<dt>Description</dt> <div>
<dd>{{ project.description or 'No description provided.' }}</dd> <dt>Description</dt>
</div> <dd>{{ project.description or 'No description provided.' }}</dd>
<div> </div>
<dt>Created</dt> <div>
<dd>{{ project.created_at.strftime('%Y-%m-%d %H:%M') }}</dd> <dt>Created</dt>
</div> <dd>{{ project.created_at.strftime('%Y-%m-%d %H:%M') }}</dd>
<div> </div>
<dt>Updated</dt> <div>
<dd>{{ project.updated_at.strftime('%Y-%m-%d %H:%M') }}</dd> <dt>Updated</dt>
</div> <dd>{{ project.updated_at.strftime('%Y-%m-%d %H:%M') }}</dd>
<div> </div>
<dt>Latest Scenario Update</dt> <div>
<dd>{{ scenario_stats.latest_update.strftime('%Y-%m-%d %H:%M') if scenario_stats.latest_update else '—' }}</dd> <dt>Latest Scenario Update</dt>
</div> <dd>{{ scenario_stats.latest_update.strftime('%Y-%m-%d %H:%M') if scenario_stats.latest_update else '—' }}</dd>
</dl> </div>
</section> </dl>
</section>
<section class="card"> <section class="card project-actions-card">
<header class="card-header"> <h2>Next Steps</h2>
<h2>Scenarios</h2> <ul class="quick-link-list">
<a class="btn" href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Add Scenario</a> <li>
<a href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Capture a new scenario</a>
<p>Create an additional assumption set under this project.</p>
</li>
<li>
<a href="{{ url_for('scenarios.project_scenario_list', project_id=project.id) }}">Review scenario portfolio</a>
<p>Compare scenarios and jump into calculators with inherited context.</p>
</li>
<li>
<a href="{{ url_for('projects.edit_project_form', project_id=project.id) }}">Update project details</a>
<p>Revise metadata or operation type for reporting.</p>
</li>
</ul>
</section>
</div>
<section class="card project-scenarios-card">
<header class="project-scenarios-card__header">
<div>
<h2>Scenarios</h2>
<p class="text-muted">Project scenarios inherit pricing and provide entry points to profitability planning.</p>
</div>
<a class="btn btn--secondary" href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Add Scenario</a>
</header> </header>
{% if scenarios %} {% if scenarios %}
<div class="table-responsive"> <ul class="scenario-list">
<table class="table"> {% for scenario in scenarios %}
<thead> <li class="scenario-item">
<tr> <div class="scenario-item__body">
<th>Name</th> <div class="scenario-item__header">
<th>Status</th> <h3><a href="{{ url_for('scenarios.view_scenario', scenario_id=scenario.id) }}">{{ scenario.name }}</a></h3>
<th>Currency</th> <span class="status-pill status-pill--{{ scenario.status.value }}">{{ scenario.status.value.title() }}</span>
<th>Primary Resource</th> </div>
<th class="text-right">Actions</th> <dl class="scenario-item__meta">
</tr> <div>
</thead> <dt>Currency</dt>
<tbody> <dd>{{ scenario.currency or '—' }}</dd>
{% for scenario in scenarios %} </div>
<tr> <div>
<td>{{ scenario.name }}</td> <dt>Primary Resource</dt>
<td>{{ scenario.status.value.title() }}</td> <dd>{{ scenario.primary_resource.value.replace('_', ' ') | title if scenario.primary_resource else '—' }}</dd>
<td>{{ scenario.currency or '—' }}</td> </div>
<td>{{ scenario.primary_resource.value.replace('_', ' ') | title if scenario.primary_resource else '—' }}</td> <div>
<td class="text-right"> <dt>Last Updated</dt>
<a class="table-link" href="{{ url_for('scenarios.view_scenario', scenario_id=scenario.id) }}">View</a> <dd>{{ scenario.updated_at.strftime('%Y-%m-%d %H:%M') if scenario.updated_at else '—' }}</dd>
<a class="table-link" href="{{ url_for('scenarios.edit_scenario_form', scenario_id=scenario.id) }}">Edit</a> </div>
</td> </dl>
</tr> </div>
{% endfor %} <div class="scenario-item__actions">
</tbody> <a class="btn btn--link" href="{{ url_for('scenarios.view_scenario', scenario_id=scenario.id) }}">View</a>
</table> <a class="btn btn--link" href="{{ url_for('scenarios.edit_scenario_form', scenario_id=scenario.id) }}">Edit</a>
</div> </div>
</li>
{% endfor %}
</ul>
{% else %} {% else %}
<p class="empty-state">No scenarios yet. <a href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Create the first scenario.</a></p> <p class="empty-state">No scenarios yet. <a href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Create the first scenario.</a></p>
{% endif %} {% endif %}

View File

@@ -16,26 +16,21 @@
{% endif %} {% endif %}
</nav> </nav>
<header class="page-header">
<div>
<h1>{% if project %}Edit Project{% else %}Create Project{% endif %}</h1>
<p class="text-muted">Provide core information about the mining project.</p>
</div>
<div class="header-actions">
<a class="btn" href="{{ cancel_url }}">Cancel</a>
<button class="btn primary" type="submit">Save Project</button>
</div>
</header>
{% if error %} {% if error %}
<div class="alert alert-error">{{ error }}</div> <div class="alert alert-error">{{ error }}</div>
{% endif %} {% endif %}
<form class="form project-form" method="post" action="{{ form_action }}">
<header class="page-header">
<div>
<h1>{% if project %}Edit Project{% else %}Create Project{% endif %}</h1>
<p class="text-muted">Provide core information about the mining project.</p>
</div>
<div class="header-actions">
<a class="btn btn--secondary" href="{{ cancel_url }}">Cancel</a>
<button class="btn btn--primary" type="submit">Save Project</button>
</div>
</header>
{% if error %}
<div class="alert alert-error">{{ error }}</div>
{% endif %}
<form class="form project-form" method="post" action="{{ form_action }}">
<div class="form-grid"> <div class="form-grid">
<div class="form-group"> <div class="form-group">
<label for="name">Name</label> <label for="name">Name</label>
@@ -63,8 +58,8 @@
</div> </div>
<div class="form-actions"> <div class="form-actions">
<a class="btn" href="{{ cancel_url }}">Cancel</a> <a class="btn btn--secondary" href="{{ cancel_url }}">Cancel</a>
<button class="btn primary" type="submit">Save Project</button> <button class="btn btn--primary" type="submit">Save Project</button>
</div> </div>
</form> </form>
{% endblock %} {% endblock %}

View File

@@ -17,48 +17,61 @@
class="form-control" class="form-control"
placeholder="Filter projects..." placeholder="Filter projects..."
data-project-filter data-project-filter
aria-label="Filter projects"
/> />
<a class="btn btn-primary" href="{{ url_for('projects.create_project_form') }}">New Project</a> <a class="btn btn--primary" href="{{ url_for('projects.create_project_form') }}">New Project</a>
</div> </div>
</section> </section>
{% if projects %} {% if projects %}
<table class="projects-table" data-project-table> <section class="projects-grid" data-project-table>
<thead> {% for project in projects %}
<tr> <article class="project-card" data-project-entry>
<th>Name</th> <header class="project-card__header">
<th>Location</th> <h2 class="project-card__title">
<th>Type</th> <a href="{{ url_for('projects.view_project', project_id=project.id) }}">{{ project.name }}</a>
<th>Scenarios</th> </h2>
<th></th> <span class="project-card__type badge">{{ project.operation_type.value.replace('_', ' ') | title }}</span>
</tr> </header>
</thead>
<tbody> <p class="project-card__description">
{% for project in projects %} {{ project.description or 'No description provided yet.' }}
<tr> </p>
<td class="table-cell-actions">
{{ project.name }} <dl class="project-card__meta">
<button <div>
class="btn btn-ghost" <dt>Scenarios</dt>
data-export-trigger <dd><span class="badge badge-pill">{{ project.scenario_count }}</span></dd>
data-export-target="projects" </div>
title="Export projects dataset" <div>
> <dt>Location</dt>
<span aria-hidden="true"></span> <dd>{{ project.location or '—' }}</dd>
<span class="sr-only">Export</span> </div>
</button> <div>
</td> <dt>Updated</dt>
<td>{{ project.location or '—' }}</td> <dd>{{ project.updated_at.strftime('%Y-%m-%d') if project.updated_at else '—' }}</dd>
<td>{{ project.operation_type.value.replace('_', ' ') | title }}</td> </div>
<td>{{ project.scenario_count }}</td> </dl>
<td class="text-right">
<a class="btn btn-link" href="{{ url_for('projects.view_project', project_id=project.id) }}">View</a> <footer class="project-card__footer">
<a class="btn btn-link" href="{{ url_for('projects.edit_project_form', project_id=project.id) }}">Edit</a> <div class="project-card__links">
</td> <a class="btn btn--link" href="{{ url_for('projects.view_project', project_id=project.id) }}">View Project</a>
</tr> <a class="btn btn--link" href="{{ url_for('scenarios.create_scenario_form', project_id=project.id) }}">Add Scenario</a>
{% endfor %} <a class="btn btn--link" href="{{ url_for('projects.edit_project_form', project_id=project.id) }}">Edit</a>
</tbody> </div>
</table> <button
class="btn btn--ghost"
data-export-trigger
data-export-target="projects"
title="Export projects dataset"
>
<span aria-hidden="true"></span>
<span class="sr-only">Export</span>
</button>
</footer>
</article>
{% endfor %}
</section>
{% else %} {% else %}
<p>No projects yet. <a href="{{ url_for('projects.create_project_form') }}">Create your first project.</a></p> <p>No projects yet. <a href="{{ url_for('projects.create_project_form') }}">Create your first project.</a></p>
{% endif %} {% endif %}

Some files were not shown because too many files have changed in this diff Show More