feat: documentation update
- Completed export workflow implementation (query builders, CSV/XLSX serializers, streaming API endpoints, UI modals, automated tests). - Added export modal UI and client script to trigger downloads directly from dashboard. - Documented import/export field mapping and usage guidelines in FR-008. - Updated installation guide with export environment variables, dependencies, and CLI/CI usage instructions.
This commit is contained in:
@@ -39,12 +39,21 @@ Before you begin, ensure that you have the following prerequisites installed on
|
||||
|
||||
3. **Access the Application**
|
||||
|
||||
Once the containers are up and running, you can access the Calminer application by navigating to `http://localhost:3000` in your web browser.
|
||||
Once the containers are up and running, you can access the Calminer application by navigating to `http://localhost:8003` in your web browser.
|
||||
If you are running the application on a remote server, replace `localhost` with the server's IP address or domain name.
|
||||
|
||||
4. **Database Initialization**
|
||||
|
||||
The first time you run the application, the database will be initialized automatically. Ensure that the database container is running and accessible.
|
||||
The application container executes `/app/scripts/docker-entrypoint.sh` before launching the API. This entrypoint runs `python -m scripts.run_migrations`, which applies all Alembic migrations and keeps the schema current on every startup. No additional action is required when using Docker Compose, but you can review the logs to confirm the migrations completed successfully.
|
||||
|
||||
For local development without Docker, run the same command after setting your environment variables:
|
||||
|
||||
```bash
|
||||
# activate your virtualenv first
|
||||
python -m scripts.run_migrations
|
||||
```
|
||||
|
||||
The script is idempotent; it will only apply pending migrations.
|
||||
|
||||
5. **Seed Default Accounts and Roles**
|
||||
|
||||
@@ -65,15 +74,24 @@ Before you begin, ensure that you have the following prerequisites installed on
|
||||
|
||||
You can rerun the script safely; it updates existing roles and user details without creating duplicates.
|
||||
|
||||
6. **Stopping the Application**
|
||||
### Export Dependencies
|
||||
|
||||
To stop the application, run the following command in the terminal:
|
||||
Export and monitoring workflows require the following Python packages in addition to the core dependencies:
|
||||
|
||||
```bash
|
||||
docker compose down
|
||||
```
|
||||
- `pandas`
|
||||
- `openpyxl`
|
||||
- `prometheus-client`
|
||||
|
||||
This command will stop and remove the containers, networks, and volumes created by Docker Compose.
|
||||
These libraries are already listed in `requirements.txt`. Ensure they are installed in your virtual environment if you are not using Docker.
|
||||
|
||||
### Environment Variables for Export Features
|
||||
|
||||
While exports reuse the existing database configuration, you may optionally set the following variables to adjust behavior:
|
||||
|
||||
- `CALMINER_EXPORT_MAX_ROWS` — override default pagination when generating exports (optional).
|
||||
- `CALMINER_EXPORT_METADATA` — enable (`true`) or disable (`false`) the metadata sheet in Excel exports by default (UI form still allows per-request overrides).
|
||||
|
||||
Set these variables in your `.env` file or compose environment section before launching the stack.
|
||||
|
||||
## Docker Configuration
|
||||
|
||||
@@ -83,6 +101,45 @@ The `docker-compose.yml` file contains the configuration for the Calminer applic
|
||||
|
||||
The application uses environment variables to configure various settings. You can set these variables in a `.env` file in the root directory of the project. Refer to the `docker-compose.yml` file for a list of available environment variables and their default values.
|
||||
|
||||
Key variables relevant to import/export workflows:
|
||||
|
||||
| Variable | Default | Description |
|
||||
| ----------------------------- | --------- | ------------------------------------------------------------------------------- |
|
||||
| `CALMINER_EXPORT_MAX_ROWS` | _(unset)_ | Optional safety guard to limit the number of rows exported in a single request. |
|
||||
| `CALMINER_EXPORT_METADATA` | `true` | Controls whether metadata sheets are generated by default during Excel exports. |
|
||||
| `CALMINER_IMPORT_STAGING_TTL` | `300` | Controls how long staged import tokens remain valid before expiration. |
|
||||
| `CALMINER_IMPORT_MAX_ROWS` | _(unset)_ | Optional guard to prevent excessively large import files. |
|
||||
|
||||
### Running Export Workflows Locally
|
||||
|
||||
1. Activate your virtual environment and ensure dependencies are installed:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
2. Start the FastAPI application (or use `docker compose up`).
|
||||
|
||||
3. Use the `/exports/projects` or `/exports/scenarios` endpoints to request CSV/XLSX downloads:
|
||||
|
||||
```bash
|
||||
curl -X POST http://localhost:8000/exports/projects \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"format": "csv"}' --output projects.csv
|
||||
|
||||
curl -X POST http://localhost:8000/exports/projects \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"format": "xlsx"}' --output projects.xlsx
|
||||
```
|
||||
|
||||
4. The Prometheus metrics endpoint is available at `/metrics` once the app is running. Ensure your monitoring stack scrapes it (e.g., Prometheus target `localhost:8000`).
|
||||
|
||||
5. For automated verification in CI pipelines, invoke the dedicated pytest module:
|
||||
|
||||
```bash
|
||||
pytest tests/test_export_routes.py
|
||||
```
|
||||
|
||||
### Volumes
|
||||
|
||||
The application uses Docker volumes to persist data. The following volumes are defined in the `docker-compose.yml` file:
|
||||
@@ -92,6 +149,16 @@ The application uses Docker volumes to persist data. The following volumes are d
|
||||
|
||||
Ensure that these volumes are properly configured to avoid data loss during container restarts or removals.
|
||||
|
||||
## Stopping the Application
|
||||
|
||||
To stop the application, run the following command in the terminal:
|
||||
|
||||
```bash
|
||||
docker compose down
|
||||
```
|
||||
|
||||
This command will stop and remove the containers, networks, and volumes created by Docker Compose.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
If you encounter any issues during the installation or deployment process, refer to the following troubleshooting tips:
|
||||
|
||||
81
admin/runbooks/export_operations.md
Normal file
81
admin/runbooks/export_operations.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Export Operations Runbook
|
||||
|
||||
## Purpose
|
||||
|
||||
This runbook provides step-by-step guidance for operators to execute project and scenario exports, monitor their status, and troubleshoot common issues.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Access to the CalMiner web UI with role `analyst`, `project_manager`, or `admin`.
|
||||
- Direct API access (curl or HTTP client) if performing scripted exports.
|
||||
- Environment variables configured per [Installation Guide](installation.md), especially:
|
||||
- `CALMINER_EXPORT_MAX_ROWS`
|
||||
- `CALMINER_EXPORT_METADATA`
|
||||
|
||||
## Success Path
|
||||
|
||||
### Export via Web UI
|
||||
|
||||
1. Sign in to CalMiner.
|
||||
2. Navigate to the dashboard and click **Export** next to either _Recent Projects_ or _Scenario Alerts_.
|
||||
3. In the modal dialog:
|
||||
- Choose **CSV** or **Excel (.xlsx)**.
|
||||
- Toggle **Include metadata sheet** (Excel only) as needed.
|
||||
- Click **Download**.
|
||||
4. Confirm that the browser downloads a file named `projects-YYYYMMDD-HHMMSS.csv` (or `.xlsx`).
|
||||
|
||||
### Export via API (curl)
|
||||
|
||||
```bash
|
||||
# CSV export of projects
|
||||
curl -X POST https://<host>/exports/projects \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"format": "csv"}' \
|
||||
--output projects.csv
|
||||
|
||||
# Excel export of scenarios
|
||||
curl -X POST https://<host>/exports/scenarios \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"format": "xlsx"}' \
|
||||
--output scenarios.xlsx
|
||||
```
|
||||
|
||||
Expected response headers:
|
||||
|
||||
- `Content-Type: text/csv` or `application/vnd.openxmlformats-officedocument.spreadsheetml.sheet`
|
||||
- `Content-Disposition: attachment; filename=...`
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
| Symptom | Likely Cause | Resolution |
|
||||
| ---------------------------------------- | ---------------------------------------------- | --------------------------------------------------------------------------- |
|
||||
| `403 Forbidden` | User lacks analyst/project_manager/admin role. | Assign appropriate role or escalate to administrator. |
|
||||
| `400 Bad Request` with validation errors | Unsupported format or malformed filters. | Verify payload matches schema (`format` = `csv` or `xlsx`); review filters. |
|
||||
| Empty dataset | No matching records for filters. | Validate data exists; adjust filters or check project/scenario status. |
|
||||
| Large exports time out | Dataset exceeds `CALMINER_EXPORT_MAX_ROWS`. | Increase limit (with caution) or export narrower dataset. |
|
||||
|
||||
## Monitoring & Logging
|
||||
|
||||
- Success and error events are logged via structured events (`import.preview`, `import.commit`, `export`). Ensure your log sink (e.g., ELK) captures JSON payloads and index fields such as `dataset`, `status`, `row_count`, and `token` for filtering.
|
||||
- Prometheus endpoint: `GET /metrics`
|
||||
|
||||
- Sample scrape config:
|
||||
|
||||
```yaml
|
||||
scrape_configs:
|
||||
- job_name: calminer
|
||||
static_configs:
|
||||
- targets: ["calminer.local:8000"]
|
||||
```
|
||||
|
||||
- Key metrics:
|
||||
- `calminer_import_total` / `calminer_export_total` — counters labelled by `dataset`, `action`, and `status` (imports) or `dataset`, `status`, and `format` (exports).
|
||||
- `calminer_import_duration_seconds` / `calminer_export_duration_seconds` — histograms for measuring operation duration.
|
||||
- Alerting suggestions:
|
||||
- Trigger when `calminer_export_total{status="failure"}` increases for the last 5 minutes.
|
||||
- Trigger when 95th percentile of `calminer_export_duration_seconds` exceeds your SLA threshold.
|
||||
|
||||
- Dashboard recommendations:
|
||||
- Plot export/import throughput split by dataset and format.
|
||||
- Surface recent failures with `detail` and `error` metadata pulled from the `import_export_logs` table.
|
||||
- Combine logs, metrics, and DB audit records to trace user actions end-to-end.
|
||||
Reference in New Issue
Block a user