Compare commits
17 Commits
2b9877a9d3
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
| 68048ff574 | |||
| f73ab7ad14 | |||
| 3c97c47f7e | |||
| c35049cd54 | |||
| f9086d2d04 | |||
| 25ca7ab196 | |||
| 0b84ee953e | |||
| 8877380f21 | |||
| 4393f17c45 | |||
| e10b2ee71c | |||
| 1c8adb36fe | |||
| c2927f2f60 | |||
| 090dca29c2 | |||
| 92d19235d8 | |||
| a488a385ad | |||
| 7b24ab511a | |||
| 615b63ba76 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -44,6 +44,3 @@ alembic.ini.timestamp
|
||||
# Project specific
|
||||
rail_game_backend.egg-info/
|
||||
.github/instructions/
|
||||
|
||||
# TODO
|
||||
TODO.md
|
||||
287
README.md
287
README.md
@@ -1,146 +1,213 @@
|
||||
# Rail Game
|
||||
|
||||
A browser-based railway simulation game using real world railway maps from OpenStreetMap.
|
||||
A browser-based railway simulation game using real-world railway maps from OpenStreetMap.
|
||||
|
||||
## At a glance
|
||||
|
||||
- Frontend: React + Vite (TypeScript)
|
||||
- Backend: Python (FastAPI, SQLAlchemy)
|
||||
- Database: PostgreSQL with PostGIS (spatial types)
|
||||
- Mapping: Leaflet + OpenStreetMap
|
||||
|
||||
## Features
|
||||
|
||||
- Real world railway maps
|
||||
- Interactive Leaflet map preview of the demo network snapshot
|
||||
- Build and manage your own railway network
|
||||
- Dynamic train schedules
|
||||
- Real-world railway maps
|
||||
- Interactive Leaflet map preview of a demo network snapshot
|
||||
- Build and manage your railway network
|
||||
- Dynamic train schedules and simulated trains
|
||||
|
||||
## Architecture
|
||||
## Current project layout
|
||||
|
||||
The project is built using the following technologies:
|
||||
This repository contains a full-stack demo app (frontend + backend), supporting scripts, docs and infra. Key folders:
|
||||
|
||||
- Frontend: HTML5, CSS3, JavaScript, React
|
||||
- Backend: Python, FastAPI, Flask, SQLAlchemy
|
||||
- Database: PostgreSQL with PostGIS extension
|
||||
- Mapping: Leaflet, OpenStreetMap
|
||||
- `backend/` — FastAPI application, models, services, migration scripts and backend tests.
|
||||
- `frontend/` — React app (Vite) and frontend tests.
|
||||
- `docs/` — Architecture docs and ADRs.
|
||||
- `infra/` — Deployment assets (Dockerfiles, compose files, init scripts).
|
||||
- `data/` — Fixtures and imported OSM snapshots.
|
||||
- `scripts/` — Utility scripts (precommit helpers, setup hooks).
|
||||
- `tests/` — End-to-end tests and cross-cutting tests.
|
||||
|
||||
## Project Structure
|
||||
|
||||
Planned structure for code and assets (folders created as needed):
|
||||
|
||||
```text
|
||||
rail-game/
|
||||
|-- backend/
|
||||
| |-- app/
|
||||
| | |-- api/ # FastAPI/Flask route handlers
|
||||
| | |-- core/ # Config, startup, shared utilities
|
||||
| | |-- models/ # SQLAlchemy models and schemas
|
||||
| | |-- services/ # Domain logic and service layer
|
||||
| | `-- websocket/ # Real-time communication handlers
|
||||
| |-- tests/ # Backend unit and integration tests
|
||||
| `-- requirements/ # Backend dependency lockfiles
|
||||
|-- frontend/
|
||||
| |-- public/ # Static assets served as-is
|
||||
| |-- src/
|
||||
| | |-- components/ # Reusable React components
|
||||
| | |-- hooks/ # Custom React hooks
|
||||
| | |-- pages/ # Top-level routed views
|
||||
| | |-- state/ # Redux/Context stores and slices
|
||||
| | |-- styles/ # Global and modular stylesheets
|
||||
| | `-- utils/ # Frontend helpers and formatters
|
||||
| `-- tests/ # Frontend unit and integration tests
|
||||
|-- docs/ # Architecture docs, ADRs, guides
|
||||
|-- infra/ # Deployment, IaC, Docker, CI workflows
|
||||
|-- scripts/ # Tooling for setup, linting, migrations
|
||||
|-- data/ # Seed data, fixtures, import/export tooling
|
||||
`-- tests/ # End-to-end and cross-cutting tests
|
||||
```
|
||||
|
||||
Use `infra/` to capture deployment assets (Dockerfiles, compose files, Terraform) and `.github/` for automation. Shared code that crosses layers should live in the respective service directories or dedicated packages under `backend/`.
|
||||
Refer to the in-repo `docs/` for architecture decisions and deeper design notes.
|
||||
|
||||
## Installation
|
||||
|
||||
1. Clone the repository:
|
||||
Below are concise, verified steps for getting the project running locally. Commands show both PowerShell (Windows) and Bash/macOS/Linux variants where they differ.
|
||||
|
||||
```bash
|
||||
git clone https://github.com/zwitschi/rail-game.git
|
||||
cd rail-game
|
||||
```
|
||||
## Prerequisites
|
||||
|
||||
2. Set up the backend (from the project root):
|
||||
- Git
|
||||
- Python 3.10+ (3.11 recommended) and pip
|
||||
- Node.js 16+ (or the version required by `frontend/package.json`)
|
||||
- PostgreSQL with PostGIS if you want to run the full DB-backed stack locally
|
||||
- Docker & Docker Compose (optional, for containerized dev)
|
||||
|
||||
```bash
|
||||
python -m venv .venv
|
||||
.\.venv\Scripts\activate
|
||||
python -m pip install -e .[dev]
|
||||
```
|
||||
### Clone repository
|
||||
|
||||
3. Set up the frontend:
|
||||
PowerShell / Bash
|
||||
|
||||
```bash
|
||||
cd frontend
|
||||
npm install
|
||||
cd ..
|
||||
```
|
||||
git clone https://github.com/zwitschi/rail-game.git
|
||||
cd rail-game
|
||||
|
||||
4. Copy the sample environment file and adjust the database URLs per environment:
|
||||
### Backend: create virtual environment and install
|
||||
|
||||
```bash
|
||||
copy .env.example .env # PowerShell: Copy-Item .env.example .env
|
||||
```
|
||||
PowerShell
|
||||
|
||||
`DATABASE_URL`, `TEST_DATABASE_URL`, and `ALEMBIC_DATABASE_URL` control the runtime, test, and migration connections respectively.
|
||||
5. (Optional) Point Git to the bundled hooks: `pwsh scripts/setup_hooks.ps1`.
|
||||
6. Run database migrations to set up the schema:
|
||||
python -m venv .venv
|
||||
.\.venv\Scripts\Activate.ps1
|
||||
python -m pip install -e .[dev]
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
alembic upgrade head
|
||||
cd ..
|
||||
```
|
||||
Bash / macOS / Linux
|
||||
|
||||
7. Start the development servers from separate terminals:
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate
|
||||
python -m pip install -e '.[dev]'
|
||||
|
||||
- Backend: `uvicorn backend.app.main:app --reload --port 8000`
|
||||
- Frontend: `cd frontend && npm run dev`
|
||||
### Notes
|
||||
|
||||
8. Open your browser: frontend runs at `http://localhost:5173`, backend API at `http://localhost:8000`.
|
||||
9. Run quality checks:
|
||||
- Installing editable extras (`.[dev]`) installs dev/test tools used by the backend (pytest, black, isort, alembic, etc.).
|
||||
|
||||
- Backend unit tests: `pytest`
|
||||
- Backend formatters: `black backend/` and `isort backend/`
|
||||
- Frontend lint: `cd frontend && npm run lint`
|
||||
- Frontend type/build check: `cd frontend && npm run build`
|
||||
### Environment file
|
||||
|
||||
10. Build for production:
|
||||
Copy the sample `.env.example` to `.env` and adjust the database connection strings as needed.
|
||||
|
||||
- Frontend bundle: `cd frontend && npm run build`
|
||||
- Backend container: `docker build -t rail-game-backend backend/`
|
||||
PowerShell
|
||||
|
||||
11. Run containers:
|
||||
- Backend: `docker run -p 8000:8000 rail-game-backend`
|
||||
- Frontend: Serve `frontend/dist` with any static file host.
|
||||
Copy-Item .env.example .env
|
||||
|
||||
## Database Migrations
|
||||
Bash
|
||||
|
||||
- Alembic configuration lives in `backend/alembic.ini` with scripts under `backend/migrations/`.
|
||||
- Generate new revisions with `alembic revision --autogenerate -m "short description"` (ensure models are imported before running autogenerate).
|
||||
- Apply migrations via `alembic upgrade head`; rollback with `alembic downgrade -1` during development.
|
||||
cp .env.example .env
|
||||
|
||||
## PostgreSQL Configuration
|
||||
### Important environment variables
|
||||
|
||||
- **Database URLs**: The backend reads connection strings from the `.env` file. Set `DATABASE_URL` (development), `TEST_DATABASE_URL` (pytest/CI), and `ALEMBIC_DATABASE_URL` (migration runner). URLs use the SQLAlchemy format, e.g. `postgresql+psycopg://user:password@host:port/database`.
|
||||
- **Required Extensions**: Migrations enable `postgis` for spatial types and `pgcrypto` for UUID generation. Ensure your Postgres instance has these extensions available.
|
||||
- **Recommended Databases**: create `railgame_dev` and `railgame_test` (or variants) owned by a dedicated `railgame` role with privileges to create extensions.
|
||||
- **Connection Debugging**: Toggle `DATABASE_ECHO=true` in `.env` to log SQL statements during development.
|
||||
- `DATABASE_URL` — runtime DB connection for the app
|
||||
- `TEST_DATABASE_URL` — database used by pytest in CI/local tests
|
||||
- `ALEMBIC_DATABASE_URL` — used when running alembic outside the app process
|
||||
|
||||
## API Preview
|
||||
### Database (Postgres + PostGIS)
|
||||
|
||||
- `GET /api/health` – Lightweight readiness probe.
|
||||
- `POST /api/auth/register` – Creates an in-memory demo account and returns a JWT access token.
|
||||
- `POST /api/auth/login` – Exchanges credentials for a JWT access token (demo user: `demo` / `railgame123`).
|
||||
- `GET /api/auth/me` – Returns the current authenticated user profile.
|
||||
- `GET /api/network` – Returns a sample snapshot of stations, tracks, and trains (camelCase fields) generated from shared domain models; requires a valid bearer token.
|
||||
If you run Postgres locally, create the dev/test DBs and ensure the `postgis` extension is available. Example (psql):
|
||||
|
||||
## Developer Tooling
|
||||
-- create DBs (run in psql as a superuser or role with create privileges)
|
||||
CREATE DATABASE railgame_dev;
|
||||
CREATE DATABASE railgame_test;
|
||||
|
||||
- Install backend tooling in editable mode: `python -m pip install -e .[dev]`.
|
||||
- Configure git hooks (Git for Windows works with these scripts): `pwsh scripts/setup_hooks.ps1`.
|
||||
- Pre-commit hooks run `black`, `isort`, `pytest backend/tests`, and `npm run lint` if frontend dependencies are installed.
|
||||
- Run the checks manually any time with `python scripts/precommit.py`.
|
||||
- Frontend lint/format commands live in `frontend/package.json` (`npm run lint`, `npm run format`).
|
||||
- Continuous integration runs via workflows in `.github/workflows/` covering backend lint/tests and frontend lint/build.
|
||||
-- connect to the db and enable extensions
|
||||
\c railgame_dev
|
||||
CREATE EXTENSION IF NOT EXISTS postgis;
|
||||
CREATE EXTENSION IF NOT EXISTS pgcrypto;
|
||||
|
||||
Adjust DB names and roles to match your `.env` values.
|
||||
|
||||
### Quick database setup (recommended)
|
||||
|
||||
For a streamlined setup, use the included initialization script after configuring your `.env` file:
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
python scripts/init_demo_db.py
|
||||
|
||||
This script validates your environment, runs migrations, and loads demo OSM data. Use `--dry-run` to preview changes, or `--region` to load specific regions.
|
||||
|
||||
**Note**: The script uses `python-dotenv` to load your `.env` file. If not installed, run `pip install python-dotenv`.
|
||||
|
||||
### Run migrations
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
cd backend
|
||||
alembic upgrade head
|
||||
cd ..
|
||||
|
||||
If you prefer to run alembic with a specific URL without editing `.env`, set `ALEMBIC_DATABASE_URL` in the environment before running the command.
|
||||
|
||||
### Load OSM fixtures (optional)
|
||||
|
||||
Use the included scripts to refresh stations and tracks from saved OSM fixtures. This step assumes the database is migrated and reachable.
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
# dry-run
|
||||
python -m backend.scripts.osm_refresh --region all --no-commit
|
||||
|
||||
# commit to DB
|
||||
python -m backend.scripts.osm_refresh --region all
|
||||
|
||||
See `backend/scripts/*.py` for more granular import options (`--skip-*` flags).
|
||||
|
||||
### Frontend
|
||||
|
||||
Install dependencies and run the dev server from the `frontend/` directory.
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
cd frontend
|
||||
npm install
|
||||
npm run dev
|
||||
|
||||
The frontend runs at `http://localhost:5173` by default (Vite). The React app talks to the backend API at the address configured in its environment (see `frontend` README or `vite` config).
|
||||
|
||||
### Run backend locally (development)
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
# from project root
|
||||
uvicorn backend.app.main:app --reload --port 8000
|
||||
|
||||
The backend API listens at `http://localhost:8000` by default.
|
||||
|
||||
### Tests & linters
|
||||
|
||||
Backend
|
||||
|
||||
pytest
|
||||
black backend/ && isort backend/
|
||||
|
||||
Frontend
|
||||
|
||||
cd frontend
|
||||
npm run lint
|
||||
npm run build # type/build check
|
||||
|
||||
### Docker / Compose (optional)
|
||||
|
||||
Build and run both services with Docker Compose if you prefer containers:
|
||||
|
||||
PowerShell / Bash
|
||||
|
||||
docker compose up --build
|
||||
|
||||
This starts all services (Postgres, Redis, backend, frontend) and automatically initializes the database with demo data on first run. The backend waits for the database to be ready before running migrations and loading OSM fixtures.
|
||||
|
||||
**Services:**
|
||||
|
||||
- Backend API: `http://localhost:8000`
|
||||
- Frontend: `http://localhost:8080`
|
||||
- Postgres: `localhost:5432`
|
||||
- Redis: `localhost:6379`
|
||||
|
||||
This expects a working Docker environment and may require you to set DB URLs to point to the containerized Postgres service if one is defined in `docker-compose.yml`.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- If migrations fail with missing PostGIS functions, ensure `postgis` is installed and enabled in the target database.
|
||||
- If alembic autogenerate creates unexpected changes, confirm the models being imported match the app import path used by `alembic` (see `backend/migrations/env.py`).
|
||||
- For authentication/debugging, the demo user is `demo` / `railgame123` (used by some integration tests and the demo auth flow).
|
||||
- If frontend dev server fails due to node version, check `frontend/package.json` engines or use `nvm`/`nvm-windows` to match the recommended Node version.
|
||||
|
||||
## API preview
|
||||
|
||||
Some useful endpoints for local testing:
|
||||
|
||||
- `GET /api/health` — readiness probe
|
||||
- `POST /api/auth/register` — demo account creation + JWT
|
||||
- `POST /api/auth/login` — exchange credentials for JWT (demo user: `demo` / `railgame123`)
|
||||
- `GET /api/auth/me` — current user profile (requires bearer token)
|
||||
- `GET /api/network` — sample network snapshot (requires bearer token)
|
||||
|
||||
## Contributing
|
||||
|
||||
- See `docs/` for architecture and ADRs.
|
||||
- Keep tests green and follow formatting rules (black, isort for Python; Prettier/ESLint for frontend).
|
||||
- Open issues or PRs for bugs, features, or docs improvements.
|
||||
|
||||
79
TODO.md
79
TODO.md
@@ -1,79 +0,0 @@
|
||||
# Development TODO Plan
|
||||
|
||||
## Phase 1 – Project Foundations
|
||||
|
||||
- [x] Initialize Git hooks, linting, and formatting tooling (ESLint, Prettier, isort, black).
|
||||
- [x] Configure `pyproject.toml` or equivalent for backend dependency management.
|
||||
- [x] Scaffold FastAPI application entrypoint with health-check endpoint.
|
||||
- [x] Bootstrap React app with Vite/CRA, including routing skeleton and global state provider.
|
||||
- [x] Define shared TypeScript/Python models for core domain entities (tracks, stations, trains).
|
||||
- [x] Set up CI workflow for linting and test automation (GitHub Actions).
|
||||
|
||||
## Phase 2 – Core Features
|
||||
|
||||
- [x] Implement authentication flow (backend JWT, frontend login/register forms).
|
||||
- [x] Build map visualization integrating Leaflet with OSM tiles.
|
||||
- [ ] Define geographic bounding boxes and filtering rules for importing real-world stations from OpenStreetMap.
|
||||
- [ ] Implement an import script/CLI that pulls OSM station data and normalizes it to the PostGIS schema.
|
||||
- [ ] Expose backend CRUD endpoints for stations (create, update, archive) with validation and geometry handling.
|
||||
- [ ] Build React map tooling for manual station placement and editing, including form validation.
|
||||
- [ ] Define track selection criteria and tagging rules for harvesting OSM rail segments within target regions.
|
||||
- [ ] Extend the importer to load track geometries and associate them with existing stations.
|
||||
- [ ] Implement backend track-management APIs with length/speed validation and topology checks.
|
||||
- [ ] Create a frontend track-drawing workflow (polyline editor, snapping to stations, undo/redo).
|
||||
- [ ] Design train connection manager requirements (link trains to operating tracks, manage consist data).
|
||||
- [ ] Implement backend services and APIs to attach trains to routes and update assignments.
|
||||
- [ ] Add UI flows for managing train connections, including visual feedback on the map.
|
||||
- [ ] Establish train scheduling service with validation rules, conflict detection, and persistence APIs.
|
||||
- [ ] Provide frontend scheduling tools (timeline or table view) for creating and editing train timetables.
|
||||
- [ ] Develop frontend dashboards for resources, schedules, and achievements.
|
||||
- [ ] Add real-time simulation updates (WebSocket layer, frontend subscription hooks).
|
||||
|
||||
## Phase 3 – Data & Persistence
|
||||
|
||||
- [x] Design PostgreSQL/PostGIS schema and migrations (Alembic or similar).
|
||||
- [x] Implement data access layer with SQLAlchemy and repository abstractions.
|
||||
- [ ] Decide on canonical fixture scope (demo geography, sample trains) and document expected dataset size.
|
||||
- [ ] Author fixture generation scripts that export JSON/GeoJSON compatible with the repository layer.
|
||||
- [ ] Create ingestion utilities to load fixtures into local and CI databases.
|
||||
- [ ] Provision a Redis instance/container for local development.
|
||||
- [ ] Add caching abstractions in backend services (e.g., network snapshot, map layers).
|
||||
- [ ] Implement cache invalidation hooks tied to repository mutations.
|
||||
|
||||
## Phase 4 – Testing & Quality
|
||||
|
||||
- [x] Write unit tests for backend services and models.
|
||||
- [ ] Configure Jest/RTL testing utilities and shared mocks for Leaflet and network APIs.
|
||||
- [ ] Write component tests for map controls, station builder UI, and dashboards.
|
||||
- [ ] Add integration tests for custom hooks (network snapshot, scheduling forms).
|
||||
- [x] Stand up Playwright/Cypress project structure with authentication helpers.
|
||||
- [x] Script login end-to-end flow (Playwright).
|
||||
- [ ] Script station creation end-to-end flow.
|
||||
- [ ] Script track placement end-to-end flow.
|
||||
- [ ] Script scheduling end-to-end flow.
|
||||
- [ ] Define load/performance targets (requests per second, simulation latency) and tooling.
|
||||
- [ ] Implement performance test harness covering scheduling and real-time updates.
|
||||
|
||||
## Phase 5 – Deployment & Ops
|
||||
|
||||
- [x] Create Dockerfile for frontend.
|
||||
- [x] Create Dockerfile for backend.
|
||||
- [x] Create docker-compose for local development with Postgres/Redis dependencies.
|
||||
- [ ] Add task runner commands to orchestrate container workflows.
|
||||
- [ ] Set up CI/CD pipeline for automated builds, tests, and container publishing.
|
||||
- [ ] Provision infrastructure scripts (Terraform/Ansible) targeting initial cloud environment.
|
||||
- [ ] Define environment configuration strategy (secrets management, config maps).
|
||||
- [ ] Configure observability stack (logging, metrics, tracing).
|
||||
- [ ] Integrate tracing/logging exporters into backend services.
|
||||
- [ ] Document deployment pipeline and release process.
|
||||
|
||||
## Phase 6 – Polish & Expansion
|
||||
|
||||
- [ ] Add leaderboards and achievements logic with UI integration.
|
||||
- [ ] Design data model changes required for achievements and ranking.
|
||||
- [ ] Implement accessibility audit fixes (WCAG compliance).
|
||||
- [ ] Conduct accessibility audit (contrast, keyboard navigation, screen reader paths).
|
||||
- [ ] Optimize asset loading and introduce lazy loading strategies.
|
||||
- [ ] Establish performance budgets for bundle size and render times.
|
||||
- [ ] Evaluate multiplayer/coop roadmap and spike POCs where feasible.
|
||||
- [ ] Prototype networking approach (WebRTC/WebSocket) for cooperative sessions.
|
||||
@@ -8,15 +8,26 @@ ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||
WORKDIR /app
|
||||
|
||||
RUN apt-get update \
|
||||
&& apt-get install -y --no-install-recommends build-essential libpq-dev \
|
||||
&& apt-get install -y --no-install-recommends build-essential libpq-dev postgresql-client \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY backend/requirements/base.txt ./backend/requirements/base.txt
|
||||
RUN pip install --upgrade pip \
|
||||
&& pip install -r backend/requirements/base.txt
|
||||
|
||||
COPY scripts ./scripts
|
||||
COPY .env.example ./.env.example
|
||||
COPY .env* ./
|
||||
|
||||
COPY backend ./backend
|
||||
|
||||
EXPOSE 8000
|
||||
|
||||
CMD ["uvicorn", "backend.app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||
# Initialize database with demo data if INIT_DEMO_DB is set
|
||||
CMD ["sh", "-c", "\
|
||||
export PYTHONPATH=/app/backend && \
|
||||
echo 'Waiting for database...' && \
|
||||
while ! pg_isready -h db -p 5432 -U railgame >/dev/null 2>&1; do sleep 1; done && \
|
||||
echo 'Database is ready!' && \
|
||||
if [ \"$INIT_DEMO_DB\" = \"true\" ]; then python scripts/init_demo_db.py; fi && \
|
||||
uvicorn backend.app.main:app --host 0.0.0.0 --port 8000"]
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[alembic]
|
||||
script_location = migrations
|
||||
sqlalchemy.url = postgresql+psycopg://railgame:railgame@localhost:5432/railgame
|
||||
sqlalchemy.url = postgresql+psycopg://railgame:railgame@localhost:5432/railgame_dev
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
@@ -3,8 +3,12 @@ from fastapi import APIRouter
|
||||
from backend.app.api.auth import router as auth_router
|
||||
from backend.app.api.health import router as health_router
|
||||
from backend.app.api.network import router as network_router
|
||||
from backend.app.api.stations import router as stations_router
|
||||
from backend.app.api.tracks import router as tracks_router
|
||||
|
||||
router = APIRouter()
|
||||
router.include_router(health_router, tags=["health"])
|
||||
router.include_router(auth_router)
|
||||
router.include_router(network_router)
|
||||
router.include_router(stations_router)
|
||||
router.include_router(tracks_router)
|
||||
|
||||
94
backend/app/api/stations.py
Normal file
94
backend/app/api/stations.py
Normal file
@@ -0,0 +1,94 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.api.deps import get_current_user, get_db
|
||||
from backend.app.models import StationCreate, StationModel, StationUpdate, UserPublic
|
||||
from backend.app.services.stations import (
|
||||
archive_station,
|
||||
create_station,
|
||||
get_station,
|
||||
list_stations,
|
||||
update_station,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/stations", tags=["stations"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[StationModel])
|
||||
def read_stations(
|
||||
include_inactive: bool = False,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> list[StationModel]:
|
||||
return list_stations(db, include_inactive=include_inactive)
|
||||
|
||||
|
||||
@router.get("/{station_id}", response_model=StationModel)
|
||||
def read_station(
|
||||
station_id: str,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> StationModel:
|
||||
try:
|
||||
return get_station(db, station_id)
|
||||
except LookupError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)
|
||||
) from exc
|
||||
except ValueError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)
|
||||
) from exc
|
||||
|
||||
|
||||
@router.post("", response_model=StationModel, status_code=status.HTTP_201_CREATED)
|
||||
def create_station_endpoint(
|
||||
payload: StationCreate,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> StationModel:
|
||||
try:
|
||||
return create_station(db, payload)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)
|
||||
) from exc
|
||||
|
||||
|
||||
@router.put("/{station_id}", response_model=StationModel)
|
||||
def update_station_endpoint(
|
||||
station_id: str,
|
||||
payload: StationUpdate,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> StationModel:
|
||||
try:
|
||||
return update_station(db, station_id, payload)
|
||||
except LookupError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)
|
||||
) from exc
|
||||
except ValueError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)
|
||||
) from exc
|
||||
|
||||
|
||||
@router.post("/{station_id}/archive", response_model=StationModel)
|
||||
def archive_station_endpoint(
|
||||
station_id: str,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> StationModel:
|
||||
try:
|
||||
return archive_station(db, station_id)
|
||||
except LookupError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND, detail=str(exc)
|
||||
) from exc
|
||||
except ValueError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)
|
||||
) from exc
|
||||
153
backend/app/api/tracks.py
Normal file
153
backend/app/api/tracks.py
Normal file
@@ -0,0 +1,153 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.api.deps import get_current_user, get_db
|
||||
from backend.app.models import (
|
||||
CombinedTrackModel,
|
||||
TrackCreate,
|
||||
TrackUpdate,
|
||||
TrackModel,
|
||||
UserPublic,
|
||||
)
|
||||
from backend.app.services.combined_tracks import (
|
||||
create_combined_track,
|
||||
get_combined_track,
|
||||
list_combined_tracks,
|
||||
)
|
||||
from backend.app.services.tracks import (
|
||||
create_track,
|
||||
delete_track,
|
||||
regenerate_combined_tracks,
|
||||
update_track,
|
||||
get_track,
|
||||
list_tracks,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/tracks", tags=["tracks"])
|
||||
|
||||
|
||||
@router.get("", response_model=list[TrackModel])
|
||||
def read_combined_tracks(
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> list[TrackModel]:
|
||||
"""Return all base tracks."""
|
||||
return list_tracks(db)
|
||||
|
||||
|
||||
@router.get("/combined", response_model=list[CombinedTrackModel])
|
||||
def read_combined_tracks_combined(
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> list[CombinedTrackModel]:
|
||||
return list_combined_tracks(db)
|
||||
|
||||
|
||||
@router.get("/{track_id}", response_model=TrackModel)
|
||||
def read_track(
|
||||
track_id: str,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> TrackModel:
|
||||
track = get_track(db, track_id)
|
||||
if track is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Track {track_id} not found",
|
||||
)
|
||||
return track
|
||||
|
||||
|
||||
@router.get("/combined/{combined_track_id}", response_model=CombinedTrackModel)
|
||||
def read_combined_track(
|
||||
combined_track_id: str,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> CombinedTrackModel:
|
||||
combined_track = get_combined_track(db, combined_track_id)
|
||||
if combined_track is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Combined track {combined_track_id} not found",
|
||||
)
|
||||
return combined_track
|
||||
|
||||
|
||||
@router.post("", response_model=TrackModel, status_code=status.HTTP_201_CREATED)
|
||||
def create_track_endpoint(
|
||||
payload: TrackCreate,
|
||||
regenerate: bool = False,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> TrackModel:
|
||||
try:
|
||||
track = create_track(db, payload)
|
||||
except ValueError as exc:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)
|
||||
) from exc
|
||||
|
||||
if regenerate:
|
||||
regenerate_combined_tracks(
|
||||
db, [track.start_station_id, track.end_station_id])
|
||||
|
||||
return track
|
||||
|
||||
|
||||
@router.post(
|
||||
"/combined",
|
||||
response_model=CombinedTrackModel,
|
||||
status_code=status.HTTP_201_CREATED,
|
||||
summary="Create a combined track between two stations using pathfinding",
|
||||
)
|
||||
def create_combined_track_endpoint(
|
||||
start_station_id: str,
|
||||
end_station_id: str,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> CombinedTrackModel:
|
||||
combined_track = create_combined_track(
|
||||
db, start_station_id, end_station_id)
|
||||
if combined_track is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail="Could not create combined track: no path exists between stations or track already exists",
|
||||
)
|
||||
return combined_track
|
||||
|
||||
|
||||
@router.put("/{track_id}", response_model=TrackModel)
|
||||
def update_track_endpoint(
|
||||
track_id: str,
|
||||
payload: TrackUpdate,
|
||||
regenerate: bool = False,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> TrackModel:
|
||||
track = update_track(db, track_id, payload)
|
||||
if track is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Track {track_id} not found",
|
||||
)
|
||||
if regenerate:
|
||||
regenerate_combined_tracks(
|
||||
db, [track.start_station_id, track.end_station_id])
|
||||
return track
|
||||
|
||||
|
||||
@router.delete("/{track_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||
def delete_track_endpoint(
|
||||
track_id: str,
|
||||
regenerate: bool = False,
|
||||
_: UserPublic = Depends(get_current_user),
|
||||
db: Session = Depends(get_db),
|
||||
) -> None:
|
||||
deleted = delete_track(db, track_id, regenerate=regenerate)
|
||||
if not deleted:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Track {track_id} not found",
|
||||
)
|
||||
136
backend/app/core/osm_config.py
Normal file
136
backend/app/core/osm_config.py
Normal file
@@ -0,0 +1,136 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""Geographic presets and tagging rules for OpenStreetMap imports."""
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Iterable, Mapping, Tuple
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class BoundingBox:
|
||||
"""Geographic bounding box expressed as WGS84 coordinates."""
|
||||
|
||||
name: str
|
||||
north: float
|
||||
south: float
|
||||
east: float
|
||||
west: float
|
||||
description: str | None = None
|
||||
|
||||
def __post_init__(self) -> None:
|
||||
if self.north <= self.south:
|
||||
msg = f"north ({self.north}) must be greater than south ({self.south})"
|
||||
raise ValueError(msg)
|
||||
if self.east <= self.west:
|
||||
msg = f"east ({self.east}) must be greater than west ({self.west})"
|
||||
raise ValueError(msg)
|
||||
|
||||
def contains(self, latitude: float, longitude: float) -> bool:
|
||||
"""Return True when the given coordinate lies inside the bounding box."""
|
||||
|
||||
return (
|
||||
self.south <= latitude <= self.north and self.west <= longitude <= self.east
|
||||
)
|
||||
|
||||
def to_overpass_arg(self) -> str:
|
||||
"""Return the bbox string used for Overpass API queries."""
|
||||
|
||||
return f"{self.south},{self.west},{self.north},{self.east}"
|
||||
|
||||
|
||||
# Primary metropolitan areas we plan to support.
|
||||
DEFAULT_REGIONS: Tuple[BoundingBox, ...] = (
|
||||
BoundingBox(
|
||||
name="berlin_metropolitan",
|
||||
north=52.6755,
|
||||
south=52.3381,
|
||||
east=13.7611,
|
||||
west=13.0884,
|
||||
description="Berlin and surrounding rapid transit network",
|
||||
),
|
||||
BoundingBox(
|
||||
name="hamburg_metropolitan",
|
||||
north=53.7447,
|
||||
south=53.3950,
|
||||
east=10.3253,
|
||||
west=9.7270,
|
||||
description="Hamburg S-Bahn and harbor region",
|
||||
),
|
||||
BoundingBox(
|
||||
name="munich_metropolitan",
|
||||
north=48.2485,
|
||||
south=47.9960,
|
||||
east=11.7229,
|
||||
west=11.3600,
|
||||
description="Munich S-Bahn core and airport corridor",
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
# Tags that identify passenger stations and stops.
|
||||
STATION_TAG_FILTERS: Mapping[str, Tuple[str, ...]] = {
|
||||
"railway": ("station", "halt", "stop"),
|
||||
"public_transport": ("station", "stop_position", "platform"),
|
||||
"train": ("yes", "regional", "suburban"),
|
||||
}
|
||||
|
||||
|
||||
# Tags that describe rail infrastructure usable for train routing.
|
||||
TRACK_ALLOWED_RAILWAY_TYPES: Tuple[str, ...] = (
|
||||
"rail",
|
||||
"light_rail",
|
||||
"subway",
|
||||
"tram",
|
||||
"narrow_gauge",
|
||||
"disused",
|
||||
"construction",
|
||||
)
|
||||
|
||||
|
||||
TRACK_TAG_FILTERS: Mapping[str, Tuple[str, ...]] = {
|
||||
"railway": TRACK_ALLOWED_RAILWAY_TYPES,
|
||||
}
|
||||
|
||||
|
||||
# Track ingestion policy
|
||||
TRACK_EXCLUDED_SERVICE_TAGS: Tuple[str, ...] = (
|
||||
"yard",
|
||||
"siding",
|
||||
"spur",
|
||||
"crossover",
|
||||
"industrial",
|
||||
"military",
|
||||
)
|
||||
|
||||
TRACK_EXCLUDED_USAGE_TAGS: Tuple[str, ...] = (
|
||||
"military",
|
||||
"tourism",
|
||||
)
|
||||
|
||||
TRACK_MIN_LENGTH_METERS: float = 75.0
|
||||
|
||||
TRACK_STATION_SNAP_RADIUS_METERS: float = 350.0
|
||||
|
||||
|
||||
def compile_overpass_filters(filters: Mapping[str, Iterable[str]]) -> str:
|
||||
"""Build an Overpass boolean expression that matches the provided filters."""
|
||||
|
||||
parts: list[str] = []
|
||||
for key, values in filters.items():
|
||||
options = "|".join(sorted(set(values)))
|
||||
parts.append(f' ["{key}"~"^({options})$"]')
|
||||
return "\n".join(parts)
|
||||
|
||||
|
||||
__all__ = [
|
||||
"BoundingBox",
|
||||
"DEFAULT_REGIONS",
|
||||
"STATION_TAG_FILTERS",
|
||||
"TRACK_ALLOWED_RAILWAY_TYPES",
|
||||
"TRACK_TAG_FILTERS",
|
||||
"TRACK_EXCLUDED_SERVICE_TAGS",
|
||||
"TRACK_EXCLUDED_USAGE_TAGS",
|
||||
"TRACK_MIN_LENGTH_METERS",
|
||||
"TRACK_STATION_SNAP_RADIUS_METERS",
|
||||
"compile_overpass_filters",
|
||||
]
|
||||
@@ -41,11 +41,14 @@ class User(Base, TimestampMixin):
|
||||
id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True), primary_key=True, default=uuid.uuid4
|
||||
)
|
||||
username: Mapped[str] = mapped_column(String(64), unique=True, nullable=False)
|
||||
email: Mapped[str | None] = mapped_column(String(255), unique=True, nullable=True)
|
||||
username: Mapped[str] = mapped_column(
|
||||
String(64), unique=True, nullable=False)
|
||||
email: Mapped[str | None] = mapped_column(
|
||||
String(255), unique=True, nullable=True)
|
||||
full_name: Mapped[str | None] = mapped_column(String(128), nullable=True)
|
||||
password_hash: Mapped[str] = mapped_column(String(256), nullable=False)
|
||||
role: Mapped[str] = mapped_column(String(32), nullable=False, default="player")
|
||||
role: Mapped[str] = mapped_column(
|
||||
String(32), nullable=False, default="player")
|
||||
preferences: Mapped[str | None] = mapped_column(Text, nullable=True)
|
||||
|
||||
|
||||
@@ -62,12 +65,50 @@ class Station(Base, TimestampMixin):
|
||||
Geometry(geometry_type="POINT", srid=4326), nullable=False
|
||||
)
|
||||
elevation_m: Mapped[float | None] = mapped_column(Float, nullable=True)
|
||||
is_active: Mapped[bool] = mapped_column(Boolean, nullable=False, default=True)
|
||||
is_active: Mapped[bool] = mapped_column(
|
||||
Boolean, nullable=False, default=True)
|
||||
|
||||
|
||||
class Track(Base, TimestampMixin):
|
||||
__tablename__ = "tracks"
|
||||
|
||||
id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True), primary_key=True, default=uuid.uuid4
|
||||
)
|
||||
osm_id: Mapped[str | None] = mapped_column(String(32), nullable=True)
|
||||
name: Mapped[str | None] = mapped_column(String(128), nullable=True)
|
||||
start_station_id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("stations.id", ondelete="RESTRICT"),
|
||||
nullable=False,
|
||||
)
|
||||
end_station_id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
ForeignKey("stations.id", ondelete="RESTRICT"),
|
||||
nullable=False,
|
||||
)
|
||||
length_meters: Mapped[float | None] = mapped_column(
|
||||
Numeric(10, 2), nullable=True)
|
||||
max_speed_kph: Mapped[int | None] = mapped_column(Integer, nullable=True)
|
||||
is_bidirectional: Mapped[bool] = mapped_column(
|
||||
Boolean, nullable=False, default=True
|
||||
)
|
||||
status: Mapped[str] = mapped_column(
|
||||
String(32), nullable=False, default="planned")
|
||||
track_geometry: Mapped[str] = mapped_column(
|
||||
Geometry(geometry_type="LINESTRING", srid=4326), nullable=False
|
||||
)
|
||||
|
||||
__table_args__ = (
|
||||
UniqueConstraint(
|
||||
"start_station_id", "end_station_id", name="uq_tracks_station_pair"
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class CombinedTrack(Base, TimestampMixin):
|
||||
__tablename__ = "combined_tracks"
|
||||
|
||||
id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True), primary_key=True, default=uuid.uuid4
|
||||
)
|
||||
@@ -82,19 +123,25 @@ class Track(Base, TimestampMixin):
|
||||
ForeignKey("stations.id", ondelete="RESTRICT"),
|
||||
nullable=False,
|
||||
)
|
||||
length_meters: Mapped[float | None] = mapped_column(Numeric(10, 2), nullable=True)
|
||||
length_meters: Mapped[float | None] = mapped_column(
|
||||
Numeric(10, 2), nullable=True)
|
||||
max_speed_kph: Mapped[int | None] = mapped_column(Integer, nullable=True)
|
||||
is_bidirectional: Mapped[bool] = mapped_column(
|
||||
Boolean, nullable=False, default=True
|
||||
)
|
||||
status: Mapped[str] = mapped_column(String(32), nullable=False, default="planned")
|
||||
track_geometry: Mapped[str] = mapped_column(
|
||||
status: Mapped[str] = mapped_column(
|
||||
String(32), nullable=False, default="planned")
|
||||
combined_geometry: Mapped[str] = mapped_column(
|
||||
Geometry(geometry_type="LINESTRING", srid=4326), nullable=False
|
||||
)
|
||||
# JSON array of constituent track IDs
|
||||
constituent_track_ids: Mapped[str] = mapped_column(
|
||||
Text, nullable=False
|
||||
)
|
||||
|
||||
__table_args__ = (
|
||||
UniqueConstraint(
|
||||
"start_station_id", "end_station_id", name="uq_tracks_station_pair"
|
||||
"start_station_id", "end_station_id", name="uq_combined_tracks_station_pair"
|
||||
),
|
||||
)
|
||||
|
||||
@@ -105,7 +152,8 @@ class Train(Base, TimestampMixin):
|
||||
id: Mapped[uuid.UUID] = mapped_column(
|
||||
UUID(as_uuid=True), primary_key=True, default=uuid.uuid4
|
||||
)
|
||||
designation: Mapped[str] = mapped_column(String(64), nullable=False, unique=True)
|
||||
designation: Mapped[str] = mapped_column(
|
||||
String(64), nullable=False, unique=True)
|
||||
operator_id: Mapped[uuid.UUID | None] = mapped_column(
|
||||
UUID(as_uuid=True), ForeignKey("users.id", ondelete="SET NULL")
|
||||
)
|
||||
|
||||
@@ -8,10 +8,14 @@ from .auth import (
|
||||
UserPublic,
|
||||
)
|
||||
from .base import (
|
||||
CombinedTrackCreate,
|
||||
CombinedTrackModel,
|
||||
StationCreate,
|
||||
StationModel,
|
||||
StationUpdate,
|
||||
TrackCreate,
|
||||
TrackModel,
|
||||
TrackUpdate,
|
||||
TrainCreate,
|
||||
TrainModel,
|
||||
TrainScheduleCreate,
|
||||
@@ -29,11 +33,15 @@ __all__ = [
|
||||
"UserPublic",
|
||||
"StationCreate",
|
||||
"StationModel",
|
||||
"StationUpdate",
|
||||
"TrackCreate",
|
||||
"TrackModel",
|
||||
"TrackUpdate",
|
||||
"TrainScheduleCreate",
|
||||
"TrainCreate",
|
||||
"TrainModel",
|
||||
"UserCreate",
|
||||
"to_camel",
|
||||
"CombinedTrackCreate",
|
||||
"CombinedTrackModel",
|
||||
]
|
||||
|
||||
@@ -3,7 +3,7 @@ from __future__ import annotations
|
||||
from datetime import datetime
|
||||
from typing import Generic, Sequence, TypeVar
|
||||
|
||||
from pydantic import BaseModel, ConfigDict
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
|
||||
def to_camel(string: str) -> str:
|
||||
@@ -42,13 +42,31 @@ class StationModel(IdentifiedModel[str]):
|
||||
name: str
|
||||
latitude: float
|
||||
longitude: float
|
||||
code: str | None = None
|
||||
osm_id: str | None = None
|
||||
elevation_m: float | None = None
|
||||
is_active: bool = True
|
||||
|
||||
|
||||
class TrackModel(IdentifiedModel[str]):
|
||||
start_station_id: str
|
||||
end_station_id: str
|
||||
length_meters: float
|
||||
max_speed_kph: float
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: float | None = None
|
||||
status: str | None = None
|
||||
is_bidirectional: bool = True
|
||||
coordinates: list[tuple[float, float]] = Field(default_factory=list)
|
||||
|
||||
|
||||
class CombinedTrackModel(IdentifiedModel[str]):
|
||||
start_station_id: str
|
||||
end_station_id: str
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: int | None = None
|
||||
status: str | None = None
|
||||
is_bidirectional: bool = True
|
||||
coordinates: list[tuple[float, float]] = Field(default_factory=list)
|
||||
constituent_track_ids: list[str] = Field(default_factory=list)
|
||||
|
||||
|
||||
class TrainModel(IdentifiedModel[str]):
|
||||
@@ -68,10 +86,45 @@ class StationCreate(CamelModel):
|
||||
is_active: bool = True
|
||||
|
||||
|
||||
class StationUpdate(CamelModel):
|
||||
name: str | None = None
|
||||
latitude: float | None = None
|
||||
longitude: float | None = None
|
||||
osm_id: str | None = None
|
||||
code: str | None = None
|
||||
elevation_m: float | None = None
|
||||
is_active: bool | None = None
|
||||
|
||||
|
||||
class TrackCreate(CamelModel):
|
||||
start_station_id: str
|
||||
end_station_id: str
|
||||
coordinates: Sequence[tuple[float, float]]
|
||||
osm_id: str | None = None
|
||||
name: str | None = None
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: int | None = None
|
||||
is_bidirectional: bool = True
|
||||
status: str = "planned"
|
||||
|
||||
|
||||
class TrackUpdate(CamelModel):
|
||||
start_station_id: str | None = None
|
||||
end_station_id: str | None = None
|
||||
coordinates: Sequence[tuple[float, float]] | None = None
|
||||
osm_id: str | None = None
|
||||
name: str | None = None
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: int | None = None
|
||||
is_bidirectional: bool | None = None
|
||||
status: str | None = None
|
||||
|
||||
|
||||
class CombinedTrackCreate(CamelModel):
|
||||
start_station_id: str
|
||||
end_station_id: str
|
||||
coordinates: Sequence[tuple[float, float]]
|
||||
constituent_track_ids: list[str]
|
||||
name: str | None = None
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: int | None = None
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
|
||||
from backend.app.repositories.stations import StationRepository
|
||||
from backend.app.repositories.tracks import TrackRepository
|
||||
from backend.app.repositories.combined_tracks import CombinedTrackRepository
|
||||
from backend.app.repositories.train_schedules import TrainScheduleRepository
|
||||
from backend.app.repositories.trains import TrainRepository
|
||||
from backend.app.repositories.users import UserRepository
|
||||
@@ -10,6 +11,7 @@ __all__ = [
|
||||
"StationRepository",
|
||||
"TrainScheduleRepository",
|
||||
"TrackRepository",
|
||||
"CombinedTrackRepository",
|
||||
"TrainRepository",
|
||||
"UserRepository",
|
||||
]
|
||||
|
||||
73
backend/app/repositories/combined_tracks.py
Normal file
73
backend/app/repositories/combined_tracks.py
Normal file
@@ -0,0 +1,73 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
from uuid import UUID
|
||||
|
||||
import sqlalchemy as sa
|
||||
from geoalchemy2.elements import WKTElement
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.db.models import CombinedTrack
|
||||
from backend.app.models import CombinedTrackCreate
|
||||
from backend.app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
class CombinedTrackRepository(BaseRepository[CombinedTrack]):
|
||||
model = CombinedTrack
|
||||
|
||||
def __init__(self, session: Session) -> None:
|
||||
super().__init__(session)
|
||||
|
||||
def list_all(self) -> list[CombinedTrack]:
|
||||
statement = sa.select(self.model)
|
||||
return list(self.session.scalars(statement))
|
||||
|
||||
def exists_between_stations(self, start_station_id: str, end_station_id: str) -> bool:
|
||||
"""Check if a combined track already exists between two stations."""
|
||||
statement = sa.select(sa.exists().where(
|
||||
sa.and_(
|
||||
self.model.start_station_id == start_station_id,
|
||||
self.model.end_station_id == end_station_id
|
||||
)
|
||||
))
|
||||
return bool(self.session.scalar(statement))
|
||||
|
||||
def get_constituent_track_ids(self, combined_track: CombinedTrack) -> list[str]:
|
||||
"""Extract constituent track IDs from a combined track."""
|
||||
try:
|
||||
return json.loads(combined_track.constituent_track_ids)
|
||||
except (json.JSONDecodeError, TypeError):
|
||||
return []
|
||||
|
||||
@staticmethod
|
||||
def _ensure_uuid(value: UUID | str) -> UUID:
|
||||
if isinstance(value, UUID):
|
||||
return value
|
||||
return UUID(str(value))
|
||||
|
||||
@staticmethod
|
||||
def _line_string(coordinates: list[tuple[float, float]]) -> WKTElement:
|
||||
if len(coordinates) < 2:
|
||||
raise ValueError(
|
||||
"Combined track geometry requires at least two coordinate pairs")
|
||||
parts = [f"{lon} {lat}" for lat, lon in coordinates]
|
||||
return WKTElement(f"LINESTRING({', '.join(parts)})", srid=4326)
|
||||
|
||||
def create(self, data: CombinedTrackCreate) -> CombinedTrack:
|
||||
coordinates = list(data.coordinates)
|
||||
geometry = self._line_string(coordinates)
|
||||
constituent_track_ids_json = json.dumps(data.constituent_track_ids)
|
||||
|
||||
combined_track = CombinedTrack(
|
||||
name=data.name,
|
||||
start_station_id=self._ensure_uuid(data.start_station_id),
|
||||
end_station_id=self._ensure_uuid(data.end_station_id),
|
||||
length_meters=data.length_meters,
|
||||
max_speed_kph=data.max_speed_kph,
|
||||
is_bidirectional=data.is_bidirectional,
|
||||
status=data.status,
|
||||
combined_geometry=geometry,
|
||||
constituent_track_ids=constituent_track_ids_json,
|
||||
)
|
||||
self.session.add(combined_track)
|
||||
return combined_track
|
||||
@@ -7,7 +7,7 @@ from geoalchemy2.elements import WKTElement
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.db.models import Track
|
||||
from backend.app.models import TrackCreate
|
||||
from backend.app.models import TrackCreate, TrackUpdate
|
||||
from backend.app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
@@ -21,6 +21,102 @@ class TrackRepository(BaseRepository[Track]):
|
||||
statement = sa.select(self.model)
|
||||
return list(self.session.scalars(statement))
|
||||
|
||||
def exists_by_osm_id(self, osm_id: str) -> bool:
|
||||
statement = sa.select(sa.exists().where(self.model.osm_id == osm_id))
|
||||
return bool(self.session.scalar(statement))
|
||||
|
||||
def find_path_between_stations(self, start_station_id: str, end_station_id: str) -> list[Track] | None:
|
||||
"""Find the shortest path between two stations using existing tracks.
|
||||
|
||||
Returns a list of tracks that form the path, or None if no path exists.
|
||||
"""
|
||||
# Build adjacency list: station -> list of (neighbor_station, track)
|
||||
adjacency = self._build_track_graph()
|
||||
|
||||
if start_station_id not in adjacency or end_station_id not in adjacency:
|
||||
return None
|
||||
|
||||
# BFS to find shortest path
|
||||
from collections import deque
|
||||
|
||||
# (current_station, path_so_far)
|
||||
queue = deque([(start_station_id, [])])
|
||||
visited = set([start_station_id])
|
||||
|
||||
while queue:
|
||||
current_station, path = queue.popleft()
|
||||
|
||||
if current_station == end_station_id:
|
||||
return path
|
||||
|
||||
for neighbor, track in adjacency[current_station]:
|
||||
if neighbor not in visited:
|
||||
visited.add(neighbor)
|
||||
queue.append((neighbor, path + [track]))
|
||||
|
||||
return None # No path found
|
||||
|
||||
def _build_track_graph(self) -> dict[str, list[tuple[str, Track]]]:
|
||||
"""Build a graph representation of tracks: station -> [(neighbor_station, track), ...]"""
|
||||
tracks = self.list_all()
|
||||
graph = {}
|
||||
|
||||
for track in tracks:
|
||||
start_id = str(track.start_station_id)
|
||||
end_id = str(track.end_station_id)
|
||||
|
||||
# Add bidirectional edges (assuming tracks are bidirectional)
|
||||
if start_id not in graph:
|
||||
graph[start_id] = []
|
||||
if end_id not in graph:
|
||||
graph[end_id] = []
|
||||
|
||||
graph[start_id].append((end_id, track))
|
||||
graph[end_id].append((start_id, track))
|
||||
|
||||
return graph
|
||||
|
||||
def combine_track_geometries(self, tracks: list[Track]) -> list[tuple[float, float]]:
|
||||
"""Combine the geometries of multiple tracks into a single coordinate sequence.
|
||||
|
||||
Assumes tracks are in order and form a continuous path.
|
||||
"""
|
||||
if not tracks:
|
||||
return []
|
||||
|
||||
combined_coords = []
|
||||
|
||||
for i, track in enumerate(tracks):
|
||||
# Extract coordinates from track geometry
|
||||
coords = self._extract_coordinates_from_track(track)
|
||||
|
||||
if i == 0:
|
||||
# First track: add all coordinates
|
||||
combined_coords.extend(coords)
|
||||
else:
|
||||
# Subsequent tracks: skip the first coordinate (shared with previous track)
|
||||
combined_coords.extend(coords[1:])
|
||||
|
||||
return combined_coords
|
||||
|
||||
def _extract_coordinates_from_track(self, track: Track) -> list[tuple[float, float]]:
|
||||
"""Extract coordinate list from a track's geometry."""
|
||||
# Convert WKT string to WKTElement, then to shapely geometry
|
||||
from geoalchemy2.elements import WKTElement
|
||||
from geoalchemy2.shape import to_shape
|
||||
|
||||
try:
|
||||
wkt_element = WKTElement(track.track_geometry)
|
||||
geom = to_shape(wkt_element)
|
||||
if hasattr(geom, 'coords'):
|
||||
# For LineString, coords returns [(x, y), ...] where x=lon, y=lat
|
||||
# Convert to (lat, lon)
|
||||
return [(coord[1], coord[0]) for coord in geom.coords]
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return []
|
||||
|
||||
@staticmethod
|
||||
def _ensure_uuid(value: UUID | str) -> UUID:
|
||||
if isinstance(value, UUID):
|
||||
@@ -30,7 +126,8 @@ class TrackRepository(BaseRepository[Track]):
|
||||
@staticmethod
|
||||
def _line_string(coordinates: list[tuple[float, float]]) -> WKTElement:
|
||||
if len(coordinates) < 2:
|
||||
raise ValueError("Track geometry requires at least two coordinate pairs")
|
||||
raise ValueError(
|
||||
"Track geometry requires at least two coordinate pairs")
|
||||
parts = [f"{lon} {lat}" for lat, lon in coordinates]
|
||||
return WKTElement(f"LINESTRING({', '.join(parts)})", srid=4326)
|
||||
|
||||
@@ -38,6 +135,7 @@ class TrackRepository(BaseRepository[Track]):
|
||||
coordinates = list(data.coordinates)
|
||||
geometry = self._line_string(coordinates)
|
||||
track = Track(
|
||||
osm_id=data.osm_id,
|
||||
name=data.name,
|
||||
start_station_id=self._ensure_uuid(data.start_station_id),
|
||||
end_station_id=self._ensure_uuid(data.end_station_id),
|
||||
@@ -49,3 +147,26 @@ class TrackRepository(BaseRepository[Track]):
|
||||
)
|
||||
self.session.add(track)
|
||||
return track
|
||||
|
||||
def update(self, track: Track, payload: TrackUpdate) -> Track:
|
||||
if payload.start_station_id is not None:
|
||||
track.start_station_id = self._ensure_uuid(
|
||||
payload.start_station_id)
|
||||
if payload.end_station_id is not None:
|
||||
track.end_station_id = self._ensure_uuid(payload.end_station_id)
|
||||
if payload.coordinates is not None:
|
||||
track.track_geometry = self._line_string(
|
||||
list(payload.coordinates)) # type: ignore[assignment]
|
||||
if payload.osm_id is not None:
|
||||
track.osm_id = payload.osm_id
|
||||
if payload.name is not None:
|
||||
track.name = payload.name
|
||||
if payload.length_meters is not None:
|
||||
track.length_meters = payload.length_meters
|
||||
if payload.max_speed_kph is not None:
|
||||
track.max_speed_kph = payload.max_speed_kph
|
||||
if payload.is_bidirectional is not None:
|
||||
track.is_bidirectional = payload.is_bidirectional
|
||||
if payload.status is not None:
|
||||
track.status = payload.status
|
||||
return track
|
||||
|
||||
79
backend/app/services/combined_tracks.py
Normal file
79
backend/app/services/combined_tracks.py
Normal file
@@ -0,0 +1,79 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""Application services for combined track operations."""
|
||||
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.models import CombinedTrackCreate, CombinedTrackModel
|
||||
from backend.app.repositories import CombinedTrackRepository, TrackRepository
|
||||
|
||||
|
||||
def create_combined_track(
|
||||
session: Session, start_station_id: str, end_station_id: str
|
||||
) -> CombinedTrackModel | None:
|
||||
"""Create a combined track between two stations using pathfinding.
|
||||
|
||||
Returns the created combined track, or None if no path exists or
|
||||
a combined track already exists between these stations.
|
||||
"""
|
||||
combined_track_repo = CombinedTrackRepository(session)
|
||||
track_repo = TrackRepository(session)
|
||||
|
||||
# Check if combined track already exists
|
||||
if combined_track_repo.exists_between_stations(start_station_id, end_station_id):
|
||||
return None
|
||||
|
||||
# Find path between stations
|
||||
path_tracks = track_repo.find_path_between_stations(
|
||||
start_station_id, end_station_id)
|
||||
if not path_tracks:
|
||||
return None
|
||||
|
||||
# Combine geometries
|
||||
combined_coords = track_repo.combine_track_geometries(path_tracks)
|
||||
if len(combined_coords) < 2:
|
||||
return None
|
||||
|
||||
# Calculate total length
|
||||
total_length = sum(track.length_meters or 0 for track in path_tracks)
|
||||
|
||||
# Get max speed (use the minimum speed of all tracks)
|
||||
max_speeds = [
|
||||
track.max_speed_kph for track in path_tracks if track.max_speed_kph]
|
||||
max_speed = min(max_speeds) if max_speeds else None
|
||||
|
||||
# Get constituent track IDs
|
||||
constituent_track_ids = [str(track.id) for track in path_tracks]
|
||||
|
||||
# Create combined track
|
||||
create_data = CombinedTrackCreate(
|
||||
start_station_id=start_station_id,
|
||||
end_station_id=end_station_id,
|
||||
coordinates=combined_coords,
|
||||
constituent_track_ids=constituent_track_ids,
|
||||
length_meters=total_length if total_length > 0 else None,
|
||||
max_speed_kph=max_speed,
|
||||
status="operational",
|
||||
)
|
||||
|
||||
combined_track = combined_track_repo.create(create_data)
|
||||
session.commit()
|
||||
|
||||
return CombinedTrackModel.model_validate(combined_track)
|
||||
|
||||
|
||||
def get_combined_track(session: Session, combined_track_id: str) -> CombinedTrackModel | None:
|
||||
"""Get a combined track by ID."""
|
||||
try:
|
||||
combined_track_repo = CombinedTrackRepository(session)
|
||||
combined_track = combined_track_repo.get(combined_track_id)
|
||||
return CombinedTrackModel.model_validate(combined_track)
|
||||
except LookupError:
|
||||
return None
|
||||
|
||||
|
||||
def list_combined_tracks(session: Session) -> list[CombinedTrackModel]:
|
||||
"""List all combined tracks."""
|
||||
combined_track_repo = CombinedTrackRepository(session)
|
||||
combined_tracks = combined_track_repo.list_all()
|
||||
return [CombinedTrackModel.model_validate(ct) for ct in combined_tracks]
|
||||
@@ -8,9 +8,10 @@ from geoalchemy2.elements import WKBElement, WKTElement
|
||||
from geoalchemy2.shape import to_shape
|
||||
|
||||
try: # pragma: no cover - optional dependency guard
|
||||
from shapely.geometry import Point # type: ignore
|
||||
from shapely.geometry import LineString, Point # type: ignore
|
||||
except ImportError: # pragma: no cover - allow running without shapely at import time
|
||||
Point = None # type: ignore[assignment]
|
||||
LineString = None # type: ignore[assignment]
|
||||
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
from sqlalchemy.orm import Session
|
||||
@@ -51,6 +52,12 @@ def _fallback_snapshot() -> dict[str, list[dict[str, object]]]:
|
||||
end_station_id="station-2",
|
||||
length_meters=289000.0,
|
||||
max_speed_kph=230.0,
|
||||
status="operational",
|
||||
is_bidirectional=True,
|
||||
coordinates=[
|
||||
(stations[0].latitude, stations[0].longitude),
|
||||
(stations[1].latitude, stations[1].longitude),
|
||||
],
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
@@ -134,6 +141,24 @@ def get_network_snapshot(session: Session) -> dict[str, list[dict[str, object]]]
|
||||
|
||||
track_models: list[TrackModel] = []
|
||||
for track in tracks_entities:
|
||||
coordinates: list[tuple[float, float]] = []
|
||||
geometry = track.track_geometry
|
||||
shape = (
|
||||
to_shape(cast(WKBElement | WKTElement, geometry))
|
||||
if geometry is not None and LineString is not None
|
||||
else None
|
||||
)
|
||||
if (
|
||||
LineString is not None
|
||||
and shape is not None
|
||||
and isinstance(shape, LineString)
|
||||
):
|
||||
coords_list: list[tuple[float, float]] = []
|
||||
for coord in shape.coords:
|
||||
lon = float(coord[0])
|
||||
lat = float(coord[1])
|
||||
coords_list.append((lat, lon))
|
||||
coordinates = coords_list
|
||||
track_models.append(
|
||||
TrackModel(
|
||||
id=str(track.id),
|
||||
@@ -141,6 +166,9 @@ def get_network_snapshot(session: Session) -> dict[str, list[dict[str, object]]]
|
||||
end_station_id=str(track.end_station_id),
|
||||
length_meters=_to_float(track.length_meters),
|
||||
max_speed_kph=_to_float(track.max_speed_kph),
|
||||
status=track.status,
|
||||
is_bidirectional=track.is_bidirectional,
|
||||
coordinates=coordinates,
|
||||
created_at=cast(datetime, track.created_at),
|
||||
updated_at=cast(datetime, track.updated_at),
|
||||
)
|
||||
|
||||
195
backend/app/services/stations.py
Normal file
195
backend/app/services/stations.py
Normal file
@@ -0,0 +1,195 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""Application services for station CRUD operations."""
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from typing import cast
|
||||
from uuid import UUID
|
||||
|
||||
from geoalchemy2.elements import WKBElement, WKTElement
|
||||
from geoalchemy2.shape import to_shape
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.db.models import Station
|
||||
from backend.app.models import StationCreate, StationModel, StationUpdate
|
||||
from backend.app.repositories import StationRepository
|
||||
|
||||
try: # pragma: no cover - optional dependency guard
|
||||
from shapely.geometry import Point # type: ignore
|
||||
except ImportError: # pragma: no cover - allow running without shapely at import time
|
||||
Point = None # type: ignore[assignment]
|
||||
|
||||
|
||||
def list_stations(
|
||||
session: Session, include_inactive: bool = False
|
||||
) -> list[StationModel]:
|
||||
repo = StationRepository(session)
|
||||
if include_inactive:
|
||||
stations = repo.list()
|
||||
else:
|
||||
stations = repo.list_active()
|
||||
return [_to_station_model(station) for station in stations]
|
||||
|
||||
|
||||
def get_station(session: Session, station_id: str) -> StationModel:
|
||||
repo = StationRepository(session)
|
||||
station = _resolve_station(repo, station_id)
|
||||
return _to_station_model(station)
|
||||
|
||||
|
||||
def create_station(session: Session, payload: StationCreate) -> StationModel:
|
||||
name = payload.name.strip()
|
||||
if not name:
|
||||
raise ValueError("Station name must not be empty")
|
||||
_validate_coordinates(payload.latitude, payload.longitude)
|
||||
|
||||
repo = StationRepository(session)
|
||||
station = repo.create(
|
||||
StationCreate(
|
||||
name=name,
|
||||
latitude=payload.latitude,
|
||||
longitude=payload.longitude,
|
||||
osm_id=_normalize_optional(payload.osm_id),
|
||||
code=_normalize_optional(payload.code),
|
||||
elevation_m=payload.elevation_m,
|
||||
is_active=payload.is_active,
|
||||
)
|
||||
)
|
||||
session.flush()
|
||||
session.refresh(station)
|
||||
session.commit()
|
||||
return _to_station_model(station)
|
||||
|
||||
|
||||
def update_station(
|
||||
session: Session, station_id: str, payload: StationUpdate
|
||||
) -> StationModel:
|
||||
repo = StationRepository(session)
|
||||
station = _resolve_station(repo, station_id)
|
||||
|
||||
if payload.name is not None:
|
||||
name = payload.name.strip()
|
||||
if not name:
|
||||
raise ValueError("Station name must not be empty")
|
||||
station.name = name
|
||||
|
||||
if payload.latitude is not None or payload.longitude is not None:
|
||||
if payload.latitude is None or payload.longitude is None:
|
||||
raise ValueError("Both latitude and longitude must be provided together")
|
||||
_validate_coordinates(payload.latitude, payload.longitude)
|
||||
station.location = repo._point(
|
||||
payload.latitude, payload.longitude
|
||||
) # type: ignore[assignment]
|
||||
|
||||
if payload.osm_id is not None:
|
||||
station.osm_id = _normalize_optional(payload.osm_id)
|
||||
|
||||
if payload.code is not None:
|
||||
station.code = _normalize_optional(payload.code)
|
||||
|
||||
if payload.elevation_m is not None:
|
||||
station.elevation_m = payload.elevation_m
|
||||
|
||||
if payload.is_active is not None:
|
||||
station.is_active = payload.is_active
|
||||
|
||||
session.flush()
|
||||
session.refresh(station)
|
||||
session.commit()
|
||||
return _to_station_model(station)
|
||||
|
||||
|
||||
def archive_station(session: Session, station_id: str) -> StationModel:
|
||||
repo = StationRepository(session)
|
||||
station = _resolve_station(repo, station_id)
|
||||
if station.is_active:
|
||||
station.is_active = False
|
||||
session.flush()
|
||||
session.refresh(station)
|
||||
session.commit()
|
||||
return _to_station_model(station)
|
||||
|
||||
|
||||
def _resolve_station(repo: StationRepository, station_id: str) -> Station:
|
||||
identifier = _parse_station_id(station_id)
|
||||
station = repo.get(identifier)
|
||||
if station is None:
|
||||
raise LookupError("Station not found")
|
||||
return station
|
||||
|
||||
|
||||
def _parse_station_id(station_id: str) -> UUID:
|
||||
try:
|
||||
return UUID(station_id)
|
||||
except (ValueError, TypeError) as exc: # pragma: no cover - simple validation
|
||||
raise ValueError("Invalid station identifier") from exc
|
||||
|
||||
|
||||
def _validate_coordinates(latitude: float, longitude: float) -> None:
|
||||
if not (-90.0 <= latitude <= 90.0):
|
||||
raise ValueError("Latitude must be between -90 and 90 degrees")
|
||||
if not (-180.0 <= longitude <= 180.0):
|
||||
raise ValueError("Longitude must be between -180 and 180 degrees")
|
||||
|
||||
|
||||
def _normalize_optional(value: str | None) -> str | None:
|
||||
if value is None:
|
||||
return None
|
||||
normalized = value.strip()
|
||||
return normalized or None
|
||||
|
||||
|
||||
def _to_station_model(station: Station) -> StationModel:
|
||||
latitude, longitude = _extract_coordinates(station.location)
|
||||
created_at = station.created_at or datetime.now(timezone.utc)
|
||||
updated_at = station.updated_at or created_at
|
||||
return StationModel(
|
||||
id=str(station.id),
|
||||
name=station.name,
|
||||
latitude=latitude,
|
||||
longitude=longitude,
|
||||
code=station.code,
|
||||
osm_id=station.osm_id,
|
||||
elevation_m=station.elevation_m,
|
||||
is_active=station.is_active,
|
||||
created_at=cast(datetime, created_at),
|
||||
updated_at=cast(datetime, updated_at),
|
||||
)
|
||||
|
||||
|
||||
def _extract_coordinates(location: object) -> tuple[float, float]:
|
||||
if location is None:
|
||||
raise ValueError("Station location is unavailable")
|
||||
|
||||
# Attempt to leverage GeoAlchemy's shapely integration first.
|
||||
try:
|
||||
geometry = to_shape(cast(WKBElement | WKTElement, location))
|
||||
if Point is not None and isinstance(geometry, Point):
|
||||
return float(geometry.y), float(geometry.x)
|
||||
except Exception: # pragma: no cover - fallback handles parsing
|
||||
pass
|
||||
|
||||
if isinstance(location, WKTElement):
|
||||
return _parse_wkt_point(location.data)
|
||||
|
||||
text = getattr(location, "desc", None)
|
||||
if isinstance(text, str):
|
||||
return _parse_wkt_point(text)
|
||||
|
||||
raise ValueError("Unable to read station geometry")
|
||||
|
||||
|
||||
def _parse_wkt_point(wkt: str) -> tuple[float, float]:
|
||||
marker = "POINT"
|
||||
if not wkt.upper().startswith(marker):
|
||||
raise ValueError("Unsupported geometry format")
|
||||
start = wkt.find("(")
|
||||
end = wkt.find(")", start)
|
||||
if start == -1 or end == -1:
|
||||
raise ValueError("Malformed POINT geometry")
|
||||
coordinates = wkt[start + 1 : end].strip().split()
|
||||
if len(coordinates) != 2:
|
||||
raise ValueError("POINT geometry must contain two coordinates")
|
||||
longitude, latitude = map(float, coordinates)
|
||||
_validate_coordinates(latitude, longitude)
|
||||
return latitude, longitude
|
||||
106
backend/app/services/tracks.py
Normal file
106
backend/app/services/tracks.py
Normal file
@@ -0,0 +1,106 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""Service layer for primary track management operations."""
|
||||
|
||||
from typing import Iterable
|
||||
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.models import CombinedTrackModel, TrackCreate, TrackModel, TrackUpdate
|
||||
from backend.app.repositories import CombinedTrackRepository, TrackRepository
|
||||
|
||||
|
||||
def list_tracks(session: Session) -> list[TrackModel]:
|
||||
repo = TrackRepository(session)
|
||||
tracks = repo.list_all()
|
||||
return [TrackModel.model_validate(track) for track in tracks]
|
||||
|
||||
|
||||
def get_track(session: Session, track_id: str) -> TrackModel | None:
|
||||
repo = TrackRepository(session)
|
||||
track = repo.get(track_id)
|
||||
if track is None:
|
||||
return None
|
||||
return TrackModel.model_validate(track)
|
||||
|
||||
|
||||
def create_track(session: Session, payload: TrackCreate) -> TrackModel:
|
||||
repo = TrackRepository(session)
|
||||
try:
|
||||
track = repo.create(payload)
|
||||
session.commit()
|
||||
except IntegrityError as exc:
|
||||
session.rollback()
|
||||
raise ValueError(
|
||||
"Track with the same station pair already exists") from exc
|
||||
|
||||
return TrackModel.model_validate(track)
|
||||
|
||||
|
||||
def update_track(session: Session, track_id: str, payload: TrackUpdate) -> TrackModel | None:
|
||||
repo = TrackRepository(session)
|
||||
track = repo.get(track_id)
|
||||
if track is None:
|
||||
return None
|
||||
|
||||
repo.update(track, payload)
|
||||
session.commit()
|
||||
|
||||
return TrackModel.model_validate(track)
|
||||
|
||||
|
||||
def delete_track(session: Session, track_id: str, regenerate: bool = False) -> bool:
|
||||
repo = TrackRepository(session)
|
||||
track = repo.get(track_id)
|
||||
if track is None:
|
||||
return False
|
||||
|
||||
start_station_id = str(track.start_station_id)
|
||||
end_station_id = str(track.end_station_id)
|
||||
|
||||
session.delete(track)
|
||||
session.commit()
|
||||
|
||||
if regenerate:
|
||||
regenerate_combined_tracks(session, [start_station_id, end_station_id])
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def regenerate_combined_tracks(session: Session, station_ids: Iterable[str]) -> list[CombinedTrackModel]:
|
||||
combined_repo = CombinedTrackRepository(session)
|
||||
|
||||
station_id_set = set(station_ids)
|
||||
if not station_id_set:
|
||||
return []
|
||||
|
||||
# Remove combined tracks touching these stations
|
||||
for combined in combined_repo.list_all():
|
||||
if {str(combined.start_station_id), str(combined.end_station_id)} & station_id_set:
|
||||
session.delete(combined)
|
||||
|
||||
session.commit()
|
||||
|
||||
# Rebuild combined tracks between affected station pairs
|
||||
from backend.app.services.combined_tracks import create_combined_track
|
||||
|
||||
regenerated: list[CombinedTrackModel] = []
|
||||
station_list = list(station_id_set)
|
||||
for i in range(len(station_list)):
|
||||
for j in range(i + 1, len(station_list)):
|
||||
result = create_combined_track(
|
||||
session, station_list[i], station_list[j])
|
||||
if result is not None:
|
||||
regenerated.append(result)
|
||||
return regenerated
|
||||
|
||||
|
||||
__all__ = [
|
||||
"list_tracks",
|
||||
"get_track",
|
||||
"create_track",
|
||||
"update_track",
|
||||
"delete_track",
|
||||
"regenerate_combined_tracks",
|
||||
]
|
||||
@@ -0,0 +1,19 @@
|
||||
"""Template for new Alembic migration scripts."""
|
||||
|
||||
from __future__ import annotations
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
revision = '63d02d67b39e'
|
||||
down_revision = '20251011_01'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.add_column('tracks', sa.Column(
|
||||
'osm_id', sa.String(length=32), nullable=True))
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_column('tracks', 'osm_id')
|
||||
@@ -0,0 +1,75 @@
|
||||
"""Template for new Alembic migration scripts."""
|
||||
|
||||
from __future__ import annotations
|
||||
from sqlalchemy.dialects import postgresql
|
||||
from geoalchemy2.types import Geometry
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
revision = 'e7d4bb03da04'
|
||||
down_revision = '63d02d67b39e'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
op.create_table(
|
||||
"combined_tracks",
|
||||
sa.Column(
|
||||
"id",
|
||||
postgresql.UUID(as_uuid=True),
|
||||
primary_key=True,
|
||||
server_default=sa.text("gen_random_uuid()"),
|
||||
),
|
||||
sa.Column("name", sa.String(length=128), nullable=True),
|
||||
sa.Column("start_station_id", postgresql.UUID(
|
||||
as_uuid=True), nullable=False),
|
||||
sa.Column("end_station_id", postgresql.UUID(
|
||||
as_uuid=True), nullable=False),
|
||||
sa.Column("length_meters", sa.Numeric(10, 2), nullable=True),
|
||||
sa.Column("max_speed_kph", sa.Integer(), nullable=True),
|
||||
sa.Column(
|
||||
"is_bidirectional",
|
||||
sa.Boolean(),
|
||||
nullable=False,
|
||||
server_default=sa.text("true"),
|
||||
),
|
||||
sa.Column(
|
||||
"status", sa.String(length=32), nullable=False, server_default="planned"
|
||||
),
|
||||
sa.Column(
|
||||
"combined_geometry",
|
||||
Geometry(geometry_type="LINESTRING", srid=4326),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column("constituent_track_ids", sa.Text(), nullable=False),
|
||||
sa.Column(
|
||||
"created_at",
|
||||
sa.DateTime(timezone=True),
|
||||
server_default=sa.text("timezone('utc', now())"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.Column(
|
||||
"updated_at",
|
||||
sa.DateTime(timezone=True),
|
||||
server_default=sa.text("timezone('utc', now())"),
|
||||
nullable=False,
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["start_station_id"], ["stations.id"], ondelete="RESTRICT"
|
||||
),
|
||||
sa.ForeignKeyConstraint(
|
||||
["end_station_id"], ["stations.id"], ondelete="RESTRICT"
|
||||
),
|
||||
sa.UniqueConstraint(
|
||||
"start_station_id", "end_station_id", name="uq_combined_tracks_station_pair"
|
||||
),
|
||||
)
|
||||
op.create_index(
|
||||
"ix_combined_tracks_geometry", "combined_tracks", ["combined_geometry"], postgresql_using="gist"
|
||||
)
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
op.drop_index("ix_combined_tracks_geometry", table_name="combined_tracks")
|
||||
op.drop_table("combined_tracks")
|
||||
196
backend/scripts/osm_refresh.py
Normal file
196
backend/scripts/osm_refresh.py
Normal file
@@ -0,0 +1,196 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""Orchestrate the OSM station/track import and load pipeline."""
|
||||
|
||||
import argparse
|
||||
import sys
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Callable, Sequence
|
||||
|
||||
from backend.app.core.osm_config import DEFAULT_REGIONS
|
||||
from backend.scripts import stations_import, stations_load, tracks_import, tracks_load
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class Stage:
|
||||
label: str
|
||||
runner: Callable[[list[str] | None], int]
|
||||
args: list[str]
|
||||
input_path: Path | None = None
|
||||
output_path: Path | None = None
|
||||
|
||||
|
||||
def build_argument_parser() -> argparse.ArgumentParser:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Run the station and track import/load workflow in sequence.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--region",
|
||||
choices=[region.name for region in DEFAULT_REGIONS] + ["all"],
|
||||
default="all",
|
||||
help="Region selector forwarded to the import scripts (default: all).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output-dir",
|
||||
type=Path,
|
||||
default=Path("data"),
|
||||
help="Directory where intermediate JSON payloads are stored (default: data/).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--stations-json",
|
||||
type=Path,
|
||||
help="Existing station JSON file to load; defaults to <output-dir>/osm_stations.json.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--tracks-json",
|
||||
type=Path,
|
||||
help="Existing track JSON file to load; defaults to <output-dir>/osm_tracks.json.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skip-station-import",
|
||||
action="store_true",
|
||||
help="Skip the station import step (expects --stations-json to point to data).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skip-station-load",
|
||||
action="store_true",
|
||||
help="Skip loading stations into PostGIS.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skip-track-import",
|
||||
action="store_true",
|
||||
help="Skip the track import step (expects --tracks-json to point to data).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skip-track-load",
|
||||
action="store_true",
|
||||
help="Skip loading tracks into PostGIS.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Print the planned stages without invoking Overpass or mutating the database.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--commit",
|
||||
dest="commit",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help="Commit database changes produced by the load steps (default).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-commit",
|
||||
dest="commit",
|
||||
action="store_false",
|
||||
help="Rollback database changes after load steps (dry run).",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
def _build_stage_plan(args: argparse.Namespace) -> list[Stage]:
|
||||
station_json = args.stations_json or args.output_dir / "osm_stations.json"
|
||||
track_json = args.tracks_json or args.output_dir / "osm_tracks.json"
|
||||
|
||||
stages: list[Stage] = []
|
||||
|
||||
if not args.skip_station_import:
|
||||
stages.append(
|
||||
Stage(
|
||||
label="Import stations",
|
||||
runner=stations_import.main,
|
||||
args=["--output", str(station_json), "--region", args.region],
|
||||
output_path=station_json,
|
||||
)
|
||||
)
|
||||
|
||||
if not args.skip_station_load:
|
||||
load_args = [str(station_json)]
|
||||
if not args.commit:
|
||||
load_args.append("--no-commit")
|
||||
stages.append(
|
||||
Stage(
|
||||
label="Load stations",
|
||||
runner=stations_load.main,
|
||||
args=load_args,
|
||||
input_path=station_json,
|
||||
)
|
||||
)
|
||||
|
||||
if not args.skip_track_import:
|
||||
stages.append(
|
||||
Stage(
|
||||
label="Import tracks",
|
||||
runner=tracks_import.main,
|
||||
args=["--output", str(track_json), "--region", args.region],
|
||||
output_path=track_json,
|
||||
)
|
||||
)
|
||||
|
||||
if not args.skip_track_load:
|
||||
load_args = [str(track_json)]
|
||||
if not args.commit:
|
||||
load_args.append("--no-commit")
|
||||
stages.append(
|
||||
Stage(
|
||||
label="Load tracks",
|
||||
runner=tracks_load.main,
|
||||
args=load_args,
|
||||
input_path=track_json,
|
||||
)
|
||||
)
|
||||
|
||||
return stages
|
||||
|
||||
|
||||
def _describe_plan(stages: Sequence[Stage]) -> None:
|
||||
if not stages:
|
||||
print("No stages selected; nothing to do.")
|
||||
return
|
||||
|
||||
print("Selected stages:")
|
||||
for stage in stages:
|
||||
detail = " ".join(stage.args) if stage.args else "<no args>"
|
||||
print(f" - {stage.label}: {detail}")
|
||||
|
||||
|
||||
def _execute_stage(stage: Stage) -> None:
|
||||
print(f"\n>>> {stage.label}")
|
||||
|
||||
if stage.output_path is not None:
|
||||
stage.output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if stage.input_path is not None and not stage.input_path.exists():
|
||||
raise RuntimeError(
|
||||
f"Expected input file {stage.input_path} for {stage.label}; run the import step first or provide an existing file."
|
||||
)
|
||||
|
||||
try:
|
||||
exit_code = stage.runner(stage.args)
|
||||
except SystemExit as exc: # argparse.error exits via SystemExit
|
||||
exit_code = int(exc.code or 0)
|
||||
|
||||
if exit_code:
|
||||
raise RuntimeError(f"{stage.label} failed with exit code {exit_code}.")
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = build_argument_parser()
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
stages = _build_stage_plan(args)
|
||||
|
||||
if args.dry_run:
|
||||
print("Dry run: the following stages would run in order.")
|
||||
_describe_plan(stages)
|
||||
return 0
|
||||
|
||||
for stage in stages:
|
||||
_execute_stage(stage)
|
||||
|
||||
print("\nOSM refresh pipeline completed successfully.")
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
169
backend/scripts/stations_import.py
Normal file
169
backend/scripts/stations_import.py
Normal file
@@ -0,0 +1,169 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""CLI utility to import station data from OpenStreetMap."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import sys
|
||||
from dataclasses import asdict
|
||||
from pathlib import Path
|
||||
from typing import Any, Iterable
|
||||
from urllib.parse import quote_plus
|
||||
|
||||
from backend.app.core.osm_config import (
|
||||
DEFAULT_REGIONS,
|
||||
STATION_TAG_FILTERS,
|
||||
compile_overpass_filters,
|
||||
)
|
||||
|
||||
OVERPASS_ENDPOINT = "https://overpass-api.de/api/interpreter"
|
||||
|
||||
|
||||
def build_argument_parser() -> argparse.ArgumentParser:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Export OSM station nodes for ingestion"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output",
|
||||
type=Path,
|
||||
default=Path("data/osm_stations.json"),
|
||||
help="Destination file for the exported station nodes (default: data/osm_stations.json)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--region",
|
||||
choices=[region.name for region in DEFAULT_REGIONS] + ["all"],
|
||||
default="all",
|
||||
help="Region name to export (default: all)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Do not fetch data; print the Overpass payload only",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
def build_overpass_query(region_name: str) -> str:
|
||||
if region_name == "all":
|
||||
regions = DEFAULT_REGIONS
|
||||
else:
|
||||
regions = tuple(
|
||||
region for region in DEFAULT_REGIONS if region.name == region_name
|
||||
)
|
||||
if not regions:
|
||||
msg = f"Unknown region {region_name}. Available regions: {[region.name for region in DEFAULT_REGIONS]}"
|
||||
raise ValueError(msg)
|
||||
|
||||
filters = compile_overpass_filters(STATION_TAG_FILTERS)
|
||||
|
||||
parts = ["[out:json][timeout:90];", "("]
|
||||
for region in regions:
|
||||
parts.append(f" node{filters}\n ({region.to_overpass_arg()});")
|
||||
parts.append(")")
|
||||
parts.append("; out body; >; out skel qt;")
|
||||
return "\n".join(parts)
|
||||
|
||||
|
||||
def perform_request(query: str) -> dict[str, Any]:
|
||||
import urllib.request
|
||||
|
||||
payload = f"data={quote_plus(query)}".encode("utf-8")
|
||||
request = urllib.request.Request(
|
||||
OVERPASS_ENDPOINT,
|
||||
data=payload,
|
||||
headers={"Content-Type": "application/x-www-form-urlencoded"},
|
||||
)
|
||||
with urllib.request.urlopen(request, timeout=120) as response:
|
||||
payload = response.read()
|
||||
return json.loads(payload)
|
||||
|
||||
|
||||
def normalize_station_elements(
|
||||
elements: Iterable[dict[str, Any]]
|
||||
) -> list[dict[str, Any]]:
|
||||
"""Convert raw Overpass nodes into StationCreate-compatible payloads."""
|
||||
|
||||
stations: list[dict[str, Any]] = []
|
||||
for element in elements:
|
||||
if element.get("type") != "node":
|
||||
continue
|
||||
|
||||
latitude = element.get("lat")
|
||||
longitude = element.get("lon")
|
||||
if latitude is None or longitude is None:
|
||||
continue
|
||||
|
||||
tags: dict[str, Any] = element.get("tags", {})
|
||||
name = tags.get("name")
|
||||
if not name:
|
||||
continue
|
||||
|
||||
raw_code = tags.get("ref") or tags.get("railway:ref") or tags.get("local_ref")
|
||||
code = str(raw_code) if raw_code is not None else None
|
||||
|
||||
elevation_tag = tags.get("ele") or tags.get("elevation")
|
||||
try:
|
||||
elevation = float(elevation_tag) if elevation_tag is not None else None
|
||||
except (TypeError, ValueError):
|
||||
elevation = None
|
||||
|
||||
disused = str(tags.get("disused", "no")).lower() in {"yes", "true"}
|
||||
railway_status = str(tags.get("railway", "")).lower()
|
||||
abandoned = railway_status in {"abandoned", "disused"}
|
||||
is_active = not (disused or abandoned)
|
||||
|
||||
stations.append(
|
||||
{
|
||||
"osm_id": str(element.get("id")),
|
||||
"name": str(name),
|
||||
"latitude": float(latitude),
|
||||
"longitude": float(longitude),
|
||||
"code": code,
|
||||
"elevation_m": elevation,
|
||||
"is_active": is_active,
|
||||
}
|
||||
)
|
||||
|
||||
return stations
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = build_argument_parser()
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
query = build_overpass_query(args.region)
|
||||
|
||||
if args.dry_run:
|
||||
print(query)
|
||||
return 0
|
||||
|
||||
output_path: Path = args.output
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
data = perform_request(query)
|
||||
raw_elements = data.get("elements", [])
|
||||
stations = normalize_station_elements(raw_elements)
|
||||
|
||||
payload = {
|
||||
"metadata": {
|
||||
"endpoint": OVERPASS_ENDPOINT,
|
||||
"region": args.region,
|
||||
"filters": STATION_TAG_FILTERS,
|
||||
"regions": [asdict(region) for region in DEFAULT_REGIONS],
|
||||
"raw_count": len(raw_elements),
|
||||
"station_count": len(stations),
|
||||
},
|
||||
"stations": stations,
|
||||
}
|
||||
|
||||
with output_path.open("w", encoding="utf-8") as handle:
|
||||
json.dump(payload, handle, indent=2)
|
||||
|
||||
print(
|
||||
f"Normalized {len(stations)} stations from {len(raw_elements)} elements into {output_path}"
|
||||
)
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
93
backend/scripts/stations_load.py
Normal file
93
backend/scripts/stations_load.py
Normal file
@@ -0,0 +1,93 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""CLI for loading normalized station JSON into the database."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import Any, Iterable, Mapping
|
||||
|
||||
from backend.app.db.session import SessionLocal
|
||||
from backend.app.models import StationCreate
|
||||
from backend.app.repositories import StationRepository
|
||||
|
||||
|
||||
def build_argument_parser() -> argparse.ArgumentParser:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Load normalized station data into PostGIS"
|
||||
)
|
||||
parser.add_argument(
|
||||
"input",
|
||||
type=Path,
|
||||
help="Path to the normalized station JSON file produced by stations_import.py",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--commit",
|
||||
dest="commit",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help="Commit the transaction after loading (default).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-commit",
|
||||
dest="commit",
|
||||
action="store_false",
|
||||
help="Rollback the transaction after loading (useful for dry runs).",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = build_argument_parser()
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
if not args.input.exists():
|
||||
parser.error(f"Input file {args.input} does not exist")
|
||||
|
||||
with args.input.open("r", encoding="utf-8") as handle:
|
||||
payload = json.load(handle)
|
||||
|
||||
stations_data = payload.get("stations") or []
|
||||
if not isinstance(stations_data, list):
|
||||
parser.error("Invalid payload: 'stations' must be a list")
|
||||
|
||||
try:
|
||||
station_creates = _parse_station_entries(stations_data)
|
||||
except ValueError as exc:
|
||||
parser.error(str(exc))
|
||||
|
||||
created = load_stations(station_creates, commit=args.commit)
|
||||
|
||||
print(f"Loaded {created} stations from {args.input}")
|
||||
return 0
|
||||
|
||||
|
||||
def _parse_station_entries(entries: Iterable[Mapping[str, Any]]) -> list[StationCreate]:
|
||||
parsed: list[StationCreate] = []
|
||||
for entry in entries:
|
||||
try:
|
||||
parsed.append(StationCreate(**entry))
|
||||
except Exception as exc: # pragma: no cover - validated in tests
|
||||
raise ValueError(f"Invalid station entry {entry}: {exc}") from exc
|
||||
return parsed
|
||||
|
||||
|
||||
def load_stations(stations: Iterable[StationCreate], commit: bool = True) -> int:
|
||||
created = 0
|
||||
with SessionLocal() as session:
|
||||
repo = StationRepository(session)
|
||||
|
||||
for create_schema in stations:
|
||||
repo.create(create_schema)
|
||||
created += 1
|
||||
|
||||
if commit:
|
||||
session.commit()
|
||||
else:
|
||||
session.rollback()
|
||||
return created
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
262
backend/scripts/tracks_import.py
Normal file
262
backend/scripts/tracks_import.py
Normal file
@@ -0,0 +1,262 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""CLI utility to export rail track geometries from OpenStreetMap."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import math
|
||||
import sys
|
||||
from dataclasses import asdict
|
||||
from pathlib import Path
|
||||
from typing import Any, Iterable, Mapping
|
||||
from urllib.parse import quote_plus
|
||||
|
||||
from backend.app.core.osm_config import (
|
||||
DEFAULT_REGIONS,
|
||||
TRACK_ALLOWED_RAILWAY_TYPES,
|
||||
TRACK_EXCLUDED_SERVICE_TAGS,
|
||||
TRACK_EXCLUDED_USAGE_TAGS,
|
||||
TRACK_MIN_LENGTH_METERS,
|
||||
TRACK_TAG_FILTERS,
|
||||
compile_overpass_filters,
|
||||
)
|
||||
|
||||
OVERPASS_ENDPOINT = "https://overpass-api.de/api/interpreter"
|
||||
|
||||
|
||||
def build_argument_parser() -> argparse.ArgumentParser:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Export OSM rail track ways for ingestion",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output",
|
||||
type=Path,
|
||||
default=Path("data/osm_tracks.json"),
|
||||
help=(
|
||||
"Destination file for the exported track geometries "
|
||||
"(default: data/osm_tracks.json)"
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--region",
|
||||
choices=[region.name for region in DEFAULT_REGIONS] + ["all"],
|
||||
default="all",
|
||||
help="Region name to export (default: all)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Do not fetch data; print the Overpass payload only",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
def build_overpass_query(region_name: str) -> str:
|
||||
if region_name == "all":
|
||||
regions = DEFAULT_REGIONS
|
||||
else:
|
||||
regions = tuple(
|
||||
region for region in DEFAULT_REGIONS if region.name == region_name
|
||||
)
|
||||
if not regions:
|
||||
available = ", ".join(region.name for region in DEFAULT_REGIONS)
|
||||
msg = f"Unknown region {region_name}. Available regions: [{available}]"
|
||||
raise ValueError(msg)
|
||||
|
||||
filters = compile_overpass_filters(TRACK_TAG_FILTERS)
|
||||
|
||||
parts = ["[out:json][timeout:120];", "("]
|
||||
for region in regions:
|
||||
parts.append(f" way{filters}\n ({region.to_overpass_arg()});")
|
||||
parts.append(")")
|
||||
parts.append("; out body geom; >; out skel qt;")
|
||||
return "\n".join(parts)
|
||||
|
||||
|
||||
def perform_request(query: str) -> dict[str, Any]:
|
||||
import urllib.request
|
||||
|
||||
payload = f"data={quote_plus(query)}".encode("utf-8")
|
||||
request = urllib.request.Request(
|
||||
OVERPASS_ENDPOINT,
|
||||
data=payload,
|
||||
headers={"Content-Type": "application/x-www-form-urlencoded"},
|
||||
)
|
||||
with urllib.request.urlopen(request, timeout=180) as response:
|
||||
payload = response.read()
|
||||
return json.loads(payload)
|
||||
|
||||
|
||||
def normalize_track_elements(
|
||||
elements: Iterable[dict[str, Any]]
|
||||
) -> list[dict[str, Any]]:
|
||||
"""Convert Overpass way elements into TrackCreate-compatible payloads."""
|
||||
|
||||
tracks: list[dict[str, Any]] = []
|
||||
for element in elements:
|
||||
if element.get("type") != "way":
|
||||
continue
|
||||
|
||||
raw_geometry = element.get("geometry") or []
|
||||
coordinates: list[list[float]] = []
|
||||
for node in raw_geometry:
|
||||
lat = node.get("lat")
|
||||
lon = node.get("lon")
|
||||
if lat is None or lon is None:
|
||||
coordinates = []
|
||||
break
|
||||
coordinates.append([float(lat), float(lon)])
|
||||
|
||||
if len(coordinates) < 2:
|
||||
continue
|
||||
|
||||
tags: dict[str, Any] = element.get("tags", {})
|
||||
length_meters = _polyline_length(coordinates)
|
||||
if not _should_include_track(tags, length_meters):
|
||||
continue
|
||||
|
||||
name = tags.get("name")
|
||||
maxspeed = _parse_maxspeed(tags.get("maxspeed"))
|
||||
status = _derive_status(tags.get("railway"))
|
||||
is_bidirectional = not _is_oneway(tags.get("oneway"))
|
||||
|
||||
tracks.append(
|
||||
{
|
||||
"osmId": str(element.get("id")),
|
||||
"name": str(name) if name else None,
|
||||
"lengthMeters": length_meters,
|
||||
"maxSpeedKph": maxspeed,
|
||||
"status": status,
|
||||
"isBidirectional": is_bidirectional,
|
||||
"coordinates": coordinates,
|
||||
}
|
||||
)
|
||||
|
||||
return tracks
|
||||
|
||||
|
||||
def _parse_maxspeed(value: Any) -> float | None:
|
||||
if value is None:
|
||||
return None
|
||||
|
||||
# Overpass may return values such as "80" or "80 km/h" or "signals".
|
||||
if isinstance(value, (int, float)):
|
||||
return float(value)
|
||||
|
||||
text = str(value).strip()
|
||||
number = ""
|
||||
for char in text:
|
||||
if char.isdigit() or char == ".":
|
||||
number += char
|
||||
elif number:
|
||||
break
|
||||
try:
|
||||
return float(number) if number else None
|
||||
except ValueError:
|
||||
return None
|
||||
|
||||
|
||||
def _derive_status(value: Any) -> str:
|
||||
tag = str(value or "").lower()
|
||||
if tag in {"abandoned", "disused"}:
|
||||
return tag
|
||||
if tag in {"construction", "proposed"}:
|
||||
return "construction"
|
||||
return "operational"
|
||||
|
||||
|
||||
def _should_include_track(tags: Mapping[str, Any], length_meters: float) -> bool:
|
||||
railway = str(tags.get("railway", "")).lower()
|
||||
if railway not in TRACK_ALLOWED_RAILWAY_TYPES:
|
||||
return False
|
||||
|
||||
if length_meters < TRACK_MIN_LENGTH_METERS:
|
||||
return False
|
||||
|
||||
service = str(tags.get("service", "")).lower()
|
||||
if service and service in TRACK_EXCLUDED_SERVICE_TAGS:
|
||||
return False
|
||||
|
||||
usage = str(tags.get("usage", "")).lower()
|
||||
if usage and usage in TRACK_EXCLUDED_USAGE_TAGS:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def _is_oneway(value: Any) -> bool:
|
||||
if value is None:
|
||||
return False
|
||||
normalized = str(value).strip().lower()
|
||||
return normalized in {"yes", "true", "1"}
|
||||
|
||||
|
||||
def _polyline_length(points: list[list[float]]) -> float:
|
||||
if len(points) < 2:
|
||||
return 0.0
|
||||
|
||||
total = 0.0
|
||||
for index in range(len(points) - 1):
|
||||
total += _haversine(points[index], points[index + 1])
|
||||
return total
|
||||
|
||||
|
||||
def _haversine(a: list[float], b: list[float]) -> float:
|
||||
"""Return distance in meters between two [lat, lon] coordinates."""
|
||||
|
||||
lat1, lon1 = a
|
||||
lat2, lon2 = b
|
||||
radius = 6_371_000
|
||||
|
||||
phi1 = math.radians(lat1)
|
||||
phi2 = math.radians(lat2)
|
||||
delta_phi = math.radians(lat2 - lat1)
|
||||
delta_lambda = math.radians(lon2 - lon1)
|
||||
|
||||
sin_dphi = math.sin(delta_phi / 2)
|
||||
sin_dlambda = math.sin(delta_lambda / 2)
|
||||
root = sin_dphi**2 + math.cos(phi1) * math.cos(phi2) * sin_dlambda**2
|
||||
distance = 2 * radius * math.atan2(math.sqrt(root), math.sqrt(1 - root))
|
||||
return distance
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = build_argument_parser()
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
query = build_overpass_query(args.region)
|
||||
|
||||
if args.dry_run:
|
||||
print(query)
|
||||
return 0
|
||||
|
||||
output_path: Path = args.output
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
data = perform_request(query)
|
||||
raw_elements = data.get("elements", [])
|
||||
tracks = normalize_track_elements(raw_elements)
|
||||
|
||||
payload = {
|
||||
"metadata": {
|
||||
"endpoint": OVERPASS_ENDPOINT,
|
||||
"region": args.region,
|
||||
"filters": TRACK_TAG_FILTERS,
|
||||
"regions": [asdict(region) for region in DEFAULT_REGIONS],
|
||||
"raw_count": len(raw_elements),
|
||||
"track_count": len(tracks),
|
||||
},
|
||||
"tracks": tracks,
|
||||
}
|
||||
|
||||
with output_path.open("w", encoding="utf-8") as handle:
|
||||
json.dump(payload, handle, indent=2)
|
||||
|
||||
print(
|
||||
f"Normalized {len(tracks)} tracks from {len(raw_elements)} elements into {output_path}"
|
||||
)
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
293
backend/scripts/tracks_load.py
Normal file
293
backend/scripts/tracks_load.py
Normal file
@@ -0,0 +1,293 @@
|
||||
from __future__ import annotations
|
||||
|
||||
"""CLI for loading normalized track JSON into the database."""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import math
|
||||
import sys
|
||||
from dataclasses import dataclass
|
||||
from pathlib import Path
|
||||
from typing import Any, Iterable, Mapping, Sequence
|
||||
|
||||
from geoalchemy2.elements import WKBElement, WKTElement
|
||||
from geoalchemy2.shape import to_shape
|
||||
|
||||
from backend.app.core.osm_config import TRACK_STATION_SNAP_RADIUS_METERS
|
||||
from backend.app.db.session import SessionLocal
|
||||
from backend.app.models import TrackCreate
|
||||
from backend.app.repositories import StationRepository, TrackRepository
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class ParsedTrack:
|
||||
coordinates: list[tuple[float, float]]
|
||||
osm_id: str | None = None
|
||||
name: str | None = None
|
||||
length_meters: float | None = None
|
||||
max_speed_kph: float | None = None
|
||||
status: str = "operational"
|
||||
is_bidirectional: bool = True
|
||||
|
||||
|
||||
@dataclass(slots=True)
|
||||
class StationRef:
|
||||
id: str
|
||||
latitude: float
|
||||
longitude: float
|
||||
|
||||
|
||||
def build_argument_parser() -> argparse.ArgumentParser:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Load normalized track data into PostGIS",
|
||||
)
|
||||
parser.add_argument(
|
||||
"input",
|
||||
type=Path,
|
||||
help="Path to the normalized track JSON file produced by tracks_import.py",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--commit",
|
||||
dest="commit",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help="Commit the transaction after loading (default).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-commit",
|
||||
dest="commit",
|
||||
action="store_false",
|
||||
help="Rollback the transaction after loading (useful for dry runs).",
|
||||
)
|
||||
return parser
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
parser = build_argument_parser()
|
||||
args = parser.parse_args(argv)
|
||||
|
||||
if not args.input.exists():
|
||||
parser.error(f"Input file {args.input} does not exist")
|
||||
|
||||
with args.input.open("r", encoding="utf-8") as handle:
|
||||
payload = json.load(handle)
|
||||
|
||||
track_entries = payload.get("tracks") or []
|
||||
if not isinstance(track_entries, list):
|
||||
parser.error("Invalid payload: 'tracks' must be a list")
|
||||
|
||||
try:
|
||||
tracks = _parse_track_entries(track_entries)
|
||||
except ValueError as exc:
|
||||
parser.error(str(exc))
|
||||
|
||||
created = load_tracks(tracks, commit=args.commit)
|
||||
print(f"Loaded {created} tracks from {args.input}")
|
||||
return 0
|
||||
|
||||
|
||||
def _parse_track_entries(entries: Iterable[Mapping[str, Any]]) -> list[ParsedTrack]:
|
||||
parsed: list[ParsedTrack] = []
|
||||
for entry in entries:
|
||||
coordinates = entry.get("coordinates")
|
||||
if not isinstance(coordinates, Sequence) or len(coordinates) < 2:
|
||||
raise ValueError(
|
||||
"Invalid track entry: 'coordinates' must contain at least two points"
|
||||
)
|
||||
|
||||
processed_coordinates: list[tuple[float, float]] = []
|
||||
for pair in coordinates:
|
||||
if not isinstance(pair, Sequence) or len(pair) != 2:
|
||||
raise ValueError(
|
||||
f"Invalid coordinate pair {pair!r} in track entry")
|
||||
lat, lon = pair
|
||||
processed_coordinates.append((float(lat), float(lon)))
|
||||
|
||||
name = entry.get("name")
|
||||
length = _safe_float(entry.get("lengthMeters"))
|
||||
max_speed = _safe_float(entry.get("maxSpeedKph"))
|
||||
status = entry.get("status", "operational")
|
||||
is_bidirectional = entry.get("isBidirectional", True)
|
||||
osm_id = entry.get("osmId")
|
||||
|
||||
parsed.append(
|
||||
ParsedTrack(
|
||||
coordinates=processed_coordinates,
|
||||
osm_id=str(osm_id) if osm_id else None,
|
||||
name=str(name) if name else None,
|
||||
length_meters=length,
|
||||
max_speed_kph=max_speed,
|
||||
status=str(status) if status else "operational",
|
||||
is_bidirectional=bool(is_bidirectional),
|
||||
)
|
||||
)
|
||||
return parsed
|
||||
|
||||
|
||||
def load_tracks(tracks: Iterable[ParsedTrack], commit: bool = True) -> int:
|
||||
created = 0
|
||||
with SessionLocal() as session:
|
||||
station_repo = StationRepository(session)
|
||||
track_repo = TrackRepository(session)
|
||||
|
||||
station_index = _build_station_index(station_repo.list_active())
|
||||
existing_pairs = {
|
||||
(str(track.start_station_id), str(track.end_station_id))
|
||||
for track in track_repo.list_all()
|
||||
}
|
||||
|
||||
for track_data in tracks:
|
||||
# Skip if track with this OSM ID already exists
|
||||
if track_data.osm_id and track_repo.exists_by_osm_id(track_data.osm_id):
|
||||
print(
|
||||
f"Skipping track {track_data.osm_id} - already exists by OSM ID")
|
||||
continue
|
||||
|
||||
start_station = _nearest_station(
|
||||
track_data.coordinates[0],
|
||||
station_index,
|
||||
TRACK_STATION_SNAP_RADIUS_METERS,
|
||||
)
|
||||
end_station = _nearest_station(
|
||||
track_data.coordinates[-1],
|
||||
station_index,
|
||||
TRACK_STATION_SNAP_RADIUS_METERS,
|
||||
)
|
||||
|
||||
if not start_station or not end_station:
|
||||
print(
|
||||
f"Skipping track {track_data.osm_id} - no start/end stations found")
|
||||
continue
|
||||
|
||||
if start_station.id == end_station.id:
|
||||
print(
|
||||
f"Skipping track {track_data.osm_id} - start and end stations are the same")
|
||||
continue
|
||||
|
||||
pair = (start_station.id, end_station.id)
|
||||
if pair in existing_pairs:
|
||||
print(
|
||||
f"Skipping track {track_data.osm_id} - station pair {pair} already exists")
|
||||
continue
|
||||
|
||||
length = track_data.length_meters or _polyline_length(
|
||||
track_data.coordinates
|
||||
)
|
||||
max_speed = (
|
||||
int(round(track_data.max_speed_kph))
|
||||
if track_data.max_speed_kph is not None
|
||||
else None
|
||||
)
|
||||
create_schema = TrackCreate(
|
||||
osm_id=track_data.osm_id,
|
||||
name=track_data.name,
|
||||
start_station_id=start_station.id,
|
||||
end_station_id=end_station.id,
|
||||
coordinates=track_data.coordinates,
|
||||
length_meters=length,
|
||||
max_speed_kph=max_speed,
|
||||
status=track_data.status,
|
||||
is_bidirectional=track_data.is_bidirectional,
|
||||
)
|
||||
|
||||
track_repo.create(create_schema)
|
||||
existing_pairs.add(pair)
|
||||
created += 1
|
||||
|
||||
if commit:
|
||||
session.commit()
|
||||
else:
|
||||
session.rollback()
|
||||
|
||||
return created
|
||||
|
||||
|
||||
def _nearest_station(
|
||||
coordinate: tuple[float, float],
|
||||
stations: Sequence[StationRef],
|
||||
max_distance_meters: float,
|
||||
) -> StationRef | None:
|
||||
best_station: StationRef | None = None
|
||||
best_distance = math.inf
|
||||
for station in stations:
|
||||
distance = _haversine(
|
||||
coordinate, (station.latitude, station.longitude))
|
||||
if distance < best_distance:
|
||||
best_station = station
|
||||
best_distance = distance
|
||||
if best_distance <= max_distance_meters:
|
||||
return best_station
|
||||
return None
|
||||
|
||||
|
||||
def _build_station_index(stations: Iterable[Any]) -> list[StationRef]:
|
||||
index: list[StationRef] = []
|
||||
for station in stations:
|
||||
location = getattr(station, "location", None)
|
||||
if location is None:
|
||||
continue
|
||||
point = _to_point(location)
|
||||
if point is None:
|
||||
continue
|
||||
latitude = getattr(point, "y", None)
|
||||
longitude = getattr(point, "x", None)
|
||||
if latitude is None or longitude is None:
|
||||
continue
|
||||
index.append(
|
||||
StationRef(
|
||||
id=str(station.id),
|
||||
latitude=float(latitude),
|
||||
longitude=float(longitude),
|
||||
)
|
||||
)
|
||||
return index
|
||||
|
||||
|
||||
def _to_point(geometry: WKBElement | WKTElement | Any):
|
||||
try:
|
||||
point = to_shape(geometry)
|
||||
return point if getattr(point, "geom_type", None) == "Point" else None
|
||||
except (
|
||||
Exception
|
||||
): # pragma: no cover - defensive, should not happen with valid geometry
|
||||
return None
|
||||
|
||||
|
||||
def _polyline_length(points: Sequence[tuple[float, float]]) -> float:
|
||||
if len(points) < 2:
|
||||
return 0.0
|
||||
|
||||
total = 0.0
|
||||
for index in range(len(points) - 1):
|
||||
total += _haversine(points[index], points[index + 1])
|
||||
return total
|
||||
|
||||
|
||||
def _haversine(a: tuple[float, float], b: tuple[float, float]) -> float:
|
||||
lat1, lon1 = a
|
||||
lat2, lon2 = b
|
||||
radius = 6_371_000
|
||||
|
||||
phi1 = math.radians(lat1)
|
||||
phi2 = math.radians(lat2)
|
||||
delta_phi = math.radians(lat2 - lat1)
|
||||
delta_lambda = math.radians(lon2 - lon1)
|
||||
|
||||
sin_dphi = math.sin(delta_phi / 2)
|
||||
sin_dlambda = math.sin(delta_lambda / 2)
|
||||
root = sin_dphi**2 + math.cos(phi1) * math.cos(phi2) * sin_dlambda**2
|
||||
distance = 2 * radius * math.atan2(math.sqrt(root), math.sqrt(1 - root))
|
||||
return distance
|
||||
|
||||
|
||||
def _safe_float(value: Any) -> float | None:
|
||||
if value is None or value == "":
|
||||
return None
|
||||
try:
|
||||
return float(value)
|
||||
except (TypeError, ValueError):
|
||||
return None
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
166
backend/tests/test_combined_tracks.py
Normal file
166
backend/tests/test_combined_tracks.py
Normal file
@@ -0,0 +1,166 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any, List
|
||||
from uuid import uuid4
|
||||
|
||||
import pytest
|
||||
|
||||
from backend.app.models import CombinedTrackModel
|
||||
from backend.app.repositories.combined_tracks import CombinedTrackRepository
|
||||
from backend.app.repositories.tracks import TrackRepository
|
||||
from backend.app.services.combined_tracks import create_combined_track
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummySession:
|
||||
added: List[Any] = field(default_factory=list)
|
||||
scalars_result: List[Any] = field(default_factory=list)
|
||||
scalar_result: Any = None
|
||||
statements: List[Any] = field(default_factory=list)
|
||||
committed: bool = False
|
||||
rolled_back: bool = False
|
||||
closed: bool = False
|
||||
|
||||
def add(self, instance: Any) -> None:
|
||||
self.added.append(instance)
|
||||
|
||||
def add_all(self, instances: list[Any]) -> None:
|
||||
self.added.extend(instances)
|
||||
|
||||
def scalars(self, statement: Any) -> list[Any]:
|
||||
self.statements.append(statement)
|
||||
return list(self.scalars_result)
|
||||
|
||||
def scalar(self, statement: Any) -> Any:
|
||||
self.statements.append(statement)
|
||||
return self.scalar_result
|
||||
|
||||
def flush(
|
||||
self, _objects: list[Any] | None = None
|
||||
) -> None: # pragma: no cover - optional
|
||||
return None
|
||||
|
||||
def commit(self) -> None: # pragma: no cover - optional
|
||||
self.committed = True
|
||||
|
||||
def rollback(self) -> None: # pragma: no cover - optional
|
||||
self.rolled_back = True
|
||||
|
||||
def close(self) -> None: # pragma: no cover - optional
|
||||
self.closed = True
|
||||
|
||||
|
||||
def _now() -> datetime:
|
||||
return datetime.now(timezone.utc)
|
||||
|
||||
|
||||
def test_combined_track_model_round_trip() -> None:
|
||||
timestamp = _now()
|
||||
combined_track = CombinedTrackModel(
|
||||
id="combined-track-1",
|
||||
start_station_id="station-1",
|
||||
end_station_id="station-2",
|
||||
length_meters=3000.0,
|
||||
max_speed_kph=100,
|
||||
status="operational",
|
||||
is_bidirectional=True,
|
||||
coordinates=[(52.52, 13.405), (52.6, 13.5), (52.7, 13.6)],
|
||||
constituent_track_ids=["track-1", "track-2"],
|
||||
created_at=timestamp,
|
||||
updated_at=timestamp,
|
||||
)
|
||||
assert combined_track.length_meters == 3000.0
|
||||
assert combined_track.start_station_id != combined_track.end_station_id
|
||||
assert len(combined_track.coordinates) == 3
|
||||
assert len(combined_track.constituent_track_ids) == 2
|
||||
|
||||
|
||||
def test_combined_track_repository_create() -> None:
|
||||
"""Test creating a combined track through the repository."""
|
||||
session = DummySession()
|
||||
repo = CombinedTrackRepository(session) # type: ignore[arg-type]
|
||||
|
||||
# Create test data
|
||||
from backend.app.models import CombinedTrackCreate
|
||||
|
||||
create_data = CombinedTrackCreate(
|
||||
start_station_id="550e8400-e29b-41d4-a716-446655440000",
|
||||
end_station_id="550e8400-e29b-41d4-a716-446655440001",
|
||||
coordinates=[(52.52, 13.405), (52.6, 13.5)],
|
||||
constituent_track_ids=["track-1"],
|
||||
length_meters=1500.0,
|
||||
max_speed_kph=120,
|
||||
status="operational",
|
||||
)
|
||||
|
||||
combined_track = repo.create(create_data)
|
||||
|
||||
assert combined_track.start_station_id is not None
|
||||
assert combined_track.end_station_id is not None
|
||||
assert combined_track.length_meters == 1500.0
|
||||
assert combined_track.max_speed_kph == 120
|
||||
assert combined_track.status == "operational"
|
||||
assert session.added and session.added[0] is combined_track
|
||||
|
||||
|
||||
def test_combined_track_repository_exists_between_stations() -> None:
|
||||
"""Test checking if combined track exists between stations."""
|
||||
session = DummySession()
|
||||
repo = CombinedTrackRepository(session) # type: ignore[arg-type]
|
||||
|
||||
# Initially should not exist (scalar_result is None by default)
|
||||
assert not repo.exists_between_stations(
|
||||
"550e8400-e29b-41d4-a716-446655440000",
|
||||
"550e8400-e29b-41d4-a716-446655440001"
|
||||
)
|
||||
|
||||
# Simulate existing combined track
|
||||
session.scalar_result = True
|
||||
assert repo.exists_between_stations(
|
||||
"550e8400-e29b-41d4-a716-446655440000",
|
||||
"550e8400-e29b-41d4-a716-446655440001"
|
||||
)
|
||||
|
||||
|
||||
def test_combined_track_service_create_no_path() -> None:
|
||||
"""Test creating combined track when no path exists."""
|
||||
# Mock session and repositories
|
||||
session = DummySession()
|
||||
|
||||
# Mock TrackRepository to return no path
|
||||
class MockTrackRepository:
|
||||
def __init__(self, session):
|
||||
pass
|
||||
|
||||
def find_path_between_stations(self, start_id, end_id):
|
||||
return None
|
||||
|
||||
# Mock CombinedTrackRepository
|
||||
class MockCombinedTrackRepository:
|
||||
def __init__(self, session):
|
||||
pass
|
||||
|
||||
def exists_between_stations(self, start_id, end_id):
|
||||
return False
|
||||
|
||||
# Patch the service to use mock repositories
|
||||
import backend.app.services.combined_tracks as service_module
|
||||
original_track_repo = service_module.TrackRepository
|
||||
original_combined_repo = service_module.CombinedTrackRepository
|
||||
|
||||
service_module.TrackRepository = MockTrackRepository
|
||||
service_module.CombinedTrackRepository = MockCombinedTrackRepository
|
||||
|
||||
try:
|
||||
result = create_combined_track(
|
||||
session, # type: ignore[arg-type]
|
||||
"550e8400-e29b-41d4-a716-446655440000",
|
||||
"550e8400-e29b-41d4-a716-446655440001"
|
||||
)
|
||||
assert result is None
|
||||
finally:
|
||||
# Restore original classes
|
||||
service_module.TrackRepository = original_track_repo
|
||||
service_module.CombinedTrackRepository = original_combined_repo
|
||||
@@ -29,11 +29,15 @@ def test_track_model_properties() -> None:
|
||||
end_station_id="station-2",
|
||||
length_meters=1500.0,
|
||||
max_speed_kph=120.0,
|
||||
status="operational",
|
||||
is_bidirectional=True,
|
||||
coordinates=[(52.52, 13.405), (52.6, 13.5)],
|
||||
created_at=timestamp,
|
||||
updated_at=timestamp,
|
||||
)
|
||||
assert track.length_meters > 0
|
||||
assert track.start_station_id != track.end_station_id
|
||||
assert len(track.coordinates) == 2
|
||||
|
||||
|
||||
def test_train_model_operating_tracks() -> None:
|
||||
|
||||
@@ -26,6 +26,9 @@ def sample_entities() -> dict[str, SimpleNamespace]:
|
||||
end_station_id=station.id,
|
||||
length_meters=1234.5,
|
||||
max_speed_kph=160,
|
||||
status="operational",
|
||||
is_bidirectional=True,
|
||||
track_geometry=None,
|
||||
created_at=timestamp,
|
||||
updated_at=timestamp,
|
||||
)
|
||||
|
||||
28
backend/tests/test_osm_config.py
Normal file
28
backend/tests/test_osm_config.py
Normal file
@@ -0,0 +1,28 @@
|
||||
from backend.app.core.osm_config import (
|
||||
DEFAULT_REGIONS,
|
||||
STATION_TAG_FILTERS,
|
||||
BoundingBox,
|
||||
compile_overpass_filters,
|
||||
)
|
||||
|
||||
|
||||
def test_default_regions_are_valid() -> None:
|
||||
assert DEFAULT_REGIONS, "Expected at least one region definition"
|
||||
for bbox in DEFAULT_REGIONS:
|
||||
assert isinstance(bbox, BoundingBox)
|
||||
assert bbox.north > bbox.south
|
||||
assert bbox.east > bbox.west
|
||||
# Berlin coordinates should fall inside Berlin bounding box for sanity
|
||||
if bbox.name == "berlin_metropolitan":
|
||||
assert bbox.contains(52.5200, 13.4050)
|
||||
|
||||
|
||||
def test_station_tag_filters_compile_to_overpass_snippet() -> None:
|
||||
compiled = compile_overpass_filters(STATION_TAG_FILTERS)
|
||||
# Ensure each key is present with its values
|
||||
for key, values in STATION_TAG_FILTERS.items():
|
||||
assert key in compiled
|
||||
for value in values:
|
||||
assert value in compiled
|
||||
# The snippet should be multi-line to preserve readability
|
||||
assert "\n" in compiled
|
||||
161
backend/tests/test_osm_refresh.py
Normal file
161
backend/tests/test_osm_refresh.py
Normal file
@@ -0,0 +1,161 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from argparse import Namespace
|
||||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
|
||||
from backend.scripts import osm_refresh
|
||||
|
||||
|
||||
def _namespace(output_dir: Path, **overrides: object) -> Namespace:
|
||||
defaults: dict[str, object] = {
|
||||
"region": "all",
|
||||
"output_dir": output_dir,
|
||||
"stations_json": None,
|
||||
"tracks_json": None,
|
||||
"skip_station_import": False,
|
||||
"skip_station_load": False,
|
||||
"skip_track_import": False,
|
||||
"skip_track_load": False,
|
||||
"dry_run": False,
|
||||
"commit": True,
|
||||
}
|
||||
defaults.update(overrides)
|
||||
return Namespace(**defaults)
|
||||
|
||||
|
||||
def test_build_stage_plan_default_sequence(tmp_path: Path) -> None:
|
||||
stages = osm_refresh._build_stage_plan(_namespace(tmp_path))
|
||||
|
||||
labels = [stage.label for stage in stages]
|
||||
assert labels == [
|
||||
"Import stations",
|
||||
"Load stations",
|
||||
"Import tracks",
|
||||
"Load tracks",
|
||||
]
|
||||
|
||||
expected_station_path = tmp_path / "osm_stations.json"
|
||||
expected_track_path = tmp_path / "osm_tracks.json"
|
||||
|
||||
assert stages[0].output_path == expected_station_path
|
||||
assert stages[1].input_path == expected_station_path
|
||||
assert stages[2].output_path == expected_track_path
|
||||
assert stages[3].input_path == expected_track_path
|
||||
|
||||
|
||||
def test_build_stage_plan_respects_skip_flags(tmp_path: Path) -> None:
|
||||
stages = osm_refresh._build_stage_plan(
|
||||
_namespace(
|
||||
tmp_path,
|
||||
skip_station_import=True,
|
||||
skip_track_import=True,
|
||||
)
|
||||
)
|
||||
|
||||
labels = [stage.label for stage in stages]
|
||||
assert labels == ["Load stations", "Load tracks"]
|
||||
|
||||
|
||||
def test_main_dry_run_lists_plan(
|
||||
monkeypatch: pytest.MonkeyPatch, tmp_path: Path, capsys: pytest.CaptureFixture[str]
|
||||
) -> None:
|
||||
def fail(_args: list[str] | None) -> int: # pragma: no cover - defensive
|
||||
raise AssertionError("runner should not be invoked during dry run")
|
||||
|
||||
monkeypatch.setattr(osm_refresh.stations_import, "main", fail)
|
||||
monkeypatch.setattr(osm_refresh.tracks_import, "main", fail)
|
||||
monkeypatch.setattr(osm_refresh.stations_load, "main", fail)
|
||||
monkeypatch.setattr(osm_refresh.tracks_load, "main", fail)
|
||||
|
||||
exit_code = osm_refresh.main(["--dry-run", "--output-dir", str(tmp_path)])
|
||||
|
||||
assert exit_code == 0
|
||||
captured = capsys.readouterr().out
|
||||
assert "Dry run" in captured
|
||||
assert "Import stations" in captured
|
||||
assert "Load tracks" in captured
|
||||
|
||||
|
||||
def test_main_executes_stages_in_order(
|
||||
monkeypatch: pytest.MonkeyPatch, tmp_path: Path
|
||||
) -> None:
|
||||
calls: list[str] = []
|
||||
|
||||
def make_import(name: str):
|
||||
def runner(args: list[str] | None) -> int:
|
||||
assert args is not None
|
||||
calls.append(name)
|
||||
output_index = args.index("--output") + 1
|
||||
output_path = Path(args[output_index])
|
||||
output_path.write_text("{}", encoding="utf-8")
|
||||
return 0
|
||||
|
||||
return runner
|
||||
|
||||
def make_load(name: str):
|
||||
def runner(args: list[str] | None) -> int:
|
||||
assert args is not None
|
||||
calls.append(name)
|
||||
return 0
|
||||
|
||||
return runner
|
||||
|
||||
monkeypatch.setattr(
|
||||
osm_refresh.stations_import, "main", make_import("stations_import")
|
||||
)
|
||||
monkeypatch.setattr(osm_refresh.tracks_import, "main", make_import("tracks_import"))
|
||||
monkeypatch.setattr(osm_refresh.stations_load, "main", make_load("stations_load"))
|
||||
monkeypatch.setattr(osm_refresh.tracks_load, "main", make_load("tracks_load"))
|
||||
|
||||
exit_code = osm_refresh.main(["--output-dir", str(tmp_path)])
|
||||
|
||||
assert exit_code == 0
|
||||
assert calls == [
|
||||
"stations_import",
|
||||
"stations_load",
|
||||
"tracks_import",
|
||||
"tracks_load",
|
||||
]
|
||||
|
||||
|
||||
def test_main_skip_import_flags(
|
||||
monkeypatch: pytest.MonkeyPatch, tmp_path: Path
|
||||
) -> None:
|
||||
station_json = tmp_path / "stations.json"
|
||||
station_json.write_text("{}", encoding="utf-8")
|
||||
track_json = tmp_path / "tracks.json"
|
||||
track_json.write_text("{}", encoding="utf-8")
|
||||
|
||||
def fail(_args: list[str] | None) -> int: # pragma: no cover - defensive
|
||||
raise AssertionError("import stage should be skipped")
|
||||
|
||||
calls: list[str] = []
|
||||
|
||||
def record(name: str):
|
||||
def runner(args: list[str] | None) -> int:
|
||||
assert args is not None
|
||||
calls.append(name)
|
||||
return 0
|
||||
|
||||
return runner
|
||||
|
||||
monkeypatch.setattr(osm_refresh.stations_import, "main", fail)
|
||||
monkeypatch.setattr(osm_refresh.tracks_import, "main", fail)
|
||||
monkeypatch.setattr(osm_refresh.stations_load, "main", record("stations_load"))
|
||||
monkeypatch.setattr(osm_refresh.tracks_load, "main", record("tracks_load"))
|
||||
|
||||
exit_code = osm_refresh.main(
|
||||
[
|
||||
"--skip-station-import",
|
||||
"--skip-track-import",
|
||||
"--stations-json",
|
||||
str(station_json),
|
||||
"--tracks-json",
|
||||
str(track_json),
|
||||
]
|
||||
)
|
||||
|
||||
assert exit_code == 0
|
||||
assert calls == ["stations_load", "tracks_load"]
|
||||
137
backend/tests/test_stations_api.py
Normal file
137
backend/tests/test_stations_api.py
Normal file
@@ -0,0 +1,137 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
from uuid import uuid4
|
||||
|
||||
import pytest
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from backend.app.api import stations as stations_api
|
||||
from backend.app.main import app
|
||||
from backend.app.models import StationCreate, StationModel, StationUpdate
|
||||
|
||||
AUTH_CREDENTIALS = {"username": "demo", "password": "railgame123"}
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
def _authenticate() -> str:
|
||||
response = client.post("/api/auth/login", json=AUTH_CREDENTIALS)
|
||||
assert response.status_code == 200
|
||||
return response.json()["accessToken"]
|
||||
|
||||
|
||||
def _station_payload(**overrides: Any) -> dict[str, Any]:
|
||||
payload = {
|
||||
"name": "Central",
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.405,
|
||||
"osmId": "123",
|
||||
"code": "BER",
|
||||
"elevationM": 34.5,
|
||||
"isActive": True,
|
||||
}
|
||||
payload.update(overrides)
|
||||
return payload
|
||||
|
||||
|
||||
def _station_model(**overrides: Any) -> StationModel:
|
||||
now = datetime.now(timezone.utc)
|
||||
base = StationModel(
|
||||
id=str(uuid4()),
|
||||
name="Central",
|
||||
latitude=52.52,
|
||||
longitude=13.405,
|
||||
code="BER",
|
||||
osm_id="123",
|
||||
elevation_m=34.5,
|
||||
is_active=True,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
return base.model_copy(update=overrides)
|
||||
|
||||
|
||||
def test_list_stations_requires_authentication() -> None:
|
||||
response = client.get("/api/stations")
|
||||
assert response.status_code == 401
|
||||
|
||||
|
||||
def test_list_stations_returns_payload(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
|
||||
def fake_list_stations(db, include_inactive: bool) -> list[StationModel]:
|
||||
assert include_inactive is True
|
||||
return [_station_model()]
|
||||
|
||||
monkeypatch.setattr(stations_api, "list_stations", fake_list_stations)
|
||||
|
||||
response = client.get(
|
||||
"/api/stations",
|
||||
params={"include_inactive": "true"},
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
payload = response.json()
|
||||
assert len(payload) == 1
|
||||
assert payload[0]["name"] == "Central"
|
||||
|
||||
|
||||
def test_create_station_delegates_to_service(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
seen: dict[str, StationCreate] = {}
|
||||
|
||||
def fake_create_station(db, payload: StationCreate) -> StationModel:
|
||||
seen["payload"] = payload
|
||||
return _station_model()
|
||||
|
||||
monkeypatch.setattr(stations_api, "create_station", fake_create_station)
|
||||
|
||||
response = client.post(
|
||||
"/api/stations",
|
||||
json=_station_payload(),
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
|
||||
assert response.status_code == 201
|
||||
assert response.json()["name"] == "Central"
|
||||
assert seen["payload"].name == "Central"
|
||||
|
||||
|
||||
def test_update_station_not_found_returns_404(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
|
||||
def fake_update_station(
|
||||
db, station_id: str, payload: StationUpdate
|
||||
) -> StationModel:
|
||||
raise LookupError("Station not found")
|
||||
|
||||
monkeypatch.setattr(stations_api, "update_station", fake_update_station)
|
||||
|
||||
response = client.put(
|
||||
"/api/stations/123e4567-e89b-12d3-a456-426614174000",
|
||||
json={"name": "New Name"},
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
|
||||
assert response.status_code == 404
|
||||
assert response.json()["detail"] == "Station not found"
|
||||
|
||||
|
||||
def test_archive_station_returns_updated_model(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
|
||||
def fake_archive_station(db, station_id: str) -> StationModel:
|
||||
return _station_model(is_active=False)
|
||||
|
||||
monkeypatch.setattr(stations_api, "archive_station", fake_archive_station)
|
||||
|
||||
response = client.post(
|
||||
"/api/stations/123e4567-e89b-12d3-a456-426614174000/archive",
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.json()["isActive"] is False
|
||||
67
backend/tests/test_stations_import.py
Normal file
67
backend/tests/test_stations_import.py
Normal file
@@ -0,0 +1,67 @@
|
||||
from backend.scripts.stations_import import (
|
||||
build_overpass_query,
|
||||
normalize_station_elements,
|
||||
)
|
||||
|
||||
|
||||
def test_build_overpass_query_single_region() -> None:
|
||||
query = build_overpass_query("berlin_metropolitan")
|
||||
|
||||
# The query should reference the Berlin bounding box coordinates.
|
||||
assert "52.3381" in query # south
|
||||
assert "52.6755" in query # north
|
||||
assert "13.0884" in query # west
|
||||
assert "13.7611" in query # east
|
||||
assert "node" in query
|
||||
assert "out body" in query
|
||||
|
||||
|
||||
def test_build_overpass_query_all_regions_includes_union() -> None:
|
||||
query = build_overpass_query("all")
|
||||
|
||||
# Ensure multiple regions are present by checking for repeated bbox parentheses.
|
||||
assert query.count("node") >= 3
|
||||
assert query.strip().endswith("out skel qt;")
|
||||
|
||||
|
||||
def test_normalize_station_elements_filters_and_transforms() -> None:
|
||||
raw_elements = [
|
||||
{
|
||||
"type": "node",
|
||||
"id": 123,
|
||||
"lat": 52.5,
|
||||
"lon": 13.4,
|
||||
"tags": {
|
||||
"name": "Sample Station",
|
||||
"ref": "XYZ",
|
||||
"ele": "35.5",
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
"id": 999,
|
||||
# Missing coordinates should be ignored
|
||||
"tags": {"name": "Broken"},
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
"id": 456,
|
||||
"lat": 50.0,
|
||||
"lon": 8.0,
|
||||
"tags": {
|
||||
"name": "Disused Station",
|
||||
"disused": "yes",
|
||||
},
|
||||
},
|
||||
]
|
||||
|
||||
stations = normalize_station_elements(raw_elements)
|
||||
|
||||
assert len(stations) == 2
|
||||
primary = stations[0]
|
||||
assert primary["osm_id"] == "123"
|
||||
assert primary["name"] == "Sample Station"
|
||||
assert primary["code"] == "XYZ"
|
||||
assert primary["elevation_m"] == 35.5
|
||||
disused_station = stations[1]
|
||||
assert disused_station["is_active"] is False
|
||||
142
backend/tests/test_stations_load.py
Normal file
142
backend/tests/test_stations_load.py
Normal file
@@ -0,0 +1,142 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
|
||||
import pytest
|
||||
|
||||
from backend.scripts import stations_load
|
||||
|
||||
|
||||
def test_parse_station_entries_returns_models() -> None:
|
||||
entries = [
|
||||
{
|
||||
"name": "Central",
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.405,
|
||||
"osm_id": "123",
|
||||
"code": "BER",
|
||||
"elevation_m": 34.5,
|
||||
"is_active": True,
|
||||
}
|
||||
]
|
||||
|
||||
parsed = stations_load._parse_station_entries(entries)
|
||||
|
||||
assert parsed[0].name == "Central"
|
||||
assert parsed[0].latitude == 52.52
|
||||
assert parsed[0].osm_id == "123"
|
||||
|
||||
|
||||
def test_parse_station_entries_invalid_raises_value_error() -> None:
|
||||
entries = [
|
||||
{
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.405,
|
||||
"is_active": True,
|
||||
}
|
||||
]
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
stations_load._parse_station_entries(entries)
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummySession:
|
||||
committed: bool = False
|
||||
rolled_back: bool = False
|
||||
closed: bool = False
|
||||
|
||||
def __enter__(self) -> "DummySession":
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc, traceback) -> None:
|
||||
self.closed = True
|
||||
|
||||
def commit(self) -> None:
|
||||
self.committed = True
|
||||
|
||||
def rollback(self) -> None:
|
||||
self.rolled_back = True
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummyRepository:
|
||||
session: DummySession
|
||||
created: list = field(default_factory=list)
|
||||
|
||||
def create(self, data) -> None: # pragma: no cover - simple delegation
|
||||
self.created.append(data)
|
||||
|
||||
|
||||
class DummySessionFactory:
|
||||
def __call__(self) -> DummySession:
|
||||
return DummySession()
|
||||
|
||||
|
||||
def test_load_stations_commits_when_requested(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
repo_instances: list[DummyRepository] = []
|
||||
|
||||
def fake_session_local() -> DummySession:
|
||||
return DummySession()
|
||||
|
||||
def fake_repo(session: DummySession) -> DummyRepository:
|
||||
repo = DummyRepository(session)
|
||||
repo_instances.append(repo)
|
||||
return repo
|
||||
|
||||
monkeypatch.setattr(stations_load, "SessionLocal", fake_session_local)
|
||||
monkeypatch.setattr(stations_load, "StationRepository", fake_repo)
|
||||
|
||||
stations = stations_load._parse_station_entries(
|
||||
[
|
||||
{
|
||||
"name": "Central",
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.405,
|
||||
"osm_id": "123",
|
||||
"is_active": True,
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
created = stations_load.load_stations(stations, commit=True)
|
||||
|
||||
assert created == 1
|
||||
assert repo_instances[0].session.committed is True
|
||||
assert repo_instances[0].session.rolled_back is False
|
||||
assert len(repo_instances[0].created) == 1
|
||||
|
||||
|
||||
def test_load_stations_rolls_back_when_no_commit(
|
||||
monkeypatch: pytest.MonkeyPatch,
|
||||
) -> None:
|
||||
repo_instances: list[DummyRepository] = []
|
||||
|
||||
def fake_session_local() -> DummySession:
|
||||
return DummySession()
|
||||
|
||||
def fake_repo(session: DummySession) -> DummyRepository:
|
||||
repo = DummyRepository(session)
|
||||
repo_instances.append(repo)
|
||||
return repo
|
||||
|
||||
monkeypatch.setattr(stations_load, "SessionLocal", fake_session_local)
|
||||
monkeypatch.setattr(stations_load, "StationRepository", fake_repo)
|
||||
|
||||
stations = stations_load._parse_station_entries(
|
||||
[
|
||||
{
|
||||
"name": "Central",
|
||||
"latitude": 52.52,
|
||||
"longitude": 13.405,
|
||||
"osm_id": "123",
|
||||
"is_active": True,
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
created = stations_load.load_stations(stations, commit=False)
|
||||
|
||||
assert created == 1
|
||||
assert repo_instances[0].session.committed is False
|
||||
assert repo_instances[0].session.rolled_back is True
|
||||
175
backend/tests/test_stations_service.py
Normal file
175
backend/tests/test_stations_service.py
Normal file
@@ -0,0 +1,175 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, cast
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
import pytest
|
||||
from geoalchemy2.elements import WKTElement
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
from backend.app.models import StationCreate, StationUpdate
|
||||
from backend.app.services import stations as stations_service
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummySession:
|
||||
flushed: bool = False
|
||||
committed: bool = False
|
||||
refreshed: List[object] = field(default_factory=list)
|
||||
|
||||
def flush(self) -> None:
|
||||
self.flushed = True
|
||||
|
||||
def refresh(self, instance: object) -> None: # pragma: no cover - simple setter
|
||||
self.refreshed.append(instance)
|
||||
|
||||
def commit(self) -> None:
|
||||
self.committed = True
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummyStation:
|
||||
id: UUID
|
||||
name: str
|
||||
location: WKTElement
|
||||
osm_id: str | None
|
||||
code: str | None
|
||||
elevation_m: float | None
|
||||
is_active: bool
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class DummyStationRepository:
|
||||
_store: Dict[UUID, DummyStation] = {}
|
||||
|
||||
def __init__(self, session: DummySession) -> None: # pragma: no cover - simple init
|
||||
self.session = session
|
||||
|
||||
@staticmethod
|
||||
def _point(latitude: float, longitude: float) -> WKTElement:
|
||||
return WKTElement(f"POINT({longitude} {latitude})", srid=4326)
|
||||
|
||||
def list(self) -> list[DummyStation]:
|
||||
return list(self._store.values())
|
||||
|
||||
def list_active(self) -> list[DummyStation]:
|
||||
return [station for station in self._store.values() if station.is_active]
|
||||
|
||||
def get(self, identifier: UUID) -> DummyStation | None:
|
||||
return self._store.get(identifier)
|
||||
|
||||
def create(self, payload: StationCreate) -> DummyStation:
|
||||
station = DummyStation(
|
||||
id=uuid4(),
|
||||
name=payload.name,
|
||||
location=self._point(payload.latitude, payload.longitude),
|
||||
osm_id=payload.osm_id,
|
||||
code=payload.code,
|
||||
elevation_m=payload.elevation_m,
|
||||
is_active=payload.is_active,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc),
|
||||
)
|
||||
self._store[station.id] = station
|
||||
return station
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def reset_store(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
DummyStationRepository._store = {}
|
||||
monkeypatch.setattr(stations_service, "StationRepository", DummyStationRepository)
|
||||
|
||||
|
||||
def test_create_station_persists_and_returns_model(
|
||||
monkeypatch: pytest.MonkeyPatch,
|
||||
) -> None:
|
||||
session = DummySession()
|
||||
payload = StationCreate(
|
||||
name="Central",
|
||||
latitude=52.52,
|
||||
longitude=13.405,
|
||||
osm_id="123",
|
||||
code="BER",
|
||||
elevation_m=34.5,
|
||||
is_active=True,
|
||||
)
|
||||
|
||||
result = stations_service.create_station(cast(Session, session), payload)
|
||||
|
||||
assert session.flushed is True
|
||||
assert session.committed is True
|
||||
assert result.name == "Central"
|
||||
assert result.latitude == pytest.approx(52.52)
|
||||
assert result.longitude == pytest.approx(13.405)
|
||||
assert result.osm_id == "123"
|
||||
|
||||
|
||||
def test_update_station_updates_geometry_and_metadata() -> None:
|
||||
session = DummySession()
|
||||
station_id = uuid4()
|
||||
DummyStationRepository._store[station_id] = DummyStation(
|
||||
id=station_id,
|
||||
name="Old Name",
|
||||
location=DummyStationRepository._point(50.0, 8.0),
|
||||
osm_id=None,
|
||||
code=None,
|
||||
elevation_m=None,
|
||||
is_active=True,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc),
|
||||
)
|
||||
|
||||
payload = StationUpdate(name="New Name", latitude=51.0, longitude=9.0)
|
||||
result = stations_service.update_station(
|
||||
cast(Session, session), str(station_id), payload
|
||||
)
|
||||
|
||||
assert result.name == "New Name"
|
||||
assert result.latitude == pytest.approx(51.0)
|
||||
assert result.longitude == pytest.approx(9.0)
|
||||
assert DummyStationRepository._store[station_id].name == "New Name"
|
||||
|
||||
|
||||
def test_update_station_requires_both_coordinates() -> None:
|
||||
session = DummySession()
|
||||
station_id = uuid4()
|
||||
DummyStationRepository._store[station_id] = DummyStation(
|
||||
id=station_id,
|
||||
name="Station",
|
||||
location=DummyStationRepository._point(50.0, 8.0),
|
||||
osm_id=None,
|
||||
code=None,
|
||||
elevation_m=None,
|
||||
is_active=True,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc),
|
||||
)
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
stations_service.update_station(
|
||||
cast(Session, session), str(station_id), StationUpdate(latitude=51.0)
|
||||
)
|
||||
|
||||
|
||||
def test_archive_station_marks_inactive() -> None:
|
||||
session = DummySession()
|
||||
station_id = uuid4()
|
||||
DummyStationRepository._store[station_id] = DummyStation(
|
||||
id=station_id,
|
||||
name="Station",
|
||||
location=DummyStationRepository._point(50.0, 8.0),
|
||||
osm_id=None,
|
||||
code=None,
|
||||
elevation_m=None,
|
||||
is_active=True,
|
||||
created_at=datetime.now(timezone.utc),
|
||||
updated_at=datetime.now(timezone.utc),
|
||||
)
|
||||
|
||||
result = stations_service.archive_station(cast(Session, session), str(station_id))
|
||||
|
||||
assert result.is_active is False
|
||||
assert DummyStationRepository._store[station_id].is_active is False
|
||||
158
backend/tests/test_tracks_api.py
Normal file
158
backend/tests/test_tracks_api.py
Normal file
@@ -0,0 +1,158 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
from backend.app.api import tracks as tracks_api
|
||||
from backend.app.main import app
|
||||
from backend.app.models import CombinedTrackModel, TrackModel
|
||||
|
||||
client = TestClient(app)
|
||||
|
||||
|
||||
def _track_model(track_id: str = "track-1") -> TrackModel:
|
||||
now = datetime.now(timezone.utc)
|
||||
return TrackModel(
|
||||
id=track_id,
|
||||
start_station_id="station-a",
|
||||
end_station_id="station-b",
|
||||
length_meters=None,
|
||||
max_speed_kph=None,
|
||||
status="planned",
|
||||
coordinates=[(52.5, 13.4), (52.6, 13.5)],
|
||||
is_bidirectional=True,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
|
||||
|
||||
def _combined_model(track_id: str = "combined-1") -> CombinedTrackModel:
|
||||
now = datetime.now(timezone.utc)
|
||||
return CombinedTrackModel(
|
||||
id=track_id,
|
||||
start_station_id="station-a",
|
||||
end_station_id="station-b",
|
||||
length_meters=1000,
|
||||
max_speed_kph=120,
|
||||
status="operational",
|
||||
coordinates=[(52.5, 13.4), (52.6, 13.5)],
|
||||
constituent_track_ids=["track-1", "track-2"],
|
||||
is_bidirectional=True,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
|
||||
|
||||
def _authenticate() -> str:
|
||||
response = client.post(
|
||||
"/api/auth/login",
|
||||
json={"username": "demo", "password": "railgame123"},
|
||||
)
|
||||
assert response.status_code == 200
|
||||
return response.json()["accessToken"]
|
||||
|
||||
|
||||
def test_list_tracks(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
monkeypatch.setattr(tracks_api, "list_tracks", lambda db: [_track_model()])
|
||||
|
||||
response = client.get(
|
||||
"/api/tracks",
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
assert response.status_code == 200
|
||||
payload = response.json()
|
||||
assert isinstance(payload, list)
|
||||
assert payload[0]["id"] == "track-1"
|
||||
|
||||
|
||||
def test_get_track_returns_404(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
monkeypatch.setattr(tracks_api, "get_track", lambda db, track_id: None)
|
||||
|
||||
response = client.get(
|
||||
"/api/tracks/not-found",
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
assert response.status_code == 404
|
||||
|
||||
|
||||
def test_create_track_calls_service(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
captured: dict[str, Any] = {}
|
||||
payload = {
|
||||
"startStationId": "station-a",
|
||||
"endStationId": "station-b",
|
||||
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
|
||||
}
|
||||
|
||||
def fake_create(db: Any, data: Any) -> TrackModel:
|
||||
assert data.start_station_id == "station-a"
|
||||
captured["payload"] = data
|
||||
return _track_model("track-new")
|
||||
|
||||
monkeypatch.setattr(tracks_api, "create_track", fake_create)
|
||||
|
||||
response = client.post(
|
||||
"/api/tracks",
|
||||
json=payload,
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
assert response.status_code == 201
|
||||
body = response.json()
|
||||
assert body["id"] == "track-new"
|
||||
assert captured["payload"].end_station_id == "station-b"
|
||||
|
||||
|
||||
def test_delete_track_returns_404(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
monkeypatch.setattr(
|
||||
tracks_api, "delete_track", lambda db, tid, regenerate=False: False
|
||||
)
|
||||
|
||||
response = client.delete(
|
||||
"/api/tracks/missing",
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
assert response.status_code == 404
|
||||
|
||||
|
||||
def test_delete_track_success(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
seen: dict[str, Any] = {}
|
||||
|
||||
def fake_delete(db: Any, track_id: str, regenerate: bool = False) -> bool:
|
||||
seen["track_id"] = track_id
|
||||
seen["regenerate"] = regenerate
|
||||
return True
|
||||
|
||||
monkeypatch.setattr(tracks_api, "delete_track", fake_delete)
|
||||
|
||||
response = client.delete(
|
||||
"/api/tracks/track-99",
|
||||
params={"regenerate": "true"},
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
|
||||
assert response.status_code == 204
|
||||
assert seen["track_id"] == "track-99"
|
||||
assert seen["regenerate"] is True
|
||||
|
||||
|
||||
def test_list_combined_tracks(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
token = _authenticate()
|
||||
monkeypatch.setattr(
|
||||
tracks_api, "list_combined_tracks", lambda db: [_combined_model()]
|
||||
)
|
||||
|
||||
response = client.get(
|
||||
"/api/tracks/combined",
|
||||
headers={"Authorization": f"Bearer {token}"},
|
||||
)
|
||||
assert response.status_code == 200
|
||||
payload = response.json()
|
||||
assert len(payload) == 1
|
||||
assert payload[0]["id"] == "combined-1"
|
||||
110
backend/tests/test_tracks_import.py
Normal file
110
backend/tests/test_tracks_import.py
Normal file
@@ -0,0 +1,110 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from backend.scripts import tracks_import
|
||||
|
||||
|
||||
def test_normalize_track_elements_excludes_invalid_geometries() -> None:
|
||||
elements = [
|
||||
{
|
||||
"type": "way",
|
||||
"id": 123,
|
||||
"geometry": [
|
||||
{"lat": 52.5, "lon": 13.4},
|
||||
{"lat": 52.6, "lon": 13.5},
|
||||
],
|
||||
"tags": {
|
||||
"name": "Main Line",
|
||||
"railway": "rail",
|
||||
"maxspeed": "120",
|
||||
},
|
||||
},
|
||||
{
|
||||
"type": "way",
|
||||
"id": 456,
|
||||
"geometry": [
|
||||
{"lat": 51.0},
|
||||
],
|
||||
"tags": {"railway": "rail"},
|
||||
},
|
||||
{
|
||||
"type": "node",
|
||||
"id": 789,
|
||||
},
|
||||
]
|
||||
|
||||
tracks = tracks_import.normalize_track_elements(elements)
|
||||
|
||||
assert len(tracks) == 1
|
||||
track = tracks[0]
|
||||
assert track["osmId"] == "123"
|
||||
assert track["name"] == "Main Line"
|
||||
assert track["maxSpeedKph"] == 120.0
|
||||
assert track["status"] == "operational"
|
||||
assert track["isBidirectional"] is True
|
||||
assert track["coordinates"] == [[52.5, 13.4], [52.6, 13.5]]
|
||||
assert track["lengthMeters"] > 0
|
||||
|
||||
|
||||
def test_normalize_track_elements_marks_oneway_and_status() -> None:
|
||||
elements = [
|
||||
{
|
||||
"type": "way",
|
||||
"id": 42,
|
||||
"geometry": [
|
||||
{"lat": 48.1, "lon": 11.5},
|
||||
{"lat": 48.2, "lon": 11.6},
|
||||
],
|
||||
"tags": {
|
||||
"railway": "disused",
|
||||
"oneway": "yes",
|
||||
},
|
||||
}
|
||||
]
|
||||
|
||||
tracks = tracks_import.normalize_track_elements(elements)
|
||||
|
||||
assert len(tracks) == 1
|
||||
track = tracks[0]
|
||||
assert track["status"] == "disused"
|
||||
assert track["isBidirectional"] is False
|
||||
|
||||
|
||||
def test_normalize_track_elements_skips_service_tracks() -> None:
|
||||
elements = [
|
||||
{
|
||||
"type": "way",
|
||||
"id": 77,
|
||||
"geometry": [
|
||||
{"lat": 52.5000, "lon": 13.4000},
|
||||
{"lat": 52.5010, "lon": 13.4010},
|
||||
],
|
||||
"tags": {
|
||||
"railway": "rail",
|
||||
"service": "yard",
|
||||
},
|
||||
}
|
||||
]
|
||||
|
||||
tracks = tracks_import.normalize_track_elements(elements)
|
||||
|
||||
assert tracks == []
|
||||
|
||||
|
||||
def test_normalize_track_elements_skips_short_tracks() -> None:
|
||||
elements = [
|
||||
{
|
||||
"type": "way",
|
||||
"id": 81,
|
||||
"geometry": [
|
||||
{"lat": 52.500000, "lon": 13.400000},
|
||||
{"lat": 52.500100, "lon": 13.400050},
|
||||
],
|
||||
"tags": {
|
||||
"railway": "rail",
|
||||
},
|
||||
}
|
||||
]
|
||||
|
||||
tracks = tracks_import.normalize_track_elements(elements)
|
||||
|
||||
assert tracks == []
|
||||
212
backend/tests/test_tracks_load.py
Normal file
212
backend/tests/test_tracks_load.py
Normal file
@@ -0,0 +1,212 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from typing import List
|
||||
|
||||
import pytest
|
||||
from geoalchemy2.shape import from_shape
|
||||
from shapely.geometry import Point
|
||||
|
||||
from backend.scripts import tracks_load
|
||||
|
||||
|
||||
def test_parse_track_entries_returns_models() -> None:
|
||||
entries = [
|
||||
{
|
||||
"name": "Connector",
|
||||
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
|
||||
"lengthMeters": 1500,
|
||||
"maxSpeedKph": 120,
|
||||
"status": "operational",
|
||||
"isBidirectional": True,
|
||||
}
|
||||
]
|
||||
|
||||
parsed = tracks_load._parse_track_entries(entries)
|
||||
|
||||
assert parsed[0].name == "Connector"
|
||||
assert parsed[0].coordinates[0] == (52.5, 13.4)
|
||||
assert parsed[0].length_meters == 1500
|
||||
assert parsed[0].max_speed_kph == 120
|
||||
|
||||
|
||||
def test_parse_track_entries_invalid_raises_value_error() -> None:
|
||||
entries = [
|
||||
{
|
||||
"coordinates": [[52.5, 13.4]],
|
||||
}
|
||||
]
|
||||
|
||||
with pytest.raises(ValueError):
|
||||
tracks_load._parse_track_entries(entries)
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummySession:
|
||||
committed: bool = False
|
||||
rolled_back: bool = False
|
||||
|
||||
def __enter__(self) -> "DummySession":
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc, traceback) -> None:
|
||||
pass
|
||||
|
||||
def commit(self) -> None:
|
||||
self.committed = True
|
||||
|
||||
def rollback(self) -> None:
|
||||
self.rolled_back = True
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummyStation:
|
||||
id: str
|
||||
location: object
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummyStationRepository:
|
||||
session: DummySession
|
||||
stations: List[DummyStation]
|
||||
|
||||
def list_active(self) -> List[DummyStation]:
|
||||
return self.stations
|
||||
|
||||
|
||||
@dataclass
|
||||
class DummyTrackRepository:
|
||||
session: DummySession
|
||||
created: list = field(default_factory=list)
|
||||
existing: list = field(default_factory=list)
|
||||
|
||||
def list_all(self):
|
||||
return self.existing
|
||||
|
||||
def create(self, data): # pragma: no cover - simple delegation
|
||||
self.created.append(data)
|
||||
|
||||
|
||||
def _point(lat: float, lon: float) -> object:
|
||||
return from_shape(Point(lon, lat), srid=4326)
|
||||
|
||||
|
||||
def test_load_tracks_creates_entries(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
session_instance = DummySession()
|
||||
station_repo_instance = DummyStationRepository(
|
||||
session_instance,
|
||||
stations=[
|
||||
DummyStation(id="station-a", location=_point(52.5, 13.4)),
|
||||
DummyStation(id="station-b", location=_point(52.6, 13.5)),
|
||||
],
|
||||
)
|
||||
track_repo_instance = DummyTrackRepository(session_instance)
|
||||
|
||||
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "StationRepository", lambda session: station_repo_instance
|
||||
)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "TrackRepository", lambda session: track_repo_instance
|
||||
)
|
||||
|
||||
parsed = tracks_load._parse_track_entries(
|
||||
[
|
||||
{
|
||||
"name": "Connector",
|
||||
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
created = tracks_load.load_tracks(parsed, commit=True)
|
||||
|
||||
assert created == 1
|
||||
assert session_instance.committed is True
|
||||
assert track_repo_instance.created
|
||||
track = track_repo_instance.created[0]
|
||||
assert track.start_station_id == "station-a"
|
||||
assert track.end_station_id == "station-b"
|
||||
assert track.coordinates == [(52.5, 13.4), (52.6, 13.5)]
|
||||
|
||||
|
||||
def test_load_tracks_skips_existing_pairs(monkeypatch: pytest.MonkeyPatch) -> None:
|
||||
session_instance = DummySession()
|
||||
station_repo_instance = DummyStationRepository(
|
||||
session_instance,
|
||||
stations=[
|
||||
DummyStation(id="station-a", location=_point(52.5, 13.4)),
|
||||
DummyStation(id="station-b", location=_point(52.6, 13.5)),
|
||||
],
|
||||
)
|
||||
existing_track = type(
|
||||
"ExistingTrack",
|
||||
(),
|
||||
{
|
||||
"start_station_id": "station-a",
|
||||
"end_station_id": "station-b",
|
||||
},
|
||||
)
|
||||
track_repo_instance = DummyTrackRepository(
|
||||
session_instance,
|
||||
existing=[existing_track],
|
||||
)
|
||||
|
||||
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "StationRepository", lambda session: station_repo_instance
|
||||
)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "TrackRepository", lambda session: track_repo_instance
|
||||
)
|
||||
|
||||
parsed = tracks_load._parse_track_entries(
|
||||
[
|
||||
{
|
||||
"name": "Connector",
|
||||
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
created = tracks_load.load_tracks(parsed, commit=False)
|
||||
|
||||
assert created == 0
|
||||
assert session_instance.rolled_back is True
|
||||
assert not track_repo_instance.created
|
||||
|
||||
|
||||
def test_load_tracks_skips_when_station_too_far(
|
||||
monkeypatch: pytest.MonkeyPatch,
|
||||
) -> None:
|
||||
session_instance = DummySession()
|
||||
station_repo_instance = DummyStationRepository(
|
||||
session_instance,
|
||||
stations=[
|
||||
DummyStation(id="remote-station", location=_point(53.5, 14.5)),
|
||||
],
|
||||
)
|
||||
track_repo_instance = DummyTrackRepository(session_instance)
|
||||
|
||||
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "StationRepository", lambda session: station_repo_instance
|
||||
)
|
||||
monkeypatch.setattr(
|
||||
tracks_load, "TrackRepository", lambda session: track_repo_instance
|
||||
)
|
||||
|
||||
parsed = tracks_load._parse_track_entries(
|
||||
[
|
||||
{
|
||||
"name": "Isolated Segment",
|
||||
"coordinates": [[52.5, 13.4], [52.51, 13.41]],
|
||||
}
|
||||
]
|
||||
)
|
||||
|
||||
created = tracks_load.load_tracks(parsed, commit=True)
|
||||
|
||||
assert created == 0
|
||||
assert session_instance.committed is True
|
||||
assert not track_repo_instance.created
|
||||
9782
data/osm_stations.json
Normal file
9782
data/osm_stations.json
Normal file
File diff suppressed because it is too large
Load Diff
527625
data/osm_tracks.json
Normal file
527625
data/osm_tracks.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,3 @@
|
||||
version: "3.9"
|
||||
|
||||
services:
|
||||
db:
|
||||
build:
|
||||
@@ -27,6 +25,7 @@ services:
|
||||
DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_dev
|
||||
TEST_DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_test
|
||||
REDIS_URL: redis://redis:6379/0
|
||||
INIT_DEMO_DB: "true"
|
||||
depends_on:
|
||||
- db
|
||||
- redis
|
||||
|
||||
160
docs/05_Building_Block_View.md
Normal file
160
docs/05_Building_Block_View.md
Normal file
@@ -0,0 +1,160 @@
|
||||
# 5. Building Block View
|
||||
|
||||
### 5.1 Whitebox Overall System
|
||||
|
||||
The Rail Game system is structured as a client-server architecture with the following top-level building blocks:
|
||||
|
||||
- **Frontend Application**: Browser-based React SPA handling user interface and interactions
|
||||
- **Backend API**: Python-based RESTful API server managing game logic and data access
|
||||
- **Database**: PostgreSQL with PostGIS for persistent storage and spatial queries
|
||||
- **External Services**: OpenStreetMap and other third-party APIs for map data and additional features
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Frontend Application] -->|REST API| B[Backend API]
|
||||
B -->|SQL Queries| C[Database]
|
||||
B -->|API Calls| D[External Services]
|
||||
A -->|Map Tiles| D
|
||||
```
|
||||
|
||||
### 5.2 Level 1 Building Blocks
|
||||
|
||||
#### 5.2.1 Frontend Application
|
||||
|
||||
**Responsibility**: Provides the user interface for railway network building, management, and visualization.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- User interactions via browser
|
||||
- RESTful API calls to Backend API
|
||||
- Integration with Leaflet for map rendering
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- Map View: Displays railway networks and allows interaction
|
||||
- Network Builder: Tools for creating and editing railway tracks and stations
|
||||
- Dashboard: User profile, resources, and game statistics
|
||||
- Authentication UI: Login, registration, and profile management
|
||||
|
||||
#### 5.2.2 Backend API
|
||||
|
||||
**Responsibility**: Handles game logic, data processing, and serves as the interface between frontend and database.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- RESTful HTTP endpoints for frontend communication
|
||||
- Database connections via SQLAlchemy ORM
|
||||
- Potential WebSocket connections for real-time updates
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- User Management: Authentication, profiles, and sessions
|
||||
- Railway Engine: Logic for network building, route calculation, and scheduling
|
||||
- Game Logic: Resource management, scoring, and achievements
|
||||
- Data Access Layer: Abstraction for database operations
|
||||
|
||||
#### 5.2.3 Database
|
||||
|
||||
**Responsibility**: Persistent storage of user data, railway networks, and game state.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- SQL connections from Backend API
|
||||
- Spatial queries via PostGIS extensions
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- User Schema: Accounts, profiles, and authentication data
|
||||
- Railway Schema: Tracks, stations, trains, and schedules
|
||||
- Game Schema: Resources, achievements, and leaderboards
|
||||
|
||||
#### 5.2.4 External Services
|
||||
|
||||
**Responsibility**: Provides external data sources and integrations.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- API calls from Backend or Frontend
|
||||
- Data feeds for map tiles, geographical information, and real-time data
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- OpenStreetMap: Source of map tiles and railway data
|
||||
- Authentication Providers: OAuth integrations (e.g., Google, GitHub)
|
||||
- Analytics Services: User tracking and performance monitoring
|
||||
|
||||
### 5.3 Level 2 Building Blocks
|
||||
|
||||
#### 5.3.1 Frontend Components
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Map Component] -->|Leaflet| B[Toolbar Component]
|
||||
A -->|Leaflet| C[Modal Components]
|
||||
A -->|Redux| D[State Management]
|
||||
```
|
||||
|
||||
- **Map Component**: React Leaflet-based map showing OpenStreetMap tiles with station markers and track polylines drawn from the shared network snapshot models
|
||||
- **Toolbar Component**: Tools for building tracks, placing stations, and managing trains
|
||||
- **Modal Components**: Dialogs for settings, confirmations, and detailed views
|
||||
- **State Management**: Redux store for game state and UI state
|
||||
|
||||
#### 5.3.2 Backend Modules
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[API Layer] -->|REST Endpoints| B[Health Router]
|
||||
A -->|REST Endpoints| C[Network Router]
|
||||
C -->|Domain Models| D[Network Service]
|
||||
D -->|Shared Schemas| E[Frontend Data Contracts]
|
||||
```
|
||||
|
||||
- **Health Module**: Lightweight readiness probes used by infrastructure checks.
|
||||
- **Network Module**: Serves read-only snapshots of stations, tracks, and trains using shared domain models (camelCase aliases for client compatibility).
|
||||
- **OSM Ingestion CLI**: Script pairings (`stations_import`/`stations_load`, `tracks_import`/`tracks_load`) that harvest OpenStreetMap fixtures and persist normalized station and track geometries into PostGIS.
|
||||
- **Authentication Module**: JWT-based user registration, authentication, and authorization. The current prototype supports on-the-fly account creation backed by an in-memory user store and issues short-lived access tokens to validate the client flow end-to-end.
|
||||
- **Railway Calculation Module**: Algorithms for route optimization and scheduling (planned).
|
||||
- **Resource Management Module**: Logic for game economy and progression (planned).
|
||||
- **Real-time Module**: WebSocket handlers for live updates (if implemented).
|
||||
|
||||
#### 5.3.3 Database Tables
|
||||
|
||||
- **Users Table**: User accounts and profile information
|
||||
- **Railways Table**: User-created railway networks (spatial data)
|
||||
- **Trains Table**: Train configurations and schedules
|
||||
- **Stations Table**: Station locations and properties (spatial data)
|
||||
- **Achievements Table**: User progress and leaderboard data
|
||||
|
||||
### 5.4 Project Directory Structure
|
||||
|
||||
The repository will be organized to mirror the logical architecture and isolate concerns between frontend, backend, infrastructure, and shared assets.
|
||||
|
||||
```text
|
||||
rail-game/
|
||||
|-- backend/
|
||||
| |-- app/
|
||||
| | |-- api/ # FastAPI/Flask route handlers and request lifecycles
|
||||
| | |-- core/ # Configuration, startup hooks, cross-cutting utilities
|
||||
| | |-- models/ # SQLAlchemy models, Pydantic schemas, migrations helpers
|
||||
| | |-- services/ # Domain services for scheduling, routing, resource logic
|
||||
| | `-- websocket/ # Real-time transport adapters and event handlers
|
||||
| |-- tests/ # Backend unit, integration, and contract tests
|
||||
| `-- requirements/ # Dependency manifests and lockfiles per environment
|
||||
|-- frontend/
|
||||
| |-- public/ # Static assets served without processing
|
||||
| |-- src/
|
||||
| | |-- components/ # Reusable React UI components and widgets
|
||||
| | |-- hooks/ # Custom hooks for map interaction, data fetching, state sync
|
||||
| | |-- pages/ # Route-level views composing feature modules
|
||||
| | |-- state/ # Redux/Context stores, slices, and middleware
|
||||
| | |-- styles/ # Global stylesheets, design tokens, CSS modules
|
||||
| | `-- utils/ # Frontend-only helpers for formatting and calculations
|
||||
| `-- tests/ # Component, store, and integration tests (Jest/React Testing Library)
|
||||
|-- docs/ # Architecture docs, ADRs, onboarding guides
|
||||
|-- infra/ # Docker, Terraform, CI/CD workflows, deployment manifests
|
||||
|-- scripts/ # Automation for setup, linting, database tasks, data imports
|
||||
|-- data/ # Seed datasets, fixtures, export/import scripts (kept out of VCS if large)
|
||||
`-- tests/ # Cross-cutting end-to-end suites and shared test utilities
|
||||
```
|
||||
|
||||
Shared code that spans application layers should be surfaced through well-defined APIs within `backend/app/services` or exposed via frontend data contracts to keep coupling low. Infrastructure automation and CI/CD assets remain isolated under `infra/` to support multiple deployment targets.
|
||||
180
docs/06_Runtime_View.md
Normal file
180
docs/06_Runtime_View.md
Normal file
@@ -0,0 +1,180 @@
|
||||
# 6. Runtime View
|
||||
|
||||
### 6.1 Overview
|
||||
|
||||
The runtime view illustrates the dynamic behavior of the Rail Game system during typical user interactions. It shows how the building blocks interact to fulfill user requests and maintain system state.
|
||||
|
||||
### 6.2 Key Runtime Scenarios
|
||||
|
||||
#### 6.2.1 User Authentication
|
||||
|
||||
**Scenario**: A user signs up and logs into the game.
|
||||
|
||||
**Description**: From the authentication UI the user can either register a new profile or sign in with existing credentials. New registrations are persisted in the prototype's in-memory store. On login the backend verifies the credentials and issues a JWT token for subsequent requests.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Submit signup/login form
|
||||
alt Register new account
|
||||
F->>B: POST /api/auth/register
|
||||
B->>B: Persist user (in-memory prototype store)
|
||||
end
|
||||
F->>B: POST /api/auth/login
|
||||
B->>D: Query user credentials
|
||||
D-->>B: User data
|
||||
B->>B: Validate password
|
||||
B-->>F: JWT token
|
||||
F-->>U: Redirect to dashboard
|
||||
```
|
||||
|
||||
#### 6.2.2 Loading Map and Railway Data
|
||||
|
||||
**Scenario**: User opens the game and loads their railway network.
|
||||
|
||||
**Description**: The frontend requests map tiles from OpenStreetMap and user-specific railway data from the backend, which retrieves it from the database.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
participant OSM as OpenStreetMap
|
||||
|
||||
U->>F: Open game
|
||||
F->>OSM: Request map tiles
|
||||
OSM-->>F: Map tiles
|
||||
F->>B: GET /api/railways/{userId}
|
||||
B->>D: Query user railways
|
||||
D-->>B: Railway data (spatial)
|
||||
B-->>F: Railway network JSON
|
||||
F->>F: Render map with railways
|
||||
```
|
||||
|
||||
#### 6.2.3 Fetching Network Snapshot (current implementation)
|
||||
|
||||
**Scenario**: The frontend loads a shared snapshot of stations, tracks, and trains using the domain models.
|
||||
|
||||
**Description**: After the React client authenticates and stores the issued access token, it calls the FastAPI `/api/network` endpoint with a bearer header. The backend constructs a `NetworkSnapshot` using immutable domain models and returns camelCase JSON for direct consumption by TypeScript interfaces. The frontend hydrates both summary lists and the React Leaflet map overlay with the resulting station and track geometry.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant F as Frontend (React)
|
||||
participant H as Hook (useNetworkSnapshot)
|
||||
participant A as API Router (/api/network)
|
||||
participant S as Network Service
|
||||
|
||||
F->>H: Mount component
|
||||
H->>A: GET /api/network (Bearer token)
|
||||
A->>S: Build snapshot using domain models
|
||||
S-->>A: Stations, tracks, trains (camelCase JSON)
|
||||
A-->>H: 200 OK + payload
|
||||
H-->>F: Update UI state (loading → success)
|
||||
F->>F: Render Leaflet map and snapshot summaries
|
||||
```
|
||||
|
||||
#### 6.2.4 OSM Track Import and Load
|
||||
|
||||
**Scenario**: Operator refreshes spatial fixtures by harvesting OSM railways and persisting them to PostGIS.
|
||||
|
||||
**Description**: The paired CLI scripts `tracks_import.py` and `tracks_load.py` export candidate track segments from Overpass, associate endpoints with the nearest known stations, and store the resulting LINESTRING geometries. Dry-run flags allow inspection of the generated Overpass payload or database mutations before commit.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant Ops as Operator
|
||||
participant TI as tracks_import.py
|
||||
participant OL as Overpass API
|
||||
participant TL as tracks_load.py
|
||||
participant DB as PostGIS
|
||||
|
||||
Ops->>TI: Invoke with region + output path
|
||||
TI->>OL: POST compiled Overpass query
|
||||
OL-->>TI: Return rail way elements (JSON)
|
||||
TI-->>Ops: Write normalized tracks JSON
|
||||
Ops->>TL: Invoke with normalized JSON
|
||||
TL->>DB: Fetch stations + existing tracks
|
||||
TL->>DB: Insert snapped LINESTRING geometries
|
||||
TL-->>Ops: Report committed track count
|
||||
```
|
||||
|
||||
#### 6.2.5 Building Railway Network
|
||||
|
||||
**Scenario**: User adds a new track segment to their railway network.
|
||||
|
||||
**Description**: The user interacts with the map to place a new track. The frontend sends the new track data to the backend, which validates and stores it in the database.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Draw new track on map
|
||||
F->>F: Validate track placement
|
||||
F->>B: POST /api/tracks
|
||||
B->>B: Validate track logic
|
||||
B->>D: Insert new track (spatial)
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Success response
|
||||
F->>F: Update map display
|
||||
```
|
||||
|
||||
#### 6.2.6 Running Train Simulation
|
||||
|
||||
**Scenario**: User starts a train simulation on their network.
|
||||
|
||||
**Description**: The frontend requests simulation start, backend calculates train routes and schedules, updates database with simulation state, and sends real-time updates back to frontend.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Click "Start Simulation"
|
||||
F->>B: POST /api/simulation/start
|
||||
B->>D: Query railway network
|
||||
D-->>B: Network data
|
||||
B->>B: Calculate routes & schedules
|
||||
B->>D: Update train positions
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Simulation started
|
||||
loop Real-time updates
|
||||
B->>B: Update train positions
|
||||
B->>D: Save positions
|
||||
B-->>F: WebSocket position updates
|
||||
end
|
||||
```
|
||||
|
||||
#### 6.2.7 Saving Game Progress
|
||||
|
||||
**Scenario**: User saves their current game state.
|
||||
|
||||
**Description**: The frontend periodically or on user request sends current game state to backend for persistence.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
F->>B: POST /api/save
|
||||
B->>D: Update user progress
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Save successful
|
||||
```
|
||||
|
||||
### 6.3 Performance and Scalability Considerations
|
||||
|
||||
- **Database Queries**: Spatial queries for railway data are optimized using PostGIS indexes
|
||||
- **Caching**: Frequently accessed map tiles and user data may be cached
|
||||
- **Real-time Updates**: WebSocket connections for simulation updates, with fallback to polling
|
||||
- **Load Balancing**: Backend API can be scaled horizontally for multiple users
|
||||
- **CDN**: Static assets and map tiles served via CDN for faster loading
|
||||
136
docs/08_Concepts.md
Normal file
136
docs/08_Concepts.md
Normal file
@@ -0,0 +1,136 @@
|
||||
# 8. Concepts
|
||||
|
||||
### 8.1 Domain Concepts
|
||||
|
||||
#### 8.1.1 Railway Network Model
|
||||
|
||||
The core domain concept is the railway network, consisting of:
|
||||
|
||||
- **Tracks**: Linear segments connecting geographical points, stored as spatial geometries
|
||||
- **Stations**: Key points on the network where trains can stop, load/unload passengers or cargo
|
||||
- **Trains**: Movable entities that follow routes along tracks according to schedules
|
||||
- **Schedules**: Time-based plans for train movements and operations
|
||||
|
||||
Railway networks are user-created and managed, built upon real-world geographical data from OpenStreetMap.
|
||||
|
||||
#### 8.1.2 Game Economy
|
||||
|
||||
Resource management drives gameplay:
|
||||
|
||||
- **Currency**: Earned through network operations and achievements
|
||||
- **Resources**: Required for building and upgrading railway components
|
||||
- **Scoring**: Based on network efficiency, passenger satisfaction, and operational success
|
||||
|
||||
#### 8.1.3 Simulation Engine
|
||||
|
||||
Dynamic simulation of train operations:
|
||||
|
||||
- **Route Calculation**: Pathfinding algorithms to determine optimal train routes
|
||||
- **Schedule Optimization**: Balancing train frequencies with network capacity
|
||||
- **Real-time Updates**: Live position tracking and status reporting
|
||||
|
||||
#### 8.1.4 Network Snapshot Contract
|
||||
|
||||
- **Shared Models**: The backend uses immutable Pydantic models with camelCase aliases that mirror TypeScript interfaces in `frontend/src/types/domain.ts`.
|
||||
- **Snapshot Service**: Until persistence exists, a service synthesises demo stations, tracks, and trains to keep the client workflow functional.
|
||||
- **Client Hook**: `useNetworkSnapshot` orchestrates fetch status (idle/loading/success/error) and pushes data into the React view layer.
|
||||
|
||||
### 8.2 Architectural Concepts
|
||||
|
||||
#### 8.2.1 Client-Server Architecture
|
||||
|
||||
- **Frontend**: Browser-based React SPA handling user interactions and UI rendering
|
||||
- **Backend**: RESTful API server processing business logic and data operations
|
||||
- **Separation of Concerns**: Clear boundaries between presentation, business logic, and data layers
|
||||
|
||||
#### 8.2.2 Spatial Data Handling
|
||||
|
||||
- **PostGIS Integration**: Extension of PostgreSQL for geographical and spatial operations
|
||||
- **Coordinate Systems**: Use of standard geographical projections (e.g., WGS84)
|
||||
- **Spatial Queries**: Efficient querying of railway elements within geographical bounds
|
||||
|
||||
#### 8.2.3 Real-time Communication
|
||||
|
||||
- **WebSocket Protocol**: For live updates during train simulations
|
||||
- **Fallback Mechanisms**: Polling as alternative when WebSockets unavailable
|
||||
- **Event-Driven Updates**: Push notifications for game state changes
|
||||
|
||||
#### 8.2.4 OSM Track Harvesting Policy
|
||||
|
||||
- **Railway Types**: Importer requests `rail`, `light_rail`, `subway`, `tram`, `narrow_gauge`, plus `construction` and `disused` variants to capture build-state metadata.
|
||||
- **Service Filters**: `service` tags such as `yard`, `siding`, `spur`, `crossover`, `industrial`, or `military` are excluded to focus on mainline traffic.
|
||||
- **Usage Filters**: Ways flagged with `usage=military` or `usage=tourism` are skipped; unspecified usage defaults to accepted.
|
||||
- **Geometry Guardrails**: Segments shorter than 75 meters are discarded and track endpoints must snap to an existing station within 350 meters or the segment is ignored during loading.
|
||||
|
||||
### 8.3 User Interface Concepts
|
||||
|
||||
#### 8.3.1 Component-Based Architecture
|
||||
|
||||
- **React Components**: Modular, reusable UI elements
|
||||
- **State Management**: Centralized state using Redux or Context API
|
||||
- **Responsive Design**: Adaptive layouts for various screen sizes and devices
|
||||
|
||||
#### 8.3.2 Map Interaction
|
||||
|
||||
- **Leaflet Integration**: Interactive mapping library for geographical visualization
|
||||
- **Layer Management**: Overlaying user railways on base OpenStreetMap tiles
|
||||
- **Gesture Handling**: Mouse, keyboard, and touch interactions for map navigation and editing
|
||||
|
||||
#### 8.3.3 Game Interface Patterns
|
||||
|
||||
- **Toolbar**: Contextual tools for building and editing railway elements
|
||||
- **Modal Dialogs**: For configuration, confirmation, and detailed information display
|
||||
- **Dashboard**: Overview of user progress, resources, and network statistics
|
||||
|
||||
### 8.4 Security Concepts
|
||||
|
||||
#### 8.4.1 Authentication and Authorization
|
||||
|
||||
- **JWT Tokens**: Stateless authentication for API requests
|
||||
- **OAuth Integration**: Support for third-party authentication providers
|
||||
- **Role-Based Access**: Differentiated permissions for users and administrators
|
||||
|
||||
#### 8.4.2 Data Protection
|
||||
|
||||
- **Input Validation**: Sanitization of all user inputs to prevent injection attacks
|
||||
- **HTTPS Encryption**: Secure communication between client and server
|
||||
- **Data Privacy**: Compliance with privacy regulations for user data handling
|
||||
|
||||
### 8.5 Persistence Concepts
|
||||
|
||||
#### 8.5.1 Database Design
|
||||
|
||||
- **Schema Overview**: Core tables include `users`, `stations`, `tracks`, `trains`, and `train_schedules`, each backed by UUID primary keys and timestamp metadata for auditing.
|
||||
- **Users**: Holds account metadata (`username`, `email`, `role`, hashed password) with JSON-ready preference storage and soft defaults for player roles.
|
||||
- **Stations**: Stores OpenStreetMap references, station codes, and a `POINT` geometry (`SRID 4326`) to support spatial queries; a GiST index accelerates proximity searches.
|
||||
- **Tracks**: Models line segments between stations using `LINESTRING` geometry plus operational attributes (length, speed limits, status, bidirectionality) and a uniqueness constraint on station pairs.
|
||||
- **Trains & Schedules**: Captures rolling stock capabilities and their ordered stop plans (`sequence_index`, arrival/departure timestamps, dwell times) with cascading foreign keys for clean deletions.
|
||||
- **Spatial Extensions**: Alembic migrations provision `postgis` and `pgcrypto` extensions; geometry columns use GeoAlchemy2 bindings for seamless ORM interactions.
|
||||
|
||||
#### 8.5.2 Data Access Patterns
|
||||
|
||||
- **ORM Layer**: SQLAlchemy 2.0 declarative models (see `backend/app/db/models.py`) expose typed entities that will feed repository and service layers.
|
||||
- **Session Management**: A centralized engine/session factory (`backend/app/db/session.py`) pulls the database URL from environment-managed settings and keeps pooling under application control.
|
||||
- **Environment Separation**: `.env` configuration exposes `DATABASE_URL`, `TEST_DATABASE_URL`, and `ALEMBIC_DATABASE_URL`, allowing the runtime, tests, and migration tooling to target different Postgres instances.
|
||||
- **Schema Evolution**: Alembic configuration (`backend/alembic.ini`, `backend/migrations/`) provides repeatable migrations—the initial revision creates the PostGIS-enabled schema and GiST indexes.
|
||||
- **Transaction Management**: Service-layer dependencies will acquire short-lived sessions (`SessionLocal`) ensuring explicit commit/rollback boundaries around game operations.
|
||||
|
||||
### 8.6 Development and Deployment Concepts
|
||||
|
||||
#### 8.6.1 Testing Strategy
|
||||
|
||||
- **Unit Testing**: Individual component and function testing
|
||||
- **Integration Testing**: API endpoint and database interaction validation
|
||||
- **End-to-End Testing**: Complete user workflow verification across browsers
|
||||
|
||||
#### 8.6.2 Deployment Pipeline
|
||||
|
||||
- **Containerization**: Docker for consistent environments
|
||||
- **CI/CD**: Automated testing and deployment workflows
|
||||
- **Static Hosting**: CDN-based delivery of frontend assets
|
||||
|
||||
#### 8.6.3 Performance Optimization
|
||||
|
||||
- **Lazy Loading**: On-demand loading of components and data
|
||||
- **Caching Layers**: Redis for frequently accessed data
|
||||
- **Asset Optimization**: Minification and compression of static resources
|
||||
@@ -70,8 +70,8 @@ The system interacts with:
|
||||
|
||||
- User registration and authentication
|
||||
- Railway network building and management
|
||||
- Train scheduling and simulation
|
||||
- Map visualization and interaction
|
||||
- Train scheduling and simulation
|
||||
- Leaderboards and user profiles
|
||||
|
||||
**Out of Scope:**
|
||||
@@ -100,458 +100,52 @@ The system interacts with:
|
||||
|
||||
- Browser-native implementation for broad accessibility
|
||||
- Spatial database for efficient geographical queries
|
||||
- Offline-friendly OSM ingestion pipeline that uses dedicated CLI scripts to export/import stations and tracks before seeding the database
|
||||
- Modular architecture allowing for future extensions (e.g., multiplayer)
|
||||
|
||||
## 5. Building Block View
|
||||
|
||||
### 5.1 Whitebox Overall System
|
||||
|
||||
The Rail Game system is structured as a client-server architecture with the following top-level building blocks:
|
||||
|
||||
- **Frontend Application**: Browser-based React SPA handling user interface and interactions
|
||||
- **Backend API**: Python-based RESTful API server managing game logic and data access
|
||||
- **Database**: PostgreSQL with PostGIS for persistent storage and spatial queries
|
||||
- **External Services**: OpenStreetMap and other third-party APIs for map data and additional features
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Frontend Application] -->|REST API| B[Backend API]
|
||||
B -->|SQL Queries| C[Database]
|
||||
B -->|API Calls| D[External Services]
|
||||
A -->|Map Tiles| D
|
||||
```
|
||||
|
||||
### 5.2 Level 1 Building Blocks
|
||||
|
||||
#### 5.2.1 Frontend Application
|
||||
|
||||
**Responsibility**: Provides the user interface for railway network building, management, and visualization.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- User interactions via browser
|
||||
- RESTful API calls to Backend API
|
||||
- Integration with Leaflet for map rendering
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- Map View: Displays railway networks and allows interaction
|
||||
- Network Builder: Tools for creating and editing railway tracks and stations
|
||||
- Dashboard: User profile, resources, and game statistics
|
||||
- Authentication UI: Login, registration, and profile management
|
||||
|
||||
#### 5.2.2 Backend API
|
||||
|
||||
**Responsibility**: Handles game logic, data processing, and serves as the interface between frontend and database.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- RESTful HTTP endpoints for frontend communication
|
||||
- Database connections via SQLAlchemy ORM
|
||||
- Potential WebSocket connections for real-time updates
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- User Management: Authentication, profiles, and sessions
|
||||
- Railway Engine: Logic for network building, route calculation, and scheduling
|
||||
- Game Logic: Resource management, scoring, and achievements
|
||||
- Data Access Layer: Abstraction for database operations
|
||||
|
||||
#### 5.2.3 Database
|
||||
|
||||
**Responsibility**: Persistent storage of user data, railway networks, and game state.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- SQL connections from Backend API
|
||||
- Spatial queries via PostGIS extensions
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- User Schema: Accounts, profiles, and authentication data
|
||||
- Railway Schema: Tracks, stations, trains, and schedules
|
||||
- Game Schema: Resources, achievements, and leaderboards
|
||||
|
||||
#### 5.2.4 External Services
|
||||
|
||||
**Responsibility**: Provides external data sources and integrations.
|
||||
|
||||
**Interfaces**:
|
||||
|
||||
- API calls from Backend or Frontend
|
||||
- Data feeds for map tiles, geographical information, and real-time data
|
||||
|
||||
**Key Components**:
|
||||
|
||||
- OpenStreetMap: Source of map tiles and railway data
|
||||
- Authentication Providers: OAuth integrations (e.g., Google, GitHub)
|
||||
- Analytics Services: User tracking and performance monitoring
|
||||
|
||||
### 5.3 Level 2 Building Blocks
|
||||
|
||||
#### 5.3.1 Frontend Components
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[Map Component] -->|Leaflet| B[Toolbar Component]
|
||||
A -->|Leaflet| C[Modal Components]
|
||||
A -->|Redux| D[State Management]
|
||||
```
|
||||
|
||||
- **Map Component**: React Leaflet-based map showing OpenStreetMap tiles with station markers and track polylines drawn from the shared network snapshot models
|
||||
- **Toolbar Component**: Tools for building tracks, placing stations, and managing trains
|
||||
- **Modal Components**: Dialogs for settings, confirmations, and detailed views
|
||||
- **State Management**: Redux store for game state and UI state
|
||||
|
||||
#### 5.3.2 Backend Modules
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
A[API Layer] -->|REST Endpoints| B[Health Router]
|
||||
A -->|REST Endpoints| C[Network Router]
|
||||
C -->|Domain Models| D[Network Service]
|
||||
D -->|Shared Schemas| E[Frontend Data Contracts]
|
||||
```
|
||||
|
||||
- **Health Module**: Lightweight readiness probes used by infrastructure checks.
|
||||
- **Network Module**: Serves read-only snapshots of stations, tracks, and trains using shared domain models (camelCase aliases for client compatibility).
|
||||
- **Authentication Module**: JWT-based user registration, authentication, and authorization. The current prototype supports on-the-fly account creation backed by an in-memory user store and issues short-lived access tokens to validate the client flow end-to-end.
|
||||
- **Railway Calculation Module**: Algorithms for route optimization and scheduling (planned).
|
||||
- **Resource Management Module**: Logic for game economy and progression (planned).
|
||||
- **Real-time Module**: WebSocket handlers for live updates (if implemented).
|
||||
|
||||
#### 5.3.3 Database Tables
|
||||
|
||||
- **Users Table**: User accounts and profile information
|
||||
- **Railways Table**: User-created railway networks (spatial data)
|
||||
- **Trains Table**: Train configurations and schedules
|
||||
- **Stations Table**: Station locations and properties (spatial data)
|
||||
- **Achievements Table**: User progress and leaderboard data
|
||||
|
||||
### 5.4 Project Directory Structure
|
||||
|
||||
The repository will be organized to mirror the logical architecture and isolate concerns between frontend, backend, infrastructure, and shared assets.
|
||||
|
||||
```text
|
||||
rail-game/
|
||||
|-- backend/
|
||||
| |-- app/
|
||||
| | |-- api/ # FastAPI/Flask route handlers and request lifecycles
|
||||
| | |-- core/ # Configuration, startup hooks, cross-cutting utilities
|
||||
| | |-- models/ # SQLAlchemy models, Pydantic schemas, migrations helpers
|
||||
| | |-- services/ # Domain services for scheduling, routing, resource logic
|
||||
| | `-- websocket/ # Real-time transport adapters and event handlers
|
||||
| |-- tests/ # Backend unit, integration, and contract tests
|
||||
| `-- requirements/ # Dependency manifests and lockfiles per environment
|
||||
|-- frontend/
|
||||
| |-- public/ # Static assets served without processing
|
||||
| |-- src/
|
||||
| | |-- components/ # Reusable React UI components and widgets
|
||||
| | |-- hooks/ # Custom hooks for map interaction, data fetching, state sync
|
||||
| | |-- pages/ # Route-level views composing feature modules
|
||||
| | |-- state/ # Redux/Context stores, slices, and middleware
|
||||
| | |-- styles/ # Global stylesheets, design tokens, CSS modules
|
||||
| | `-- utils/ # Frontend-only helpers for formatting and calculations
|
||||
| `-- tests/ # Component, store, and integration tests (Jest/React Testing Library)
|
||||
|-- docs/ # Architecture docs, ADRs, onboarding guides
|
||||
|-- infra/ # Docker, Terraform, CI/CD workflows, deployment manifests
|
||||
|-- scripts/ # Automation for setup, linting, database tasks, data imports
|
||||
|-- data/ # Seed datasets, fixtures, export/import scripts (kept out of VCS if large)
|
||||
`-- tests/ # Cross-cutting end-to-end suites and shared test utilities
|
||||
```
|
||||
|
||||
Shared code that spans application layers should be surfaced through well-defined APIs within `backend/app/services` or exposed via frontend data contracts to keep coupling low. Infrastructure automation and CI/CD assets remain isolated under `infra/` to support multiple deployment targets.
|
||||
The detailed building block view now lives in [05_Building_Block_View.md](./05_Building_Block_View.md).
|
||||
|
||||
## 6. Runtime View
|
||||
|
||||
### 6.1 Overview
|
||||
|
||||
The runtime view illustrates the dynamic behavior of the Rail Game system during typical user interactions. It shows how the building blocks interact to fulfill user requests and maintain system state.
|
||||
|
||||
### 6.2 Key Runtime Scenarios
|
||||
|
||||
#### 6.2.1 User Authentication
|
||||
|
||||
**Scenario**: A user signs up and logs into the game.
|
||||
|
||||
**Description**: From the authentication UI the user can either register a new profile or sign in with existing credentials. New registrations are persisted in the prototype's in-memory store. On login the backend verifies the credentials and issues a JWT token for subsequent requests.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Submit signup/login form
|
||||
alt Register new account
|
||||
F->>B: POST /api/auth/register
|
||||
B->>B: Persist user (in-memory prototype store)
|
||||
end
|
||||
F->>B: POST /api/auth/login
|
||||
B->>D: Query user credentials
|
||||
D-->>B: User data
|
||||
B->>B: Validate password
|
||||
B-->>F: JWT token
|
||||
F-->>U: Redirect to dashboard
|
||||
```
|
||||
|
||||
#### 6.2.2 Loading Map and Railway Data
|
||||
|
||||
**Scenario**: User opens the game and loads their railway network.
|
||||
|
||||
**Description**: The frontend requests map tiles from OpenStreetMap and user-specific railway data from the backend, which retrieves it from the database.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
participant OSM as OpenStreetMap
|
||||
|
||||
U->>F: Open game
|
||||
F->>OSM: Request map tiles
|
||||
OSM-->>F: Map tiles
|
||||
F->>B: GET /api/railways/{userId}
|
||||
B->>D: Query user railways
|
||||
D-->>B: Railway data (spatial)
|
||||
B-->>F: Railway network JSON
|
||||
F->>F: Render map with railways
|
||||
```
|
||||
|
||||
#### 6.2.3 Fetching Network Snapshot (current implementation)
|
||||
|
||||
**Scenario**: The frontend loads a shared snapshot of stations, tracks, and trains using the domain models.
|
||||
|
||||
**Description**: After the React client authenticates and stores the issued access token, it calls the FastAPI `/api/network` endpoint with a bearer header. The backend constructs a `NetworkSnapshot` using immutable domain models and returns camelCase JSON for direct consumption by TypeScript interfaces. The frontend hydrates both summary lists and the React Leaflet map overlay with the resulting station and track geometry.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant F as Frontend (React)
|
||||
participant H as Hook (useNetworkSnapshot)
|
||||
participant A as API Router (/api/network)
|
||||
participant S as Network Service
|
||||
|
||||
F->>H: Mount component
|
||||
H->>A: GET /api/network (Bearer token)
|
||||
A->>S: Build snapshot using domain models
|
||||
S-->>A: Stations, tracks, trains (camelCase JSON)
|
||||
A-->>H: 200 OK + payload
|
||||
H-->>F: Update UI state (loading → success)
|
||||
F->>F: Render Leaflet map and snapshot summaries
|
||||
```
|
||||
|
||||
#### 6.2.4 Building Railway Network
|
||||
|
||||
**Scenario**: User adds a new track segment to their railway network.
|
||||
|
||||
**Description**: The user interacts with the map to place a new track. The frontend sends the new track data to the backend, which validates and stores it in the database.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Draw new track on map
|
||||
F->>F: Validate track placement
|
||||
F->>B: POST /api/tracks
|
||||
B->>B: Validate track logic
|
||||
B->>D: Insert new track (spatial)
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Success response
|
||||
F->>F: Update map display
|
||||
```
|
||||
|
||||
#### 6.2.5 Running Train Simulation
|
||||
|
||||
**Scenario**: User starts a train simulation on their network.
|
||||
|
||||
**Description**: The frontend requests simulation start, backend calculates train routes and schedules, updates database with simulation state, and sends real-time updates back to frontend.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant U as User
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
U->>F: Click "Start Simulation"
|
||||
F->>B: POST /api/simulation/start
|
||||
B->>D: Query railway network
|
||||
D-->>B: Network data
|
||||
B->>B: Calculate routes & schedules
|
||||
B->>D: Update train positions
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Simulation started
|
||||
loop Real-time updates
|
||||
B->>B: Update train positions
|
||||
B->>D: Save positions
|
||||
B-->>F: WebSocket position updates
|
||||
end
|
||||
```
|
||||
|
||||
#### 6.2.6 Saving Game Progress
|
||||
|
||||
**Scenario**: User saves their current game state.
|
||||
|
||||
**Description**: The frontend periodically or on user request sends current game state to backend for persistence.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant F as Frontend
|
||||
participant B as Backend API
|
||||
participant D as Database
|
||||
|
||||
F->>B: POST /api/save
|
||||
B->>D: Update user progress
|
||||
D-->>B: Confirmation
|
||||
B-->>F: Save successful
|
||||
```
|
||||
|
||||
### 6.3 Performance and Scalability Considerations
|
||||
|
||||
- **Database Queries**: Spatial queries for railway data are optimized using PostGIS indexes
|
||||
- **Caching**: Frequently accessed map tiles and user data may be cached
|
||||
- **Real-time Updates**: WebSocket connections for simulation updates, with fallback to polling
|
||||
- **Load Balancing**: Backend API can be scaled horizontally for multiple users
|
||||
- **CDN**: Static assets and map tiles served via CDN for faster loading
|
||||
Runtime scenarios, sequence diagrams, and performance considerations are documented in [06_Runtime_View.md](./06_Runtime_View.md).
|
||||
|
||||
## 7. Deployment View
|
||||
|
||||
To be detailed in subsequent sections.
|
||||
### 7.1 Infrastructure Overview
|
||||
|
||||
- **Application Containers**: Backend (FastAPI + Uvicorn) and frontend (Vite/Node) each ship with dedicated Dockerfiles under `backend/` and `frontend/`.
|
||||
- **Data Services**: PostgreSQL with PostGIS and Redis run as managed containers; volumes persist database state between restarts.
|
||||
- **Reverse Proxy**: An Nginx gateway routes `/api` traffic to the backend service and serves built frontend assets in production deployments.
|
||||
|
||||
### 7.2 Local Development Topology
|
||||
|
||||
- `docker-compose.yml` orchestrates backend, frontend, Postgres/PostGIS, Redis, and Nginx for an end-to-end sandbox.
|
||||
- Developers may alternatively run the frontend and backend directly via `npm run dev` and `uvicorn` while relying on the compose-managed data services.
|
||||
- Environment variables are loaded from the repository root `.env` file (not tracked); a sample configuration lives at `.env.example`.
|
||||
|
||||
### 7.3 Continuous Integration & Delivery
|
||||
|
||||
- **CI Pipelines**: GitHub Actions lint and format both stacks, execute `pytest`, and run Playwright login flows on every pull request.
|
||||
- **Build Artifacts**: Successful pipelines publish container images tagged with the commit SHA to the project registry (planned).
|
||||
- **Promotion Strategy**: Main branch builds deploy to a shared staging environment; tagged releases promote to production once smoke tests pass (planned).
|
||||
|
||||
### 7.4 Environment Configuration
|
||||
|
||||
- **Secrets Management**: Local development uses `.env` files; higher environments will source secrets from the cloud provider's vault service (e.g., AWS Secrets Manager) with runtime injection.
|
||||
- **Database Migration**: Alembic migrations execute during deployment rollout to guarantee schema alignment before application start.
|
||||
- **Feature Flags**: Environment-specific toggles (planned) will allow gradual rollout of simulation and multiplayer features.
|
||||
|
||||
### 7.5 Observability and Operations
|
||||
|
||||
- **Logging**: Structured JSON logs emitted from FastAPI are shipped to centralized storage (e.g., OpenSearch) via Fluent Bit sidecars (planned).
|
||||
- **Metrics**: Prometheus exporters for application and database metrics inform Grafana dashboards tracking request rate, latency, and simulation throughput.
|
||||
- **Alerting**: PagerDuty escalation policies will trigger on error budgets and availability SLO breaches once production traffic begins.
|
||||
|
||||
## 8. Concepts
|
||||
|
||||
### 8.1 Domain Concepts
|
||||
|
||||
#### 8.1.1 Railway Network Model
|
||||
|
||||
The core domain concept is the railway network, consisting of:
|
||||
|
||||
- **Tracks**: Linear segments connecting geographical points, stored as spatial geometries
|
||||
- **Stations**: Key points on the network where trains can stop, load/unload passengers or cargo
|
||||
- **Trains**: Movable entities that follow routes along tracks according to schedules
|
||||
- **Schedules**: Time-based plans for train movements and operations
|
||||
|
||||
Railway networks are user-created and managed, built upon real-world geographical data from OpenStreetMap.
|
||||
|
||||
#### 8.1.2 Game Economy
|
||||
|
||||
Resource management drives gameplay:
|
||||
|
||||
- **Currency**: Earned through network operations and achievements
|
||||
- **Resources**: Required for building and upgrading railway components
|
||||
- **Scoring**: Based on network efficiency, passenger satisfaction, and operational success
|
||||
|
||||
#### 8.1.3 Simulation Engine
|
||||
|
||||
Dynamic simulation of train operations:
|
||||
|
||||
- **Route Calculation**: Pathfinding algorithms to determine optimal train routes
|
||||
- **Schedule Optimization**: Balancing train frequencies with network capacity
|
||||
- **Real-time Updates**: Live position tracking and status reporting
|
||||
|
||||
#### 8.1.4 Network Snapshot Contract
|
||||
|
||||
- **Shared Models**: The backend uses immutable Pydantic models with camelCase aliases that mirror TypeScript interfaces in `frontend/src/types/domain.ts`.
|
||||
- **Snapshot Service**: Until persistence exists, a service synthesises demo stations, tracks, and trains to keep the client workflow functional.
|
||||
- **Client Hook**: `useNetworkSnapshot` orchestrates fetch status (idle/loading/success/error) and pushes data into the React view layer.
|
||||
|
||||
### 8.2 Architectural Concepts
|
||||
|
||||
#### 8.2.1 Client-Server Architecture
|
||||
|
||||
- **Frontend**: Browser-based React SPA handling user interactions and UI rendering
|
||||
- **Backend**: RESTful API server processing business logic and data operations
|
||||
- **Separation of Concerns**: Clear boundaries between presentation, business logic, and data layers
|
||||
|
||||
#### 8.2.2 Spatial Data Handling
|
||||
|
||||
- **PostGIS Integration**: Extension of PostgreSQL for geographical and spatial operations
|
||||
- **Coordinate Systems**: Use of standard geographical projections (e.g., WGS84)
|
||||
- **Spatial Queries**: Efficient querying of railway elements within geographical bounds
|
||||
|
||||
#### 8.2.3 Real-time Communication
|
||||
|
||||
- **WebSocket Protocol**: For live updates during train simulations
|
||||
- **Fallback Mechanisms**: Polling as alternative when WebSockets unavailable
|
||||
- **Event-Driven Updates**: Push notifications for game state changes
|
||||
|
||||
### 8.3 User Interface Concepts
|
||||
|
||||
#### 8.3.1 Component-Based Architecture
|
||||
|
||||
- **React Components**: Modular, reusable UI elements
|
||||
- **State Management**: Centralized state using Redux or Context API
|
||||
- **Responsive Design**: Adaptive layouts for various screen sizes and devices
|
||||
|
||||
#### 8.3.2 Map Interaction
|
||||
|
||||
- **Leaflet Integration**: Interactive mapping library for geographical visualization
|
||||
- **Layer Management**: Overlaying user railways on base OpenStreetMap tiles
|
||||
- **Gesture Handling**: Mouse, keyboard, and touch interactions for map navigation and editing
|
||||
|
||||
#### 8.3.3 Game Interface Patterns
|
||||
|
||||
- **Toolbar**: Contextual tools for building and editing railway elements
|
||||
- **Modal Dialogs**: For configuration, confirmation, and detailed information display
|
||||
- **Dashboard**: Overview of user progress, resources, and network statistics
|
||||
|
||||
### 8.4 Security Concepts
|
||||
|
||||
#### 8.4.1 Authentication and Authorization
|
||||
|
||||
- **JWT Tokens**: Stateless authentication for API requests
|
||||
- **OAuth Integration**: Support for third-party authentication providers
|
||||
- **Role-Based Access**: Differentiated permissions for users and administrators
|
||||
|
||||
#### 8.4.2 Data Protection
|
||||
|
||||
- **Input Validation**: Sanitization of all user inputs to prevent injection attacks
|
||||
- **HTTPS Encryption**: Secure communication between client and server
|
||||
- **Data Privacy**: Compliance with privacy regulations for user data handling
|
||||
|
||||
### 8.5 Persistence Concepts
|
||||
|
||||
#### 8.5.1 Database Design
|
||||
|
||||
- **Schema Overview**: Core tables include `users`, `stations`, `tracks`, `trains`, and `train_schedules`, each backed by UUID primary keys and timestamp metadata for auditing.
|
||||
- **Users**: Holds account metadata (`username`, `email`, `role`, hashed password) with JSON-ready preference storage and soft defaults for player roles.
|
||||
- **Stations**: Stores OpenStreetMap references, station codes, and a `POINT` geometry (`SRID 4326`) to support spatial queries; a GiST index accelerates proximity searches.
|
||||
- **Tracks**: Models line segments between stations using `LINESTRING` geometry plus operational attributes (length, speed limits, status, bidirectionality) and a uniqueness constraint on station pairs.
|
||||
- **Trains & Schedules**: Captures rolling stock capabilities and their ordered stop plans (`sequence_index`, arrival/departure timestamps, dwell times) with cascading foreign keys for clean deletions.
|
||||
- **Spatial Extensions**: Alembic migrations provision `postgis` and `pgcrypto` extensions; geometry columns use GeoAlchemy2 bindings for seamless ORM interactions.
|
||||
|
||||
#### 8.5.2 Data Access Patterns
|
||||
|
||||
- **ORM Layer**: SQLAlchemy 2.0 declarative models (see `backend/app/db/models.py`) expose typed entities that will feed repository and service layers.
|
||||
- **Session Management**: A centralized engine/session factory (`backend/app/db/session.py`) pulls the database URL from environment-managed settings and keeps pooling under application control.
|
||||
- **Environment Separation**: `.env` configuration exposes `DATABASE_URL`, `TEST_DATABASE_URL`, and `ALEMBIC_DATABASE_URL`, allowing the runtime, tests, and migration tooling to target different Postgres instances.
|
||||
- **Schema Evolution**: Alembic configuration (`backend/alembic.ini`, `backend/migrations/`) provides repeatable migrations—the initial revision creates the PostGIS-enabled schema and GiST indexes.
|
||||
- **Transaction Management**: Service-layer dependencies will acquire short-lived sessions (`SessionLocal`) ensuring explicit commit/rollback boundaries around game operations.
|
||||
|
||||
### 8.6 Development and Deployment Concepts
|
||||
|
||||
#### 8.6.1 Testing Strategy
|
||||
|
||||
- **Unit Testing**: Individual component and function testing
|
||||
- **Integration Testing**: API endpoint and database interaction validation
|
||||
- **End-to-End Testing**: Complete user workflow verification across browsers
|
||||
|
||||
#### 8.6.2 Deployment Pipeline
|
||||
|
||||
- **Containerization**: Docker for consistent environments
|
||||
- **CI/CD**: Automated testing and deployment workflows
|
||||
- **Static Hosting**: CDN-based delivery of frontend assets
|
||||
|
||||
#### 8.6.3 Performance Optimization
|
||||
|
||||
- **Lazy Loading**: On-demand loading of components and data
|
||||
- **Caching Layers**: Redis for frequently accessed data
|
||||
- **Asset Optimization**: Minification and compression of static resources
|
||||
Concept catalogs and supporting models are maintained in [08_Concepts.md](./08_Concepts.md).
|
||||
|
||||
## 9. Design Decisions
|
||||
|
||||
|
||||
725
frontend/package-lock.json
generated
725
frontend/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -9,6 +9,7 @@
|
||||
"preview": "vite preview",
|
||||
"lint": "eslint \"src/**/*.{ts,tsx}\"",
|
||||
"format": "prettier --write \"src/**/*.{ts,tsx,css}\"",
|
||||
"test": "vitest run",
|
||||
"test:e2e": "playwright test"
|
||||
},
|
||||
"dependencies": {
|
||||
@@ -34,6 +35,7 @@
|
||||
"eslint-plugin-react-hooks": "^4.6.2",
|
||||
"prettier": "^3.3.3",
|
||||
"typescript": "^5.5.3",
|
||||
"vite": "^5.4.0"
|
||||
"vite": "^5.4.0",
|
||||
"vitest": "^1.6.0"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,14 +1,135 @@
|
||||
import './styles/global.css';
|
||||
|
||||
import type { LatLngExpression } from 'leaflet';
|
||||
import { useEffect, useMemo, useState } from 'react';
|
||||
|
||||
import { LoginForm } from './components/auth/LoginForm';
|
||||
import { NetworkMap } from './components/map/NetworkMap';
|
||||
import { useNetworkSnapshot } from './hooks/useNetworkSnapshot';
|
||||
import { useAuth } from './state/AuthContext';
|
||||
import type { Station } from './types/domain';
|
||||
import { buildTrackAdjacency, computeRoute } from './utils/route';
|
||||
|
||||
function App(): JSX.Element {
|
||||
const { token, user, status: authStatus, logout } = useAuth();
|
||||
const isAuthenticated = authStatus === 'authenticated' && token !== null;
|
||||
const { data, status, error } = useNetworkSnapshot(isAuthenticated ? token : null);
|
||||
const [focusedStationId, setFocusedStationId] = useState<string | null>(null);
|
||||
const [routeSelection, setRouteSelection] = useState<{
|
||||
startId: string | null;
|
||||
endId: string | null;
|
||||
}>({ startId: null, endId: null });
|
||||
const [selectedTrackId, setSelectedTrackId] = useState<string | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
if (status !== 'success' || !data?.stations.length) {
|
||||
setFocusedStationId(null);
|
||||
setRouteSelection({ startId: null, endId: null });
|
||||
setSelectedTrackId(null);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!focusedStationId || !hasStation(data.stations, focusedStationId)) {
|
||||
setFocusedStationId(data.stations[0].id);
|
||||
}
|
||||
}, [status, data, focusedStationId]);
|
||||
|
||||
useEffect(() => {
|
||||
if (status !== 'success' || !data) {
|
||||
return;
|
||||
}
|
||||
|
||||
setRouteSelection((current) => {
|
||||
const startExists = current.startId
|
||||
? hasStation(data.stations, current.startId)
|
||||
: false;
|
||||
const endExists = current.endId
|
||||
? hasStation(data.stations, current.endId)
|
||||
: false;
|
||||
|
||||
return {
|
||||
startId: startExists ? current.startId : null,
|
||||
endId: endExists ? current.endId : null,
|
||||
};
|
||||
});
|
||||
}, [status, data]);
|
||||
|
||||
const stationById = useMemo(() => {
|
||||
if (!data) {
|
||||
return new Map<string, Station>();
|
||||
}
|
||||
const lookup = new Map<string, Station>();
|
||||
for (const station of data.stations) {
|
||||
lookup.set(station.id, station);
|
||||
}
|
||||
return lookup;
|
||||
}, [data]);
|
||||
|
||||
const trackAdjacency = useMemo(
|
||||
() => buildTrackAdjacency(data ? data.tracks : []),
|
||||
[data]
|
||||
);
|
||||
|
||||
const routeComputation = useMemo(
|
||||
() =>
|
||||
computeRoute({
|
||||
startId: routeSelection.startId,
|
||||
endId: routeSelection.endId,
|
||||
stationById,
|
||||
adjacency: trackAdjacency,
|
||||
}),
|
||||
[routeSelection, stationById, trackAdjacency]
|
||||
);
|
||||
|
||||
const routeSegments = useMemo<LatLngExpression[][]>(() => {
|
||||
return routeComputation.segments.map((segment) =>
|
||||
segment.map((pair) => [pair[0], pair[1]] as LatLngExpression)
|
||||
);
|
||||
}, [routeComputation.segments]);
|
||||
|
||||
const focusedStation = useMemo(() => {
|
||||
if (!data || !focusedStationId) {
|
||||
return null;
|
||||
}
|
||||
return stationById.get(focusedStationId) ?? null;
|
||||
}, [data, focusedStationId, stationById]);
|
||||
|
||||
const selectedTrack = useMemo(() => {
|
||||
if (!data || !selectedTrackId) {
|
||||
return null;
|
||||
}
|
||||
return data.tracks.find((track) => track.id === selectedTrackId) ?? null;
|
||||
}, [data, selectedTrackId]);
|
||||
|
||||
const handleStationSelection = (stationId: string) => {
|
||||
setFocusedStationId(stationId);
|
||||
setSelectedTrackId(null);
|
||||
setRouteSelection((current) => {
|
||||
if (!current.startId || (current.startId && current.endId)) {
|
||||
return { startId: stationId, endId: null };
|
||||
}
|
||||
|
||||
if (current.startId === stationId) {
|
||||
return { startId: stationId, endId: null };
|
||||
}
|
||||
|
||||
return { startId: current.startId, endId: stationId };
|
||||
});
|
||||
};
|
||||
|
||||
const clearRouteSelection = () => {
|
||||
setRouteSelection({ startId: null, endId: null });
|
||||
};
|
||||
|
||||
const handleCreateTrack = () => {
|
||||
if (!routeSelection.startId || !routeSelection.endId) {
|
||||
return;
|
||||
}
|
||||
// TODO: Implement track creation API call
|
||||
alert(
|
||||
`Creating track between ${stationById.get(routeSelection.startId)?.name} and ${stationById.get(routeSelection.endId)?.name}`
|
||||
);
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="app-shell">
|
||||
@@ -39,7 +160,87 @@ function App(): JSX.Element {
|
||||
{status === 'success' && data && (
|
||||
<div className="snapshot-layout">
|
||||
<div className="map-wrapper">
|
||||
<NetworkMap snapshot={data} />
|
||||
<NetworkMap
|
||||
snapshot={data}
|
||||
focusedStationId={focusedStationId}
|
||||
startStationId={routeSelection.startId}
|
||||
endStationId={routeSelection.endId}
|
||||
routeSegments={routeSegments}
|
||||
selectedTrackId={selectedTrackId}
|
||||
onStationClick={handleStationSelection}
|
||||
onTrackClick={setSelectedTrackId}
|
||||
/>
|
||||
</div>
|
||||
<div className="route-panel">
|
||||
<div className="route-panel__header">
|
||||
<h3>Route Selection</h3>
|
||||
<button
|
||||
type="button"
|
||||
className="ghost-button"
|
||||
onClick={clearRouteSelection}
|
||||
disabled={!routeSelection.startId && !routeSelection.endId}
|
||||
>
|
||||
Clear
|
||||
</button>
|
||||
</div>
|
||||
<p className="route-panel__hint">
|
||||
Click a station to set the origin, then click another station to
|
||||
preview the rail corridor between them.
|
||||
</p>
|
||||
<dl className="route-panel__meta">
|
||||
<div>
|
||||
<dt>Origin</dt>
|
||||
<dd>
|
||||
{routeSelection.startId
|
||||
? (stationById.get(routeSelection.startId)?.name ??
|
||||
'Unknown station')
|
||||
: 'Choose a station'}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>Destination</dt>
|
||||
<dd>
|
||||
{routeSelection.endId
|
||||
? (stationById.get(routeSelection.endId)?.name ??
|
||||
'Unknown station')
|
||||
: 'Choose a station'}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>Estimated Length</dt>
|
||||
<dd>
|
||||
{routeComputation.totalLength !== null
|
||||
? `${(routeComputation.totalLength / 1000).toFixed(2)} km`
|
||||
: 'N/A'}
|
||||
</dd>
|
||||
</div>
|
||||
</dl>
|
||||
{routeComputation.error && (
|
||||
<p className="route-panel__error">{routeComputation.error}</p>
|
||||
)}
|
||||
{!routeComputation.error && routeComputation.stations && (
|
||||
<div className="route-panel__path">
|
||||
<span>Path:</span>
|
||||
<ol>
|
||||
{routeComputation.stations.map((station) => (
|
||||
<li key={`route-station-${station.id}`}>{station.name}</li>
|
||||
))}
|
||||
</ol>
|
||||
</div>
|
||||
)}
|
||||
{routeSelection.startId &&
|
||||
routeSelection.endId &&
|
||||
routeComputation.error && (
|
||||
<div className="route-panel__actions">
|
||||
<button
|
||||
type="button"
|
||||
className="primary-button"
|
||||
onClick={handleCreateTrack}
|
||||
>
|
||||
Create Track
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
<div className="grid">
|
||||
<div>
|
||||
@@ -47,8 +248,40 @@ function App(): JSX.Element {
|
||||
<ul>
|
||||
{data.stations.map((station) => (
|
||||
<li key={station.id}>
|
||||
{station.name} ({station.latitude.toFixed(3)},{' '}
|
||||
{station.longitude.toFixed(3)})
|
||||
<button
|
||||
type="button"
|
||||
className={`station-list-item${
|
||||
station.id === focusedStationId
|
||||
? ' station-list-item--selected'
|
||||
: ''
|
||||
}${
|
||||
station.id === routeSelection.startId
|
||||
? ' station-list-item--start'
|
||||
: ''
|
||||
}${
|
||||
station.id === routeSelection.endId
|
||||
? ' station-list-item--end'
|
||||
: ''
|
||||
}`}
|
||||
aria-pressed={station.id === focusedStationId}
|
||||
onClick={() => handleStationSelection(station.id)}
|
||||
>
|
||||
<span className="station-list-item__name">
|
||||
{station.name}
|
||||
</span>
|
||||
<span className="station-list-item__coords">
|
||||
{station.latitude.toFixed(3)},{' '}
|
||||
{station.longitude.toFixed(3)}
|
||||
</span>
|
||||
{station.id === routeSelection.startId && (
|
||||
<span className="station-list-item__badge">Origin</span>
|
||||
)}
|
||||
{station.id === routeSelection.endId && (
|
||||
<span className="station-list-item__badge">
|
||||
Destination
|
||||
</span>
|
||||
)}
|
||||
</button>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
@@ -70,12 +303,99 @@ function App(): JSX.Element {
|
||||
{data.tracks.map((track) => (
|
||||
<li key={track.id}>
|
||||
{track.startStationId} → {track.endStationId} ·{' '}
|
||||
{(track.lengthMeters / 1000).toFixed(1)} km
|
||||
{track.lengthMeters > 0
|
||||
? `${(track.lengthMeters / 1000).toFixed(1)} km`
|
||||
: 'N/A'}
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
{focusedStation && (
|
||||
<div className="selected-station">
|
||||
<h3>Focused Station</h3>
|
||||
<dl>
|
||||
<div>
|
||||
<dt>Name</dt>
|
||||
<dd>{focusedStation.name}</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>Coordinates</dt>
|
||||
<dd>
|
||||
{focusedStation.latitude.toFixed(5)},{' '}
|
||||
{focusedStation.longitude.toFixed(5)}
|
||||
</dd>
|
||||
</div>
|
||||
{focusedStation.code && (
|
||||
<div>
|
||||
<dt>Code</dt>
|
||||
<dd>{focusedStation.code}</dd>
|
||||
</div>
|
||||
)}
|
||||
{typeof focusedStation.elevationM === 'number' && (
|
||||
<div>
|
||||
<dt>Elevation</dt>
|
||||
<dd>{focusedStation.elevationM.toFixed(1)} m</dd>
|
||||
</div>
|
||||
)}
|
||||
{focusedStation.osmId && (
|
||||
<div>
|
||||
<dt>OSM ID</dt>
|
||||
<dd>{focusedStation.osmId}</dd>
|
||||
</div>
|
||||
)}
|
||||
<div>
|
||||
<dt>Status</dt>
|
||||
<dd>
|
||||
{(focusedStation.isActive ?? true) ? 'Active' : 'Inactive'}
|
||||
</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</div>
|
||||
)}
|
||||
{selectedTrack && (
|
||||
<div className="selected-track">
|
||||
<h3>Selected Track</h3>
|
||||
<dl>
|
||||
<div>
|
||||
<dt>Start Station</dt>
|
||||
<dd>
|
||||
{stationById.get(selectedTrack.startStationId)?.name ??
|
||||
'Unknown'}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>End Station</dt>
|
||||
<dd>
|
||||
{stationById.get(selectedTrack.endStationId)?.name ??
|
||||
'Unknown'}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>Length</dt>
|
||||
<dd>
|
||||
{selectedTrack.lengthMeters > 0
|
||||
? `${(selectedTrack.lengthMeters / 1000).toFixed(2)} km`
|
||||
: 'N/A'}
|
||||
</dd>
|
||||
</div>
|
||||
<div>
|
||||
<dt>Max Speed</dt>
|
||||
<dd>{selectedTrack.maxSpeedKph} km/h</dd>
|
||||
</div>
|
||||
{selectedTrack.status && (
|
||||
<div>
|
||||
<dt>Status</dt>
|
||||
<dd>{selectedTrack.status}</dd>
|
||||
</div>
|
||||
)}
|
||||
<div>
|
||||
<dt>Bidirectional</dt>
|
||||
<dd>{selectedTrack.isBidirectional ? 'Yes' : 'No'}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
@@ -86,3 +406,7 @@ function App(): JSX.Element {
|
||||
}
|
||||
|
||||
export default App;
|
||||
|
||||
function hasStation(stations: Station[], id: string): boolean {
|
||||
return stations.some((station) => station.id === id);
|
||||
}
|
||||
|
||||
@@ -1,6 +1,13 @@
|
||||
import type { LatLngBoundsExpression, LatLngExpression } from 'leaflet';
|
||||
import { useMemo } from 'react';
|
||||
import { CircleMarker, MapContainer, Polyline, TileLayer, Tooltip } from 'react-leaflet';
|
||||
import { useEffect, useMemo } from 'react';
|
||||
import {
|
||||
CircleMarker,
|
||||
MapContainer,
|
||||
Polyline,
|
||||
TileLayer,
|
||||
Tooltip,
|
||||
useMap,
|
||||
} from 'react-leaflet';
|
||||
|
||||
import type { NetworkSnapshot } from '../../services/api';
|
||||
|
||||
@@ -8,6 +15,13 @@ import 'leaflet/dist/leaflet.css';
|
||||
|
||||
interface NetworkMapProps {
|
||||
readonly snapshot: NetworkSnapshot;
|
||||
readonly focusedStationId?: string | null;
|
||||
readonly startStationId?: string | null;
|
||||
readonly endStationId?: string | null;
|
||||
readonly routeSegments?: LatLngExpression[][];
|
||||
readonly selectedTrackId?: string | null;
|
||||
readonly onStationClick?: (stationId: string) => void;
|
||||
readonly onTrackClick?: (trackId: string) => void;
|
||||
}
|
||||
|
||||
interface StationPosition {
|
||||
@@ -18,7 +32,16 @@ interface StationPosition {
|
||||
|
||||
const DEFAULT_CENTER: LatLngExpression = [51.505, -0.09];
|
||||
|
||||
export function NetworkMap({ snapshot }: NetworkMapProps): JSX.Element {
|
||||
export function NetworkMap({
|
||||
snapshot,
|
||||
focusedStationId,
|
||||
startStationId,
|
||||
endStationId,
|
||||
routeSegments = [],
|
||||
selectedTrackId,
|
||||
onStationClick,
|
||||
onTrackClick,
|
||||
}: NetworkMapProps): JSX.Element {
|
||||
const stationPositions = useMemo<StationPosition[]>(() => {
|
||||
return snapshot.stations.map((station) => ({
|
||||
id: station.id,
|
||||
@@ -38,6 +61,12 @@ export function NetworkMap({ snapshot }: NetworkMapProps): JSX.Element {
|
||||
const trackSegments = useMemo(() => {
|
||||
return snapshot.tracks
|
||||
.map((track) => {
|
||||
if (track.coordinates && track.coordinates.length >= 2) {
|
||||
return track.coordinates.map(
|
||||
(pair) => [pair[0], pair[1]] as LatLngExpression
|
||||
);
|
||||
}
|
||||
|
||||
const start = stationLookup.get(track.startStationId);
|
||||
const end = stationLookup.get(track.endStationId);
|
||||
if (!start || !end) {
|
||||
@@ -73,6 +102,13 @@ export function NetworkMap({ snapshot }: NetworkMapProps): JSX.Element {
|
||||
] as LatLngBoundsExpression;
|
||||
}, [stationPositions]);
|
||||
|
||||
const focusedPosition = useMemo(() => {
|
||||
if (!focusedStationId) {
|
||||
return null;
|
||||
}
|
||||
return stationLookup.get(focusedStationId) ?? null;
|
||||
}, [focusedStationId, stationLookup]);
|
||||
|
||||
return (
|
||||
<MapContainer
|
||||
className="network-map"
|
||||
@@ -84,19 +120,134 @@ export function NetworkMap({ snapshot }: NetworkMapProps): JSX.Element {
|
||||
attribution='© <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
|
||||
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
|
||||
/>
|
||||
{trackSegments.map((segment, index) => (
|
||||
<Polyline key={`track-${index}`} positions={segment} pathOptions={{ color: '#38bdf8', weight: 4 }} />
|
||||
{focusedPosition ? <StationFocus position={focusedPosition} /> : null}
|
||||
{trackSegments.map((segment, index) => {
|
||||
const track = snapshot.tracks[index];
|
||||
const isSelected = track.id === selectedTrackId;
|
||||
return (
|
||||
<Polyline
|
||||
key={`track-${track.id}`}
|
||||
positions={segment}
|
||||
pathOptions={{
|
||||
color: isSelected ? '#3b82f6' : '#334155',
|
||||
weight: isSelected ? 5 : 3,
|
||||
opacity: 0.8,
|
||||
}}
|
||||
eventHandlers={{
|
||||
click: () => {
|
||||
onTrackClick?.(track.id);
|
||||
},
|
||||
}}
|
||||
>
|
||||
<Tooltip>
|
||||
{track.startStationId} → {track.endStationId}
|
||||
<br />
|
||||
Length:{' '}
|
||||
{track.lengthMeters > 0
|
||||
? `${(track.lengthMeters / 1000).toFixed(1)} km`
|
||||
: 'N/A'}
|
||||
<br />
|
||||
Max Speed: {track.maxSpeedKph} km/h
|
||||
{track.status && (
|
||||
<>
|
||||
<br />
|
||||
Status: {track.status}
|
||||
</>
|
||||
)}
|
||||
</Tooltip>
|
||||
</Polyline>
|
||||
);
|
||||
})}
|
||||
{routeSegments.map((segment, index) => (
|
||||
<Polyline
|
||||
key={`route-${index}`}
|
||||
positions={segment}
|
||||
pathOptions={{ color: '#facc15', weight: 6, opacity: 0.9 }}
|
||||
/>
|
||||
))}
|
||||
{stationPositions.map((station) => (
|
||||
<CircleMarker
|
||||
key={station.id}
|
||||
center={station.position}
|
||||
radius={6}
|
||||
pathOptions={{ color: '#f97316', fillColor: '#fed7aa', fillOpacity: 0.9 }}
|
||||
radius={station.id === focusedStationId ? 9 : 6}
|
||||
pathOptions={{
|
||||
color: resolveMarkerStroke(
|
||||
station.id,
|
||||
startStationId,
|
||||
endStationId,
|
||||
focusedStationId
|
||||
),
|
||||
fillColor: resolveMarkerFill(
|
||||
station.id,
|
||||
startStationId,
|
||||
endStationId,
|
||||
focusedStationId
|
||||
),
|
||||
fillOpacity: 0.96,
|
||||
weight: station.id === focusedStationId ? 3 : 1,
|
||||
}}
|
||||
eventHandlers={{
|
||||
click: () => {
|
||||
onStationClick?.(station.id);
|
||||
},
|
||||
}}
|
||||
>
|
||||
<Tooltip direction="top" offset={[0, -8]}>{station.name}</Tooltip>
|
||||
<Tooltip
|
||||
direction="top"
|
||||
offset={[0, -8]}
|
||||
permanent={station.id === focusedStationId}
|
||||
sticky
|
||||
>
|
||||
{station.name}
|
||||
</Tooltip>
|
||||
</CircleMarker>
|
||||
))}
|
||||
</MapContainer>
|
||||
);
|
||||
}
|
||||
|
||||
function StationFocus({ position }: { position: LatLngExpression }): null {
|
||||
const map = useMap();
|
||||
|
||||
useEffect(() => {
|
||||
map.panTo(position, { animate: true, duration: 0.6 });
|
||||
}, [map, position]);
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
function resolveMarkerStroke(
|
||||
stationId: string,
|
||||
startStationId?: string | null,
|
||||
endStationId?: string | null,
|
||||
focusedStationId?: string | null
|
||||
): string {
|
||||
if (stationId === startStationId) {
|
||||
return '#38bdf8';
|
||||
}
|
||||
if (stationId === endStationId) {
|
||||
return '#fb923c';
|
||||
}
|
||||
if (stationId === focusedStationId) {
|
||||
return '#22c55e';
|
||||
}
|
||||
return '#f97316';
|
||||
}
|
||||
|
||||
function resolveMarkerFill(
|
||||
stationId: string,
|
||||
startStationId?: string | null,
|
||||
endStationId?: string | null,
|
||||
focusedStationId?: string | null
|
||||
): string {
|
||||
if (stationId === startStationId) {
|
||||
return '#bae6fd';
|
||||
}
|
||||
if (stationId === endStationId) {
|
||||
return '#fed7aa';
|
||||
}
|
||||
if (stationId === focusedStationId) {
|
||||
return '#bbf7d0';
|
||||
}
|
||||
return '#ffe4c7';
|
||||
}
|
||||
|
||||
@@ -86,6 +86,80 @@ body {
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.station-list-item {
|
||||
width: 100%;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
gap: 0.75rem;
|
||||
border: 1px solid rgba(148, 163, 184, 0.35);
|
||||
background: rgba(15, 23, 42, 0.65);
|
||||
border-radius: 10px;
|
||||
padding: 0.6rem 0.75rem;
|
||||
color: rgba(226, 232, 240, 0.9);
|
||||
cursor: pointer;
|
||||
transition:
|
||||
background-color 0.18s ease,
|
||||
border-color 0.18s ease,
|
||||
transform 0.18s ease;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.station-list-item:hover,
|
||||
.station-list-item:focus-visible {
|
||||
outline: none;
|
||||
border-color: rgba(94, 234, 212, 0.7);
|
||||
background: rgba(15, 118, 110, 0.3);
|
||||
transform: translateY(-1px);
|
||||
}
|
||||
|
||||
.station-list-item--selected {
|
||||
border-color: rgba(45, 212, 191, 0.9);
|
||||
background: rgba(13, 148, 136, 0.4);
|
||||
box-shadow: 0 8px 18px -10px rgba(45, 212, 191, 0.65);
|
||||
}
|
||||
|
||||
.station-list-item--start {
|
||||
border-color: rgba(56, 189, 248, 0.8);
|
||||
background: rgba(14, 165, 233, 0.2);
|
||||
}
|
||||
|
||||
.station-list-item--end {
|
||||
border-color: rgba(249, 115, 22, 0.8);
|
||||
background: rgba(234, 88, 12, 0.18);
|
||||
}
|
||||
|
||||
.station-list-item__name {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.station-list-item__coords {
|
||||
font-size: 0.85rem;
|
||||
color: rgba(226, 232, 240, 0.7);
|
||||
font-family: 'Fira Code', 'Source Code Pro', monospace;
|
||||
}
|
||||
|
||||
.station-list-item__badge {
|
||||
font-size: 0.75rem;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.05em;
|
||||
padding: 0.1rem 0.45rem;
|
||||
border-radius: 999px;
|
||||
background: rgba(148, 163, 184, 0.18);
|
||||
color: rgba(226, 232, 240, 0.85);
|
||||
}
|
||||
|
||||
.station-list-item--start .station-list-item__badge {
|
||||
background: rgba(56, 189, 248, 0.35);
|
||||
color: #0ea5e9;
|
||||
}
|
||||
|
||||
.station-list-item--end .station-list-item__badge {
|
||||
background: rgba(249, 115, 22, 0.35);
|
||||
color: #f97316;
|
||||
}
|
||||
|
||||
.grid h3 {
|
||||
margin-bottom: 0.5rem;
|
||||
font-size: 1.1rem;
|
||||
@@ -109,6 +183,173 @@ body {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.route-panel {
|
||||
display: grid;
|
||||
gap: 0.85rem;
|
||||
padding: 1.1rem 1.35rem;
|
||||
border-radius: 12px;
|
||||
border: 1px solid rgba(250, 204, 21, 0.3);
|
||||
background: rgba(161, 98, 7, 0.16);
|
||||
}
|
||||
|
||||
.route-panel__header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.route-panel__hint {
|
||||
font-size: 0.9rem;
|
||||
color: rgba(226, 232, 240, 0.78);
|
||||
}
|
||||
|
||||
.route-panel__meta {
|
||||
display: grid;
|
||||
gap: 0.45rem;
|
||||
}
|
||||
|
||||
.route-panel__meta > div {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: baseline;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.route-panel__meta dt {
|
||||
font-size: 0.8rem;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.06em;
|
||||
color: rgba(226, 232, 240, 0.65);
|
||||
}
|
||||
|
||||
.route-panel__meta dd {
|
||||
font-size: 0.95rem;
|
||||
color: rgba(226, 232, 240, 0.92);
|
||||
}
|
||||
|
||||
.route-panel__error {
|
||||
color: #f87171;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.route-panel__path {
|
||||
display: flex;
|
||||
gap: 0.6rem;
|
||||
align-items: baseline;
|
||||
}
|
||||
|
||||
.route-panel__path span {
|
||||
font-size: 0.85rem;
|
||||
color: rgba(226, 232, 240, 0.7);
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.06em;
|
||||
}
|
||||
|
||||
.route-panel__path ol {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 0.4rem;
|
||||
list-style: none;
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.route-panel__path li::after {
|
||||
content: '→';
|
||||
margin-left: 0.35rem;
|
||||
color: rgba(250, 204, 21, 0.75);
|
||||
}
|
||||
|
||||
.route-panel__path li:last-child::after {
|
||||
content: '';
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.route-panel__actions {
|
||||
margin-top: 1rem;
|
||||
display: flex;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.selected-station {
|
||||
margin-top: 1rem;
|
||||
padding: 1rem 1.25rem;
|
||||
border-radius: 12px;
|
||||
border: 1px solid rgba(45, 212, 191, 0.35);
|
||||
background: rgba(13, 148, 136, 0.18);
|
||||
display: grid;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.selected-station h3 {
|
||||
color: rgba(226, 232, 240, 0.9);
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
|
||||
.selected-station dl {
|
||||
display: grid;
|
||||
gap: 0.45rem;
|
||||
}
|
||||
|
||||
.selected-station dl > div {
|
||||
display: flex;
|
||||
align-items: baseline;
|
||||
justify-content: space-between;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.selected-station dt {
|
||||
font-size: 0.8rem;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.08em;
|
||||
color: rgba(226, 232, 240, 0.6);
|
||||
}
|
||||
|
||||
.selected-station dd {
|
||||
font-size: 0.95rem;
|
||||
color: rgba(226, 232, 240, 0.92);
|
||||
}
|
||||
|
||||
.selected-track {
|
||||
margin-top: 1rem;
|
||||
padding: 1rem 1.25rem;
|
||||
border-radius: 12px;
|
||||
border: 1px solid rgba(59, 130, 246, 0.35);
|
||||
background: rgba(37, 99, 235, 0.18);
|
||||
display: grid;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.selected-track h3 {
|
||||
color: rgba(226, 232, 240, 0.9);
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
|
||||
.selected-track dl {
|
||||
display: grid;
|
||||
gap: 0.45rem;
|
||||
}
|
||||
|
||||
.selected-track dl > div {
|
||||
display: flex;
|
||||
align-items: baseline;
|
||||
justify-content: space-between;
|
||||
gap: 0.75rem;
|
||||
}
|
||||
|
||||
.selected-track dt {
|
||||
font-size: 0.8rem;
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.08em;
|
||||
color: rgba(226, 232, 240, 0.6);
|
||||
}
|
||||
|
||||
.selected-track dd {
|
||||
font-size: 0.95rem;
|
||||
color: rgba(226, 232, 240, 0.92);
|
||||
}
|
||||
|
||||
@media (min-width: 768px) {
|
||||
.snapshot-layout {
|
||||
gap: 2rem;
|
||||
|
||||
@@ -11,6 +11,10 @@ export interface Station extends Identified {
|
||||
readonly name: string;
|
||||
readonly latitude: number;
|
||||
readonly longitude: number;
|
||||
readonly code?: string | null;
|
||||
readonly osmId?: string | null;
|
||||
readonly elevationM?: number | null;
|
||||
readonly isActive?: boolean;
|
||||
}
|
||||
|
||||
export interface Track extends Identified {
|
||||
@@ -18,6 +22,9 @@ export interface Track extends Identified {
|
||||
readonly endStationId: string;
|
||||
readonly lengthMeters: number;
|
||||
readonly maxSpeedKph: number;
|
||||
readonly status?: string | null;
|
||||
readonly isBidirectional?: boolean;
|
||||
readonly coordinates: readonly [number, number][];
|
||||
}
|
||||
|
||||
export interface Train extends Identified {
|
||||
|
||||
216
frontend/src/utils/route.test.ts
Normal file
216
frontend/src/utils/route.test.ts
Normal file
@@ -0,0 +1,216 @@
|
||||
import { describe, expect, it } from 'vitest';
|
||||
|
||||
import { buildTrackAdjacency, computeRoute } from './route';
|
||||
import type { Station, Track } from '../types/domain';
|
||||
|
||||
const baseTimestamps = {
|
||||
createdAt: '2024-01-01T00:00:00Z',
|
||||
updatedAt: '2024-01-01T00:00:00Z',
|
||||
};
|
||||
|
||||
describe('route utilities', () => {
|
||||
it('finds a multi-hop path across connected tracks', () => {
|
||||
const stations: Station[] = [
|
||||
{
|
||||
id: 'station-a',
|
||||
name: 'Alpha',
|
||||
latitude: 51.5,
|
||||
longitude: -0.1,
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'station-b',
|
||||
name: 'Bravo',
|
||||
latitude: 51.52,
|
||||
longitude: -0.11,
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'station-c',
|
||||
name: 'Charlie',
|
||||
latitude: 51.54,
|
||||
longitude: -0.12,
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'station-d',
|
||||
name: 'Delta',
|
||||
latitude: 51.55,
|
||||
longitude: -0.15,
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const tracks: Track[] = [
|
||||
{
|
||||
id: 'track-ab',
|
||||
startStationId: 'station-a',
|
||||
endStationId: 'station-b',
|
||||
lengthMeters: 1200,
|
||||
maxSpeedKph: 120,
|
||||
coordinates: [
|
||||
[51.5, -0.1],
|
||||
[51.51, -0.105],
|
||||
[51.52, -0.11],
|
||||
],
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'track-bc',
|
||||
startStationId: 'station-b',
|
||||
endStationId: 'station-c',
|
||||
lengthMeters: 1500,
|
||||
maxSpeedKph: 110,
|
||||
coordinates: [
|
||||
[51.52, -0.11],
|
||||
[51.53, -0.115],
|
||||
[51.54, -0.12],
|
||||
],
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'track-cd',
|
||||
startStationId: 'station-c',
|
||||
endStationId: 'station-d',
|
||||
lengthMeters: 900,
|
||||
maxSpeedKph: 115,
|
||||
coordinates: [
|
||||
[51.54, -0.12],
|
||||
[51.545, -0.13],
|
||||
[51.55, -0.15],
|
||||
],
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const stationById = new Map(stations.map((station) => [station.id, station]));
|
||||
const adjacency = buildTrackAdjacency(tracks);
|
||||
|
||||
const result = computeRoute({
|
||||
startId: 'station-a',
|
||||
endId: 'station-d',
|
||||
stationById,
|
||||
adjacency,
|
||||
});
|
||||
|
||||
expect(result.error).toBeNull();
|
||||
expect(result.stations?.map((station) => station.id)).toEqual([
|
||||
'station-a',
|
||||
'station-b',
|
||||
'station-c',
|
||||
'station-d',
|
||||
]);
|
||||
expect(result.tracks.map((track) => track.id)).toEqual([
|
||||
'track-ab',
|
||||
'track-bc',
|
||||
'track-cd',
|
||||
]);
|
||||
expect(result.totalLength).toBe(1200 + 1500 + 900);
|
||||
expect(result.segments).toHaveLength(3);
|
||||
expect(result.segments[0][0]).toEqual([51.5, -0.1]);
|
||||
expect(result.segments[2][result.segments[2].length - 1]).toEqual([51.55, -0.15]);
|
||||
});
|
||||
|
||||
it('returns an error when no path exists', () => {
|
||||
const stations: Station[] = [
|
||||
{
|
||||
id: 'station-a',
|
||||
name: 'Alpha',
|
||||
latitude: 51.5,
|
||||
longitude: -0.1,
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'station-b',
|
||||
name: 'Bravo',
|
||||
latitude: 51.6,
|
||||
longitude: -0.2,
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const tracks: Track[] = [
|
||||
{
|
||||
id: 'track-self',
|
||||
startStationId: 'station-a',
|
||||
endStationId: 'station-a',
|
||||
lengthMeters: 0,
|
||||
maxSpeedKph: 80,
|
||||
coordinates: [
|
||||
[51.5, -0.1],
|
||||
[51.5005, -0.1005],
|
||||
],
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const stationById = new Map(stations.map((station) => [station.id, station]));
|
||||
const adjacency = buildTrackAdjacency(tracks);
|
||||
|
||||
const result = computeRoute({
|
||||
startId: 'station-a',
|
||||
endId: 'station-b',
|
||||
stationById,
|
||||
adjacency,
|
||||
});
|
||||
|
||||
expect(result.stations).toBeNull();
|
||||
expect(result.tracks).toHaveLength(0);
|
||||
expect(result.error).toBe(
|
||||
'No rail connection found between the selected stations.'
|
||||
);
|
||||
expect(result.segments).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('reverses track geometry when traversing in the opposite direction', () => {
|
||||
const stations: Station[] = [
|
||||
{
|
||||
id: 'station-a',
|
||||
name: 'Alpha',
|
||||
latitude: 51.5,
|
||||
longitude: -0.1,
|
||||
...baseTimestamps,
|
||||
},
|
||||
{
|
||||
id: 'station-b',
|
||||
name: 'Bravo',
|
||||
latitude: 51.52,
|
||||
longitude: -0.11,
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const tracks: Track[] = [
|
||||
{
|
||||
id: 'track-ab',
|
||||
startStationId: 'station-a',
|
||||
endStationId: 'station-b',
|
||||
lengthMeters: 1200,
|
||||
maxSpeedKph: 120,
|
||||
coordinates: [
|
||||
[51.5, -0.1],
|
||||
[51.52, -0.11],
|
||||
],
|
||||
...baseTimestamps,
|
||||
},
|
||||
];
|
||||
|
||||
const stationById = new Map(stations.map((station) => [station.id, station]));
|
||||
const adjacency = buildTrackAdjacency(tracks);
|
||||
|
||||
const result = computeRoute({
|
||||
startId: 'station-b',
|
||||
endId: 'station-a',
|
||||
stationById,
|
||||
adjacency,
|
||||
});
|
||||
|
||||
expect(result.error).toBeNull();
|
||||
expect(result.segments).toEqual([
|
||||
[
|
||||
[51.52, -0.11],
|
||||
[51.5, -0.1],
|
||||
],
|
||||
]);
|
||||
});
|
||||
});
|
||||
239
frontend/src/utils/route.ts
Normal file
239
frontend/src/utils/route.ts
Normal file
@@ -0,0 +1,239 @@
|
||||
import type { Station, Track } from '../types/domain';
|
||||
|
||||
export type LatLngTuple = readonly [number, number];
|
||||
|
||||
export interface NeighborEdge {
|
||||
readonly neighborId: string;
|
||||
readonly track: Track;
|
||||
readonly isForward: boolean;
|
||||
}
|
||||
|
||||
export type TrackAdjacency = Map<string, NeighborEdge[]>;
|
||||
|
||||
export interface ComputeRouteParams {
|
||||
readonly startId?: string | null;
|
||||
readonly endId?: string | null;
|
||||
readonly stationById: Map<string, Station>;
|
||||
readonly adjacency: TrackAdjacency;
|
||||
}
|
||||
|
||||
export interface RouteComputation {
|
||||
readonly stations: Station[] | null;
|
||||
readonly tracks: Track[];
|
||||
readonly totalLength: number | null;
|
||||
readonly error: string | null;
|
||||
readonly segments: LatLngTuple[][];
|
||||
}
|
||||
|
||||
export function buildTrackAdjacency(tracks: readonly Track[]): TrackAdjacency {
|
||||
const adjacency: TrackAdjacency = new Map();
|
||||
|
||||
const register = (fromId: string, toId: string, track: Track, isForward: boolean) => {
|
||||
if (!adjacency.has(fromId)) {
|
||||
adjacency.set(fromId, []);
|
||||
}
|
||||
adjacency.get(fromId)!.push({ neighborId: toId, track, isForward });
|
||||
};
|
||||
|
||||
for (const track of tracks) {
|
||||
register(track.startStationId, track.endStationId, track, true);
|
||||
register(track.endStationId, track.startStationId, track, false);
|
||||
}
|
||||
|
||||
return adjacency;
|
||||
}
|
||||
|
||||
export function computeRoute({
|
||||
startId,
|
||||
endId,
|
||||
stationById,
|
||||
adjacency,
|
||||
}: ComputeRouteParams): RouteComputation {
|
||||
if (!startId || !endId) {
|
||||
return emptyResult();
|
||||
}
|
||||
|
||||
if (!stationById.has(startId) || !stationById.has(endId)) {
|
||||
return {
|
||||
stations: null,
|
||||
tracks: [],
|
||||
totalLength: null,
|
||||
error: 'Selected stations are no longer available.',
|
||||
segments: [],
|
||||
};
|
||||
}
|
||||
|
||||
if (startId === endId) {
|
||||
const station = stationById.get(startId);
|
||||
return {
|
||||
stations: station ? [station] : null,
|
||||
tracks: [],
|
||||
totalLength: 0,
|
||||
error: null,
|
||||
segments: [],
|
||||
};
|
||||
}
|
||||
|
||||
const visited = new Set<string>();
|
||||
const queue: string[] = [];
|
||||
const parent = new Map<string, { prev: string | null; edge: NeighborEdge | null }>();
|
||||
|
||||
queue.push(startId);
|
||||
visited.add(startId);
|
||||
parent.set(startId, { prev: null, edge: null });
|
||||
|
||||
while (queue.length > 0) {
|
||||
const current = queue.shift()!;
|
||||
if (current === endId) {
|
||||
break;
|
||||
}
|
||||
|
||||
const neighbors = adjacency.get(current) ?? [];
|
||||
for (const edge of neighbors) {
|
||||
const { neighborId } = edge;
|
||||
if (visited.has(neighborId)) {
|
||||
continue;
|
||||
}
|
||||
visited.add(neighborId);
|
||||
parent.set(neighborId, { prev: current, edge });
|
||||
queue.push(neighborId);
|
||||
}
|
||||
}
|
||||
|
||||
if (!parent.has(endId)) {
|
||||
return {
|
||||
stations: null,
|
||||
tracks: [],
|
||||
totalLength: null,
|
||||
error: 'No rail connection found between the selected stations.',
|
||||
segments: [],
|
||||
};
|
||||
}
|
||||
|
||||
const stationPath: string[] = [];
|
||||
const trackSequence: Track[] = [];
|
||||
const directions: boolean[] = [];
|
||||
let cursor: string | null = endId;
|
||||
|
||||
while (cursor) {
|
||||
const details = parent.get(cursor);
|
||||
if (!details) {
|
||||
break;
|
||||
}
|
||||
stationPath.push(cursor);
|
||||
if (details.edge) {
|
||||
trackSequence.push(details.edge.track);
|
||||
directions.push(details.edge.isForward);
|
||||
}
|
||||
cursor = details.prev;
|
||||
}
|
||||
|
||||
stationPath.reverse();
|
||||
trackSequence.reverse();
|
||||
directions.reverse();
|
||||
|
||||
const stations = stationPath
|
||||
.map((id) => stationById.get(id))
|
||||
.filter((station): station is Station => Boolean(station));
|
||||
|
||||
const segments = buildSegments(trackSequence, directions, stationById);
|
||||
const totalLength = computeTotalLength(trackSequence, stations);
|
||||
|
||||
return {
|
||||
stations,
|
||||
tracks: trackSequence,
|
||||
totalLength,
|
||||
error: null,
|
||||
segments,
|
||||
};
|
||||
}
|
||||
|
||||
function buildSegments(
|
||||
tracks: Track[],
|
||||
directions: boolean[],
|
||||
stationById: Map<string, Station>
|
||||
): LatLngTuple[][] {
|
||||
const segments: LatLngTuple[][] = [];
|
||||
|
||||
for (let index = 0; index < tracks.length; index += 1) {
|
||||
const track = tracks[index];
|
||||
const isForward = directions[index] ?? true;
|
||||
const coordinates = extractTrackCoordinates(track, stationById);
|
||||
if (coordinates.length < 2) {
|
||||
continue;
|
||||
}
|
||||
segments.push(isForward ? coordinates : [...coordinates].reverse());
|
||||
}
|
||||
|
||||
return segments;
|
||||
}
|
||||
|
||||
function extractTrackCoordinates(
|
||||
track: Track,
|
||||
stationById: Map<string, Station>
|
||||
): LatLngTuple[] {
|
||||
if (Array.isArray(track.coordinates) && track.coordinates.length >= 2) {
|
||||
return track.coordinates.map((pair) => [pair[0], pair[1]] as LatLngTuple);
|
||||
}
|
||||
|
||||
const start = stationById.get(track.startStationId);
|
||||
const end = stationById.get(track.endStationId);
|
||||
if (!start || !end) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return [
|
||||
[start.latitude, start.longitude],
|
||||
[end.latitude, end.longitude],
|
||||
];
|
||||
}
|
||||
|
||||
function computeTotalLength(tracks: Track[], stations: Station[]): number | null {
|
||||
if (tracks.length === 0 && stations.length <= 1) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
const hasTrackLengths = tracks.every(
|
||||
(track) =>
|
||||
typeof track.lengthMeters === 'number' && Number.isFinite(track.lengthMeters)
|
||||
);
|
||||
|
||||
if (hasTrackLengths) {
|
||||
return tracks.reduce((total, track) => total + (track.lengthMeters ?? 0), 0);
|
||||
}
|
||||
|
||||
if (stations.length < 2) {
|
||||
return null;
|
||||
}
|
||||
|
||||
let total = 0;
|
||||
for (let index = 0; index < stations.length - 1; index += 1) {
|
||||
total += haversineDistance(stations[index], stations[index + 1]);
|
||||
}
|
||||
return total;
|
||||
}
|
||||
|
||||
function haversineDistance(a: Station, b: Station): number {
|
||||
const R = 6371_000;
|
||||
const toRad = (value: number) => (value * Math.PI) / 180;
|
||||
const dLat = toRad(b.latitude - a.latitude);
|
||||
const dLon = toRad(b.longitude - a.longitude);
|
||||
const lat1 = toRad(a.latitude);
|
||||
const lat2 = toRad(b.latitude);
|
||||
|
||||
const sinDLat = Math.sin(dLat / 2);
|
||||
const sinDLon = Math.sin(dLon / 2);
|
||||
const root = sinDLat * sinDLat + Math.cos(lat1) * Math.cos(lat2) * sinDLon * sinDLon;
|
||||
const c = 2 * Math.atan2(Math.sqrt(root), Math.sqrt(1 - root));
|
||||
return R * c;
|
||||
}
|
||||
|
||||
function emptyResult(): RouteComputation {
|
||||
return {
|
||||
stations: null,
|
||||
tracks: [],
|
||||
totalLength: null,
|
||||
error: null,
|
||||
segments: [],
|
||||
};
|
||||
}
|
||||
@@ -15,7 +15,7 @@
|
||||
"isolatedModules": true,
|
||||
"noEmit": true,
|
||||
"jsx": "react-jsx",
|
||||
"types": ["vite/client"]
|
||||
"types": ["vite/client", "vitest"]
|
||||
},
|
||||
"include": ["src"]
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
8
frontend/vitest.config.ts
Normal file
8
frontend/vitest.config.ts
Normal file
@@ -0,0 +1,8 @@
|
||||
import { defineConfig } from 'vitest/config';
|
||||
|
||||
export default defineConfig({
|
||||
test: {
|
||||
include: ['src/**/*.test.ts'],
|
||||
environment: 'node',
|
||||
},
|
||||
});
|
||||
169
scripts/init_demo_db.py
Normal file
169
scripts/init_demo_db.py
Normal file
@@ -0,0 +1,169 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Initialize the database with demo data for the Rail Game.
|
||||
|
||||
This script automates the database setup process:
|
||||
1. Validates environment setup
|
||||
2. Runs database migrations
|
||||
3. Loads OSM fixtures for demo data
|
||||
|
||||
Usage:
|
||||
python scripts/init_demo_db.py [--dry-run] [--region REGION]
|
||||
|
||||
Requirements:
|
||||
- Virtual environment activated
|
||||
- .env file configured with DATABASE_URL
|
||||
- PostgreSQL with PostGIS running
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
load_dotenv()
|
||||
except ImportError:
|
||||
print("WARNING: python-dotenv not installed. .env file will not be loaded automatically.")
|
||||
print("Install with: pip install python-dotenv")
|
||||
|
||||
|
||||
def check_virtualenv():
|
||||
"""Check if we're running in a virtual environment."""
|
||||
# Skip virtualenv check in Docker containers
|
||||
if os.getenv('INIT_DEMO_DB') == 'true':
|
||||
return
|
||||
|
||||
if not hasattr(sys, 'real_prefix') and not (hasattr(sys, 'base_prefix') and sys.base_prefix != sys.prefix):
|
||||
print("ERROR: Virtual environment not activated. Run:")
|
||||
print(" .venv\\Scripts\\Activate.ps1 (PowerShell)")
|
||||
print(" source .venv/bin/activate (Bash/macOS/Linux)")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def check_env_file():
|
||||
"""Check if .env file exists."""
|
||||
env_file = Path('.env')
|
||||
if not env_file.exists():
|
||||
print("ERROR: .env file not found. Copy .env.example to .env and configure:")
|
||||
print(" Copy-Item .env.example .env (PowerShell)")
|
||||
print(" cp .env.example .env (Bash)")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def check_database_url():
|
||||
"""Check if DATABASE_URL is set in environment."""
|
||||
database_url = os.getenv('DATABASE_URL')
|
||||
if not database_url:
|
||||
print("ERROR: DATABASE_URL not set. Check your .env file.")
|
||||
sys.exit(1)
|
||||
print(f"Using database: {database_url}")
|
||||
|
||||
|
||||
def run_command(cmd, cwd=None, description="", env=None):
|
||||
"""Run a shell command and return the result."""
|
||||
print(f"\n>>> {description}")
|
||||
print(f"Running: {' '.join(cmd)}")
|
||||
try:
|
||||
env_vars = os.environ.copy()
|
||||
if env:
|
||||
env_vars.update(env)
|
||||
env_vars.setdefault("PYTHONPATH", "/app")
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
cwd=cwd,
|
||||
check=True,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
env=env_vars,
|
||||
)
|
||||
if result.stdout:
|
||||
print(result.stdout)
|
||||
return result
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"ERROR: Command failed with exit code {e.returncode}")
|
||||
if e.stdout:
|
||||
print(e.stdout)
|
||||
if e.stderr:
|
||||
print(e.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def run_migrations():
|
||||
"""Run database migrations using alembic."""
|
||||
run_command(
|
||||
['alembic', 'upgrade', 'head'],
|
||||
cwd='backend',
|
||||
description="Running database migrations",
|
||||
)
|
||||
|
||||
|
||||
def load_osm_fixtures(region, dry_run=False):
|
||||
"""Load OSM fixtures for demo data."""
|
||||
cmd = ['python', '-m', 'backend.scripts.osm_refresh', '--region', region]
|
||||
if dry_run:
|
||||
cmd.append('--no-commit')
|
||||
description = f"Loading OSM fixtures (dry run) for region: {region}"
|
||||
else:
|
||||
description = f"Loading OSM fixtures for region: {region}"
|
||||
|
||||
run_command(cmd, description=description)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Initialize database with demo data")
|
||||
parser.add_argument(
|
||||
'--region',
|
||||
default='all',
|
||||
help='OSM region to load (default: all)'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--dry-run',
|
||||
action='store_true',
|
||||
help='Dry run: run migrations and load fixtures without committing'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--skip-migrations',
|
||||
action='store_true',
|
||||
help='Skip running migrations'
|
||||
)
|
||||
parser.add_argument(
|
||||
'--skip-fixtures',
|
||||
action='store_true',
|
||||
help='Skip loading OSM fixtures'
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
print("Rail Game Database Initialization")
|
||||
print("=" * 40)
|
||||
|
||||
# Pre-flight checks
|
||||
check_virtualenv()
|
||||
check_env_file()
|
||||
check_database_url()
|
||||
|
||||
# Run migrations
|
||||
if not args.skip_migrations:
|
||||
run_migrations()
|
||||
else:
|
||||
print("Skipping migrations (--skip-migrations)")
|
||||
|
||||
# Load fixtures
|
||||
if not args.skip_fixtures:
|
||||
load_osm_fixtures(args.region, args.dry_run)
|
||||
else:
|
||||
print("Skipping fixtures (--skip-fixtures)")
|
||||
|
||||
print("\n✅ Database initialization completed successfully!")
|
||||
if args.dry_run:
|
||||
print("Note: This was a dry run. No data was committed to the database.")
|
||||
else:
|
||||
print("Demo data loaded. You can now start the backend server.")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
Reference in New Issue
Block a user