Compare commits

..

9 Commits

Author SHA1 Message Date
f9086d2d04 feat: initialize database with demo data on first run and update README
Some checks failed
Backend CI / lint-and-test (push) Failing after 1m33s
Frontend CI / lint-and-build (push) Successful in 17s
2025-10-11 21:52:30 +02:00
25ca7ab196 Add OSM Track Harvesting Policy and demo database initialization script
- Updated documentation to include OSM Track Harvesting Policy with details on railway types, service filters, usage filters, and geometry guardrails.
- Introduced a new script `init_demo_db.py` to automate the database setup process, including environment checks, running migrations, and loading OSM fixtures for demo data.
2025-10-11 21:37:25 +02:00
0b84ee953e fix: correct grammar and formatting in README 2025-10-11 21:37:01 +02:00
8877380f21 fix: revert README structure 2025-10-11 21:07:27 +02:00
4393f17c45 refactor: simplify stage plan return type and enhance test coverage for OSM refresh 2025-10-11 20:37:25 +02:00
e10b2ee71c docs: fix formatting 2025-10-11 20:23:08 +02:00
1c8adb36fe feat: Add OSM refresh script and update loading scripts for improved database handling 2025-10-11 20:21:14 +02:00
c2927f2f60 feat: Enhance track model and import functionality
- Added new fields to TrackModel: status, is_bidirectional, and coordinates.
- Updated network service to handle new track attributes and geometry extraction.
- Introduced CLI scripts for importing and loading tracks from OpenStreetMap.
- Implemented normalization of track elements to ensure valid geometries.
- Enhanced tests for track model, network service, and import/load scripts.
- Updated frontend to accommodate new track attributes and improve route computation.
- Documented OSM ingestion process in architecture and runtime views.
2025-10-11 19:54:10 +02:00
090dca29c2 feat: add route selection functionality and improve station handling
- Added `vitest` for testing and created initial tests for route utilities.
- Implemented route selection logic in the App component, allowing users to select start and end stations.
- Updated the NetworkMap component to reflect focused and selected stations, including visual indicators for start and end stations.
- Enhanced the route panel UI to display selected route information and estimated lengths.
- Introduced utility functions for building track adjacency and computing routes based on selected stations.
- Improved styling for route selection and station list items to enhance user experience.
2025-10-11 19:28:35 +02:00
33 changed files with 540660 additions and 182 deletions

287
README.md
View File

@@ -1,146 +1,213 @@
# Rail Game # Rail Game
A browser-based railway simulation game using real world railway maps from OpenStreetMap. A browser-based railway simulation game using real-world railway maps from OpenStreetMap.
## At a glance
- Frontend: React + Vite (TypeScript)
- Backend: Python (FastAPI, SQLAlchemy)
- Database: PostgreSQL with PostGIS (spatial types)
- Mapping: Leaflet + OpenStreetMap
## Features ## Features
- Real world railway maps - Real-world railway maps
- Interactive Leaflet map preview of the demo network snapshot - Interactive Leaflet map preview of a demo network snapshot
- Build and manage your own railway network - Build and manage your railway network
- Dynamic train schedules - Dynamic train schedules and simulated trains
## Architecture ## Current project layout
The project is built using the following technologies: This repository contains a full-stack demo app (frontend + backend), supporting scripts, docs and infra. Key folders:
- Frontend: HTML5, CSS3, JavaScript, React - `backend/` — FastAPI application, models, services, migration scripts and backend tests.
- Backend: Python, FastAPI, Flask, SQLAlchemy - `frontend/` — React app (Vite) and frontend tests.
- Database: PostgreSQL with PostGIS extension - `docs/` — Architecture docs and ADRs.
- Mapping: Leaflet, OpenStreetMap - `infra/` — Deployment assets (Dockerfiles, compose files, init scripts).
- `data/` — Fixtures and imported OSM snapshots.
- `scripts/` — Utility scripts (precommit helpers, setup hooks).
- `tests/` — End-to-end tests and cross-cutting tests.
## Project Structure Refer to the in-repo `docs/` for architecture decisions and deeper design notes.
Planned structure for code and assets (folders created as needed):
```text
rail-game/
|-- backend/
| |-- app/
| | |-- api/ # FastAPI/Flask route handlers
| | |-- core/ # Config, startup, shared utilities
| | |-- models/ # SQLAlchemy models and schemas
| | |-- services/ # Domain logic and service layer
| | `-- websocket/ # Real-time communication handlers
| |-- tests/ # Backend unit and integration tests
| `-- requirements/ # Backend dependency lockfiles
|-- frontend/
| |-- public/ # Static assets served as-is
| |-- src/
| | |-- components/ # Reusable React components
| | |-- hooks/ # Custom React hooks
| | |-- pages/ # Top-level routed views
| | |-- state/ # Redux/Context stores and slices
| | |-- styles/ # Global and modular stylesheets
| | `-- utils/ # Frontend helpers and formatters
| `-- tests/ # Frontend unit and integration tests
|-- docs/ # Architecture docs, ADRs, guides
|-- infra/ # Deployment, IaC, Docker, CI workflows
|-- scripts/ # Tooling for setup, linting, migrations
|-- data/ # Seed data, fixtures, import/export tooling
`-- tests/ # End-to-end and cross-cutting tests
```
Use `infra/` to capture deployment assets (Dockerfiles, compose files, Terraform) and `.github/` for automation. Shared code that crosses layers should live in the respective service directories or dedicated packages under `backend/`.
## Installation ## Installation
1. Clone the repository: Below are concise, verified steps for getting the project running locally. Commands show both PowerShell (Windows) and Bash/macOS/Linux variants where they differ.
```bash ## Prerequisites
git clone https://github.com/zwitschi/rail-game.git
cd rail-game
```
2. Set up the backend (from the project root): - Git
- Python 3.10+ (3.11 recommended) and pip
- Node.js 16+ (or the version required by `frontend/package.json`)
- PostgreSQL with PostGIS if you want to run the full DB-backed stack locally
- Docker & Docker Compose (optional, for containerized dev)
```bash ### Clone repository
python -m venv .venv
.\.venv\Scripts\activate
python -m pip install -e .[dev]
```
3. Set up the frontend: PowerShell / Bash
```bash git clone https://github.com/zwitschi/rail-game.git
cd frontend cd rail-game
npm install
cd ..
```
4. Copy the sample environment file and adjust the database URLs per environment: ### Backend: create virtual environment and install
```bash PowerShell
copy .env.example .env # PowerShell: Copy-Item .env.example .env
```
`DATABASE_URL`, `TEST_DATABASE_URL`, and `ALEMBIC_DATABASE_URL` control the runtime, test, and migration connections respectively. python -m venv .venv
5. (Optional) Point Git to the bundled hooks: `pwsh scripts/setup_hooks.ps1`. .\.venv\Scripts\Activate.ps1
6. Run database migrations to set up the schema: python -m pip install -e .[dev]
```bash Bash / macOS / Linux
cd backend
alembic upgrade head
cd ..
```
7. Start the development servers from separate terminals: python -m venv .venv
source .venv/bin/activate
python -m pip install -e '.[dev]'
- Backend: `uvicorn backend.app.main:app --reload --port 8000` ### Notes
- Frontend: `cd frontend && npm run dev`
8. Open your browser: frontend runs at `http://localhost:5173`, backend API at `http://localhost:8000`. - Installing editable extras (`.[dev]`) installs dev/test tools used by the backend (pytest, black, isort, alembic, etc.).
9. Run quality checks:
- Backend unit tests: `pytest` ### Environment file
- Backend formatters: `black backend/` and `isort backend/`
- Frontend lint: `cd frontend && npm run lint`
- Frontend type/build check: `cd frontend && npm run build`
10. Build for production: Copy the sample `.env.example` to `.env` and adjust the database connection strings as needed.
- Frontend bundle: `cd frontend && npm run build` PowerShell
- Backend container: `docker build -t rail-game-backend backend/`
11. Run containers: Copy-Item .env.example .env
- Backend: `docker run -p 8000:8000 rail-game-backend`
- Frontend: Serve `frontend/dist` with any static file host.
## Database Migrations Bash
- Alembic configuration lives in `backend/alembic.ini` with scripts under `backend/migrations/`. cp .env.example .env
- Generate new revisions with `alembic revision --autogenerate -m "short description"` (ensure models are imported before running autogenerate).
- Apply migrations via `alembic upgrade head`; rollback with `alembic downgrade -1` during development.
## PostgreSQL Configuration ### Important environment variables
- **Database URLs**: The backend reads connection strings from the `.env` file. Set `DATABASE_URL` (development), `TEST_DATABASE_URL` (pytest/CI), and `ALEMBIC_DATABASE_URL` (migration runner). URLs use the SQLAlchemy format, e.g. `postgresql+psycopg://user:password@host:port/database`. - `DATABASE_URL` — runtime DB connection for the app
- **Required Extensions**: Migrations enable `postgis` for spatial types and `pgcrypto` for UUID generation. Ensure your Postgres instance has these extensions available. - `TEST_DATABASE_URL` — database used by pytest in CI/local tests
- **Recommended Databases**: create `railgame_dev` and `railgame_test` (or variants) owned by a dedicated `railgame` role with privileges to create extensions. - `ALEMBIC_DATABASE_URL` — used when running alembic outside the app process
- **Connection Debugging**: Toggle `DATABASE_ECHO=true` in `.env` to log SQL statements during development.
## API Preview ### Database (Postgres + PostGIS)
- `GET /api/health` Lightweight readiness probe. If you run Postgres locally, create the dev/test DBs and ensure the `postgis` extension is available. Example (psql):
- `POST /api/auth/register` Creates an in-memory demo account and returns a JWT access token.
- `POST /api/auth/login` Exchanges credentials for a JWT access token (demo user: `demo` / `railgame123`).
- `GET /api/auth/me` Returns the current authenticated user profile.
- `GET /api/network` Returns a sample snapshot of stations, tracks, and trains (camelCase fields) generated from shared domain models; requires a valid bearer token.
## Developer Tooling -- create DBs (run in psql as a superuser or role with create privileges)
CREATE DATABASE railgame_dev;
CREATE DATABASE railgame_test;
- Install backend tooling in editable mode: `python -m pip install -e .[dev]`. -- connect to the db and enable extensions
- Configure git hooks (Git for Windows works with these scripts): `pwsh scripts/setup_hooks.ps1`. \c railgame_dev
- Pre-commit hooks run `black`, `isort`, `pytest backend/tests`, and `npm run lint` if frontend dependencies are installed. CREATE EXTENSION IF NOT EXISTS postgis;
- Run the checks manually any time with `python scripts/precommit.py`. CREATE EXTENSION IF NOT EXISTS pgcrypto;
- Frontend lint/format commands live in `frontend/package.json` (`npm run lint`, `npm run format`).
- Continuous integration runs via workflows in `.github/workflows/` covering backend lint/tests and frontend lint/build. Adjust DB names and roles to match your `.env` values.
### Quick database setup (recommended)
For a streamlined setup, use the included initialization script after configuring your `.env` file:
PowerShell / Bash
python scripts/init_demo_db.py
This script validates your environment, runs migrations, and loads demo OSM data. Use `--dry-run` to preview changes, or `--region` to load specific regions.
**Note**: The script uses `python-dotenv` to load your `.env` file. If not installed, run `pip install python-dotenv`.
### Run migrations
PowerShell / Bash
cd backend
alembic upgrade head
cd ..
If you prefer to run alembic with a specific URL without editing `.env`, set `ALEMBIC_DATABASE_URL` in the environment before running the command.
### Load OSM fixtures (optional)
Use the included scripts to refresh stations and tracks from saved OSM fixtures. This step assumes the database is migrated and reachable.
PowerShell / Bash
# dry-run
python -m backend.scripts.osm_refresh --region all --no-commit
# commit to DB
python -m backend.scripts.osm_refresh --region all
See `backend/scripts/*.py` for more granular import options (`--skip-*` flags).
### Frontend
Install dependencies and run the dev server from the `frontend/` directory.
PowerShell / Bash
cd frontend
npm install
npm run dev
The frontend runs at `http://localhost:5173` by default (Vite). The React app talks to the backend API at the address configured in its environment (see `frontend` README or `vite` config).
### Run backend locally (development)
PowerShell / Bash
# from project root
uvicorn backend.app.main:app --reload --port 8000
The backend API listens at `http://localhost:8000` by default.
### Tests & linters
Backend
pytest
black backend/ && isort backend/
Frontend
cd frontend
npm run lint
npm run build # type/build check
### Docker / Compose (optional)
Build and run both services with Docker Compose if you prefer containers:
PowerShell / Bash
docker compose up --build
This starts all services (Postgres, Redis, backend, frontend) and automatically initializes the database with demo data on first run. The backend waits for the database to be ready before running migrations and loading OSM fixtures.
**Services:**
- Backend API: `http://localhost:8000`
- Frontend: `http://localhost:8080`
- Postgres: `localhost:5432`
- Redis: `localhost:6379`
This expects a working Docker environment and may require you to set DB URLs to point to the containerized Postgres service if one is defined in `docker-compose.yml`.
## Troubleshooting
- If migrations fail with missing PostGIS functions, ensure `postgis` is installed and enabled in the target database.
- If alembic autogenerate creates unexpected changes, confirm the models being imported match the app import path used by `alembic` (see `backend/migrations/env.py`).
- For authentication/debugging, the demo user is `demo` / `railgame123` (used by some integration tests and the demo auth flow).
- If frontend dev server fails due to node version, check `frontend/package.json` engines or use `nvm`/`nvm-windows` to match the recommended Node version.
## API preview
Some useful endpoints for local testing:
- `GET /api/health` — readiness probe
- `POST /api/auth/register` — demo account creation + JWT
- `POST /api/auth/login` — exchange credentials for JWT (demo user: `demo` / `railgame123`)
- `GET /api/auth/me` — current user profile (requires bearer token)
- `GET /api/network` — sample network snapshot (requires bearer token)
## Contributing
- See `docs/` for architecture and ADRs.
- Keep tests green and follow formatting rules (black, isort for Python; Prettier/ESLint for frontend).
- Open issues or PRs for bugs, features, or docs improvements.

10
TODO.md
View File

@@ -16,12 +16,12 @@
- [x] Define geographic bounding boxes and filtering rules for importing real-world stations from OpenStreetMap. - [x] Define geographic bounding boxes and filtering rules for importing real-world stations from OpenStreetMap.
- [x] Implement an import script/CLI that pulls OSM station data and normalizes it to the PostGIS schema. - [x] Implement an import script/CLI that pulls OSM station data and normalizes it to the PostGIS schema.
- [x] Expose backend CRUD endpoints for stations (create, update, archive) with validation and geometry handling. - [x] Expose backend CRUD endpoints for stations (create, update, archive) with validation and geometry handling.
- [ ] Build React map tooling for selecting a station. - [x] Build React map tooling for selecting a station.
- [ ] Build tools for station editing, including form validation. - [x] Enhance map UI to support selecting two stations and previewing the rail corridor between them.
- [ ] Define track selection criteria and tagging rules for harvesting OSM rail segments within target regions. - [x] Define track selection criteria and tagging rules for harvesting OSM rail segments within target regions.
- [ ] Extend the importer to load track geometries and associate them with existing stations. - [x] Extend the importer to load track geometries and associate them with existing stations.
- [ ] Implement backend track-management APIs with length/speed validation and topology checks. - [ ] Implement backend track-management APIs with length/speed validation and topology checks.
- [ ] Create a frontend track-drawing workflow (polyline editor, snapping to stations, undo/redo). - [ ] Implement track path mapping along existing OSM rail segments between chosen stations.
- [ ] Design train connection manager requirements (link trains to operating tracks, manage consist data). - [ ] Design train connection manager requirements (link trains to operating tracks, manage consist data).
- [ ] Implement backend services and APIs to attach trains to routes and update assignments. - [ ] Implement backend services and APIs to attach trains to routes and update assignments.
- [ ] Add UI flows for managing train connections, including visual feedback on the map. - [ ] Add UI flows for managing train connections, including visual feedback on the map.

View File

@@ -8,15 +8,26 @@ ENV PYTHONDONTWRITEBYTECODE=1 \
WORKDIR /app WORKDIR /app
RUN apt-get update \ RUN apt-get update \
&& apt-get install -y --no-install-recommends build-essential libpq-dev \ && apt-get install -y --no-install-recommends build-essential libpq-dev postgresql-client \
&& rm -rf /var/lib/apt/lists/* && rm -rf /var/lib/apt/lists/*
COPY backend/requirements/base.txt ./backend/requirements/base.txt COPY backend/requirements/base.txt ./backend/requirements/base.txt
RUN pip install --upgrade pip \ RUN pip install --upgrade pip \
&& pip install -r backend/requirements/base.txt && pip install -r backend/requirements/base.txt
COPY scripts ./scripts
COPY .env.example ./.env.example
COPY .env* ./
COPY backend ./backend COPY backend ./backend
EXPOSE 8000 EXPOSE 8000
CMD ["uvicorn", "backend.app.main:app", "--host", "0.0.0.0", "--port", "8000"] # Initialize database with demo data if INIT_DEMO_DB is set
CMD ["sh", "-c", "\
export PYTHONPATH=/app && \
echo 'Waiting for database...' && \
while ! pg_isready -h db -p 5432 -U railgame >/dev/null 2>&1; do sleep 1; done && \
echo 'Database is ready!' && \
if [ \"$INIT_DEMO_DB\" = \"true\" ]; then python scripts/init_demo_db.py; fi && \
uvicorn backend.app.main:app --host 0.0.0.0 --port 8000"]

View File

@@ -75,6 +75,43 @@ STATION_TAG_FILTERS: Mapping[str, Tuple[str, ...]] = {
} }
# Tags that describe rail infrastructure usable for train routing.
TRACK_ALLOWED_RAILWAY_TYPES: Tuple[str, ...] = (
"rail",
"light_rail",
"subway",
"tram",
"narrow_gauge",
"disused",
"construction",
)
TRACK_TAG_FILTERS: Mapping[str, Tuple[str, ...]] = {
"railway": TRACK_ALLOWED_RAILWAY_TYPES,
}
# Track ingestion policy
TRACK_EXCLUDED_SERVICE_TAGS: Tuple[str, ...] = (
"yard",
"siding",
"spur",
"crossover",
"industrial",
"military",
)
TRACK_EXCLUDED_USAGE_TAGS: Tuple[str, ...] = (
"military",
"tourism",
)
TRACK_MIN_LENGTH_METERS: float = 75.0
TRACK_STATION_SNAP_RADIUS_METERS: float = 350.0
def compile_overpass_filters(filters: Mapping[str, Iterable[str]]) -> str: def compile_overpass_filters(filters: Mapping[str, Iterable[str]]) -> str:
"""Build an Overpass boolean expression that matches the provided filters.""" """Build an Overpass boolean expression that matches the provided filters."""
@@ -89,5 +126,11 @@ __all__ = [
"BoundingBox", "BoundingBox",
"DEFAULT_REGIONS", "DEFAULT_REGIONS",
"STATION_TAG_FILTERS", "STATION_TAG_FILTERS",
"TRACK_ALLOWED_RAILWAY_TYPES",
"TRACK_TAG_FILTERS",
"TRACK_EXCLUDED_SERVICE_TAGS",
"TRACK_EXCLUDED_USAGE_TAGS",
"TRACK_MIN_LENGTH_METERS",
"TRACK_STATION_SNAP_RADIUS_METERS",
"compile_overpass_filters", "compile_overpass_filters",
] ]

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
from datetime import datetime from datetime import datetime
from typing import Generic, Sequence, TypeVar from typing import Generic, Sequence, TypeVar
from pydantic import BaseModel, ConfigDict from pydantic import BaseModel, ConfigDict, Field
def to_camel(string: str) -> str: def to_camel(string: str) -> str:
@@ -53,6 +53,9 @@ class TrackModel(IdentifiedModel[str]):
end_station_id: str end_station_id: str
length_meters: float length_meters: float
max_speed_kph: float max_speed_kph: float
status: str | None = None
is_bidirectional: bool = True
coordinates: list[tuple[float, float]] = Field(default_factory=list)
class TrainModel(IdentifiedModel[str]): class TrainModel(IdentifiedModel[str]):

View File

@@ -8,9 +8,10 @@ from geoalchemy2.elements import WKBElement, WKTElement
from geoalchemy2.shape import to_shape from geoalchemy2.shape import to_shape
try: # pragma: no cover - optional dependency guard try: # pragma: no cover - optional dependency guard
from shapely.geometry import Point # type: ignore from shapely.geometry import LineString, Point # type: ignore
except ImportError: # pragma: no cover - allow running without shapely at import time except ImportError: # pragma: no cover - allow running without shapely at import time
Point = None # type: ignore[assignment] Point = None # type: ignore[assignment]
LineString = None # type: ignore[assignment]
from sqlalchemy.exc import SQLAlchemyError from sqlalchemy.exc import SQLAlchemyError
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
@@ -51,6 +52,12 @@ def _fallback_snapshot() -> dict[str, list[dict[str, object]]]:
end_station_id="station-2", end_station_id="station-2",
length_meters=289000.0, length_meters=289000.0,
max_speed_kph=230.0, max_speed_kph=230.0,
status="operational",
is_bidirectional=True,
coordinates=[
(stations[0].latitude, stations[0].longitude),
(stations[1].latitude, stations[1].longitude),
],
created_at=now, created_at=now,
updated_at=now, updated_at=now,
) )
@@ -134,6 +141,20 @@ def get_network_snapshot(session: Session) -> dict[str, list[dict[str, object]]]
track_models: list[TrackModel] = [] track_models: list[TrackModel] = []
for track in tracks_entities: for track in tracks_entities:
coordinates: list[tuple[float, float]] = []
geometry = track.track_geometry
shape = (
to_shape(cast(WKBElement | WKTElement, geometry))
if geometry is not None and LineString is not None
else None
)
if LineString is not None and shape is not None and isinstance(shape, LineString):
coords_list: list[tuple[float, float]] = []
for coord in shape.coords:
lon = float(coord[0])
lat = float(coord[1])
coords_list.append((lat, lon))
coordinates = coords_list
track_models.append( track_models.append(
TrackModel( TrackModel(
id=str(track.id), id=str(track.id),
@@ -141,6 +162,9 @@ def get_network_snapshot(session: Session) -> dict[str, list[dict[str, object]]]
end_station_id=str(track.end_station_id), end_station_id=str(track.end_station_id),
length_meters=_to_float(track.length_meters), length_meters=_to_float(track.length_meters),
max_speed_kph=_to_float(track.max_speed_kph), max_speed_kph=_to_float(track.max_speed_kph),
status=track.status,
is_bidirectional=track.is_bidirectional,
coordinates=coordinates,
created_at=cast(datetime, track.created_at), created_at=cast(datetime, track.created_at),
updated_at=cast(datetime, track.updated_at), updated_at=cast(datetime, track.updated_at),
) )

View File

@@ -0,0 +1,201 @@
from __future__ import annotations
"""Orchestrate the OSM station/track import and load pipeline."""
import argparse
import sys
from dataclasses import dataclass
from pathlib import Path
from typing import Callable, Sequence
from backend.app.core.osm_config import DEFAULT_REGIONS
from backend.scripts import (
stations_import,
stations_load,
tracks_import,
tracks_load,
)
@dataclass(slots=True)
class Stage:
label: str
runner: Callable[[list[str] | None], int]
args: list[str]
input_path: Path | None = None
output_path: Path | None = None
def build_argument_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(
description="Run the station and track import/load workflow in sequence.",
)
parser.add_argument(
"--region",
choices=[region.name for region in DEFAULT_REGIONS] + ["all"],
default="all",
help="Region selector forwarded to the import scripts (default: all).",
)
parser.add_argument(
"--output-dir",
type=Path,
default=Path("data"),
help="Directory where intermediate JSON payloads are stored (default: data/).",
)
parser.add_argument(
"--stations-json",
type=Path,
help="Existing station JSON file to load; defaults to <output-dir>/osm_stations.json.",
)
parser.add_argument(
"--tracks-json",
type=Path,
help="Existing track JSON file to load; defaults to <output-dir>/osm_tracks.json.",
)
parser.add_argument(
"--skip-station-import",
action="store_true",
help="Skip the station import step (expects --stations-json to point to data).",
)
parser.add_argument(
"--skip-station-load",
action="store_true",
help="Skip loading stations into PostGIS.",
)
parser.add_argument(
"--skip-track-import",
action="store_true",
help="Skip the track import step (expects --tracks-json to point to data).",
)
parser.add_argument(
"--skip-track-load",
action="store_true",
help="Skip loading tracks into PostGIS.",
)
parser.add_argument(
"--dry-run",
action="store_true",
help="Print the planned stages without invoking Overpass or mutating the database.",
)
parser.add_argument(
"--commit",
dest="commit",
action="store_true",
default=True,
help="Commit database changes produced by the load steps (default).",
)
parser.add_argument(
"--no-commit",
dest="commit",
action="store_false",
help="Rollback database changes after load steps (dry run).",
)
return parser
def _build_stage_plan(args: argparse.Namespace) -> list[Stage]:
station_json = args.stations_json or args.output_dir / "osm_stations.json"
track_json = args.tracks_json or args.output_dir / "osm_tracks.json"
stages: list[Stage] = []
if not args.skip_station_import:
stages.append(
Stage(
label="Import stations",
runner=stations_import.main,
args=["--output", str(station_json), "--region", args.region],
output_path=station_json,
)
)
if not args.skip_station_load:
load_args = [str(station_json)]
if not args.commit:
load_args.append("--no-commit")
stages.append(
Stage(
label="Load stations",
runner=stations_load.main,
args=load_args,
input_path=station_json,
)
)
if not args.skip_track_import:
stages.append(
Stage(
label="Import tracks",
runner=tracks_import.main,
args=["--output", str(track_json), "--region", args.region],
output_path=track_json,
)
)
if not args.skip_track_load:
load_args = [str(track_json)]
if not args.commit:
load_args.append("--no-commit")
stages.append(
Stage(
label="Load tracks",
runner=tracks_load.main,
args=load_args,
input_path=track_json,
)
)
return stages
def _describe_plan(stages: Sequence[Stage]) -> None:
if not stages:
print("No stages selected; nothing to do.")
return
print("Selected stages:")
for stage in stages:
detail = " ".join(stage.args) if stage.args else "<no args>"
print(f" - {stage.label}: {detail}")
def _execute_stage(stage: Stage) -> None:
print(f"\n>>> {stage.label}")
if stage.output_path is not None:
stage.output_path.parent.mkdir(parents=True, exist_ok=True)
if stage.input_path is not None and not stage.input_path.exists():
raise RuntimeError(
f"Expected input file {stage.input_path} for {stage.label}; run the import step first or provide an existing file."
)
try:
exit_code = stage.runner(stage.args)
except SystemExit as exc: # argparse.error exits via SystemExit
exit_code = int(exc.code or 0)
if exit_code:
raise RuntimeError(f"{stage.label} failed with exit code {exit_code}.")
def main(argv: list[str] | None = None) -> int:
parser = build_argument_parser()
args = parser.parse_args(argv)
stages = _build_stage_plan(args)
if args.dry_run:
print("Dry run: the following stages would run in order.")
_describe_plan(stages)
return 0
for stage in stages:
_execute_stage(stage)
print("\nOSM refresh pipeline completed successfully.")
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -23,10 +23,17 @@ def build_argument_parser() -> argparse.ArgumentParser:
help="Path to the normalized station JSON file produced by stations_import.py", help="Path to the normalized station JSON file produced by stations_import.py",
) )
parser.add_argument( parser.add_argument(
"--commit/--no-commit", "--commit",
dest="commit", dest="commit",
action="store_true",
default=True, default=True,
help="Commit the transaction (default: commit). Use --no-commit for dry runs.", help="Commit the transaction after loading (default).",
)
parser.add_argument(
"--no-commit",
dest="commit",
action="store_false",
help="Rollback the transaction after loading (useful for dry runs).",
) )
return parser return parser

View File

@@ -0,0 +1,259 @@
from __future__ import annotations
"""CLI utility to export rail track geometries from OpenStreetMap."""
import argparse
import json
import math
import sys
from dataclasses import asdict
from pathlib import Path
from typing import Any, Iterable, Mapping
from urllib.parse import quote_plus
from backend.app.core.osm_config import (
DEFAULT_REGIONS,
TRACK_ALLOWED_RAILWAY_TYPES,
TRACK_EXCLUDED_SERVICE_TAGS,
TRACK_EXCLUDED_USAGE_TAGS,
TRACK_MIN_LENGTH_METERS,
TRACK_TAG_FILTERS,
compile_overpass_filters,
)
OVERPASS_ENDPOINT = "https://overpass-api.de/api/interpreter"
def build_argument_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(
description="Export OSM rail track ways for ingestion",
)
parser.add_argument(
"--output",
type=Path,
default=Path("data/osm_tracks.json"),
help=(
"Destination file for the exported track geometries "
"(default: data/osm_tracks.json)"
),
)
parser.add_argument(
"--region",
choices=[region.name for region in DEFAULT_REGIONS] + ["all"],
default="all",
help="Region name to export (default: all)",
)
parser.add_argument(
"--dry-run",
action="store_true",
help="Do not fetch data; print the Overpass payload only",
)
return parser
def build_overpass_query(region_name: str) -> str:
if region_name == "all":
regions = DEFAULT_REGIONS
else:
regions = tuple(
region for region in DEFAULT_REGIONS if region.name == region_name)
if not regions:
available = ", ".join(region.name for region in DEFAULT_REGIONS)
msg = f"Unknown region {region_name}. Available regions: [{available}]"
raise ValueError(msg)
filters = compile_overpass_filters(TRACK_TAG_FILTERS)
parts = ["[out:json][timeout:120];", "("]
for region in regions:
parts.append(f" way{filters}\n ({region.to_overpass_arg()});")
parts.append(")")
parts.append("; out body geom; >; out skel qt;")
return "\n".join(parts)
def perform_request(query: str) -> dict[str, Any]:
import urllib.request
payload = f"data={quote_plus(query)}".encode("utf-8")
request = urllib.request.Request(
OVERPASS_ENDPOINT,
data=payload,
headers={"Content-Type": "application/x-www-form-urlencoded"},
)
with urllib.request.urlopen(request, timeout=180) as response:
payload = response.read()
return json.loads(payload)
def normalize_track_elements(elements: Iterable[dict[str, Any]]) -> list[dict[str, Any]]:
"""Convert Overpass way elements into TrackCreate-compatible payloads."""
tracks: list[dict[str, Any]] = []
for element in elements:
if element.get("type") != "way":
continue
raw_geometry = element.get("geometry") or []
coordinates: list[list[float]] = []
for node in raw_geometry:
lat = node.get("lat")
lon = node.get("lon")
if lat is None or lon is None:
coordinates = []
break
coordinates.append([float(lat), float(lon)])
if len(coordinates) < 2:
continue
tags: dict[str, Any] = element.get("tags", {})
length_meters = _polyline_length(coordinates)
if not _should_include_track(tags, length_meters):
continue
name = tags.get("name")
maxspeed = _parse_maxspeed(tags.get("maxspeed"))
status = _derive_status(tags.get("railway"))
is_bidirectional = not _is_oneway(tags.get("oneway"))
tracks.append(
{
"osmId": str(element.get("id")),
"name": str(name) if name else None,
"lengthMeters": length_meters,
"maxSpeedKph": maxspeed,
"status": status,
"isBidirectional": is_bidirectional,
"coordinates": coordinates,
}
)
return tracks
def _parse_maxspeed(value: Any) -> float | None:
if value is None:
return None
# Overpass may return values such as "80" or "80 km/h" or "signals".
if isinstance(value, (int, float)):
return float(value)
text = str(value).strip()
number = ""
for char in text:
if char.isdigit() or char == ".":
number += char
elif number:
break
try:
return float(number) if number else None
except ValueError:
return None
def _derive_status(value: Any) -> str:
tag = str(value or "").lower()
if tag in {"abandoned", "disused"}:
return tag
if tag in {"construction", "proposed"}:
return "construction"
return "operational"
def _should_include_track(tags: Mapping[str, Any], length_meters: float) -> bool:
railway = str(tags.get("railway", "")).lower()
if railway not in TRACK_ALLOWED_RAILWAY_TYPES:
return False
if length_meters < TRACK_MIN_LENGTH_METERS:
return False
service = str(tags.get("service", "")).lower()
if service and service in TRACK_EXCLUDED_SERVICE_TAGS:
return False
usage = str(tags.get("usage", "")).lower()
if usage and usage in TRACK_EXCLUDED_USAGE_TAGS:
return False
return True
def _is_oneway(value: Any) -> bool:
if value is None:
return False
normalized = str(value).strip().lower()
return normalized in {"yes", "true", "1"}
def _polyline_length(points: list[list[float]]) -> float:
if len(points) < 2:
return 0.0
total = 0.0
for index in range(len(points) - 1):
total += _haversine(points[index], points[index + 1])
return total
def _haversine(a: list[float], b: list[float]) -> float:
"""Return distance in meters between two [lat, lon] coordinates."""
lat1, lon1 = a
lat2, lon2 = b
radius = 6_371_000
phi1 = math.radians(lat1)
phi2 = math.radians(lat2)
delta_phi = math.radians(lat2 - lat1)
delta_lambda = math.radians(lon2 - lon1)
sin_dphi = math.sin(delta_phi / 2)
sin_dlambda = math.sin(delta_lambda / 2)
root = sin_dphi**2 + math.cos(phi1) * math.cos(phi2) * sin_dlambda**2
distance = 2 * radius * math.atan2(math.sqrt(root), math.sqrt(1 - root))
return distance
def main(argv: list[str] | None = None) -> int:
parser = build_argument_parser()
args = parser.parse_args(argv)
query = build_overpass_query(args.region)
if args.dry_run:
print(query)
return 0
output_path: Path = args.output
output_path.parent.mkdir(parents=True, exist_ok=True)
data = perform_request(query)
raw_elements = data.get("elements", [])
tracks = normalize_track_elements(raw_elements)
payload = {
"metadata": {
"endpoint": OVERPASS_ENDPOINT,
"region": args.region,
"filters": TRACK_TAG_FILTERS,
"regions": [asdict(region) for region in DEFAULT_REGIONS],
"raw_count": len(raw_elements),
"track_count": len(tracks),
},
"tracks": tracks,
}
with output_path.open("w", encoding="utf-8") as handle:
json.dump(payload, handle, indent=2)
print(
f"Normalized {len(tracks)} tracks from {len(raw_elements)} elements into {output_path}"
)
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,273 @@
from __future__ import annotations
"""CLI for loading normalized track JSON into the database."""
import argparse
import json
import math
import sys
from dataclasses import dataclass
from pathlib import Path
from typing import Any, Iterable, Mapping, Sequence
from geoalchemy2.elements import WKBElement, WKTElement
from geoalchemy2.shape import to_shape
from backend.app.core.osm_config import TRACK_STATION_SNAP_RADIUS_METERS
from backend.app.db.session import SessionLocal
from backend.app.models import TrackCreate
from backend.app.repositories import StationRepository, TrackRepository
@dataclass(slots=True)
class ParsedTrack:
coordinates: list[tuple[float, float]]
name: str | None = None
length_meters: float | None = None
max_speed_kph: float | None = None
status: str = "operational"
is_bidirectional: bool = True
@dataclass(slots=True)
class StationRef:
id: str
latitude: float
longitude: float
def build_argument_parser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(
description="Load normalized track data into PostGIS",
)
parser.add_argument(
"input",
type=Path,
help="Path to the normalized track JSON file produced by tracks_import.py",
)
parser.add_argument(
"--commit",
dest="commit",
action="store_true",
default=True,
help="Commit the transaction after loading (default).",
)
parser.add_argument(
"--no-commit",
dest="commit",
action="store_false",
help="Rollback the transaction after loading (useful for dry runs).",
)
return parser
def main(argv: list[str] | None = None) -> int:
parser = build_argument_parser()
args = parser.parse_args(argv)
if not args.input.exists():
parser.error(f"Input file {args.input} does not exist")
with args.input.open("r", encoding="utf-8") as handle:
payload = json.load(handle)
track_entries = payload.get("tracks") or []
if not isinstance(track_entries, list):
parser.error("Invalid payload: 'tracks' must be a list")
try:
tracks = _parse_track_entries(track_entries)
except ValueError as exc:
parser.error(str(exc))
created = load_tracks(tracks, commit=args.commit)
print(f"Loaded {created} tracks from {args.input}")
return 0
def _parse_track_entries(entries: Iterable[Mapping[str, Any]]) -> list[ParsedTrack]:
parsed: list[ParsedTrack] = []
for entry in entries:
coordinates = entry.get("coordinates")
if not isinstance(coordinates, Sequence) or len(coordinates) < 2:
raise ValueError(
"Invalid track entry: 'coordinates' must contain at least two points")
processed_coordinates: list[tuple[float, float]] = []
for pair in coordinates:
if not isinstance(pair, Sequence) or len(pair) != 2:
raise ValueError(
f"Invalid coordinate pair {pair!r} in track entry")
lat, lon = pair
processed_coordinates.append((float(lat), float(lon)))
name = entry.get("name")
length = _safe_float(entry.get("lengthMeters"))
max_speed = _safe_float(entry.get("maxSpeedKph"))
status = entry.get("status", "operational")
is_bidirectional = entry.get("isBidirectional", True)
parsed.append(
ParsedTrack(
coordinates=processed_coordinates,
name=str(name) if name else None,
length_meters=length,
max_speed_kph=max_speed,
status=str(status) if status else "operational",
is_bidirectional=bool(is_bidirectional),
)
)
return parsed
def load_tracks(tracks: Iterable[ParsedTrack], commit: bool = True) -> int:
created = 0
with SessionLocal() as session:
station_repo = StationRepository(session)
track_repo = TrackRepository(session)
station_index = _build_station_index(station_repo.list_active())
existing_pairs = {
(str(track.start_station_id), str(track.end_station_id))
for track in track_repo.list_all()
}
for track_data in tracks:
start_station = _nearest_station(
track_data.coordinates[0],
station_index,
TRACK_STATION_SNAP_RADIUS_METERS,
)
end_station = _nearest_station(
track_data.coordinates[-1],
station_index,
TRACK_STATION_SNAP_RADIUS_METERS,
)
if not start_station or not end_station:
continue
if start_station.id == end_station.id:
continue
pair = (start_station.id, end_station.id)
if pair in existing_pairs:
continue
length = track_data.length_meters or _polyline_length(
track_data.coordinates)
max_speed = (
int(round(track_data.max_speed_kph))
if track_data.max_speed_kph is not None
else None
)
create_schema = TrackCreate(
name=track_data.name,
start_station_id=start_station.id,
end_station_id=end_station.id,
coordinates=track_data.coordinates,
length_meters=length,
max_speed_kph=max_speed,
status=track_data.status,
is_bidirectional=track_data.is_bidirectional,
)
track_repo.create(create_schema)
existing_pairs.add(pair)
created += 1
if commit:
session.commit()
else:
session.rollback()
return created
def _nearest_station(
coordinate: tuple[float, float],
stations: Sequence[StationRef],
max_distance_meters: float,
) -> StationRef | None:
best_station: StationRef | None = None
best_distance = math.inf
for station in stations:
distance = _haversine(
coordinate, (station.latitude, station.longitude))
if distance < best_distance:
best_station = station
best_distance = distance
if best_distance <= max_distance_meters:
return best_station
return None
def _build_station_index(stations: Iterable[Any]) -> list[StationRef]:
index: list[StationRef] = []
for station in stations:
location = getattr(station, "location", None)
if location is None:
continue
point = _to_point(location)
if point is None:
continue
latitude = getattr(point, "y", None)
longitude = getattr(point, "x", None)
if latitude is None or longitude is None:
continue
index.append(
StationRef(
id=str(station.id),
latitude=float(latitude),
longitude=float(longitude),
)
)
return index
def _to_point(geometry: WKBElement | WKTElement | Any):
try:
point = to_shape(geometry)
return point if getattr(point, "geom_type", None) == "Point" else None
except Exception: # pragma: no cover - defensive, should not happen with valid geometry
return None
def _polyline_length(points: Sequence[tuple[float, float]]) -> float:
if len(points) < 2:
return 0.0
total = 0.0
for index in range(len(points) - 1):
total += _haversine(points[index], points[index + 1])
return total
def _haversine(a: tuple[float, float], b: tuple[float, float]) -> float:
lat1, lon1 = a
lat2, lon2 = b
radius = 6_371_000
phi1 = math.radians(lat1)
phi2 = math.radians(lat2)
delta_phi = math.radians(lat2 - lat1)
delta_lambda = math.radians(lon2 - lon1)
sin_dphi = math.sin(delta_phi / 2)
sin_dlambda = math.sin(delta_lambda / 2)
root = sin_dphi**2 + math.cos(phi1) * math.cos(phi2) * sin_dlambda**2
distance = 2 * radius * math.atan2(math.sqrt(root), math.sqrt(1 - root))
return distance
def _safe_float(value: Any) -> float | None:
if value is None or value == "":
return None
try:
return float(value)
except (TypeError, ValueError):
return None
if __name__ == "__main__":
sys.exit(main())

View File

@@ -29,11 +29,15 @@ def test_track_model_properties() -> None:
end_station_id="station-2", end_station_id="station-2",
length_meters=1500.0, length_meters=1500.0,
max_speed_kph=120.0, max_speed_kph=120.0,
status="operational",
is_bidirectional=True,
coordinates=[(52.52, 13.405), (52.6, 13.5)],
created_at=timestamp, created_at=timestamp,
updated_at=timestamp, updated_at=timestamp,
) )
assert track.length_meters > 0 assert track.length_meters > 0
assert track.start_station_id != track.end_station_id assert track.start_station_id != track.end_station_id
assert len(track.coordinates) == 2
def test_train_model_operating_tracks() -> None: def test_train_model_operating_tracks() -> None:

View File

@@ -26,6 +26,9 @@ def sample_entities() -> dict[str, SimpleNamespace]:
end_station_id=station.id, end_station_id=station.id,
length_meters=1234.5, length_meters=1234.5,
max_speed_kph=160, max_speed_kph=160,
status="operational",
is_bidirectional=True,
track_geometry=None,
created_at=timestamp, created_at=timestamp,
updated_at=timestamp, updated_at=timestamp,
) )
@@ -47,7 +50,8 @@ def test_network_snapshot_prefers_repository_data(
track = sample_entities["track"] track = sample_entities["track"]
train = sample_entities["train"] train = sample_entities["train"]
monkeypatch.setattr(StationRepository, "list_active", lambda self: [station]) monkeypatch.setattr(StationRepository, "list_active",
lambda self: [station])
monkeypatch.setattr(TrackRepository, "list_all", lambda self: [track]) monkeypatch.setattr(TrackRepository, "list_all", lambda self: [track])
monkeypatch.setattr(TrainRepository, "list_all", lambda self: [train]) monkeypatch.setattr(TrainRepository, "list_all", lambda self: [train])
@@ -55,7 +59,8 @@ def test_network_snapshot_prefers_repository_data(
assert snapshot["stations"] assert snapshot["stations"]
assert snapshot["stations"][0]["name"] == station.name assert snapshot["stations"][0]["name"] == station.name
assert snapshot["tracks"][0]["lengthMeters"] == pytest.approx(track.length_meters) assert snapshot["tracks"][0]["lengthMeters"] == pytest.approx(
track.length_meters)
assert snapshot["trains"][0]["designation"] == train.designation assert snapshot["trains"][0]["designation"] == train.designation
assert snapshot["trains"][0]["operatingTrackIds"] == [] assert snapshot["trains"][0]["operatingTrackIds"] == []
@@ -71,4 +76,5 @@ def test_network_snapshot_falls_back_when_repositories_empty(
assert snapshot["stations"] assert snapshot["stations"]
assert snapshot["trains"] assert snapshot["trains"]
assert any(station["name"] == "Central" for station in snapshot["stations"]) assert any(station["name"] ==
"Central" for station in snapshot["stations"])

View File

@@ -0,0 +1,158 @@
from __future__ import annotations
from argparse import Namespace
from pathlib import Path
import pytest
from backend.scripts import osm_refresh
def _namespace(output_dir: Path, **overrides: object) -> Namespace:
defaults: dict[str, object] = {
"region": "all",
"output_dir": output_dir,
"stations_json": None,
"tracks_json": None,
"skip_station_import": False,
"skip_station_load": False,
"skip_track_import": False,
"skip_track_load": False,
"dry_run": False,
"commit": True,
}
defaults.update(overrides)
return Namespace(**defaults)
def test_build_stage_plan_default_sequence(tmp_path: Path) -> None:
stages = osm_refresh._build_stage_plan(_namespace(tmp_path))
labels = [stage.label for stage in stages]
assert labels == [
"Import stations",
"Load stations",
"Import tracks",
"Load tracks",
]
expected_station_path = tmp_path / "osm_stations.json"
expected_track_path = tmp_path / "osm_tracks.json"
assert stages[0].output_path == expected_station_path
assert stages[1].input_path == expected_station_path
assert stages[2].output_path == expected_track_path
assert stages[3].input_path == expected_track_path
def test_build_stage_plan_respects_skip_flags(tmp_path: Path) -> None:
stages = osm_refresh._build_stage_plan(
_namespace(
tmp_path,
skip_station_import=True,
skip_track_import=True,
)
)
labels = [stage.label for stage in stages]
assert labels == ["Load stations", "Load tracks"]
def test_main_dry_run_lists_plan(monkeypatch: pytest.MonkeyPatch, tmp_path: Path, capsys: pytest.CaptureFixture[str]) -> None:
def fail(_args: list[str] | None) -> int: # pragma: no cover - defensive
raise AssertionError("runner should not be invoked during dry run")
monkeypatch.setattr(osm_refresh.stations_import, "main", fail)
monkeypatch.setattr(osm_refresh.tracks_import, "main", fail)
monkeypatch.setattr(osm_refresh.stations_load, "main", fail)
monkeypatch.setattr(osm_refresh.tracks_load, "main", fail)
exit_code = osm_refresh.main(["--dry-run", "--output-dir", str(tmp_path)])
assert exit_code == 0
captured = capsys.readouterr().out
assert "Dry run" in captured
assert "Import stations" in captured
assert "Load tracks" in captured
def test_main_executes_stages_in_order(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> None:
calls: list[str] = []
def make_import(name: str):
def runner(args: list[str] | None) -> int:
assert args is not None
calls.append(name)
output_index = args.index("--output") + 1
output_path = Path(args[output_index])
output_path.write_text("{}", encoding="utf-8")
return 0
return runner
def make_load(name: str):
def runner(args: list[str] | None) -> int:
assert args is not None
calls.append(name)
return 0
return runner
monkeypatch.setattr(osm_refresh.stations_import, "main",
make_import("stations_import"))
monkeypatch.setattr(osm_refresh.tracks_import, "main",
make_import("tracks_import"))
monkeypatch.setattr(osm_refresh.stations_load, "main",
make_load("stations_load"))
monkeypatch.setattr(osm_refresh.tracks_load, "main",
make_load("tracks_load"))
exit_code = osm_refresh.main(["--output-dir", str(tmp_path)])
assert exit_code == 0
assert calls == [
"stations_import",
"stations_load",
"tracks_import",
"tracks_load",
]
def test_main_skip_import_flags(monkeypatch: pytest.MonkeyPatch, tmp_path: Path) -> None:
station_json = tmp_path / "stations.json"
station_json.write_text("{}", encoding="utf-8")
track_json = tmp_path / "tracks.json"
track_json.write_text("{}", encoding="utf-8")
def fail(_args: list[str] | None) -> int: # pragma: no cover - defensive
raise AssertionError("import stage should be skipped")
calls: list[str] = []
def record(name: str):
def runner(args: list[str] | None) -> int:
assert args is not None
calls.append(name)
return 0
return runner
monkeypatch.setattr(osm_refresh.stations_import, "main", fail)
monkeypatch.setattr(osm_refresh.tracks_import, "main", fail)
monkeypatch.setattr(osm_refresh.stations_load,
"main", record("stations_load"))
monkeypatch.setattr(osm_refresh.tracks_load, "main", record("tracks_load"))
exit_code = osm_refresh.main(
[
"--skip-station-import",
"--skip-track-import",
"--stations-json",
str(station_json),
"--tracks-json",
str(track_json),
]
)
assert exit_code == 0
assert calls == ["stations_load", "tracks_load"]

View File

@@ -0,0 +1,110 @@
from __future__ import annotations
from backend.scripts import tracks_import
def test_normalize_track_elements_excludes_invalid_geometries() -> None:
elements = [
{
"type": "way",
"id": 123,
"geometry": [
{"lat": 52.5, "lon": 13.4},
{"lat": 52.6, "lon": 13.5},
],
"tags": {
"name": "Main Line",
"railway": "rail",
"maxspeed": "120",
},
},
{
"type": "way",
"id": 456,
"geometry": [
{"lat": 51.0},
],
"tags": {"railway": "rail"},
},
{
"type": "node",
"id": 789,
},
]
tracks = tracks_import.normalize_track_elements(elements)
assert len(tracks) == 1
track = tracks[0]
assert track["osmId"] == "123"
assert track["name"] == "Main Line"
assert track["maxSpeedKph"] == 120.0
assert track["status"] == "operational"
assert track["isBidirectional"] is True
assert track["coordinates"] == [[52.5, 13.4], [52.6, 13.5]]
assert track["lengthMeters"] > 0
def test_normalize_track_elements_marks_oneway_and_status() -> None:
elements = [
{
"type": "way",
"id": 42,
"geometry": [
{"lat": 48.1, "lon": 11.5},
{"lat": 48.2, "lon": 11.6},
],
"tags": {
"railway": "disused",
"oneway": "yes",
},
}
]
tracks = tracks_import.normalize_track_elements(elements)
assert len(tracks) == 1
track = tracks[0]
assert track["status"] == "disused"
assert track["isBidirectional"] is False
def test_normalize_track_elements_skips_service_tracks() -> None:
elements = [
{
"type": "way",
"id": 77,
"geometry": [
{"lat": 52.5000, "lon": 13.4000},
{"lat": 52.5010, "lon": 13.4010},
],
"tags": {
"railway": "rail",
"service": "yard",
},
}
]
tracks = tracks_import.normalize_track_elements(elements)
assert tracks == []
def test_normalize_track_elements_skips_short_tracks() -> None:
elements = [
{
"type": "way",
"id": 81,
"geometry": [
{"lat": 52.500000, "lon": 13.400000},
{"lat": 52.500100, "lon": 13.400050},
],
"tags": {
"railway": "rail",
},
}
]
tracks = tracks_import.normalize_track_elements(elements)
assert tracks == []

View File

@@ -0,0 +1,200 @@
from __future__ import annotations
from dataclasses import dataclass, field
from typing import List
import pytest
from geoalchemy2.shape import from_shape
from shapely.geometry import Point
from backend.scripts import tracks_load
def test_parse_track_entries_returns_models() -> None:
entries = [
{
"name": "Connector",
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
"lengthMeters": 1500,
"maxSpeedKph": 120,
"status": "operational",
"isBidirectional": True,
}
]
parsed = tracks_load._parse_track_entries(entries)
assert parsed[0].name == "Connector"
assert parsed[0].coordinates[0] == (52.5, 13.4)
assert parsed[0].length_meters == 1500
assert parsed[0].max_speed_kph == 120
def test_parse_track_entries_invalid_raises_value_error() -> None:
entries = [
{
"coordinates": [[52.5, 13.4]],
}
]
with pytest.raises(ValueError):
tracks_load._parse_track_entries(entries)
@dataclass
class DummySession:
committed: bool = False
rolled_back: bool = False
def __enter__(self) -> "DummySession":
return self
def __exit__(self, exc_type, exc, traceback) -> None:
pass
def commit(self) -> None:
self.committed = True
def rollback(self) -> None:
self.rolled_back = True
@dataclass
class DummyStation:
id: str
location: object
@dataclass
class DummyStationRepository:
session: DummySession
stations: List[DummyStation]
def list_active(self) -> List[DummyStation]:
return self.stations
@dataclass
class DummyTrackRepository:
session: DummySession
created: list = field(default_factory=list)
existing: list = field(default_factory=list)
def list_all(self):
return self.existing
def create(self, data): # pragma: no cover - simple delegation
self.created.append(data)
def _point(lat: float, lon: float) -> object:
return from_shape(Point(lon, lat), srid=4326)
def test_load_tracks_creates_entries(monkeypatch: pytest.MonkeyPatch) -> None:
session_instance = DummySession()
station_repo_instance = DummyStationRepository(
session_instance,
stations=[
DummyStation(id="station-a", location=_point(52.5, 13.4)),
DummyStation(id="station-b", location=_point(52.6, 13.5)),
],
)
track_repo_instance = DummyTrackRepository(session_instance)
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
monkeypatch.setattr(tracks_load, "StationRepository",
lambda session: station_repo_instance)
monkeypatch.setattr(tracks_load, "TrackRepository",
lambda session: track_repo_instance)
parsed = tracks_load._parse_track_entries(
[
{
"name": "Connector",
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
}
]
)
created = tracks_load.load_tracks(parsed, commit=True)
assert created == 1
assert session_instance.committed is True
assert track_repo_instance.created
track = track_repo_instance.created[0]
assert track.start_station_id == "station-a"
assert track.end_station_id == "station-b"
assert track.coordinates == [(52.5, 13.4), (52.6, 13.5)]
def test_load_tracks_skips_existing_pairs(monkeypatch: pytest.MonkeyPatch) -> None:
session_instance = DummySession()
station_repo_instance = DummyStationRepository(
session_instance,
stations=[
DummyStation(id="station-a", location=_point(52.5, 13.4)),
DummyStation(id="station-b", location=_point(52.6, 13.5)),
],
)
existing_track = type("ExistingTrack", (), {
"start_station_id": "station-a",
"end_station_id": "station-b",
})
track_repo_instance = DummyTrackRepository(
session_instance,
existing=[existing_track],
)
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
monkeypatch.setattr(tracks_load, "StationRepository",
lambda session: station_repo_instance)
monkeypatch.setattr(tracks_load, "TrackRepository",
lambda session: track_repo_instance)
parsed = tracks_load._parse_track_entries(
[
{
"name": "Connector",
"coordinates": [[52.5, 13.4], [52.6, 13.5]],
}
]
)
created = tracks_load.load_tracks(parsed, commit=False)
assert created == 0
assert session_instance.rolled_back is True
assert not track_repo_instance.created
def test_load_tracks_skips_when_station_too_far(monkeypatch: pytest.MonkeyPatch) -> None:
session_instance = DummySession()
station_repo_instance = DummyStationRepository(
session_instance,
stations=[
DummyStation(id="remote-station", location=_point(53.5, 14.5)),
],
)
track_repo_instance = DummyTrackRepository(session_instance)
monkeypatch.setattr(tracks_load, "SessionLocal", lambda: session_instance)
monkeypatch.setattr(tracks_load, "StationRepository",
lambda session: station_repo_instance)
monkeypatch.setattr(tracks_load, "TrackRepository",
lambda session: track_repo_instance)
parsed = tracks_load._parse_track_entries(
[
{
"name": "Isolated Segment",
"coordinates": [[52.5, 13.4], [52.51, 13.41]],
}
]
)
created = tracks_load.load_tracks(parsed, commit=True)
assert created == 0
assert session_instance.committed is True
assert not track_repo_instance.created

9782
data/osm_stations.json Normal file

File diff suppressed because it is too large Load Diff

527625
data/osm_tracks.json Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,3 @@
version: "3.9"
services: services:
db: db:
build: build:
@@ -27,6 +25,7 @@ services:
DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_dev DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_dev
TEST_DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_test TEST_DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_test
REDIS_URL: redis://redis:6379/0 REDIS_URL: redis://redis:6379/0
INIT_DEMO_DB: "true"
depends_on: depends_on:
- db - db
- redis - redis

View File

@@ -111,6 +111,7 @@ graph TD
- **Health Module**: Lightweight readiness probes used by infrastructure checks. - **Health Module**: Lightweight readiness probes used by infrastructure checks.
- **Network Module**: Serves read-only snapshots of stations, tracks, and trains using shared domain models (camelCase aliases for client compatibility). - **Network Module**: Serves read-only snapshots of stations, tracks, and trains using shared domain models (camelCase aliases for client compatibility).
- **OSM Ingestion CLI**: Script pairings (`stations_import`/`stations_load`, `tracks_import`/`tracks_load`) that harvest OpenStreetMap fixtures and persist normalized station and track geometries into PostGIS.
- **Authentication Module**: JWT-based user registration, authentication, and authorization. The current prototype supports on-the-fly account creation backed by an in-memory user store and issues short-lived access tokens to validate the client flow end-to-end. - **Authentication Module**: JWT-based user registration, authentication, and authorization. The current prototype supports on-the-fly account creation backed by an in-memory user store and issues short-lived access tokens to validate the client flow end-to-end.
- **Railway Calculation Module**: Algorithms for route optimization and scheduling (planned). - **Railway Calculation Module**: Algorithms for route optimization and scheduling (planned).
- **Resource Management Module**: Logic for game economy and progression (planned). - **Resource Management Module**: Logic for game economy and progression (planned).
@@ -157,4 +158,3 @@ rail-game/
``` ```
Shared code that spans application layers should be surfaced through well-defined APIs within `backend/app/services` or exposed via frontend data contracts to keep coupling low. Infrastructure automation and CI/CD assets remain isolated under `infra/` to support multiple deployment targets. Shared code that spans application layers should be surfaced through well-defined APIs within `backend/app/services` or exposed via frontend data contracts to keep coupling low. Infrastructure automation and CI/CD assets remain isolated under `infra/` to support multiple deployment targets.

View File

@@ -78,7 +78,31 @@ sequenceDiagram
F->>F: Render Leaflet map and snapshot summaries F->>F: Render Leaflet map and snapshot summaries
``` ```
#### 6.2.4 Building Railway Network #### 6.2.4 OSM Track Import and Load
**Scenario**: Operator refreshes spatial fixtures by harvesting OSM railways and persisting them to PostGIS.
**Description**: The paired CLI scripts `tracks_import.py` and `tracks_load.py` export candidate track segments from Overpass, associate endpoints with the nearest known stations, and store the resulting LINESTRING geometries. Dry-run flags allow inspection of the generated Overpass payload or database mutations before commit.
```mermaid
sequenceDiagram
participant Ops as Operator
participant TI as tracks_import.py
participant OL as Overpass API
participant TL as tracks_load.py
participant DB as PostGIS
Ops->>TI: Invoke with region + output path
TI->>OL: POST compiled Overpass query
OL-->>TI: Return rail way elements (JSON)
TI-->>Ops: Write normalized tracks JSON
Ops->>TL: Invoke with normalized JSON
TL->>DB: Fetch stations + existing tracks
TL->>DB: Insert snapped LINESTRING geometries
TL-->>Ops: Report committed track count
```
#### 6.2.5 Building Railway Network
**Scenario**: User adds a new track segment to their railway network. **Scenario**: User adds a new track segment to their railway network.
@@ -101,7 +125,7 @@ sequenceDiagram
F->>F: Update map display F->>F: Update map display
``` ```
#### 6.2.5 Running Train Simulation #### 6.2.6 Running Train Simulation
**Scenario**: User starts a train simulation on their network. **Scenario**: User starts a train simulation on their network.
@@ -129,7 +153,7 @@ sequenceDiagram
end end
``` ```
#### 6.2.6 Saving Game Progress #### 6.2.7 Saving Game Progress
**Scenario**: User saves their current game state. **Scenario**: User saves their current game state.
@@ -154,4 +178,3 @@ sequenceDiagram
- **Real-time Updates**: WebSocket connections for simulation updates, with fallback to polling - **Real-time Updates**: WebSocket connections for simulation updates, with fallback to polling
- **Load Balancing**: Backend API can be scaled horizontally for multiple users - **Load Balancing**: Backend API can be scaled horizontally for multiple users
- **CDN**: Static assets and map tiles served via CDN for faster loading - **CDN**: Static assets and map tiles served via CDN for faster loading

View File

@@ -55,6 +55,13 @@ Dynamic simulation of train operations:
- **Fallback Mechanisms**: Polling as alternative when WebSockets unavailable - **Fallback Mechanisms**: Polling as alternative when WebSockets unavailable
- **Event-Driven Updates**: Push notifications for game state changes - **Event-Driven Updates**: Push notifications for game state changes
#### 8.2.4 OSM Track Harvesting Policy
- **Railway Types**: Importer requests `rail`, `light_rail`, `subway`, `tram`, `narrow_gauge`, plus `construction` and `disused` variants to capture build-state metadata.
- **Service Filters**: `service` tags such as `yard`, `siding`, `spur`, `crossover`, `industrial`, or `military` are excluded to focus on mainline traffic.
- **Usage Filters**: Ways flagged with `usage=military` or `usage=tourism` are skipped; unspecified usage defaults to accepted.
- **Geometry Guardrails**: Segments shorter than 75 meters are discarded and track endpoints must snap to an existing station within 350 meters or the segment is ignored during loading.
### 8.3 User Interface Concepts ### 8.3 User Interface Concepts
#### 8.3.1 Component-Based Architecture #### 8.3.1 Component-Based Architecture
@@ -127,4 +134,3 @@ Dynamic simulation of train operations:
- **Lazy Loading**: On-demand loading of components and data - **Lazy Loading**: On-demand loading of components and data
- **Caching Layers**: Redis for frequently accessed data - **Caching Layers**: Redis for frequently accessed data
- **Asset Optimization**: Minification and compression of static resources - **Asset Optimization**: Minification and compression of static resources

View File

@@ -100,6 +100,7 @@ The system interacts with:
- Browser-native implementation for broad accessibility - Browser-native implementation for broad accessibility
- Spatial database for efficient geographical queries - Spatial database for efficient geographical queries
- Offline-friendly OSM ingestion pipeline that uses dedicated CLI scripts to export/import stations and tracks before seeding the database
- Modular architecture allowing for future extensions (e.g., multiplayer) - Modular architecture allowing for future extensions (e.g., multiplayer)
## 5. Building Block View ## 5. Building Block View

View File

@@ -30,7 +30,8 @@
"eslint-plugin-react-hooks": "^4.6.2", "eslint-plugin-react-hooks": "^4.6.2",
"prettier": "^3.3.3", "prettier": "^3.3.3",
"typescript": "^5.5.3", "typescript": "^5.5.3",
"vite": "^5.4.0" "vite": "^5.4.0",
"vitest": "^1.6.0"
} }
}, },
"node_modules/@babel/code-frame": { "node_modules/@babel/code-frame": {
@@ -885,6 +886,19 @@
"dev": true, "dev": true,
"license": "BSD-3-Clause" "license": "BSD-3-Clause"
}, },
"node_modules/@jest/schemas": {
"version": "29.6.3",
"resolved": "https://registry.npmjs.org/@jest/schemas/-/schemas-29.6.3.tgz",
"integrity": "sha512-mo5j5X+jIZmJQveBKeS/clAueipV7KgiX1vMgCxam1RNYiqE1w62n0/tJJnHtjW8ZHcQco5gY85jA3mi0L+nSA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@sinclair/typebox": "^0.27.8"
},
"engines": {
"node": "^14.15.0 || ^16.10.0 || >=18.0.0"
}
},
"node_modules/@jridgewell/gen-mapping": { "node_modules/@jridgewell/gen-mapping": {
"version": "0.3.13", "version": "0.3.13",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz", "resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz",
@@ -1322,6 +1336,13 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/@sinclair/typebox": {
"version": "0.27.8",
"resolved": "https://registry.npmjs.org/@sinclair/typebox/-/typebox-0.27.8.tgz",
"integrity": "sha512-+Fj43pSMwJs4KRrH/938Uf+uAELIgVBmQzg/q1YG10djyfA3TnrU8N8XzqCh/okZdszqBQTZf96idMfE5lnwTA==",
"dev": true,
"license": "MIT"
},
"node_modules/@types/babel__core": { "node_modules/@types/babel__core": {
"version": "7.20.5", "version": "7.20.5",
"resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz", "resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz",
@@ -1699,6 +1720,109 @@
"vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0" "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
} }
}, },
"node_modules/@vitest/expect": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@vitest/expect/-/expect-1.6.1.tgz",
"integrity": "sha512-jXL+9+ZNIJKruofqXuuTClf44eSpcHlgj3CiuNihUF3Ioujtmc0zIa3UJOW5RjDK1YLBJZnWBlPuqhYycLioog==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vitest/spy": "1.6.1",
"@vitest/utils": "1.6.1",
"chai": "^4.3.10"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/@vitest/runner": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@vitest/runner/-/runner-1.6.1.tgz",
"integrity": "sha512-3nSnYXkVkf3mXFfE7vVyPmi3Sazhb/2cfZGGs0JRzFsPFvAMBEcrweV1V1GsrstdXeKCTXlJbvnQwGWgEIHmOA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vitest/utils": "1.6.1",
"p-limit": "^5.0.0",
"pathe": "^1.1.1"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/@vitest/runner/node_modules/p-limit": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/p-limit/-/p-limit-5.0.0.tgz",
"integrity": "sha512-/Eaoq+QyLSiXQ4lyYV23f14mZRQcXnxfHrN0vCai+ak9G0pp9iEQukIIZq5NccEvwRB8PUnZT0KsOoDCINS1qQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"yocto-queue": "^1.0.0"
},
"engines": {
"node": ">=18"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/@vitest/runner/node_modules/yocto-queue": {
"version": "1.2.1",
"resolved": "https://registry.npmjs.org/yocto-queue/-/yocto-queue-1.2.1.tgz",
"integrity": "sha512-AyeEbWOu/TAXdxlV9wmGcR0+yh2j3vYPGOECcIj2S7MkrLyC7ne+oye2BKTItt0ii2PHk4cDy+95+LshzbXnGg==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12.20"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/@vitest/snapshot": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@vitest/snapshot/-/snapshot-1.6.1.tgz",
"integrity": "sha512-WvidQuWAzU2p95u8GAKlRMqMyN1yOJkGHnx3M1PL9Raf7AQ1kwLKg04ADlCa3+OXUZE7BceOhVZiuWAbzCKcUQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"magic-string": "^0.30.5",
"pathe": "^1.1.1",
"pretty-format": "^29.7.0"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/@vitest/spy": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@vitest/spy/-/spy-1.6.1.tgz",
"integrity": "sha512-MGcMmpGkZebsMZhbQKkAf9CX5zGvjkBTqf8Zx3ApYWXr3wG+QvEu2eXWfnIIWYSJExIp4V9FCKDEeygzkYrXMw==",
"dev": true,
"license": "MIT",
"dependencies": {
"tinyspy": "^2.2.0"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/@vitest/utils": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@vitest/utils/-/utils-1.6.1.tgz",
"integrity": "sha512-jOrrUvXM4Av9ZWiG1EajNto0u96kWAhJ1LmPmJhXXQx/32MecEKd10pOLYgS2BQx1TgkGhloPU1ArDW2vvaY6g==",
"dev": true,
"license": "MIT",
"dependencies": {
"diff-sequences": "^29.6.3",
"estree-walker": "^3.0.3",
"loupe": "^2.3.7",
"pretty-format": "^29.7.0"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/acorn": { "node_modules/acorn": {
"version": "8.15.0", "version": "8.15.0",
"resolved": "https://registry.npmjs.org/acorn/-/acorn-8.15.0.tgz", "resolved": "https://registry.npmjs.org/acorn/-/acorn-8.15.0.tgz",
@@ -1722,6 +1846,19 @@
"acorn": "^6.0.0 || ^7.0.0 || ^8.0.0" "acorn": "^6.0.0 || ^7.0.0 || ^8.0.0"
} }
}, },
"node_modules/acorn-walk": {
"version": "8.3.4",
"resolved": "https://registry.npmjs.org/acorn-walk/-/acorn-walk-8.3.4.tgz",
"integrity": "sha512-ueEepnujpqee2o5aIYnvHU6C0A42MNdsIDeqy5BydrkuC5R1ZuUFnm27EeFJGoEHJQgn3uleRvmTXaJgfXbt4g==",
"dev": true,
"license": "MIT",
"dependencies": {
"acorn": "^8.11.0"
},
"engines": {
"node": ">=0.4.0"
}
},
"node_modules/ajv": { "node_modules/ajv": {
"version": "6.12.6", "version": "6.12.6",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz", "resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
@@ -1942,6 +2079,16 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/assertion-error": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/assertion-error/-/assertion-error-1.1.0.tgz",
"integrity": "sha512-jgsaNduz+ndvGyFt3uSuWqvy4lCnIJiovtouQN5JZHOKCS2QuhEdbcQHFhVksz2N2U9hXJo8odG7ETyWlEeuDw==",
"dev": true,
"license": "MIT",
"engines": {
"node": "*"
}
},
"node_modules/ast-types-flow": { "node_modules/ast-types-flow": {
"version": "0.0.8", "version": "0.0.8",
"resolved": "https://registry.npmjs.org/ast-types-flow/-/ast-types-flow-0.0.8.tgz", "resolved": "https://registry.npmjs.org/ast-types-flow/-/ast-types-flow-0.0.8.tgz",
@@ -2069,6 +2216,16 @@
"node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7" "node": "^6 || ^7 || ^8 || ^9 || ^10 || ^11 || ^12 || >=13.7"
} }
}, },
"node_modules/cac": {
"version": "6.7.14",
"resolved": "https://registry.npmjs.org/cac/-/cac-6.7.14.tgz",
"integrity": "sha512-b6Ilus+c3RrdDk+JhLKUAQfzzgLEPy6wcXqS7f/xe1EETvsDP6GORG7SFuOs6cID5YkqchW/LXZbX5bc8j7ZcQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=8"
}
},
"node_modules/call-bind": { "node_modules/call-bind": {
"version": "1.0.8", "version": "1.0.8",
"resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.8.tgz", "resolved": "https://registry.npmjs.org/call-bind/-/call-bind-1.0.8.tgz",
@@ -2150,6 +2307,25 @@
], ],
"license": "CC-BY-4.0" "license": "CC-BY-4.0"
}, },
"node_modules/chai": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/chai/-/chai-4.5.0.tgz",
"integrity": "sha512-RITGBfijLkBddZvnn8jdqoTypxvqbOLYQkGGxXzeFjVHvudaPw0HNFD9x928/eUwYWd2dPCugVqspGALTZZQKw==",
"dev": true,
"license": "MIT",
"dependencies": {
"assertion-error": "^1.1.0",
"check-error": "^1.0.3",
"deep-eql": "^4.1.3",
"get-func-name": "^2.0.2",
"loupe": "^2.3.6",
"pathval": "^1.1.1",
"type-detect": "^4.1.0"
},
"engines": {
"node": ">=4"
}
},
"node_modules/chalk": { "node_modules/chalk": {
"version": "4.1.2", "version": "4.1.2",
"resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", "resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz",
@@ -2167,6 +2343,19 @@
"url": "https://github.com/chalk/chalk?sponsor=1" "url": "https://github.com/chalk/chalk?sponsor=1"
} }
}, },
"node_modules/check-error": {
"version": "1.0.3",
"resolved": "https://registry.npmjs.org/check-error/-/check-error-1.0.3.tgz",
"integrity": "sha512-iKEoDYaRmd1mxM90a2OEfWhjsjPpYPuQ+lMYsoxB126+t8fw7ySEO48nmDg5COTjxDI65/Y2OWpeEHk3ZOe8zg==",
"dev": true,
"license": "MIT",
"dependencies": {
"get-func-name": "^2.0.2"
},
"engines": {
"node": "*"
}
},
"node_modules/color-convert": { "node_modules/color-convert": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz", "resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
@@ -2194,6 +2383,13 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/confbox": {
"version": "0.1.8",
"resolved": "https://registry.npmjs.org/confbox/-/confbox-0.1.8.tgz",
"integrity": "sha512-RMtmw0iFkeR4YV+fUOSucriAQNb9g8zFR52MWCtl+cCZOFRNL6zeB395vPzFhEjjn4fMxXudmELnl/KF/WrK6w==",
"dev": true,
"license": "MIT"
},
"node_modules/convert-source-map": { "node_modules/convert-source-map": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz", "resolved": "https://registry.npmjs.org/convert-source-map/-/convert-source-map-2.0.0.tgz",
@@ -2302,6 +2498,19 @@
} }
} }
}, },
"node_modules/deep-eql": {
"version": "4.1.4",
"resolved": "https://registry.npmjs.org/deep-eql/-/deep-eql-4.1.4.tgz",
"integrity": "sha512-SUwdGfqdKOwxCPeVYjwSyRpJ7Z+fhpwIAtmCUdZIWZ/YP5R9WAsyuSgpLVDi9bjWoN2LXHNss/dk3urXtdQxGg==",
"dev": true,
"license": "MIT",
"dependencies": {
"type-detect": "^4.0.0"
},
"engines": {
"node": ">=6"
}
},
"node_modules/deep-is": { "node_modules/deep-is": {
"version": "0.1.4", "version": "0.1.4",
"resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz", "resolved": "https://registry.npmjs.org/deep-is/-/deep-is-0.1.4.tgz",
@@ -2345,6 +2554,16 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/diff-sequences": {
"version": "29.6.3",
"resolved": "https://registry.npmjs.org/diff-sequences/-/diff-sequences-29.6.3.tgz",
"integrity": "sha512-EjePK1srD3P08o2j4f0ExnylqRs5B9tJjcp9t1krH2qRi8CCdsYfwe9JgSLurFBWwq4uOlipzfk5fHNvwFKr8Q==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^14.15.0 || ^16.10.0 || >=18.0.0"
}
},
"node_modules/doctrine": { "node_modules/doctrine": {
"version": "3.0.0", "version": "3.0.0",
"resolved": "https://registry.npmjs.org/doctrine/-/doctrine-3.0.0.tgz", "resolved": "https://registry.npmjs.org/doctrine/-/doctrine-3.0.0.tgz",
@@ -3120,6 +3339,16 @@
"node": ">=4.0" "node": ">=4.0"
} }
}, },
"node_modules/estree-walker": {
"version": "3.0.3",
"resolved": "https://registry.npmjs.org/estree-walker/-/estree-walker-3.0.3.tgz",
"integrity": "sha512-7RUKfXgSMMkzt6ZuXmqapOurLGPPfgj6l9uRZ7lRGolvk0y2yocc35LdcxKC5PQZdn2DMqioAQ2NoWcrTKmm6g==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/estree": "^1.0.0"
}
},
"node_modules/esutils": { "node_modules/esutils": {
"version": "2.0.3", "version": "2.0.3",
"resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz", "resolved": "https://registry.npmjs.org/esutils/-/esutils-2.0.3.tgz",
@@ -3130,6 +3359,30 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/execa": {
"version": "8.0.1",
"resolved": "https://registry.npmjs.org/execa/-/execa-8.0.1.tgz",
"integrity": "sha512-VyhnebXciFV2DESc+p6B+y0LjSm0krU4OgJN44qFAhBY0TJ+1V61tYD2+wHusZ6F9n5K+vl8k0sTy7PEfV4qpg==",
"dev": true,
"license": "MIT",
"dependencies": {
"cross-spawn": "^7.0.3",
"get-stream": "^8.0.1",
"human-signals": "^5.0.0",
"is-stream": "^3.0.0",
"merge-stream": "^2.0.0",
"npm-run-path": "^5.1.0",
"onetime": "^6.0.0",
"signal-exit": "^4.1.0",
"strip-final-newline": "^3.0.0"
},
"engines": {
"node": ">=16.17"
},
"funding": {
"url": "https://github.com/sindresorhus/execa?sponsor=1"
}
},
"node_modules/fast-deep-equal": { "node_modules/fast-deep-equal": {
"version": "3.1.3", "version": "3.1.3",
"resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz", "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
@@ -3355,6 +3608,16 @@
"node": ">=6.9.0" "node": ">=6.9.0"
} }
}, },
"node_modules/get-func-name": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/get-func-name/-/get-func-name-2.0.2.tgz",
"integrity": "sha512-8vXOvuE167CtIc3OyItco7N/dpRtBbYOsPsXCz7X/PMnlGjYjSGuZJgM1Y7mmew7BKf9BqvLX2tnOVy1BBUsxQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": "*"
}
},
"node_modules/get-intrinsic": { "node_modules/get-intrinsic": {
"version": "1.3.0", "version": "1.3.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz", "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
@@ -3394,6 +3657,19 @@
"node": ">= 0.4" "node": ">= 0.4"
} }
}, },
"node_modules/get-stream": {
"version": "8.0.1",
"resolved": "https://registry.npmjs.org/get-stream/-/get-stream-8.0.1.tgz",
"integrity": "sha512-VaUJspBffn/LMCJVoMvSAdmscJyS1auj5Zulnn5UoYcY531UWmdwhRWkcGKnGU93m5HSXP9LP2usOryrBtQowA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=16"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/get-symbol-description": { "node_modules/get-symbol-description": {
"version": "1.1.0", "version": "1.1.0",
"resolved": "https://registry.npmjs.org/get-symbol-description/-/get-symbol-description-1.1.0.tgz", "resolved": "https://registry.npmjs.org/get-symbol-description/-/get-symbol-description-1.1.0.tgz",
@@ -3618,6 +3894,16 @@
"node": ">= 0.4" "node": ">= 0.4"
} }
}, },
"node_modules/human-signals": {
"version": "5.0.0",
"resolved": "https://registry.npmjs.org/human-signals/-/human-signals-5.0.0.tgz",
"integrity": "sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==",
"dev": true,
"license": "Apache-2.0",
"engines": {
"node": ">=16.17.0"
}
},
"node_modules/ignore": { "node_modules/ignore": {
"version": "7.0.5", "version": "7.0.5",
"resolved": "https://registry.npmjs.org/ignore/-/ignore-7.0.5.tgz", "resolved": "https://registry.npmjs.org/ignore/-/ignore-7.0.5.tgz",
@@ -3994,6 +4280,19 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/is-stream": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-3.0.0.tgz",
"integrity": "sha512-LnQR4bZ9IADDRSkvpqMGvt/tEJWclzklNgSw48V5EAaAeDd6qGvN8ei6k5p0tvxSR171VmGyHuTiAOfxAbr8kA==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/is-string": { "node_modules/is-string": {
"version": "1.1.1", "version": "1.1.1",
"resolved": "https://registry.npmjs.org/is-string/-/is-string-1.1.1.tgz", "resolved": "https://registry.npmjs.org/is-string/-/is-string-1.1.1.tgz",
@@ -4255,6 +4554,23 @@
"node": ">= 0.8.0" "node": ">= 0.8.0"
} }
}, },
"node_modules/local-pkg": {
"version": "0.5.1",
"resolved": "https://registry.npmjs.org/local-pkg/-/local-pkg-0.5.1.tgz",
"integrity": "sha512-9rrA30MRRP3gBD3HTGnC6cDFpaE1kVDWxWgqWJUN0RvDNAo+Nz/9GxB+nHOH0ifbVFy0hSA1V6vFDvnx54lTEQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"mlly": "^1.7.3",
"pkg-types": "^1.2.1"
},
"engines": {
"node": ">=14"
},
"funding": {
"url": "https://github.com/sponsors/antfu"
}
},
"node_modules/locate-path": { "node_modules/locate-path": {
"version": "6.0.0", "version": "6.0.0",
"resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz", "resolved": "https://registry.npmjs.org/locate-path/-/locate-path-6.0.0.tgz",
@@ -4290,6 +4606,16 @@
"loose-envify": "cli.js" "loose-envify": "cli.js"
} }
}, },
"node_modules/loupe": {
"version": "2.3.7",
"resolved": "https://registry.npmjs.org/loupe/-/loupe-2.3.7.tgz",
"integrity": "sha512-zSMINGVYkdpYSOBmLi0D1Uo7JU9nVdQKrHxC8eYlV+9YKK9WePqAlL7lSlorG/U2Fw1w0hTBmaa/jrQ3UbPHtA==",
"dev": true,
"license": "MIT",
"dependencies": {
"get-func-name": "^2.0.1"
}
},
"node_modules/lru-cache": { "node_modules/lru-cache": {
"version": "5.1.1", "version": "5.1.1",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz", "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-5.1.1.tgz",
@@ -4300,6 +4626,16 @@
"yallist": "^3.0.2" "yallist": "^3.0.2"
} }
}, },
"node_modules/magic-string": {
"version": "0.30.19",
"resolved": "https://registry.npmjs.org/magic-string/-/magic-string-0.30.19.tgz",
"integrity": "sha512-2N21sPY9Ws53PZvsEpVtNuSW+ScYbQdp4b9qUaL+9QkHUrGFKo56Lg9Emg5s9V/qrtNBmiR01sYhUOwu3H+VOw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@jridgewell/sourcemap-codec": "^1.5.5"
}
},
"node_modules/math-intrinsics": { "node_modules/math-intrinsics": {
"version": "1.1.0", "version": "1.1.0",
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz", "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
@@ -4310,6 +4646,13 @@
"node": ">= 0.4" "node": ">= 0.4"
} }
}, },
"node_modules/merge-stream": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/merge-stream/-/merge-stream-2.0.0.tgz",
"integrity": "sha512-abv/qOcuPfk3URPfDzmZU1LKmuw8kT+0nIHvKrKgFrwifol/doWcdA4ZqsWQ8ENrFKkd67Mfpo/LovbIUsbt3w==",
"dev": true,
"license": "MIT"
},
"node_modules/merge2": { "node_modules/merge2": {
"version": "1.4.1", "version": "1.4.1",
"resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz", "resolved": "https://registry.npmjs.org/merge2/-/merge2-1.4.1.tgz",
@@ -4334,6 +4677,19 @@
"node": ">=8.6" "node": ">=8.6"
} }
}, },
"node_modules/mimic-fn": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/mimic-fn/-/mimic-fn-4.0.0.tgz",
"integrity": "sha512-vqiC06CuhBTUdZH+RYl8sFrL096vA45Ok5ISO6sE/Mr1jRbGH4Csnhi8f3wKVl7x8mO4Au7Ir9D3Oyv1VYMFJw==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/minimatch": { "node_modules/minimatch": {
"version": "9.0.5", "version": "9.0.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz", "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz",
@@ -4360,6 +4716,26 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/mlly": {
"version": "1.8.0",
"resolved": "https://registry.npmjs.org/mlly/-/mlly-1.8.0.tgz",
"integrity": "sha512-l8D9ODSRWLe2KHJSifWGwBqpTZXIXTeo8mlKjY+E2HAakaTeNpqAyBZ8GSqLzHgw4XmHmC8whvpjJNMbFZN7/g==",
"dev": true,
"license": "MIT",
"dependencies": {
"acorn": "^8.15.0",
"pathe": "^2.0.3",
"pkg-types": "^1.3.1",
"ufo": "^1.6.1"
}
},
"node_modules/mlly/node_modules/pathe": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/pathe/-/pathe-2.0.3.tgz",
"integrity": "sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w==",
"dev": true,
"license": "MIT"
},
"node_modules/ms": { "node_modules/ms": {
"version": "2.1.3", "version": "2.1.3",
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz", "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
@@ -4400,6 +4776,35 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/npm-run-path": {
"version": "5.3.0",
"resolved": "https://registry.npmjs.org/npm-run-path/-/npm-run-path-5.3.0.tgz",
"integrity": "sha512-ppwTtiJZq0O/ai0z7yfudtBpWIoxM8yE6nHi1X47eFR2EWORqfbu6CnPlNsjeN683eT0qG6H/Pyf9fCcvjnnnQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"path-key": "^4.0.0"
},
"engines": {
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/npm-run-path/node_modules/path-key": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/path-key/-/path-key-4.0.0.tgz",
"integrity": "sha512-haREypq7xkM7ErfgIyA0z+Bj4AGKlMSdlQE2jvJo6huWD1EdkKYV+G/T4nq0YEF2vgTT8kqMFKo1uHn950r4SQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/object-assign": { "node_modules/object-assign": {
"version": "4.1.1", "version": "4.1.1",
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz", "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
@@ -4533,6 +4938,22 @@
"wrappy": "1" "wrappy": "1"
} }
}, },
"node_modules/onetime": {
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/onetime/-/onetime-6.0.0.tgz",
"integrity": "sha512-1FlR+gjXK7X+AsAHso35MnyN5KqGwJRi/31ft6x0M194ht7S+rWAvd7PHss9xSKMzE0asv1pyIHaJYq+BbacAQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"mimic-fn": "^4.0.0"
},
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/optionator": { "node_modules/optionator": {
"version": "0.9.4", "version": "0.9.4",
"resolved": "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz", "resolved": "https://registry.npmjs.org/optionator/-/optionator-0.9.4.tgz",
@@ -4651,6 +5072,23 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/pathe": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/pathe/-/pathe-1.1.2.tgz",
"integrity": "sha512-whLdWMYL2TwI08hn8/ZqAbrVemu0LNaNNJZX73O6qaIdCTfXutsLhMkjdENX0qhsQ9uIimo4/aQOmXkoon2nDQ==",
"dev": true,
"license": "MIT"
},
"node_modules/pathval": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/pathval/-/pathval-1.1.1.tgz",
"integrity": "sha512-Dp6zGqpTdETdR63lehJYPeIOqpiNBNtc7BpWSLrOje7UaIsE5aY92r/AunQA7rsXvet3lrJ3JnZX29UPTKXyKQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": "*"
}
},
"node_modules/picocolors": { "node_modules/picocolors": {
"version": "1.1.1", "version": "1.1.1",
"resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz", "resolved": "https://registry.npmjs.org/picocolors/-/picocolors-1.1.1.tgz",
@@ -4671,6 +5109,25 @@
"url": "https://github.com/sponsors/jonschlinkert" "url": "https://github.com/sponsors/jonschlinkert"
} }
}, },
"node_modules/pkg-types": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/pkg-types/-/pkg-types-1.3.1.tgz",
"integrity": "sha512-/Jm5M4RvtBFVkKWRu2BLUTNP8/M2a+UwuAX+ae4770q1qVGtfjG+WTCupoZixokjmHiry8uI+dlY8KXYV5HVVQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"confbox": "^0.1.8",
"mlly": "^1.7.4",
"pathe": "^2.0.1"
}
},
"node_modules/pkg-types/node_modules/pathe": {
"version": "2.0.3",
"resolved": "https://registry.npmjs.org/pathe/-/pathe-2.0.3.tgz",
"integrity": "sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w==",
"dev": true,
"license": "MIT"
},
"node_modules/playwright": { "node_modules/playwright": {
"version": "1.56.0", "version": "1.56.0",
"resolved": "https://registry.npmjs.org/playwright/-/playwright-1.56.0.tgz", "resolved": "https://registry.npmjs.org/playwright/-/playwright-1.56.0.tgz",
@@ -4783,6 +5240,41 @@
"url": "https://github.com/prettier/prettier?sponsor=1" "url": "https://github.com/prettier/prettier?sponsor=1"
} }
}, },
"node_modules/pretty-format": {
"version": "29.7.0",
"resolved": "https://registry.npmjs.org/pretty-format/-/pretty-format-29.7.0.tgz",
"integrity": "sha512-Pdlw/oPxN+aXdmM9R00JVC9WVFoCLTKJvDVLgmJ+qAffBMxsV85l/Lu7sNx4zSzPyoL2euImuEwHhOXdEgNFZQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@jest/schemas": "^29.6.3",
"ansi-styles": "^5.0.0",
"react-is": "^18.0.0"
},
"engines": {
"node": "^14.15.0 || ^16.10.0 || >=18.0.0"
}
},
"node_modules/pretty-format/node_modules/ansi-styles": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-5.2.0.tgz",
"integrity": "sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=10"
},
"funding": {
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
}
},
"node_modules/pretty-format/node_modules/react-is": {
"version": "18.3.1",
"resolved": "https://registry.npmjs.org/react-is/-/react-is-18.3.1.tgz",
"integrity": "sha512-/LLMVyas0ljjAtoYiPqYiL8VWXzUUdThrmU5+n20DZv+a+ClRoevUzw5JxU+Ieh5/c87ytoTBV9G1FiKfNJdmg==",
"dev": true,
"license": "MIT"
},
"node_modules/prop-types": { "node_modules/prop-types": {
"version": "15.8.1", "version": "15.8.1",
"resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz", "resolved": "https://registry.npmjs.org/prop-types/-/prop-types-15.8.1.tgz",
@@ -5276,6 +5768,26 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/siginfo": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/siginfo/-/siginfo-2.0.0.tgz",
"integrity": "sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g==",
"dev": true,
"license": "ISC"
},
"node_modules/signal-exit": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-4.1.0.tgz",
"integrity": "sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==",
"dev": true,
"license": "ISC",
"engines": {
"node": ">=14"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/source-map-js": { "node_modules/source-map-js": {
"version": "1.2.1", "version": "1.2.1",
"resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz",
@@ -5286,6 +5798,20 @@
"node": ">=0.10.0" "node": ">=0.10.0"
} }
}, },
"node_modules/stackback": {
"version": "0.0.2",
"resolved": "https://registry.npmjs.org/stackback/-/stackback-0.0.2.tgz",
"integrity": "sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw==",
"dev": true,
"license": "MIT"
},
"node_modules/std-env": {
"version": "3.9.0",
"resolved": "https://registry.npmjs.org/std-env/-/std-env-3.9.0.tgz",
"integrity": "sha512-UGvjygr6F6tpH7o2qyqR6QYpwraIjKSdtzyBdyytFOHmPZY917kwdwLG0RbOjWOnKmnm3PeHjaoLLMie7kPLQw==",
"dev": true,
"license": "MIT"
},
"node_modules/stop-iteration-iterator": { "node_modules/stop-iteration-iterator": {
"version": "1.1.0", "version": "1.1.0",
"resolved": "https://registry.npmjs.org/stop-iteration-iterator/-/stop-iteration-iterator-1.1.0.tgz", "resolved": "https://registry.npmjs.org/stop-iteration-iterator/-/stop-iteration-iterator-1.1.0.tgz",
@@ -5436,6 +5962,19 @@
"node": ">=4" "node": ">=4"
} }
}, },
"node_modules/strip-final-newline": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/strip-final-newline/-/strip-final-newline-3.0.0.tgz",
"integrity": "sha512-dOESqjYr96iWYylGObzd39EuNTa5VJxyvVAEm5Jnh7KGo75V43Hk1odPQkNDyXNmUR6k+gEiDVXnjB8HJ3crXw==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=12"
},
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/strip-json-comments": { "node_modules/strip-json-comments": {
"version": "3.1.1", "version": "3.1.1",
"resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz", "resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-3.1.1.tgz",
@@ -5449,6 +5988,26 @@
"url": "https://github.com/sponsors/sindresorhus" "url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/strip-literal": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/strip-literal/-/strip-literal-2.1.1.tgz",
"integrity": "sha512-631UJ6O00eNGfMiWG78ck80dfBab8X6IVFB51jZK5Icd7XAs60Z5y7QdSd/wGIklnWvRbUNloVzhOKKmutxQ6Q==",
"dev": true,
"license": "MIT",
"dependencies": {
"js-tokens": "^9.0.1"
},
"funding": {
"url": "https://github.com/sponsors/antfu"
}
},
"node_modules/strip-literal/node_modules/js-tokens": {
"version": "9.0.1",
"resolved": "https://registry.npmjs.org/js-tokens/-/js-tokens-9.0.1.tgz",
"integrity": "sha512-mxa9E9ITFOt0ban3j6L5MpjwegGz6lBQmM1IJkWeBZGcMxto50+eWdjC/52xDbS2vy0k7vIMK0Fe2wfL9OQSpQ==",
"dev": true,
"license": "MIT"
},
"node_modules/supports-color": { "node_modules/supports-color": {
"version": "7.2.0", "version": "7.2.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz", "resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
@@ -5482,6 +6041,33 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/tinybench": {
"version": "2.9.0",
"resolved": "https://registry.npmjs.org/tinybench/-/tinybench-2.9.0.tgz",
"integrity": "sha512-0+DUvqWMValLmha6lr4kD8iAMK1HzV0/aKnCtWb9v9641TnP/MFb7Pc2bxoxQjTXAErryXVgUOfv2YqNllqGeg==",
"dev": true,
"license": "MIT"
},
"node_modules/tinypool": {
"version": "0.8.4",
"resolved": "https://registry.npmjs.org/tinypool/-/tinypool-0.8.4.tgz",
"integrity": "sha512-i11VH5gS6IFeLY3gMBQ00/MmLncVP7JLXOw1vlgkytLmJK7QnEr7NXf0LBdxfmNPAeyetukOk0bOYrJrFGjYJQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/tinyspy": {
"version": "2.2.1",
"resolved": "https://registry.npmjs.org/tinyspy/-/tinyspy-2.2.1.tgz",
"integrity": "sha512-KYad6Vy5VDWV4GH3fjpseMQ/XU2BhIYP7Vzd0LG44qRWm/Yt2WCOTicFdvmgo6gWaqooMQCawTtILVQJupKu7A==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/to-regex-range": { "node_modules/to-regex-range": {
"version": "5.0.1", "version": "5.0.1",
"resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", "resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz",
@@ -5547,6 +6133,16 @@
"node": ">= 0.8.0" "node": ">= 0.8.0"
} }
}, },
"node_modules/type-detect": {
"version": "4.1.0",
"resolved": "https://registry.npmjs.org/type-detect/-/type-detect-4.1.0.tgz",
"integrity": "sha512-Acylog8/luQ8L7il+geoSxhEkazvkslg7PSNKOX59mbB9cOveP5aq9h74Y7YU8yDpJwetzQQrfIwtf4Wp4LKcw==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=4"
}
},
"node_modules/type-fest": { "node_modules/type-fest": {
"version": "0.20.2", "version": "0.20.2",
"resolved": "https://registry.npmjs.org/type-fest/-/type-fest-0.20.2.tgz", "resolved": "https://registry.npmjs.org/type-fest/-/type-fest-0.20.2.tgz",
@@ -5652,6 +6248,13 @@
"node": ">=14.17" "node": ">=14.17"
} }
}, },
"node_modules/ufo": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/ufo/-/ufo-1.6.1.tgz",
"integrity": "sha512-9a4/uxlTWJ4+a5i0ooc1rU7C7YOw3wT+UGqdeNNHWnOF9qcMBgLRS+4IYUqbczewFx4mLEig6gawh7X6mFlEkA==",
"dev": true,
"license": "MIT"
},
"node_modules/unbox-primitive": { "node_modules/unbox-primitive": {
"version": "1.1.0", "version": "1.1.0",
"resolved": "https://registry.npmjs.org/unbox-primitive/-/unbox-primitive-1.1.0.tgz", "resolved": "https://registry.npmjs.org/unbox-primitive/-/unbox-primitive-1.1.0.tgz",
@@ -5779,6 +6382,95 @@
} }
} }
}, },
"node_modules/vite-node": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/vite-node/-/vite-node-1.6.1.tgz",
"integrity": "sha512-YAXkfvGtuTzwWbDSACdJSg4A4DZiAqckWe90Zapc/sEX3XvHcw1NdurM/6od8J207tSDqNbSsgdCacBgvJKFuA==",
"dev": true,
"license": "MIT",
"dependencies": {
"cac": "^6.7.14",
"debug": "^4.3.4",
"pathe": "^1.1.1",
"picocolors": "^1.0.0",
"vite": "^5.0.0"
},
"bin": {
"vite-node": "vite-node.mjs"
},
"engines": {
"node": "^18.0.0 || >=20.0.0"
},
"funding": {
"url": "https://opencollective.com/vitest"
}
},
"node_modules/vitest": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/vitest/-/vitest-1.6.1.tgz",
"integrity": "sha512-Ljb1cnSJSivGN0LqXd/zmDbWEM0RNNg2t1QW/XUhYl/qPqyu7CsqeWtqQXHVaJsecLPuDoak2oJcZN2QoRIOag==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vitest/expect": "1.6.1",
"@vitest/runner": "1.6.1",
"@vitest/snapshot": "1.6.1",
"@vitest/spy": "1.6.1",
"@vitest/utils": "1.6.1",
"acorn-walk": "^8.3.2",
"chai": "^4.3.10",
"debug": "^4.3.4",
"execa": "^8.0.1",
"local-pkg": "^0.5.0",
"magic-string": "^0.30.5",
"pathe": "^1.1.1",
"picocolors": "^1.0.0",
"std-env": "^3.5.0",
"strip-literal": "^2.0.0",
"tinybench": "^2.5.1",
"tinypool": "^0.8.3",
"vite": "^5.0.0",
"vite-node": "1.6.1",
"why-is-node-running": "^2.2.2"
},
"bin": {
"vitest": "vitest.mjs"
},
"engines": {
"node": "^18.0.0 || >=20.0.0"
},
"funding": {
"url": "https://opencollective.com/vitest"
},
"peerDependencies": {
"@edge-runtime/vm": "*",
"@types/node": "^18.0.0 || >=20.0.0",
"@vitest/browser": "1.6.1",
"@vitest/ui": "1.6.1",
"happy-dom": "*",
"jsdom": "*"
},
"peerDependenciesMeta": {
"@edge-runtime/vm": {
"optional": true
},
"@types/node": {
"optional": true
},
"@vitest/browser": {
"optional": true
},
"@vitest/ui": {
"optional": true
},
"happy-dom": {
"optional": true
},
"jsdom": {
"optional": true
}
}
},
"node_modules/which": { "node_modules/which": {
"version": "2.0.2", "version": "2.0.2",
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz", "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
@@ -5884,6 +6576,23 @@
"url": "https://github.com/sponsors/ljharb" "url": "https://github.com/sponsors/ljharb"
} }
}, },
"node_modules/why-is-node-running": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/why-is-node-running/-/why-is-node-running-2.3.0.tgz",
"integrity": "sha512-hUrmaWBdVDcxvYqnyh09zunKzROWjbZTiNy8dBEjkS7ehEDQibXJ7XvlmtbwuTclUiIyN+CyXQD4Vmko8fNm8w==",
"dev": true,
"license": "MIT",
"dependencies": {
"siginfo": "^2.0.0",
"stackback": "0.0.2"
},
"bin": {
"why-is-node-running": "cli.js"
},
"engines": {
"node": ">=8"
}
},
"node_modules/word-wrap": { "node_modules/word-wrap": {
"version": "1.2.5", "version": "1.2.5",
"resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz", "resolved": "https://registry.npmjs.org/word-wrap/-/word-wrap-1.2.5.tgz",

View File

@@ -9,6 +9,7 @@
"preview": "vite preview", "preview": "vite preview",
"lint": "eslint \"src/**/*.{ts,tsx}\"", "lint": "eslint \"src/**/*.{ts,tsx}\"",
"format": "prettier --write \"src/**/*.{ts,tsx,css}\"", "format": "prettier --write \"src/**/*.{ts,tsx,css}\"",
"test": "vitest run",
"test:e2e": "playwright test" "test:e2e": "playwright test"
}, },
"dependencies": { "dependencies": {
@@ -34,6 +35,7 @@
"eslint-plugin-react-hooks": "^4.6.2", "eslint-plugin-react-hooks": "^4.6.2",
"prettier": "^3.3.3", "prettier": "^3.3.3",
"typescript": "^5.5.3", "typescript": "^5.5.3",
"vite": "^5.4.0" "vite": "^5.4.0",
"vitest": "^1.6.0"
} }
} }

View File

@@ -1,38 +1,115 @@
import './styles/global.css'; import './styles/global.css';
import type { LatLngExpression } from 'leaflet';
import { useEffect, useMemo, useState } from 'react'; import { useEffect, useMemo, useState } from 'react';
import { LoginForm } from './components/auth/LoginForm'; import { LoginForm } from './components/auth/LoginForm';
import { NetworkMap } from './components/map/NetworkMap'; import { NetworkMap } from './components/map/NetworkMap';
import { useNetworkSnapshot } from './hooks/useNetworkSnapshot'; import { useNetworkSnapshot } from './hooks/useNetworkSnapshot';
import { useAuth } from './state/AuthContext'; import { useAuth } from './state/AuthContext';
import type { Station } from './types/domain';
import { buildTrackAdjacency, computeRoute } from './utils/route';
function App(): JSX.Element { function App(): JSX.Element {
const { token, user, status: authStatus, logout } = useAuth(); const { token, user, status: authStatus, logout } = useAuth();
const isAuthenticated = authStatus === 'authenticated' && token !== null; const isAuthenticated = authStatus === 'authenticated' && token !== null;
const { data, status, error } = useNetworkSnapshot(isAuthenticated ? token : null); const { data, status, error } = useNetworkSnapshot(isAuthenticated ? token : null);
const [selectedStationId, setSelectedStationId] = useState<string | null>(null); const [focusedStationId, setFocusedStationId] = useState<string | null>(null);
const [routeSelection, setRouteSelection] = useState<{
startId: string | null;
endId: string | null;
}>({ startId: null, endId: null });
useEffect(() => { useEffect(() => {
if (status !== 'success' || !data?.stations.length) { if (status !== 'success' || !data?.stations.length) {
setSelectedStationId(null); setFocusedStationId(null);
setRouteSelection({ startId: null, endId: null });
return; return;
} }
if ( if (!focusedStationId || !hasStation(data.stations, focusedStationId)) {
!selectedStationId || setFocusedStationId(data.stations[0].id);
!data.stations.some((station) => station.id === selectedStationId)
) {
setSelectedStationId(data.stations[0].id);
} }
}, [status, data, selectedStationId]); }, [status, data, focusedStationId]);
const selectedStation = useMemo(() => { useEffect(() => {
if (!data || !selectedStationId) { if (status !== 'success' || !data) {
return;
}
setRouteSelection((current) => {
const startExists = current.startId
? hasStation(data.stations, current.startId)
: false;
const endExists = current.endId
? hasStation(data.stations, current.endId)
: false;
return {
startId: startExists ? current.startId : null,
endId: endExists ? current.endId : null,
};
});
}, [status, data]);
const stationById = useMemo(() => {
if (!data) {
return new Map<string, Station>();
}
const lookup = new Map<string, Station>();
for (const station of data.stations) {
lookup.set(station.id, station);
}
return lookup;
}, [data]);
const trackAdjacency = useMemo(
() => buildTrackAdjacency(data ? data.tracks : []),
[data]
);
const routeComputation = useMemo(
() =>
computeRoute({
startId: routeSelection.startId,
endId: routeSelection.endId,
stationById,
adjacency: trackAdjacency,
}),
[routeSelection, stationById, trackAdjacency]
);
const routeSegments = useMemo<LatLngExpression[][]>(() => {
return routeComputation.segments.map((segment) =>
segment.map((pair) => [pair[0], pair[1]] as LatLngExpression)
);
}, [routeComputation.segments]);
const focusedStation = useMemo(() => {
if (!data || !focusedStationId) {
return null; return null;
} }
return data.stations.find((station) => station.id === selectedStationId) ?? null; return stationById.get(focusedStationId) ?? null;
}, [data, selectedStationId]); }, [data, focusedStationId, stationById]);
const handleStationSelection = (stationId: string) => {
setFocusedStationId(stationId);
setRouteSelection((current) => {
if (!current.startId || (current.startId && current.endId)) {
return { startId: stationId, endId: null };
}
if (current.startId === stationId) {
return { startId: stationId, endId: null };
}
return { startId: current.startId, endId: stationId };
});
};
const clearRouteSelection = () => {
setRouteSelection({ startId: null, endId: null });
};
return ( return (
<div className="app-shell"> <div className="app-shell">
@@ -65,10 +142,71 @@ function App(): JSX.Element {
<div className="map-wrapper"> <div className="map-wrapper">
<NetworkMap <NetworkMap
snapshot={data} snapshot={data}
selectedStationId={selectedStationId} focusedStationId={focusedStationId}
onSelectStation={(id) => setSelectedStationId(id)} startStationId={routeSelection.startId}
endStationId={routeSelection.endId}
routeSegments={routeSegments}
onStationClick={handleStationSelection}
/> />
</div> </div>
<div className="route-panel">
<div className="route-panel__header">
<h3>Route Selection</h3>
<button
type="button"
className="ghost-button"
onClick={clearRouteSelection}
disabled={!routeSelection.startId && !routeSelection.endId}
>
Clear
</button>
</div>
<p className="route-panel__hint">
Click a station to set the origin, then click another station to
preview the rail corridor between them.
</p>
<dl className="route-panel__meta">
<div>
<dt>Origin</dt>
<dd>
{routeSelection.startId
? (stationById.get(routeSelection.startId)?.name ??
'Unknown station')
: 'Choose a station'}
</dd>
</div>
<div>
<dt>Destination</dt>
<dd>
{routeSelection.endId
? (stationById.get(routeSelection.endId)?.name ??
'Unknown station')
: 'Choose a station'}
</dd>
</div>
<div>
<dt>Estimated Length</dt>
<dd>
{routeComputation.totalLength !== null
? `${(routeComputation.totalLength / 1000).toFixed(2)} km`
: 'N/A'}
</dd>
</div>
</dl>
{routeComputation.error && (
<p className="route-panel__error">{routeComputation.error}</p>
)}
{!routeComputation.error && routeComputation.stations && (
<div className="route-panel__path">
<span>Path:</span>
<ol>
{routeComputation.stations.map((station) => (
<li key={`route-station-${station.id}`}>{station.name}</li>
))}
</ol>
</div>
)}
</div>
<div className="grid"> <div className="grid">
<div> <div>
<h3>Stations</h3> <h3>Stations</h3>
@@ -78,12 +216,20 @@ function App(): JSX.Element {
<button <button
type="button" type="button"
className={`station-list-item${ className={`station-list-item${
station.id === selectedStationId station.id === focusedStationId
? ' station-list-item--selected' ? ' station-list-item--selected'
: '' : ''
}${
station.id === routeSelection.startId
? ' station-list-item--start'
: ''
}${
station.id === routeSelection.endId
? ' station-list-item--end'
: ''
}`} }`}
aria-pressed={station.id === selectedStationId} aria-pressed={station.id === focusedStationId}
onClick={() => setSelectedStationId(station.id)} onClick={() => handleStationSelection(station.id)}
> >
<span className="station-list-item__name"> <span className="station-list-item__name">
{station.name} {station.name}
@@ -92,6 +238,14 @@ function App(): JSX.Element {
{station.latitude.toFixed(3)},{' '} {station.latitude.toFixed(3)},{' '}
{station.longitude.toFixed(3)} {station.longitude.toFixed(3)}
</span> </span>
{station.id === routeSelection.startId && (
<span className="station-list-item__badge">Origin</span>
)}
{station.id === routeSelection.endId && (
<span className="station-list-item__badge">
Destination
</span>
)}
</button> </button>
</li> </li>
))} ))}
@@ -114,49 +268,51 @@ function App(): JSX.Element {
{data.tracks.map((track) => ( {data.tracks.map((track) => (
<li key={track.id}> <li key={track.id}>
{track.startStationId} {track.endStationId} ·{' '} {track.startStationId} {track.endStationId} ·{' '}
{(track.lengthMeters / 1000).toFixed(1)} km {track.lengthMeters > 0
? `${(track.lengthMeters / 1000).toFixed(1)} km`
: 'N/A'}
</li> </li>
))} ))}
</ul> </ul>
</div> </div>
</div> </div>
{selectedStation && ( {focusedStation && (
<div className="selected-station"> <div className="selected-station">
<h3>Selected Station</h3> <h3>Focused Station</h3>
<dl> <dl>
<div> <div>
<dt>Name</dt> <dt>Name</dt>
<dd>{selectedStation.name}</dd> <dd>{focusedStation.name}</dd>
</div> </div>
<div> <div>
<dt>Coordinates</dt> <dt>Coordinates</dt>
<dd> <dd>
{selectedStation.latitude.toFixed(5)},{' '} {focusedStation.latitude.toFixed(5)},{' '}
{selectedStation.longitude.toFixed(5)} {focusedStation.longitude.toFixed(5)}
</dd> </dd>
</div> </div>
{selectedStation.code && ( {focusedStation.code && (
<div> <div>
<dt>Code</dt> <dt>Code</dt>
<dd>{selectedStation.code}</dd> <dd>{focusedStation.code}</dd>
</div> </div>
)} )}
{typeof selectedStation.elevationM === 'number' && ( {typeof focusedStation.elevationM === 'number' && (
<div> <div>
<dt>Elevation</dt> <dt>Elevation</dt>
<dd>{selectedStation.elevationM.toFixed(1)} m</dd> <dd>{focusedStation.elevationM.toFixed(1)} m</dd>
</div> </div>
)} )}
{selectedStation.osmId && ( {focusedStation.osmId && (
<div> <div>
<dt>OSM ID</dt> <dt>OSM ID</dt>
<dd>{selectedStation.osmId}</dd> <dd>{focusedStation.osmId}</dd>
</div> </div>
)} )}
<div> <div>
<dt>Status</dt> <dt>Status</dt>
<dd> <dd>
{(selectedStation.isActive ?? true) ? 'Active' : 'Inactive'} {(focusedStation.isActive ?? true) ? 'Active' : 'Inactive'}
</dd> </dd>
</div> </div>
</dl> </dl>
@@ -172,3 +328,7 @@ function App(): JSX.Element {
} }
export default App; export default App;
function hasStation(stations: Station[], id: string): boolean {
return stations.some((station) => station.id === id);
}

View File

@@ -15,8 +15,11 @@ import 'leaflet/dist/leaflet.css';
interface NetworkMapProps { interface NetworkMapProps {
readonly snapshot: NetworkSnapshot; readonly snapshot: NetworkSnapshot;
readonly selectedStationId?: string | null; readonly focusedStationId?: string | null;
readonly onSelectStation?: (stationId: string) => void; readonly startStationId?: string | null;
readonly endStationId?: string | null;
readonly routeSegments?: LatLngExpression[][];
readonly onStationClick?: (stationId: string) => void;
} }
interface StationPosition { interface StationPosition {
@@ -29,8 +32,11 @@ const DEFAULT_CENTER: LatLngExpression = [51.505, -0.09];
export function NetworkMap({ export function NetworkMap({
snapshot, snapshot,
selectedStationId, focusedStationId,
onSelectStation, startStationId,
endStationId,
routeSegments = [],
onStationClick,
}: NetworkMapProps): JSX.Element { }: NetworkMapProps): JSX.Element {
const stationPositions = useMemo<StationPosition[]>(() => { const stationPositions = useMemo<StationPosition[]>(() => {
return snapshot.stations.map((station) => ({ return snapshot.stations.map((station) => ({
@@ -51,6 +57,12 @@ export function NetworkMap({
const trackSegments = useMemo(() => { const trackSegments = useMemo(() => {
return snapshot.tracks return snapshot.tracks
.map((track) => { .map((track) => {
if (track.coordinates && track.coordinates.length >= 2) {
return track.coordinates.map(
(pair) => [pair[0], pair[1]] as LatLngExpression
);
}
const start = stationLookup.get(track.startStationId); const start = stationLookup.get(track.startStationId);
const end = stationLookup.get(track.endStationId); const end = stationLookup.get(track.endStationId);
if (!start || !end) { if (!start || !end) {
@@ -86,12 +98,12 @@ export function NetworkMap({
] as LatLngBoundsExpression; ] as LatLngBoundsExpression;
}, [stationPositions]); }, [stationPositions]);
const selectedPosition = useMemo(() => { const focusedPosition = useMemo(() => {
if (!selectedStationId) { if (!focusedStationId) {
return null; return null;
} }
return stationLookup.get(selectedStationId) ?? null; return stationLookup.get(focusedStationId) ?? null;
}, [selectedStationId, stationLookup]); }, [focusedStationId, stationLookup]);
return ( return (
<MapContainer <MapContainer
@@ -104,35 +116,52 @@ export function NetworkMap({
attribution='&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors' attribution='&copy; <a href="https://www.openstreetmap.org/copyright">OpenStreetMap</a> contributors'
url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png" url="https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png"
/> />
{selectedPosition ? <StationFocus position={selectedPosition} /> : null} {focusedPosition ? <StationFocus position={focusedPosition} /> : null}
{trackSegments.map((segment, index) => ( {trackSegments.map((segment, index) => (
<Polyline <Polyline
key={`track-${index}`} key={`track-${index}`}
positions={segment} positions={segment}
pathOptions={{ color: '#38bdf8', weight: 4 }} pathOptions={{ color: '#334155', weight: 3, opacity: 0.8 }}
/>
))}
{routeSegments.map((segment, index) => (
<Polyline
key={`route-${index}`}
positions={segment}
pathOptions={{ color: '#facc15', weight: 6, opacity: 0.9 }}
/> />
))} ))}
{stationPositions.map((station) => ( {stationPositions.map((station) => (
<CircleMarker <CircleMarker
key={station.id} key={station.id}
center={station.position} center={station.position}
radius={station.id === selectedStationId ? 9 : 6} radius={station.id === focusedStationId ? 9 : 6}
pathOptions={{ pathOptions={{
color: station.id === selectedStationId ? '#34d399' : '#f97316', color: resolveMarkerStroke(
fillColor: station.id === selectedStationId ? '#6ee7b7' : '#fed7aa', station.id,
fillOpacity: 0.95, startStationId,
weight: station.id === selectedStationId ? 3 : 1, endStationId,
focusedStationId
),
fillColor: resolveMarkerFill(
station.id,
startStationId,
endStationId,
focusedStationId
),
fillOpacity: 0.96,
weight: station.id === focusedStationId ? 3 : 1,
}} }}
eventHandlers={{ eventHandlers={{
click: () => { click: () => {
onSelectStation?.(station.id); onStationClick?.(station.id);
}, },
}} }}
> >
<Tooltip <Tooltip
direction="top" direction="top"
offset={[0, -8]} offset={[0, -8]}
permanent={station.id === selectedStationId} permanent={station.id === focusedStationId}
sticky sticky
> >
{station.name} {station.name}
@@ -152,3 +181,39 @@ function StationFocus({ position }: { position: LatLngExpression }): null {
return null; return null;
} }
function resolveMarkerStroke(
stationId: string,
startStationId?: string | null,
endStationId?: string | null,
focusedStationId?: string | null
): string {
if (stationId === startStationId) {
return '#38bdf8';
}
if (stationId === endStationId) {
return '#fb923c';
}
if (stationId === focusedStationId) {
return '#22c55e';
}
return '#f97316';
}
function resolveMarkerFill(
stationId: string,
startStationId?: string | null,
endStationId?: string | null,
focusedStationId?: string | null
): string {
if (stationId === startStationId) {
return '#bae6fd';
}
if (stationId === endStationId) {
return '#fed7aa';
}
if (stationId === focusedStationId) {
return '#bbf7d0';
}
return '#ffe4c7';
}

View File

@@ -102,6 +102,7 @@ body {
background-color 0.18s ease, background-color 0.18s ease,
border-color 0.18s ease, border-color 0.18s ease,
transform 0.18s ease; transform 0.18s ease;
flex-wrap: wrap;
} }
.station-list-item:hover, .station-list-item:hover,
@@ -118,6 +119,16 @@ body {
box-shadow: 0 8px 18px -10px rgba(45, 212, 191, 0.65); box-shadow: 0 8px 18px -10px rgba(45, 212, 191, 0.65);
} }
.station-list-item--start {
border-color: rgba(56, 189, 248, 0.8);
background: rgba(14, 165, 233, 0.2);
}
.station-list-item--end {
border-color: rgba(249, 115, 22, 0.8);
background: rgba(234, 88, 12, 0.18);
}
.station-list-item__name { .station-list-item__name {
font-weight: 600; font-weight: 600;
} }
@@ -128,6 +139,27 @@ body {
font-family: 'Fira Code', 'Source Code Pro', monospace; font-family: 'Fira Code', 'Source Code Pro', monospace;
} }
.station-list-item__badge {
font-size: 0.75rem;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.05em;
padding: 0.1rem 0.45rem;
border-radius: 999px;
background: rgba(148, 163, 184, 0.18);
color: rgba(226, 232, 240, 0.85);
}
.station-list-item--start .station-list-item__badge {
background: rgba(56, 189, 248, 0.35);
color: #0ea5e9;
}
.station-list-item--end .station-list-item__badge {
background: rgba(249, 115, 22, 0.35);
color: #f97316;
}
.grid h3 { .grid h3 {
margin-bottom: 0.5rem; margin-bottom: 0.5rem;
font-size: 1.1rem; font-size: 1.1rem;
@@ -151,6 +183,89 @@ body {
width: 100%; width: 100%;
} }
.route-panel {
display: grid;
gap: 0.85rem;
padding: 1.1rem 1.35rem;
border-radius: 12px;
border: 1px solid rgba(250, 204, 21, 0.3);
background: rgba(161, 98, 7, 0.16);
}
.route-panel__header {
display: flex;
align-items: center;
justify-content: space-between;
gap: 1rem;
}
.route-panel__hint {
font-size: 0.9rem;
color: rgba(226, 232, 240, 0.78);
}
.route-panel__meta {
display: grid;
gap: 0.45rem;
}
.route-panel__meta > div {
display: flex;
justify-content: space-between;
align-items: baseline;
gap: 0.75rem;
}
.route-panel__meta dt {
font-size: 0.8rem;
text-transform: uppercase;
letter-spacing: 0.06em;
color: rgba(226, 232, 240, 0.65);
}
.route-panel__meta dd {
font-size: 0.95rem;
color: rgba(226, 232, 240, 0.92);
}
.route-panel__error {
color: #f87171;
font-weight: 600;
}
.route-panel__path {
display: flex;
gap: 0.6rem;
align-items: baseline;
}
.route-panel__path span {
font-size: 0.85rem;
color: rgba(226, 232, 240, 0.7);
text-transform: uppercase;
letter-spacing: 0.06em;
}
.route-panel__path ol {
display: flex;
flex-wrap: wrap;
gap: 0.4rem;
list-style: none;
padding: 0;
margin: 0;
}
.route-panel__path li::after {
content: '→';
margin-left: 0.35rem;
color: rgba(250, 204, 21, 0.75);
}
.route-panel__path li:last-child::after {
content: '';
margin: 0;
}
.selected-station { .selected-station {
margin-top: 1rem; margin-top: 1rem;
padding: 1rem 1.25rem; padding: 1rem 1.25rem;

View File

@@ -22,6 +22,9 @@ export interface Track extends Identified {
readonly endStationId: string; readonly endStationId: string;
readonly lengthMeters: number; readonly lengthMeters: number;
readonly maxSpeedKph: number; readonly maxSpeedKph: number;
readonly status?: string | null;
readonly isBidirectional?: boolean;
readonly coordinates: readonly [number, number][];
} }
export interface Train extends Identified { export interface Train extends Identified {

View File

@@ -0,0 +1,216 @@
import { describe, expect, it } from 'vitest';
import { buildTrackAdjacency, computeRoute } from './route';
import type { Station, Track } from '../types/domain';
const baseTimestamps = {
createdAt: '2024-01-01T00:00:00Z',
updatedAt: '2024-01-01T00:00:00Z',
};
describe('route utilities', () => {
it('finds a multi-hop path across connected tracks', () => {
const stations: Station[] = [
{
id: 'station-a',
name: 'Alpha',
latitude: 51.5,
longitude: -0.1,
...baseTimestamps,
},
{
id: 'station-b',
name: 'Bravo',
latitude: 51.52,
longitude: -0.11,
...baseTimestamps,
},
{
id: 'station-c',
name: 'Charlie',
latitude: 51.54,
longitude: -0.12,
...baseTimestamps,
},
{
id: 'station-d',
name: 'Delta',
latitude: 51.55,
longitude: -0.15,
...baseTimestamps,
},
];
const tracks: Track[] = [
{
id: 'track-ab',
startStationId: 'station-a',
endStationId: 'station-b',
lengthMeters: 1200,
maxSpeedKph: 120,
coordinates: [
[51.5, -0.1],
[51.51, -0.105],
[51.52, -0.11],
],
...baseTimestamps,
},
{
id: 'track-bc',
startStationId: 'station-b',
endStationId: 'station-c',
lengthMeters: 1500,
maxSpeedKph: 110,
coordinates: [
[51.52, -0.11],
[51.53, -0.115],
[51.54, -0.12],
],
...baseTimestamps,
},
{
id: 'track-cd',
startStationId: 'station-c',
endStationId: 'station-d',
lengthMeters: 900,
maxSpeedKph: 115,
coordinates: [
[51.54, -0.12],
[51.545, -0.13],
[51.55, -0.15],
],
...baseTimestamps,
},
];
const stationById = new Map(stations.map((station) => [station.id, station]));
const adjacency = buildTrackAdjacency(tracks);
const result = computeRoute({
startId: 'station-a',
endId: 'station-d',
stationById,
adjacency,
});
expect(result.error).toBeNull();
expect(result.stations?.map((station) => station.id)).toEqual([
'station-a',
'station-b',
'station-c',
'station-d',
]);
expect(result.tracks.map((track) => track.id)).toEqual([
'track-ab',
'track-bc',
'track-cd',
]);
expect(result.totalLength).toBe(1200 + 1500 + 900);
expect(result.segments).toHaveLength(3);
expect(result.segments[0][0]).toEqual([51.5, -0.1]);
expect(result.segments[2][result.segments[2].length - 1]).toEqual([51.55, -0.15]);
});
it('returns an error when no path exists', () => {
const stations: Station[] = [
{
id: 'station-a',
name: 'Alpha',
latitude: 51.5,
longitude: -0.1,
...baseTimestamps,
},
{
id: 'station-b',
name: 'Bravo',
latitude: 51.6,
longitude: -0.2,
...baseTimestamps,
},
];
const tracks: Track[] = [
{
id: 'track-self',
startStationId: 'station-a',
endStationId: 'station-a',
lengthMeters: 0,
maxSpeedKph: 80,
coordinates: [
[51.5, -0.1],
[51.5005, -0.1005],
],
...baseTimestamps,
},
];
const stationById = new Map(stations.map((station) => [station.id, station]));
const adjacency = buildTrackAdjacency(tracks);
const result = computeRoute({
startId: 'station-a',
endId: 'station-b',
stationById,
adjacency,
});
expect(result.stations).toBeNull();
expect(result.tracks).toHaveLength(0);
expect(result.error).toBe(
'No rail connection found between the selected stations.'
);
expect(result.segments).toHaveLength(0);
});
it('reverses track geometry when traversing in the opposite direction', () => {
const stations: Station[] = [
{
id: 'station-a',
name: 'Alpha',
latitude: 51.5,
longitude: -0.1,
...baseTimestamps,
},
{
id: 'station-b',
name: 'Bravo',
latitude: 51.52,
longitude: -0.11,
...baseTimestamps,
},
];
const tracks: Track[] = [
{
id: 'track-ab',
startStationId: 'station-a',
endStationId: 'station-b',
lengthMeters: 1200,
maxSpeedKph: 120,
coordinates: [
[51.5, -0.1],
[51.52, -0.11],
],
...baseTimestamps,
},
];
const stationById = new Map(stations.map((station) => [station.id, station]));
const adjacency = buildTrackAdjacency(tracks);
const result = computeRoute({
startId: 'station-b',
endId: 'station-a',
stationById,
adjacency,
});
expect(result.error).toBeNull();
expect(result.segments).toEqual([
[
[51.52, -0.11],
[51.5, -0.1],
],
]);
});
});

239
frontend/src/utils/route.ts Normal file
View File

@@ -0,0 +1,239 @@
import type { Station, Track } from '../types/domain';
export type LatLngTuple = readonly [number, number];
export interface NeighborEdge {
readonly neighborId: string;
readonly track: Track;
readonly isForward: boolean;
}
export type TrackAdjacency = Map<string, NeighborEdge[]>;
export interface ComputeRouteParams {
readonly startId?: string | null;
readonly endId?: string | null;
readonly stationById: Map<string, Station>;
readonly adjacency: TrackAdjacency;
}
export interface RouteComputation {
readonly stations: Station[] | null;
readonly tracks: Track[];
readonly totalLength: number | null;
readonly error: string | null;
readonly segments: LatLngTuple[][];
}
export function buildTrackAdjacency(tracks: readonly Track[]): TrackAdjacency {
const adjacency: TrackAdjacency = new Map();
const register = (fromId: string, toId: string, track: Track, isForward: boolean) => {
if (!adjacency.has(fromId)) {
adjacency.set(fromId, []);
}
adjacency.get(fromId)!.push({ neighborId: toId, track, isForward });
};
for (const track of tracks) {
register(track.startStationId, track.endStationId, track, true);
register(track.endStationId, track.startStationId, track, false);
}
return adjacency;
}
export function computeRoute({
startId,
endId,
stationById,
adjacency,
}: ComputeRouteParams): RouteComputation {
if (!startId || !endId) {
return emptyResult();
}
if (!stationById.has(startId) || !stationById.has(endId)) {
return {
stations: null,
tracks: [],
totalLength: null,
error: 'Selected stations are no longer available.',
segments: [],
};
}
if (startId === endId) {
const station = stationById.get(startId);
return {
stations: station ? [station] : null,
tracks: [],
totalLength: 0,
error: null,
segments: [],
};
}
const visited = new Set<string>();
const queue: string[] = [];
const parent = new Map<string, { prev: string | null; edge: NeighborEdge | null }>();
queue.push(startId);
visited.add(startId);
parent.set(startId, { prev: null, edge: null });
while (queue.length > 0) {
const current = queue.shift()!;
if (current === endId) {
break;
}
const neighbors = adjacency.get(current) ?? [];
for (const edge of neighbors) {
const { neighborId } = edge;
if (visited.has(neighborId)) {
continue;
}
visited.add(neighborId);
parent.set(neighborId, { prev: current, edge });
queue.push(neighborId);
}
}
if (!parent.has(endId)) {
return {
stations: null,
tracks: [],
totalLength: null,
error: 'No rail connection found between the selected stations.',
segments: [],
};
}
const stationPath: string[] = [];
const trackSequence: Track[] = [];
const directions: boolean[] = [];
let cursor: string | null = endId;
while (cursor) {
const details = parent.get(cursor);
if (!details) {
break;
}
stationPath.push(cursor);
if (details.edge) {
trackSequence.push(details.edge.track);
directions.push(details.edge.isForward);
}
cursor = details.prev;
}
stationPath.reverse();
trackSequence.reverse();
directions.reverse();
const stations = stationPath
.map((id) => stationById.get(id))
.filter((station): station is Station => Boolean(station));
const segments = buildSegments(trackSequence, directions, stationById);
const totalLength = computeTotalLength(trackSequence, stations);
return {
stations,
tracks: trackSequence,
totalLength,
error: null,
segments,
};
}
function buildSegments(
tracks: Track[],
directions: boolean[],
stationById: Map<string, Station>
): LatLngTuple[][] {
const segments: LatLngTuple[][] = [];
for (let index = 0; index < tracks.length; index += 1) {
const track = tracks[index];
const isForward = directions[index] ?? true;
const coordinates = extractTrackCoordinates(track, stationById);
if (coordinates.length < 2) {
continue;
}
segments.push(isForward ? coordinates : [...coordinates].reverse());
}
return segments;
}
function extractTrackCoordinates(
track: Track,
stationById: Map<string, Station>
): LatLngTuple[] {
if (Array.isArray(track.coordinates) && track.coordinates.length >= 2) {
return track.coordinates.map((pair) => [pair[0], pair[1]] as LatLngTuple);
}
const start = stationById.get(track.startStationId);
const end = stationById.get(track.endStationId);
if (!start || !end) {
return [];
}
return [
[start.latitude, start.longitude],
[end.latitude, end.longitude],
];
}
function computeTotalLength(tracks: Track[], stations: Station[]): number | null {
if (tracks.length === 0 && stations.length <= 1) {
return 0;
}
const hasTrackLengths = tracks.every(
(track) =>
typeof track.lengthMeters === 'number' && Number.isFinite(track.lengthMeters)
);
if (hasTrackLengths) {
return tracks.reduce((total, track) => total + (track.lengthMeters ?? 0), 0);
}
if (stations.length < 2) {
return null;
}
let total = 0;
for (let index = 0; index < stations.length - 1; index += 1) {
total += haversineDistance(stations[index], stations[index + 1]);
}
return total;
}
function haversineDistance(a: Station, b: Station): number {
const R = 6371_000;
const toRad = (value: number) => (value * Math.PI) / 180;
const dLat = toRad(b.latitude - a.latitude);
const dLon = toRad(b.longitude - a.longitude);
const lat1 = toRad(a.latitude);
const lat2 = toRad(b.latitude);
const sinDLat = Math.sin(dLat / 2);
const sinDLon = Math.sin(dLon / 2);
const root = sinDLat * sinDLat + Math.cos(lat1) * Math.cos(lat2) * sinDLon * sinDLon;
const c = 2 * Math.atan2(Math.sqrt(root), Math.sqrt(1 - root));
return R * c;
}
function emptyResult(): RouteComputation {
return {
stations: null,
tracks: [],
totalLength: null,
error: null,
segments: [],
};
}

View File

@@ -15,7 +15,7 @@
"isolatedModules": true, "isolatedModules": true,
"noEmit": true, "noEmit": true,
"jsx": "react-jsx", "jsx": "react-jsx",
"types": ["vite/client"] "types": ["vite/client", "vitest"]
}, },
"include": ["src"] "include": ["src"]
} }

View File

@@ -0,0 +1,8 @@
import { defineConfig } from 'vitest/config';
export default defineConfig({
test: {
include: ['src/**/*.test.ts'],
environment: 'node',
},
});

159
scripts/init_demo_db.py Normal file
View File

@@ -0,0 +1,159 @@
#!/usr/bin/env python3
"""
Initialize the database with demo data for the Rail Game.
This script automates the database setup process:
1. Validates environment setup
2. Runs database migrations
3. Loads OSM fixtures for demo data
Usage:
python scripts/init_demo_db.py [--dry-run] [--region REGION]
Requirements:
- Virtual environment activated
- .env file configured with DATABASE_URL
- PostgreSQL with PostGIS running
"""
import argparse
import os
import subprocess
import sys
from pathlib import Path
try:
from dotenv import load_dotenv
load_dotenv()
except ImportError:
print("WARNING: python-dotenv not installed. .env file will not be loaded automatically.")
print("Install with: pip install python-dotenv")
def check_virtualenv():
"""Check if we're running in a virtual environment."""
# Skip virtualenv check in Docker containers
if os.getenv('INIT_DEMO_DB') == 'true':
return
if not hasattr(sys, 'real_prefix') and not (hasattr(sys, 'base_prefix') and sys.base_prefix != sys.prefix):
print("ERROR: Virtual environment not activated. Run:")
print(" .venv\\Scripts\\Activate.ps1 (PowerShell)")
print(" source .venv/bin/activate (Bash/macOS/Linux)")
sys.exit(1)
def check_env_file():
"""Check if .env file exists."""
env_file = Path('.env')
if not env_file.exists():
print("ERROR: .env file not found. Copy .env.example to .env and configure:")
print(" Copy-Item .env.example .env (PowerShell)")
print(" cp .env.example .env (Bash)")
sys.exit(1)
def check_database_url():
"""Check if DATABASE_URL is set in environment."""
database_url = os.getenv('DATABASE_URL')
if not database_url:
print("ERROR: DATABASE_URL not set. Check your .env file.")
sys.exit(1)
print(f"Using database: {database_url}")
def run_command(cmd, cwd=None, description=""):
"""Run a shell command and return the result."""
print(f"\n>>> {description}")
print(f"Running: {' '.join(cmd)}")
try:
result = subprocess.run(cmd, cwd=cwd, check=True,
capture_output=True, text=True)
if result.stdout:
print(result.stdout)
return result
except subprocess.CalledProcessError as e:
print(f"ERROR: Command failed with exit code {e.returncode}")
if e.stdout:
print(e.stdout)
if e.stderr:
print(e.stderr)
sys.exit(1)
def run_migrations():
"""Run database migrations using alembic."""
run_command(
['alembic', 'upgrade', 'head'],
cwd='backend',
description="Running database migrations"
)
def load_osm_fixtures(region, dry_run=False):
"""Load OSM fixtures for demo data."""
cmd = ['python', '-m', 'backend.scripts.osm_refresh', '--region', region]
if dry_run:
cmd.append('--no-commit')
description = f"Loading OSM fixtures (dry run) for region: {region}"
else:
description = f"Loading OSM fixtures for region: {region}"
run_command(cmd, description=description)
def main():
parser = argparse.ArgumentParser(
description="Initialize database with demo data")
parser.add_argument(
'--region',
default='all',
help='OSM region to load (default: all)'
)
parser.add_argument(
'--dry-run',
action='store_true',
help='Dry run: run migrations and load fixtures without committing'
)
parser.add_argument(
'--skip-migrations',
action='store_true',
help='Skip running migrations'
)
parser.add_argument(
'--skip-fixtures',
action='store_true',
help='Skip loading OSM fixtures'
)
args = parser.parse_args()
print("Rail Game Database Initialization")
print("=" * 40)
# Pre-flight checks
check_virtualenv()
check_env_file()
check_database_url()
# Run migrations
if not args.skip_migrations:
run_migrations()
else:
print("Skipping migrations (--skip-migrations)")
# Load fixtures
if not args.skip_fixtures:
load_osm_fixtures(args.region, args.dry_run)
else:
print("Skipping fixtures (--skip-fixtures)")
print("\n✅ Database initialization completed successfully!")
if args.dry_run:
print("Note: This was a dry run. No data was committed to the database.")
else:
print("Demo data loaded. You can now start the backend server.")
if __name__ == '__main__':
main()