Compare commits

..

3 Commits

Author SHA1 Message Date
779299ca63 feat: update TODO list with new tasks for station import, track management, and testing enhancements
Some checks failed
Backend CI / lint-and-test (push) Failing after 36s
Frontend CI / lint-and-build (push) Successful in 15s
2025-10-11 17:29:15 +02:00
965de3a0f2 feat: add Dockerfile for backend and frontend services with necessary configurations 2025-10-11 17:25:51 +02:00
1099a738a3 feat: add Playwright configuration and initial e2e test for authentication
- Created Playwright configuration file to set up testing environment.
- Added a new e2e test for user authentication in login.spec.ts.
- Updated tsconfig.node.json to include playwright.config.ts.
- Enhanced vite.config.ts to include API proxying for backend integration.
- Added a placeholder for last run test results in .last-run.json.
2025-10-11 17:25:38 +02:00
21 changed files with 683 additions and 24 deletions

52
TODO.md
View File

@@ -13,35 +13,67 @@
- [x] Implement authentication flow (backend JWT, frontend login/register forms).
- [x] Build map visualization integrating Leaflet with OSM tiles.
- [ ] Create railway builder tools (track placement, station creation) with backend persistence APIs.
- [ ] Establish train scheduling service with validation rules and API endpoints.
- [ ] Define geographic bounding boxes and filtering rules for importing real-world stations from OpenStreetMap.
- [ ] Implement an import script/CLI that pulls OSM station data and normalizes it to the PostGIS schema.
- [ ] Expose backend CRUD endpoints for stations (create, update, archive) with validation and geometry handling.
- [ ] Build React map tooling for manual station placement and editing, including form validation.
- [ ] Define track selection criteria and tagging rules for harvesting OSM rail segments within target regions.
- [ ] Extend the importer to load track geometries and associate them with existing stations.
- [ ] Implement backend track-management APIs with length/speed validation and topology checks.
- [ ] Create a frontend track-drawing workflow (polyline editor, snapping to stations, undo/redo).
- [ ] Design train connection manager requirements (link trains to operating tracks, manage consist data).
- [ ] Implement backend services and APIs to attach trains to routes and update assignments.
- [ ] Add UI flows for managing train connections, including visual feedback on the map.
- [ ] Establish train scheduling service with validation rules, conflict detection, and persistence APIs.
- [ ] Provide frontend scheduling tools (timeline or table view) for creating and editing train timetables.
- [ ] Develop frontend dashboards for resources, schedules, and achievements.
- [ ] Add real-time simulation updates (WebSocket layer, frontend subscription hooks).
## Phase 3 Data & Persistence
- [x] Design PostgreSQL/PostGIS schema and migrations (Alembic or similar).
- [ ] Implement data access layer with SQLAlchemy and repository abstractions.
- [ ] Seed initial data fixtures for development/demo purposes.
- [ ] Integrate caching strategy (Redis) for map tiles and frequent queries.
- [x] Implement data access layer with SQLAlchemy and repository abstractions.
- [ ] Decide on canonical fixture scope (demo geography, sample trains) and document expected dataset size.
- [ ] Author fixture generation scripts that export JSON/GeoJSON compatible with the repository layer.
- [ ] Create ingestion utilities to load fixtures into local and CI databases.
- [ ] Provision a Redis instance/container for local development.
- [ ] Add caching abstractions in backend services (e.g., network snapshot, map layers).
- [ ] Implement cache invalidation hooks tied to repository mutations.
## Phase 4 Testing & Quality
- [ ] Write unit tests for backend services and models.
- [ ] Add frontend component and hook tests (Jest + React Testing Library).
- [ ] Implement end-to-end test suite (Playwright/Cypress) covering critical flows.
- [ ] Set up load/performance testing harness for scheduling and simulation.
- [x] Write unit tests for backend services and models.
- [ ] Configure Jest/RTL testing utilities and shared mocks for Leaflet and network APIs.
- [ ] Write component tests for map controls, station builder UI, and dashboards.
- [ ] Add integration tests for custom hooks (network snapshot, scheduling forms).
- [x] Stand up Playwright/Cypress project structure with authentication helpers.
- [x] Script login end-to-end flow (Playwright).
- [ ] Script station creation end-to-end flow.
- [ ] Script track placement end-to-end flow.
- [ ] Script scheduling end-to-end flow.
- [ ] Define load/performance targets (requests per second, simulation latency) and tooling.
- [ ] Implement performance test harness covering scheduling and real-time updates.
## Phase 5 Deployment & Ops
- [ ] Create Dockerfiles for frontend and backend, plus docker-compose for local dev.
- [x] Create Dockerfile for frontend.
- [x] Create Dockerfile for backend.
- [x] Create docker-compose for local development with Postgres/Redis dependencies.
- [ ] Add task runner commands to orchestrate container workflows.
- [ ] Set up CI/CD pipeline for automated builds, tests, and container publishing.
- [ ] Provision infrastructure scripts (Terraform/Ansible) targeting initial cloud environment.
- [ ] Define environment configuration strategy (secrets management, config maps).
- [ ] Configure observability stack (logging, metrics, tracing).
- [ ] Integrate tracing/logging exporters into backend services.
- [ ] Document deployment pipeline and release process.
## Phase 6 Polish & Expansion
- [ ] Add leaderboards and achievements logic with UI integration.
- [ ] Design data model changes required for achievements and ranking.
- [ ] Implement accessibility audit fixes (WCAG compliance).
- [ ] Conduct accessibility audit (contrast, keyboard navigation, screen reader paths).
- [ ] Optimize asset loading and introduce lazy loading strategies.
- [ ] Establish performance budgets for bundle size and render times.
- [ ] Evaluate multiplayer/coop roadmap and spike POCs where feasible.
- [ ] Prototype networking approach (WebRTC/WebSocket) for cooperative sessions.

22
backend/Dockerfile Normal file
View File

@@ -0,0 +1,22 @@
# syntax=docker/dockerfile:1
FROM python:3.11-slim AS base
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1
WORKDIR /app
RUN apt-get update \
&& apt-get install -y --no-install-recommends build-essential libpq-dev \
&& rm -rf /var/lib/apt/lists/*
COPY backend/requirements/base.txt ./backend/requirements/base.txt
RUN pip install --upgrade pip \
&& pip install -r backend/requirements/base.txt
COPY backend ./backend
EXPOSE 8000
CMD ["uvicorn", "backend.app.main:app", "--host", "0.0.0.0", "--port", "8000"]

View File

@@ -0,0 +1,62 @@
from __future__ import annotations
from collections.abc import Callable
from typing import Optional
from sqlalchemy.orm import Session
from backend.app.db.session import SessionLocal
from backend.app.repositories import (
StationRepository,
TrainRepository,
TrainScheduleRepository,
TrackRepository,
UserRepository,
)
class SqlAlchemyUnitOfWork:
"""Coordinate transactional work across repositories."""
def __init__(self, session_factory: Callable[[], Session] = SessionLocal) -> None:
self._session_factory = session_factory
self.session: Optional[Session] = None
self._committed = False
self.users: UserRepository
self.stations: StationRepository
self.tracks: TrackRepository
self.trains: TrainRepository
self.train_schedules: TrainScheduleRepository
def __enter__(self) -> "SqlAlchemyUnitOfWork":
self.session = self._session_factory()
self.users = UserRepository(self.session)
self.stations = StationRepository(self.session)
self.tracks = TrackRepository(self.session)
self.trains = TrainRepository(self.session)
self.train_schedules = TrainScheduleRepository(self.session)
return self
def __exit__(self, exc_type, exc, _tb) -> None:
try:
if exc:
self.rollback()
elif not self._committed:
self.commit()
finally:
if self.session is not None:
self.session.close()
self.session = None
self._committed = False
def commit(self) -> None:
if self.session is None:
raise RuntimeError("Unit of work is not active")
self.session.commit()
self._committed = True
def rollback(self) -> None:
if self.session is None:
return
self.session.rollback()
self._committed = False

View File

@@ -12,8 +12,10 @@ from .base import (
StationModel,
TrackCreate,
TrackModel,
TrainScheduleCreate,
TrainCreate,
TrainModel,
UserCreate,
to_camel,
)
@@ -29,7 +31,9 @@ __all__ = [
"StationModel",
"TrackCreate",
"TrackModel",
"TrainScheduleCreate",
"TrainCreate",
"TrainModel",
"UserCreate",
"to_camel",
]

View File

@@ -85,3 +85,21 @@ class TrainCreate(CamelModel):
operator_id: str | None = None
home_station_id: str | None = None
consist: str | None = None
class TrainScheduleCreate(CamelModel):
train_id: str
station_id: str
sequence_index: int
scheduled_arrival: datetime | None = None
scheduled_departure: datetime | None = None
dwell_seconds: int | None = None
class UserCreate(CamelModel):
username: str
password_hash: str
email: str | None = None
full_name: str | None = None
role: str = "player"
preferences: str | None = None

View File

@@ -1,11 +1,15 @@
"""Repository abstractions for database access."""
from backend.app.repositories.stations import StationRepository
from backend.app.repositories.train_schedules import TrainScheduleRepository
from backend.app.repositories.tracks import TrackRepository
from backend.app.repositories.trains import TrainRepository
from backend.app.repositories.users import UserRepository
__all__ = [
"StationRepository",
"TrainScheduleRepository",
"TrackRepository",
"TrainRepository",
"UserRepository",
]

View File

@@ -0,0 +1,45 @@
from __future__ import annotations
import sqlalchemy as sa
from uuid import UUID
from sqlalchemy.orm import Session
from backend.app.db.models import TrainSchedule
from backend.app.models import TrainScheduleCreate
from backend.app.repositories.base import BaseRepository
class TrainScheduleRepository(BaseRepository[TrainSchedule]):
"""Persistence operations for train timetables."""
model = TrainSchedule
def __init__(self, session: Session) -> None:
super().__init__(session)
@staticmethod
def _ensure_uuid(value: UUID | str) -> UUID:
if isinstance(value, UUID):
return value
return UUID(str(value))
def list_for_train(self, train_id: UUID | str) -> list[TrainSchedule]:
identifier = self._ensure_uuid(train_id)
statement = (
sa.select(self.model)
.where(self.model.train_id == identifier)
.order_by(self.model.sequence_index.asc())
)
return list(self.session.scalars(statement))
def create(self, data: TrainScheduleCreate) -> TrainSchedule:
schedule = TrainSchedule(
train_id=self._ensure_uuid(data.train_id),
station_id=self._ensure_uuid(data.station_id),
sequence_index=data.sequence_index,
scheduled_arrival=data.scheduled_arrival,
scheduled_departure=data.scheduled_departure,
dwell_seconds=data.dwell_seconds,
)
self.session.add(schedule)
return schedule

View File

@@ -0,0 +1,37 @@
from __future__ import annotations
import sqlalchemy as sa
from sqlalchemy.orm import Session
from backend.app.db.models import User
from backend.app.models import UserCreate
from backend.app.repositories.base import BaseRepository
class UserRepository(BaseRepository[User]):
"""Data access helpers for user accounts."""
model = User
def __init__(self, session: Session) -> None:
super().__init__(session)
def get_by_username(self, username: str) -> User | None:
statement = sa.select(self.model).where(sa.func.lower(self.model.username) == username.lower())
return self.session.scalar(statement)
def list_recent(self, limit: int = 50) -> list[User]:
statement = sa.select(self.model).order_by(self.model.created_at.desc()).limit(limit)
return list(self.session.scalars(statement))
def create(self, data: UserCreate) -> User:
user = User(
username=data.username,
email=data.email,
full_name=data.full_name,
password_hash=data.password_hash,
role=data.role,
preferences=data.preferences,
)
self.session.add(user)
return user

View File

@@ -12,6 +12,7 @@ except ImportError: # pragma: no cover - allow running without shapely at impor
Point = None # type: ignore[assignment]
from sqlalchemy.orm import Session
from sqlalchemy.exc import SQLAlchemyError
from backend.app.models import StationModel, TrackModel, TrainModel
from backend.app.repositories import StationRepository, TrackRepository, TrainRepository
@@ -94,9 +95,13 @@ def get_network_snapshot(session: Session) -> dict[str, list[dict[str, object]]]
track_repo = TrackRepository(session)
train_repo = TrainRepository(session)
stations_entities = station_repo.list_active()
tracks_entities = track_repo.list_all()
trains_entities = train_repo.list_all()
try:
stations_entities = station_repo.list_active()
tracks_entities = track_repo.list_all()
trains_entities = train_repo.list_all()
except SQLAlchemyError:
session.rollback()
return _fallback_snapshot()
if not stations_entities and not tracks_entities and not trains_entities:
return _fallback_snapshot()

View File

@@ -1,17 +1,40 @@
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Any, List
from datetime import datetime, timezone
from typing import Any, List, cast
from uuid import uuid4
import pytest
from backend.app.models import StationCreate, TrackCreate, TrainCreate
from backend.app.repositories import StationRepository, TrackRepository, TrainRepository
from backend.app.db.models import TrainSchedule, User
from backend.app.db.unit_of_work import SqlAlchemyUnitOfWork
from backend.app.models import (
StationCreate,
TrackCreate,
TrainCreate,
TrainScheduleCreate,
UserCreate,
)
from backend.app.repositories import (
StationRepository,
TrackRepository,
TrainRepository,
TrainScheduleRepository,
UserRepository,
)
from sqlalchemy.orm import Session
@dataclass
class DummySession:
added: List[Any] = field(default_factory=list)
scalars_result: List[Any] = field(default_factory=list)
scalar_result: Any = None
statements: List[Any] = field(default_factory=list)
committed: bool = False
rolled_back: bool = False
closed: bool = False
def add(self, instance: Any) -> None:
self.added.append(instance)
@@ -19,12 +42,26 @@ class DummySession:
def add_all(self, instances: list[Any]) -> None:
self.added.extend(instances)
def scalars(self, _statement: Any) -> list[Any]: # pragma: no cover - not used here
return []
def scalars(self, statement: Any) -> list[Any]:
self.statements.append(statement)
return list(self.scalars_result)
def scalar(self, statement: Any) -> Any:
self.statements.append(statement)
return self.scalar_result
def flush(self, _objects: list[Any] | None = None) -> None: # pragma: no cover - optional
return None
def commit(self) -> None: # pragma: no cover - optional
self.committed = True
def rollback(self) -> None: # pragma: no cover - optional
self.rolled_back = True
def close(self) -> None: # pragma: no cover - optional
self.closed = True
def test_station_repository_create_generates_geometry() -> None:
session = DummySession()
@@ -104,3 +141,96 @@ def test_train_repository_create_supports_optional_ids() -> None:
assert train.designation == "ICE 123"
assert str(train.home_station_id).endswith("1")
assert train.operator_id is None
def test_user_repository_create_persists_user() -> None:
session = DummySession()
repo = UserRepository(session) # type: ignore[arg-type]
user = repo.create(
UserCreate(
username="demo",
password_hash="hashed",
email="demo@example.com",
full_name="Demo Engineer",
role="admin",
)
)
assert session.added and session.added[0] is user
assert user.username == "demo"
assert user.role == "admin"
def test_user_repository_get_by_username_is_case_insensitive() -> None:
existing = User(username="Demo", password_hash="hashed", role="player")
session = DummySession(scalar_result=existing)
repo = UserRepository(session) # type: ignore[arg-type]
result = repo.get_by_username("demo")
assert result is existing
assert session.statements
def test_train_schedule_repository_create_converts_identifiers() -> None:
session = DummySession()
repo = TrainScheduleRepository(session) # type: ignore[arg-type]
train_id = uuid4()
station_id = uuid4()
schedule = repo.create(
TrainScheduleCreate(
train_id=str(train_id),
station_id=str(station_id),
sequence_index=1,
scheduled_arrival=datetime.now(timezone.utc),
dwell_seconds=90,
)
)
assert session.added and session.added[0] is schedule
assert schedule.train_id == train_id
assert schedule.station_id == station_id
def test_train_schedule_repository_list_for_train_orders_results() -> None:
train_id = uuid4()
schedules = [
TrainSchedule(train_id=train_id, station_id=uuid4(), sequence_index=2),
TrainSchedule(train_id=train_id, station_id=uuid4(), sequence_index=1),
]
session = DummySession(scalars_result=schedules)
repo = TrainScheduleRepository(session) # type: ignore[arg-type]
result = repo.list_for_train(train_id)
assert result == schedules
statement = session.statements[-1]
assert getattr(statement, "_order_by_clauses", ())
def test_unit_of_work_commits_and_closes_session() -> None:
session = DummySession()
uow = SqlAlchemyUnitOfWork(lambda: cast(Session, session))
with uow as active:
active.users.create(
UserCreate(username="demo", password_hash="hashed")
)
active.commit()
assert session.committed
assert session.closed
def test_unit_of_work_rolls_back_on_exception() -> None:
session = DummySession()
uow = SqlAlchemyUnitOfWork(lambda: cast(Session, session))
with pytest.raises(RuntimeError):
with uow:
raise RuntimeError("boom")
assert session.rolled_back
assert session.closed

45
docker-compose.yml Normal file
View File

@@ -0,0 +1,45 @@
version: "3.9"
services:
db:
build:
context: ./infra/postgres
image: rail-game-postgres
environment:
POSTGRES_USER: railgame
POSTGRES_PASSWORD: railgame
POSTGRES_DB: railgame_dev
ports:
- "5432:5432"
volumes:
- postgres-data:/var/lib/postgresql/data
redis:
image: redis:7-alpine
ports:
- "6379:6379"
backend:
build:
context: .
dockerfile: backend/Dockerfile
environment:
DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_dev
TEST_DATABASE_URL: postgresql+psycopg://railgame:railgame@db:5432/railgame_test
REDIS_URL: redis://redis:6379/0
depends_on:
- db
- redis
ports:
- "8000:8000"
frontend:
build:
context: ./frontend
depends_on:
- backend
ports:
- "8080:80"
volumes:
postgres-data:

16
frontend/Dockerfile Normal file
View File

@@ -0,0 +1,16 @@
# syntax=docker/dockerfile:1
FROM node:20-alpine AS build
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci
COPY . .
RUN npm run build
FROM nginx:1.27-alpine
COPY --from=build /app/dist /usr/share/nginx/html
COPY nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

20
frontend/nginx.conf Normal file
View File

@@ -0,0 +1,20 @@
server {
listen 80;
server_name localhost;
root /usr/share/nginx/html;
index index.html;
location / {
try_files $uri $uri/ /index.html;
}
location /api/ {
proxy_pass http://backend:8000/api/;
proxy_http_version 1.1;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}

View File

@@ -14,6 +14,7 @@
"react-leaflet": "^4.2.1"
},
"devDependencies": {
"@playwright/test": "^1.48.0",
"@types/leaflet": "^1.9.13",
"@types/node": "^20.16.5",
"@types/react": "^18.3.3",
@@ -972,6 +973,22 @@
"node": ">= 8"
}
},
"node_modules/@playwright/test": {
"version": "1.56.0",
"resolved": "https://registry.npmjs.org/@playwright/test/-/test-1.56.0.tgz",
"integrity": "sha512-Tzh95Twig7hUwwNe381/K3PggZBZblKUe2wv25oIpzWLr6Z0m4KgV1ZVIjnR6GM9ANEqjZD7XsZEa6JL/7YEgg==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"playwright": "1.56.0"
},
"bin": {
"playwright": "cli.js"
},
"engines": {
"node": ">=18"
}
},
"node_modules/@react-leaflet/core": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/@react-leaflet/core/-/core-2.1.0.tgz",
@@ -4654,6 +4671,53 @@
"url": "https://github.com/sponsors/jonschlinkert"
}
},
"node_modules/playwright": {
"version": "1.56.0",
"resolved": "https://registry.npmjs.org/playwright/-/playwright-1.56.0.tgz",
"integrity": "sha512-X5Q1b8lOdWIE4KAoHpW3SE8HvUB+ZZsUoN64ZhjnN8dOb1UpujxBtENGiZFE+9F/yhzJwYa+ca3u43FeLbboHA==",
"dev": true,
"license": "Apache-2.0",
"dependencies": {
"playwright-core": "1.56.0"
},
"bin": {
"playwright": "cli.js"
},
"engines": {
"node": ">=18"
},
"optionalDependencies": {
"fsevents": "2.3.2"
}
},
"node_modules/playwright-core": {
"version": "1.56.0",
"resolved": "https://registry.npmjs.org/playwright-core/-/playwright-core-1.56.0.tgz",
"integrity": "sha512-1SXl7pMfemAMSDn5rkPeZljxOCYAmQnYLBTExuh6E8USHXGSX3dx6lYZN/xPpTz1vimXmPA9CDnILvmJaB8aSQ==",
"dev": true,
"license": "Apache-2.0",
"bin": {
"playwright-core": "cli.js"
},
"engines": {
"node": ">=18"
}
},
"node_modules/playwright/node_modules/fsevents": {
"version": "2.3.2",
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.2.tgz",
"integrity": "sha512-xiqMQR4xAeHTuB9uWm+fFRcIOgKBMiOBP+eXiyT7jsgVCq1bkVygt00oASowB7EdtpOHaaPgKt812P9ab+DDKA==",
"dev": true,
"hasInstallScript": true,
"license": "MIT",
"optional": true,
"os": [
"darwin"
],
"engines": {
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
}
},
"node_modules/possible-typed-array-names": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/possible-typed-array-names/-/possible-typed-array-names-1.1.0.tgz",

View File

@@ -8,7 +8,8 @@
"build": "tsc --noEmit && tsc --project tsconfig.node.json --noEmit && vite build",
"preview": "vite preview",
"lint": "eslint \"src/**/*.{ts,tsx}\"",
"format": "prettier --write \"src/**/*.{ts,tsx,css}\""
"format": "prettier --write \"src/**/*.{ts,tsx,css}\"",
"test:e2e": "playwright test"
},
"dependencies": {
"leaflet": "^1.9.4",
@@ -24,6 +25,7 @@
"@typescript-eslint/eslint-plugin": "^8.6.0",
"@typescript-eslint/parser": "^8.6.0",
"@vitejs/plugin-react": "^4.3.1",
"@playwright/test": "^1.48.0",
"eslint": "^8.57.0",
"eslint-config-prettier": "^9.1.0",
"eslint-plugin-import": "^2.31.0",

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,22 @@
import { defineConfig, devices } from '@playwright/test';
const baseURL = process.env.PLAYWRIGHT_BASE_URL ?? 'http://localhost:8080';
export default defineConfig({
testDir: './tests/e2e',
fullyParallel: true,
retries: process.env.CI ? 2 : 0,
reporter: [['list'], ['html', { outputFolder: 'playwright-report', open: 'never' }]],
use: {
baseURL,
trace: 'on-first-retry',
screenshot: 'only-on-failure',
video: 'retain-on-failure',
},
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'] },
},
],
});

View File

@@ -0,0 +1,4 @@
{
"status": "passed",
"failedTests": []
}

View File

@@ -0,0 +1,26 @@
import { expect, test, type Page } from '@playwright/test';
const demoCredentials = {
username: process.env.DEMO_USERNAME ?? 'demo',
password: process.env.DEMO_PASSWORD ?? 'railgame123',
};
test.describe('Authentication', () => {
test('allows the demo user to sign in and view the network snapshot', async ({
page,
}: {
page: Page;
}) => {
await page.goto('/');
await expect(page.getByRole('heading', { name: 'Sign in' })).toBeVisible();
await page.getByLabel('Username').fill(demoCredentials.username);
await page.getByLabel('Password').fill(demoCredentials.password);
await page.getByRole('button', { name: 'Sign in' }).click();
await expect(page.getByRole('heading', { name: 'Network Snapshot' })).toBeVisible();
await expect(page.getByRole('heading', { name: 'Stations' })).toBeVisible();
await expect(page.getByRole('button', { name: 'Sign out' })).toBeVisible();
});
});

View File

@@ -14,5 +14,5 @@
"isolatedModules": true,
"types": ["node"]
},
"include": ["vite.config.ts"]
"include": ["vite.config.ts", "playwright.config.ts"]
}

View File

@@ -1,10 +1,26 @@
import { defineConfig } from "vite";
import react from "@vitejs/plugin-react";
import react from '@vitejs/plugin-react';
import { defineConfig } from 'vite';
const backendUrl = process.env.VITE_BACKEND_URL ?? 'http://localhost:8000';
export default defineConfig({
plugins: [react()],
server: {
port: 5173,
open: true
}
open: true,
proxy: {
'/api': {
target: backendUrl,
changeOrigin: true,
},
},
},
preview: {
proxy: {
'/api': {
target: backendUrl,
changeOrigin: true,
},
},
},
});