Compare commits
196 Commits
nuscore
...
mytischten
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4e81a1c4a7 | ||
|
|
b2017b7365 | ||
|
|
b3bbca3887 | ||
|
|
0ee9e486b5 | ||
|
|
00e058a665 | ||
|
|
e5a0dfdddc | ||
|
|
83f4e1c45e | ||
|
|
f0477b1023 | ||
|
|
07370bfcef | ||
|
|
f031485bd4 | ||
|
|
e22e3257ef | ||
|
|
76f1b1a12f | ||
|
|
6007e70b9d | ||
|
|
d7935cc1e2 | ||
|
|
b470e728ed | ||
|
|
d09de49018 | ||
|
|
8892392bf2 | ||
|
|
26acb588e1 | ||
|
|
566361e46a | ||
|
|
1191636d92 | ||
|
|
526eca8b97 | ||
|
|
af6048b289 | ||
|
|
5605cd6189 | ||
|
|
84bbcb0f87 | ||
|
|
f9a63a13ce | ||
|
|
2a7694617b | ||
|
|
6ff672c5f1 | ||
|
|
a2e9e5e510 | ||
|
|
5b0a3baa21 | ||
|
|
9cb9ff511c | ||
|
|
e079fe4827 | ||
|
|
2c8cad52a7 | ||
|
|
12184c2f72 | ||
|
|
1f94c273ae | ||
|
|
e333a54025 | ||
|
|
a86c05eb66 | ||
|
|
c2dbf0a12d | ||
|
|
a8470145a0 | ||
|
|
2871b79b04 | ||
|
|
503ff90dfa | ||
|
|
673a3afbb5 | ||
|
|
10e6d74d93 | ||
|
|
3fc1760b2c | ||
|
|
d12b9daf87 | ||
|
|
75cc2df06b | ||
|
|
7454a274a1 | ||
|
|
380709c29c | ||
|
|
c6f8b4dd74 | ||
|
|
02c947b0e3 | ||
|
|
c3366313d6 | ||
|
|
b1e184c4c2 | ||
|
|
3e05bdab51 | ||
|
|
fde6ba55d2 | ||
|
|
19410a0ee2 | ||
|
|
28db204aba | ||
|
|
47a815dd71 | ||
|
|
14dc654145 | ||
|
|
025ad68cf3 | ||
|
|
89f30f76f5 | ||
|
|
85c26bc80d | ||
|
|
6cdcbfe0db | ||
|
|
7e1b09fa97 | ||
|
|
18a191f686 | ||
|
|
e21b50fc38 | ||
|
|
23caeddf9e | ||
|
|
663125670e | ||
|
|
515e04d1e3 | ||
|
|
bf082ea995 | ||
|
|
67fc5d45e1 | ||
|
|
30e3f4f321 | ||
|
|
c4e237cfca | ||
|
|
fea84e210a | ||
|
|
e94a12cd20 | ||
| 438029a3a4 | |||
| c58491c97a | |||
| 1d9b9dbc45 | |||
| dc791dc33d | |||
| 57fbbff353 | |||
| b00a35af30 | |||
|
|
dd0f29124c | ||
|
|
dc084806ab | ||
|
|
4b4c48a50f | ||
|
|
65acc9e0d5 | ||
|
|
13cd55c051 | ||
|
|
9bf37399d5 | ||
|
|
047b1801b3 | ||
|
|
945ec0d48c | ||
|
|
e83bc250a8 | ||
|
|
0c28b12978 | ||
|
|
5aa11151cf | ||
|
|
a651113dee | ||
|
|
bf0d5b0935 | ||
|
|
6acdcfa5c3 | ||
|
|
dc2c60cefe | ||
|
|
bdbbb88be9 | ||
|
|
e6146b8f5a | ||
|
|
f7a799ea7f | ||
|
|
b74cb30cf6 | ||
|
|
0d2dfd9a07 | ||
|
|
61e5efadb8 | ||
|
|
88d050392f | ||
|
|
08b0be78ad | ||
|
|
b0e610f3ab | ||
|
|
0285c05fa6 | ||
|
|
5d4f2ebd4b | ||
|
|
bfa908ac9a | ||
|
|
9592459348 | ||
|
|
47f53ee3fd | ||
|
|
c22f4016cc | ||
|
|
2458ba2d37 | ||
|
|
6eb42812fd | ||
|
|
938ce4d991 | ||
|
|
cb6e84945b | ||
|
|
8c6be234c6 | ||
|
|
fe160420c1 | ||
|
|
167e3ba3ec | ||
|
|
9455b5d65a | ||
|
|
e6627a897e | ||
|
|
71fc85427b | ||
|
|
76597a4360 | ||
|
|
4f9761efb0 | ||
|
|
51e47cf9f9 | ||
|
|
0525f7908d | ||
|
|
a4d89374b7 | ||
|
|
de907df092 | ||
|
|
b906ac64b3 | ||
|
|
b7bbb92f86 | ||
|
|
6896484e9e | ||
|
|
9cc9db3a5a | ||
|
|
1c99fb30a1 | ||
|
|
2782661206 | ||
|
|
d10b663dc1 | ||
|
|
9baa6bae01 | ||
|
|
945fd85e39 | ||
|
|
5b04ed7904 | ||
|
|
de36a8ce2b | ||
|
|
903b036a63 | ||
|
|
5f3b6200ec | ||
|
|
eff211856f | ||
|
|
a81c3453b5 | ||
|
|
56c708d3a0 | ||
|
|
062bddcf52 | ||
|
|
4f98c782f3 | ||
|
|
3ea2907d08 | ||
|
|
ba5d6b14a8 | ||
|
|
004a94404a | ||
|
|
5ddf998672 | ||
|
|
baf5bda6f2 | ||
|
|
572de5f7d4 | ||
|
|
37893474b1 | ||
|
|
f437747664 | ||
|
|
22e9750e5d | ||
|
|
bd95f77131 | ||
|
|
bbdc923950 | ||
|
|
3e5ddd8a05 | ||
|
|
f4e5cf2edb | ||
|
|
44dba70aac | ||
|
|
7698d87ba0 | ||
|
|
201d5e9214 | ||
|
|
c21544d9b6 | ||
|
|
6167116630 | ||
|
|
1bb5f61b57 | ||
|
|
1535c8795b | ||
|
|
cb2d7d3936 | ||
|
|
5b4a5ba501 | ||
|
|
90b5f8d63d | ||
|
|
1ff3d9d1a6 | ||
|
|
df6fb23132 | ||
|
|
1e86b821e8 | ||
|
|
5923ef8bba | ||
|
|
cd8f40aa9d | ||
|
|
d392ccddd5 | ||
|
|
4a83e5c159 | ||
|
|
911c07e522 | ||
|
|
cd89c68a69 | ||
|
|
f1321b18bb | ||
|
|
54ce09e9a9 | ||
|
|
7a9e856961 | ||
|
|
fd4b47327f | ||
|
|
3a26f10110 | ||
|
|
ce2bda37ac | ||
|
|
5dda346fd7 | ||
|
|
28c92b66af | ||
|
|
d08835e206 | ||
|
|
3334d76688 | ||
|
|
d48cc4385f | ||
|
|
9b8dcd8561 | ||
|
|
2b06a8dd10 | ||
|
|
58e773e51e | ||
|
|
8d17cad299 | ||
|
|
156f4d6921 | ||
|
|
e27a4d960d | ||
|
|
c589c11607 | ||
|
|
0caa31e3eb | ||
|
|
fff5d404f5 | ||
|
|
7aff827711 |
3
.gitignore
vendored
3
.gitignore
vendored
@@ -7,4 +7,5 @@ backend/.env
|
||||
|
||||
backend/images/*
|
||||
backend/backend-debug.log
|
||||
backend/*.log
|
||||
backend/*.log
|
||||
backend/.env.local
|
||||
|
||||
86
CHECK_SERVER.md
Normal file
86
CHECK_SERVER.md
Normal file
@@ -0,0 +1,86 @@
|
||||
# Server-Prüfung: i18n-Fixes
|
||||
|
||||
## Lokale Prüfung (bereits durchgeführt)
|
||||
|
||||
✅ Alle Dateien sind lokal korrekt:
|
||||
- `TeamManagementView.vue` - Alle `$t()` durch `t()` ersetzt, `t` im return Statement
|
||||
- `PermissionsView.vue` - Alle `$t()` durch `t()` ersetzt, `t` im return Statement
|
||||
- `LogsView.vue` - Alle `$t()` durch `t()` ersetzt, `t` im return Statement
|
||||
- `SeasonSelector.vue` - Bereits korrekt
|
||||
|
||||
## Server-Prüfung
|
||||
|
||||
### 1. Prüfskript auf den Server kopieren
|
||||
|
||||
```bash
|
||||
# Vom lokalen Rechner aus:
|
||||
scp check-i18n-fixes.sh rv2756:/var/www/tt-tagebuch.de/
|
||||
```
|
||||
|
||||
### 2. Auf dem Server ausführen
|
||||
|
||||
```bash
|
||||
# Auf dem Server:
|
||||
cd /var/www/tt-tagebuch.de
|
||||
chmod +x check-i18n-fixes.sh
|
||||
./check-i18n-fixes.sh
|
||||
```
|
||||
|
||||
### 3. Falls Dateien nicht aktualisiert sind
|
||||
|
||||
```bash
|
||||
# Auf dem Server:
|
||||
cd /var/www/tt-tagebuch.de
|
||||
git pull origin main
|
||||
cd backend
|
||||
npm install # Erstellt automatisch den Frontend-Build (via postinstall script)
|
||||
```
|
||||
|
||||
### 4. Backend neu starten (falls nötig)
|
||||
|
||||
```bash
|
||||
# Falls als systemd-Service:
|
||||
sudo systemctl restart tt-tagebuch
|
||||
|
||||
# Oder falls als PM2-Prozess:
|
||||
pm2 restart tt-tagebuch-backend
|
||||
```
|
||||
|
||||
## Erwartete Ergebnisse
|
||||
|
||||
Das Prüfskript sollte folgende Ausgabe zeigen:
|
||||
|
||||
```
|
||||
1. TeamManagementView.vue:
|
||||
✓ Enthält 'const t = (key, params) => i18n.global.t'
|
||||
✓ Enthält keine $t() Aufrufe mehr
|
||||
✓ 't' ist im return Statement enthalten
|
||||
|
||||
2. PermissionsView.vue:
|
||||
✓ Enthält 'const t = (key, params) => i18n.global.t'
|
||||
✓ Enthält keine $t() Aufrufe mehr
|
||||
✓ 't' ist im return Statement enthalten
|
||||
|
||||
3. LogsView.vue:
|
||||
✓ Enthält 'const t = (key, params) => i18n.global.t'
|
||||
✓ Enthält keine $t() Aufrufe mehr
|
||||
✓ 't' ist im return Statement enthalten
|
||||
|
||||
4. SeasonSelector.vue:
|
||||
✓ Enthält 'const t = (key, params) => i18n.global.t'
|
||||
✓ Enthält keine $t() Aufrufe mehr
|
||||
```
|
||||
|
||||
## Commits, die auf den Server müssen
|
||||
|
||||
Die folgenden Commits müssen auf dem Server sein:
|
||||
|
||||
- `b0e610f` - Fix: Replace all $t() calls with t() in PermissionsView and LogsView templates
|
||||
- `0285c05` - Fix: Replace all $t() calls with t() in TeamManagementView template
|
||||
- `5d4f2eb` - Update localization handling in TeamManagementView
|
||||
|
||||
Prüfe mit:
|
||||
```bash
|
||||
git log --oneline -5
|
||||
```
|
||||
|
||||
191
DEPLOYMENT_SOCKET_IO.md
Normal file
191
DEPLOYMENT_SOCKET_IO.md
Normal file
@@ -0,0 +1,191 @@
|
||||
# Deployment-Anleitung: Socket.IO mit SSL
|
||||
|
||||
Socket.IO läuft jetzt direkt auf HTTPS-Port 3051 (nicht über Apache-Proxy).
|
||||
|
||||
## Schritte nach dem Deployment
|
||||
|
||||
### 1. Firewall-Port öffnen
|
||||
|
||||
```bash
|
||||
# UFW (Ubuntu Firewall)
|
||||
sudo ufw allow 3051/tcp
|
||||
```
|
||||
|
||||
### 2. Apache-Konfiguration aktualisieren
|
||||
|
||||
```bash
|
||||
sudo cp /var/www/tt-tagebuch.de/apache.conf.example /etc/apache2/sites-available/tt-tagebuch.de-le-ssl.conf
|
||||
sudo systemctl restart apache2
|
||||
```
|
||||
|
||||
### 3. systemd-Service konfigurieren (als www-data)
|
||||
|
||||
**WICHTIG:** Der Service sollte als `www-data` laufen, nicht als `nobody`!
|
||||
|
||||
```bash
|
||||
# Service-Datei installieren
|
||||
sudo cp /var/www/tt-tagebuch.de/tt-tagebuch.service /etc/systemd/system/
|
||||
sudo systemctl daemon-reload
|
||||
```
|
||||
|
||||
Die Service-Datei konfiguriert:
|
||||
- User: `www-data` (Standard-Webserver-Benutzer)
|
||||
- Group: `www-data`
|
||||
- Port: 3050 (HTTP) und 3051 (HTTPS)
|
||||
|
||||
### 4. SSL-Zertifikat-Berechtigungen setzen
|
||||
|
||||
**WICHTIG:** Der Node.js-Prozess muss Zugriff auf die SSL-Zertifikate haben!
|
||||
|
||||
```bash
|
||||
cd /var/www/tt-tagebuch.de/backend
|
||||
chmod +x scripts/fixCertPermissions.sh
|
||||
sudo ./scripts/fixCertPermissions.sh
|
||||
```
|
||||
|
||||
Dieses Skript:
|
||||
- Erstellt die Gruppe `ssl-cert` (falls nicht vorhanden)
|
||||
- Fügt den Service-Benutzer (`www-data`) zur Gruppe hinzu
|
||||
- Setzt die Berechtigungen für die Zertifikate
|
||||
|
||||
### 5. Backend neu starten
|
||||
|
||||
**WICHTIG:** Der Backend-Server muss neu gestartet werden, damit der HTTPS-Server auf Port 3051 läuft!
|
||||
|
||||
```bash
|
||||
# Falls als systemd-Service:
|
||||
sudo systemctl restart tt-tagebuch
|
||||
|
||||
# Oder falls als PM2-Prozess:
|
||||
pm2 restart tt-tagebuch-backend
|
||||
```
|
||||
|
||||
### 6. Prüfen, ob HTTPS-Server läuft
|
||||
|
||||
```bash
|
||||
# Prüfe, ob Port 3051 geöffnet ist
|
||||
sudo netstat -tlnp | grep 3051
|
||||
# Oder:
|
||||
sudo ss -tlnp | grep 3051
|
||||
|
||||
# Prüfe Backend-Logs
|
||||
sudo journalctl -u tt-tagebuch -f
|
||||
# Oder bei PM2:
|
||||
pm2 logs tt-tagebuch-backend
|
||||
```
|
||||
|
||||
Du solltest folgende Meldung sehen:
|
||||
```
|
||||
🚀 HTTPS-Server für Socket.IO läuft auf Port 3051
|
||||
```
|
||||
|
||||
### 7. Diagnose-Skript ausführen
|
||||
|
||||
```bash
|
||||
cd /var/www/tt-tagebuch.de/backend
|
||||
node scripts/checkSocketIOServer.js
|
||||
```
|
||||
|
||||
Dieses Skript prüft:
|
||||
- Ob SSL-Zertifikate existieren
|
||||
- Ob Port 3051 geöffnet ist
|
||||
- Ob der Server erreichbar ist
|
||||
|
||||
### 8. Testen
|
||||
|
||||
Im Browser sollte Socket.IO jetzt direkt zu `wss://tt-tagebuch.de:3051` verbinden.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Port 3051 ist nicht erreichbar
|
||||
|
||||
1. **Prüfe Firewall:**
|
||||
```bash
|
||||
sudo ufw status
|
||||
sudo ufw allow 3051/tcp
|
||||
```
|
||||
|
||||
2. **Prüfe, ob der Server läuft:**
|
||||
```bash
|
||||
sudo netstat -tlnp | grep 3051
|
||||
sudo ss -tlnp | grep 3051
|
||||
```
|
||||
|
||||
3. **Prüfe Backend-Logs auf Fehler:**
|
||||
```bash
|
||||
sudo journalctl -u tt-tagebuch -n 50
|
||||
# Oder:
|
||||
pm2 logs tt-tagebuch-backend --lines 50
|
||||
```
|
||||
|
||||
4. **Prüfe, ob HTTPS-Server gestartet wurde:**
|
||||
- Suche in den Logs nach: `🚀 HTTPS-Server für Socket.IO läuft auf Port 3051`
|
||||
- Falls nicht vorhanden, prüfe auf Fehler: `⚠️ HTTPS-Server konnte nicht gestartet werden`
|
||||
|
||||
### SSL-Zertifikat-Fehler / Berechtigungsfehler
|
||||
|
||||
**Fehler:** `EACCES: permission denied, open '/etc/letsencrypt/live/tt-tagebuch.de/privkey.pem'`
|
||||
|
||||
**Lösung:**
|
||||
```bash
|
||||
cd /var/www/tt-tagebuch.de/backend
|
||||
chmod +x scripts/fixCertPermissions.sh
|
||||
sudo ./scripts/fixCertPermissions.sh
|
||||
sudo systemctl restart tt-tagebuch
|
||||
```
|
||||
|
||||
Stelle sicher, dass die Zertifikate existieren:
|
||||
```bash
|
||||
ls -la /etc/letsencrypt/live/tt-tagebuch.de/
|
||||
```
|
||||
|
||||
Falls die Zertifikate nicht existieren:
|
||||
```bash
|
||||
sudo certbot certonly --standalone -d tt-tagebuch.de
|
||||
```
|
||||
|
||||
### Service läuft als "nobody"
|
||||
|
||||
**Problem:** Der Service läuft als `nobody`, was zu eingeschränkt ist.
|
||||
|
||||
**Lösung:**
|
||||
1. Installiere die Service-Datei (siehe Schritt 3)
|
||||
2. Führe das Berechtigungs-Skript aus (siehe Schritt 4)
|
||||
3. Starte den Service neu
|
||||
|
||||
```bash
|
||||
# Prüfe aktuellen Service-User
|
||||
sudo systemctl show -p User tt-tagebuch.service
|
||||
|
||||
# Installiere Service-Datei
|
||||
sudo cp /var/www/tt-tagebuch.de/tt-tagebuch.service /etc/systemd/system/
|
||||
sudo systemctl daemon-reload
|
||||
sudo systemctl restart tt-tagebuch
|
||||
|
||||
# Prüfe, ob jetzt als www-data läuft
|
||||
sudo systemctl show -p User tt-tagebuch.service
|
||||
```
|
||||
|
||||
### Frontend verbindet nicht
|
||||
|
||||
1. **Prüfe Browser-Konsole auf Fehler**
|
||||
2. **Prüfe, ob `import.meta.env.PROD` korrekt gesetzt ist:**
|
||||
- In Produktion sollte die Socket.IO-URL `https://tt-tagebuch.de:3051` sein
|
||||
- In Entwicklung sollte sie `http://localhost:3005` sein
|
||||
|
||||
3. **Prüfe, ob die Socket.IO-URL korrekt ist:**
|
||||
- Öffne Browser-Entwicklertools → Network
|
||||
- Suche nach WebSocket-Verbindungen
|
||||
- Die URL sollte `wss://tt-tagebuch.de:3051/socket.io/...` sein
|
||||
|
||||
### Server lauscht nur auf localhost
|
||||
|
||||
Der Server sollte auf `0.0.0.0` lauschen (nicht nur auf `localhost`).
|
||||
Dies ist bereits in der Konfiguration eingestellt:
|
||||
```javascript
|
||||
httpsServer.listen(httpsPort, '0.0.0.0', () => {
|
||||
console.log(`🚀 HTTPS-Server für Socket.IO läuft auf Port ${httpsPort}`);
|
||||
});
|
||||
```
|
||||
|
||||
Falls der Server trotzdem nicht erreichbar ist, prüfe die Backend-Logs.
|
||||
342
DSGVO_CHECKLIST.md
Normal file
342
DSGVO_CHECKLIST.md
Normal file
@@ -0,0 +1,342 @@
|
||||
# DSGVO-Konformitäts-Checkliste für Trainingstagebuch
|
||||
|
||||
## Status: ⚠️ PRÜFUNG ERFORDERLICH
|
||||
|
||||
Diese Checkliste dokumentiert den aktuellen Stand der DSGVO-Konformität der Anwendung.
|
||||
|
||||
---
|
||||
|
||||
## 1. Datenschutzerklärung ✅ / ⚠️
|
||||
|
||||
### Status: ⚠️ Teilweise vorhanden, muss aktualisiert werden
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Datenschutzerklärung vorhanden (`/datenschutz`)
|
||||
- ✅ Impressum vorhanden (`/impressum`)
|
||||
- ✅ Verlinkung im Footer
|
||||
|
||||
**Fehlend/Verbesserungsbedarf:**
|
||||
- ⚠️ MyTischtennis-Integration nicht erwähnt (Drittlandübermittlung?)
|
||||
- ⚠️ Logging von API-Requests nicht erwähnt
|
||||
- ⚠️ Verschlüsselung von Mitgliederdaten nicht erwähnt
|
||||
- ⚠️ Speicherdauer für Logs nicht konkretisiert
|
||||
- ⚠️ Keine Informationen zu automatischer Löschung
|
||||
|
||||
---
|
||||
|
||||
## 2. Einwilligungen ⚠️
|
||||
|
||||
### Status: ⚠️ Teilweise vorhanden
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ `picsInInternetAllowed` bei Mitgliedern (Einwilligung für Fotos im Internet)
|
||||
- ✅ MyTischtennis: `savePassword` und `autoUpdateRatings` (Einwilligungen)
|
||||
|
||||
**Fehlend/Verbesserungsbedarf:**
|
||||
- ⚠️ Keine explizite Einwilligung bei Registrierung zur Datenschutzerklärung
|
||||
- ⚠️ Keine Einwilligung für Logging von API-Requests
|
||||
- ⚠️ Keine Einwilligung für Datenübertragung an MyTischtennis.de
|
||||
- ⚠️ Keine Möglichkeit, Einwilligungen zu widerrufen (außer manuell)
|
||||
|
||||
---
|
||||
|
||||
## 3. Löschrechte (Art. 17 DSGVO) ⚠️
|
||||
|
||||
### Status: ⚠️ Teilweise implementiert
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ DELETE-Endpunkte für viele Ressourcen (Member, Tournament, etc.)
|
||||
- ✅ MyTischtennis-Account kann gelöscht werden
|
||||
|
||||
**Fehlend/Verbesserungsbedarf:**
|
||||
- ❌ **KRITISCH:** Kein Endpunkt zum vollständigen Löschen eines User-Accounts
|
||||
- ❌ **KRITISCH:** Keine automatische Löschung aller zugehörigen Daten (Cascade-Delete)
|
||||
- ❌ Keine Löschung von Logs nach Ablauf der Speicherdauer
|
||||
- ⚠️ Keine Anonymisierung statt Löschung (falls gesetzliche Aufbewahrungspflichten bestehen)
|
||||
- ⚠️ Keine Bestätigung vor Löschung kritischer Daten
|
||||
|
||||
**Empfehlung:**
|
||||
- Implementiere `/api/user/delete` Endpunkt
|
||||
- Implementiere automatische Löschung aller zugehörigen Daten:
|
||||
- UserClub-Einträge
|
||||
- MyTischtennis-Account
|
||||
- Alle Logs (nach Anonymisierung)
|
||||
- Alle Mitglieder, die nur diesem User zugeordnet sind
|
||||
- Implementiere automatische Löschung von Logs nach 90 Tagen
|
||||
|
||||
---
|
||||
|
||||
## 4. Auskunftsrechte (Art. 15 DSGVO) ❌
|
||||
|
||||
### Status: ❌ Nicht implementiert
|
||||
|
||||
**Fehlend:**
|
||||
- ❌ **KRITISCH:** Kein Endpunkt zur Auskunft über gespeicherte Daten
|
||||
- ❌ Keine Übersicht über alle personenbezogenen Daten eines Users
|
||||
- ❌ Keine Übersicht über alle Mitgliederdaten
|
||||
- ❌ Keine Übersicht über Logs, die einen User betreffen
|
||||
|
||||
**Empfehlung:**
|
||||
- Implementiere `/api/user/data-export` Endpunkt
|
||||
- Exportiere alle Daten in strukturiertem Format (JSON)
|
||||
- Inkludiere:
|
||||
- User-Daten
|
||||
- Vereinszugehörigkeiten
|
||||
- Mitgliederdaten (falls User Zugriff hat)
|
||||
- Logs
|
||||
- MyTischtennis-Daten
|
||||
|
||||
---
|
||||
|
||||
## 5. Datenportabilität (Art. 20 DSGVO) ❌
|
||||
|
||||
### Status: ❌ Nicht implementiert
|
||||
|
||||
**Fehlend:**
|
||||
- ❌ **KRITISCH:** Kein Export in maschinenlesbarem Format
|
||||
- ❌ Keine JSON/XML-Export-Funktion
|
||||
- ⚠️ PDF-Export für Trainingstage vorhanden, aber nicht für alle Daten
|
||||
|
||||
**Empfehlung:**
|
||||
- Implementiere `/api/user/data-export` mit JSON-Format
|
||||
- Implementiere Export für:
|
||||
- Alle eigenen Daten
|
||||
- Alle Mitgliederdaten (falls berechtigt)
|
||||
- Alle Trainingsdaten
|
||||
- Alle Turnierdaten
|
||||
|
||||
---
|
||||
|
||||
## 6. Verschlüsselung ✅ / ⚠️
|
||||
|
||||
### Status: ✅ Gut implementiert
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ AES-256-CBC Verschlüsselung für Mitgliederdaten:
|
||||
- firstName, lastName
|
||||
- birthDate
|
||||
- phone, street, city, postalCode
|
||||
- email
|
||||
- notes (Participant)
|
||||
- ✅ Passwörter werden mit bcrypt gehasht
|
||||
- ✅ HTTPS für alle Verbindungen
|
||||
|
||||
**Verbesserungsbedarf:**
|
||||
- ⚠️ Verschlüsselungsschlüssel sollte in separater, sicherer Konfiguration sein
|
||||
- ✅ **BEHOBEN:** MyTischtennis-Daten werden jetzt vollständig verschlüsselt (E-Mail, Zugriffstoken, Refresh-Token, Cookie, Benutzerdaten, Vereinsinformationen)
|
||||
- ⚠️ Keine Verschlüsselung für Logs (können personenbezogene Daten enthalten)
|
||||
|
||||
---
|
||||
|
||||
## 7. Logging ⚠️
|
||||
|
||||
### Status: ⚠️ Verbesserungsbedarf
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Aktivitäts-Logging (`log` Tabelle) - protokolliert wichtige Aktionen
|
||||
- ✅ Server-Logs - Standard-Server-Logs für Fehlerbehebung
|
||||
- ✅ **ENTFERNT:** API-Logging für MyTischtennis-Requests wurde deaktiviert
|
||||
|
||||
**Probleme:**
|
||||
- ✅ **BEHOBEN:** API-Logging für MyTischtennis-Requests wurde komplett entfernt (keine personenbezogenen Daten mehr in API-Logs)
|
||||
- ⚠️ Keine automatische Löschung von Aktivitätslogs (noch zu implementieren)
|
||||
- ✅ **BEHOBEN:** In Datenschutzerklärung dokumentiert, was geloggt wird
|
||||
|
||||
**Empfehlung:**
|
||||
- ⚠️ Implementiere automatische Löschung von Aktivitätslogs nach angemessener Frist (noch ausstehend)
|
||||
|
||||
---
|
||||
|
||||
## 8. MyTischtennis-Integration ⚠️
|
||||
|
||||
### Status: ⚠️ Verbesserungsbedarf
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Verschlüsselung von Passwörtern
|
||||
- ✅ Einwilligungen (`savePassword`, `autoUpdateRatings`)
|
||||
- ✅ DELETE-Endpunkt für Account
|
||||
|
||||
**Probleme:**
|
||||
- ✅ **BEHOBEN:** Drittlandübermittlung in Datenschutzerklärung erwähnt
|
||||
- ⚠️ Keine explizite Einwilligung für Datenübertragung an MyTischtennis.de
|
||||
- ✅ **BEHOBEN:** Informationen über Datenschutz bei MyTischtennis.de in Datenschutzerklärung
|
||||
- ✅ **BEHOBEN:** Alle MyTischtennis-Daten werden jetzt verschlüsselt gespeichert
|
||||
|
||||
**Empfehlung:**
|
||||
- Aktualisiere Datenschutzerklärung:
|
||||
- Erwähne MyTischtennis-Integration
|
||||
- Erkläre, welche Daten übertragen werden
|
||||
- Verweise auf Datenschutzerklärung von MyTischtennis.de
|
||||
- Erkläre Rechtsgrundlage (Einwilligung)
|
||||
- Implementiere explizite Einwilligung bei Einrichtung der Integration
|
||||
- Verschlüssele auch Zugriffstoken
|
||||
|
||||
---
|
||||
|
||||
## 9. Cookies & Local Storage ✅
|
||||
|
||||
### Status: ✅ Konform
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Nur technisch notwendige Cookies/Storage:
|
||||
- Session-Token (Session Storage)
|
||||
- Username, Clubs, Permissions (Local Storage)
|
||||
- ✅ Keine Tracking-Cookies
|
||||
- ✅ Keine Werbe-Cookies
|
||||
- ✅ Dokumentiert in Datenschutzerklärung
|
||||
|
||||
**Hinweis:**
|
||||
- Local Storage wird für persistente Daten verwendet (Clubs, Permissions)
|
||||
- Dies ist technisch notwendig und DSGVO-konform
|
||||
|
||||
---
|
||||
|
||||
## 10. Berechtigungssystem ✅
|
||||
|
||||
### Status: ✅ Gut implementiert
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Rollenbasierte Zugriffe (Admin, Trainer, Mannschaftsführer, Mitglied)
|
||||
- ✅ Individuelle Berechtigungen pro Ressource
|
||||
- ✅ Transparente Zugriffskontrolle
|
||||
- ✅ Logging von Aktivitäten
|
||||
|
||||
**Hinweis:**
|
||||
- Berechtigungssystem ist DSGVO-konform
|
||||
- Ermöglicht Datenminimierung (Zugriff nur auf notwendige Daten)
|
||||
|
||||
---
|
||||
|
||||
## 11. Datenminimierung ⚠️
|
||||
|
||||
### Status: ⚠️ Teilweise konform
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Nur notwendige Daten werden gespeichert
|
||||
- ✅ Berechtigungssystem ermöglicht minimale Datenzugriffe
|
||||
|
||||
**Verbesserungsbedarf:**
|
||||
- ⚠️ Logs enthalten möglicherweise zu viele Daten (Request/Response-Bodies)
|
||||
- ⚠️ Keine automatische Löschung alter Daten
|
||||
- ⚠️ Keine Option, Daten zu anonymisieren statt zu löschen
|
||||
|
||||
---
|
||||
|
||||
## 12. Technische und organisatorische Maßnahmen (TOM) ✅ / ⚠️
|
||||
|
||||
### Status: ✅ Gut, aber verbesserungsbedürftig
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ Verschlüsselung sensibler Daten
|
||||
- ✅ HTTPS für alle Verbindungen
|
||||
- ✅ Passwort-Hashing (bcrypt)
|
||||
- ✅ Authentifizierung und Autorisierung
|
||||
- ✅ Berechtigungssystem
|
||||
|
||||
**Verbesserungsbedarf:**
|
||||
- ⚠️ Keine Dokumentation der TOM
|
||||
- ⚠️ Keine regelmäßigen Sicherheitsupdates dokumentiert
|
||||
- ⚠️ Keine Backup-Strategie dokumentiert
|
||||
- ⚠️ Keine Notfallpläne dokumentiert
|
||||
|
||||
---
|
||||
|
||||
## 13. Auftragsverarbeitung ⚠️
|
||||
|
||||
### Status: ⚠️ Nicht dokumentiert
|
||||
|
||||
**Fehlend:**
|
||||
- ⚠️ Keine Informationen über Hosting-Provider
|
||||
- ⚠️ Keine Informationen über Auftragsverarbeitungsverträge (AVV)
|
||||
- ⚠️ Keine Informationen über Subunternehmer
|
||||
|
||||
**Empfehlung:**
|
||||
- Dokumentiere alle Auftragsverarbeiter (Hosting, etc.)
|
||||
- Erwähne in Datenschutzerklärung, dass AVV abgeschlossen wurden
|
||||
|
||||
---
|
||||
|
||||
## 14. Betroffenenrechte - Umsetzung ❌
|
||||
|
||||
### Status: ❌ Nicht vollständig implementiert
|
||||
|
||||
**Fehlend:**
|
||||
- ❌ **KRITISCH:** Kein Endpunkt für Auskunft (Art. 15)
|
||||
- ❌ **KRITISCH:** Kein Endpunkt für Löschung (Art. 17)
|
||||
- ❌ **KRITISCH:** Kein Endpunkt für Datenexport (Art. 20)
|
||||
- ❌ Kein Endpunkt für Berichtigung (Art. 16) - teilweise vorhanden über normale Edit-Endpunkte
|
||||
- ❌ Kein Endpunkt für Einschränkung (Art. 18)
|
||||
- ❌ Kein Endpunkt für Widerspruch (Art. 21)
|
||||
|
||||
**Empfehlung:**
|
||||
- Implementiere zentrale Endpunkte für alle Betroffenenrechte:
|
||||
- `GET /api/user/rights/information` - Auskunft
|
||||
- `DELETE /api/user/rights/deletion` - Löschung
|
||||
- `GET /api/user/rights/export` - Datenexport
|
||||
- `PUT /api/user/rights/restriction` - Einschränkung
|
||||
- `POST /api/user/rights/objection` - Widerspruch
|
||||
|
||||
---
|
||||
|
||||
## 15. Kontakt für Datenschutz ✅
|
||||
|
||||
### Status: ✅ Vorhanden
|
||||
|
||||
**Vorhanden:**
|
||||
- ✅ E-Mail-Adresse in Datenschutzerklärung: tsschulz@tsschulz.de
|
||||
- ✅ Vollständige Anschrift im Impressum
|
||||
|
||||
---
|
||||
|
||||
## Zusammenfassung
|
||||
|
||||
### ✅ Gut implementiert:
|
||||
1. Verschlüsselung sensibler Daten
|
||||
2. HTTPS
|
||||
3. Berechtigungssystem
|
||||
4. Cookies/Local Storage (nur technisch notwendig)
|
||||
5. Datenschutzerklärung vorhanden
|
||||
|
||||
### ⚠️ Verbesserungsbedarf:
|
||||
1. Datenschutzerklärung aktualisieren (MyTischtennis, Logging)
|
||||
2. Logging von personenbezogenen Daten reduzieren/anonymisieren
|
||||
3. Automatische Löschung von Logs implementieren
|
||||
4. MyTischtennis-Integration in Datenschutzerklärung erwähnen
|
||||
|
||||
### ❌ Kritisch - Muss implementiert werden:
|
||||
1. **Löschrechte-API** (Art. 17 DSGVO)
|
||||
2. **Auskunftsrechte-API** (Art. 15 DSGVO)
|
||||
3. **Datenexport-API** (Art. 20 DSGVO)
|
||||
4. **Automatische Löschung von Logs** nach Retention-Periode
|
||||
|
||||
---
|
||||
|
||||
## Prioritäten
|
||||
|
||||
### Sofort (vor Live-Betrieb):
|
||||
1. Datenschutzerklärung aktualisieren
|
||||
2. Löschrechte-API implementieren
|
||||
3. Auskunftsrechte-API implementieren
|
||||
4. Datenexport-API implementieren
|
||||
|
||||
### Kurzfristig (innerhalb 1 Monat):
|
||||
1. Automatische Löschung von Logs implementieren
|
||||
2. Logging von personenbezogenen Daten reduzieren/anonymisieren
|
||||
3. MyTischtennis-Integration in Datenschutzerklärung dokumentieren
|
||||
|
||||
### Mittelfristig (innerhalb 3 Monate):
|
||||
1. Einwilligungsmanagement implementieren
|
||||
2. TOM dokumentieren
|
||||
3. Auftragsverarbeitung dokumentieren
|
||||
|
||||
---
|
||||
|
||||
## Nächste Schritte
|
||||
|
||||
1. ✅ Diese Checkliste erstellen
|
||||
2. ⏳ Datenschutzerklärung aktualisieren
|
||||
3. ⏳ Löschrechte-API implementieren
|
||||
4. ⏳ Auskunftsrechte-API implementieren
|
||||
5. ⏳ Datenexport-API implementieren
|
||||
6. ⏳ Logging verbessern
|
||||
|
||||
69
SERVER_NODE_UPGRADE.md
Normal file
69
SERVER_NODE_UPGRADE.md
Normal file
@@ -0,0 +1,69 @@
|
||||
# Server Node.js Upgrade-Anleitung
|
||||
|
||||
## Problem
|
||||
|
||||
Der Server verwendet Node.js 20.17.0, aber Vite 7.2.4 benötigt Node.js 20.19+ oder 22.12+.
|
||||
|
||||
## Lösung 1: Node.js auf dem Server upgraden (Empfohlen)
|
||||
|
||||
### Option A: Node.js 20.19+ installieren
|
||||
|
||||
```bash
|
||||
# Auf dem Server:
|
||||
# Mit nvm (falls installiert):
|
||||
nvm install 20.19.0
|
||||
nvm use 20.19.0
|
||||
nvm alias default 20.19.0
|
||||
|
||||
# Oder mit NodeSource Repository:
|
||||
curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
|
||||
sudo apt-get install -y nodejs=20.19.0-1nodesource1
|
||||
|
||||
# Prüfe Version:
|
||||
node --version # Sollte 20.19.0 oder höher sein
|
||||
```
|
||||
|
||||
### Option B: Node.js 22.12+ installieren (LTS)
|
||||
|
||||
```bash
|
||||
# Auf dem Server:
|
||||
# Mit nvm:
|
||||
nvm install 22.12.0
|
||||
nvm use 22.12.0
|
||||
nvm alias default 22.12.0
|
||||
|
||||
# Oder mit NodeSource Repository:
|
||||
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
|
||||
sudo apt-get install -y nodejs
|
||||
|
||||
# Prüfe Version:
|
||||
node --version # Sollte 22.12.0 oder höher sein
|
||||
```
|
||||
|
||||
### Nach dem Upgrade
|
||||
|
||||
```bash
|
||||
cd /var/www/tt-tagebuch.de/backend
|
||||
npm install # Erstellt automatisch den Frontend-Build
|
||||
sudo systemctl restart tt-tagebuch
|
||||
```
|
||||
|
||||
## Lösung 2: Vite auf Version 6 downgraden (Temporär)
|
||||
|
||||
Falls Node.js nicht upgradet werden kann, wurde Vite bereits auf Version 6.0.0 downgraded.
|
||||
|
||||
```bash
|
||||
cd /var/www/tt-tagebuch.de/backend
|
||||
npm install # Erstellt automatisch den Frontend-Build
|
||||
sudo systemctl restart tt-tagebuch
|
||||
```
|
||||
|
||||
**Hinweis:** Vite 6 funktioniert mit Node.js 20.17.0, aber Vite 7 bietet bessere Performance und Features.
|
||||
|
||||
## Empfehlung
|
||||
|
||||
**Node.js upgraden** ist die bessere Lösung, da:
|
||||
- Vite 7 bessere Performance bietet
|
||||
- Zukünftige Updates einfacher sind
|
||||
- Node.js 20.19+ oder 22.12+ LTS-Versionen sind
|
||||
|
||||
109
SITEMAP_ANLEITUNG.md
Normal file
109
SITEMAP_ANLEITUNG.md
Normal file
@@ -0,0 +1,109 @@
|
||||
# Sitemap für Google Search Console einreichen
|
||||
|
||||
## Aktuelle Sitemap
|
||||
|
||||
Die Sitemap ist verfügbar unter: `https://tt-tagebuch.de/sitemap.xml`
|
||||
|
||||
Sie enthält folgende öffentliche Seiten:
|
||||
- `/` (Home) - Priorität: 1.0
|
||||
- `/register` (Registrierung) - Priorität: 0.8
|
||||
- `/login` (Anmeldung) - Priorität: 0.7
|
||||
- `/impressum` (Impressum) - Priorität: 0.3
|
||||
- `/datenschutz` (Datenschutz) - Priorität: 0.3
|
||||
|
||||
## Sitemap aktualisieren
|
||||
|
||||
### Automatisch (empfohlen)
|
||||
```bash
|
||||
./update-sitemap.sh
|
||||
```
|
||||
|
||||
Das Skript aktualisiert automatisch das `lastmod`-Datum auf das heutige Datum.
|
||||
|
||||
### Manuell
|
||||
Die Sitemap-Datei befindet sich in: `frontend/public/sitemap.xml`
|
||||
|
||||
Nach Änderungen:
|
||||
1. Frontend neu bauen: `cd frontend && npm run build`
|
||||
2. Backend neu starten (falls nötig)
|
||||
|
||||
## Sitemap in Google Search Console einreichen
|
||||
|
||||
### Schritt 1: Google Search Console öffnen
|
||||
1. Gehe zu: https://search.google.com/search-console
|
||||
2. Wähle die Property für `tt-tagebuch.de` aus
|
||||
|
||||
### Schritt 2: Sitemap hinzufügen
|
||||
1. Klicke im linken Menü auf **"Sitemaps"**
|
||||
2. Im Feld **"Neue Sitemap hinzufügen"** eingeben:
|
||||
```
|
||||
sitemap.xml
|
||||
```
|
||||
Oder die vollständige URL:
|
||||
```
|
||||
https://tt-tagebuch.de/sitemap.xml
|
||||
```
|
||||
3. Klicke auf **"Senden"**
|
||||
|
||||
### Schritt 3: Status prüfen
|
||||
- Google wird die Sitemap innerhalb weniger Minuten verarbeiten
|
||||
- Der Status wird angezeigt:
|
||||
- ✅ **Erfolgreich**: Sitemap wurde erfolgreich verarbeitet
|
||||
- ⚠️ **Warnung**: Sitemap wurde verarbeitet, aber es gibt Warnungen
|
||||
- ❌ **Fehler**: Sitemap konnte nicht verarbeitet werden
|
||||
|
||||
### Schritt 4: Indexierung anfordern
|
||||
Nach dem Einreichen der Sitemap kannst du auch einzelne URLs zur Indexierung anfordern:
|
||||
1. Gehe zu **"URL-Prüfung"**
|
||||
2. Gib die URL ein: `https://tt-tagebuch.de/`
|
||||
3. Klicke auf **"Indexierung anfordern"**
|
||||
|
||||
## Sitemap testen
|
||||
|
||||
### Online-Tools
|
||||
- Google Sitemap Tester: https://www.xml-sitemaps.com/validate-xml-sitemap.html
|
||||
- Sitemap Validator: https://validator.w3.org/
|
||||
|
||||
### Per Kommandozeile
|
||||
```bash
|
||||
# Sitemap abrufen
|
||||
curl https://tt-tagebuch.de/sitemap.xml
|
||||
|
||||
# XML-Validierung (falls xmllint installiert ist)
|
||||
curl -s https://tt-tagebuch.de/sitemap.xml | xmllint --noout -
|
||||
```
|
||||
|
||||
## Wichtige Hinweise
|
||||
|
||||
1. **robots.txt**: Die Sitemap ist bereits in der `robots.txt` referenziert:
|
||||
```
|
||||
Sitemap: https://tt-tagebuch.de/sitemap.xml
|
||||
```
|
||||
|
||||
2. **lastmod-Datum**: Wird automatisch beim Ausführen von `update-sitemap.sh` aktualisiert
|
||||
|
||||
3. **Nur öffentliche Seiten**: Die Sitemap enthält nur öffentlich zugängliche Seiten. Geschützte Seiten (die eine Anmeldung erfordern) sind nicht enthalten.
|
||||
|
||||
4. **Prioritäten**:
|
||||
- Homepage: 1.0 (höchste Priorität)
|
||||
- Registrierung/Login: 0.7-0.8 (wichtig für neue Nutzer)
|
||||
- Rechtliche Seiten: 0.3 (niedrige Priorität, ändern sich selten)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Sitemap wird nicht gefunden
|
||||
- Prüfe, ob die Sitemap unter `https://tt-tagebuch.de/sitemap.xml` erreichbar ist
|
||||
- Stelle sicher, dass das Frontend gebaut wurde: `cd frontend && npm run build`
|
||||
- Prüfe die Apache-Konfiguration (sollte statische Dateien aus `/var/www/tt-tagebuch.de` servieren)
|
||||
|
||||
### Sitemap wird nicht indexiert
|
||||
- Warte einige Stunden/Tage - Google braucht Zeit zum Crawlen
|
||||
- Prüfe in der Search Console, ob es Fehler gibt
|
||||
- Stelle sicher, dass die URLs in der Sitemap erreichbar sind
|
||||
- Prüfe, ob die `robots.txt` die Seiten nicht blockiert
|
||||
|
||||
### Sitemap enthält Fehler
|
||||
- Validiere die XML-Struktur mit einem XML-Validator
|
||||
- Prüfe, ob alle URLs korrekt sind (keine 404-Fehler)
|
||||
- Stelle sicher, dass alle URLs HTTPS verwenden (nicht HTTP)
|
||||
|
||||
22
apache-http.conf.example
Normal file
22
apache-http.conf.example
Normal file
@@ -0,0 +1,22 @@
|
||||
# Apache-Konfiguration für tt-tagebuch.de - HTTP (Port 80)
|
||||
#
|
||||
# Diese Datei kopieren nach: /etc/apache2/sites-available/tt-tagebuch.de.conf
|
||||
# Dann aktivieren mit: sudo a2ensite tt-tagebuch.de.conf
|
||||
# Und neu starten: sudo systemctl restart apache2
|
||||
#
|
||||
# WICHTIG: Folgende Module müssen aktiviert sein:
|
||||
# sudo a2enmod rewrite
|
||||
# sudo systemctl restart apache2
|
||||
|
||||
# HTTP: www.tt-tagebuch.de -> HTTPS: tt-tagebuch.de
|
||||
<VirtualHost *:80>
|
||||
ServerName www.tt-tagebuch.de
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
# HTTP: tt-tagebuch.de -> HTTPS: tt-tagebuch.de
|
||||
<VirtualHost *:80>
|
||||
ServerName tt-tagebuch.de
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
60
apache-https.conf.example
Normal file
60
apache-https.conf.example
Normal file
@@ -0,0 +1,60 @@
|
||||
# Apache-Konfiguration für tt-tagebuch.de - HTTPS (Port 443)
|
||||
#
|
||||
# Diese Datei kopieren nach: /etc/apache2/sites-available/tt-tagebuch.de-le-ssl.conf
|
||||
# Dann aktivieren mit: sudo a2ensite tt-tagebuch.de-le-ssl.conf
|
||||
# Und neu starten: sudo systemctl restart apache2
|
||||
#
|
||||
# WICHTIG: Folgende Module müssen aktiviert sein:
|
||||
# sudo a2enmod proxy
|
||||
# sudo a2enmod proxy_http
|
||||
# sudo a2enmod proxy_wstunnel
|
||||
# sudo a2enmod rewrite
|
||||
# sudo a2enmod headers
|
||||
# sudo systemctl restart apache2
|
||||
|
||||
# HTTPS: www.tt-tagebuch.de -> HTTPS: tt-tagebuch.de (301-Weiterleitung)
|
||||
<VirtualHost *:443>
|
||||
ServerName www.tt-tagebuch.de
|
||||
|
||||
SSLEngine on
|
||||
SSLCertificateFile /etc/letsencrypt/live/tt-tagebuch.de/fullchain.pem
|
||||
SSLCertificateKeyFile /etc/letsencrypt/live/tt-tagebuch.de/privkey.pem
|
||||
Include /etc/letsencrypt/options-ssl-apache.conf
|
||||
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
# HTTPS: tt-tagebuch.de - Hauptkonfiguration (non-www)
|
||||
<VirtualHost *:443>
|
||||
ServerName tt-tagebuch.de
|
||||
|
||||
DocumentRoot /var/www/tt-tagebuch.de
|
||||
|
||||
<Directory /var/www/tt-tagebuch.de>
|
||||
Options Indexes FollowSymLinks
|
||||
AllowOverride All
|
||||
Require all granted
|
||||
</Directory>
|
||||
|
||||
ErrorLog ${APACHE_LOG_DIR}/tt-tagebuch.de_error.log
|
||||
CustomLog ${APACHE_LOG_DIR}/tt-tagebuch.de_access.log combined
|
||||
|
||||
SSLEngine on
|
||||
SSLCertificateFile /etc/letsencrypt/live/tt-tagebuch.de/fullchain.pem
|
||||
SSLCertificateKeyFile /etc/letsencrypt/live/tt-tagebuch.de/privkey.pem
|
||||
Include /etc/letsencrypt/options-ssl-apache.conf
|
||||
|
||||
ProxyRequests Off
|
||||
|
||||
# HINWEIS: Socket.IO läuft jetzt direkt auf HTTPS-Port 3051 (nicht über Apache-Proxy)
|
||||
# Siehe backend/SOCKET_IO_SSL_SETUP.md für Details
|
||||
|
||||
# API-Routen
|
||||
ProxyPass /api http://localhost:3050/api
|
||||
ProxyPassReverse /api http://localhost:3050/api
|
||||
|
||||
# Alle anderen Anfragen an den Backend-Server (für Frontend)
|
||||
ProxyPass / http://localhost:3050/
|
||||
ProxyPassReverse / http://localhost:3050/
|
||||
</VirtualHost>
|
||||
|
||||
89
apache.conf.example
Normal file
89
apache.conf.example
Normal file
@@ -0,0 +1,89 @@
|
||||
# Apache-Konfiguration für tt-tagebuch.de
|
||||
#
|
||||
# HINWEIS: Diese Datei ist eine kombinierte Referenz.
|
||||
# Für die tatsächliche Konfiguration werden zwei separate Dateien verwendet:
|
||||
#
|
||||
# 1. apache-http.conf.example -> /etc/apache2/sites-available/tt-tagebuch.de.conf
|
||||
# (HTTP, Port 80 - Weiterleitung zu HTTPS)
|
||||
#
|
||||
# 2. apache-https.conf.example -> /etc/apache2/sites-available/tt-tagebuch.de-le-ssl.conf
|
||||
# (HTTPS, Port 443 - Hauptkonfiguration)
|
||||
#
|
||||
# Oder verwende das Update-Skript: ./update-apache-config.sh
|
||||
#
|
||||
# WICHTIG: Folgende Module müssen aktiviert sein:
|
||||
# sudo a2enmod proxy
|
||||
# sudo a2enmod proxy_http
|
||||
# sudo a2enmod proxy_wstunnel
|
||||
# sudo a2enmod rewrite
|
||||
# sudo a2enmod headers
|
||||
# sudo systemctl restart apache2
|
||||
|
||||
# ============================================
|
||||
# HTTP (Port 80) - Weiterleitung zu HTTPS
|
||||
# ============================================
|
||||
|
||||
# HTTP: www.tt-tagebuch.de -> HTTPS: tt-tagebuch.de
|
||||
<VirtualHost *:80>
|
||||
ServerName www.tt-tagebuch.de
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
# HTTP: tt-tagebuch.de -> HTTPS: tt-tagebuch.de
|
||||
<VirtualHost *:80>
|
||||
ServerName tt-tagebuch.de
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
# ============================================
|
||||
# HTTPS (Port 443) - Weiterleitung www -> non-www
|
||||
# ============================================
|
||||
|
||||
# HTTPS: www.tt-tagebuch.de -> HTTPS: tt-tagebuch.de (301-Weiterleitung)
|
||||
<VirtualHost *:443>
|
||||
ServerName www.tt-tagebuch.de
|
||||
|
||||
SSLEngine on
|
||||
SSLCertificateFile /etc/letsencrypt/live/tt-tagebuch.de/fullchain.pem
|
||||
SSLCertificateKeyFile /etc/letsencrypt/live/tt-tagebuch.de/privkey.pem
|
||||
Include /etc/letsencrypt/options-ssl-apache.conf
|
||||
|
||||
Redirect permanent / https://tt-tagebuch.de/
|
||||
</VirtualHost>
|
||||
|
||||
# ============================================
|
||||
# HTTPS (Port 443) - Hauptkonfiguration (non-www)
|
||||
# ============================================
|
||||
|
||||
<VirtualHost *:443>
|
||||
ServerName tt-tagebuch.de
|
||||
|
||||
DocumentRoot /var/www/tt-tagebuch.de
|
||||
|
||||
<Directory /var/www/tt-tagebuch.de>
|
||||
Options Indexes FollowSymLinks
|
||||
AllowOverride All
|
||||
Require all granted
|
||||
</Directory>
|
||||
|
||||
ErrorLog ${APACHE_LOG_DIR}/tt-tagebuch.de_error.log
|
||||
CustomLog ${APACHE_LOG_DIR}/tt-tagebuch.de_access.log combined
|
||||
|
||||
SSLEngine on
|
||||
SSLCertificateFile /etc/letsencrypt/live/tt-tagebuch.de/fullchain.pem
|
||||
SSLCertificateKeyFile /etc/letsencrypt/live/tt-tagebuch.de/privkey.pem
|
||||
Include /etc/letsencrypt/options-ssl-apache.conf
|
||||
|
||||
ProxyRequests Off
|
||||
|
||||
# HINWEIS: Socket.IO läuft jetzt direkt auf HTTPS-Port 3051 (nicht über Apache-Proxy)
|
||||
# Siehe backend/SOCKET_IO_SSL_SETUP.md für Details
|
||||
|
||||
# API-Routen
|
||||
ProxyPass /api http://localhost:3050/api
|
||||
ProxyPassReverse /api http://localhost:3050/api
|
||||
|
||||
# Alle anderen Anfragen an den Backend-Server (für Frontend)
|
||||
ProxyPass / http://localhost:3050/
|
||||
ProxyPassReverse / http://localhost:3050/
|
||||
</VirtualHost>
|
||||
140
backend/SOCKET_IO_SSL_SETUP.md
Normal file
140
backend/SOCKET_IO_SSL_SETUP.md
Normal file
@@ -0,0 +1,140 @@
|
||||
# Socket.IO mit SSL direkt betreiben (Alternative zu Apache-Proxy)
|
||||
|
||||
Falls die Apache-WebSocket-Proxy-Konfiguration nicht funktioniert, kann Socket.IO direkt mit SSL betrieben werden.
|
||||
|
||||
## Voraussetzungen
|
||||
|
||||
1. SSL-Zertifikat (z.B. von Let's Encrypt)
|
||||
2. Port in der Firewall öffnen (z.B. 3051)
|
||||
3. Socket.IO-Server auf HTTPS konfigurieren
|
||||
|
||||
## Backend-Konfiguration
|
||||
|
||||
### 1. Socket.IO auf HTTPS umstellen
|
||||
|
||||
Ändere `backend/server.js`:
|
||||
|
||||
```javascript
|
||||
import https from 'https';
|
||||
import fs from 'fs';
|
||||
|
||||
// SSL-Zertifikat laden
|
||||
const httpsOptions = {
|
||||
key: fs.readFileSync('/etc/letsencrypt/live/tt-tagebuch.de/privkey.pem'),
|
||||
cert: fs.readFileSync('/etc/letsencrypt/live/tt-tagebuch.de/fullchain.pem')
|
||||
};
|
||||
|
||||
// HTTPS-Server erstellen
|
||||
const httpsServer = https.createServer(httpsOptions, app);
|
||||
|
||||
// Socket.IO initialisieren
|
||||
initializeSocketIO(httpsServer);
|
||||
|
||||
// HTTPS-Server starten
|
||||
const httpsPort = process.env.HTTPS_PORT || 3051;
|
||||
httpsServer.listen(httpsPort, () => {
|
||||
console.log(`🚀 HTTPS-Server läuft auf Port ${httpsPort}`);
|
||||
});
|
||||
|
||||
// HTTP-Server für API (optional, falls API weiterhin über HTTP laufen soll)
|
||||
const httpServer = createServer(app);
|
||||
const httpPort = process.env.PORT || 3005;
|
||||
httpServer.listen(httpPort, () => {
|
||||
console.log(`🚀 HTTP-Server läuft auf Port ${httpPort}`);
|
||||
});
|
||||
```
|
||||
|
||||
### 2. Frontend-Konfiguration
|
||||
|
||||
Ändere `frontend/src/services/socketService.js`:
|
||||
|
||||
```javascript
|
||||
import { io } from 'socket.io-client';
|
||||
import { backendBaseUrl } from '../apiClient.js';
|
||||
|
||||
let socket = null;
|
||||
|
||||
export const connectSocket = (clubId) => {
|
||||
// Verwende HTTPS-URL für Socket.IO
|
||||
const socketUrl = backendBaseUrl.replace('http://', 'https://').replace(':3005', ':3051');
|
||||
|
||||
if (socket && socket.connected) {
|
||||
// Wenn bereits verbunden, verlasse den alten Club-Raum und trete dem neuen bei
|
||||
if (socket.currentClubId) {
|
||||
socket.emit('leave-club', socket.currentClubId);
|
||||
}
|
||||
} else {
|
||||
// Neue Verbindung erstellen
|
||||
socket = io(socketUrl, {
|
||||
path: '/socket.io/',
|
||||
transports: ['websocket', 'polling'],
|
||||
reconnection: true,
|
||||
reconnectionDelay: 1000,
|
||||
reconnectionAttempts: 5,
|
||||
timeout: 20000,
|
||||
upgrade: true,
|
||||
forceNew: false,
|
||||
secure: true // Wichtig für HTTPS
|
||||
});
|
||||
|
||||
socket.on('connect', () => {
|
||||
console.log('Socket.IO verbunden');
|
||||
if (socket.currentClubId) {
|
||||
socket.emit('join-club', socket.currentClubId);
|
||||
}
|
||||
});
|
||||
|
||||
socket.on('disconnect', () => {
|
||||
console.log('Socket.IO getrennt');
|
||||
});
|
||||
|
||||
socket.on('connect_error', (error) => {
|
||||
console.error('Socket.IO Verbindungsfehler:', error);
|
||||
});
|
||||
}
|
||||
|
||||
// Club-Raum beitreten
|
||||
if (clubId) {
|
||||
socket.emit('join-club', clubId);
|
||||
socket.currentClubId = clubId;
|
||||
}
|
||||
|
||||
return socket;
|
||||
};
|
||||
|
||||
export const disconnectSocket = () => {
|
||||
if (socket) {
|
||||
socket.disconnect();
|
||||
socket = null;
|
||||
}
|
||||
};
|
||||
|
||||
export const getSocket = () => socket;
|
||||
```
|
||||
|
||||
### 3. Firewall-Port öffnen
|
||||
|
||||
```bash
|
||||
# UFW (Ubuntu Firewall)
|
||||
sudo ufw allow 3051/tcp
|
||||
|
||||
# Oder iptables
|
||||
sudo iptables -A INPUT -p tcp --dport 3051 -j ACCEPT
|
||||
```
|
||||
|
||||
### 4. Apache-Konfiguration anpassen
|
||||
|
||||
Entferne die Socket.IO-Proxy-Konfiguration aus Apache, da Socket.IO jetzt direkt erreichbar ist.
|
||||
|
||||
## Vorteile
|
||||
|
||||
- Einfacher zu konfigurieren
|
||||
- Keine Apache-Proxy-Probleme
|
||||
- Direkte WebSocket-Verbindung
|
||||
|
||||
## Nachteile
|
||||
|
||||
- Separater Port muss geöffnet sein
|
||||
- Zwei Ports (HTTP für API, HTTPS für Socket.IO)
|
||||
- CORS-Konfiguration muss angepasst werden
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import axios from 'axios';
|
||||
import { chromium } from 'playwright';
|
||||
|
||||
const BASE_URL = 'https://www.mytischtennis.de';
|
||||
|
||||
@@ -17,19 +18,246 @@ class MyTischtennisClient {
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get login page to extract XSRF token and CAPTCHA token
|
||||
* @returns {Promise<Object>} Object with xsrfToken, captchaToken, and captchaClicked flag
|
||||
*/
|
||||
async getLoginPage() {
|
||||
try {
|
||||
const response = await this.client.get('/login?next=%2F');
|
||||
const html = typeof response.data === 'string' ? response.data : String(response.data || '');
|
||||
|
||||
const extractFirst = (patterns) => {
|
||||
for (const pattern of patterns) {
|
||||
const match = html.match(pattern);
|
||||
if (match && (match[1] || match[2] || match[3])) {
|
||||
return match[1] || match[2] || match[3];
|
||||
}
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
// Parse form action and input fields for frontend login-form endpoint
|
||||
const formMatch = html.match(/<form[^>]*action=(?:"([^"]+)"|'([^']+)')[^>]*>([\s\S]*?)<\/form>/i);
|
||||
const loginAction = formMatch ? (formMatch[1] || formMatch[2] || '/login') : '/login';
|
||||
const formHtml = formMatch ? formMatch[3] : html;
|
||||
const fields = [];
|
||||
|
||||
const inputRegex = /<input\b([\s\S]*?)>/gi;
|
||||
let inputMatch = null;
|
||||
while ((inputMatch = inputRegex.exec(formHtml)) !== null) {
|
||||
const rawAttributes = inputMatch[1] || '';
|
||||
const attributes = {};
|
||||
|
||||
// Parses key="value", key='value', key=value and boolean attributes.
|
||||
const attributeRegex = /([a-zA-Z_:][-a-zA-Z0-9_:.]*)(?:=(?:"([^"]*)"|'([^']*)'|([^\s"'=<>`]+)))?/g;
|
||||
let attributeMatch = null;
|
||||
while ((attributeMatch = attributeRegex.exec(rawAttributes)) !== null) {
|
||||
const key = attributeMatch[1];
|
||||
const value = attributeMatch[2] ?? attributeMatch[3] ?? attributeMatch[4] ?? true;
|
||||
attributes[key] = value;
|
||||
}
|
||||
|
||||
fields.push({
|
||||
name: typeof attributes.name === 'string' ? attributes.name : null,
|
||||
id: typeof attributes.id === 'string' ? attributes.id : null,
|
||||
type: typeof attributes.type === 'string' ? attributes.type : 'text',
|
||||
placeholder: typeof attributes.placeholder === 'string' ? attributes.placeholder : null,
|
||||
autocomplete: typeof attributes.autocomplete === 'string' ? attributes.autocomplete : null,
|
||||
minlength: typeof attributes.minlength === 'string' ? attributes.minlength : null,
|
||||
required: attributes.required === true || attributes.required === 'required',
|
||||
value: typeof attributes.value === 'string' ? attributes.value : null
|
||||
});
|
||||
}
|
||||
|
||||
// Fallback: if page is JS-rendered and no input tags are server-rendered, provide usable defaults.
|
||||
const hasEmailField = fields.some((f) => f?.name === 'email' || f?.type === 'email');
|
||||
const hasPasswordField = fields.some((f) => f?.name === 'password' || f?.type === 'password');
|
||||
if (!hasEmailField) {
|
||||
fields.push({
|
||||
name: 'email',
|
||||
id: null,
|
||||
type: 'email',
|
||||
placeholder: null,
|
||||
autocomplete: 'email',
|
||||
minlength: null,
|
||||
required: true,
|
||||
value: null
|
||||
});
|
||||
}
|
||||
if (!hasPasswordField) {
|
||||
fields.push({
|
||||
name: 'password',
|
||||
id: null,
|
||||
type: 'password',
|
||||
placeholder: null,
|
||||
autocomplete: 'current-password',
|
||||
minlength: null,
|
||||
required: true,
|
||||
value: null
|
||||
});
|
||||
}
|
||||
|
||||
// Extract XSRF token from hidden input
|
||||
const xsrfToken = extractFirst([
|
||||
/<input[^>]*name=(?:"xsrf"|'xsrf')[^>]*value=(?:"([^"]+)"|'([^']+)')/i,
|
||||
/(?:^|[,{])\s*"xsrf"\s*:\s*"([^"]+)"/i
|
||||
]);
|
||||
|
||||
// Extract CAPTCHA token from hidden input (if present)
|
||||
const captchaToken = extractFirst([
|
||||
/<input[^>]*name=(?:"captcha"|'captcha')[^>]*value=(?:"([^"]+)"|'([^']+)')/i,
|
||||
/(?:^|[,{])\s*"captcha"\s*:\s*"([^"]+)"/i
|
||||
]);
|
||||
|
||||
// Check if captcha_clicked is true or false
|
||||
const captchaClickedRaw = extractFirst([
|
||||
/<input[^>]*name=(?:"captcha_clicked"|'captcha_clicked')[^>]*value=(?:"([^"]+)"|'([^']+)')/i,
|
||||
/(?:^|[,{])\s*"captcha_clicked"\s*:\s*"([^"]+)"/i
|
||||
]);
|
||||
const captchaClicked = String(captchaClickedRaw || '').toLowerCase() === 'true';
|
||||
|
||||
// Check if CAPTCHA is required (look for private-captcha element or captcha input)
|
||||
const requiresCaptcha = html.includes('private-captcha')
|
||||
|| html.includes('name="captcha"')
|
||||
|| html.includes("name='captcha'")
|
||||
|| /captcha/i.test(html);
|
||||
|
||||
// Extract CAPTCHA metadata used by frontend
|
||||
const captchaSiteKey = extractFirst([
|
||||
/data-sitekey=(?:"([^"]+)"|'([^']+)'|([^\s>]+))/i,
|
||||
/(?:^|[,{])\s*"sitekey"\s*:\s*"([^"]+)"/i,
|
||||
/(?:^|[,{])\s*"captchaSiteKey"\s*:\s*"([^"]+)"/i
|
||||
]);
|
||||
const captchaPuzzleEndpoint = extractFirst([
|
||||
/data-puzzle-endpoint=(?:"([^"]+)"|'([^']+)'|([^\s>]+))/i,
|
||||
/(?:^|[,{])\s*"puzzle_endpoint"\s*:\s*"([^"]+)"/i,
|
||||
/(?:^|[,{])\s*"captchaPuzzleEndpoint"\s*:\s*"([^"]+)"/i
|
||||
]);
|
||||
|
||||
console.log('[myTischtennisClient.getLoginPage]', {
|
||||
hasXsrfToken: !!xsrfToken,
|
||||
hasCaptchaToken: !!captchaToken,
|
||||
captchaClicked,
|
||||
requiresCaptcha,
|
||||
fieldsCount: fields.length,
|
||||
hasCaptchaSiteKey: !!captchaSiteKey,
|
||||
hasCaptchaPuzzleEndpoint: !!captchaPuzzleEndpoint
|
||||
});
|
||||
|
||||
return {
|
||||
success: true,
|
||||
loginAction,
|
||||
fields,
|
||||
xsrfToken,
|
||||
captchaToken,
|
||||
captchaClicked,
|
||||
requiresCaptcha,
|
||||
captchaSiteKey,
|
||||
captchaPuzzleEndpoint
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Error fetching login page:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.message
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Login to myTischtennis API
|
||||
* @param {string} email - myTischtennis email (not username!)
|
||||
* @param {string} password - myTischtennis password
|
||||
* @param {string} captchaToken - Optional CAPTCHA token if required
|
||||
* @param {string} xsrfToken - Optional XSRF token (will be fetched if not provided)
|
||||
* @returns {Promise<Object>} Login response with token and session data
|
||||
*/
|
||||
async login(email, password) {
|
||||
async login(email, password, captchaToken = null, xsrfToken = null) {
|
||||
try {
|
||||
let loginPage = null;
|
||||
let captchaClicked = false;
|
||||
|
||||
// If XSRF token not provided, fetch login page to get it
|
||||
if (!xsrfToken) {
|
||||
loginPage = await this.getLoginPage();
|
||||
if (!loginPage.success) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Konnte Login-Seite nicht abrufen: ' + loginPage.error
|
||||
};
|
||||
}
|
||||
xsrfToken = loginPage.xsrfToken;
|
||||
|
||||
// If CAPTCHA token not provided but found in HTML, use it
|
||||
if (!captchaToken && loginPage.captchaToken) {
|
||||
captchaToken = loginPage.captchaToken;
|
||||
captchaClicked = loginPage.captchaClicked;
|
||||
console.log('[myTischtennisClient.login] CAPTCHA-Token aus HTML extrahiert, captcha_clicked:', captchaClicked);
|
||||
}
|
||||
|
||||
// If CAPTCHA is required but no token found yet, wait and try to get it again
|
||||
// Das CAPTCHA-System löst das Puzzle im Hintergrund via JavaScript, daher kann es einen Moment dauern
|
||||
// Wir müssen mehrmals versuchen, da das Token erst generiert wird, nachdem das JavaScript gelaufen ist
|
||||
if (loginPage.requiresCaptcha && !captchaToken) {
|
||||
console.log('[myTischtennisClient.login] CAPTCHA erforderlich, aber noch kein Token gefunden. Warte und versuche erneut...');
|
||||
|
||||
// Versuche bis zu 5 Mal, das CAPTCHA-Token zu erhalten
|
||||
let maxRetries = 5;
|
||||
let retryCount = 0;
|
||||
let foundToken = false;
|
||||
|
||||
while (retryCount < maxRetries && !foundToken) {
|
||||
// Warte 2-4 Sekunden zwischen den Versuchen
|
||||
const waitMs = Math.floor(Math.random() * 2000) + 2000; // 2000-4000ms
|
||||
console.log(`[myTischtennisClient.login] Versuch ${retryCount + 1}/${maxRetries}: Warte ${waitMs}ms...`);
|
||||
await new Promise(resolve => setTimeout(resolve, waitMs));
|
||||
|
||||
// Versuche erneut, die Login-Seite abzurufen, um das gelöste CAPTCHA-Token zu erhalten
|
||||
const retryLoginPage = await this.getLoginPage();
|
||||
if (retryLoginPage.success && retryLoginPage.captchaToken) {
|
||||
captchaToken = retryLoginPage.captchaToken;
|
||||
captchaClicked = retryLoginPage.captchaClicked;
|
||||
xsrfToken = retryLoginPage.xsrfToken || xsrfToken; // Aktualisiere XSRF-Token falls nötig
|
||||
foundToken = true;
|
||||
console.log(`[myTischtennisClient.login] CAPTCHA-Token nach ${retryCount + 1} Versuchen gefunden, captcha_clicked:`, captchaClicked);
|
||||
} else {
|
||||
retryCount++;
|
||||
}
|
||||
}
|
||||
|
||||
if (!foundToken) {
|
||||
// Wenn nach allen Versuchen kein Token gefunden wurde, Fehler zurückgeben
|
||||
console.log('[myTischtennisClient.login] CAPTCHA-Token konnte nach mehreren Versuchen nicht gefunden werden');
|
||||
return {
|
||||
success: false,
|
||||
error: 'CAPTCHA erforderlich. Bitte lösen Sie das CAPTCHA auf der MyTischtennis-Website.',
|
||||
requiresCaptcha: true
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Zufällige Verzögerung von 2-5 Sekunden zwischen Laden des Forms und Absenden
|
||||
// Simuliert menschliches Verhalten und gibt dem CAPTCHA-System Zeit
|
||||
const delayMs = Math.floor(Math.random() * 3000) + 2000; // 2000-5000ms
|
||||
console.log(`[myTischtennisClient] Warte ${delayMs}ms vor Login-Request (simuliert menschliches Verhalten)`);
|
||||
await new Promise(resolve => setTimeout(resolve, delayMs));
|
||||
}
|
||||
|
||||
// Create form data
|
||||
const formData = new URLSearchParams();
|
||||
formData.append('email', email);
|
||||
formData.append('password', password);
|
||||
formData.append('intent', 'login');
|
||||
|
||||
if (xsrfToken) {
|
||||
formData.append('xsrf', xsrfToken);
|
||||
}
|
||||
|
||||
if (captchaToken) {
|
||||
formData.append('captcha', captchaToken);
|
||||
formData.append('captcha_clicked', captchaClicked ? 'true' : 'false');
|
||||
}
|
||||
|
||||
const response = await this.client.post(
|
||||
'/login?next=%2F&_data=routes%2F_auth%2B%2Flogin',
|
||||
@@ -86,15 +314,236 @@ class MyTischtennisClient {
|
||||
cookie: authCookie.split(';')[0] // Just the cookie value without attributes
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('MyTischtennis login error:', error.message);
|
||||
const statusCode = error.response?.status || 500;
|
||||
const responseData = error.response?.data;
|
||||
|
||||
// Check if response contains CAPTCHA error
|
||||
let errorMessage = error.response?.data?.message || error.message || 'Login fehlgeschlagen';
|
||||
let requiresCaptcha = false;
|
||||
|
||||
// Check for CAPTCHA-related errors in response
|
||||
if (typeof responseData === 'string') {
|
||||
if (responseData.includes('Captcha') || responseData.includes('CAPTCHA') ||
|
||||
responseData.includes('captcha') || responseData.includes('Captcha-Bestätigung')) {
|
||||
requiresCaptcha = true;
|
||||
errorMessage = 'CAPTCHA erforderlich. Bitte lösen Sie das CAPTCHA auf der MyTischtennis-Website.';
|
||||
}
|
||||
} else if (responseData && typeof responseData === 'object') {
|
||||
// Check for CAPTCHA errors in JSON response or HTML
|
||||
const dataString = JSON.stringify(responseData);
|
||||
if (dataString.includes('Captcha') || dataString.includes('CAPTCHA') ||
|
||||
dataString.includes('captcha') || dataString.includes('Captcha-Bestätigung')) {
|
||||
requiresCaptcha = true;
|
||||
errorMessage = 'CAPTCHA erforderlich. Bitte lösen Sie das CAPTCHA auf der MyTischtennis-Website.';
|
||||
}
|
||||
}
|
||||
|
||||
console.error('MyTischtennis login error:', errorMessage, `(Status: ${statusCode})`, requiresCaptcha ? '(CAPTCHA erforderlich)' : '');
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'Login fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
error: errorMessage,
|
||||
status: statusCode,
|
||||
requiresCaptcha
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Browser-based fallback login for CAPTCHA flows.
|
||||
* @param {string} email
|
||||
* @param {string} password
|
||||
* @returns {Promise<Object>} Login response with token and session data
|
||||
*/
|
||||
async loginWithBrowserAutomation(email, password) {
|
||||
let browser = null;
|
||||
let context = null;
|
||||
try {
|
||||
console.log('[myTischtennisClient.playwright] Start browser login flow');
|
||||
browser = await chromium.launch({
|
||||
headless: true,
|
||||
args: ['--no-sandbox', '--disable-dev-shm-usage']
|
||||
});
|
||||
context = await browser.newContext();
|
||||
const page = await context.newPage();
|
||||
await page.goto(`${this.baseURL}/login?next=%2F`, { waitUntil: 'domcontentloaded', timeout: 45000 });
|
||||
console.log('[myTischtennisClient.playwright] Page loaded');
|
||||
|
||||
// Best-effort: Consent/overlay dialogs that can block form interaction.
|
||||
const consentSelectors = [
|
||||
'#onetrust-accept-btn-handler',
|
||||
'button:has-text("Alle akzeptieren")',
|
||||
'button:has-text("Akzeptieren")',
|
||||
'button:has-text("Einverstanden")'
|
||||
];
|
||||
for (const selector of consentSelectors) {
|
||||
try {
|
||||
const button = page.locator(selector).first();
|
||||
if (await button.count()) {
|
||||
await button.click({ timeout: 1500 });
|
||||
console.log('[myTischtennisClient.playwright] Consent dialog accepted');
|
||||
break;
|
||||
}
|
||||
} catch (_e) {
|
||||
// ignore and try next selector
|
||||
}
|
||||
}
|
||||
|
||||
// Fill credentials
|
||||
await page.locator('input[name="email"]').first().fill(email, { timeout: 10000 });
|
||||
await page.locator('input[name="password"]').first().fill(password, { timeout: 10000 });
|
||||
console.log('[myTischtennisClient.playwright] Credentials filled');
|
||||
|
||||
// Try to interact with private-captcha if present.
|
||||
const captchaHost = page.locator('private-captcha').first();
|
||||
if (await captchaHost.count()) {
|
||||
try {
|
||||
await page.waitForTimeout(1200);
|
||||
const interaction = await page.evaluate(() => {
|
||||
const host = document.querySelector('private-captcha');
|
||||
const checkbox = host?.shadowRoot?.querySelector('#pc-checkbox');
|
||||
if (!checkbox) {
|
||||
return { clicked: false, reason: 'checkbox-missing' };
|
||||
}
|
||||
checkbox.click();
|
||||
checkbox.dispatchEvent(new Event('input', { bubbles: true }));
|
||||
checkbox.dispatchEvent(new Event('change', { bubbles: true }));
|
||||
return {
|
||||
clicked: true,
|
||||
viaShadowRoot: true,
|
||||
className: checkbox.className || null,
|
||||
checked: !!checkbox.checked
|
||||
};
|
||||
});
|
||||
console.log('[myTischtennisClient.playwright] evaluate interaction result:', interaction);
|
||||
|
||||
// Wait until hidden captcha fields are populated by site scripts.
|
||||
try {
|
||||
await page.waitForFunction(() => {
|
||||
const captchaField = document.querySelector('input[name="captcha"]');
|
||||
const clickedField = document.querySelector('input[name="captcha_clicked"]');
|
||||
const captchaValue = (captchaField && captchaField.value ? captchaField.value.trim() : '');
|
||||
const clickedValue = (clickedField && clickedField.value ? clickedField.value.toLowerCase() : '');
|
||||
return captchaValue.length > 80 && (clickedValue === 'true' || clickedValue === '1');
|
||||
}, { timeout: 15000 });
|
||||
const captchaState = await page.evaluate(() => {
|
||||
const captchaField = document.querySelector('input[name="captcha"]');
|
||||
const clickedField = document.querySelector('input[name="captcha_clicked"]');
|
||||
return {
|
||||
captchaLen: captchaField?.value?.length || 0,
|
||||
captchaClicked: clickedField?.value || null
|
||||
};
|
||||
});
|
||||
console.log('[myTischtennisClient.playwright] Captcha value ready:', captchaState);
|
||||
} catch (_waitErr) {
|
||||
// Keep going; some flows still succeed without explicit hidden field update.
|
||||
console.warn('[myTischtennisClient.playwright] Captcha value not ready in time');
|
||||
}
|
||||
} catch (captchaError) {
|
||||
console.warn('[myTischtennisClient.playwright] Captcha interaction warning:', captchaError?.message || captchaError);
|
||||
}
|
||||
}
|
||||
|
||||
// Ensure captcha_clicked field is set if available.
|
||||
await page.evaluate(() => {
|
||||
const clickedField = document.querySelector('input[name="captcha_clicked"]');
|
||||
if (clickedField && !clickedField.value) {
|
||||
clickedField.value = 'true';
|
||||
}
|
||||
});
|
||||
|
||||
// Submit form
|
||||
const submitButton = page.locator('button[type="submit"], input[type="submit"]').first();
|
||||
if (await submitButton.count()) {
|
||||
await submitButton.click({ timeout: 15000, noWaitAfter: true });
|
||||
} else {
|
||||
await page.keyboard.press('Enter');
|
||||
}
|
||||
console.log('[myTischtennisClient.playwright] Submit clicked');
|
||||
|
||||
// Wait for auth cookie after submit (polling avoids timing races).
|
||||
let authCookieObj = null;
|
||||
const maxAttempts = 20;
|
||||
for (let attempt = 0; attempt < maxAttempts; attempt++) {
|
||||
const cookies = await context.cookies();
|
||||
authCookieObj = cookies.find((c) => c.name === 'sb-10-auth-token');
|
||||
if (authCookieObj?.value) {
|
||||
break;
|
||||
}
|
||||
await page.waitForTimeout(500);
|
||||
}
|
||||
if (!authCookieObj || !authCookieObj.value) {
|
||||
let errorText = null;
|
||||
try {
|
||||
const textContent = await page.locator('body').innerText({ timeout: 1000 });
|
||||
if (textContent?.includes('Captcha-Bestätigung fehlgeschlagen')) {
|
||||
errorText = 'Captcha-Bestätigung fehlgeschlagen';
|
||||
}
|
||||
} catch (_e) {
|
||||
// ignore text read errors
|
||||
}
|
||||
return {
|
||||
success: false,
|
||||
error: errorText
|
||||
? `Playwright-Login fehlgeschlagen: ${errorText}`
|
||||
: 'Playwright-Login fehlgeschlagen: Kein sb-10-auth-token Cookie gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
// Cookie value is expected as "base64-<tokenData>"
|
||||
const tokenMatch = String(authCookieObj.value).match(/^base64-(.+)$/);
|
||||
if (!tokenMatch) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Playwright-Login fehlgeschlagen: Token-Format ungültig'
|
||||
};
|
||||
}
|
||||
|
||||
let tokenData;
|
||||
try {
|
||||
tokenData = JSON.parse(Buffer.from(tokenMatch[1], 'base64').toString('utf-8'));
|
||||
} catch (decodeError) {
|
||||
return {
|
||||
success: false,
|
||||
error: `Playwright-Login fehlgeschlagen: Token konnte nicht dekodiert werden (${decodeError.message})`
|
||||
};
|
||||
}
|
||||
|
||||
const cookie = `sb-10-auth-token=${authCookieObj.value}`;
|
||||
console.log('[myTischtennisClient.playwright] Browser login successful');
|
||||
return {
|
||||
success: true,
|
||||
accessToken: tokenData.access_token,
|
||||
refreshToken: tokenData.refresh_token,
|
||||
expiresAt: tokenData.expires_at,
|
||||
expiresIn: tokenData.expires_in,
|
||||
user: tokenData.user,
|
||||
cookie
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('[myTischtennisClient.playwright] Browser login failed:', error?.message || error);
|
||||
return {
|
||||
success: false,
|
||||
error: error?.message || 'Playwright-Login fehlgeschlagen'
|
||||
};
|
||||
} finally {
|
||||
if (context) {
|
||||
try {
|
||||
await context.close();
|
||||
} catch (contextCloseError) {
|
||||
console.warn('[myTischtennisClient.playwright] Context close warning:', contextCloseError?.message || contextCloseError);
|
||||
}
|
||||
}
|
||||
if (browser) {
|
||||
try {
|
||||
await browser.close();
|
||||
} catch (browserCloseError) {
|
||||
console.warn('[myTischtennisClient.playwright] Browser close warning:', browserCloseError?.message || browserCloseError);
|
||||
}
|
||||
console.log('[myTischtennisClient.playwright] Browser closed');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify login credentials
|
||||
* @param {string} email - myTischtennis email
|
||||
@@ -259,4 +708,3 @@ class MyTischtennisClient {
|
||||
}
|
||||
|
||||
export default new MyTischtennisClient();
|
||||
|
||||
|
||||
125
backend/constants/ERROR_CODES_USAGE.md
Normal file
125
backend/constants/ERROR_CODES_USAGE.md
Normal file
@@ -0,0 +1,125 @@
|
||||
# Fehlercode-System - Verwendungsanleitung
|
||||
|
||||
## Übersicht
|
||||
|
||||
Das Fehlercode-System ersetzt hardcodierte deutsche Fehlermeldungen durch strukturierte Fehlercodes, die im Frontend übersetzt werden.
|
||||
|
||||
## Backend-Verwendung
|
||||
|
||||
### 1. Fehlercode verwenden
|
||||
|
||||
```javascript
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import { ERROR_CODES, createError } from '../constants/errorCodes.js';
|
||||
|
||||
// Einfacher Fehlercode ohne Parameter
|
||||
throw new HttpError(createError(ERROR_CODES.USER_NOT_FOUND), 404);
|
||||
|
||||
// Fehlercode mit Parametern
|
||||
throw new HttpError(
|
||||
createError(ERROR_CODES.MEMBER_NOT_FOUND, { memberId: 123 }),
|
||||
404
|
||||
);
|
||||
|
||||
// Oder direkt:
|
||||
throw new HttpError(
|
||||
{ code: ERROR_CODES.MEMBER_NOT_FOUND, params: { memberId: 123 } },
|
||||
404
|
||||
);
|
||||
```
|
||||
|
||||
### 2. Legacy-Format (wird weiterhin unterstützt)
|
||||
|
||||
```javascript
|
||||
// Alte Variante funktioniert noch:
|
||||
throw new HttpError('Benutzer nicht gefunden', 404);
|
||||
```
|
||||
|
||||
## Frontend-Verwendung
|
||||
|
||||
### 1. Fehlermeldungen automatisch übersetzen
|
||||
|
||||
Die `getSafeErrorMessage`-Funktion erkennt automatisch Fehlercodes:
|
||||
|
||||
```javascript
|
||||
import { getSafeErrorMessage } from '../utils/errorMessages.js';
|
||||
|
||||
// In einer Vue-Komponente (Options API)
|
||||
try {
|
||||
await apiClient.post('/api/endpoint', data);
|
||||
} catch (error) {
|
||||
const message = getSafeErrorMessage(error, this.$t('errors.ERROR_UNKNOWN_ERROR'), this.$t);
|
||||
await this.showInfo(this.$t('messages.error'), message, '', 'error');
|
||||
}
|
||||
|
||||
// In einer Vue-Komponente (Composition API)
|
||||
import { useI18n } from 'vue-i18n';
|
||||
const { t } = useI18n();
|
||||
|
||||
try {
|
||||
await apiClient.post('/api/endpoint', data);
|
||||
} catch (error) {
|
||||
const message = getSafeErrorMessage(error, t('errors.ERROR_UNKNOWN_ERROR'), t);
|
||||
await showInfo(t('messages.error'), message, '', 'error');
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Dialog-Utils mit Übersetzung
|
||||
|
||||
```javascript
|
||||
import { buildInfoConfig, safeErrorMessage } from '../utils/dialogUtils.js';
|
||||
|
||||
// Mit Übersetzungsfunktion
|
||||
this.infoDialog = buildInfoConfig({
|
||||
title: this.$t('messages.error'),
|
||||
message: safeErrorMessage(error, this.$t('errors.ERROR_UNKNOWN_ERROR'), this.$t),
|
||||
type: 'error'
|
||||
}, this.$t);
|
||||
```
|
||||
|
||||
## API-Response-Format
|
||||
|
||||
### Neues Format (mit Fehlercode):
|
||||
```json
|
||||
{
|
||||
"success": false,
|
||||
"code": "ERROR_MEMBER_NOT_FOUND",
|
||||
"params": {
|
||||
"memberId": 123
|
||||
},
|
||||
"error": "ERROR_MEMBER_NOT_FOUND" // Für Rückwärtskompatibilität
|
||||
}
|
||||
```
|
||||
|
||||
### Legacy-Format (wird weiterhin unterstützt):
|
||||
```json
|
||||
{
|
||||
"success": false,
|
||||
"message": "Mitglied nicht gefunden",
|
||||
"error": "Mitglied nicht gefunden"
|
||||
}
|
||||
```
|
||||
|
||||
## Übersetzungen hinzufügen
|
||||
|
||||
1. **Backend**: Fehlercode in `backend/constants/errorCodes.js` definieren
|
||||
2. **Frontend**: Übersetzung in `frontend/src/i18n/locales/de.json` unter `errors` hinzufügen
|
||||
|
||||
Beispiel:
|
||||
```json
|
||||
{
|
||||
"errors": {
|
||||
"ERROR_MEMBER_NOT_FOUND": "Mitglied nicht gefunden.",
|
||||
"ERROR_MEMBER_NOT_FOUND_WITH_ID": "Mitglied mit ID {memberId} nicht gefunden."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Migration bestehender Fehler
|
||||
|
||||
1. Hardcodierte Fehlermeldung identifizieren
|
||||
2. Passenden Fehlercode in `errorCodes.js` finden oder erstellen
|
||||
3. Backend-Code anpassen: `throw new HttpError(createError(ERROR_CODES.XXX), status)`
|
||||
4. Übersetzung in `de.json` hinzufügen
|
||||
5. Frontend-Code muss nicht geändert werden (automatische Erkennung)
|
||||
|
||||
121
backend/constants/errorCodes.js
Normal file
121
backend/constants/errorCodes.js
Normal file
@@ -0,0 +1,121 @@
|
||||
/**
|
||||
* Fehlercodes für die API
|
||||
* Diese Codes werden an das Frontend gesendet und dort übersetzt
|
||||
*
|
||||
* Format: { code: string, params?: object }
|
||||
*
|
||||
* Beispiel:
|
||||
* - { code: 'ERROR_USER_NOT_FOUND' }
|
||||
* - { code: 'ERROR_MEMBER_NOT_FOUND', params: { memberId: 123 } }
|
||||
* - { code: 'ERROR_VALIDATION_FAILED', params: { field: 'email', value: 'invalid' } }
|
||||
*/
|
||||
|
||||
export const ERROR_CODES = {
|
||||
// Allgemeine Fehler
|
||||
INTERNAL_SERVER_ERROR: 'ERROR_INTERNAL_SERVER_ERROR',
|
||||
UNKNOWN_ERROR: 'ERROR_UNKNOWN_ERROR',
|
||||
VALIDATION_FAILED: 'ERROR_VALIDATION_FAILED',
|
||||
NOT_FOUND: 'ERROR_NOT_FOUND',
|
||||
UNAUTHORIZED: 'ERROR_UNAUTHORIZED',
|
||||
FORBIDDEN: 'ERROR_FORBIDDEN',
|
||||
BAD_REQUEST: 'ERROR_BAD_REQUEST',
|
||||
|
||||
// Authentifizierung
|
||||
USER_NOT_FOUND: 'ERROR_USER_NOT_FOUND',
|
||||
INVALID_PASSWORD: 'ERROR_INVALID_PASSWORD',
|
||||
LOGIN_FAILED: 'ERROR_LOGIN_FAILED',
|
||||
SESSION_EXPIRED: 'ERROR_SESSION_EXPIRED',
|
||||
|
||||
// MyTischtennis
|
||||
MYTISCHTENNIS_USER_NOT_FOUND: 'ERROR_MYTISCHTENNIS_USER_NOT_FOUND',
|
||||
MYTISCHTENNIS_INVALID_PASSWORD: 'ERROR_MYTISCHTENNIS_INVALID_PASSWORD',
|
||||
MYTISCHTENNIS_LOGIN_FAILED: 'ERROR_MYTISCHTENNIS_LOGIN_FAILED',
|
||||
MYTISCHTENNIS_ACCOUNT_NOT_LINKED: 'ERROR_MYTISCHTENNIS_ACCOUNT_NOT_LINKED',
|
||||
MYTISCHTENNIS_PASSWORD_NOT_SAVED: 'ERROR_MYTISCHTENNIS_PASSWORD_NOT_SAVED',
|
||||
MYTISCHTENNIS_SESSION_EXPIRED: 'ERROR_MYTISCHTENNIS_SESSION_EXPIRED',
|
||||
MYTISCHTENNIS_NO_PASSWORD_SAVED: 'ERROR_MYTISCHTENNIS_NO_PASSWORD_SAVED',
|
||||
MYTISCHTENNIS_CAPTCHA_REQUIRED: 'ERROR_MYTISCHTENNIS_CAPTCHA_REQUIRED',
|
||||
|
||||
// Mitglieder
|
||||
MEMBER_NOT_FOUND: 'ERROR_MEMBER_NOT_FOUND',
|
||||
MEMBER_ALREADY_EXISTS: 'ERROR_MEMBER_ALREADY_EXISTS',
|
||||
MEMBER_FIRSTNAME_REQUIRED: 'ERROR_MEMBER_FIRSTNAME_REQUIRED',
|
||||
MEMBER_LASTNAME_REQUIRED: 'ERROR_MEMBER_LASTNAME_REQUIRED',
|
||||
|
||||
// Gruppen
|
||||
GROUP_NOT_FOUND: 'ERROR_GROUP_NOT_FOUND',
|
||||
GROUP_NAME_REQUIRED: 'ERROR_GROUP_NAME_REQUIRED',
|
||||
GROUP_ALREADY_EXISTS: 'ERROR_GROUP_ALREADY_EXISTS',
|
||||
GROUP_INVALID_PRESET_TYPE: 'ERROR_GROUP_INVALID_PRESET_TYPE',
|
||||
GROUP_CANNOT_RENAME_PRESET: 'ERROR_GROUP_CANNOT_RENAME_PRESET',
|
||||
|
||||
// Turniere
|
||||
TOURNAMENT_NOT_FOUND: 'ERROR_TOURNAMENT_NOT_FOUND',
|
||||
TOURNAMENT_NO_DATE: 'ERROR_TOURNAMENT_NO_DATE',
|
||||
TOURNAMENT_CLASS_NAME_REQUIRED: 'ERROR_TOURNAMENT_CLASS_NAME_REQUIRED',
|
||||
TOURNAMENT_NO_PARTICIPANTS: 'ERROR_TOURNAMENT_NO_PARTICIPANTS',
|
||||
TOURNAMENT_NO_VALID_PARTICIPANTS: 'ERROR_TOURNAMENT_NO_VALID_PARTICIPANTS',
|
||||
TOURNAMENT_NO_TRAINING_DAY: 'ERROR_TOURNAMENT_NO_TRAINING_DAY',
|
||||
TOURNAMENT_PDF_GENERATION_FAILED: 'ERROR_TOURNAMENT_PDF_GENERATION_FAILED',
|
||||
TOURNAMENT_SELECT_FIRST: 'ERROR_TOURNAMENT_SELECT_FIRST',
|
||||
|
||||
// Trainingstagebuch
|
||||
DIARY_DATE_NOT_FOUND: 'ERROR_DIARY_DATE_NOT_FOUND',
|
||||
DIARY_DATE_UPDATED: 'ERROR_DIARY_DATE_UPDATED',
|
||||
DIARY_NO_PARTICIPANTS: 'ERROR_DIARY_NO_PARTICIPANTS',
|
||||
DIARY_PDF_GENERATION_FAILED: 'ERROR_DIARY_PDF_GENERATION_FAILED',
|
||||
DIARY_IMAGE_LOAD_FAILED: 'ERROR_DIARY_IMAGE_LOAD_FAILED',
|
||||
DIARY_STATS_LOAD_FAILED: 'ERROR_DIARY_STATS_LOAD_FAILED',
|
||||
DIARY_NO_EXERCISE_DATA: 'ERROR_DIARY_NO_EXERCISE_DATA',
|
||||
DIARY_ACTIVITY_PARTICIPANTS_UPDATE_FAILED: 'ERROR_DIARY_ACTIVITY_PARTICIPANTS_UPDATE_FAILED',
|
||||
DIARY_GROUP_ASSIGNMENT_UPDATED: 'SUCCESS_DIARY_GROUP_ASSIGNMENT_UPDATED',
|
||||
DIARY_GROUP_ASSIGNMENT_UPDATE_FAILED: 'ERROR_DIARY_GROUP_ASSIGNMENT_UPDATE_FAILED',
|
||||
DIARY_ASSIGN_ALL_PARTICIPANTS_FAILED: 'ERROR_DIARY_ASSIGN_ALL_PARTICIPANTS_FAILED',
|
||||
DIARY_ASSIGN_GROUP_FAILED: 'ERROR_DIARY_ASSIGN_GROUP_FAILED',
|
||||
DIARY_PARTICIPANT_ASSIGN_FAILED: 'ERROR_DIARY_PARTICIPANT_ASSIGN_FAILED',
|
||||
DIARY_PARTICIPANT_GROUP_ASSIGNMENT_UPDATE_FAILED: 'ERROR_DIARY_PARTICIPANT_GROUP_ASSIGNMENT_UPDATE_FAILED',
|
||||
DIARY_MEMBER_CREATED: 'SUCCESS_DIARY_MEMBER_CREATED',
|
||||
DIARY_MEMBER_CREATE_FAILED: 'ERROR_DIARY_MEMBER_CREATE_FAILED',
|
||||
|
||||
// Team Management
|
||||
TEAM_NOT_LINKED_TO_LEAGUE: 'ERROR_TEAM_NOT_LINKED_TO_LEAGUE',
|
||||
TEAM_LINK_TO_LEAGUE_REQUIRED: 'ERROR_TEAM_LINK_TO_LEAGUE_REQUIRED',
|
||||
TEAM_PDF_LOAD_FAILED: 'ERROR_TEAM_PDF_LOAD_FAILED',
|
||||
TEAM_STATS_LOAD_FAILED: 'ERROR_TEAM_STATS_LOAD_FAILED',
|
||||
|
||||
// Aktivitäten
|
||||
ACTIVITY_IMAGE_DELETE_FAILED: 'ERROR_ACTIVITY_IMAGE_DELETE_FAILED',
|
||||
|
||||
// Offizielle Turniere
|
||||
OFFICIAL_TOURNAMENT_PDF_UPLOAD_SUCCESS: 'SUCCESS_OFFICIAL_TOURNAMENT_PDF_UPLOAD',
|
||||
OFFICIAL_TOURNAMENT_PDF_UPLOAD_FAILED: 'ERROR_OFFICIAL_TOURNAMENT_PDF_UPLOAD',
|
||||
|
||||
// Vereine
|
||||
CLUB_NOT_FOUND: 'ERROR_CLUB_NOT_FOUND',
|
||||
CLUB_ALREADY_EXISTS: 'ERROR_CLUB_ALREADY_EXISTS',
|
||||
CLUB_NAME_REQUIRED: 'ERROR_CLUB_NAME_REQUIRED',
|
||||
CLUB_NAME_TOO_SHORT: 'ERROR_CLUB_NAME_TOO_SHORT',
|
||||
|
||||
// Mitglieder-Übertragung
|
||||
MEMBER_TRANSFER_BULK_FAILED: 'ERROR_MEMBER_TRANSFER_BULK_FAILED',
|
||||
|
||||
// Training
|
||||
TRAINING_STATS_LOAD_FAILED: 'ERROR_TRAINING_STATS_LOAD_FAILED',
|
||||
|
||||
// Logs
|
||||
LOG_NOT_FOUND: 'ERROR_LOG_NOT_FOUND',
|
||||
};
|
||||
|
||||
/**
|
||||
* Erstellt ein Fehler-Objekt mit Code und optionalen Parametern
|
||||
* @param {string} code - Fehlercode aus ERROR_CODES
|
||||
* @param {object} params - Optionale Parameter für die Fehlermeldung
|
||||
* @returns {object} Fehler-Objekt mit code und params
|
||||
*/
|
||||
export function createError(code, params = null) {
|
||||
return {
|
||||
code,
|
||||
...(params && { params })
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
import { register, activateUser, login, logout } from '../services/authService.js';
|
||||
import jwt from 'jsonwebtoken';
|
||||
import UserToken from '../models/UserToken.js';
|
||||
import User from '../models/User.js'; // ggf. Pfad anpassen
|
||||
import { register, activateUser, login, logout, requestPasswordReset, resetPassword } from '../services/authService.js';
|
||||
|
||||
const registerUser = async (req, res, next) => {
|
||||
try {
|
||||
const { email, password } = req.body;
|
||||
const user = await register(email, password);
|
||||
res.status(201).json(user);
|
||||
console.log('registerUser', email, password);
|
||||
await register(email, password);
|
||||
console.log('registerUser done');
|
||||
// Aus Sicherheitsgründen KEINE Userdaten (Passwort-Hash, Aktivierungscode, ...) zurückgeben
|
||||
res.status(201).json({ success: true });
|
||||
console.log('registerUser response sent');
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
@@ -16,8 +17,9 @@ const registerUser = async (req, res, next) => {
|
||||
const activate = async (req, res, next) => {
|
||||
try {
|
||||
const { activationCode } = req.params;
|
||||
const user = await activateUser(activationCode);
|
||||
res.status(200).json(user);
|
||||
await activateUser(activationCode);
|
||||
// Auch bei Aktivierung kein komplettes User-Objekt zurückgeben
|
||||
res.status(200).json({ success: true });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
@@ -43,4 +45,24 @@ const logoutUser = async (req, res, next) => {
|
||||
}
|
||||
};
|
||||
|
||||
export { registerUser, activate, loginUser, logoutUser };
|
||||
const forgotPassword = async (req, res, next) => {
|
||||
try {
|
||||
const { email } = req.body;
|
||||
const result = await requestPasswordReset(email);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
};
|
||||
|
||||
const resetUserPassword = async (req, res, next) => {
|
||||
try {
|
||||
const { token, password } = req.body;
|
||||
const result = await resetPassword(token, password);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
};
|
||||
|
||||
export { registerUser, activate, loginUser, logoutUser, forgotPassword, resetUserPassword };
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import diaryService from '../services/diaryService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
import { emitDiaryDateUpdated, emitDiaryTagAdded, emitDiaryTagRemoved } from '../services/socketService.js';
|
||||
const getDatesForClub = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.params;
|
||||
@@ -43,6 +43,10 @@ const updateTrainingTimes = async (req, res) => {
|
||||
throw new HttpError('notallfieldsfilled', 400);
|
||||
}
|
||||
const updatedDate = await diaryService.updateTrainingTimes(userToken, clubId, dateId, trainingStart, trainingEnd);
|
||||
|
||||
// Emit Socket-Event
|
||||
emitDiaryDateUpdated(clubId, dateId, { trainingStart, trainingEnd });
|
||||
|
||||
res.status(200).json(updatedDate);
|
||||
} catch (error) {
|
||||
console.error('[updateTrainingTimes] - Error:', error);
|
||||
@@ -79,6 +83,14 @@ const addDiaryTag = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { diaryDateId, tagName } = req.body;
|
||||
const tags = await diaryService.addTagToDate(userToken, diaryDateId, tagName);
|
||||
|
||||
// Hole clubId für Event
|
||||
const { DiaryDate } = await import('../models/index.js');
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId && tags && tags.length > 0) {
|
||||
emitDiaryTagAdded(diaryDate.clubId, diaryDateId, tags[tags.length - 1]);
|
||||
}
|
||||
|
||||
res.status(201).json(tags);
|
||||
} catch (error) {
|
||||
console.error('[addDiaryTag] - Error:', error);
|
||||
@@ -95,6 +107,12 @@ const addTagToDiaryDate = async (req, res) => {
|
||||
return res.status(400).json({ message: 'diaryDateId and tagId are required.' });
|
||||
}
|
||||
const result = await diaryService.addTagToDiaryDate(userToken, clubId, diaryDateId, tagId);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (result && result.tag) {
|
||||
emitDiaryTagAdded(clubId, diaryDateId, result.tag);
|
||||
}
|
||||
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[addTagToDiaryDate] - Error:', error);
|
||||
@@ -106,8 +124,20 @@ const deleteTagFromDiaryDate = async (req, res) => {
|
||||
try {
|
||||
const { tagId } = req.query;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const { clubId } = req.params;
|
||||
|
||||
// Hole diaryDateId vor dem Löschen
|
||||
const { DiaryDateTag } = await import('../models/index.js');
|
||||
const diaryDateTag = await DiaryDateTag.findByPk(tagId);
|
||||
const diaryDateId = diaryDateTag?.diaryDateId;
|
||||
|
||||
await diaryService.removeTagFromDiaryDate(userToken, clubId, tagId);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (diaryDateId) {
|
||||
emitDiaryTagRemoved(clubId, diaryDateId, tagId);
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Tag deleted' });
|
||||
} catch (error) {
|
||||
console.error('[deleteTag] - Error:', error);
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
import diaryDateActivityService from '../services/diaryDateActivityService.js';
|
||||
import { emitActivityChanged } from '../services/socketService.js';
|
||||
import DiaryDate from '../models/DiaryDates.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const createDiaryDateActivity = async (req, res) => {
|
||||
@@ -14,6 +16,13 @@ export const createDiaryDateActivity = async (req, res) => {
|
||||
orderId,
|
||||
isTimeblock,
|
||||
});
|
||||
|
||||
// Emit Socket-Event
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, diaryDateId);
|
||||
}
|
||||
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
@@ -34,6 +43,15 @@ export const updateDiaryDateActivity = async (req, res) => {
|
||||
orderId,
|
||||
groupId, // Pass groupId to the service
|
||||
});
|
||||
|
||||
// Emit Socket-Event
|
||||
if (updatedActivity?.diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(updatedActivity.diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, updatedActivity.diaryDateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json(updatedActivity);
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Error updating activity' });
|
||||
@@ -44,7 +62,22 @@ export const deleteDiaryDateActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, id } = req.params;
|
||||
|
||||
// Hole diaryDateId vor dem Löschen
|
||||
const DiaryDateActivity = (await import('../models/DiaryDateActivity.js')).default;
|
||||
const activity = await DiaryDateActivity.findByPk(id);
|
||||
const diaryDateId = activity?.diaryDateId;
|
||||
|
||||
await diaryDateActivityService.deleteActivity(userToken, clubId, id);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, diaryDateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Activity deleted' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Error deleting activity' });
|
||||
@@ -57,6 +90,15 @@ export const updateDiaryDateActivityOrder = async (req, res) => {
|
||||
const { clubId, id } = req.params;
|
||||
const { orderId } = req.body;
|
||||
const updatedActivity = await diaryDateActivityService.updateActivityOrder(userToken, clubId, id, orderId);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (updatedActivity?.diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(updatedActivity.diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, updatedActivity.diaryDateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json(updatedActivity);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
@@ -79,8 +121,15 @@ export const getDiaryDateActivities = async (req, res) => {
|
||||
export const addGroupActivity = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, diaryDateId, groupId, activity, timeblockId } = req.body;
|
||||
const activityItem = await diaryDateActivityService.addGroupActivity(userToken, clubId, diaryDateId, groupId, activity, timeblockId);
|
||||
const { clubId, diaryDateId, groupId, activity, predefinedActivityId, timeblockId } = req.body;
|
||||
const activityItem = await diaryDateActivityService.addGroupActivity(userToken, clubId, diaryDateId, groupId, activity, predefinedActivityId, timeblockId);
|
||||
|
||||
// Emit Socket-Event
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, diaryDateId);
|
||||
}
|
||||
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
@@ -88,11 +137,61 @@ export const addGroupActivity = async(req, res) => {
|
||||
}
|
||||
}
|
||||
|
||||
export const updateGroupActivity = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupActivityId } = req.params;
|
||||
const { predefinedActivityId } = req.body;
|
||||
const activityItem = await diaryDateActivityService.updateGroupActivity(userToken, clubId, groupActivityId, predefinedActivityId);
|
||||
|
||||
// Emit Socket-Event
|
||||
const GroupActivity = (await import('../models/GroupActivity.js')).default;
|
||||
const DiaryDateActivity = (await import('../models/DiaryDateActivity.js')).default;
|
||||
const groupActivity = await GroupActivity.findByPk(groupActivityId);
|
||||
let diaryDateId = null;
|
||||
if (groupActivity?.diaryDateActivity) {
|
||||
const activity = await DiaryDateActivity.findByPk(groupActivity.diaryDateActivity);
|
||||
diaryDateId = activity?.diaryDateId;
|
||||
}
|
||||
if (diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, diaryDateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json(activityItem);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error updating group activity' });
|
||||
}
|
||||
}
|
||||
|
||||
export const deleteGroupActivity = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupActivityId } = req.params;
|
||||
|
||||
// Hole diaryDateId vor dem Löschen
|
||||
const GroupActivity = (await import('../models/GroupActivity.js')).default;
|
||||
const DiaryDateActivity = (await import('../models/DiaryDateActivity.js')).default;
|
||||
const groupActivity = await GroupActivity.findByPk(groupActivityId);
|
||||
let diaryDateId = null;
|
||||
if (groupActivity?.diaryDateActivity) {
|
||||
const activity = await DiaryDateActivity.findByPk(groupActivity.diaryDateActivity);
|
||||
diaryDateId = activity?.diaryDateId;
|
||||
}
|
||||
|
||||
await diaryDateActivityService.deleteGroupActivity(userToken, clubId, groupActivityId);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitActivityChanged(diaryDate.clubId, diaryDateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Group activity deleted' });
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import DiaryMemberActivity from '../models/DiaryMemberActivity.js';
|
||||
import DiaryDateActivity from '../models/DiaryDateActivity.js';
|
||||
import DiaryDates from '../models/DiaryDates.js';
|
||||
import Participant from '../models/Participant.js';
|
||||
import { checkAccess } from '../utils/userUtils.js';
|
||||
import { emitActivityMemberAdded, emitActivityMemberRemoved } from '../services/socketService.js';
|
||||
|
||||
export const getMembersForActivity = async (req, res) => {
|
||||
try {
|
||||
@@ -31,6 +34,13 @@ export const addMembersToActivity = async (req, res) => {
|
||||
|
||||
const validIds = new Set(validParticipants.map(p => p.id));
|
||||
const created = [];
|
||||
|
||||
// Hole clubId und dateId für Events (falls nicht aus params verfügbar)
|
||||
const activity = await DiaryDateActivity.findByPk(diaryDateActivityId);
|
||||
const diaryDate = activity ? await DiaryDates.findByPk(activity.diaryDateId) : null;
|
||||
const eventClubId = diaryDate?.clubId || clubId;
|
||||
const dateId = diaryDate?.id || null;
|
||||
|
||||
for (const pid of participantIds) {
|
||||
if (!validIds.has(pid)) {
|
||||
continue;
|
||||
@@ -39,6 +49,11 @@ export const addMembersToActivity = async (req, res) => {
|
||||
if (!existing) {
|
||||
const rec = await DiaryMemberActivity.create({ diaryDateActivityId, participantId: pid });
|
||||
created.push(rec);
|
||||
|
||||
// Emit Socket-Event
|
||||
if (eventClubId && dateId) {
|
||||
emitActivityMemberAdded(eventClubId, diaryDateActivityId, pid, dateId);
|
||||
}
|
||||
} else {
|
||||
}
|
||||
}
|
||||
@@ -54,7 +69,19 @@ export const removeMemberFromActivity = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, diaryDateActivityId, participantId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
|
||||
// Hole dateId für Event
|
||||
const activity = await DiaryDateActivity.findByPk(diaryDateActivityId);
|
||||
const diaryDate = activity ? await DiaryDates.findByPk(activity.diaryDateId) : null;
|
||||
const dateId = diaryDate?.id || null;
|
||||
|
||||
await DiaryMemberActivity.destroy({ where: { diaryDateActivityId, participantId } });
|
||||
|
||||
// Emit Socket-Event
|
||||
if (dateId) {
|
||||
emitActivityMemberRemoved(clubId, diaryDateActivityId, participantId, dateId);
|
||||
}
|
||||
|
||||
res.status(200).json({ ok: true });
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Error removing member from activity' });
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { DiaryNote, DiaryTag } from '../models/index.js';
|
||||
import { DiaryNote, DiaryTag, DiaryDate } from '../models/index.js';
|
||||
import diaryService from '../services/diaryService.js';
|
||||
import { emitDiaryNoteAdded, emitDiaryNoteDeleted } from '../services/socketService.js';
|
||||
|
||||
export const getNotes = async (req, res) => {
|
||||
try {
|
||||
@@ -26,6 +27,9 @@ export const createNote = async (req, res) => {
|
||||
|
||||
const newNote = await DiaryNote.create({ memberId, diaryDateId, content });
|
||||
|
||||
// Hole DiaryDate für clubId
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
|
||||
if (Array.isArray(tags) && tags.length > 0 && typeof newNote.addTags === 'function') {
|
||||
const tagInstances = await DiaryTag.findAll({ where: { id: tags } });
|
||||
await newNote.addTags(tagInstances);
|
||||
@@ -34,9 +38,19 @@ export const createNote = async (req, res) => {
|
||||
include: [{ model: DiaryTag, as: 'tags', required: false }],
|
||||
});
|
||||
|
||||
// Emit Socket-Event
|
||||
if (diaryDate?.clubId) {
|
||||
emitDiaryNoteAdded(diaryDate.clubId, diaryDateId, noteWithTags ?? newNote);
|
||||
}
|
||||
|
||||
return res.status(201).json(noteWithTags ?? newNote);
|
||||
}
|
||||
|
||||
// Emit Socket-Event
|
||||
if (diaryDate?.clubId) {
|
||||
emitDiaryNoteAdded(diaryDate.clubId, diaryDateId, newNote);
|
||||
}
|
||||
|
||||
res.status(201).json(newNote);
|
||||
} catch (error) {
|
||||
console.error('[createNote] - Error:', error);
|
||||
@@ -47,7 +61,25 @@ export const createNote = async (req, res) => {
|
||||
export const deleteNote = async (req, res) => {
|
||||
try {
|
||||
const { noteId } = req.params;
|
||||
|
||||
// Hole Note für diaryDateId vor dem Löschen
|
||||
const note = await DiaryNote.findByPk(noteId);
|
||||
const diaryDateId = note?.diaryDateId;
|
||||
|
||||
// Hole DiaryDate für clubId
|
||||
let clubId = null;
|
||||
if (diaryDateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(diaryDateId);
|
||||
clubId = diaryDate?.clubId;
|
||||
}
|
||||
|
||||
await DiaryNote.destroy({ where: { id: noteId } });
|
||||
|
||||
// Emit Socket-Event
|
||||
if (clubId && diaryDateId) {
|
||||
emitDiaryNoteDeleted(clubId, diaryDateId, noteId);
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Note deleted' });
|
||||
} catch (error) {
|
||||
res.status(500).json({ error: 'Error deleting note' });
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import groupService from '../services/groupService.js';
|
||||
import { emitActivityChanged, emitGroupChanged } from '../services/socketService.js';
|
||||
import DiaryDate from '../models/DiaryDates.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const addGroup = async(req, res) => {
|
||||
@@ -7,6 +9,15 @@ const addGroup = async(req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubid: clubId, dateid: dateId, name, lead } = req.body;
|
||||
const result = await groupService.addGroup(userToken, clubId, dateId, name, lead);
|
||||
|
||||
// Emit Socket-Event für Gruppen-Änderungen
|
||||
if (dateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(dateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitGroupChanged(diaryDate.clubId, dateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(201).json(result);
|
||||
} catch (error) {
|
||||
console.error('[addGroup] - Error:', error);
|
||||
@@ -33,6 +44,15 @@ const changeGroup = async(req, res) => {
|
||||
const { groupId } = req.params;
|
||||
const { clubid: clubId, dateid: dateId, name, lead } = req.body;
|
||||
const result = await groupService.changeGroup(userToken, groupId, clubId, dateId, name, lead);
|
||||
|
||||
// Emit Socket-Event für Gruppen-Änderungen
|
||||
if (dateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(dateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitGroupChanged(diaryDate.clubId, dateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[changeGroup] - Error:', error);
|
||||
@@ -40,4 +60,27 @@ const changeGroup = async(req, res) => {
|
||||
}
|
||||
}
|
||||
|
||||
export { addGroup, getGroups, changeGroup};
|
||||
const deleteGroup = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { groupId } = req.params;
|
||||
const { clubid: clubId, dateid: dateId } = req.body;
|
||||
const result = await groupService.deleteGroup(userToken, groupId, clubId, dateId);
|
||||
|
||||
// Emit Socket-Events für Gruppen- und Aktivitäts-Änderungen (Gruppen werden in Aktivitäten verwendet)
|
||||
if (dateId) {
|
||||
const diaryDate = await DiaryDate.findByPk(dateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitGroupChanged(diaryDate.clubId, dateId);
|
||||
emitActivityChanged(diaryDate.clubId, dateId);
|
||||
}
|
||||
}
|
||||
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[deleteGroup] - Error:', error);
|
||||
res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
}
|
||||
|
||||
export { addGroup, getGroups, changeGroup, deleteGroup};
|
||||
@@ -1,6 +1,6 @@
|
||||
import MatchService from '../services/matchService.js';
|
||||
import fs from 'fs';
|
||||
|
||||
import { emitScheduleMatchUpdated } from '../services/socketService.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const uploadCSV = async (req, res) => {
|
||||
try {
|
||||
@@ -116,7 +116,11 @@ export const updateMatchPlayers = async (req, res) => {
|
||||
playersPlanned,
|
||||
playersPlayed
|
||||
);
|
||||
|
||||
|
||||
if (result.clubId) {
|
||||
emitScheduleMatchUpdated(result.clubId, result.id, result.match || null);
|
||||
}
|
||||
|
||||
return res.status(200).json({
|
||||
message: 'Match players updated successfully',
|
||||
data: result
|
||||
@@ -145,3 +149,21 @@ export const getPlayerMatchStats = async (req, res) => {
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const getMatchPlayers = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.params;
|
||||
if (!clubId) {
|
||||
return res.status(400).json({ error: 'Club-ID fehlt' });
|
||||
}
|
||||
const Member = (await import('../models/Member.js')).default;
|
||||
const members = await Member.findAll({
|
||||
where: { clubId: clubId, active: true },
|
||||
attributes: ['id', 'firstName', 'lastName', 'gender']
|
||||
});
|
||||
return res.status(200).json(members);
|
||||
} catch (error) {
|
||||
console.error('Error retrieving match players:', error);
|
||||
return res.status(500).json({ error: 'Failed to retrieve match players' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -49,13 +49,19 @@ export const getMemberActivities = async (req, res) => {
|
||||
|
||||
const participantIds = participants.map(p => p.id);
|
||||
|
||||
// Get all diary member activities for this member
|
||||
const whereClause = {
|
||||
participantId: participantIds
|
||||
};
|
||||
// Sammle alle Gruppen-IDs, zu denen der Member gehört
|
||||
const memberGroupIds = new Set();
|
||||
participants.forEach(p => {
|
||||
if (p.groupId !== null && p.groupId !== undefined) {
|
||||
memberGroupIds.add(p.groupId);
|
||||
}
|
||||
});
|
||||
|
||||
// 1. Get all diary member activities explicitly assigned to this member
|
||||
const memberActivities = await DiaryMemberActivity.findAll({
|
||||
where: whereClause,
|
||||
where: {
|
||||
participantId: participantIds
|
||||
},
|
||||
include: [
|
||||
{
|
||||
model: Participant,
|
||||
@@ -90,47 +96,186 @@ export const getMemberActivities = async (req, res) => {
|
||||
order: [[{ model: DiaryDateActivity, as: 'activity' }, { model: DiaryDates, as: 'diaryDate' }, 'date', 'DESC']]
|
||||
});
|
||||
|
||||
// Group activities by name and count occurrences, considering group assignment
|
||||
// 2. Get all group activities for groups the member belongs to
|
||||
const groupActivities = [];
|
||||
if (memberGroupIds.size > 0) {
|
||||
// Suche direkt nach GroupActivity-Einträgen für die Gruppen des Members
|
||||
const groupActivitiesData = await GroupActivity.findAll({
|
||||
where: {
|
||||
groupId: {
|
||||
[Op.in]: Array.from(memberGroupIds)
|
||||
}
|
||||
},
|
||||
include: [
|
||||
{
|
||||
model: DiaryDateActivity,
|
||||
as: 'activityGroupActivity',
|
||||
include: [
|
||||
{
|
||||
model: DiaryDates,
|
||||
as: 'diaryDate',
|
||||
where: startDate ? {
|
||||
date: {
|
||||
[Op.gte]: startDate
|
||||
}
|
||||
} : {}
|
||||
},
|
||||
{
|
||||
model: PredefinedActivity,
|
||||
as: 'predefinedActivity',
|
||||
required: false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
model: PredefinedActivity,
|
||||
as: 'groupPredefinedActivity',
|
||||
required: false
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
// Erstelle virtuelle DiaryMemberActivity-Objekte für Gruppen-Aktivitäten
|
||||
for (const groupActivity of groupActivitiesData) {
|
||||
if (!groupActivity.activityGroupActivity || !groupActivity.activityGroupActivity.diaryDate) {
|
||||
continue; // Überspringe, wenn keine DiaryDateActivity oder kein DiaryDate vorhanden
|
||||
}
|
||||
|
||||
const activity = groupActivity.activityGroupActivity;
|
||||
const diaryDateId = activity.diaryDateId;
|
||||
|
||||
// Finde alle relevanten Participants für dieses DiaryDate
|
||||
const relevantParticipants = participants.filter(p =>
|
||||
p.diaryDateId === diaryDateId &&
|
||||
p.groupId === groupActivity.groupId
|
||||
);
|
||||
|
||||
for (const participant of relevantParticipants) {
|
||||
// Verwende die PredefinedActivity aus GroupActivity, falls vorhanden
|
||||
// Sonst die aus DiaryDateActivity
|
||||
const predefinedActivity = groupActivity.groupPredefinedActivity || activity.predefinedActivity;
|
||||
|
||||
if (predefinedActivity) {
|
||||
// Erstelle ein modifiziertes Activity-Objekt
|
||||
const modifiedActivity = {
|
||||
...activity.toJSON(),
|
||||
predefinedActivity: predefinedActivity
|
||||
};
|
||||
groupActivities.push({
|
||||
activity: modifiedActivity,
|
||||
participant: participant,
|
||||
id: null // Virtuell, nicht in DB
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Filter: explizite Zuordnungen sollen nur dann zählen, wenn
|
||||
// - der Participant keine Gruppe hat UND die Aktivität KEINE Gruppenbindung hat, oder
|
||||
// - die Aktivität keine Gruppenbindung hat, oder
|
||||
// - es eine Gruppenbindung gibt, die zur Gruppe des Participants passt.
|
||||
const filteredMemberActivities = memberActivities.filter((ma) => {
|
||||
if (!ma?.participant || !ma?.activity) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const participantGroupId = ma.participant.groupId;
|
||||
const groupActivitiesForActivity = ma.activity.groupActivities || [];
|
||||
|
||||
// Participant ohne Gruppe -> nur Aktivitäten ohne Gruppenbindung zählen
|
||||
if (participantGroupId === null || participantGroupId === undefined) {
|
||||
return !groupActivitiesForActivity.length;
|
||||
}
|
||||
|
||||
// Keine Gruppenbindung -> immer zählen
|
||||
if (!groupActivitiesForActivity.length) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Gruppenbindung vorhanden -> nur zählen, wenn die Gruppe passt
|
||||
return groupActivitiesForActivity.some((ga) => Number(ga.groupId) === Number(participantGroupId));
|
||||
});
|
||||
|
||||
// 3. Kombiniere beide Listen und entferne Duplikate
|
||||
// Ein Duplikat liegt vor, wenn dieselbe Aktivität für denselben Participant bereits explizit zugeordnet ist
|
||||
const explicitActivityKeys = new Set();
|
||||
filteredMemberActivities.forEach(ma => {
|
||||
if (ma.activity && ma.activity.id && ma.participant && ma.participant.id) {
|
||||
// Erstelle einen eindeutigen Schlüssel: activityId-participantId
|
||||
const key = `${ma.activity.id}-${ma.participant.id}`;
|
||||
explicitActivityKeys.add(key);
|
||||
}
|
||||
});
|
||||
|
||||
// Filtere Gruppen-Aktivitäten, die bereits explizit zugeordnet sind
|
||||
const uniqueGroupActivities = groupActivities.filter(ga => {
|
||||
if (!ga.activity || !ga.activity.id || !ga.participant || !ga.participant.id) {
|
||||
return false;
|
||||
}
|
||||
const key = `${ga.activity.id}-${ga.participant.id}`;
|
||||
return !explicitActivityKeys.has(key);
|
||||
});
|
||||
|
||||
// Kombiniere beide Listen
|
||||
const allActivities = [...filteredMemberActivities, ...uniqueGroupActivities];
|
||||
|
||||
// Group activities by name and count occurrences
|
||||
// Verwende einen Set pro Aktivität, um eindeutige Datum-Aktivität-Kombinationen zu tracken
|
||||
const activityMap = new Map();
|
||||
|
||||
for (const ma of memberActivities) {
|
||||
for (const ma of allActivities) {
|
||||
if (!ma.activity || !ma.activity.predefinedActivity || !ma.participant) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Check group assignment
|
||||
const participantGroupId = ma.participant.groupId;
|
||||
const activityGroupIds = ma.activity.groupActivities?.map(ga => ga.groupId) || [];
|
||||
|
||||
// Filter: Only count if:
|
||||
// 1. Activity has no group assignment (empty activityGroupIds) - activity is for all groups OR
|
||||
// 2. Participant's group matches one of the activity's groups
|
||||
const shouldCount = activityGroupIds.length === 0 ||
|
||||
(participantGroupId !== null && activityGroupIds.includes(participantGroupId));
|
||||
|
||||
if (!shouldCount) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const activity = ma.activity.predefinedActivity;
|
||||
const activityName = activity.name;
|
||||
const activityCode = activity.code || activity.name; // Verwende Code falls vorhanden, sonst Name
|
||||
const date = ma.activity.diaryDate?.date;
|
||||
|
||||
if (!activityMap.has(activityName)) {
|
||||
activityMap.set(activityName, {
|
||||
name: activityName,
|
||||
count: 0,
|
||||
if (!date) {
|
||||
continue; // Überspringe Einträge ohne Datum
|
||||
}
|
||||
|
||||
// Verwende Code als Key, falls vorhanden, sonst Name
|
||||
const key = activityCode;
|
||||
|
||||
if (!activityMap.has(key)) {
|
||||
activityMap.set(key, {
|
||||
name: activityName, // Vollständiger Name für Tooltip
|
||||
code: activityCode, // Code/Kürzel für Anzeige
|
||||
uniqueDates: new Set(), // Set für eindeutige Daten
|
||||
dates: []
|
||||
});
|
||||
}
|
||||
|
||||
const activityData = activityMap.get(activityName);
|
||||
activityData.count++;
|
||||
if (date) {
|
||||
const activityData = activityMap.get(key);
|
||||
// Konvertiere Datum zu String für Set-Vergleich (nur Datum, keine Zeit)
|
||||
const dateString = date instanceof Date
|
||||
? date.toISOString().split('T')[0]
|
||||
: new Date(date).toISOString().split('T')[0];
|
||||
|
||||
// Füge Datum nur hinzu, wenn es noch nicht vorhanden ist
|
||||
if (!activityData.uniqueDates.has(dateString)) {
|
||||
activityData.uniqueDates.add(dateString);
|
||||
activityData.dates.push(date);
|
||||
}
|
||||
}
|
||||
|
||||
// Konvertiere Sets zu Arrays und setze count basierend auf eindeutigen Daten
|
||||
activityMap.forEach((activityData, key) => {
|
||||
activityData.count = activityData.uniqueDates.size;
|
||||
// Sortiere Daten (neueste zuerst)
|
||||
activityData.dates.sort((a, b) => {
|
||||
const dateA = new Date(a);
|
||||
const dateB = new Date(b);
|
||||
return dateB - dateA;
|
||||
});
|
||||
// Entferne uniqueDates, da es nicht an Frontend gesendet werden muss
|
||||
delete activityData.uniqueDates;
|
||||
});
|
||||
|
||||
// Convert map to array and sort by count
|
||||
const activities = Array.from(activityMap.values())
|
||||
.sort((a, b) => b.count - a.count);
|
||||
@@ -162,7 +307,15 @@ export const getMemberLastParticipations = async (req, res) => {
|
||||
|
||||
const participantIds = participants.map(p => p.id);
|
||||
|
||||
// Get last participations for this member
|
||||
// Sammle alle Gruppen-IDs, zu denen der Member gehört
|
||||
const memberGroupIds = new Set();
|
||||
participants.forEach(p => {
|
||||
if (p.groupId !== null && p.groupId !== undefined) {
|
||||
memberGroupIds.add(p.groupId);
|
||||
}
|
||||
});
|
||||
|
||||
// 1. Get last participations explicitly assigned to this member
|
||||
const memberActivities = await DiaryMemberActivity.findAll({
|
||||
where: {
|
||||
participantId: participantIds
|
||||
@@ -196,31 +349,177 @@ export const getMemberLastParticipations = async (req, res) => {
|
||||
order: [[{ model: DiaryDateActivity, as: 'activity' }, { model: DiaryDates, as: 'diaryDate' }, 'date', 'DESC']],
|
||||
limit: parseInt(limit) * 10 // Get more to filter by group
|
||||
});
|
||||
|
||||
// Siehe getMemberActivities(): nur zählen, wenn Gruppenbindung passt (oder keine existiert)
|
||||
const filteredMemberActivities = memberActivities.filter((ma) => {
|
||||
if (!ma?.participant || !ma?.activity) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const participantGroupId = ma.participant.groupId;
|
||||
const groupActivitiesForActivity = ma.activity.groupActivities || [];
|
||||
|
||||
if (!groupActivitiesForActivity.length) {
|
||||
return true;
|
||||
}
|
||||
|
||||
return groupActivitiesForActivity.some((ga) => Number(ga.groupId) === Number(participantGroupId));
|
||||
});
|
||||
|
||||
// Format the results, considering group assignment
|
||||
const participations = memberActivities
|
||||
// 2. Get all group activities for groups the member belongs to
|
||||
const groupActivities = [];
|
||||
if (memberGroupIds.size > 0) {
|
||||
// Suche direkt nach GroupActivity-Einträgen für die Gruppen des Members
|
||||
const groupActivitiesData = await GroupActivity.findAll({
|
||||
where: {
|
||||
groupId: {
|
||||
[Op.in]: Array.from(memberGroupIds)
|
||||
}
|
||||
},
|
||||
include: [
|
||||
{
|
||||
model: DiaryDateActivity,
|
||||
as: 'activityGroupActivity',
|
||||
include: [
|
||||
{
|
||||
model: DiaryDates,
|
||||
as: 'diaryDate'
|
||||
},
|
||||
{
|
||||
model: PredefinedActivity,
|
||||
as: 'predefinedActivity',
|
||||
required: false
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
model: PredefinedActivity,
|
||||
as: 'groupPredefinedActivity',
|
||||
required: false
|
||||
}
|
||||
],
|
||||
order: [[{ model: DiaryDateActivity, as: 'activityGroupActivity' }, { model: DiaryDates, as: 'diaryDate' }, 'date', 'DESC']],
|
||||
limit: parseInt(limit) * 10 // Get more to filter
|
||||
});
|
||||
|
||||
// Erstelle virtuelle DiaryMemberActivity-Objekte für Gruppen-Aktivitäten
|
||||
for (const groupActivity of groupActivitiesData) {
|
||||
if (!groupActivity.activityGroupActivity || !groupActivity.activityGroupActivity.diaryDate) {
|
||||
continue; // Überspringe, wenn keine DiaryDateActivity oder kein DiaryDate vorhanden
|
||||
}
|
||||
|
||||
const activity = groupActivity.activityGroupActivity;
|
||||
const diaryDateId = activity.diaryDateId;
|
||||
|
||||
// Finde alle relevanten Participants für dieses DiaryDate
|
||||
const relevantParticipants = participants.filter(p =>
|
||||
p.diaryDateId === diaryDateId &&
|
||||
p.groupId === groupActivity.groupId
|
||||
);
|
||||
|
||||
for (const participant of relevantParticipants) {
|
||||
// Verwende die PredefinedActivity aus GroupActivity, falls vorhanden
|
||||
// Sonst die aus DiaryDateActivity
|
||||
const predefinedActivity = groupActivity.groupPredefinedActivity || activity.predefinedActivity;
|
||||
|
||||
if (predefinedActivity) {
|
||||
// Erstelle ein modifiziertes Activity-Objekt
|
||||
const modifiedActivity = {
|
||||
...activity.toJSON(),
|
||||
predefinedActivity: predefinedActivity
|
||||
};
|
||||
groupActivities.push({
|
||||
activity: modifiedActivity,
|
||||
participant: participant,
|
||||
id: null // Virtuell, nicht in DB
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3. Kombiniere beide Listen und entferne Duplikate
|
||||
// Ein Duplikat liegt vor, wenn dieselbe Aktivität für denselben Participant bereits explizit zugeordnet ist
|
||||
const explicitActivityKeys = new Set();
|
||||
filteredMemberActivities.forEach(ma => {
|
||||
if (ma.activity && ma.activity.id && ma.participant && ma.participant.id) {
|
||||
// Erstelle einen eindeutigen Schlüssel: activityId-participantId
|
||||
const key = `${ma.activity.id}-${ma.participant.id}`;
|
||||
explicitActivityKeys.add(key);
|
||||
}
|
||||
});
|
||||
|
||||
// Filtere Gruppen-Aktivitäten, die bereits explizit zugeordnet sind
|
||||
const uniqueGroupActivities = groupActivities.filter(ga => {
|
||||
if (!ga.activity || !ga.activity.id || !ga.participant || !ga.participant.id) {
|
||||
return false;
|
||||
}
|
||||
const key = `${ga.activity.id}-${ga.participant.id}`;
|
||||
return !explicitActivityKeys.has(key);
|
||||
});
|
||||
|
||||
// Kombiniere beide Listen
|
||||
const allActivities = [...filteredMemberActivities, ...uniqueGroupActivities];
|
||||
|
||||
// Gruppiere nach Datum
|
||||
const participationsByDate = new Map();
|
||||
|
||||
allActivities
|
||||
.filter(ma => {
|
||||
if (!ma.activity || !ma.activity.predefinedActivity || !ma.activity.diaryDate || !ma.participant) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check group assignment
|
||||
const participantGroupId = ma.participant.groupId;
|
||||
const activityGroupIds = ma.activity.groupActivities?.map(ga => ga.groupId) || [];
|
||||
|
||||
// Filter: Only count if:
|
||||
// 1. Activity has no group assignment (empty activityGroupIds) - activity is for all groups OR
|
||||
// 2. Participant's group matches one of the activity's groups
|
||||
return activityGroupIds.length === 0 ||
|
||||
(participantGroupId !== null && activityGroupIds.includes(participantGroupId));
|
||||
return true;
|
||||
})
|
||||
.slice(0, parseInt(limit)) // Limit after filtering
|
||||
.map(ma => ({
|
||||
id: ma.id,
|
||||
activityName: ma.activity.predefinedActivity.name,
|
||||
date: ma.activity.diaryDate.date,
|
||||
diaryDateId: ma.activity.diaryDate.id
|
||||
}));
|
||||
.forEach(ma => {
|
||||
const date = ma.activity.diaryDate.date;
|
||||
const diaryDateId = ma.activity.diaryDate.id;
|
||||
const activity = ma.activity.predefinedActivity;
|
||||
const activityName = activity.name;
|
||||
const activityCode = activity.code || activity.name;
|
||||
|
||||
if (!participationsByDate.has(date)) {
|
||||
participationsByDate.set(date, {
|
||||
date: date,
|
||||
diaryDateId: diaryDateId,
|
||||
activities: []
|
||||
});
|
||||
}
|
||||
|
||||
const dateEntry = participationsByDate.get(date);
|
||||
// Füge Aktivität nur hinzu, wenn sie noch nicht vorhanden ist (vermeide Duplikate)
|
||||
// Speichere sowohl code als auch name
|
||||
const activityEntry = {
|
||||
code: activityCode,
|
||||
name: activityName
|
||||
};
|
||||
if (!dateEntry.activities.find(a => (a.code || a.name) === activityCode)) {
|
||||
dateEntry.activities.push(activityEntry);
|
||||
}
|
||||
});
|
||||
|
||||
// Sortiere nach Datum (neueste zuerst) und nehme die letzten N Daten
|
||||
const sortedDates = Array.from(participationsByDate.values())
|
||||
.sort((a, b) => {
|
||||
const dateA = new Date(a.date);
|
||||
const dateB = new Date(b.date);
|
||||
return dateB - dateA;
|
||||
})
|
||||
.slice(0, parseInt(limit));
|
||||
|
||||
// Formatiere für das Frontend: Flache Liste mit Datum und Aktivität
|
||||
const participations = [];
|
||||
sortedDates.forEach(dateEntry => {
|
||||
dateEntry.activities.forEach(activity => {
|
||||
participations.push({
|
||||
id: null, // Virtuell
|
||||
activityName: activity.code || activity.name, // Code für Anzeige
|
||||
activityFullName: activity.name, // Vollständiger Name für Tooltip
|
||||
date: dateEntry.date,
|
||||
diaryDateId: dateEntry.diaryDateId
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
return res.status(200).json(participations);
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import MemberService from "../services/memberService.js";
|
||||
import MemberTransferService from "../services/memberTransferService.js";
|
||||
import { emitMemberChanged } from '../services/socketService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getClubMembers = async(req, res) => {
|
||||
@@ -32,6 +33,12 @@ const setClubMembers = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const addResult = await MemberService.setClubMember(userToken, clubId, memberId, firstName, lastName, street, city, postalCode, birthdate,
|
||||
phone, email, active, testMembership, picsInInternetAllowed, gender, ttr, qttr, memberFormHandedOver, contacts);
|
||||
|
||||
// Emit Socket-Event wenn Member erfolgreich erstellt/aktualisiert wurde
|
||||
if (addResult.status === 200) {
|
||||
emitMemberChanged(clubId);
|
||||
}
|
||||
|
||||
res.status(addResult.status || 500).json(addResult.response);
|
||||
} catch (error) {
|
||||
console.error('[setClubMembers] - Error:', error);
|
||||
@@ -124,10 +131,14 @@ const generateMemberGallery = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const size = parseInt(req.query.size) || 200; // Default: 200x200
|
||||
const format = req.query.format || 'image'; // 'image' or 'json'
|
||||
const result = await MemberService.generateMemberGallery(userToken, clubId, size);
|
||||
|
||||
// Bei format=json wird kein Bild erstellt, nur die Mitgliederliste zurückgegeben
|
||||
const createImage = format !== 'json';
|
||||
const result = await MemberService.generateMemberGallery(userToken, clubId, size, createImage);
|
||||
|
||||
if (result.status === 200) {
|
||||
if (format === 'json') {
|
||||
// Return member information for interactive gallery
|
||||
// Return member information for interactive gallery (ohne Bild zu erstellen)
|
||||
return res.status(200).json({
|
||||
members: result.galleryEntries.map(entry => ({
|
||||
memberId: entry.memberId,
|
||||
|
||||
@@ -1,5 +1,52 @@
|
||||
import myTischtennisService from '../services/myTischtennisService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import axios from 'axios';
|
||||
import myTischtennisClient from '../clients/myTischtennisClient.js';
|
||||
|
||||
const MYTT_ORIGIN = 'https://www.mytischtennis.de';
|
||||
const MYTT_PROXY_PREFIX = '/api/mytischtennis/proxy';
|
||||
|
||||
function rewriteMytischtennisContent(content) {
|
||||
if (typeof content !== 'string' || !content) {
|
||||
return content;
|
||||
}
|
||||
|
||||
let rewritten = content;
|
||||
|
||||
// Root-relative Build/Fonts über unseren Same-Origin-Proxy laden.
|
||||
rewritten = rewritten.replace(
|
||||
/(["'])\/build\//g,
|
||||
`$1${MYTT_PROXY_PREFIX}/build/`
|
||||
);
|
||||
rewritten = rewritten.replace(
|
||||
/(["'])\/fonts\//g,
|
||||
`$1${MYTT_PROXY_PREFIX}/fonts/`
|
||||
);
|
||||
|
||||
// Absolute Build/Fonts-URLs ebenfalls auf den Proxy biegen.
|
||||
rewritten = rewritten.replace(
|
||||
/https:\/\/www\.mytischtennis\.de\/build\//g,
|
||||
`${MYTT_PROXY_PREFIX}/build/`
|
||||
);
|
||||
rewritten = rewritten.replace(
|
||||
/https:\/\/www\.mytischtennis\.de\/fonts\//g,
|
||||
`${MYTT_PROXY_PREFIX}/fonts/`
|
||||
);
|
||||
|
||||
// CSS url(/fonts/...) Fälle.
|
||||
rewritten = rewritten.replace(
|
||||
/url\((["']?)\/fonts\//g,
|
||||
`url($1${MYTT_PROXY_PREFIX}/fonts/`
|
||||
);
|
||||
|
||||
// Captcha-Endpunkt muss ebenfalls same-origin über Proxy erreichbar sein.
|
||||
rewritten = rewritten.replace(
|
||||
/(["'])\/api\/private-captcha/g,
|
||||
`$1${MYTT_PROXY_PREFIX}/api/private-captcha`
|
||||
);
|
||||
|
||||
return rewritten;
|
||||
}
|
||||
|
||||
class MyTischtennisController {
|
||||
/**
|
||||
@@ -35,6 +82,49 @@ class MyTischtennisController {
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/login-form
|
||||
* Parsed login form data from mytischtennis.de
|
||||
*/
|
||||
async getLoginForm(req, res, next) {
|
||||
try {
|
||||
const myTischtennisClient = (await import('../clients/myTischtennisClient.js')).default;
|
||||
const result = await myTischtennisClient.getLoginPage();
|
||||
|
||||
if (!result.success) {
|
||||
throw new HttpError('Login-Formular konnte nicht geladen werden', 502);
|
||||
}
|
||||
|
||||
const publicFields = (result.fields || [])
|
||||
.filter((field) => ['email', 'password'].includes(field.type) || field.name === 'email' || field.name === 'password')
|
||||
.map((field) => ({
|
||||
name: field.name,
|
||||
id: field.id,
|
||||
type: field.type,
|
||||
placeholder: field.placeholder || null,
|
||||
required: !!field.required,
|
||||
autocomplete: field.autocomplete || null,
|
||||
minlength: field.minlength ? Number(field.minlength) : null
|
||||
}));
|
||||
|
||||
res.status(200).json({
|
||||
success: true,
|
||||
form: {
|
||||
action: result.loginAction,
|
||||
fields: publicFields
|
||||
},
|
||||
captcha: {
|
||||
required: !!result.requiresCaptcha,
|
||||
siteKey: result.captchaSiteKey || null,
|
||||
puzzleEndpoint: result.captchaPuzzleEndpoint || null,
|
||||
solutionField: result.captchaSolutionField || 'captcha'
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/account
|
||||
* Create or update myTischtennis account
|
||||
@@ -42,7 +132,9 @@ class MyTischtennisController {
|
||||
async upsertAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { email, password, savePassword, autoUpdateRatings, userPassword } = req.body;
|
||||
const { email, password, savePassword, userPassword } = req.body;
|
||||
const hasAutoUpdateRatings = Object.prototype.hasOwnProperty.call(req.body, 'autoUpdateRatings');
|
||||
const autoUpdateRatings = hasAutoUpdateRatings ? req.body.autoUpdateRatings : undefined;
|
||||
|
||||
if (!email) {
|
||||
throw new HttpError('E-Mail-Adresse erforderlich', 400);
|
||||
@@ -58,7 +150,7 @@ class MyTischtennisController {
|
||||
email,
|
||||
password,
|
||||
savePassword || false,
|
||||
autoUpdateRatings || false,
|
||||
autoUpdateRatings,
|
||||
userPassword
|
||||
);
|
||||
|
||||
@@ -199,6 +291,444 @@ class MyTischtennisController {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/login-page
|
||||
* Proxy für Login-Seite (für iframe)
|
||||
* Lädt die Login-Seite von mytischtennis.de und modifiziert sie, sodass Form-Submissions über unseren Proxy gehen
|
||||
* Authentifizierung ist optional - Token kann als Query-Parameter übergeben werden
|
||||
*/
|
||||
async getLoginPage(req, res, next) {
|
||||
try {
|
||||
// Versuche, userId aus Token zu bekommen (optional)
|
||||
let userId = null;
|
||||
const token = req.query.token || req.headers['authorization']?.split(' ')[1] || req.headers['authcode'];
|
||||
if (token) {
|
||||
try {
|
||||
const jwt = (await import('jsonwebtoken')).default;
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
userId = decoded.userId;
|
||||
} catch (err) {
|
||||
// Token ungültig - ignorieren
|
||||
}
|
||||
}
|
||||
|
||||
// Speichere userId im Request für submitLogin
|
||||
req.userId = userId;
|
||||
|
||||
// Lade die Login-Seite von mytischtennis.de
|
||||
const response = await axios.get(`${MYTT_ORIGIN}/login?next=%2F`, {
|
||||
headers: {
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36',
|
||||
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7'
|
||||
},
|
||||
maxRedirects: 5,
|
||||
validateStatus: () => true // Akzeptiere alle Status-Codes
|
||||
});
|
||||
|
||||
// Setze Cookies aus der Response
|
||||
const setCookieHeaders = response.headers['set-cookie'];
|
||||
if (setCookieHeaders) {
|
||||
res.setHeader('Set-Cookie', setCookieHeaders);
|
||||
}
|
||||
|
||||
// Modifiziere HTML: Ändere Form-Action auf unseren Proxy
|
||||
let html = response.data;
|
||||
if (typeof html === 'string') {
|
||||
// Füge Token als Hidden-Input hinzu, damit submitLogin die userId bekommt
|
||||
const tokenInput = userId ? `<input type="hidden" name="__token" value="${token}" />` : '';
|
||||
|
||||
// Ersetze Form-Action URLs und füge Token-Input hinzu
|
||||
html = html.replace(
|
||||
/(<form[^>]*action="[^"]*\/login[^"]*"[^>]*>)/g,
|
||||
`$1${tokenInput}`
|
||||
);
|
||||
html = html.replace(
|
||||
/action="([^"]*\/login[^"]*)"/g,
|
||||
'action="/api/mytischtennis/login-submit"'
|
||||
);
|
||||
// Ersetze auch relative URLs
|
||||
html = html.replace(
|
||||
/action="\/login/g,
|
||||
'action="/api/mytischtennis/login-submit'
|
||||
);
|
||||
html = rewriteMytischtennisContent(html);
|
||||
|
||||
// MyTischtennis bootet eine große React-App, die im Proxy-Kontext häufig mit
|
||||
// Runtime-Fehlern abstürzt ("Da ist etwas schiefgelaufen"). Für den iframe-Login
|
||||
// reicht die serverseitig gerenderte Form aus; deshalb Bootstrap-Skripte entfernen.
|
||||
html = html.replace(/<script\b[^>]*type=(?:"|')module(?:"|')[^>]*>[\s\S]*?<\/script>/gi, '');
|
||||
html = html.replace(/<script\b[^>]*src=(?:"|')[^"']*\/build\/[^"']*(?:"|')[^>]*>\s*<\/script>/gi, '');
|
||||
html = html.replace(/<link\b[^>]*rel=(?:"|')modulepreload(?:"|')[^>]*>/gi, '');
|
||||
}
|
||||
|
||||
// Setze Content-Type
|
||||
res.setHeader('Content-Type', response.headers['content-type'] || 'text/html; charset=utf-8');
|
||||
|
||||
// Sende den modifizierten HTML-Inhalt
|
||||
res.status(response.status).send(html);
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Laden der Login-Seite:', error);
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/proxy/*
|
||||
* Same-Origin-Proxy für mytischtennis Build-/Font-/Captcha-Ressourcen
|
||||
*/
|
||||
async proxyRemote(req, res, next) {
|
||||
try {
|
||||
const proxyPath = req.params[0] || '';
|
||||
const queryString = new URLSearchParams(req.query || {}).toString();
|
||||
const targetUrl = `${MYTT_ORIGIN}/${proxyPath}${queryString ? `?${queryString}` : ''}`;
|
||||
|
||||
const upstream = await axios.get(targetUrl, {
|
||||
responseType: 'arraybuffer',
|
||||
headers: {
|
||||
'User-Agent': req.headers['user-agent'] || 'Mozilla/5.0',
|
||||
'Accept': req.headers.accept || '*/*',
|
||||
'Accept-Language': req.headers['accept-language'] || 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
|
||||
...(req.headers.cookie ? { 'Cookie': req.headers.cookie } : {})
|
||||
},
|
||||
validateStatus: () => true
|
||||
});
|
||||
|
||||
// Wichtige Header durchreichen
|
||||
const passthroughHeaders = ['content-type', 'cache-control', 'etag', 'last-modified', 'expires'];
|
||||
for (const headerName of passthroughHeaders) {
|
||||
const value = upstream.headers[headerName];
|
||||
if (value) {
|
||||
res.setHeader(headerName, value);
|
||||
}
|
||||
}
|
||||
if (upstream.headers['set-cookie']) {
|
||||
res.setHeader('Set-Cookie', upstream.headers['set-cookie']);
|
||||
}
|
||||
|
||||
const contentType = String(upstream.headers['content-type'] || '').toLowerCase();
|
||||
const isTextLike = /(text\/|javascript|json|xml|svg)/.test(contentType);
|
||||
|
||||
if (isTextLike) {
|
||||
const asText = Buffer.from(upstream.data).toString('utf-8');
|
||||
const rewritten = rewriteMytischtennisContent(asText);
|
||||
return res.status(upstream.status).send(rewritten);
|
||||
}
|
||||
|
||||
return res.status(upstream.status).send(upstream.data);
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Proxy von mytischtennis-Ressourcen:', error.message);
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/login-submit
|
||||
* Proxy für Login-Form-Submission
|
||||
* Leitet den Login-Request durch, damit Cookies im Backend-Kontext bleiben
|
||||
* Authentifizierung ist optional - iframe kann keinen Token mitsenden
|
||||
*/
|
||||
async submitLogin(req, res, next) {
|
||||
try {
|
||||
// Versuche, userId aus Token zu bekommen (aus Query-Parameter oder Hidden-Input)
|
||||
let userId = null;
|
||||
const token = req.query.token || req.body.__token || req.headers['authorization']?.split(' ')[1] || req.headers['authcode'];
|
||||
if (token) {
|
||||
try {
|
||||
const jwt = (await import('jsonwebtoken')).default;
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
userId = decoded.userId;
|
||||
} catch (err) {
|
||||
// Token ungültig - ignorieren
|
||||
}
|
||||
}
|
||||
|
||||
// Entferne __token aus req.body, damit es nicht an mytischtennis.de gesendet wird
|
||||
if (req.body.__token) {
|
||||
delete req.body.__token;
|
||||
}
|
||||
|
||||
// Hole Cookies aus dem Request (wird auch für CAPTCHA-Fallback benötigt)
|
||||
const cookies = req.headers.cookie || '';
|
||||
|
||||
// Normalisiere Payload
|
||||
const payload = { ...(req.body || {}) };
|
||||
const mask = (v) => (typeof v === 'string' && v.length > 12 ? `${v.slice(0, 12)}...(${v.length})` : v);
|
||||
|
||||
// Falls captcha im Browser-Kontext nicht gesetzt wurde, versuche serverseitigen Fallback
|
||||
if (!payload.captcha) {
|
||||
try {
|
||||
const loginPageResponse = await axios.get('https://www.mytischtennis.de/login?next=%2F', {
|
||||
headers: {
|
||||
'Cookie': cookies,
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
|
||||
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
|
||||
'Referer': 'https://www.mytischtennis.de/'
|
||||
},
|
||||
validateStatus: () => true
|
||||
});
|
||||
|
||||
const html = typeof loginPageResponse.data === 'string' ? loginPageResponse.data : '';
|
||||
const siteKeyMatch = html.match(/data-sitekey=(?:"([^"]+)"|'([^']+)')/i);
|
||||
const puzzleEndpointMatch = html.match(/data-puzzle-endpoint=(?:"([^"]+)"|'([^']+)')/i);
|
||||
const siteKey = siteKeyMatch ? (siteKeyMatch[1] || siteKeyMatch[2]) : null;
|
||||
const puzzleEndpoint = puzzleEndpointMatch ? (puzzleEndpointMatch[1] || puzzleEndpointMatch[2]) : null;
|
||||
|
||||
if (siteKey && puzzleEndpoint) {
|
||||
const puzzleResponse = await axios.get(`${puzzleEndpoint}?sitekey=${encodeURIComponent(siteKey)}`, {
|
||||
headers: {
|
||||
'Cookie': cookies,
|
||||
'Accept': '*/*',
|
||||
'Origin': 'https://www.mytischtennis.de',
|
||||
'Referer': 'https://www.mytischtennis.de/'
|
||||
},
|
||||
validateStatus: () => true
|
||||
});
|
||||
|
||||
if (puzzleResponse.status === 200 && typeof puzzleResponse.data === 'string' && puzzleResponse.data.trim()) {
|
||||
payload.captcha = puzzleResponse.data.trim();
|
||||
payload.captcha_clicked = 'true';
|
||||
}
|
||||
}
|
||||
} catch (captchaFallbackError) {
|
||||
console.warn('[submitLogin] CAPTCHA-Fallback fehlgeschlagen:', captchaFallbackError.message);
|
||||
}
|
||||
}
|
||||
|
||||
// Wenn captcha vorhanden ist, als bestätigt markieren
|
||||
if (payload.captcha && !payload.captcha_clicked) {
|
||||
payload.captcha_clicked = 'true';
|
||||
}
|
||||
|
||||
console.log('[submitLogin] Incoming payload fields:', {
|
||||
keys: Object.keys(payload),
|
||||
hasEmail: !!payload.email,
|
||||
hasPassword: !!payload.password,
|
||||
xsrf: mask(payload.xsrf),
|
||||
captchaClicked: payload.captcha_clicked,
|
||||
captcha: mask(payload.captcha)
|
||||
});
|
||||
|
||||
// Form-Daten sauber als x-www-form-urlencoded serialisieren
|
||||
const formData = new URLSearchParams();
|
||||
for (const [key, value] of Object.entries(payload)) {
|
||||
if (value !== undefined && value !== null) {
|
||||
formData.append(key, String(value));
|
||||
}
|
||||
}
|
||||
|
||||
// Leite den Login-Request an mytischtennis.de weiter
|
||||
const response = await axios.post(
|
||||
'https://www.mytischtennis.de/login?next=%2F&_data=routes%2F_auth%2B%2Flogin',
|
||||
formData.toString(),
|
||||
{
|
||||
headers: {
|
||||
'Cookie': cookies,
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
|
||||
'Accept': '*/*',
|
||||
'Referer': 'https://www.mytischtennis.de/login?next=%2F'
|
||||
},
|
||||
maxRedirects: 0,
|
||||
validateStatus: () => true
|
||||
}
|
||||
);
|
||||
|
||||
console.log('[submitLogin] Upstream response:', {
|
||||
status: response.status,
|
||||
hasSetCookie: Array.isArray(response.headers['set-cookie']) && response.headers['set-cookie'].length > 0,
|
||||
bodyPreview: typeof response.data === 'string'
|
||||
? response.data.slice(0, 220)
|
||||
: JSON.stringify(response.data || {}).slice(0, 220)
|
||||
});
|
||||
|
||||
// Falls CAPTCHA-Bestätigung im Proxy-Flow fehlschlägt:
|
||||
// Fallback auf echten Browser-Login (Playwright), dann Session direkt speichern.
|
||||
const upstreamBody = typeof response.data === 'string' ? response.data : JSON.stringify(response.data || {});
|
||||
const isCaptchaFailure = response.status === 400
|
||||
&& (upstreamBody.includes('Captcha-Bestätigung fehlgeschlagen') || upstreamBody.includes('Captcha-Bestätigung ist erforderlich'));
|
||||
|
||||
if (isCaptchaFailure && userId && payload.email && payload.password) {
|
||||
console.log('[submitLogin] CAPTCHA-Fehler erkannt, starte Playwright-Fallback...');
|
||||
const browserLogin = await myTischtennisClient.loginWithBrowserAutomation(payload.email, payload.password);
|
||||
|
||||
if (browserLogin.success && browserLogin.cookie) {
|
||||
await this.saveSessionFromCookie(userId, browserLogin.cookie);
|
||||
return res.status(200).send(
|
||||
'<!doctype html><html><body><p>Login erfolgreich. Fenster kann geschlossen werden.</p></body></html>'
|
||||
);
|
||||
}
|
||||
|
||||
console.warn('[submitLogin] Playwright-Fallback fehlgeschlagen:', browserLogin.error);
|
||||
}
|
||||
|
||||
// Setze Cookies aus der Response
|
||||
const setCookieHeaders = response.headers['set-cookie'];
|
||||
if (setCookieHeaders) {
|
||||
res.setHeader('Set-Cookie', setCookieHeaders);
|
||||
}
|
||||
|
||||
// Setze andere relevante Headers
|
||||
if (response.headers['content-type']) {
|
||||
res.setHeader('Content-Type', response.headers['content-type']);
|
||||
}
|
||||
if (response.headers['location']) {
|
||||
res.setHeader('Location', response.headers['location']);
|
||||
}
|
||||
|
||||
// Prüfe, ob Login erfolgreich war (durch Prüfung der Cookies)
|
||||
const authCookie = setCookieHeaders?.find(cookie => cookie.startsWith('sb-10-auth-token='));
|
||||
if (authCookie && userId) {
|
||||
// Login erfolgreich - speichere Session (nur wenn userId vorhanden)
|
||||
await this.saveSessionFromCookie(userId, authCookie);
|
||||
}
|
||||
|
||||
// Sende Response weiter
|
||||
res.status(response.status).send(response.data);
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Login-Submit:', error);
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Speichere Session-Daten aus Cookie
|
||||
*/
|
||||
async saveSessionFromCookie(userId, cookieString) {
|
||||
try {
|
||||
const tokenMatch = cookieString.match(/sb-10-auth-token=base64-([^;]+)/);
|
||||
if (!tokenMatch) {
|
||||
throw new Error('Token-Format ungültig');
|
||||
}
|
||||
|
||||
const base64Token = tokenMatch[1];
|
||||
const decodedToken = Buffer.from(base64Token, 'base64').toString('utf-8');
|
||||
const tokenData = JSON.parse(decodedToken);
|
||||
|
||||
const MyTischtennis = (await import('../models/MyTischtennis.js')).default;
|
||||
const myTischtennisAccount = await MyTischtennis.findOne({ where: { userId } });
|
||||
|
||||
if (myTischtennisAccount) {
|
||||
myTischtennisAccount.accessToken = tokenData.access_token;
|
||||
myTischtennisAccount.refreshToken = tokenData.refresh_token;
|
||||
myTischtennisAccount.expiresAt = tokenData.expires_at;
|
||||
myTischtennisAccount.cookie = cookieString.split(';')[0].trim();
|
||||
myTischtennisAccount.userData = tokenData.user;
|
||||
myTischtennisAccount.lastLoginSuccess = new Date();
|
||||
myTischtennisAccount.lastLoginAttempt = new Date();
|
||||
|
||||
// Hole Club-Informationen
|
||||
const myTischtennisClient = (await import('../clients/myTischtennisClient.js')).default;
|
||||
const profileResult = await myTischtennisClient.getUserProfile(myTischtennisAccount.cookie);
|
||||
if (profileResult.success) {
|
||||
myTischtennisAccount.clubId = profileResult.clubId;
|
||||
myTischtennisAccount.clubName = profileResult.clubName;
|
||||
myTischtennisAccount.fedNickname = profileResult.fedNickname;
|
||||
}
|
||||
|
||||
await myTischtennisAccount.save();
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Speichern der Session:', error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/extract-session
|
||||
* Extrahiere Session nach Login im iframe
|
||||
* Versucht, die Session-Daten aus den Cookies zu extrahieren
|
||||
* Authentifizierung ist optional - iframe kann keinen Token mitsenden
|
||||
*/
|
||||
async extractSession(req, res, next) {
|
||||
try {
|
||||
// Versuche, userId aus Token zu bekommen (optional)
|
||||
let userId = req.user?.id;
|
||||
|
||||
// Falls kein Token vorhanden, versuche userId aus Account zu bekommen (falls E-Mail bekannt)
|
||||
if (!userId) {
|
||||
// Kann nicht ohne Authentifizierung arbeiten - Session kann nicht gespeichert werden
|
||||
return res.status(401).json({
|
||||
error: 'Authentifizierung erforderlich zum Speichern der Session'
|
||||
});
|
||||
}
|
||||
|
||||
// Hole die Cookies aus dem Request
|
||||
const cookies = req.headers.cookie || '';
|
||||
|
||||
// Versuche, die Session zu verifizieren, indem wir einen Request mit den Cookies machen
|
||||
const response = await axios.get('https://www.mytischtennis.de/?_data=root', {
|
||||
headers: {
|
||||
'Cookie': cookies,
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
|
||||
'Accept': 'application/json'
|
||||
},
|
||||
validateStatus: () => true
|
||||
});
|
||||
|
||||
// Prüfe, ob wir eingeloggt sind (durch Prüfung der Response)
|
||||
if (response.status === 200 && response.data?.userProfile) {
|
||||
// Session erfolgreich - speichere die Daten
|
||||
const account = await myTischtennisService.getAccount(userId);
|
||||
if (!account) {
|
||||
throw new HttpError('Kein myTischtennis-Account verknüpft', 404);
|
||||
}
|
||||
|
||||
// Extrahiere Cookie-String
|
||||
const cookieString = cookies.split(';').find(c => c.trim().startsWith('sb-10-auth-token='));
|
||||
if (!cookieString) {
|
||||
throw new HttpError('Kein Auth-Token in Cookies gefunden', 400);
|
||||
}
|
||||
|
||||
// Parse Token aus Cookie
|
||||
const tokenMatch = cookieString.match(/sb-10-auth-token=base64-([^;]+)/);
|
||||
if (!tokenMatch) {
|
||||
throw new HttpError('Token-Format ungültig', 400);
|
||||
}
|
||||
|
||||
const base64Token = tokenMatch[1];
|
||||
const decodedToken = Buffer.from(base64Token, 'base64').toString('utf-8');
|
||||
const tokenData = JSON.parse(decodedToken);
|
||||
|
||||
// Aktualisiere Account mit Session-Daten
|
||||
const MyTischtennis = (await import('../models/MyTischtennis.js')).default;
|
||||
const myTischtennisAccount = await MyTischtennis.findOne({ where: { userId } });
|
||||
|
||||
if (myTischtennisAccount) {
|
||||
myTischtennisAccount.accessToken = tokenData.access_token;
|
||||
myTischtennisAccount.refreshToken = tokenData.refresh_token;
|
||||
myTischtennisAccount.expiresAt = tokenData.expires_at;
|
||||
myTischtennisAccount.cookie = cookieString.trim();
|
||||
myTischtennisAccount.userData = tokenData.user;
|
||||
myTischtennisAccount.lastLoginSuccess = new Date();
|
||||
myTischtennisAccount.lastLoginAttempt = new Date();
|
||||
|
||||
// Hole Club-Informationen
|
||||
const myTischtennisClient = (await import('../clients/myTischtennisClient.js')).default;
|
||||
const profileResult = await myTischtennisClient.getUserProfile(cookieString.trim());
|
||||
if (profileResult.success) {
|
||||
myTischtennisAccount.clubId = profileResult.clubId;
|
||||
myTischtennisAccount.clubName = profileResult.clubName;
|
||||
myTischtennisAccount.fedNickname = profileResult.fedNickname;
|
||||
}
|
||||
|
||||
await myTischtennisAccount.save();
|
||||
}
|
||||
|
||||
res.status(200).json({
|
||||
success: true,
|
||||
message: 'Session erfolgreich extrahiert und gespeichert'
|
||||
});
|
||||
} else {
|
||||
throw new HttpError('Nicht eingeloggt oder Session ungültig', 401);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Extrahieren der Session:', error);
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default new MyTischtennisController();
|
||||
|
||||
@@ -233,9 +233,11 @@ export const listOfficialTournaments = async (req, res) => {
|
||||
const { clubId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const list = await OfficialTournament.findAll({ where: { clubId } });
|
||||
res.status(200).json(list);
|
||||
res.status(200).json(Array.isArray(list) ? list : []);
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Failed to list tournaments' });
|
||||
console.error('[listOfficialTournaments] Error:', e);
|
||||
const errorMessage = e.message || 'Failed to list tournaments';
|
||||
res.status(e.statusCode || 500).json({ error: errorMessage });
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import Participant from '../models/Participant.js';
|
||||
|
||||
import DiaryDates from '../models/DiaryDates.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
import { emitParticipantAdded, emitParticipantRemoved, emitParticipantUpdated } from '../services/socketService.js';
|
||||
export const getParticipants = async (req, res) => {
|
||||
try {
|
||||
const { dateId } = req.params;
|
||||
@@ -24,7 +25,12 @@ export const updateParticipantGroup = async (req, res) => {
|
||||
where: {
|
||||
diaryDateId: dateId,
|
||||
memberId: memberId
|
||||
}
|
||||
},
|
||||
include: [{
|
||||
model: DiaryDates,
|
||||
as: 'diaryDate',
|
||||
attributes: ['clubId']
|
||||
}]
|
||||
});
|
||||
|
||||
if (!participant) {
|
||||
@@ -34,7 +40,25 @@ export const updateParticipantGroup = async (req, res) => {
|
||||
participant.groupId = groupId || null;
|
||||
await participant.save();
|
||||
|
||||
res.status(200).json(participant);
|
||||
// Lade den Participant erneut aus der DB, um sicherzustellen, dass wir den aktuellen Wert haben
|
||||
const updatedParticipant = await Participant.findOne({
|
||||
where: {
|
||||
diaryDateId: dateId,
|
||||
memberId: memberId
|
||||
},
|
||||
include: [{
|
||||
model: DiaryDates,
|
||||
as: 'diaryDate',
|
||||
attributes: ['clubId']
|
||||
}]
|
||||
});
|
||||
|
||||
// Emit Socket-Event mit dem aktualisierten Participant
|
||||
if (updatedParticipant?.diaryDate?.clubId) {
|
||||
emitParticipantUpdated(updatedParticipant.diaryDate.clubId, dateId, updatedParticipant);
|
||||
}
|
||||
|
||||
res.status(200).json(updatedParticipant || participant);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Aktualisieren der Teilnehmer-Gruppenzuordnung' });
|
||||
@@ -45,6 +69,13 @@ export const addParticipant = async (req, res) => {
|
||||
try {
|
||||
const { diaryDateId, memberId } = req.body;
|
||||
const participant = await Participant.create({ diaryDateId, memberId });
|
||||
|
||||
// Hole DiaryDate für clubId
|
||||
const diaryDate = await DiaryDates.findByPk(diaryDateId);
|
||||
if (diaryDate?.clubId) {
|
||||
emitParticipantAdded(diaryDate.clubId, diaryDateId, participant);
|
||||
}
|
||||
|
||||
res.status(201).json(participant);
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
@@ -55,7 +86,18 @@ export const addParticipant = async (req, res) => {
|
||||
export const removeParticipant = async (req, res) => {
|
||||
try {
|
||||
const { diaryDateId, memberId } = req.body;
|
||||
|
||||
// Hole DiaryDate für clubId vor dem Löschen
|
||||
const diaryDate = await DiaryDates.findByPk(diaryDateId);
|
||||
const clubId = diaryDate?.clubId;
|
||||
|
||||
await Participant.destroy({ where: { diaryDateId, memberId } });
|
||||
|
||||
// Emit Socket-Event
|
||||
if (clubId) {
|
||||
emitParticipantRemoved(clubId, diaryDateId, memberId);
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Teilnehmer entfernt' });
|
||||
} catch (error) {
|
||||
devLog(error);
|
||||
|
||||
@@ -1,15 +1,60 @@
|
||||
// controllers/tournamentController.js
|
||||
import tournamentService from "../services/tournamentService.js";
|
||||
import { emitTournamentChanged } from '../services/socketService.js';
|
||||
import TournamentClass from '../models/TournamentClass.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
// 1. Alle Turniere eines Vereins
|
||||
// Pools (zusammengelegte Gruppenphasen)
|
||||
export const mergeClassesIntoPool = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, sourceClassId, targetClassId, strategy, outOfCompetitionForSource } = req.body;
|
||||
try {
|
||||
await tournamentService.mergeClassesIntoPool(
|
||||
token,
|
||||
clubId,
|
||||
tournamentId,
|
||||
sourceClassId,
|
||||
targetClassId,
|
||||
strategy, // 'singleGroup' | 'distribute'
|
||||
!!outOfCompetitionForSource
|
||||
);
|
||||
// Broadcast
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[mergeClassesIntoPool] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const resetPool = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, poolId } = req.body;
|
||||
try {
|
||||
await tournamentService.resetPool(token, clubId, tournamentId, poolId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[resetPool] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 1. Alle Turniere eines Vereins (query: type = 'internal' | 'external' | 'mini')
|
||||
export const getTournaments = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const type = req.query.type || null;
|
||||
try {
|
||||
const tournaments = await tournamentService.getTournaments(token, clubId);
|
||||
const tournaments = await tournamentService.getTournaments(token, clubId, type);
|
||||
res.status(200).json(tournaments);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
if (error instanceof HttpError) {
|
||||
res.set('x-debug-tournament-clubid', String(clubId));
|
||||
res.set('x-debug-tournament-clubid-num', String(Number(clubId)));
|
||||
return res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
@@ -17,36 +62,81 @@ export const getTournaments = async (req, res) => {
|
||||
// 2. Neues Turnier anlegen
|
||||
export const addTournament = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentName, date } = req.body;
|
||||
const { clubId, tournamentName, date, winningSets, allowsExternal } = req.body;
|
||||
try {
|
||||
const tournament = await tournamentService.addTournament(token, clubId, tournamentName, date);
|
||||
const tournament = await tournamentService.addTournament(token, clubId, tournamentName, date, winningSets, allowsExternal);
|
||||
if (clubId && tournament && tournament.id) {
|
||||
emitTournamentChanged(clubId, tournament.id);
|
||||
}
|
||||
res.status(201).json(tournament);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
console.error('[addTournament] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 3. Teilnehmer hinzufügen
|
||||
// Minimeisterschaft anlegen (Turnier + 6 Klassen); Name: "Minimeisterschaften <Jahr> Ortsentscheid <ort>"
|
||||
export const addMiniChampionship = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, ort, date, year, winningSets } = req.body;
|
||||
try {
|
||||
const tournament = await tournamentService.addMiniChampionship(token, clubId, ort, date, year, winningSets);
|
||||
if (clubId && tournament && tournament.id) {
|
||||
emitTournamentChanged(clubId, tournament.id);
|
||||
}
|
||||
res.status(201).json(tournament);
|
||||
} catch (error) {
|
||||
console.error('[addMiniChampionship] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 3. Teilnehmer hinzufügen - klassengebunden
|
||||
export const addParticipant = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participant: participantId } = req.body;
|
||||
const { clubId, classId, participant: participantId, tournamentId } = req.body;
|
||||
try {
|
||||
await tournamentService.addParticipant(token, clubId, tournamentId, participantId);
|
||||
const participants = await tournamentService.getParticipants(token, clubId, tournamentId);
|
||||
// Payloads:
|
||||
// - Mit Klasse (klassengebunden): { clubId, classId, participant }
|
||||
// - Ohne Klasse (turnierweit): { clubId, tournamentId, participant, classId: null }
|
||||
if (!participantId) {
|
||||
return res.status(400).json({ error: 'Teilnehmer-ID ist erforderlich' });
|
||||
}
|
||||
// Allow adding a participant either to a specific class (classId) or to the whole tournament (no class)
|
||||
if (!classId && !tournamentId) {
|
||||
return res.status(400).json({ error: 'Klasse oder tournamentId ist erforderlich' });
|
||||
}
|
||||
|
||||
// Pass through to service. If classId is present it will be used, otherwise the service should add the participant with classId = null for the given tournamentId
|
||||
await tournamentService.addParticipant(token, clubId, classId || null, participantId, tournamentId || null);
|
||||
|
||||
// Determine tournamentId for response and event emission
|
||||
let respTournamentId = tournamentId;
|
||||
if (classId && !respTournamentId) {
|
||||
const tournamentClass = await TournamentClass.findByPk(classId);
|
||||
if (!tournamentClass) {
|
||||
return res.status(404).json({ error: 'Klasse nicht gefunden' });
|
||||
}
|
||||
respTournamentId = tournamentClass.tournamentId;
|
||||
}
|
||||
|
||||
// Fetch updated participants for the (optional) class or whole tournament
|
||||
const participants = await tournamentService.getParticipants(token, clubId, respTournamentId, classId || null);
|
||||
// Emit Socket-Event
|
||||
if (respTournamentId) emitTournamentChanged(clubId, respTournamentId);
|
||||
res.status(200).json(participants);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
console.error('[addParticipant] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 4. Teilnehmerliste abrufen
|
||||
// 4. Teilnehmerliste abrufen - nach Klasse oder Turnier
|
||||
export const getParticipants = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.body;
|
||||
const { clubId, tournamentId, classId } = req.body;
|
||||
try {
|
||||
const participants = await tournamentService.getParticipants(token, clubId, tournamentId);
|
||||
const participants = await tournamentService.getParticipants(token, clubId, tournamentId, classId || null);
|
||||
res.status(200).json(participants);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -60,6 +150,8 @@ export const setModus = async (req, res) => {
|
||||
const { clubId, tournamentId, type, numberOfGroups, advancingPerGroup } = req.body;
|
||||
try {
|
||||
await tournamentService.setModus(token, clubId, tournamentId, type, numberOfGroups, advancingPerGroup);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -70,9 +162,48 @@ export const setModus = async (req, res) => {
|
||||
// 6. Gruppen-Strukturen anlegen (leere Gruppen)
|
||||
export const createGroups = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.body;
|
||||
const { clubId, tournamentId, numberOfGroups } = req.body;
|
||||
try {
|
||||
await tournamentService.createGroups(token, clubId, tournamentId);
|
||||
// DEBUG: Eingehende Daten sichtbar machen (temporär)
|
||||
console.log('[tournamentController.createGroups] body:', req.body);
|
||||
console.log('[tournamentController.createGroups] types:', {
|
||||
clubId: typeof clubId,
|
||||
tournamentId: typeof tournamentId,
|
||||
numberOfGroups: typeof numberOfGroups,
|
||||
});
|
||||
|
||||
// Turniere ohne Klassen: `numberOfGroups: 0` kommt aus der UI (Default) vor.
|
||||
// Statt „nichts passiert“ normalisieren wir auf mindestens 1 Gruppe.
|
||||
let normalizedNumberOfGroups = numberOfGroups;
|
||||
if (normalizedNumberOfGroups !== undefined && normalizedNumberOfGroups !== null) {
|
||||
const n = Number(normalizedNumberOfGroups);
|
||||
console.log('[tournamentController.createGroups] parsed numberOfGroups:', n);
|
||||
if (!Number.isFinite(n) || !Number.isInteger(n) || n < 0) {
|
||||
return res.status(400).json({ error: 'numberOfGroups muss eine ganze Zahl >= 0 sein' });
|
||||
}
|
||||
normalizedNumberOfGroups = Math.max(1, n);
|
||||
}
|
||||
|
||||
console.log('[tournamentController.createGroups] normalizedNumberOfGroups:', normalizedNumberOfGroups);
|
||||
|
||||
await tournamentService.createGroups(token, clubId, tournamentId, normalizedNumberOfGroups);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 6b. Gruppen-Strukturen pro Klasse anlegen
|
||||
export const createGroupsPerClass = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, groupsPerClass } = req.body;
|
||||
try {
|
||||
await tournamentService.createGroupsPerClass(token, clubId, tournamentId, groupsPerClass);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -86,6 +217,8 @@ export const fillGroups = async (req, res) => {
|
||||
const { clubId, tournamentId } = req.body;
|
||||
try {
|
||||
const updatedMembers = await tournamentService.fillGroups(token, clubId, tournamentId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(updatedMembers);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -93,6 +226,21 @@ export const fillGroups = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
// 7b. Gruppenspiele erstellen ohne Gruppenzuordnungen zu ändern
|
||||
export const createGroupMatches = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.body;
|
||||
try {
|
||||
await tournamentService.createGroupMatches(token, clubId, tournamentId, classId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 8. Gruppen mit ihren Teilnehmern abfragen
|
||||
export const getGroups = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
@@ -119,6 +267,23 @@ export const getTournament = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
// Update Turnier
|
||||
export const updateTournament = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.params;
|
||||
const { name, date, winningSets, numberOfTables } = req.body;
|
||||
try {
|
||||
const tournament = await tournamentService.updateTournament(token, clubId, tournamentId, name, date, winningSets, numberOfTables);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(tournament);
|
||||
} catch (error) {
|
||||
console.error('[updateTournament] Error:', error);
|
||||
const status = error.message.includes('existiert bereits') ? 400 : 500;
|
||||
res.status(status).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// 10. Alle Spiele eines Turniers abfragen
|
||||
export const getTournamentMatches = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
@@ -138,6 +303,8 @@ export const addMatchResult = async (req, res) => {
|
||||
const { clubId, tournamentId, matchId, set, result } = req.body;
|
||||
try {
|
||||
await tournamentService.addMatchResult(token, clubId, tournamentId, matchId, set, result);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: "Result added successfully" });
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -151,6 +318,8 @@ export const finishMatch = async (req, res) => {
|
||||
const { clubId, tournamentId, matchId } = req.body;
|
||||
try {
|
||||
await tournamentService.finishMatch(token, clubId, tournamentId, matchId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: "Match finished successfully" });
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
@@ -164,6 +333,8 @@ export const startKnockout = async (req, res) => {
|
||||
|
||||
try {
|
||||
await tournamentService.startKnockout(token, clubId, tournamentId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: "K.o.-Runde erfolgreich gestartet" });
|
||||
} catch (error) {
|
||||
const status = /Gruppenmodus|Zu wenige Qualifikanten/.test(error.message) ? 400 : 500;
|
||||
@@ -190,6 +361,8 @@ export const manualAssignGroups = async (req, res) => {
|
||||
numberOfGroups, // neu
|
||||
maxGroupSize // neu
|
||||
);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(groupsWithParts);
|
||||
} catch (error) {
|
||||
console.error('Error in manualAssignGroups:', error);
|
||||
@@ -197,11 +370,35 @@ export const manualAssignGroups = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
export const assignParticipantToGroup = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId, groupNumber, isExternal } = req.body;
|
||||
|
||||
try {
|
||||
const groups = await tournamentService.assignParticipantToGroup(
|
||||
token,
|
||||
clubId,
|
||||
tournamentId,
|
||||
participantId,
|
||||
groupNumber,
|
||||
isExternal || false
|
||||
);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(groups);
|
||||
} catch (error) {
|
||||
console.error('Error in assignParticipantToGroup:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const resetGroups = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.body;
|
||||
try {
|
||||
await tournamentService.resetGroups(token, clubId, tournamentId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
@@ -210,11 +407,26 @@ export const resetGroups = async (req, res) => {
|
||||
};
|
||||
|
||||
export const resetMatches = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.body;
|
||||
try {
|
||||
await tournamentService.resetMatches(token, clubId, tournamentId, classId || null);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
res.status(500).json({ error: err.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const cleanupOrphanedMatches = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.body;
|
||||
try {
|
||||
await tournamentService.resetMatches(token, clubId, tournamentId);
|
||||
res.sendStatus(204);
|
||||
const result = await tournamentService.cleanupOrphanedMatches(token, clubId, tournamentId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(result);
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
res.status(500).json({ error: err.message });
|
||||
@@ -227,6 +439,8 @@ export const removeParticipant = async (req, res) => {
|
||||
try {
|
||||
await tournamentService.removeParticipant(token, clubId, tournamentId, participantId);
|
||||
const participants = await tournamentService.getParticipants(token, clubId, tournamentId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(participants);
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
@@ -234,6 +448,35 @@ export const removeParticipant = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
export const updateParticipantSeeded = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.params;
|
||||
const { seeded } = req.body;
|
||||
try {
|
||||
await tournamentService.updateParticipantSeeded(token, clubId, tournamentId, participantId, seeded);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Gesetzt-Status aktualisiert' });
|
||||
} catch (err) {
|
||||
console.error('[updateParticipantSeeded] Error:', err);
|
||||
res.status(500).json({ error: err.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const setParticipantGaveUp = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.params;
|
||||
const { gaveUp } = req.body;
|
||||
try {
|
||||
await tournamentService.setParticipantGaveUp(token, clubId, tournamentId, participantId, gaveUp);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Aufgabe-Status aktualisiert' });
|
||||
} catch (err) {
|
||||
console.error('[setParticipantGaveUp] Error:', err);
|
||||
res.status(500).json({ error: err.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteMatchResult = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, matchId, set } = req.body;
|
||||
@@ -245,6 +488,8 @@ export const deleteMatchResult = async (req, res) => {
|
||||
matchId,
|
||||
set
|
||||
);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Einzelsatz gelöscht' });
|
||||
} catch (error) {
|
||||
console.error('Error in deleteMatchResult:', error);
|
||||
@@ -258,6 +503,8 @@ export const reopenMatch = async (req, res) => {
|
||||
const { clubId, tournamentId, matchId } = req.body;
|
||||
try {
|
||||
await tournamentService.reopenMatch(token, clubId, tournamentId, matchId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
// Gib optional das aktualisierte Match zurück
|
||||
res.status(200).json({ message: "Match reopened" });
|
||||
} catch (error) {
|
||||
@@ -268,13 +515,237 @@ export const reopenMatch = async (req, res) => {
|
||||
|
||||
export const deleteKnockoutMatches = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.body;
|
||||
const { clubId, tournamentId, classId } = req.body;
|
||||
try {
|
||||
await tournamentService.resetKnockout(token, clubId, tournamentId);
|
||||
await tournamentService.resetKnockout(token, clubId, tournamentId, classId);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: "K.o.-Runde gelöscht" });
|
||||
} catch (error) {
|
||||
console.error("Error in deleteKnockoutMatches:", error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const setMatchActive = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, matchId } = req.params;
|
||||
const { isActive } = req.body;
|
||||
try {
|
||||
await tournamentService.setMatchActive(token, clubId, tournamentId, matchId, isActive);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Match-Status aktualisiert' });
|
||||
} catch (err) {
|
||||
console.error('[setMatchActive] Error:', err);
|
||||
res.status(500).json({ error: err.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const setMatchTableNumber = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, matchId } = req.params;
|
||||
const { tableNumber } = req.body;
|
||||
try {
|
||||
await tournamentService.setMatchTableNumber(token, clubId, tournamentId, matchId, tableNumber);
|
||||
// Emit Socket-Event
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Tischnummer aktualisiert' });
|
||||
} catch (err) {
|
||||
console.error('[setMatchTableNumber] Error:', err);
|
||||
res.status(500).json({ error: err.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Externe Teilnehmer hinzufügen
|
||||
export const addExternalParticipant = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId, firstName, lastName, club, birthDate, gender, email, address } = req.body;
|
||||
try {
|
||||
await tournamentService.addExternalParticipant(token, clubId, classId, firstName, lastName, club, birthDate, gender, email, address);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Externer Teilnehmer hinzugefügt' });
|
||||
} catch (error) {
|
||||
console.error('[addExternalParticipant] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Externe Teilnehmer abrufen - nach Klasse oder Turnier
|
||||
export const getExternalParticipants = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.body;
|
||||
try {
|
||||
const participants = await tournamentService.getExternalParticipants(token, clubId, tournamentId, classId || null);
|
||||
res.status(200).json(participants);
|
||||
} catch (error) {
|
||||
console.error('[getExternalParticipants] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Externe Teilnehmer löschen
|
||||
export const removeExternalParticipant = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.body;
|
||||
try {
|
||||
await tournamentService.removeExternalParticipant(token, clubId, tournamentId, participantId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Externer Teilnehmer entfernt' });
|
||||
} catch (error) {
|
||||
console.error('[removeExternalParticipant] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Gesetzt-Status für externe Teilnehmer aktualisieren
|
||||
export const updateExternalParticipantSeeded = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.params;
|
||||
const { seeded } = req.body;
|
||||
try {
|
||||
await tournamentService.updateExternalParticipantSeeded(token, clubId, tournamentId, participantId, seeded);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Gesetzt-Status aktualisiert' });
|
||||
} catch (error) {
|
||||
console.error('[updateExternalParticipantSeeded] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const setExternalParticipantGaveUp = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.params;
|
||||
const { gaveUp } = req.body;
|
||||
try {
|
||||
await tournamentService.setExternalParticipantGaveUp(token, clubId, tournamentId, participantId, gaveUp);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Aufgabe-Status aktualisiert' });
|
||||
} catch (error) {
|
||||
console.error('[setExternalParticipantGaveUp] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Tournament Classes
|
||||
export const getTournamentClasses = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.params;
|
||||
try {
|
||||
const classes = await tournamentService.getTournamentClasses(token, clubId, tournamentId);
|
||||
res.status(200).json(classes);
|
||||
} catch (error) {
|
||||
console.error('[getTournamentClasses] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const addTournamentClass = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.params;
|
||||
const { name, isDoubles, gender, minBirthYear, maxBirthYear } = req.body;
|
||||
try {
|
||||
const tournamentClass = await tournamentService.addTournamentClass(token, clubId, tournamentId, name, isDoubles, gender, minBirthYear, maxBirthYear);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(tournamentClass);
|
||||
} catch (error) {
|
||||
console.error('[addTournamentClass] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateTournamentClass = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.params;
|
||||
const { name, sortOrder, isDoubles, gender, minBirthYear, maxBirthYear } = req.body;
|
||||
try {
|
||||
const tournamentClass = await tournamentService.updateTournamentClass(token, clubId, tournamentId, classId, name, sortOrder, isDoubles, gender, minBirthYear, maxBirthYear);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(tournamentClass);
|
||||
} catch (error) {
|
||||
console.error('[updateTournamentClass] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteTournamentClass = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.params;
|
||||
try {
|
||||
await tournamentService.deleteTournamentClass(token, clubId, tournamentId, classId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Klasse gelöscht' });
|
||||
} catch (error) {
|
||||
console.error('[deleteTournamentClass] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateParticipantClass = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, participantId } = req.params;
|
||||
const { classId, isExternal } = req.body;
|
||||
try {
|
||||
await tournamentService.updateParticipantClass(token, clubId, tournamentId, participantId, classId, isExternal);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Klasse aktualisiert' });
|
||||
} catch (error) {
|
||||
console.error('[updateParticipantClass] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
// Tournament Pairings
|
||||
export const getPairings = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.params;
|
||||
try {
|
||||
const pairings = await tournamentService.getPairings(token, clubId, tournamentId, classId);
|
||||
res.status(200).json(pairings);
|
||||
} catch (error) {
|
||||
console.error('[getPairings] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const createPairing = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, classId } = req.params;
|
||||
const { player1Type, player1Id, player2Type, player2Id, seeded, groupId } = req.body;
|
||||
try {
|
||||
const pairing = await tournamentService.createPairing(token, clubId, tournamentId, classId, player1Type, player1Id, player2Type, player2Id, seeded, groupId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(pairing);
|
||||
} catch (error) {
|
||||
console.error('[createPairing] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const updatePairing = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, pairingId } = req.params;
|
||||
const { player1Type, player1Id, player2Type, player2Id, seeded, groupId } = req.body;
|
||||
try {
|
||||
const pairing = await tournamentService.updatePairing(token, clubId, tournamentId, pairingId, player1Type, player1Id, player2Type, player2Id, seeded, groupId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json(pairing);
|
||||
} catch (error) {
|
||||
console.error('[updatePairing] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const deletePairing = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, pairingId } = req.params;
|
||||
try {
|
||||
await tournamentService.deletePairing(token, clubId, tournamentId, pairingId);
|
||||
emitTournamentChanged(clubId, tournamentId);
|
||||
res.status(200).json({ message: 'Paarung gelöscht' });
|
||||
} catch (error) {
|
||||
console.error('[deletePairing] Error:', error);
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
70
backend/controllers/tournamentStagesController.js
Normal file
70
backend/controllers/tournamentStagesController.js
Normal file
@@ -0,0 +1,70 @@
|
||||
import tournamentService from '../services/tournamentService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
export const getStages = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId } = req.query;
|
||||
try {
|
||||
if (clubId == null || tournamentId == null) {
|
||||
return res.status(400).json({ error: 'clubId und tournamentId sind erforderlich.' });
|
||||
}
|
||||
const data = await tournamentService.getTournamentStages(token, Number(clubId), Number(tournamentId));
|
||||
res.status(200).json(data);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
if (error instanceof HttpError) {
|
||||
// Debug-Hilfe: zeigt, welche IDs tatsächlich am Endpoint ankamen (ohne sensible Daten)
|
||||
res.set('x-debug-stages-clubid', String(clubId));
|
||||
res.set('x-debug-stages-tournamentid', String(tournamentId));
|
||||
res.set('x-debug-stages-clubid-num', String(Number(clubId)));
|
||||
return res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const upsertStages = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, stages, advancement, advancements } = req.body;
|
||||
try {
|
||||
const data = await tournamentService.upsertTournamentStages(
|
||||
token,
|
||||
Number(clubId),
|
||||
Number(tournamentId),
|
||||
stages,
|
||||
advancement,
|
||||
advancements
|
||||
);
|
||||
res.status(200).json(data);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
if (error instanceof HttpError) {
|
||||
res.set('x-debug-stages-clubid', String(clubId));
|
||||
res.set('x-debug-stages-tournamentid', String(tournamentId));
|
||||
res.set('x-debug-stages-clubid-num', String(Number(clubId)));
|
||||
return res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
|
||||
export const advanceStage = async (req, res) => {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubId, tournamentId, fromStageIndex, toStageIndex } = req.body;
|
||||
try {
|
||||
const data = await tournamentService.advanceTournamentStage(
|
||||
token,
|
||||
Number(clubId),
|
||||
Number(tournamentId),
|
||||
Number(fromStageIndex || 1),
|
||||
(toStageIndex == null ? null : Number(toStageIndex))
|
||||
);
|
||||
res.status(200).json(data);
|
||||
} catch (error) {
|
||||
console.error(error);
|
||||
if (error instanceof HttpError) {
|
||||
return res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
res.status(500).json({ error: error.message });
|
||||
}
|
||||
};
|
||||
128
backend/controllers/trainingGroupController.js
Normal file
128
backend/controllers/trainingGroupController.js
Normal file
@@ -0,0 +1,128 @@
|
||||
import trainingGroupService from '../services/trainingGroupService.js';
|
||||
import { getSafeErrorMessage } from '../utils/errorUtils.js';
|
||||
|
||||
export const getTrainingGroups = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const groups = await trainingGroupService.getTrainingGroups(userToken, clubId);
|
||||
res.status(200).json(groups);
|
||||
} catch (error) {
|
||||
console.error('[getTrainingGroups] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Laden der Trainingsgruppen');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const createTrainingGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const { name, sortOrder } = req.body;
|
||||
const group = await trainingGroupService.createTrainingGroup(userToken, clubId, name, sortOrder);
|
||||
res.status(201).json(group);
|
||||
} catch (error) {
|
||||
console.error('[createTrainingGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Erstellen der Trainingsgruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateTrainingGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupId } = req.params;
|
||||
const { name, sortOrder } = req.body;
|
||||
const group = await trainingGroupService.updateTrainingGroup(userToken, clubId, groupId, name, sortOrder);
|
||||
res.status(200).json(group);
|
||||
} catch (error) {
|
||||
console.error('[updateTrainingGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Aktualisieren der Trainingsgruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteTrainingGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupId } = req.params;
|
||||
await trainingGroupService.deleteTrainingGroup(userToken, clubId, groupId);
|
||||
res.status(200).json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[deleteTrainingGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Löschen der Trainingsgruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const addMemberToGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupId, memberId } = req.params;
|
||||
const memberGroup = await trainingGroupService.addMemberToGroup(userToken, clubId, groupId, memberId);
|
||||
res.status(201).json(memberGroup);
|
||||
} catch (error) {
|
||||
console.error('[addMemberToGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Hinzufügen des Mitglieds zur Gruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const removeMemberFromGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, groupId, memberId } = req.params;
|
||||
await trainingGroupService.removeMemberFromGroup(userToken, clubId, groupId, memberId);
|
||||
res.status(200).json({ success: true });
|
||||
} catch (error) {
|
||||
console.error('[removeMemberFromGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Entfernen des Mitglieds aus der Gruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const getMemberGroups = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, memberId } = req.params;
|
||||
const groups = await trainingGroupService.getMemberGroups(userToken, clubId, memberId);
|
||||
res.status(200).json(groups);
|
||||
} catch (error) {
|
||||
console.error('[getMemberGroups] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Laden der Gruppen des Mitglieds');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const ensurePresetGroups = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const groups = await trainingGroupService.ensurePresetGroups(userToken, clubId);
|
||||
res.status(200).json({
|
||||
message: 'Preset-Gruppen wurden erstellt/überprüft',
|
||||
groups: groups.length
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[ensurePresetGroups] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Erstellen der Preset-Gruppen');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const enablePresetGroup = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, presetType } = req.params;
|
||||
const group = await trainingGroupService.enablePresetGroup(userToken, clubId, presetType);
|
||||
res.status(200).json({
|
||||
message: 'Preset-Gruppe wurde aktiviert',
|
||||
group
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[enablePresetGroup] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Aktivieren der Preset-Gruppe');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
80
backend/controllers/trainingTimeController.js
Normal file
80
backend/controllers/trainingTimeController.js
Normal file
@@ -0,0 +1,80 @@
|
||||
import trainingTimeService from '../services/trainingTimeService.js';
|
||||
import { getSafeErrorMessage } from '../utils/errorUtils.js';
|
||||
|
||||
export const getTrainingTimes = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const groups = await trainingTimeService.getTrainingTimes(userToken, clubId);
|
||||
res.status(200).json(groups);
|
||||
} catch (error) {
|
||||
console.error('[getTrainingTimes] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Laden der Trainingszeiten');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const createTrainingTime = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const { trainingGroupId, weekday, startTime, endTime } = req.body;
|
||||
|
||||
if (!trainingGroupId || weekday === undefined || !startTime || !endTime) {
|
||||
return res.status(400).json({ error: 'Alle Felder müssen ausgefüllt sein' });
|
||||
}
|
||||
|
||||
const trainingTime = await trainingTimeService.createTrainingTime(
|
||||
userToken,
|
||||
clubId,
|
||||
trainingGroupId,
|
||||
weekday,
|
||||
startTime,
|
||||
endTime
|
||||
);
|
||||
|
||||
res.status(201).json(trainingTime);
|
||||
} catch (error) {
|
||||
console.error('[createTrainingTime] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Erstellen der Trainingszeit');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateTrainingTime = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, timeId } = req.params;
|
||||
const { weekday, startTime, endTime } = req.body;
|
||||
|
||||
const trainingTime = await trainingTimeService.updateTrainingTime(
|
||||
userToken,
|
||||
clubId,
|
||||
timeId,
|
||||
weekday,
|
||||
startTime,
|
||||
endTime
|
||||
);
|
||||
|
||||
res.status(200).json(trainingTime);
|
||||
} catch (error) {
|
||||
console.error('[updateTrainingTime] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Aktualisieren der Trainingszeit');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteTrainingTime = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, timeId } = req.params;
|
||||
|
||||
const result = await trainingTimeService.deleteTrainingTime(userToken, clubId, timeId);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[deleteTrainingTime] - Error:', error);
|
||||
const msg = getSafeErrorMessage(error, 'Fehler beim Löschen der Trainingszeit');
|
||||
res.status(error.statusCode || 500).json({ error: msg });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -1,10 +1,54 @@
|
||||
/**
|
||||
* HttpError mit Unterstützung für Fehlercodes
|
||||
*
|
||||
* Verwendung:
|
||||
* - new HttpError('Fehlermeldung', 400) - Legacy, wird weiterhin unterstützt
|
||||
* - new HttpError({ code: 'ERROR_USER_NOT_FOUND' }, 404) - Mit Fehlercode
|
||||
* - new HttpError({ code: 'ERROR_MEMBER_NOT_FOUND', params: { memberId: 123 } }, 404) - Mit Parametern
|
||||
*/
|
||||
class HttpError extends Error {
|
||||
constructor(message, statusCode) {
|
||||
super(message);
|
||||
constructor(messageOrError, statusCode) {
|
||||
// Unterstützung für beide Formate:
|
||||
// 1. Legacy: new HttpError('Fehlermeldung', 400)
|
||||
// 2. Neu: new HttpError({ code: 'ERROR_CODE', params: {...} }, 400)
|
||||
if (typeof messageOrError === 'string') {
|
||||
// Legacy-Format
|
||||
super(messageOrError);
|
||||
this.errorCode = null;
|
||||
this.errorParams = null;
|
||||
} else if (messageOrError && typeof messageOrError === 'object' && messageOrError.code) {
|
||||
// Neues Format mit Fehlercode
|
||||
super(messageOrError.code); // Für Stack-Trace
|
||||
this.errorCode = messageOrError.code;
|
||||
this.errorParams = messageOrError.params || null;
|
||||
} else {
|
||||
// Fallback
|
||||
super('Unknown error');
|
||||
this.errorCode = null;
|
||||
this.errorParams = null;
|
||||
}
|
||||
|
||||
this.name = this.constructor.name;
|
||||
this.statusCode = statusCode;
|
||||
this.statusCode = statusCode || 500;
|
||||
Error.captureStackTrace(this, this.constructor);
|
||||
}
|
||||
|
||||
/**
|
||||
* Gibt das Fehler-Objekt für die API-Antwort zurück
|
||||
* @returns {object} Fehler-Objekt mit code und optional params
|
||||
*/
|
||||
toJSON() {
|
||||
if (this.errorCode) {
|
||||
return {
|
||||
code: this.errorCode,
|
||||
...(this.errorParams && { params: this.errorParams })
|
||||
};
|
||||
}
|
||||
// Legacy: Gib die Nachricht zurück
|
||||
return {
|
||||
message: this.message
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default HttpError;
|
||||
|
||||
@@ -1,87 +1,13 @@
|
||||
import ApiLog from '../models/ApiLog.js';
|
||||
|
||||
/**
|
||||
* Middleware to log all API requests and responses
|
||||
* Should be added early in the middleware chain, but after authentication
|
||||
*
|
||||
* HINWEIS: Logging wurde deaktiviert - keine API-Requests werden mehr geloggt
|
||||
* (früher wurden nur MyTischtennis-Requests geloggt, dies wurde entfernt)
|
||||
*/
|
||||
export const requestLoggingMiddleware = async (req, res, next) => {
|
||||
const startTime = Date.now();
|
||||
const originalSend = res.send;
|
||||
|
||||
// Get request body (but limit size for sensitive data)
|
||||
let requestBody = null;
|
||||
if (req.body && Object.keys(req.body).length > 0) {
|
||||
const bodyStr = JSON.stringify(req.body);
|
||||
// Truncate very long bodies
|
||||
requestBody = bodyStr.length > 10000 ? bodyStr.substring(0, 10000) + '... (truncated)' : bodyStr;
|
||||
}
|
||||
|
||||
// Capture response
|
||||
let responseBody = null;
|
||||
res.send = function(data) {
|
||||
// Try to parse response as JSON
|
||||
try {
|
||||
const parsed = JSON.parse(data);
|
||||
const responseStr = JSON.stringify(parsed);
|
||||
// Truncate very long responses
|
||||
responseBody = responseStr.length > 10000 ? responseStr.substring(0, 10000) + '... (truncated)' : responseStr;
|
||||
} catch (e) {
|
||||
// Not JSON, just use raw data (truncated)
|
||||
responseBody = typeof data === 'string' ? data.substring(0, 1000) : String(data).substring(0, 1000);
|
||||
}
|
||||
|
||||
// Restore original send
|
||||
res.send = originalSend;
|
||||
return res.send.apply(res, arguments);
|
||||
};
|
||||
|
||||
// Log after response is sent
|
||||
res.on('finish', async () => {
|
||||
const executionTime = Date.now() - startTime;
|
||||
const ipAddress = req.ip || req.connection.remoteAddress || req.headers['x-forwarded-for'];
|
||||
const path = req.path || req.url;
|
||||
|
||||
// Nur myTischtennis-Requests loggen
|
||||
// Skip logging for non-data endpoints (Status-Checks, Health-Checks, etc.)
|
||||
// Exclude any endpoint containing 'status' or root paths
|
||||
if (
|
||||
path.includes('/status') ||
|
||||
path === '/' ||
|
||||
path === '/health' ||
|
||||
path.endsWith('/status') ||
|
||||
path.includes('/scheduler-status')
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Nur myTischtennis-Endpunkte loggen (z.B. /api/mytischtennis/*)
|
||||
if (!path.includes('/mytischtennis')) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Get user ID if available (wird von authMiddleware gesetzt)
|
||||
const userId = req.user?.id || null;
|
||||
|
||||
try {
|
||||
await ApiLog.create({
|
||||
userId,
|
||||
method: req.method,
|
||||
path: path,
|
||||
statusCode: res.statusCode,
|
||||
requestBody,
|
||||
responseBody,
|
||||
executionTime,
|
||||
errorMessage: res.statusCode >= 400 ? `HTTP ${res.statusCode}` : null,
|
||||
ipAddress,
|
||||
userAgent: req.headers['user-agent'],
|
||||
logType: 'api_request'
|
||||
});
|
||||
} catch (error) {
|
||||
// Don't let logging errors break the request
|
||||
console.error('Error logging API request:', error);
|
||||
}
|
||||
});
|
||||
|
||||
// Logging wurde deaktiviert - keine API-Requests werden mehr geloggt
|
||||
// (früher wurden nur MyTischtennis-Requests geloggt, dies wurde entfernt)
|
||||
next();
|
||||
};
|
||||
|
||||
|
||||
58
backend/migrations/20251213_add_tournament_stages.sql
Normal file
58
backend/migrations/20251213_add_tournament_stages.sql
Normal file
@@ -0,0 +1,58 @@
|
||||
-- Adds multi-stage tournaments (rounds) support
|
||||
-- MariaDB/MySQL compatible migration (manual execution)
|
||||
|
||||
-- 1) New table: tournament_stage
|
||||
CREATE TABLE IF NOT EXISTS tournament_stage (
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
tournament_id INT NOT NULL,
|
||||
stage_index INT NOT NULL,
|
||||
name VARCHAR(255) NULL,
|
||||
type VARCHAR(32) NOT NULL, -- 'groups' | 'knockout'
|
||||
number_of_groups INT NULL,
|
||||
advancing_per_group INT NULL,
|
||||
max_group_size INT NULL,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
CONSTRAINT fk_tournament_stage_tournament
|
||||
FOREIGN KEY (tournament_id) REFERENCES tournament(id)
|
||||
ON DELETE CASCADE
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
CREATE INDEX idx_tournament_stage_tournament_id ON tournament_stage (tournament_id);
|
||||
CREATE UNIQUE INDEX uq_tournament_stage_tournament_id_index ON tournament_stage (tournament_id, stage_index);
|
||||
|
||||
-- 2) New table: tournament_stage_advancement
|
||||
CREATE TABLE IF NOT EXISTS tournament_stage_advancement (
|
||||
id INT NOT NULL AUTO_INCREMENT,
|
||||
tournament_id INT NOT NULL,
|
||||
from_stage_id INT NOT NULL,
|
||||
to_stage_id INT NOT NULL,
|
||||
mode VARCHAR(32) NOT NULL DEFAULT 'pools',
|
||||
config JSON NOT NULL,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (id),
|
||||
CONSTRAINT fk_tournament_stage_adv_tournament
|
||||
FOREIGN KEY (tournament_id) REFERENCES tournament(id)
|
||||
ON DELETE CASCADE,
|
||||
CONSTRAINT fk_tournament_stage_adv_from
|
||||
FOREIGN KEY (from_stage_id) REFERENCES tournament_stage(id)
|
||||
ON DELETE CASCADE,
|
||||
CONSTRAINT fk_tournament_stage_adv_to
|
||||
FOREIGN KEY (to_stage_id) REFERENCES tournament_stage(id)
|
||||
ON DELETE CASCADE
|
||||
) ENGINE=InnoDB;
|
||||
|
||||
CREATE INDEX idx_tournament_stage_adv_tournament_id ON tournament_stage_advancement (tournament_id);
|
||||
CREATE INDEX idx_tournament_stage_adv_from_stage_id ON tournament_stage_advancement (from_stage_id);
|
||||
CREATE INDEX idx_tournament_stage_adv_to_stage_id ON tournament_stage_advancement (to_stage_id);
|
||||
|
||||
-- 3) Add stage_id to tournament_group and tournament_match
|
||||
-- MariaDB has no IF NOT EXISTS for columns; run each ALTER once.
|
||||
-- If you rerun, comment out the ALTERs or check INFORMATION_SCHEMA first.
|
||||
ALTER TABLE tournament_group ADD COLUMN stage_id INT NULL;
|
||||
ALTER TABLE tournament_match ADD COLUMN stage_id INT NULL;
|
||||
|
||||
CREATE INDEX idx_tournament_group_tournament_stage ON tournament_group (tournament_id, stage_id);
|
||||
CREATE INDEX idx_tournament_match_tournament_stage ON tournament_match (tournament_id, stage_id);
|
||||
@@ -0,0 +1,16 @@
|
||||
-- Allow NULL placeholders for KO (e.g. "Spiel um Platz 3")
|
||||
-- MariaDB/MySQL manual migration
|
||||
--
|
||||
-- Background: We create placeholder matches with player1_id/player2_id = NULL.
|
||||
-- Some prod DBs still have NOT NULL on these columns.
|
||||
|
||||
-- 1) Make player columns nullable
|
||||
ALTER TABLE tournament_match MODIFY COLUMN player1_id INT NULL;
|
||||
ALTER TABLE tournament_match MODIFY COLUMN player2_id INT NULL;
|
||||
|
||||
-- 2) (Optional) If you have foreign keys to tournament_member/external participant IDs,
|
||||
-- ensure they also allow NULL. (Not adding here because not all installations have FKs.)
|
||||
|
||||
-- 3) Verify
|
||||
-- SHOW COLUMNS FROM tournament_match LIKE 'player1_id';
|
||||
-- SHOW COLUMNS FROM tournament_match LIKE 'player2_id';
|
||||
@@ -0,0 +1,11 @@
|
||||
-- Add pool_id to tournament_group for pooled group phases
|
||||
ALTER TABLE `tournament_group`
|
||||
ADD COLUMN `pool_id` INT NULL AFTER `class_id`;
|
||||
|
||||
-- Add out_of_competition flags
|
||||
ALTER TABLE `tournament_member`
|
||||
ADD COLUMN `out_of_competition` TINYINT(1) NOT NULL DEFAULT 0 AFTER `class_id`;
|
||||
|
||||
ALTER TABLE `external_tournament_participant`
|
||||
ADD COLUMN `out_of_competition` TINYINT(1) NOT NULL DEFAULT 0 AFTER `class_id`;
|
||||
|
||||
3
backend/migrations/20260107_change_accident_to_text.sql
Normal file
3
backend/migrations/20260107_change_accident_to_text.sql
Normal file
@@ -0,0 +1,3 @@
|
||||
-- Change accident field from VARCHAR to TEXT to allow longer descriptions
|
||||
ALTER TABLE `accident`
|
||||
MODIFY COLUMN `accident` TEXT NOT NULL;
|
||||
@@ -0,0 +1,6 @@
|
||||
-- E-Mail und Adresse für externe Teilnehmer (für Weitermeldung)
|
||||
-- Die Felder werden verschlüsselt gespeichert (siehe Model)
|
||||
|
||||
ALTER TABLE `external_tournament_participant`
|
||||
ADD COLUMN `email` VARCHAR(500) NULL AFTER `club`,
|
||||
ADD COLUMN `address` TEXT NULL AFTER `email`;
|
||||
@@ -0,0 +1,8 @@
|
||||
-- Add gave_up (Aufgabe) to tournament participants
|
||||
-- Wenn ein Spieler aufgibt: alle seine Spiele zählen für den Gegner (11:0), beide aufgegeben = 0:0, kein Sieger
|
||||
|
||||
ALTER TABLE `tournament_member`
|
||||
ADD COLUMN `gave_up` TINYINT(1) NOT NULL DEFAULT 0 AFTER `out_of_competition`;
|
||||
|
||||
ALTER TABLE `external_tournament_participant`
|
||||
ADD COLUMN `gave_up` TINYINT(1) NOT NULL DEFAULT 0 AFTER `out_of_competition`;
|
||||
@@ -0,0 +1,9 @@
|
||||
-- Minimeisterschaften: Turnier-Jahr und Alters-Obergrenze pro Klasse
|
||||
-- tournament.mini_championship_year: Jahr der Minimeisterschaft (z.B. 2025); nur gesetzt bei Minimeisterschaften
|
||||
-- tournament_class.max_birth_year: Geboren im Jahr X oder früher (<=); für Altersklassen 12/10
|
||||
|
||||
ALTER TABLE `tournament`
|
||||
ADD COLUMN `mini_championship_year` INT NULL AFTER `allows_external`;
|
||||
|
||||
ALTER TABLE `tournament_class`
|
||||
ADD COLUMN `max_birth_year` INT NULL AFTER `min_birth_year`;
|
||||
@@ -0,0 +1,9 @@
|
||||
-- Anzahl der Tische im Turnier
|
||||
ALTER TABLE tournament
|
||||
ADD COLUMN number_of_tables INT NULL DEFAULT NULL
|
||||
COMMENT 'Anzahl der Tische, auf denen gespielt wird';
|
||||
|
||||
-- Tischnummer pro Match
|
||||
ALTER TABLE tournament_match
|
||||
ADD COLUMN table_number INT NULL DEFAULT NULL
|
||||
COMMENT 'Tischnummer, an der das Match stattfindet';
|
||||
@@ -0,0 +1,8 @@
|
||||
-- Felder für "Passwort vergessen"-Funktion
|
||||
ALTER TABLE user
|
||||
ADD COLUMN reset_token VARCHAR(255) NULL DEFAULT NULL
|
||||
COMMENT 'Token für Passwort-Reset';
|
||||
|
||||
ALTER TABLE user
|
||||
ADD COLUMN reset_token_expires DATETIME NULL DEFAULT NULL
|
||||
COMMENT 'Ablaufzeitpunkt des Reset-Tokens';
|
||||
77
backend/migrations/TABELLEN_LISTE.md
Normal file
77
backend/migrations/TABELLEN_LISTE.md
Normal file
@@ -0,0 +1,77 @@
|
||||
# Liste aller Tabellen im Trainingstagebuch-Projekt
|
||||
|
||||
## Basis-Tabellen
|
||||
1. `user` - Benutzer
|
||||
2. `user_club` - Verknüpfung Benutzer ↔ Verein
|
||||
3. `user_token` - Authentifizierungs-Tokens
|
||||
4. `clubs` - Vereine
|
||||
5. `log` - System-Logs
|
||||
|
||||
## Mitglieder-Verwaltung
|
||||
6. `member` - Mitglieder
|
||||
7. `member_contact` - Kontaktdaten der Mitglieder (Telefon, E-Mail)
|
||||
8. `member_image` - Bilder der Mitglieder
|
||||
9. `member_notes` - Notizen zu Mitgliedern
|
||||
10. `member_transfer_config` - Konfiguration für Mitgliederübertragung
|
||||
|
||||
## Trainingsgruppen (NEU)
|
||||
11. `training_group` - Trainingsgruppen
|
||||
12. `member_training_group` - Verknüpfung Mitglied ↔ Trainingsgruppe
|
||||
13. `club_disabled_preset_groups` - Deaktivierte Preset-Gruppen pro Verein
|
||||
14. `training_times` - Trainingszeiten pro Gruppe (NEU)
|
||||
|
||||
## Tagebuch
|
||||
15. `diary_dates` - Trainingstage
|
||||
16. `participants` - Teilnehmer an Trainingstagen
|
||||
17. `activities` - Aktivitäten
|
||||
18. `diary_notes` - Notizen zu Trainingstagen
|
||||
19. `diary_tags` - Tags für Tagebuch
|
||||
20. `member_diary_tags` - Verknüpfung Mitglied ↔ Tagebuch-Tag
|
||||
21. `diary_date_tags` - Verknüpfung Trainingstag ↔ Tag
|
||||
22. `diary_member_notes` - Notizen zu Mitgliedern an Trainingstagen
|
||||
23. `diary_member_tags` - Tags für Mitglieder an Trainingstagen
|
||||
24. `diary_date_activities` - Aktivitäten an Trainingstagen
|
||||
25. `diary_member_activities` - Verknüpfung Teilnehmer ↔ Aktivität
|
||||
26. `group` - Gruppen (für Trainingsplan)
|
||||
27. `group_activity` - Gruppenaktivitäten
|
||||
|
||||
## Vordefinierte Aktivitäten
|
||||
28. `predefined_activities` - Vordefinierte Aktivitäten
|
||||
29. `predefined_activity_images` - Bilder zu vordefinierten Aktivitäten
|
||||
|
||||
## Unfälle
|
||||
30. `accident` - Unfälle
|
||||
|
||||
## Teams & Ligen
|
||||
31. `season` - Saisons
|
||||
32. `league` - Ligen
|
||||
33. `team` - Teams
|
||||
34. `club_team` - Verknüpfung Verein ↔ Team
|
||||
35. `team_document` - Dokumente zu Teams
|
||||
36. `match` - Spiele
|
||||
37. `location` - Spielorte
|
||||
|
||||
## Turniere
|
||||
38. `tournament` - Turniere
|
||||
39. `tournament_class` - Turnierklassen
|
||||
40. `tournament_group` - Turniergruppen
|
||||
41. `tournament_member` - Teilnehmer an Turnieren
|
||||
42. `tournament_match` - Spiele in Turnieren
|
||||
43. `tournament_result` - Ergebnisse von Turnierspielen
|
||||
44. `external_tournament_participant` - Externe Teilnehmer an Turnieren
|
||||
|
||||
## Offizielle Turniere (myTischtennis)
|
||||
45. `official_tournaments` - Offizielle Turniere
|
||||
46. `official_competitions` - Wettbewerbe in offiziellen Turnieren
|
||||
47. `official_competition_members` - Teilnehmer an offiziellen Wettbewerben
|
||||
|
||||
## myTischtennis Integration
|
||||
48. `my_tischtennis` - myTischtennis-Verbindungen
|
||||
49. `my_tischtennis_update_history` - Update-Historie
|
||||
50. `my_tischtennis_fetch_log` - Fetch-Logs
|
||||
|
||||
## API & Logging
|
||||
51. `api_log` - API-Logs
|
||||
|
||||
## Gesamt: 51 Tabellen
|
||||
|
||||
22
backend/migrations/add_allows_external_to_tournament.sql
Normal file
22
backend/migrations/add_allows_external_to_tournament.sql
Normal file
@@ -0,0 +1,22 @@
|
||||
-- Migration: Add 'allows_external' column to tournament table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament';
|
||||
SET @columnname = 'allows_external';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` TINYINT(1) NOT NULL DEFAULT 0 AFTER `winning_sets`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
-- Migration: Add 'class_id' column to external_tournament_participant table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'external_tournament_participant';
|
||||
SET @columnname = 'class_id';
|
||||
|
||||
-- Check if column exists
|
||||
SET @column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
);
|
||||
|
||||
-- Add column if it doesn't exist
|
||||
SET @sql = IF(@column_exists = 0,
|
||||
'ALTER TABLE `external_tournament_participant` ADD COLUMN `class_id` INT(11) NULL AFTER `seeded`',
|
||||
'SELECT 1 AS column_already_exists'
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
|
||||
27
backend/migrations/add_class_id_to_tournament_group.sql
Normal file
27
backend/migrations/add_class_id_to_tournament_group.sql
Normal file
@@ -0,0 +1,27 @@
|
||||
-- Migration: Add 'class_id' column to tournament_group table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_group';
|
||||
SET @columnname = 'class_id';
|
||||
|
||||
-- Check if column exists
|
||||
SET @column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
);
|
||||
|
||||
-- Add column if it doesn't exist
|
||||
SET @sql = IF(@column_exists = 0,
|
||||
'ALTER TABLE `tournament_group` ADD COLUMN `class_id` INT(11) NULL AFTER `tournament_id`',
|
||||
'SELECT 1 AS column_already_exists'
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
|
||||
27
backend/migrations/add_class_id_to_tournament_match.sql
Normal file
27
backend/migrations/add_class_id_to_tournament_match.sql
Normal file
@@ -0,0 +1,27 @@
|
||||
-- Migration: Add 'class_id' column to tournament_match table
|
||||
-- Date: 2025-01-16
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_match';
|
||||
SET @columnname = 'class_id';
|
||||
|
||||
-- Check if column exists
|
||||
SET @column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
);
|
||||
|
||||
-- Add column if it doesn't exist
|
||||
SET @sql = IF(@column_exists = 0,
|
||||
'ALTER TABLE `tournament_match` ADD COLUMN `class_id` INT(11) NULL AFTER `group_id`',
|
||||
'SELECT 1 AS column_already_exists'
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
|
||||
22
backend/migrations/add_class_id_to_tournament_member.sql
Normal file
22
backend/migrations/add_class_id_to_tournament_member.sql
Normal file
@@ -0,0 +1,22 @@
|
||||
-- Migration: Add 'class_id' column to tournament_member table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_member';
|
||||
SET @columnname = 'class_id';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` INT(11) NULL AFTER `seeded`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
@@ -0,0 +1,8 @@
|
||||
-- Migration: Geschlecht zu externen Turnierteilnehmern hinzufügen
|
||||
-- Datum: 2025-01-XX
|
||||
|
||||
ALTER TABLE `external_tournament_participant`
|
||||
ADD COLUMN `gender` ENUM('male', 'female', 'diverse', 'unknown') NULL DEFAULT 'unknown' AFTER `birth_date`;
|
||||
|
||||
|
||||
|
||||
8
backend/migrations/add_gender_to_tournament_class.sql
Normal file
8
backend/migrations/add_gender_to_tournament_class.sql
Normal file
@@ -0,0 +1,8 @@
|
||||
-- Migration: Geschlecht zu Turnierklassen hinzufügen
|
||||
-- Datum: 2025-01-XX
|
||||
|
||||
ALTER TABLE `tournament_class`
|
||||
ADD COLUMN `gender` ENUM('male', 'female', 'mixed') NULL DEFAULT NULL AFTER `is_doubles`;
|
||||
|
||||
|
||||
|
||||
22
backend/migrations/add_is_active_to_tournament_match.sql
Normal file
22
backend/migrations/add_is_active_to_tournament_match.sql
Normal file
@@ -0,0 +1,22 @@
|
||||
-- Migration: Add 'is_active' column to tournament_match table
|
||||
-- Date: 2025-01-14
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_match';
|
||||
SET @columnname = 'is_active';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` TINYINT(1) NOT NULL DEFAULT 0 AFTER `is_finished`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
@@ -0,0 +1,7 @@
|
||||
-- Migration: Add is_doubles column to tournament_class table
|
||||
-- Date: 2025-01-23
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
ALTER TABLE `tournament_class`
|
||||
ADD COLUMN `is_doubles` TINYINT(1) NOT NULL DEFAULT 0 AFTER `sort_order`;
|
||||
|
||||
@@ -0,0 +1,27 @@
|
||||
-- Migration: Geburtsjahr-Beschränkung zu Turnierklassen hinzufügen
|
||||
-- Datum: 2025-01-XX
|
||||
-- Beschreibung: Fügt max_birth_year Feld hinzu für "geboren im Jahr X oder früher" (<=)
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_class';
|
||||
SET @columnname = 'max_birth_year';
|
||||
|
||||
-- Check if column exists
|
||||
SET @column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
);
|
||||
|
||||
-- Add column if it doesn't exist
|
||||
SET @sql = IF(@column_exists = 0,
|
||||
'ALTER TABLE `tournament_class` ADD COLUMN `max_birth_year` INT(11) NULL DEFAULT NULL AFTER `gender`',
|
||||
'SELECT 1 AS column_already_exists'
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
29
backend/migrations/add_name_to_tournament.sql
Normal file
29
backend/migrations/add_name_to_tournament.sql
Normal file
@@ -0,0 +1,29 @@
|
||||
-- Migration: Add name column to tournament table
|
||||
-- Date: 2025-01-13
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
-- Add name column if it doesn't exist
|
||||
-- Check if column exists and add it if not
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament';
|
||||
SET @columnname = 'name';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` VARCHAR(255) NOT NULL DEFAULT "" AFTER `id`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
-- Update existing tournaments: set name to formatted date if name is empty
|
||||
UPDATE `tournament`
|
||||
SET `name` = DATE_FORMAT(`date`, '%d.%m.%Y')
|
||||
WHERE `name` = '' OR `name` IS NULL;
|
||||
|
||||
24
backend/migrations/add_seeded_to_tournament_member.sql
Normal file
24
backend/migrations/add_seeded_to_tournament_member.sql
Normal file
@@ -0,0 +1,24 @@
|
||||
-- Migration: Add seeded column to tournament_member table
|
||||
-- Date: 2025-01-13
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
-- Add seeded column if it doesn't exist
|
||||
-- Check if column exists and add it if not
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_member';
|
||||
SET @columnname = 'seeded';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` TINYINT(1) NOT NULL DEFAULT 0 AFTER `club_member_id`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
22
backend/migrations/add_winning_sets_to_tournament.sql
Normal file
22
backend/migrations/add_winning_sets_to_tournament.sql
Normal file
@@ -0,0 +1,22 @@
|
||||
-- Migration: Add 'winning_sets' column to tournament table
|
||||
-- Date: 2025-01-14
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament';
|
||||
SET @columnname = 'winning_sets';
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @columnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @columnname, '` INT NOT NULL DEFAULT 3 AFTER `advancing_per_group`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
@@ -0,0 +1,41 @@
|
||||
-- Migration: Change 'ttr' column to 'birth_date' in external_tournament_participant table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'external_tournament_participant';
|
||||
SET @oldcolumnname = 'ttr';
|
||||
SET @newcolumnname = 'birth_date';
|
||||
|
||||
-- Check if old column exists
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @oldcolumnname)
|
||||
) > 0,
|
||||
CONCAT('ALTER TABLE `', @tablename, '` CHANGE COLUMN `', @oldcolumnname, '` `', @newcolumnname, '` VARCHAR(255) NULL AFTER `club`'),
|
||||
'SELECT 1'
|
||||
));
|
||||
PREPARE alterIfExists FROM @preparedStatement;
|
||||
EXECUTE alterIfExists;
|
||||
DEALLOCATE PREPARE alterIfExists;
|
||||
|
||||
-- If old column didn't exist, check if new column exists and add it if not
|
||||
SET @preparedStatement = (SELECT IF(
|
||||
(
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @newcolumnname)
|
||||
) > 0,
|
||||
'SELECT 1',
|
||||
CONCAT('ALTER TABLE `', @tablename, '` ADD COLUMN `', @newcolumnname, '` VARCHAR(255) NULL AFTER `club`')
|
||||
));
|
||||
PREPARE alterIfNotExists FROM @preparedStatement;
|
||||
EXECUTE alterIfNotExists;
|
||||
DEALLOCATE PREPARE alterIfNotExists;
|
||||
|
||||
62
backend/migrations/check_seasons_and_teams.sql
Normal file
62
backend/migrations/check_seasons_and_teams.sql
Normal file
@@ -0,0 +1,62 @@
|
||||
-- Diagnose-Skript: Prüfe Seasons und Teams auf dem Server
|
||||
-- Führe diese Queries auf dem Server aus, um das Problem zu identifizieren
|
||||
|
||||
-- 1. Prüfe, ob die season-Tabelle existiert und Daten enthält
|
||||
SELECT '=== SEASONS ===' as info;
|
||||
SELECT * FROM `season` ORDER BY `id` DESC;
|
||||
|
||||
-- 2. Prüfe, ob die club_team-Tabelle existiert und welche season_id verwendet wird
|
||||
SELECT '=== CLUB_TEAMS ===' as info;
|
||||
SELECT
|
||||
id,
|
||||
name,
|
||||
club_id,
|
||||
season_id,
|
||||
league_id,
|
||||
created_at,
|
||||
updated_at
|
||||
FROM `club_team`
|
||||
ORDER BY `id`;
|
||||
|
||||
-- 3. Prüfe, ob es Teams gibt, die auf nicht-existierende Seasons verweisen
|
||||
SELECT '=== TEAMS MIT FEHLENDEN SEASONS ===' as info;
|
||||
SELECT
|
||||
ct.id,
|
||||
ct.name,
|
||||
ct.season_id,
|
||||
s.season
|
||||
FROM `club_team` ct
|
||||
LEFT JOIN `season` s ON ct.season_id = s.id
|
||||
WHERE s.id IS NULL;
|
||||
|
||||
-- 4. Prüfe, ob es Teams gibt, die keine season_id haben
|
||||
SELECT '=== TEAMS OHNE SEASON_ID ===' as info;
|
||||
SELECT
|
||||
id,
|
||||
name,
|
||||
club_id,
|
||||
season_id
|
||||
FROM `club_team`
|
||||
WHERE season_id IS NULL;
|
||||
|
||||
-- 5. Prüfe die Struktur der club_team-Tabelle
|
||||
SELECT '=== CLUB_TEAM TABELLENSTRUKTUR ===' as info;
|
||||
DESCRIBE `club_team`;
|
||||
|
||||
-- 6. Prüfe die Struktur der season-Tabelle
|
||||
SELECT '=== SEASON TABELLENSTRUKTUR ===' as info;
|
||||
DESCRIBE `season`;
|
||||
|
||||
-- 7. Prüfe Foreign Key Constraints
|
||||
SELECT '=== FOREIGN KEYS ===' as info;
|
||||
SELECT
|
||||
CONSTRAINT_NAME,
|
||||
TABLE_NAME,
|
||||
COLUMN_NAME,
|
||||
REFERENCED_TABLE_NAME,
|
||||
REFERENCED_COLUMN_NAME
|
||||
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND (TABLE_NAME = 'club_team' OR TABLE_NAME = 'season')
|
||||
AND REFERENCED_TABLE_NAME IS NOT NULL;
|
||||
|
||||
30
backend/migrations/check_seasons_and_teams_simple.sql
Normal file
30
backend/migrations/check_seasons_and_teams_simple.sql
Normal file
@@ -0,0 +1,30 @@
|
||||
-- Vereinfachtes Diagnose-Skript: Prüfe nur die wichtigsten Punkte
|
||||
|
||||
-- 1. Gibt es Seasons in der Datenbank?
|
||||
SELECT 'SEASONS:' as check_type, COUNT(*) as count FROM `season`;
|
||||
SELECT * FROM `season` ORDER BY `id` DESC;
|
||||
|
||||
-- 2. Gibt es Teams in der Datenbank?
|
||||
SELECT 'CLUB_TEAMS:' as check_type, COUNT(*) as count FROM `club_team`;
|
||||
SELECT id, name, club_id, season_id, league_id FROM `club_team` ORDER BY `id`;
|
||||
|
||||
-- 3. Haben alle Teams eine season_id?
|
||||
SELECT 'TEAMS OHNE SEASON_ID:' as check_type, COUNT(*) as count
|
||||
FROM `club_team` WHERE season_id IS NULL;
|
||||
|
||||
-- 4. Verweisen alle Teams auf existierende Seasons?
|
||||
SELECT 'TEAMS MIT FEHLENDEN SEASONS:' as check_type, COUNT(*) as count
|
||||
FROM `club_team` ct
|
||||
LEFT JOIN `season` s ON ct.season_id = s.id
|
||||
WHERE s.id IS NULL;
|
||||
|
||||
-- 5. Welche season_id verwenden die Teams?
|
||||
SELECT 'SEASON_ID VERWENDUNG:' as check_type, season_id, COUNT(*) as team_count
|
||||
FROM `club_team`
|
||||
GROUP BY season_id;
|
||||
|
||||
-- 6. Welche Seasons existieren?
|
||||
SELECT 'EXISTIERENDE SEASONS:' as check_type, id, season
|
||||
FROM `season`
|
||||
ORDER BY id;
|
||||
|
||||
17
backend/migrations/create_club_disabled_preset_groups.sql
Normal file
17
backend/migrations/create_club_disabled_preset_groups.sql
Normal file
@@ -0,0 +1,17 @@
|
||||
-- Migration: Create club_disabled_preset_groups table
|
||||
-- Date: 2025-01-16
|
||||
-- For MariaDB/MySQL
|
||||
-- Stores which preset groups are disabled for each club
|
||||
|
||||
CREATE TABLE IF NOT EXISTS `club_disabled_preset_groups` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`club_id` INT(11) NOT NULL,
|
||||
`preset_type` ENUM('anfaenger', 'fortgeschrittene', 'erwachsene', 'nachwuchs', 'leistungsgruppe') NOT NULL,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
UNIQUE KEY `unique_club_preset_type` (`club_id`, `preset_type`),
|
||||
KEY `club_id` (`club_id`),
|
||||
CONSTRAINT `club_disabled_preset_groups_ibfk_1` FOREIGN KEY (`club_id`) REFERENCES `clubs` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
@@ -0,0 +1,22 @@
|
||||
-- Migration: Create external_tournament_participant table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
CREATE TABLE IF NOT EXISTS `external_tournament_participant` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`tournament_id` INT(11) NOT NULL,
|
||||
`group_id` INT(11) NULL,
|
||||
`first_name` VARCHAR(255) NOT NULL,
|
||||
`last_name` VARCHAR(255) NOT NULL,
|
||||
`club` VARCHAR(255) NULL,
|
||||
`birth_date` VARCHAR(255) NULL,
|
||||
`seeded` TINYINT(1) NOT NULL DEFAULT 0,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
INDEX `idx_tournament_id` (`tournament_id`),
|
||||
INDEX `idx_group_id` (`group_id`),
|
||||
CONSTRAINT `fk_external_participant_tournament` FOREIGN KEY (`tournament_id`) REFERENCES `tournament` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
CONSTRAINT `fk_external_participant_group` FOREIGN KEY (`group_id`) REFERENCES `tournament_group` (`id`) ON DELETE SET NULL ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
16
backend/migrations/create_tournament_class_table.sql
Normal file
16
backend/migrations/create_tournament_class_table.sql
Normal file
@@ -0,0 +1,16 @@
|
||||
-- Migration: Create tournament_class table
|
||||
-- Date: 2025-01-15
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
CREATE TABLE IF NOT EXISTS `tournament_class` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`tournament_id` INT(11) NOT NULL,
|
||||
`name` VARCHAR(255) NOT NULL,
|
||||
`sort_order` INT(11) NOT NULL DEFAULT 0,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `tournament_id` (`tournament_id`),
|
||||
CONSTRAINT `tournament_class_ibfk_1` FOREIGN KEY (`tournament_id`) REFERENCES `tournament` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
33
backend/migrations/create_tournament_pairing_table.sql
Normal file
33
backend/migrations/create_tournament_pairing_table.sql
Normal file
@@ -0,0 +1,33 @@
|
||||
-- Migration: Create tournament_pairing table
|
||||
-- Date: 2025-01-23
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
CREATE TABLE IF NOT EXISTS `tournament_pairing` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`tournament_id` INT(11) NOT NULL,
|
||||
`class_id` INT(11) NOT NULL,
|
||||
`group_id` INT(11) NULL,
|
||||
`member1_id` INT(11) NULL,
|
||||
`external1_id` INT(11) NULL,
|
||||
`member2_id` INT(11) NULL,
|
||||
`external2_id` INT(11) NULL,
|
||||
`seeded` TINYINT(1) NOT NULL DEFAULT 0,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `tournament_id` (`tournament_id`),
|
||||
KEY `class_id` (`class_id`),
|
||||
KEY `group_id` (`group_id`),
|
||||
KEY `member1_id` (`member1_id`),
|
||||
KEY `member2_id` (`member2_id`),
|
||||
KEY `external1_id` (`external1_id`),
|
||||
KEY `external2_id` (`external2_id`),
|
||||
CONSTRAINT `tournament_pairing_ibfk_1` FOREIGN KEY (`tournament_id`) REFERENCES `tournament` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
CONSTRAINT `tournament_pairing_ibfk_2` FOREIGN KEY (`class_id`) REFERENCES `tournament_class` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
CONSTRAINT `tournament_pairing_ibfk_3` FOREIGN KEY (`group_id`) REFERENCES `tournament_group` (`id`) ON DELETE SET NULL ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
36
backend/migrations/create_training_group_tables.sql
Normal file
36
backend/migrations/create_training_group_tables.sql
Normal file
@@ -0,0 +1,36 @@
|
||||
-- Migration: Create training_group and member_training_group tables
|
||||
-- Date: 2025-01-16
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
-- Create training_group table
|
||||
CREATE TABLE IF NOT EXISTS `training_group` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`club_id` INT(11) NOT NULL,
|
||||
`name` VARCHAR(255) NOT NULL,
|
||||
`is_preset` TINYINT(1) NOT NULL DEFAULT 0,
|
||||
`preset_type` ENUM('anfaenger', 'fortgeschrittene', 'erwachsene', 'nachwuchs', 'leistungsgruppe') NULL,
|
||||
`sort_order` INT(11) NOT NULL DEFAULT 0,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `club_id` (`club_id`),
|
||||
CONSTRAINT `training_group_ibfk_1` FOREIGN KEY (`club_id`) REFERENCES `clubs` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- Create member_training_group junction table
|
||||
CREATE TABLE IF NOT EXISTS `member_training_group` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`member_id` INT(11) NOT NULL,
|
||||
`training_group_id` INT(11) NOT NULL,
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
UNIQUE KEY `unique_member_group` (`member_id`, `training_group_id`),
|
||||
KEY `member_id` (`member_id`),
|
||||
KEY `training_group_id` (`training_group_id`),
|
||||
CONSTRAINT `member_training_group_ibfk_1` FOREIGN KEY (`member_id`) REFERENCES `member` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
CONSTRAINT `member_training_group_ibfk_2` FOREIGN KEY (`training_group_id`) REFERENCES `training_group` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
|
||||
|
||||
19
backend/migrations/create_training_times_table.sql
Normal file
19
backend/migrations/create_training_times_table.sql
Normal file
@@ -0,0 +1,19 @@
|
||||
-- Migration: Create training_times table
|
||||
-- Date: 2025-01-16
|
||||
-- For MariaDB/MySQL
|
||||
-- Stores training times for training groups
|
||||
|
||||
CREATE TABLE IF NOT EXISTS `training_times` (
|
||||
`id` INT(11) NOT NULL AUTO_INCREMENT,
|
||||
`training_group_id` INT(11) NOT NULL,
|
||||
`weekday` TINYINT(1) NOT NULL COMMENT '0 = Sunday, 1 = Monday, ..., 6 = Saturday',
|
||||
`start_time` TIME NOT NULL,
|
||||
`end_time` TIME NOT NULL,
|
||||
`sort_order` INT(11) NOT NULL DEFAULT 0 COMMENT 'Order for displaying multiple times on the same weekday',
|
||||
`created_at` DATETIME NOT NULL,
|
||||
`updated_at` DATETIME NOT NULL,
|
||||
PRIMARY KEY (`id`),
|
||||
KEY `training_group_id` (`training_group_id`),
|
||||
CONSTRAINT `training_times_ibfk_1` FOREIGN KEY (`training_group_id`) REFERENCES `training_group` (`id`) ON DELETE CASCADE ON UPDATE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
92
backend/migrations/fix_seasons_and_teams.sql
Normal file
92
backend/migrations/fix_seasons_and_teams.sql
Normal file
@@ -0,0 +1,92 @@
|
||||
-- Fix-Skript: Behebt häufige Probleme mit Seasons und Teams
|
||||
-- Führe dieses Skript auf dem Server aus, wenn die Diagnose Probleme zeigt
|
||||
|
||||
-- 1. Stelle sicher, dass die season-Tabelle existiert und die richtige Struktur hat
|
||||
-- (Falls die Tabelle nicht existiert, wird sie erstellt)
|
||||
CREATE TABLE IF NOT EXISTS `season` (
|
||||
`id` INT NOT NULL AUTO_INCREMENT,
|
||||
`season` VARCHAR(255) NOT NULL UNIQUE,
|
||||
`created_at` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
`updated_at` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
PRIMARY KEY (`id`)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- 2. Stelle sicher, dass die club_team-Tabelle die season_id-Spalte hat
|
||||
-- (Falls die Spalte nicht existiert, wird sie hinzugefügt)
|
||||
ALTER TABLE `club_team`
|
||||
ADD COLUMN IF NOT EXISTS `season_id` INT NULL;
|
||||
|
||||
-- 3. Erstelle die Seasons, falls sie fehlen
|
||||
INSERT IGNORE INTO `season` (`season`) VALUES ('2024/2025');
|
||||
INSERT IGNORE INTO `season` (`season`) VALUES ('2025/2026');
|
||||
|
||||
-- 4. Aktualisiere Teams ohne season_id auf die aktuelle Saison
|
||||
-- (Verwendet die neueste Saison basierend auf dem aktuellen Datum)
|
||||
UPDATE `club_team`
|
||||
SET `season_id` = (
|
||||
SELECT `id` FROM `season`
|
||||
WHERE `season` = (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
)
|
||||
LIMIT 1
|
||||
)
|
||||
WHERE `season_id` IS NULL;
|
||||
|
||||
-- 5. Falls keine aktuelle Saison existiert, erstelle sie
|
||||
INSERT IGNORE INTO `season` (`season`) VALUES (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
);
|
||||
|
||||
-- 6. Aktualisiere Teams mit ungültigen season_id auf die aktuelle Saison
|
||||
UPDATE `club_team` ct
|
||||
LEFT JOIN `season` s ON ct.season_id = s.id
|
||||
SET ct.season_id = (
|
||||
SELECT `id` FROM `season`
|
||||
WHERE `season` = (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
)
|
||||
LIMIT 1
|
||||
)
|
||||
WHERE s.id IS NULL;
|
||||
|
||||
-- 7. Füge Foreign Key Constraint hinzu, falls er fehlt
|
||||
-- (Hinweis: MySQL/MariaDB unterstützt "IF NOT EXISTS" nicht für Constraints,
|
||||
-- daher müssen wir prüfen, ob der Constraint bereits existiert)
|
||||
SET @constraint_exists = (
|
||||
SELECT COUNT(*)
|
||||
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE
|
||||
WHERE TABLE_SCHEMA = DATABASE()
|
||||
AND TABLE_NAME = 'club_team'
|
||||
AND CONSTRAINT_NAME = 'club_team_season_id_foreign_idx'
|
||||
AND REFERENCED_TABLE_NAME = 'season'
|
||||
);
|
||||
|
||||
SET @sql = IF(@constraint_exists = 0,
|
||||
'ALTER TABLE `club_team` ADD CONSTRAINT `club_team_season_id_foreign_idx` FOREIGN KEY (`season_id`) REFERENCES `season` (`id`) ON DELETE CASCADE ON UPDATE CASCADE',
|
||||
'SELECT "Foreign key constraint already exists" as message'
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
|
||||
-- 8. Zeige die Ergebnisse
|
||||
SELECT '=== ERGEBNIS ===' as info;
|
||||
SELECT
|
||||
ct.id,
|
||||
ct.name,
|
||||
ct.season_id,
|
||||
s.season
|
||||
FROM `club_team` ct
|
||||
LEFT JOIN `season` s ON ct.season_id = s.id
|
||||
ORDER BY ct.id;
|
||||
|
||||
@@ -0,0 +1,41 @@
|
||||
-- Migration: Umbenennen von max_birth_year zu min_birth_year
|
||||
-- Datum: 2025-01-XX
|
||||
-- Beschreibung: Ändert die Logik von "geboren <= X" zu "geboren >= X"
|
||||
-- For MariaDB/MySQL
|
||||
|
||||
SET @dbname = DATABASE();
|
||||
SET @tablename = 'tournament_class';
|
||||
SET @oldcolumnname = 'max_birth_year';
|
||||
SET @newcolumnname = 'min_birth_year';
|
||||
|
||||
-- Check if old column exists
|
||||
SET @old_column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @oldcolumnname)
|
||||
);
|
||||
|
||||
-- Check if new column already exists
|
||||
SET @new_column_exists = (
|
||||
SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
|
||||
WHERE
|
||||
(TABLE_SCHEMA = @dbname)
|
||||
AND (TABLE_NAME = @tablename)
|
||||
AND (COLUMN_NAME = @newcolumnname)
|
||||
);
|
||||
|
||||
-- Rename column if old exists and new doesn't
|
||||
SET @sql = IF(@old_column_exists > 0 AND @new_column_exists = 0,
|
||||
CONCAT('ALTER TABLE `', @tablename, '` CHANGE COLUMN `', @oldcolumnname, '` `', @newcolumnname, '` INT(11) NULL DEFAULT NULL AFTER `gender`'),
|
||||
IF(@new_column_exists > 0,
|
||||
'SELECT 1 AS column_already_renamed',
|
||||
'SELECT 1 AS old_column_not_found'
|
||||
)
|
||||
);
|
||||
|
||||
PREPARE stmt FROM @sql;
|
||||
EXECUTE stmt;
|
||||
DEALLOCATE PREPARE stmt;
|
||||
|
||||
@@ -0,0 +1,39 @@
|
||||
-- Migration: Update my_tischtennis table TEXT fields to LONGTEXT for encrypted data
|
||||
-- Date: 2025-11-21
|
||||
-- For MariaDB/MySQL
|
||||
--
|
||||
-- Problem: Encrypted data can be very long, and TEXT fields (max 65KB) are too small
|
||||
-- Solution: Change to LONGTEXT (max 4GB) for all encrypted fields
|
||||
|
||||
-- Update user_data to LONGTEXT
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `user_data` LONGTEXT NULL;
|
||||
|
||||
-- Update access_token to LONGTEXT
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `access_token` LONGTEXT NULL;
|
||||
|
||||
-- Update refresh_token to LONGTEXT
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `refresh_token` LONGTEXT NULL;
|
||||
|
||||
-- Update cookie to LONGTEXT
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `cookie` LONGTEXT NULL;
|
||||
|
||||
-- Update encrypted_password to LONGTEXT
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `encrypted_password` LONGTEXT NULL;
|
||||
|
||||
-- Update club_id to LONGTEXT (was VARCHAR, but encrypted data can be longer)
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `club_id` LONGTEXT NULL;
|
||||
|
||||
-- Update club_name to LONGTEXT (was VARCHAR, but encrypted data can be longer)
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `club_name` LONGTEXT NULL;
|
||||
|
||||
-- Update fed_nickname to LONGTEXT (was VARCHAR, but encrypted data can be longer)
|
||||
ALTER TABLE `my_tischtennis`
|
||||
MODIFY COLUMN `fed_nickname` LONGTEXT NULL;
|
||||
|
||||
@@ -13,7 +13,7 @@ const Accident = sequelize.define('Accident', {
|
||||
allowNull: false,
|
||||
},
|
||||
accident: {
|
||||
type: DataTypes.STRING,
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: false,
|
||||
set(value) {
|
||||
const encryptedValue = encryptData(value);
|
||||
|
||||
33
backend/models/ClubDisabledPresetGroup.js
Normal file
33
backend/models/ClubDisabledPresetGroup.js
Normal file
@@ -0,0 +1,33 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Club from './Club.js';
|
||||
|
||||
const ClubDisabledPresetGroup = sequelize.define('ClubDisabledPresetGroup', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Club,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
},
|
||||
presetType: {
|
||||
type: DataTypes.ENUM('anfaenger', 'fortgeschrittene', 'erwachsene', 'nachwuchs', 'leistungsgruppe'),
|
||||
allowNull: false,
|
||||
comment: 'Type of preset group that is disabled for this club'
|
||||
}
|
||||
}, {
|
||||
tableName: 'club_disabled_preset_groups',
|
||||
underscored: true,
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default ClubDisabledPresetGroup;
|
||||
|
||||
139
backend/models/ExternalTournamentParticipant.js
Normal file
139
backend/models/ExternalTournamentParticipant.js
Normal file
@@ -0,0 +1,139 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import { encryptData, decryptData } from '../utils/encrypt.js';
|
||||
|
||||
const ExternalTournamentParticipant = sequelize.define('ExternalTournamentParticipant', {
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
groupId: {
|
||||
type: DataTypes.INTEGER,
|
||||
autoIncrement: false,
|
||||
allowNull: true
|
||||
},
|
||||
firstName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
set(value) {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('firstName', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('firstName');
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
lastName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
set(value) {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('lastName', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('lastName');
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
club: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
if (!value) {
|
||||
this.setDataValue('club', null);
|
||||
return;
|
||||
}
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('club', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('club');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
email: {
|
||||
type: DataTypes.STRING(500),
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
if (!value) {
|
||||
this.setDataValue('email', null);
|
||||
return;
|
||||
}
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('email', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('email');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
address: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
if (!value) {
|
||||
this.setDataValue('address', null);
|
||||
return;
|
||||
}
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('address', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('address');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
birthDate: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
if (!value) {
|
||||
this.setDataValue('birthDate', null);
|
||||
return;
|
||||
}
|
||||
const encryptedValue = encryptData(value || '');
|
||||
this.setDataValue('birthDate', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('birthDate');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
gender: {
|
||||
type: DataTypes.ENUM('male', 'female', 'diverse', 'unknown'),
|
||||
allowNull: true,
|
||||
defaultValue: 'unknown'
|
||||
},
|
||||
seeded: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
classId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
outOfCompetition: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
gaveUp: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
field: 'gave_up'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'external_tournament_participant',
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default ExternalTournamentParticipant;
|
||||
|
||||
40
backend/models/MemberTrainingGroup.js
Normal file
40
backend/models/MemberTrainingGroup.js
Normal file
@@ -0,0 +1,40 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Member from './Member.js';
|
||||
import TrainingGroup from './TrainingGroup.js';
|
||||
|
||||
const MemberTrainingGroup = sequelize.define('MemberTrainingGroup', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
memberId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Member,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
},
|
||||
trainingGroupId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: TrainingGroup,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
}
|
||||
}, {
|
||||
tableName: 'member_training_group',
|
||||
underscored: true,
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default MemberTrainingGroup;
|
||||
|
||||
|
||||
|
||||
@@ -22,9 +22,17 @@ const MyTischtennis = sequelize.define('MyTischtennis', {
|
||||
email: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
set(value) {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('email', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('email');
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
encryptedPassword: {
|
||||
type: DataTypes.TEXT,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data
|
||||
allowNull: true,
|
||||
field: 'encrypted_password'
|
||||
},
|
||||
@@ -41,14 +49,40 @@ const MyTischtennis = sequelize.define('MyTischtennis', {
|
||||
field: 'auto_update_ratings'
|
||||
},
|
||||
accessToken: {
|
||||
type: DataTypes.TEXT,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data
|
||||
allowNull: true,
|
||||
field: 'access_token'
|
||||
field: 'access_token',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('accessToken', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('accessToken', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('accessToken');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
refreshToken: {
|
||||
type: DataTypes.TEXT,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data
|
||||
allowNull: true,
|
||||
field: 'refresh_token'
|
||||
field: 'refresh_token',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('refreshToken', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('refreshToken', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('refreshToken');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
expiresAt: {
|
||||
type: DataTypes.BIGINT,
|
||||
@@ -56,28 +90,100 @@ const MyTischtennis = sequelize.define('MyTischtennis', {
|
||||
field: 'expires_at'
|
||||
},
|
||||
cookie: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('cookie', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('cookie', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('cookie');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
userData: {
|
||||
type: DataTypes.JSON,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT to support very long encrypted strings
|
||||
allowNull: true,
|
||||
field: 'user_data'
|
||||
field: 'user_data',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('userData', null);
|
||||
} else {
|
||||
const jsonString = typeof value === 'string' ? value : JSON.stringify(value);
|
||||
const encryptedValue = encryptData(jsonString);
|
||||
this.setDataValue('userData', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('userData');
|
||||
if (!encryptedValue) return null;
|
||||
try {
|
||||
const decryptedString = decryptData(encryptedValue);
|
||||
return JSON.parse(decryptedString);
|
||||
} catch (error) {
|
||||
console.error('Error decrypting/parsing userData:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.STRING,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data (can be longer than VARCHAR)
|
||||
allowNull: true,
|
||||
field: 'club_id'
|
||||
field: 'club_id',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('clubId', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('clubId', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('clubId');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
clubName: {
|
||||
type: DataTypes.STRING,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data (can be longer than VARCHAR)
|
||||
allowNull: true,
|
||||
field: 'club_name'
|
||||
field: 'club_name',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('clubName', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('clubName', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('clubName');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
fedNickname: {
|
||||
type: DataTypes.STRING,
|
||||
type: DataTypes.TEXT('long'), // Use LONGTEXT for encrypted data (can be longer than VARCHAR)
|
||||
allowNull: true,
|
||||
field: 'fed_nickname'
|
||||
field: 'fed_nickname',
|
||||
set(value) {
|
||||
if (value === null || value === undefined) {
|
||||
this.setDataValue('fedNickname', null);
|
||||
} else {
|
||||
const encryptedValue = encryptData(value);
|
||||
this.setDataValue('fedNickname', encryptedValue);
|
||||
}
|
||||
},
|
||||
get() {
|
||||
const encryptedValue = this.getDataValue('fedNickname');
|
||||
if (!encryptedValue) return null;
|
||||
return decryptData(encryptedValue);
|
||||
}
|
||||
},
|
||||
lastLoginAttempt: {
|
||||
type: DataTypes.DATE,
|
||||
|
||||
@@ -17,6 +17,7 @@ const Tournament = sequelize.define('Tournament', {
|
||||
advancingPerGroup: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 1,
|
||||
},
|
||||
numberOfGroups: {
|
||||
type: DataTypes.INTEGER,
|
||||
@@ -28,7 +29,28 @@ const Tournament = sequelize.define('Tournament', {
|
||||
allowNull: false,
|
||||
defaultValue: 1
|
||||
},
|
||||
advancingPerGroup: { type: DataTypes.INTEGER, allowNull: false, defaultValue: 1 },
|
||||
winningSets: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 3,
|
||||
},
|
||||
allowsExternal: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
},
|
||||
miniChampionshipYear: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
field: 'mini_championship_year',
|
||||
comment: 'Jahr der Minimeisterschaft; nur gesetzt bei Minimeisterschaften'
|
||||
},
|
||||
numberOfTables: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null,
|
||||
comment: 'Anzahl der Tische, auf denen gespielt wird'
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament',
|
||||
|
||||
62
backend/models/TournamentClass.js
Normal file
62
backend/models/TournamentClass.js
Normal file
@@ -0,0 +1,62 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Tournament from './Tournament.js';
|
||||
|
||||
const TournamentClass = sequelize.define('TournamentClass', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Tournament,
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE'
|
||||
},
|
||||
name: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false
|
||||
},
|
||||
sortOrder: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0
|
||||
},
|
||||
isDoubles: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
gender: {
|
||||
type: DataTypes.ENUM('male', 'female', 'mixed'),
|
||||
allowNull: true,
|
||||
defaultValue: null
|
||||
},
|
||||
minBirthYear: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null,
|
||||
field: 'min_birth_year',
|
||||
comment: 'Geboren im Jahr X oder später (>=)'
|
||||
},
|
||||
maxBirthYear: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null,
|
||||
field: 'max_birth_year',
|
||||
comment: 'Geboren im Jahr X oder früher (<=); für Altersklassen 12/10'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_class',
|
||||
timestamps: true
|
||||
});
|
||||
|
||||
export default TournamentClass;
|
||||
|
||||
@@ -8,10 +8,22 @@ const TournamentGroup = sequelize.define('TournamentGroup', {
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
stageId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
tournamentId : {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false
|
||||
},
|
||||
classId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
poolId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_group',
|
||||
|
||||
@@ -5,6 +5,10 @@ import Tournament from './Tournament.js';
|
||||
import TournamentGroup from './TournamentGroup.js';
|
||||
|
||||
const TournamentMatch = sequelize.define('TournamentMatch', {
|
||||
stageId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
@@ -25,6 +29,10 @@ const TournamentMatch = sequelize.define('TournamentMatch', {
|
||||
onDelete: 'SET NULL',
|
||||
onUpdate: 'CASCADE'
|
||||
},
|
||||
classId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
groupRound: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
@@ -35,21 +43,32 @@ const TournamentMatch = sequelize.define('TournamentMatch', {
|
||||
},
|
||||
player1Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
allowNull: true,
|
||||
},
|
||||
player2Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
allowNull: true,
|
||||
},
|
||||
isFinished: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
},
|
||||
isActive: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
},
|
||||
result: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
},
|
||||
tableNumber: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null,
|
||||
comment: 'Tischnummer, an der das Match stattfindet'
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_match',
|
||||
|
||||
@@ -16,6 +16,26 @@ const TournamentMember = sequelize.define('TournamentMember', {
|
||||
type: DataTypes.INTEGER,
|
||||
autoIncrement: false,
|
||||
allowNull: false
|
||||
},
|
||||
seeded: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
classId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
outOfCompetition: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
gaveUp: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
field: 'gave_up'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
|
||||
71
backend/models/TournamentPairing.js
Normal file
71
backend/models/TournamentPairing.js
Normal file
@@ -0,0 +1,71 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Tournament from './Tournament.js';
|
||||
import TournamentClass from './TournamentClass.js';
|
||||
|
||||
const TournamentPairing = sequelize.define('TournamentPairing', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Tournament,
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE'
|
||||
},
|
||||
classId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: TournamentClass,
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE'
|
||||
},
|
||||
groupId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
// Player 1: entweder Mitglied oder externer Teilnehmer
|
||||
member1Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
external1Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
// Player 2: entweder Mitglied oder externer Teilnehmer
|
||||
member2Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
external2Id: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true
|
||||
},
|
||||
seeded: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_pairing',
|
||||
timestamps: true
|
||||
});
|
||||
|
||||
export default TournamentPairing;
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
46
backend/models/TournamentStage.js
Normal file
46
backend/models/TournamentStage.js
Normal file
@@ -0,0 +1,46 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const TournamentStage = sequelize.define('TournamentStage', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
index: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
field: 'stage_index',
|
||||
},
|
||||
name: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
},
|
||||
type: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false, // 'groups' | 'knockout'
|
||||
},
|
||||
numberOfGroups: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
advancingPerGroup: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
maxGroupSize: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_stage',
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default TournamentStage;
|
||||
40
backend/models/TournamentStageAdvancement.js
Normal file
40
backend/models/TournamentStageAdvancement.js
Normal file
@@ -0,0 +1,40 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const TournamentStageAdvancement = sequelize.define('TournamentStageAdvancement', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
tournamentId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
fromStageId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
toStageId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
mode: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
defaultValue: 'pools',
|
||||
},
|
||||
config: {
|
||||
// JSON: { pools: [{ fromPlaces:[1,2], target:{ type:'groups', groupCount:2 }}, ...] }
|
||||
type: DataTypes.JSON,
|
||||
allowNull: false,
|
||||
defaultValue: {},
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'tournament_stage_advancement',
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default TournamentStageAdvancement;
|
||||
51
backend/models/TrainingGroup.js
Normal file
51
backend/models/TrainingGroup.js
Normal file
@@ -0,0 +1,51 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Club from './Club.js';
|
||||
|
||||
const TrainingGroup = sequelize.define('TrainingGroup', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Club,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
},
|
||||
name: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
isPreset: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
comment: 'True if this is a preset group (Anfänger, Fortgeschrittene, etc.)'
|
||||
},
|
||||
presetType: {
|
||||
type: DataTypes.ENUM('anfaenger', 'fortgeschrittene', 'erwachsene', 'nachwuchs', 'leistungsgruppe'),
|
||||
allowNull: true,
|
||||
comment: 'Type of preset group'
|
||||
},
|
||||
sortOrder: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
comment: 'Order for displaying groups'
|
||||
}
|
||||
}, {
|
||||
tableName: 'training_group',
|
||||
underscored: true,
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default TrainingGroup;
|
||||
|
||||
|
||||
|
||||
47
backend/models/TrainingTime.js
Normal file
47
backend/models/TrainingTime.js
Normal file
@@ -0,0 +1,47 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import TrainingGroup from './TrainingGroup.js';
|
||||
|
||||
const TrainingTime = sequelize.define('TrainingTime', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
trainingGroupId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: TrainingGroup,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
},
|
||||
weekday: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
comment: '0 = Sunday, 1 = Monday, ..., 6 = Saturday'
|
||||
},
|
||||
startTime: {
|
||||
type: DataTypes.TIME,
|
||||
allowNull: false,
|
||||
},
|
||||
endTime: {
|
||||
type: DataTypes.TIME,
|
||||
allowNull: false,
|
||||
},
|
||||
sortOrder: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
comment: 'Order for displaying multiple times on the same weekday'
|
||||
}
|
||||
}, {
|
||||
tableName: 'training_times',
|
||||
underscored: true,
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default TrainingTime;
|
||||
|
||||
@@ -37,6 +37,16 @@ const User = sequelize.define('User', {
|
||||
authCode: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true
|
||||
},
|
||||
resetToken: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Token für Passwort-Reset'
|
||||
},
|
||||
resetTokenExpires: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
comment: 'Ablaufzeitpunkt des Reset-Tokens'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
|
||||
@@ -27,9 +27,14 @@ import Group from './Group.js';
|
||||
import GroupActivity from './GroupActivity.js';
|
||||
import Tournament from './Tournament.js';
|
||||
import TournamentGroup from './TournamentGroup.js';
|
||||
import TournamentClass from './TournamentClass.js';
|
||||
import TournamentMember from './TournamentMember.js';
|
||||
import TournamentMatch from './TournamentMatch.js';
|
||||
import TournamentResult from './TournamentResult.js';
|
||||
import ExternalTournamentParticipant from './ExternalTournamentParticipant.js';
|
||||
import TournamentPairing from './TournamentPairing.js';
|
||||
import TournamentStage from './TournamentStage.js';
|
||||
import TournamentStageAdvancement from './TournamentStageAdvancement.js';
|
||||
import Accident from './Accident.js';
|
||||
import UserToken from './UserToken.js';
|
||||
import OfficialTournament from './OfficialTournament.js';
|
||||
@@ -42,6 +47,10 @@ import ApiLog from './ApiLog.js';
|
||||
import MemberTransferConfig from './MemberTransferConfig.js';
|
||||
import MemberContact from './MemberContact.js';
|
||||
import MemberImage from './MemberImage.js';
|
||||
import TrainingGroup from './TrainingGroup.js';
|
||||
import MemberTrainingGroup from './MemberTrainingGroup.js';
|
||||
import ClubDisabledPresetGroup from './ClubDisabledPresetGroup.js';
|
||||
import TrainingTime from './TrainingTime.js';
|
||||
// Official tournaments relations
|
||||
OfficialTournament.hasMany(OfficialCompetition, { foreignKey: 'tournamentId', as: 'competitions' });
|
||||
OfficialCompetition.belongsTo(OfficialTournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
@@ -185,6 +194,13 @@ Club.hasMany(Tournament, { foreignKey: 'clubId', as: 'tournaments' });
|
||||
TournamentGroup.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournaments' });
|
||||
Tournament.hasMany(TournamentGroup, { foreignKey: 'tournamentId', as: 'tournamentGroups' });
|
||||
|
||||
// Tournament Stages
|
||||
TournamentStage.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentStage, { foreignKey: 'tournamentId', as: 'stages' });
|
||||
|
||||
TournamentStageAdvancement.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentStageAdvancement, { foreignKey: 'tournamentId', as: 'stageAdvancements' });
|
||||
|
||||
TournamentMember.belongsTo(TournamentGroup, {
|
||||
foreignKey: 'groupId',
|
||||
targetKey: 'id',
|
||||
@@ -201,6 +217,15 @@ Member.hasMany(TournamentMember, { foreignKey: 'clubMemberId', as: 'tournamentGr
|
||||
|
||||
TournamentMember.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentMember, { foreignKey: 'tournamentId', as: 'tournamentMembers' });
|
||||
TournamentMember.belongsTo(TournamentClass, {
|
||||
foreignKey: 'classId',
|
||||
as: 'class',
|
||||
constraints: false
|
||||
});
|
||||
TournamentClass.hasMany(TournamentMember, {
|
||||
foreignKey: 'classId',
|
||||
as: 'members'
|
||||
});
|
||||
|
||||
TournamentMatch.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentMatch, { foreignKey: 'tournamentId', as: 'tournamentMatches' });
|
||||
@@ -227,6 +252,68 @@ TournamentMatch.belongsTo(TournamentMember, { foreignKey: 'player2Id', as: 'play
|
||||
TournamentMember.hasMany(TournamentMatch, { foreignKey: 'player1Id', as: 'player1Matches' });
|
||||
TournamentMember.hasMany(TournamentMatch, { foreignKey: 'player2Id', as: 'player2Matches' });
|
||||
|
||||
// Tournament Classes
|
||||
TournamentClass.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentClass, { foreignKey: 'tournamentId', as: 'classes' });
|
||||
|
||||
// External Tournament Participants
|
||||
ExternalTournamentParticipant.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(ExternalTournamentParticipant, { foreignKey: 'tournamentId', as: 'externalParticipants' });
|
||||
ExternalTournamentParticipant.belongsTo(TournamentGroup, {
|
||||
foreignKey: 'groupId',
|
||||
targetKey: 'id',
|
||||
as: 'group',
|
||||
constraints: false
|
||||
});
|
||||
TournamentGroup.hasMany(ExternalTournamentParticipant, {
|
||||
foreignKey: 'groupId',
|
||||
as: 'externalGroupMembers'
|
||||
});
|
||||
ExternalTournamentParticipant.belongsTo(TournamentClass, {
|
||||
foreignKey: 'classId',
|
||||
as: 'class',
|
||||
constraints: false
|
||||
});
|
||||
TournamentClass.hasMany(ExternalTournamentParticipant, {
|
||||
foreignKey: 'classId',
|
||||
as: 'externalParticipants'
|
||||
});
|
||||
|
||||
// Tournament Pairings
|
||||
TournamentPairing.belongsTo(Tournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Tournament.hasMany(TournamentPairing, { foreignKey: 'tournamentId', as: 'pairings' });
|
||||
TournamentPairing.belongsTo(TournamentClass, { foreignKey: 'classId', as: 'class' });
|
||||
TournamentClass.hasMany(TournamentPairing, { foreignKey: 'classId', as: 'pairings' });
|
||||
TournamentPairing.belongsTo(TournamentGroup, {
|
||||
foreignKey: 'groupId',
|
||||
as: 'group',
|
||||
constraints: false
|
||||
});
|
||||
TournamentGroup.hasMany(TournamentPairing, {
|
||||
foreignKey: 'groupId',
|
||||
as: 'pairings'
|
||||
});
|
||||
TournamentPairing.belongsTo(TournamentMember, {
|
||||
foreignKey: 'member1Id',
|
||||
as: 'member1',
|
||||
constraints: false
|
||||
});
|
||||
TournamentPairing.belongsTo(TournamentMember, {
|
||||
foreignKey: 'member2Id',
|
||||
as: 'member2',
|
||||
constraints: false
|
||||
});
|
||||
TournamentPairing.belongsTo(ExternalTournamentParticipant, {
|
||||
foreignKey: 'external1Id',
|
||||
as: 'external1',
|
||||
constraints: false
|
||||
});
|
||||
TournamentPairing.belongsTo(ExternalTournamentParticipant, {
|
||||
foreignKey: 'external2Id',
|
||||
as: 'external2',
|
||||
constraints: false
|
||||
});
|
||||
|
||||
Accident.belongsTo(Member, { foreignKey: 'memberId', as: 'members' });
|
||||
Member.hasMany(Accident, { foreignKey: 'memberId', as: 'accidents' });
|
||||
|
||||
@@ -254,6 +341,31 @@ MemberContact.belongsTo(Member, { foreignKey: 'memberId', as: 'member' });
|
||||
Member.hasMany(MemberImage, { foreignKey: 'memberId', as: 'images' });
|
||||
MemberImage.belongsTo(Member, { foreignKey: 'memberId', as: 'member' });
|
||||
|
||||
// Training Groups
|
||||
Club.hasMany(TrainingGroup, { foreignKey: 'clubId', as: 'trainingGroups' });
|
||||
TrainingGroup.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
|
||||
Member.belongsToMany(TrainingGroup, {
|
||||
through: MemberTrainingGroup,
|
||||
foreignKey: 'memberId',
|
||||
otherKey: 'trainingGroupId',
|
||||
as: 'trainingGroups'
|
||||
});
|
||||
TrainingGroup.belongsToMany(Member, {
|
||||
through: MemberTrainingGroup,
|
||||
foreignKey: 'trainingGroupId',
|
||||
otherKey: 'memberId',
|
||||
as: 'members'
|
||||
});
|
||||
|
||||
// Club Disabled Preset Groups
|
||||
Club.hasMany(ClubDisabledPresetGroup, { foreignKey: 'clubId', as: 'disabledPresetGroups' });
|
||||
ClubDisabledPresetGroup.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
|
||||
// Training Times
|
||||
TrainingGroup.hasMany(TrainingTime, { foreignKey: 'trainingGroupId', as: 'trainingTimes' });
|
||||
TrainingTime.belongsTo(TrainingGroup, { foreignKey: 'trainingGroupId', as: 'trainingGroup' });
|
||||
|
||||
export {
|
||||
User,
|
||||
Log,
|
||||
@@ -283,9 +395,12 @@ export {
|
||||
GroupActivity,
|
||||
Tournament,
|
||||
TournamentGroup,
|
||||
TournamentClass,
|
||||
TournamentMember,
|
||||
TournamentMatch,
|
||||
TournamentResult,
|
||||
ExternalTournamentParticipant,
|
||||
TournamentPairing,
|
||||
Accident,
|
||||
UserToken,
|
||||
OfficialTournament,
|
||||
@@ -298,4 +413,8 @@ export {
|
||||
MemberTransferConfig,
|
||||
MemberContact,
|
||||
MemberImage,
|
||||
TrainingGroup,
|
||||
MemberTrainingGroup,
|
||||
ClubDisabledPresetGroup,
|
||||
TrainingTime,
|
||||
};
|
||||
|
||||
1
backend/node_modules/.bin/color-support
generated
vendored
1
backend/node_modules/.bin/color-support
generated
vendored
@@ -1 +0,0 @@
|
||||
../color-support/bin.js
|
||||
1
backend/node_modules/.bin/mime
generated
vendored
1
backend/node_modules/.bin/mime
generated
vendored
@@ -1 +0,0 @@
|
||||
../mime/cli.js
|
||||
1
backend/node_modules/.bin/mkdirp
generated
vendored
1
backend/node_modules/.bin/mkdirp
generated
vendored
@@ -1 +0,0 @@
|
||||
../mkdirp/bin/cmd.js
|
||||
1
backend/node_modules/.bin/node-pre-gyp
generated
vendored
1
backend/node_modules/.bin/node-pre-gyp
generated
vendored
@@ -1 +0,0 @@
|
||||
../@mapbox/node-pre-gyp/bin/node-pre-gyp
|
||||
1
backend/node_modules/.bin/nodemon
generated
vendored
1
backend/node_modules/.bin/nodemon
generated
vendored
@@ -1 +0,0 @@
|
||||
../nodemon/bin/nodemon.js
|
||||
1
backend/node_modules/.bin/nodetouch
generated
vendored
1
backend/node_modules/.bin/nodetouch
generated
vendored
@@ -1 +0,0 @@
|
||||
../touch/bin/nodetouch.js
|
||||
1
backend/node_modules/.bin/nopt
generated
vendored
1
backend/node_modules/.bin/nopt
generated
vendored
@@ -1 +0,0 @@
|
||||
../nopt/bin/nopt.js
|
||||
1
backend/node_modules/.bin/rimraf
generated
vendored
1
backend/node_modules/.bin/rimraf
generated
vendored
@@ -1 +0,0 @@
|
||||
../rimraf/bin.js
|
||||
1
backend/node_modules/.bin/semver
generated
vendored
1
backend/node_modules/.bin/semver
generated
vendored
@@ -1 +0,0 @@
|
||||
../semver/bin/semver.js
|
||||
1
backend/node_modules/.bin/uuid
generated
vendored
1
backend/node_modules/.bin/uuid
generated
vendored
@@ -1 +0,0 @@
|
||||
../uuid/dist/bin/uuid
|
||||
5218
backend/node_modules/.package-lock.json
generated
vendored
5218
backend/node_modules/.package-lock.json
generated
vendored
File diff suppressed because it is too large
Load Diff
74
backend/node_modules/@mapbox/node-pre-gyp/.github/workflows/codeql.yml
generated
vendored
74
backend/node_modules/@mapbox/node-pre-gyp/.github/workflows/codeql.yml
generated
vendored
@@ -1,74 +0,0 @@
|
||||
# For most projects, this workflow file will not need changing; you simply need
|
||||
# to commit it to your repository.
|
||||
#
|
||||
# You may wish to alter this file to override the set of languages analyzed,
|
||||
# or to provide custom queries or build logic.
|
||||
#
|
||||
# ******** NOTE ********
|
||||
# We have attempted to detect the languages in your repository. Please check
|
||||
# the `language` matrix defined below to confirm you have the correct set of
|
||||
# supported CodeQL languages.
|
||||
#
|
||||
name: "CodeQL"
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "master" ]
|
||||
pull_request:
|
||||
# The branches below must be a subset of the branches above
|
||||
branches: [ "master" ]
|
||||
schedule:
|
||||
- cron: '24 5 * * 4'
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: read
|
||||
contents: read
|
||||
security-events: write
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
language: [ 'javascript' ]
|
||||
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
|
||||
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v3
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
# By default, queries listed here will override any specified in a config file.
|
||||
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||
|
||||
# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
|
||||
# queries: security-extended,security-and-quality
|
||||
|
||||
|
||||
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
|
||||
# If this step fails, then you should remove it and run the build manually (see below)
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@v2
|
||||
|
||||
# ℹ️ Command-line programs to run using the OS shell.
|
||||
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||
|
||||
# If the Autobuild fails above, remove it and uncomment the following three lines.
|
||||
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
|
||||
|
||||
# - run: |
|
||||
# echo "Run, Build Application using script"
|
||||
# ./location_of_script_within_repo/buildscript.sh
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
||||
510
backend/node_modules/@mapbox/node-pre-gyp/CHANGELOG.md
generated
vendored
510
backend/node_modules/@mapbox/node-pre-gyp/CHANGELOG.md
generated
vendored
@@ -1,510 +0,0 @@
|
||||
# node-pre-gyp changelog
|
||||
|
||||
## 1.0.11
|
||||
- Fixes dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
|
||||
|
||||
## 1.0.10
|
||||
- Upgraded minimist to 1.2.6 to address dependabot alert [CVE-2021-44906](https://nvd.nist.gov/vuln/detail/CVE-2021-44906)
|
||||
|
||||
## 1.0.9
|
||||
- Upgraded node-fetch to 2.6.7 to address [CVE-2022-0235](https://www.cve.org/CVERecord?id=CVE-2022-0235)
|
||||
- Upgraded detect-libc to 2.0.0 to use non-blocking NodeJS(>=12) Report API
|
||||
|
||||
## 1.0.8
|
||||
- Downgraded npmlog to maintain node v10 and v8 support (https://github.com/mapbox/node-pre-gyp/pull/624)
|
||||
|
||||
## 1.0.7
|
||||
- Upgraded nyc and npmlog to address https://github.com/advisories/GHSA-93q8-gq69-wqmw
|
||||
|
||||
## 1.0.6
|
||||
- Added node v17 to the internal node releases listing
|
||||
- Upgraded various dependencies declared in package.json to latest major versions (node-fetch from 2.6.1 to 2.6.5, npmlog from 4.1.2 to 5.01, semver from 7.3.4 to 7.3.5, and tar from 6.1.0 to 6.1.11)
|
||||
- Fixed bug in `staging_host` parameter (https://github.com/mapbox/node-pre-gyp/pull/590)
|
||||
|
||||
|
||||
## 1.0.5
|
||||
- Fix circular reference warning with node >= v14
|
||||
|
||||
## 1.0.4
|
||||
- Added node v16 to the internal node releases listing
|
||||
|
||||
## 1.0.3
|
||||
- Improved support configuring s3 uploads (solves https://github.com/mapbox/node-pre-gyp/issues/571)
|
||||
- New options added in https://github.com/mapbox/node-pre-gyp/pull/576: 'bucket', 'region', and `s3ForcePathStyle`
|
||||
|
||||
## 1.0.2
|
||||
- Fixed regression in proxy support (https://github.com/mapbox/node-pre-gyp/issues/572)
|
||||
|
||||
## 1.0.1
|
||||
- Switched from mkdirp@1.0.4 to make-dir@3.1.0 to avoid this bug: https://github.com/isaacs/node-mkdirp/issues/31
|
||||
|
||||
## 1.0.0
|
||||
- Module is now name-spaced at `@mapbox/node-pre-gyp` and the original `node-pre-gyp` is deprecated.
|
||||
- New: support for staging and production s3 targets (see README.md)
|
||||
- BREAKING: no longer supporting `node_pre_gyp_accessKeyId` & `node_pre_gyp_secretAccessKey`, use `AWS_ACCESS_KEY_ID` & `AWS_SECRET_ACCESS_KEY` instead to authenticate against s3 for `info`, `publish`, and `unpublish` commands.
|
||||
- Dropped node v6 support, added node v14 support
|
||||
- Switched tests to use mapbox-owned bucket for testing
|
||||
- Added coverage tracking and linting with eslint
|
||||
- Added back support for symlinks inside the tarball
|
||||
- Upgraded all test apps to N-API/node-addon-api
|
||||
- New: support for staging and production s3 targets (see README.md)
|
||||
- Added `node_pre_gyp_s3_host` env var which has priority over the `--s3_host` option or default.
|
||||
- Replaced needle with node-fetch
|
||||
- Added proxy support for node-fetch
|
||||
- Upgraded to mkdirp@1.x
|
||||
|
||||
## 0.17.0
|
||||
- Got travis + appveyor green again
|
||||
- Added support for more node versions
|
||||
|
||||
## 0.16.0
|
||||
|
||||
- Added Node 15 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/520)
|
||||
|
||||
## 0.15.0
|
||||
|
||||
- Bump dependency on `mkdirp` from `^0.5.1` to `^0.5.3` (https://github.com/mapbox/node-pre-gyp/pull/492)
|
||||
- Bump dependency on `needle` from `^2.2.1` to `^2.5.0` (https://github.com/mapbox/node-pre-gyp/pull/502)
|
||||
- Added Node 14 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/501)
|
||||
|
||||
## 0.14.0
|
||||
|
||||
- Defer modules requires in napi.js (https://github.com/mapbox/node-pre-gyp/pull/434)
|
||||
- Bump dependency on `tar` from `^4` to `^4.4.2` (https://github.com/mapbox/node-pre-gyp/pull/454)
|
||||
- Support extracting compiled binary from local offline mirror (https://github.com/mapbox/node-pre-gyp/pull/459)
|
||||
- Added Node 13 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/483)
|
||||
|
||||
## 0.13.0
|
||||
|
||||
- Added Node 12 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/449)
|
||||
|
||||
## 0.12.0
|
||||
|
||||
- Fixed double-build problem with node v10 (https://github.com/mapbox/node-pre-gyp/pull/428)
|
||||
- Added node 11 support in the local database (https://github.com/mapbox/node-pre-gyp/pull/422)
|
||||
|
||||
## 0.11.0
|
||||
|
||||
- Fixed double-install problem with node v10
|
||||
- Significant N-API improvements (https://github.com/mapbox/node-pre-gyp/pull/405)
|
||||
|
||||
## 0.10.3
|
||||
|
||||
- Now will use `request` over `needle` if request is installed. By default `needle` is used for `https`. This should unbreak proxy support that regressed in v0.9.0
|
||||
|
||||
## 0.10.2
|
||||
|
||||
- Fixed rc/deep-extent security vulnerability
|
||||
- Fixed broken reinstall script do to incorrectly named get_best_napi_version
|
||||
|
||||
## 0.10.1
|
||||
|
||||
- Fix needle error event (@medns)
|
||||
|
||||
## 0.10.0
|
||||
|
||||
- Allow for a single-level module path when packing @allenluce (https://github.com/mapbox/node-pre-gyp/pull/371)
|
||||
- Log warnings instead of errors when falling back @xzyfer (https://github.com/mapbox/node-pre-gyp/pull/366)
|
||||
- Add Node.js v10 support to tests (https://github.com/mapbox/node-pre-gyp/pull/372)
|
||||
- Remove retire.js from CI (https://github.com/mapbox/node-pre-gyp/pull/372)
|
||||
- Remove support for Node.js v4 due to [EOL on April 30th, 2018](https://github.com/nodejs/Release/blob/7dd52354049cae99eed0e9fe01345b0722a86fde/schedule.json#L14)
|
||||
- Update appveyor tests to install default NPM version instead of NPM v2.x for all Windows builds (https://github.com/mapbox/node-pre-gyp/pull/375)
|
||||
|
||||
## 0.9.1
|
||||
|
||||
- Fixed regression (in v0.9.0) with support for http redirects @allenluce (https://github.com/mapbox/node-pre-gyp/pull/361)
|
||||
|
||||
## 0.9.0
|
||||
|
||||
- Switched from using `request` to `needle` to reduce size of module deps (https://github.com/mapbox/node-pre-gyp/pull/350)
|
||||
|
||||
## 0.8.0
|
||||
|
||||
- N-API support (@inspiredware)
|
||||
|
||||
## 0.7.1
|
||||
|
||||
- Upgraded to tar v4.x
|
||||
|
||||
## 0.7.0
|
||||
|
||||
- Updated request and hawk (#347)
|
||||
- Dropped node v0.10.x support
|
||||
|
||||
## 0.6.40
|
||||
|
||||
- Improved error reporting if an install fails
|
||||
|
||||
## 0.6.39
|
||||
|
||||
- Support for node v9
|
||||
- Support for versioning on `{libc}` to allow binaries to work on non-glic linux systems like alpine linux
|
||||
|
||||
|
||||
## 0.6.38
|
||||
|
||||
- Maintaining compatibility (for v0.6.x series) with node v0.10.x
|
||||
|
||||
## 0.6.37
|
||||
|
||||
- Solved one part of #276: now now deduce the node ABI from the major version for node >= 2 even when not stored in the abi_crosswalk.json
|
||||
- Fixed docs to avoid mentioning the deprecated and dangerous `prepublish` in package.json (#291)
|
||||
- Add new node versions to crosswalk
|
||||
- Ported tests to use tape instead of mocha
|
||||
- Got appveyor tests passing by downgrading npm and node-gyp
|
||||
|
||||
## 0.6.36
|
||||
|
||||
- Removed the running of `testbinary` during install. Because this was regressed for so long, it is too dangerous to re-enable by default. Developers needing validation can call `node-pre-gyp testbinary` directory.
|
||||
- Fixed regression in v0.6.35 for electron installs (now skipping binary validation which is not yet supported for electron)
|
||||
|
||||
## 0.6.35
|
||||
|
||||
- No longer recommending `npm ls` in `prepublish` (#291)
|
||||
- Fixed testbinary command (#283) @szdavid92
|
||||
|
||||
## 0.6.34
|
||||
|
||||
- Added new node versions to crosswalk, including v8
|
||||
- Upgraded deps to latest versions, started using `^` instead of `~` for all deps.
|
||||
|
||||
## 0.6.33
|
||||
|
||||
- Improved support for yarn
|
||||
|
||||
## 0.6.32
|
||||
|
||||
- Honor npm configuration for CA bundles (@heikkipora)
|
||||
- Add node-pre-gyp and npm versions to user agent (@addaleax)
|
||||
- Updated various deps
|
||||
- Add known node version for v7.x
|
||||
|
||||
## 0.6.31
|
||||
|
||||
- Updated various deps
|
||||
|
||||
## 0.6.30
|
||||
|
||||
- Update to npmlog@4.x and semver@5.3.x
|
||||
- Add known node version for v6.5.0
|
||||
|
||||
## 0.6.29
|
||||
|
||||
- Add known node versions for v0.10.45, v0.12.14, v4.4.4, v5.11.1, and v6.1.0
|
||||
|
||||
## 0.6.28
|
||||
|
||||
- Now more verbose when remote binaries are not available. This is needed since npm is increasingly more quiet by default
|
||||
and users need to know why builds are falling back to source compiles that might then error out.
|
||||
|
||||
## 0.6.27
|
||||
|
||||
- Add known node version for node v6
|
||||
- Stopped bundling dependencies
|
||||
- Documented method for module authors to avoid bundling node-pre-gyp
|
||||
- See https://github.com/mapbox/node-pre-gyp/tree/master#configuring for details
|
||||
|
||||
## 0.6.26
|
||||
|
||||
- Skip validation for nw runtime (https://github.com/mapbox/node-pre-gyp/pull/181) via @fleg
|
||||
|
||||
## 0.6.25
|
||||
|
||||
- Improved support for auto-detection of electron runtime in `node-pre-gyp.find()`
|
||||
- Pull request from @enlight - https://github.com/mapbox/node-pre-gyp/pull/187
|
||||
- Add known node version for 4.4.1 and 5.9.1
|
||||
|
||||
## 0.6.24
|
||||
|
||||
- Add known node version for 5.8.0, 5.9.0, and 4.4.0.
|
||||
|
||||
## 0.6.23
|
||||
|
||||
- Add known node version for 0.10.43, 0.12.11, 4.3.2, and 5.7.1.
|
||||
|
||||
## 0.6.22
|
||||
|
||||
- Add known node version for 4.3.1, and 5.7.0.
|
||||
|
||||
## 0.6.21
|
||||
|
||||
- Add known node version for 0.10.42, 0.12.10, 4.3.0, and 5.6.0.
|
||||
|
||||
## 0.6.20
|
||||
|
||||
- Add known node version for 4.2.5, 4.2.6, 5.4.0, 5.4.1,and 5.5.0.
|
||||
|
||||
## 0.6.19
|
||||
|
||||
- Add known node version for 4.2.4
|
||||
|
||||
## 0.6.18
|
||||
|
||||
- Add new known node versions for 0.10.x, 0.12.x, 4.x, and 5.x
|
||||
|
||||
## 0.6.17
|
||||
|
||||
- Re-tagged to fix packaging problem of `Error: Cannot find module 'isarray'`
|
||||
|
||||
## 0.6.16
|
||||
|
||||
- Added known version in crosswalk for 5.1.0.
|
||||
|
||||
## 0.6.15
|
||||
|
||||
- Upgraded tar-pack (https://github.com/mapbox/node-pre-gyp/issues/182)
|
||||
- Support custom binary hosting mirror (https://github.com/mapbox/node-pre-gyp/pull/170)
|
||||
- Added known version in crosswalk for 4.2.2.
|
||||
|
||||
## 0.6.14
|
||||
|
||||
- Added node 5.x version
|
||||
|
||||
## 0.6.13
|
||||
|
||||
- Added more known node 4.x versions
|
||||
|
||||
## 0.6.12
|
||||
|
||||
- Added support for [Electron](http://electron.atom.io/). Just pass the `--runtime=electron` flag when building/installing. Thanks @zcbenz
|
||||
|
||||
## 0.6.11
|
||||
|
||||
- Added known node and io.js versions including more 3.x and 4.x versions
|
||||
|
||||
## 0.6.10
|
||||
|
||||
- Added known node and io.js versions including 3.x and 4.x versions
|
||||
- Upgraded `tar` dep
|
||||
|
||||
## 0.6.9
|
||||
|
||||
- Upgraded `rc` dep
|
||||
- Updated known io.js version: v2.4.0
|
||||
|
||||
## 0.6.8
|
||||
|
||||
- Upgraded `semver` and `rimraf` deps
|
||||
- Updated known node and io.js versions
|
||||
|
||||
## 0.6.7
|
||||
|
||||
- Fixed `node_abi` versions for io.js 1.1.x -> 1.8.x (should be 43, but was stored as 42) (refs https://github.com/iojs/build/issues/94)
|
||||
|
||||
## 0.6.6
|
||||
|
||||
- Updated with known io.js 2.0.0 version
|
||||
|
||||
## 0.6.5
|
||||
|
||||
- Now respecting `npm_config_node_gyp` (https://github.com/npm/npm/pull/4887)
|
||||
- Updated to semver@4.3.2
|
||||
- Updated known node v0.12.x versions and io.js 1.x versions.
|
||||
|
||||
## 0.6.4
|
||||
|
||||
- Improved support for `io.js` (@fengmk2)
|
||||
- Test coverage improvements (@mikemorris)
|
||||
- Fixed support for `--dist-url` that regressed in 0.6.3
|
||||
|
||||
## 0.6.3
|
||||
|
||||
- Added support for passing raw options to node-gyp using `--` separator. Flags passed after
|
||||
the `--` to `node-pre-gyp configure` will be passed directly to gyp while flags passed
|
||||
after the `--` will be passed directly to make/visual studio.
|
||||
- Added `node-pre-gyp configure` command to be able to call `node-gyp configure` directly
|
||||
- Fix issue with require validation not working on windows 7 (@edgarsilva)
|
||||
|
||||
## 0.6.2
|
||||
|
||||
- Support for io.js >= v1.0.2
|
||||
- Deferred require of `request` and `tar` to help speed up command line usage of `node-pre-gyp`.
|
||||
|
||||
## 0.6.1
|
||||
|
||||
- Fixed bundled `tar` version
|
||||
|
||||
## 0.6.0
|
||||
|
||||
- BREAKING: node odd releases like v0.11.x now use `major.minor.patch` for `{node_abi}` instead of `NODE_MODULE_VERSION` (#124)
|
||||
- Added support for `toolset` option in versioning. By default is an empty string but `--toolset` can be passed to publish or install to select alternative binaries that target a custom toolset like C++11. For example to target Visual Studio 2014 modules like node-sqlite3 use `--toolset=v140`.
|
||||
- Added support for `--no-rollback` option to request that a failed binary test does not remove the binary module leaves it in place.
|
||||
- Added support for `--update-binary` option to request an existing binary be re-installed and the check for a valid local module be skipped.
|
||||
- Added support for passing build options from `npm` through `node-pre-gyp` to `node-gyp`: `--nodedir`, `--disturl`, `--python`, and `--msvs_version`
|
||||
|
||||
## 0.5.31
|
||||
|
||||
- Added support for deducing node_abi for node.js runtime from previous release if the series is even
|
||||
- Added support for --target=0.10.33
|
||||
|
||||
## 0.5.30
|
||||
|
||||
- Repackaged with latest bundled deps
|
||||
|
||||
## 0.5.29
|
||||
|
||||
- Added support for semver `build`.
|
||||
- Fixed support for downloading from urls that include `+`.
|
||||
|
||||
## 0.5.28
|
||||
|
||||
- Now reporting unix style paths only in reveal command
|
||||
|
||||
## 0.5.27
|
||||
|
||||
- Fixed support for auto-detecting s3 bucket name when it contains `.` - @taavo
|
||||
- Fixed support for installing when path contains a `'` - @halfdan
|
||||
- Ported tests to mocha
|
||||
|
||||
## 0.5.26
|
||||
|
||||
- Fix node-webkit support when `--target` option is not provided
|
||||
|
||||
## 0.5.25
|
||||
|
||||
- Fix bundling of deps
|
||||
|
||||
## 0.5.24
|
||||
|
||||
- Updated ABI crosswalk to incldue node v0.10.30 and v0.10.31
|
||||
|
||||
## 0.5.23
|
||||
|
||||
- Added `reveal` command. Pass no options to get all versioning data as json. Pass a second arg to grab a single versioned property value
|
||||
- Added support for `--silent` (shortcut for `--loglevel=silent`)
|
||||
|
||||
## 0.5.22
|
||||
|
||||
- Fixed node-webkit versioning name (NOTE: node-webkit support still experimental)
|
||||
|
||||
## 0.5.21
|
||||
|
||||
- New package to fix `shasum check failed` error with v0.5.20
|
||||
|
||||
## 0.5.20
|
||||
|
||||
- Now versioning node-webkit binaries based on major.minor.patch - assuming no compatible ABI across versions (#90)
|
||||
|
||||
## 0.5.19
|
||||
|
||||
- Updated to know about more node-webkit releases
|
||||
|
||||
## 0.5.18
|
||||
|
||||
- Updated to know about more node-webkit releases
|
||||
|
||||
## 0.5.17
|
||||
|
||||
- Updated to know about node v0.10.29 release
|
||||
|
||||
## 0.5.16
|
||||
|
||||
- Now supporting all aws-sdk configuration parameters (http://docs.aws.amazon.com/AWSJavaScriptSDK/guide/node-configuring.html) (#86)
|
||||
|
||||
## 0.5.15
|
||||
|
||||
- Fixed installation of windows packages sub directories on unix systems (#84)
|
||||
|
||||
## 0.5.14
|
||||
|
||||
- Finished support for cross building using `--target_platform` option (#82)
|
||||
- Now skipping binary validation on install if target arch/platform do not match the host.
|
||||
- Removed multi-arch validing for OS X since it required a FAT node.js binary
|
||||
|
||||
## 0.5.13
|
||||
|
||||
- Fix problem in 0.5.12 whereby the wrong versions of mkdirp and semver where bundled.
|
||||
|
||||
## 0.5.12
|
||||
|
||||
- Improved support for node-webkit (@Mithgol)
|
||||
|
||||
## 0.5.11
|
||||
|
||||
- Updated target versions listing
|
||||
|
||||
## 0.5.10
|
||||
|
||||
- Fixed handling of `-debug` flag passed directory to node-pre-gyp (#72)
|
||||
- Added optional second arg to `node_pre_gyp.find` to customize the default versioning options used to locate the runtime binary
|
||||
- Failed install due to `testbinary` check failure no longer leaves behind binary (#70)
|
||||
|
||||
## 0.5.9
|
||||
|
||||
- Fixed regression in `testbinary` command causing installs to fail on windows with 0.5.7 (#60)
|
||||
|
||||
## 0.5.8
|
||||
|
||||
- Started bundling deps
|
||||
|
||||
## 0.5.7
|
||||
|
||||
- Fixed the `testbinary` check, which is used to determine whether to re-download or source compile, to work even in complex dependency situations (#63)
|
||||
- Exposed the internal `testbinary` command in node-pre-gyp command line tool
|
||||
- Fixed minor bug so that `fallback_to_build` option is always respected
|
||||
|
||||
## 0.5.6
|
||||
|
||||
- Added support for versioning on the `name` value in `package.json` (#57).
|
||||
- Moved to using streams for reading tarball when publishing (#52)
|
||||
|
||||
## 0.5.5
|
||||
|
||||
- Improved binary validation that also now works with node-webkit (@Mithgol)
|
||||
- Upgraded test apps to work with node v0.11.x
|
||||
- Improved test coverage
|
||||
|
||||
## 0.5.4
|
||||
|
||||
- No longer depends on external install of node-gyp for compiling builds.
|
||||
|
||||
## 0.5.3
|
||||
|
||||
- Reverted fix for debian/nodejs since it broke windows (#45)
|
||||
|
||||
## 0.5.2
|
||||
|
||||
- Support for debian systems where the node binary is named `nodejs` (#45)
|
||||
- Added `bin/node-pre-gyp.cmd` to be able to run command on windows locally (npm creates an .npm automatically when globally installed)
|
||||
- Updated abi-crosswalk with node v0.10.26 entry.
|
||||
|
||||
## 0.5.1
|
||||
|
||||
- Various minor bug fixes, several improving windows support for publishing.
|
||||
|
||||
## 0.5.0
|
||||
|
||||
- Changed property names in `binary` object: now required are `module_name`, `module_path`, and `host`.
|
||||
- Now `module_path` supports versioning, which allows developers to opt-in to using a versioned install path (#18).
|
||||
- Added `remote_path` which also supports versioning.
|
||||
- Changed `remote_uri` to `host`.
|
||||
|
||||
## 0.4.2
|
||||
|
||||
- Added support for `--target` flag to request cross-compile against a specific node/node-webkit version.
|
||||
- Added preliminary support for node-webkit
|
||||
- Fixed support for `--target_arch` option being respected in all cases.
|
||||
|
||||
## 0.4.1
|
||||
|
||||
- Fixed exception when only stderr is available in binary test (@bendi / #31)
|
||||
|
||||
## 0.4.0
|
||||
|
||||
- Enforce only `https:` based remote publishing access.
|
||||
- Added `node-pre-gyp info` command to display listing of published binaries
|
||||
- Added support for changing the directory node-pre-gyp should build in with the `-C/--directory` option.
|
||||
- Added support for S3 prefixes.
|
||||
|
||||
## 0.3.1
|
||||
|
||||
- Added `unpublish` command.
|
||||
- Fixed module path construction in tests.
|
||||
- Added ability to disable falling back to build behavior via `npm install --fallback-to-build=false` which overrides setting in a depedencies package.json `install` target.
|
||||
|
||||
## 0.3.0
|
||||
|
||||
- Support for packaging all files in `module_path` directory - see `app4` for example
|
||||
- Added `testpackage` command.
|
||||
- Changed `clean` command to only delete `.node` not entire `build` directory since node-gyp will handle that.
|
||||
- `.node` modules must be in a folder of there own since tar-pack will remove everything when it unpacks.
|
||||
27
backend/node_modules/@mapbox/node-pre-gyp/LICENSE
generated
vendored
27
backend/node_modules/@mapbox/node-pre-gyp/LICENSE
generated
vendored
@@ -1,27 +0,0 @@
|
||||
Copyright (c), Mapbox
|
||||
|
||||
All rights reserved.
|
||||
|
||||
Redistribution and use in source and binary forms, with or without modification,
|
||||
are permitted provided that the following conditions are met:
|
||||
|
||||
* Redistributions of source code must retain the above copyright notice,
|
||||
this list of conditions and the following disclaimer.
|
||||
* Redistributions in binary form must reproduce the above copyright notice,
|
||||
this list of conditions and the following disclaimer in the documentation
|
||||
and/or other materials provided with the distribution.
|
||||
* Neither the name of node-pre-gyp nor the names of its contributors
|
||||
may be used to endorse or promote products derived from this software
|
||||
without specific prior written permission.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
|
||||
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
|
||||
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
|
||||
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
|
||||
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
|
||||
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
|
||||
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
||||
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
|
||||
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
|
||||
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
742
backend/node_modules/@mapbox/node-pre-gyp/README.md
generated
vendored
742
backend/node_modules/@mapbox/node-pre-gyp/README.md
generated
vendored
@@ -1,742 +0,0 @@
|
||||
# @mapbox/node-pre-gyp
|
||||
|
||||
#### @mapbox/node-pre-gyp makes it easy to publish and install Node.js C++ addons from binaries
|
||||
|
||||
[](https://travis-ci.com/mapbox/node-pre-gyp)
|
||||
[](https://ci.appveyor.com/project/Mapbox/node-pre-gyp)
|
||||
|
||||
`@mapbox/node-pre-gyp` stands between [npm](https://github.com/npm/npm) and [node-gyp](https://github.com/Tootallnate/node-gyp) and offers a cross-platform method of binary deployment.
|
||||
|
||||
### Special note on previous package
|
||||
|
||||
On Feb 9th, 2021 `@mapbox/node-pre-gyp@1.0.0` was [released](./CHANGELOG.md). Older, unscoped versions that are not part of the `@mapbox` org are deprecated and only `@mapbox/node-pre-gyp` will see updates going forward. To upgrade to the new package do:
|
||||
|
||||
```
|
||||
npm uninstall node-pre-gyp --save
|
||||
npm install @mapbox/node-pre-gyp --save
|
||||
```
|
||||
|
||||
### Features
|
||||
|
||||
- A command line tool called `node-pre-gyp` that can install your package's C++ module from a binary.
|
||||
- A variety of developer targeted commands for packaging, testing, and publishing binaries.
|
||||
- A JavaScript module that can dynamically require your installed binary: `require('@mapbox/node-pre-gyp').find`
|
||||
|
||||
For a hello world example of a module packaged with `node-pre-gyp` see <https://github.com/springmeyer/node-addon-example> and [the wiki ](https://github.com/mapbox/node-pre-gyp/wiki/Modules-using-node-pre-gyp) for real world examples.
|
||||
|
||||
## Credits
|
||||
|
||||
- The module is modeled after [node-gyp](https://github.com/Tootallnate/node-gyp) by [@Tootallnate](https://github.com/Tootallnate)
|
||||
- Motivation for initial development came from [@ErisDS](https://github.com/ErisDS) and the [Ghost Project](https://github.com/TryGhost/Ghost).
|
||||
- Development is sponsored by [Mapbox](https://www.mapbox.com/)
|
||||
|
||||
## FAQ
|
||||
|
||||
See the [Frequently Ask Questions](https://github.com/mapbox/node-pre-gyp/wiki/FAQ).
|
||||
|
||||
## Depends
|
||||
|
||||
- Node.js >= node v8.x
|
||||
|
||||
## Install
|
||||
|
||||
`node-pre-gyp` is designed to be installed as a local dependency of your Node.js C++ addon and accessed like:
|
||||
|
||||
./node_modules/.bin/node-pre-gyp --help
|
||||
|
||||
But you can also install it globally:
|
||||
|
||||
npm install @mapbox/node-pre-gyp -g
|
||||
|
||||
## Usage
|
||||
|
||||
### Commands
|
||||
|
||||
View all possible commands:
|
||||
|
||||
node-pre-gyp --help
|
||||
|
||||
- clean - Remove the entire folder containing the compiled .node module
|
||||
- install - Install pre-built binary for module
|
||||
- reinstall - Run "clean" and "install" at once
|
||||
- build - Compile the module by dispatching to node-gyp or nw-gyp
|
||||
- rebuild - Run "clean" and "build" at once
|
||||
- package - Pack binary into tarball
|
||||
- testpackage - Test that the staged package is valid
|
||||
- publish - Publish pre-built binary
|
||||
- unpublish - Unpublish pre-built binary
|
||||
- info - Fetch info on published binaries
|
||||
|
||||
You can also chain commands:
|
||||
|
||||
node-pre-gyp clean build unpublish publish info
|
||||
|
||||
### Options
|
||||
|
||||
Options include:
|
||||
|
||||
- `-C/--directory`: run the command in this directory
|
||||
- `--build-from-source`: build from source instead of using pre-built binary
|
||||
- `--update-binary`: reinstall by replacing previously installed local binary with remote binary
|
||||
- `--runtime=node-webkit`: customize the runtime: `node`, `electron` and `node-webkit` are the valid options
|
||||
- `--fallback-to-build`: fallback to building from source if pre-built binary is not available
|
||||
- `--target=0.4.0`: Pass the target node or node-webkit version to compile against
|
||||
- `--target_arch=ia32`: Pass the target arch and override the host `arch`. Any value that is [supported by Node.js](https://nodejs.org/api/os.html#osarch) is valid.
|
||||
- `--target_platform=win32`: Pass the target platform and override the host `platform`. Valid values are `linux`, `darwin`, `win32`, `sunos`, `freebsd`, `openbsd`, and `aix`.
|
||||
|
||||
Both `--build-from-source` and `--fallback-to-build` can be passed alone or they can provide values. You can pass `--fallback-to-build=false` to override the option as declared in package.json. In addition to being able to pass `--build-from-source` you can also pass `--build-from-source=myapp` where `myapp` is the name of your module.
|
||||
|
||||
For example: `npm install --build-from-source=myapp`. This is useful if:
|
||||
|
||||
- `myapp` is referenced in the package.json of a larger app and therefore `myapp` is being installed as a dependency with `npm install`.
|
||||
- The larger app also depends on other modules installed with `node-pre-gyp`
|
||||
- You only want to trigger a source compile for `myapp` and the other modules.
|
||||
|
||||
### Configuring
|
||||
|
||||
This is a guide to configuring your module to use node-pre-gyp.
|
||||
|
||||
#### 1) Add new entries to your `package.json`
|
||||
|
||||
- Add `@mapbox/node-pre-gyp` to `dependencies`
|
||||
- Add `aws-sdk` as a `devDependency`
|
||||
- Add a custom `install` script
|
||||
- Declare a `binary` object
|
||||
|
||||
This looks like:
|
||||
|
||||
```js
|
||||
"dependencies" : {
|
||||
"@mapbox/node-pre-gyp": "1.x"
|
||||
},
|
||||
"devDependencies": {
|
||||
"aws-sdk": "2.x"
|
||||
}
|
||||
"scripts": {
|
||||
"install": "node-pre-gyp install --fallback-to-build"
|
||||
},
|
||||
"binary": {
|
||||
"module_name": "your_module",
|
||||
"module_path": "./lib/binding/",
|
||||
"host": "https://your_module.s3-us-west-1.amazonaws.com"
|
||||
}
|
||||
```
|
||||
|
||||
For a full example see [node-addon-examples's package.json](https://github.com/springmeyer/node-addon-example/blob/master/package.json).
|
||||
|
||||
Let's break this down:
|
||||
|
||||
- Dependencies need to list `node-pre-gyp`
|
||||
- Your devDependencies should list `aws-sdk` so that you can run `node-pre-gyp publish` locally or a CI system. We recommend using `devDependencies` only since `aws-sdk` is large and not needed for `node-pre-gyp install` since it only uses http to fetch binaries
|
||||
- Your `scripts` section should override the `install` target with `"install": "node-pre-gyp install --fallback-to-build"`. This allows node-pre-gyp to be used instead of the default npm behavior of always source compiling with `node-gyp` directly.
|
||||
- Your package.json should contain a `binary` section describing key properties you provide to allow node-pre-gyp to package optimally. They are detailed below.
|
||||
|
||||
Note: in the past we recommended putting `@mapbox/node-pre-gyp` in the `bundledDependencies`, but we no longer recommend this. In the past there were npm bugs (with node versions 0.10.x) that could lead to node-pre-gyp not being available at the right time during install (unless we bundled). This should no longer be the case. Also, for a time we recommended using `"preinstall": "npm install @mapbox/node-pre-gyp"` as an alternative method to avoid needing to bundle. But this did not behave predictably across all npm versions - see https://github.com/mapbox/node-pre-gyp/issues/260 for the details. So we do not recommend using `preinstall` to install `@mapbox/node-pre-gyp`. More history on this at https://github.com/strongloop/fsevents/issues/157#issuecomment-265545908.
|
||||
|
||||
##### The `binary` object has three required properties
|
||||
|
||||
###### module_name
|
||||
|
||||
The name of your native node module. This value must:
|
||||
|
||||
- Match the name passed to [the NODE_MODULE macro](http://nodejs.org/api/addons.html#addons_hello_world)
|
||||
- Must be a valid C variable name (e.g. it cannot contain `-`)
|
||||
- Should not include the `.node` extension.
|
||||
|
||||
###### module_path
|
||||
|
||||
The location your native module is placed after a build. This should be an empty directory without other Javascript files. This entire directory will be packaged in the binary tarball. When installing from a remote package this directory will be overwritten with the contents of the tarball.
|
||||
|
||||
Note: This property supports variables based on [Versioning](#versioning).
|
||||
|
||||
###### host
|
||||
|
||||
A url to the remote location where you've published tarball binaries (must be `https` not `http`).
|
||||
|
||||
It is highly recommended that you use Amazon S3. The reasons are:
|
||||
|
||||
- Various node-pre-gyp commands like `publish` and `info` only work with an S3 host.
|
||||
- S3 is a very solid hosting platform for distributing large files.
|
||||
- We provide detail documentation for using [S3 hosting](#s3-hosting) with node-pre-gyp.
|
||||
|
||||
Why then not require S3? Because while some applications using node-pre-gyp need to distribute binaries as large as 20-30 MB, others might have very small binaries and might wish to store them in a GitHub repo. This is not recommended, but if an author really wants to host in a non-S3 location then it should be possible.
|
||||
|
||||
It should also be mentioned that there is an optional and entirely separate npm module called [node-pre-gyp-github](https://github.com/bchr02/node-pre-gyp-github) which is intended to complement node-pre-gyp and be installed along with it. It provides the ability to store and publish your binaries within your repositories GitHub Releases if you would rather not use S3 directly. Installation and usage instructions can be found [here](https://github.com/bchr02/node-pre-gyp-github), but the basic premise is that instead of using the ```node-pre-gyp publish``` command you would use ```node-pre-gyp-github publish```.
|
||||
|
||||
##### The `binary` object other optional S3 properties
|
||||
|
||||
If you are not using a standard s3 path like `bucket_name.s3(.-)region.amazonaws.com`, you might get an error on `publish` because node-pre-gyp extracts the region and bucket from the `host` url. For example, you may have an on-premises s3-compatible storage server, or may have configured a specific dns redirecting to an s3 endpoint. In these cases, you can explicitly set the `region` and `bucket` properties to tell node-pre-gyp to use these values instead of guessing from the `host` property. The following values can be used in the `binary` section:
|
||||
|
||||
###### host
|
||||
|
||||
The url to the remote server root location (must be `https` not `http`).
|
||||
|
||||
###### bucket
|
||||
|
||||
The bucket name where your tarball binaries should be located.
|
||||
|
||||
###### region
|
||||
|
||||
Your S3 server region.
|
||||
|
||||
###### s3ForcePathStyle
|
||||
|
||||
Set `s3ForcePathStyle` to true if the endpoint url should not be prefixed with the bucket name. If false (default), the server endpoint would be constructed as `bucket_name.your_server.com`.
|
||||
|
||||
##### The `binary` object has optional properties
|
||||
|
||||
###### remote_path
|
||||
|
||||
It **is recommended** that you customize this property. This is an extra path to use for publishing and finding remote tarballs. The default value for `remote_path` is `""` meaning that if you do not provide it then all packages will be published at the base of the `host`. It is recommended to provide a value like `./{name}/v{version}` to help organize remote packages in the case that you choose to publish multiple node addons to the same `host`.
|
||||
|
||||
Note: This property supports variables based on [Versioning](#versioning).
|
||||
|
||||
###### package_name
|
||||
|
||||
It is **not recommended** to override this property unless you are also overriding the `remote_path`. This is the versioned name of the remote tarball containing the binary `.node` module and any supporting files you've placed inside the `module_path` directory. Unless you specify `package_name` in your `package.json` then it defaults to `{module_name}-v{version}-{node_abi}-{platform}-{arch}.tar.gz` which allows your binary to work across node versions, platforms, and architectures. If you are using `remote_path` that is also versioned by `./{module_name}/v{version}` then you could remove these variables from the `package_name` and just use: `{node_abi}-{platform}-{arch}.tar.gz`. Then your remote tarball will be looked up at, for example, `https://example.com/your-module/v0.1.0/node-v11-linux-x64.tar.gz`.
|
||||
|
||||
Avoiding the version of your module in the `package_name` and instead only embedding in a directory name can be useful when you want to make a quick tag of your module that does not change any C++ code. In this case you can just copy binaries to the new version behind the scenes like:
|
||||
|
||||
```sh
|
||||
aws s3 sync --acl public-read s3://mapbox-node-binary/sqlite3/v3.0.3/ s3://mapbox-node-binary/sqlite3/v3.0.4/
|
||||
```
|
||||
|
||||
Note: This property supports variables based on [Versioning](#versioning).
|
||||
|
||||
#### 2) Add a new target to binding.gyp
|
||||
|
||||
`node-pre-gyp` calls out to `node-gyp` to compile the module and passes variables along like [module_name](#module_name) and [module_path](#module_path).
|
||||
|
||||
A new target must be added to `binding.gyp` that moves the compiled `.node` module from `./build/Release/module_name.node` into the directory specified by `module_path`.
|
||||
|
||||
Add a target like this at the end of your `targets` list:
|
||||
|
||||
```js
|
||||
{
|
||||
"target_name": "action_after_build",
|
||||
"type": "none",
|
||||
"dependencies": [ "<(module_name)" ],
|
||||
"copies": [
|
||||
{
|
||||
"files": [ "<(PRODUCT_DIR)/<(module_name).node" ],
|
||||
"destination": "<(module_path)"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
For a full example see [node-addon-example's binding.gyp](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/binding.gyp).
|
||||
|
||||
#### 3) Dynamically require your `.node`
|
||||
|
||||
Inside the main js file that requires your addon module you are likely currently doing:
|
||||
|
||||
```js
|
||||
var binding = require('../build/Release/binding.node');
|
||||
```
|
||||
|
||||
or:
|
||||
|
||||
```js
|
||||
var bindings = require('./bindings')
|
||||
```
|
||||
|
||||
Change those lines to:
|
||||
|
||||
```js
|
||||
var binary = require('@mapbox/node-pre-gyp');
|
||||
var path = require('path');
|
||||
var binding_path = binary.find(path.resolve(path.join(__dirname,'./package.json')));
|
||||
var binding = require(binding_path);
|
||||
```
|
||||
|
||||
For a full example see [node-addon-example's index.js](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/index.js#L1-L4)
|
||||
|
||||
#### 4) Build and package your app
|
||||
|
||||
Now build your module from source:
|
||||
|
||||
npm install --build-from-source
|
||||
|
||||
The `--build-from-source` tells `node-pre-gyp` to not look for a remote package and instead dispatch to node-gyp to build.
|
||||
|
||||
Now `node-pre-gyp` should now also be installed as a local dependency so the command line tool it offers can be found at `./node_modules/.bin/node-pre-gyp`.
|
||||
|
||||
#### 5) Test
|
||||
|
||||
Now `npm test` should work just as it did before.
|
||||
|
||||
#### 6) Publish the tarball
|
||||
|
||||
Then package your app:
|
||||
|
||||
./node_modules/.bin/node-pre-gyp package
|
||||
|
||||
Once packaged, now you can publish:
|
||||
|
||||
./node_modules/.bin/node-pre-gyp publish
|
||||
|
||||
Currently the `publish` command pushes your binary to S3. This requires:
|
||||
|
||||
- You have installed `aws-sdk` with `npm install aws-sdk`
|
||||
- You have created a bucket already.
|
||||
- The `host` points to an S3 http or https endpoint.
|
||||
- You have configured node-pre-gyp to read your S3 credentials (see [S3 hosting](#s3-hosting) for details).
|
||||
|
||||
You can also host your binaries elsewhere. To do this requires:
|
||||
|
||||
- You manually publish the binary created by the `package` command to an `https` endpoint
|
||||
- Ensure that the `host` value points to your custom `https` endpoint.
|
||||
|
||||
#### 7) Automate builds
|
||||
|
||||
Now you need to publish builds for all the platforms and node versions you wish to support. This is best automated.
|
||||
|
||||
- See [Appveyor Automation](#appveyor-automation) for how to auto-publish builds on Windows.
|
||||
- See [Travis Automation](#travis-automation) for how to auto-publish builds on OS X and Linux.
|
||||
|
||||
#### 8) You're done!
|
||||
|
||||
Now publish your module to the npm registry. Users will now be able to install your module from a binary.
|
||||
|
||||
What will happen is this:
|
||||
|
||||
1. `npm install <your package>` will pull from the npm registry
|
||||
2. npm will run the `install` script which will call out to `node-pre-gyp`
|
||||
3. `node-pre-gyp` will fetch the binary `.node` module and unpack in the right place
|
||||
4. Assuming that all worked, you are done
|
||||
|
||||
If a a binary was not available for a given platform and `--fallback-to-build` was used then `node-gyp rebuild` will be called to try to source compile the module.
|
||||
|
||||
#### 9) One more option
|
||||
|
||||
It may be that you want to work with two s3 buckets, one for staging and one for production; this
|
||||
arrangement makes it less likely to accidentally overwrite a production binary. It also allows the production
|
||||
environment to have more restrictive permissions than staging while still enabling publishing when
|
||||
developing and testing.
|
||||
|
||||
The binary.host property can be set at execution time. In order to do so all of the following conditions
|
||||
must be true.
|
||||
|
||||
- binary.host is falsey or not present
|
||||
- binary.staging_host is not empty
|
||||
- binary.production_host is not empty
|
||||
|
||||
If any of these checks fail then the operation will not perform execution time determination of the s3 target.
|
||||
|
||||
If the command being executed is either "publish" or "unpublish" then the default is set to `binary.staging_host`. In all other cases
|
||||
the default is `binary.production_host`.
|
||||
|
||||
The command-line options `--s3_host=staging` or `--s3_host=production` override the default. If `s3_host`
|
||||
is present and not `staging` or `production` an exception is thrown.
|
||||
|
||||
This allows installing from staging by specifying `--s3_host=staging`. And it requires specifying
|
||||
`--s3_option=production` in order to publish to, or unpublish from, production, making accidental errors less likely.
|
||||
|
||||
## Node-API Considerations
|
||||
|
||||
[Node-API](https://nodejs.org/api/n-api.html#n_api_node_api), which was previously known as N-API, is an ABI-stable alternative to previous technologies such as [nan](https://github.com/nodejs/nan) which are tied to a specific Node runtime engine. Node-API is Node runtime engine agnostic and guarantees modules created today will continue to run, without changes, into the future.
|
||||
|
||||
Using `node-pre-gyp` with Node-API projects requires a handful of additional configuration values and imposes some additional requirements.
|
||||
|
||||
The most significant difference is that an Node-API module can be coded to target multiple Node-API versions. Therefore, an Node-API module must declare in its `package.json` file which Node-API versions the module is designed to run against. In addition, since multiple builds may be required for a single module, path and file names must be specified in way that avoids naming conflicts.
|
||||
|
||||
### The `napi_versions` array property
|
||||
|
||||
A Node-API module must declare in its `package.json` file, the Node-API versions the module is intended to support. This is accomplished by including an `napi-versions` array property in the `binary` object. For example:
|
||||
|
||||
```js
|
||||
"binary": {
|
||||
"module_name": "your_module",
|
||||
"module_path": "your_module_path",
|
||||
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
|
||||
"napi_versions": [1,3]
|
||||
}
|
||||
```
|
||||
|
||||
If the `napi_versions` array property is *not* present, `node-pre-gyp` operates as it always has. Including the `napi_versions` array property instructs `node-pre-gyp` that this is a Node-API module build.
|
||||
|
||||
When the `napi_versions` array property is present, `node-pre-gyp` fires off multiple operations, one for each of the Node-API versions in the array. In the example above, two operations are initiated, one for Node-API version 1 and second for Node-API version 3. How this version number is communicated is described next.
|
||||
|
||||
### The `napi_build_version` value
|
||||
|
||||
For each of the Node-API module operations `node-pre-gyp` initiates, it ensures that the `napi_build_version` is set appropriately.
|
||||
|
||||
This value is of importance in two areas:
|
||||
|
||||
1. The C/C++ code which needs to know against which Node-API version it should compile.
|
||||
2. `node-pre-gyp` itself which must assign appropriate path and file names to avoid collisions.
|
||||
|
||||
### Defining `NAPI_VERSION` for the C/C++ code
|
||||
|
||||
The `napi_build_version` value is communicated to the C/C++ code by adding this code to the `binding.gyp` file:
|
||||
|
||||
```
|
||||
"defines": [
|
||||
"NAPI_VERSION=<(napi_build_version)",
|
||||
]
|
||||
```
|
||||
|
||||
This ensures that `NAPI_VERSION`, an integer value, is declared appropriately to the C/C++ code for each build.
|
||||
|
||||
> Note that earlier versions of this document recommended defining the symbol `NAPI_BUILD_VERSION`. `NAPI_VERSION` is preferred because it used by the Node-API C/C++ headers to configure the specific Node-API versions being requested.
|
||||
|
||||
### Path and file naming requirements in `package.json`
|
||||
|
||||
Since `node-pre-gyp` fires off multiple operations for each request, it is essential that path and file names be created in such a way as to avoid collisions. This is accomplished by imposing additional path and file naming requirements.
|
||||
|
||||
Specifically, when performing Node-API builds, the `{napi_build_version}` text configuration value *must* be present in the `module_path` property. In addition, the `{napi_build_version}` text configuration value *must* be present in either the `remote_path` or `package_name` property. (No problem if it's in both.)
|
||||
|
||||
Here's an example:
|
||||
|
||||
```js
|
||||
"binary": {
|
||||
"module_name": "your_module",
|
||||
"module_path": "./lib/binding/napi-v{napi_build_version}",
|
||||
"remote_path": "./{module_name}/v{version}/{configuration}/",
|
||||
"package_name": "{platform}-{arch}-napi-v{napi_build_version}.tar.gz",
|
||||
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
|
||||
"napi_versions": [1,3]
|
||||
}
|
||||
```
|
||||
|
||||
## Supporting both Node-API and NAN builds
|
||||
|
||||
You may have a legacy native add-on that you wish to continue supporting for those versions of Node that do not support Node-API, as you add Node-API support for later Node versions. This can be accomplished by specifying the `node_napi_label` configuration value in the package.json `binary.package_name` property.
|
||||
|
||||
Placing the configuration value `node_napi_label` in the package.json `binary.package_name` property instructs `node-pre-gyp` to build all viable Node-API binaries supported by the current Node instance. If the current Node instance does not support Node-API, `node-pre-gyp` will request a traditional, non-Node-API build.
|
||||
|
||||
The configuration value `node_napi_label` is set by `node-pre-gyp` to the type of build created, `napi` or `node`, and the version number. For Node-API builds, the string contains the Node-API version nad has values like `napi-v3`. For traditional, non-Node-API builds, the string contains the ABI version with values like `node-v46`.
|
||||
|
||||
Here's how the `binary` configuration above might be changed to support both Node-API and NAN builds:
|
||||
|
||||
```js
|
||||
"binary": {
|
||||
"module_name": "your_module",
|
||||
"module_path": "./lib/binding/{node_napi_label}",
|
||||
"remote_path": "./{module_name}/v{version}/{configuration}/",
|
||||
"package_name": "{platform}-{arch}-{node_napi_label}.tar.gz",
|
||||
"host": "https://your_bucket.s3-us-west-1.amazonaws.com",
|
||||
"napi_versions": [1,3]
|
||||
}
|
||||
```
|
||||
|
||||
The C/C++ symbol `NAPI_VERSION` can be used to distinguish Node-API and non-Node-API builds. The value of `NAPI_VERSION` is set to the integer Node-API version for Node-API builds and is set to `0` for non-Node-API builds.
|
||||
|
||||
For example:
|
||||
|
||||
```C
|
||||
#if NAPI_VERSION
|
||||
// Node-API code goes here
|
||||
#else
|
||||
// NAN code goes here
|
||||
#endif
|
||||
```
|
||||
|
||||
### Two additional configuration values
|
||||
|
||||
The following two configuration values, which were implemented in previous versions of `node-pre-gyp`, continue to exist, but have been replaced by the `node_napi_label` configuration value described above.
|
||||
|
||||
1. `napi_version` If Node-API is supported by the currently executing Node instance, this value is the Node-API version number supported by Node. If Node-API is not supported, this value is an empty string.
|
||||
|
||||
2. `node_abi_napi` If the value returned for `napi_version` is non empty, this value is `'napi'`. If the value returned for `napi_version` is empty, this value is the value returned for `node_abi`.
|
||||
|
||||
These values are present for use in the `binding.gyp` file and may be used as `{napi_version}` and `{node_abi_napi}` for text substituion in the `binary` properties of the `package.json` file.
|
||||
|
||||
## S3 Hosting
|
||||
|
||||
You can host wherever you choose but S3 is cheap, `node-pre-gyp publish` expects it, and S3 can be integrated well with [Travis.ci](http://travis-ci.org) to automate builds for OS X and Ubuntu, and with [Appveyor](http://appveyor.com) to automate builds for Windows. Here is an approach to do this:
|
||||
|
||||
First, get setup locally and test the workflow:
|
||||
|
||||
#### 1) Create an S3 bucket
|
||||
|
||||
And have your **key** and **secret key** ready for writing to the bucket.
|
||||
|
||||
It is recommended to create a IAM user with a policy that only gives permissions to the specific bucket you plan to publish to. This can be done in the [IAM console](https://console.aws.amazon.com/iam/) by: 1) adding a new user, 2) choosing `Attach User Policy`, 3) Using the `Policy Generator`, 4) selecting `Amazon S3` for the service, 5) adding the actions: `DeleteObject`, `GetObject`, `GetObjectAcl`, `ListBucket`, `HeadBucket`, `PutObject`, `PutObjectAcl`, 6) adding an ARN of `arn:aws:s3:::bucket/*` (replacing `bucket` with your bucket name), and finally 7) clicking `Add Statement` and saving the policy. It should generate a policy like:
|
||||
|
||||
```js
|
||||
{
|
||||
"Version": "2012-10-17",
|
||||
"Statement": [
|
||||
{
|
||||
"Sid": "objects",
|
||||
"Effect": "Allow",
|
||||
"Action": [
|
||||
"s3:PutObject",
|
||||
"s3:GetObjectAcl",
|
||||
"s3:GetObject",
|
||||
"s3:DeleteObject",
|
||||
"s3:PutObjectAcl"
|
||||
],
|
||||
"Resource": "arn:aws:s3:::your-bucket-name/*"
|
||||
},
|
||||
{
|
||||
"Sid": "bucket",
|
||||
"Effect": "Allow",
|
||||
"Action": "s3:ListBucket",
|
||||
"Resource": "arn:aws:s3:::your-bucket-name"
|
||||
},
|
||||
{
|
||||
"Sid": "buckets",
|
||||
"Effect": "Allow",
|
||||
"Action": "s3:HeadBucket",
|
||||
"Resource": "*"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
#### 2) Install node-pre-gyp
|
||||
|
||||
Either install it globally:
|
||||
|
||||
npm install node-pre-gyp -g
|
||||
|
||||
Or put the local version on your PATH
|
||||
|
||||
export PATH=`pwd`/node_modules/.bin/:$PATH
|
||||
|
||||
#### 3) Configure AWS credentials
|
||||
|
||||
It is recommended to configure the AWS JS SDK v2 used internally by `node-pre-gyp` by setting these environment variables:
|
||||
|
||||
- AWS_ACCESS_KEY_ID
|
||||
- AWS_SECRET_ACCESS_KEY
|
||||
|
||||
But also you can also use the `Shared Config File` mentioned [in the AWS JS SDK v2 docs](https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/configuring-the-jssdk.html)
|
||||
|
||||
#### 4) Package and publish your build
|
||||
|
||||
Install the `aws-sdk`:
|
||||
|
||||
npm install aws-sdk
|
||||
|
||||
Then publish:
|
||||
|
||||
node-pre-gyp package publish
|
||||
|
||||
Note: if you hit an error like `Hostname/IP doesn't match certificate's altnames` it may mean that you need to provide the `region` option in your config.
|
||||
|
||||
## Appveyor Automation
|
||||
|
||||
[Appveyor](http://www.appveyor.com/) can build binaries and publish the results per commit and supports:
|
||||
|
||||
- Windows Visual Studio 2013 and related compilers
|
||||
- Both 64 bit (x64) and 32 bit (x86) build configurations
|
||||
- Multiple Node.js versions
|
||||
|
||||
For an example of doing this see [node-sqlite3's appveyor.yml](https://github.com/mapbox/node-sqlite3/blob/master/appveyor.yml).
|
||||
|
||||
Below is a guide to getting set up:
|
||||
|
||||
#### 1) Create a free Appveyor account
|
||||
|
||||
Go to https://ci.appveyor.com/signup/free and sign in with your GitHub account.
|
||||
|
||||
#### 2) Create a new project
|
||||
|
||||
Go to https://ci.appveyor.com/projects/new and select the GitHub repo for your module
|
||||
|
||||
#### 3) Add appveyor.yml and push it
|
||||
|
||||
Once you have committed an `appveyor.yml` ([appveyor.yml reference](http://www.appveyor.com/docs/appveyor-yml)) to your GitHub repo and pushed it AppVeyor should automatically start building your project.
|
||||
|
||||
#### 4) Create secure variables
|
||||
|
||||
Encrypt your S3 AWS keys by going to <https://ci.appveyor.com/tools/encrypt> and hitting the `encrypt` button.
|
||||
|
||||
Then paste the result into your `appveyor.yml`
|
||||
|
||||
```yml
|
||||
environment:
|
||||
AWS_ACCESS_KEY_ID:
|
||||
secure: Dn9HKdLNYvDgPdQOzRq/DqZ/MPhjknRHB1o+/lVU8MA=
|
||||
AWS_SECRET_ACCESS_KEY:
|
||||
secure: W1rwNoSnOku1r+28gnoufO8UA8iWADmL1LiiwH9IOkIVhDTNGdGPJqAlLjNqwLnL
|
||||
```
|
||||
|
||||
NOTE: keys are per account but not per repo (this is difference than Travis where keys are per repo but not related to the account used to encrypt them).
|
||||
|
||||
#### 5) Hook up publishing
|
||||
|
||||
Just put `node-pre-gyp package publish` in your `appveyor.yml` after `npm install`.
|
||||
|
||||
#### 6) Publish when you want
|
||||
|
||||
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
|
||||
|
||||
SET CM=%APPVEYOR_REPO_COMMIT_MESSAGE%
|
||||
if not "%CM%" == "%CM:[publish binary]=%" node-pre-gyp --msvs_version=2013 publish
|
||||
|
||||
If your commit message contains special characters (e.g. `&`) this method might fail. An alternative is to use PowerShell, which gives you additional possibilities, like ignoring case by using `ToLower()`:
|
||||
|
||||
ps: if($env:APPVEYOR_REPO_COMMIT_MESSAGE.ToLower().Contains('[publish binary]')) { node-pre-gyp --msvs_version=2013 publish }
|
||||
|
||||
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package.
|
||||
|
||||
## Travis Automation
|
||||
|
||||
[Travis](https://travis-ci.org/) can push to S3 after a successful build and supports both:
|
||||
|
||||
- Ubuntu Precise and OS X (64 bit)
|
||||
- Multiple Node.js versions
|
||||
|
||||
For an example of doing this see [node-add-example's .travis.yml](https://github.com/springmeyer/node-addon-example/blob/2ff60a8ded7f042864ad21db00c3a5a06cf47075/.travis.yml).
|
||||
|
||||
Note: if you need 32 bit binaries, this can be done from a 64 bit Travis machine. See [the node-sqlite3 scripts for an example of doing this](https://github.com/mapbox/node-sqlite3/blob/bae122aa6a2b8a45f6b717fab24e207740e32b5d/scripts/build_against_node.sh#L54-L74).
|
||||
|
||||
Below is a guide to getting set up:
|
||||
|
||||
#### 1) Install the Travis gem
|
||||
|
||||
gem install travis
|
||||
|
||||
#### 2) Create secure variables
|
||||
|
||||
Make sure you run this command from within the directory of your module.
|
||||
|
||||
Use `travis-encrypt` like:
|
||||
|
||||
travis encrypt AWS_ACCESS_KEY_ID=${node_pre_gyp_accessKeyId}
|
||||
travis encrypt AWS_SECRET_ACCESS_KEY=${node_pre_gyp_secretAccessKey}
|
||||
|
||||
Then put those values in your `.travis.yml` like:
|
||||
|
||||
```yaml
|
||||
env:
|
||||
global:
|
||||
- secure: F+sEL/v56CzHqmCSSES4pEyC9NeQlkoR0Gs/ZuZxX1ytrj8SKtp3MKqBj7zhIclSdXBz4Ev966Da5ctmcTd410p0b240MV6BVOkLUtkjZJyErMBOkeb8n8yVfSoeMx8RiIhBmIvEn+rlQq+bSFis61/JkE9rxsjkGRZi14hHr4M=
|
||||
- secure: o2nkUQIiABD139XS6L8pxq3XO5gch27hvm/gOdV+dzNKc/s2KomVPWcOyXNxtJGhtecAkABzaW8KHDDi5QL1kNEFx6BxFVMLO8rjFPsMVaBG9Ks6JiDQkkmrGNcnVdxI/6EKTLHTH5WLsz8+J7caDBzvKbEfTux5EamEhxIWgrI=
|
||||
```
|
||||
|
||||
More details on Travis encryption at http://about.travis-ci.org/docs/user/encryption-keys/.
|
||||
|
||||
#### 3) Hook up publishing
|
||||
|
||||
Just put `node-pre-gyp package publish` in your `.travis.yml` after `npm install`.
|
||||
|
||||
##### OS X publishing
|
||||
|
||||
If you want binaries for OS X in addition to linux you can enable [multi-os for Travis](http://docs.travis-ci.com/user/multi-os/#Setting-.travis.yml)
|
||||
|
||||
Use a configuration like:
|
||||
|
||||
```yml
|
||||
|
||||
language: cpp
|
||||
|
||||
os:
|
||||
- linux
|
||||
- osx
|
||||
|
||||
env:
|
||||
matrix:
|
||||
- NODE_VERSION="4"
|
||||
- NODE_VERSION="6"
|
||||
|
||||
before_install:
|
||||
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
|
||||
- source ~/.nvm/nvm.sh
|
||||
- nvm install $NODE_VERSION
|
||||
- nvm use $NODE_VERSION
|
||||
```
|
||||
|
||||
See [Travis OS X Gotchas](#travis-os-x-gotchas) for why we replace `language: node_js` and `node_js:` sections with `language: cpp` and a custom matrix.
|
||||
|
||||
Also create platform specific sections for any deps that need install. For example if you need libpng:
|
||||
|
||||
```yml
|
||||
- if [ $(uname -s) == 'Linux' ]; then apt-get install libpng-dev; fi;
|
||||
- if [ $(uname -s) == 'Darwin' ]; then brew install libpng; fi;
|
||||
```
|
||||
|
||||
For detailed multi-OS examples see [node-mapnik](https://github.com/mapnik/node-mapnik/blob/master/.travis.yml) and [node-sqlite3](https://github.com/mapbox/node-sqlite3/blob/master/.travis.yml).
|
||||
|
||||
##### Travis OS X Gotchas
|
||||
|
||||
First, unlike the Travis Linux machines, the OS X machines do not put `node-pre-gyp` on PATH by default. To do so you will need to:
|
||||
|
||||
```sh
|
||||
export PATH=$(pwd)/node_modules/.bin:${PATH}
|
||||
```
|
||||
|
||||
Second, the OS X machines do not support using a matrix for installing different Node.js versions. So you need to bootstrap the installation of Node.js in a cross platform way.
|
||||
|
||||
By doing:
|
||||
|
||||
```yml
|
||||
env:
|
||||
matrix:
|
||||
- NODE_VERSION="4"
|
||||
- NODE_VERSION="6"
|
||||
|
||||
before_install:
|
||||
- rm -rf ~/.nvm/ && git clone --depth 1 https://github.com/creationix/nvm.git ~/.nvm
|
||||
- source ~/.nvm/nvm.sh
|
||||
- nvm install $NODE_VERSION
|
||||
- nvm use $NODE_VERSION
|
||||
```
|
||||
|
||||
You can easily recreate the previous behavior of this matrix:
|
||||
|
||||
```yml
|
||||
node_js:
|
||||
- "4"
|
||||
- "6"
|
||||
```
|
||||
|
||||
#### 4) Publish when you want
|
||||
|
||||
You might wish to publish binaries only on a specific commit. To do this you could borrow from the [Travis CI idea of commit keywords](http://about.travis-ci.org/docs/user/how-to-skip-a-build/) and add special handling for commit messages with `[publish binary]`:
|
||||
|
||||
COMMIT_MESSAGE=$(git log --format=%B --no-merges -n 1 | tr -d '\n')
|
||||
if [[ ${COMMIT_MESSAGE} =~ "[publish binary]" ]]; then node-pre-gyp publish; fi;
|
||||
|
||||
Then you can trigger new binaries to be built like:
|
||||
|
||||
git commit -a -m "[publish binary]"
|
||||
|
||||
Or, if you don't have any changes to make simply run:
|
||||
|
||||
git commit --allow-empty -m "[publish binary]"
|
||||
|
||||
WARNING: if you are working in a pull request and publishing binaries from there then you will want to avoid double publishing when Travis CI builds both the `push` and `pr`. You only want to run the publish on the `push` commit. See https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/is_pr_merge.sh which is called from https://github.com/Project-OSRM/node-osrm/blob/8eb837abe2e2e30e595093d16e5354bc5c573575/scripts/publish.sh for an example of how to do this.
|
||||
|
||||
Remember this publishing is not the same as `npm publish`. We're just talking about the binary module here and not your entire npm package. To automate the publishing of your entire package to npm on Travis see http://about.travis-ci.org/docs/user/deployment/npm/
|
||||
|
||||
# Versioning
|
||||
|
||||
The `binary` properties of `module_path`, `remote_path`, and `package_name` support variable substitution. The strings are evaluated by `node-pre-gyp` depending on your system and any custom build flags you passed.
|
||||
|
||||
- `node_abi`: The node C++ `ABI` number. This value is available in Javascript as `process.versions.modules` as of [`>= v0.10.4 >= v0.11.7`](https://github.com/joyent/node/commit/ccabd4a6fa8a6eb79d29bc3bbe9fe2b6531c2d8e) and in C++ as the `NODE_MODULE_VERSION` define much earlier. For versions of Node before this was available we fallback to the V8 major and minor version.
|
||||
- `platform` matches node's `process.platform` like `linux`, `darwin`, and `win32` unless the user passed the `--target_platform` option to override.
|
||||
- `arch` matches node's `process.arch` like `x64` or `ia32` unless the user passes the `--target_arch` option to override.
|
||||
- `libc` matches `require('detect-libc').family` like `glibc` or `musl` unless the user passes the `--target_libc` option to override.
|
||||
- `configuration` - Either 'Release' or 'Debug' depending on if `--debug` is passed during the build.
|
||||
- `module_name` - the `binary.module_name` attribute from `package.json`.
|
||||
- `version` - the semver `version` value for your module from `package.json` (NOTE: ignores the `semver.build` property).
|
||||
- `major`, `minor`, `patch`, and `prelease` match the individual semver values for your module's `version`
|
||||
- `build` - the sevmer `build` value. For example it would be `this.that` if your package.json `version` was `v1.0.0+this.that`
|
||||
- `prerelease` - the semver `prerelease` value. For example it would be `alpha.beta` if your package.json `version` was `v1.0.0-alpha.beta`
|
||||
|
||||
|
||||
The options are visible in the code at <https://github.com/mapbox/node-pre-gyp/blob/612b7bca2604508d881e1187614870ba19a7f0c5/lib/util/versioning.js#L114-L127>
|
||||
|
||||
# Download binary files from a mirror
|
||||
|
||||
S3 is broken in China for the well known reason.
|
||||
|
||||
Using the `npm` config argument: `--{module_name}_binary_host_mirror` can download binary files through a mirror, `-` in `module_name` will be replaced with `_`.
|
||||
|
||||
e.g.: Install [v8-profiler](https://www.npmjs.com/package/v8-profiler) from `npm`.
|
||||
|
||||
```bash
|
||||
$ npm install v8-profiler --profiler_binary_host_mirror=https://npm.taobao.org/mirrors/node-inspector/
|
||||
```
|
||||
|
||||
e.g.: Install [canvas-prebuilt](https://www.npmjs.com/package/canvas-prebuilt) from `npm`.
|
||||
|
||||
```bash
|
||||
$ npm install canvas-prebuilt --canvas_prebuilt_binary_host_mirror=https://npm.taobao.org/mirrors/canvas-prebuilt/
|
||||
```
|
||||
4
backend/node_modules/@mapbox/node-pre-gyp/bin/node-pre-gyp
generated
vendored
4
backend/node_modules/@mapbox/node-pre-gyp/bin/node-pre-gyp
generated
vendored
@@ -1,4 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
'use strict';
|
||||
|
||||
require('../lib/main');
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user