Compare commits
42 Commits
activitypa
...
mytischten
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
36bf99c013 | ||
|
|
7549fb5730 | ||
|
|
1517d83f6c | ||
|
|
993e12d4a5 | ||
|
|
806cb527d4 | ||
|
|
7e9d2d2c4f | ||
|
|
ec9b92000e | ||
|
|
d110900e85 | ||
|
|
cd3c3502f6 | ||
|
|
ccce9bffac | ||
|
|
f1ba25f9f5 | ||
|
|
548f51ac54 | ||
|
|
946e4fce1e | ||
|
|
40dcd0e54c | ||
|
|
bd338b86df | ||
|
|
1d4aa43b02 | ||
|
|
cc08f4ba43 | ||
|
|
d0ccaa9e54 | ||
|
|
dc0eff4e4c | ||
|
|
db9e404372 | ||
|
|
60ac89636e | ||
|
|
2b1365339e | ||
|
|
0cf2351c79 | ||
|
|
5c32fad34e | ||
|
|
7f0b681e88 | ||
|
|
cc964da9cf | ||
|
|
dbede48d4f | ||
|
|
6cd3c3a020 | ||
|
|
7ecbef806d | ||
|
|
1c70ca97bb | ||
|
|
a6493990d3 | ||
|
|
f8f4d23c4e | ||
|
|
1ef1711eea | ||
|
|
85981a880d | ||
|
|
84503b6404 | ||
|
|
bcc3ce036d | ||
|
|
0fe0514660 | ||
|
|
431ec861ba | ||
|
|
648b608036 | ||
|
|
4ac71d967f | ||
|
|
75d304ec6d | ||
|
|
144034a305 |
0
.cursor/commands/oldfkchecks.md
Normal file
0
.cursor/commands/oldfkchecks.md
Normal file
212
backend/MYTISCHTENNIS_AUTO_FETCH_README.md
Normal file
212
backend/MYTISCHTENNIS_AUTO_FETCH_README.md
Normal file
@@ -0,0 +1,212 @@
|
||||
# MyTischtennis Automatischer Datenabruf
|
||||
|
||||
## Übersicht
|
||||
|
||||
Dieses System ermöglicht den automatischen Abruf von Spielergebnissen und Statistiken von myTischtennis.de.
|
||||
|
||||
## Scheduler
|
||||
|
||||
### 6:00 Uhr - Rating Updates
|
||||
- **Service:** `autoUpdateRatingsService.js`
|
||||
- **Funktion:** Aktualisiert TTR/QTTR-Werte für Spieler
|
||||
- **TODO:** Implementierung der eigentlichen Rating-Update-Logik
|
||||
|
||||
### 6:30 Uhr - Spielergebnisse
|
||||
- **Service:** `autoFetchMatchResultsService.js`
|
||||
- **Funktion:** Ruft Spielerbilanzen für konfigurierte Teams ab
|
||||
- **Status:** ✅ Grundlegende Implementierung fertig
|
||||
|
||||
## Benötigte Konfiguration
|
||||
|
||||
### 1. MyTischtennis-Account
|
||||
- Account muss in den MyTischtennis-Settings verknüpft sein
|
||||
- Checkbox "Automatische Updates" aktivieren
|
||||
- Passwort speichern (erforderlich für automatische Re-Authentifizierung)
|
||||
|
||||
### 2. League-Konfiguration
|
||||
|
||||
Für jede Liga müssen folgende Felder ausgefüllt werden:
|
||||
|
||||
```sql
|
||||
UPDATE league SET
|
||||
my_tischtennis_group_id = '504417', -- Group ID von myTischtennis
|
||||
association = 'HeTTV', -- Verband (z.B. HeTTV, DTTB)
|
||||
groupname = '1.Kreisklasse' -- Gruppenname für URL
|
||||
WHERE id = 1;
|
||||
```
|
||||
|
||||
**Beispiel-URL:**
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/...
|
||||
^^^^^ ^^^^^^^^^^^^^^ ^^^^^^
|
||||
association groupname group_id
|
||||
```
|
||||
|
||||
### 3. Team-Konfiguration
|
||||
|
||||
Für jedes Team muss die myTischtennis Team-ID gesetzt werden:
|
||||
|
||||
```sql
|
||||
UPDATE club_team SET
|
||||
my_tischtennis_team_id = '2995094' -- Team ID von myTischtennis
|
||||
WHERE id = 1;
|
||||
```
|
||||
|
||||
**Beispiel-URL:**
|
||||
```
|
||||
.../mannschaft/2995094/Harheimer_TC_(J11)/spielerbilanzen/gesamt
|
||||
^^^^^^^
|
||||
team_id
|
||||
```
|
||||
|
||||
### 4. Spieler-Zuordnung (Optional)
|
||||
|
||||
Spieler werden automatisch anhand des Namens zugeordnet. Für genauere Zuordnung kann die myTischtennis Player-ID gesetzt werden:
|
||||
|
||||
```sql
|
||||
UPDATE member SET
|
||||
my_tischtennis_player_id = 'NU2705037' -- Player ID von myTischtennis
|
||||
WHERE id = 1;
|
||||
```
|
||||
|
||||
## Migrationen
|
||||
|
||||
Folgende Migrationen müssen ausgeführt werden:
|
||||
|
||||
```bash
|
||||
# 1. MyTischtennis Auto-Update-Felder
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/add_auto_update_ratings_to_my_tischtennis.sql
|
||||
|
||||
# 2. MyTischtennis Update-History-Tabelle
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/create_my_tischtennis_update_history.sql
|
||||
|
||||
# 3. League MyTischtennis-Felder
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/add_mytischtennis_fields_to_league.sql
|
||||
|
||||
# 4. Team MyTischtennis-ID
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/add_mytischtennis_team_id_to_club_team.sql
|
||||
|
||||
# 5. Member MyTischtennis Player-ID
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/add_mytischtennis_player_id_to_member.sql
|
||||
|
||||
# 6. Match Result-Felder
|
||||
mysql -u root -p trainingstagebuch < backend/migrations/add_match_result_fields.sql
|
||||
```
|
||||
|
||||
## Abgerufene Daten
|
||||
|
||||
Von der myTischtennis API werden folgende Daten abgerufen:
|
||||
|
||||
### Einzelstatistiken
|
||||
- Player ID, Vorname, Nachname
|
||||
- Gewonnene/Verlorene Punkte
|
||||
- Anzahl Spiele
|
||||
- Detaillierte Statistiken nach Gegner-Position
|
||||
|
||||
### Doppelstatistiken
|
||||
- Player IDs, Namen der beiden Spieler
|
||||
- Gewonnene/Verlorene Punkte
|
||||
- Anzahl Spiele
|
||||
|
||||
### Team-Informationen
|
||||
- Teamname, Liga, Saison
|
||||
- Gesamtpunkte (gewonnen/verloren)
|
||||
- Doppel- und Einzelpunkte
|
||||
|
||||
## Implementierungsdetails
|
||||
|
||||
### Datenfluss
|
||||
|
||||
1. **Scheduler** (6:30 Uhr):
|
||||
- `schedulerService.js` triggert `autoFetchMatchResultsService.executeAutomaticFetch()`
|
||||
|
||||
2. **Account-Verarbeitung**:
|
||||
- Lädt alle MyTischtennis-Accounts mit `autoUpdateRatings = true`
|
||||
- Prüft Session-Gültigkeit
|
||||
- Re-Authentifizierung bei abgelaufener Session
|
||||
|
||||
3. **Team-Abfrage**:
|
||||
- Lädt alle Teams mit konfigurierten myTischtennis-IDs
|
||||
- Baut API-URL dynamisch zusammen
|
||||
- Führt authentifizierten GET-Request durch
|
||||
|
||||
4. **Datenverarbeitung**:
|
||||
- Parst JSON-Response
|
||||
- Matched Spieler anhand von ID oder Name
|
||||
- Speichert myTischtennis Player-ID bei Mitgliedern
|
||||
- Loggt Statistiken
|
||||
|
||||
### Player-Matching-Algorithmus
|
||||
|
||||
```javascript
|
||||
1. Suche nach myTischtennis Player-ID (exakte Übereinstimmung)
|
||||
2. Falls nicht gefunden: Suche nach Name (case-insensitive)
|
||||
3. Falls gefunden: Speichere myTischtennis Player-ID für zukünftige Abfragen
|
||||
```
|
||||
|
||||
**Hinweis:** Da Namen verschlüsselt gespeichert werden, müssen für den Namens-Abgleich alle Members geladen und entschlüsselt werden. Dies ist bei großen Datenbanken ineffizient.
|
||||
|
||||
## TODO / Offene Punkte
|
||||
|
||||
### Noch zu implementieren:
|
||||
|
||||
1. **TTR/QTTR Updates** (6:00 Uhr Job):
|
||||
- Endpoint für TTR/QTTR-Daten identifizieren
|
||||
- Daten abrufen und in Member-Tabelle speichern
|
||||
|
||||
2. **Spielergebnis-Details**:
|
||||
- Einzelne Matches mit Satzständen speichern
|
||||
- Tabelle für Match-Historie erstellen
|
||||
|
||||
3. **History-Tabelle für Spielergebnis-Abrufe** (optional):
|
||||
- Ähnlich zu `my_tischtennis_update_history`
|
||||
- Speichert Erfolg/Fehler der Abrufe
|
||||
|
||||
4. **Benachrichtigungen** (optional):
|
||||
- Email/Push bei neuen Ergebnissen
|
||||
- Highlights für besondere Siege
|
||||
|
||||
5. **Performance-Optimierung**:
|
||||
- Caching für Player-Matches
|
||||
- Incremental Updates (nur neue Daten)
|
||||
|
||||
## Manueller Test
|
||||
|
||||
```javascript
|
||||
// Im Node-Backend-Code oder über API-Endpoint:
|
||||
import schedulerService from './services/schedulerService.js';
|
||||
|
||||
// Rating Updates manuell triggern
|
||||
await schedulerService.triggerRatingUpdates();
|
||||
|
||||
// Spielergebnisse manuell abrufen
|
||||
await schedulerService.triggerMatchResultsFetch();
|
||||
```
|
||||
|
||||
## API-Dokumentation
|
||||
|
||||
### MyTischtennis Spielerbilanzen-Endpoint
|
||||
|
||||
**URL-Format:**
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/{association}/{season}/ligen/{groupname}/gruppe/{groupId}/mannschaft/{teamId}/{teamname}/spielerbilanzen/gesamt?_data=routes%2Fclick-tt%2B%2F%24association%2B%2F%24season%2B%2F%24type%2B%2F%28%24groupname%29.gruppe.%24urlid_.mannschaft.%24teamid.%24teamname%2B%2Fspielerbilanzen.%24filter
|
||||
```
|
||||
|
||||
**Parameter:**
|
||||
- `{association}`: Verband (z.B. "HeTTV")
|
||||
- `{season}`: Saison im Format "25--26"
|
||||
- `{groupname}`: Gruppenname URL-encoded (z.B. "1.Kreisklasse")
|
||||
- `{groupId}`: Gruppen-ID (numerisch, z.B. "504417")
|
||||
- `{teamId}`: Team-ID (numerisch, z.B. "2995094")
|
||||
- `{teamname}`: Teamname URL-encoded mit Underscores (z.B. "Harheimer_TC_(J11)")
|
||||
|
||||
**Response:** JSON mit `data.balancesheet` Array
|
||||
|
||||
## Sicherheit
|
||||
|
||||
- ✅ Automatische Session-Verwaltung
|
||||
- ✅ Re-Authentifizierung bei abgelaufenen Sessions
|
||||
- ✅ Passwörter verschlüsselt gespeichert
|
||||
- ✅ Fehlerbehandlung und Logging
|
||||
- ✅ Graceful Degradation (einzelne Team-Fehler stoppen nicht den gesamten Prozess)
|
||||
|
||||
328
backend/MYTISCHTENNIS_URL_PARSER_README.md
Normal file
328
backend/MYTISCHTENNIS_URL_PARSER_README.md
Normal file
@@ -0,0 +1,328 @@
|
||||
# MyTischtennis URL Parser
|
||||
|
||||
## Übersicht
|
||||
|
||||
Der URL-Parser ermöglicht es, myTischtennis-Team-URLs automatisch zu parsen und die Konfiguration für automatische Datenabrufe vorzunehmen.
|
||||
|
||||
## Verwendung
|
||||
|
||||
### 1. URL Parsen
|
||||
|
||||
**Endpoint:** `POST /api/mytischtennis/parse-url`
|
||||
|
||||
**Request:**
|
||||
```json
|
||||
{
|
||||
"url": "https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer_TC_(J11)/spielerbilanzen/gesamt"
|
||||
}
|
||||
```
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"data": {
|
||||
"association": "HeTTV",
|
||||
"season": "25/26",
|
||||
"type": "ligen",
|
||||
"groupname": "1.Kreisklasse",
|
||||
"groupId": "504417",
|
||||
"teamId": "2995094",
|
||||
"teamname": "Harheimer TC (J11)",
|
||||
"originalUrl": "https://www.mytischtennis.de/click-tt/...",
|
||||
"clubId": "43030",
|
||||
"clubName": "Harheimer TC",
|
||||
"teamName": "Jugend 11",
|
||||
"leagueName": "Jugend 13 1. Kreisklasse",
|
||||
"region": "Frankfurt",
|
||||
"tableRank": 8,
|
||||
"matchesWon": 0,
|
||||
"matchesLost": 3
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Team Automatisch Konfigurieren
|
||||
|
||||
**Endpoint:** `POST /api/mytischtennis/configure-team`
|
||||
|
||||
**Request:**
|
||||
```json
|
||||
{
|
||||
"url": "https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer_TC_(J11)/spielerbilanzen/gesamt",
|
||||
"clubTeamId": 1,
|
||||
"createLeague": false,
|
||||
"createSeason": false
|
||||
}
|
||||
```
|
||||
|
||||
**Parameter:**
|
||||
- `url` (required): Die myTischtennis-URL
|
||||
- `clubTeamId` (required): Die ID des lokalen Club-Teams
|
||||
- `createLeague` (optional): Wenn `true`, wird eine neue League erstellt
|
||||
- `createSeason` (optional): Wenn `true`, wird eine neue Season erstellt
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"message": "Team configured successfully",
|
||||
"data": {
|
||||
"team": {
|
||||
"id": 1,
|
||||
"name": "Jugend 11",
|
||||
"myTischtennisTeamId": "2995094"
|
||||
},
|
||||
"league": {
|
||||
"id": 5,
|
||||
"name": "Jugend 13 1. Kreisklasse",
|
||||
"myTischtennisGroupId": "504417",
|
||||
"association": "HeTTV",
|
||||
"groupname": "1.Kreisklasse"
|
||||
},
|
||||
"season": {
|
||||
"id": 2,
|
||||
"name": "25/26"
|
||||
},
|
||||
"parsedData": { ... }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. URL für Team Abrufen
|
||||
|
||||
**Endpoint:** `GET /api/mytischtennis/team-url/:teamId`
|
||||
|
||||
**Response:**
|
||||
```json
|
||||
{
|
||||
"success": true,
|
||||
"url": "https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer%20TC%20%28J11%29/spielerbilanzen/gesamt"
|
||||
}
|
||||
```
|
||||
|
||||
## URL-Format
|
||||
|
||||
### Unterstützte URL-Muster
|
||||
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/{association}/{season}/{type}/{groupname}/gruppe/{groupId}/mannschaft/{teamId}/{teamname}/...
|
||||
```
|
||||
|
||||
**Komponenten:**
|
||||
- `{association}`: Verband (z.B. "HeTTV", "DTTB", "WestD")
|
||||
- `{season}`: Saison im Format "YY--YY" (z.B. "25--26" für 2025/2026)
|
||||
- `{type}`: Typ (meist "ligen")
|
||||
- `{groupname}`: Gruppenname URL-encoded (z.B. "1.Kreisklasse", "Kreisliga")
|
||||
- `{groupId}`: Numerische Gruppen-ID (z.B. "504417")
|
||||
- `{teamId}`: Numerische Team-ID (z.B. "2995094")
|
||||
- `{teamname}`: Teamname URL-encoded mit Underscores (z.B. "Harheimer_TC_(J11)")
|
||||
|
||||
### Beispiel-URLs
|
||||
|
||||
**Spielerbilanzen:**
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer_TC_(J11)/spielerbilanzen/gesamt
|
||||
```
|
||||
|
||||
**Spielplan:**
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer_TC_(J11)/spielplan
|
||||
```
|
||||
|
||||
**Tabelle:**
|
||||
```
|
||||
https://www.mytischtennis.de/click-tt/HeTTV/25--26/ligen/1.Kreisklasse/gruppe/504417/mannschaft/2995094/Harheimer_TC_(J11)/tabelle
|
||||
```
|
||||
|
||||
## Datenfluss
|
||||
|
||||
### Ohne MyTischtennis-Login
|
||||
|
||||
1. URL wird geparst
|
||||
2. Nur URL-Komponenten werden extrahiert
|
||||
3. Zusätzliche Daten (clubName, leagueName, etc.) sind nicht verfügbar
|
||||
|
||||
### Mit MyTischtennis-Login
|
||||
|
||||
1. URL wird geparst
|
||||
2. API-Request an myTischtennis mit Authentication
|
||||
3. Vollständige Team-Daten werden abgerufen
|
||||
4. Alle Felder sind verfügbar
|
||||
|
||||
## Frontend-Integration
|
||||
|
||||
### Vue.js Beispiel
|
||||
|
||||
```javascript
|
||||
<template>
|
||||
<div>
|
||||
<input
|
||||
v-model="myTischtennisUrl"
|
||||
placeholder="MyTischtennis URL einfügen..."
|
||||
@blur="parseUrl"
|
||||
/>
|
||||
|
||||
<div v-if="parsedData">
|
||||
<h3>{{ parsedData.teamname }}</h3>
|
||||
<p>Liga: {{ parsedData.leagueName }}</p>
|
||||
<p>Verband: {{ parsedData.association }}</p>
|
||||
<p>Tabelle: Platz {{ parsedData.tableRank }}</p>
|
||||
|
||||
<button @click="configureTeam">Team konfigurieren</button>
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script>
|
||||
export default {
|
||||
data() {
|
||||
return {
|
||||
myTischtennisUrl: '',
|
||||
parsedData: null
|
||||
};
|
||||
},
|
||||
methods: {
|
||||
async parseUrl() {
|
||||
if (!this.myTischtennisUrl) return;
|
||||
|
||||
try {
|
||||
const response = await fetch('/api/mytischtennis/parse-url', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'userid': this.userId,
|
||||
'authcode': this.authCode
|
||||
},
|
||||
body: JSON.stringify({
|
||||
url: this.myTischtennisUrl
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
this.parsedData = result.data;
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Parsen:', error);
|
||||
alert('URL konnte nicht geparst werden');
|
||||
}
|
||||
},
|
||||
|
||||
async configureTeam() {
|
||||
try {
|
||||
const response = await fetch('/api/mytischtennis/configure-team', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'userid': this.userId,
|
||||
'authcode': this.authCode
|
||||
},
|
||||
body: JSON.stringify({
|
||||
url: this.myTischtennisUrl,
|
||||
clubTeamId: this.selectedTeamId,
|
||||
createLeague: false,
|
||||
createSeason: true
|
||||
})
|
||||
});
|
||||
|
||||
const result = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
alert('Team erfolgreich konfiguriert!');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Fehler bei Konfiguration:', error);
|
||||
alert('Team konnte nicht konfiguriert werden');
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
</script>
|
||||
```
|
||||
|
||||
## Workflow
|
||||
|
||||
### Empfohlener Workflow für Benutzer
|
||||
|
||||
1. **MyTischtennis-URL kopieren:**
|
||||
- Auf myTischtennis.de zum Team navigieren
|
||||
- URL aus Adresszeile kopieren
|
||||
|
||||
2. **URL in Trainingstagebuch einfügen:**
|
||||
- Zu Team-Verwaltung navigieren
|
||||
- URL einfügen
|
||||
- Automatisches Parsen
|
||||
|
||||
3. **Konfiguration überprüfen:**
|
||||
- Geparste Daten werden angezeigt
|
||||
- Benutzer kann Daten überprüfen und bei Bedarf anpassen
|
||||
|
||||
4. **Team konfigurieren:**
|
||||
- Auf "Konfigurieren" klicken
|
||||
- System speichert alle benötigten IDs
|
||||
- Automatischer Datenabruf ist ab sofort aktiv
|
||||
|
||||
## Fehlerbehandlung
|
||||
|
||||
### Häufige Fehler
|
||||
|
||||
**"Invalid myTischtennis URL format"**
|
||||
- URL entspricht nicht dem erwarteten Format
|
||||
- Lösung: Vollständige URL von der Spielerbilanzen-Seite kopieren
|
||||
|
||||
**"Season not found"**
|
||||
- Saison existiert noch nicht in der Datenbank
|
||||
- Lösung: `createSeason: true` setzen
|
||||
|
||||
**"Team has no league assigned"**
|
||||
- Team hat keine verknüpfte Liga
|
||||
- Lösung: `createLeague: true` setzen oder Liga manuell zuweisen
|
||||
|
||||
**"HTTP 401: Unauthorized"**
|
||||
- MyTischtennis-Login abgelaufen oder nicht vorhanden
|
||||
- Lösung: In MyTischtennis-Settings erneut anmelden
|
||||
|
||||
## Sicherheit
|
||||
|
||||
- ✅ Alle Endpoints erfordern Authentifizierung
|
||||
- ✅ UserID wird aus Header-Parameter gelesen
|
||||
- ✅ MyTischtennis-Credentials werden sicher gespeichert
|
||||
- ✅ Keine sensiblen Daten in URLs
|
||||
|
||||
## Technische Details
|
||||
|
||||
### Service: `myTischtennisUrlParserService`
|
||||
|
||||
**Methoden:**
|
||||
- `parseUrl(url)` - Parst URL und extrahiert Komponenten
|
||||
- `fetchTeamData(parsedUrl, cookie, accessToken)` - Ruft zusätzliche Daten ab
|
||||
- `getCompleteConfig(url, cookie, accessToken)` - Kombination aus Parsen + Abrufen
|
||||
- `isValidTeamUrl(url)` - Validiert URL-Format
|
||||
- `buildUrl(config)` - Baut URL aus Komponenten
|
||||
|
||||
### Controller: `myTischtennisUrlController`
|
||||
|
||||
**Endpoints:**
|
||||
- `POST /api/mytischtennis/parse-url` - URL parsen
|
||||
- `POST /api/mytischtennis/configure-team` - Team konfigurieren
|
||||
- `GET /api/mytischtennis/team-url/:teamId` - URL abrufen
|
||||
|
||||
## Zukünftige Erweiterungen
|
||||
|
||||
### Geplante Features
|
||||
|
||||
1. **Bulk-Import:**
|
||||
- Mehrere URLs gleichzeitig importieren
|
||||
- Alle Teams einer Liga auf einmal konfigurieren
|
||||
|
||||
2. **Auto-Discovery:**
|
||||
- Automatisches Finden aller Teams eines Vereins
|
||||
- Vorschläge für ähnliche Teams
|
||||
|
||||
3. **Validierung:**
|
||||
- Prüfung, ob Team bereits konfiguriert ist
|
||||
- Warnung bei Duplikaten
|
||||
|
||||
4. **History:**
|
||||
- Speichern der URL-Konfigurationen
|
||||
- Versionierung bei Änderungen
|
||||
|
||||
273
backend/clients/myTischtennisClient.js
Normal file
273
backend/clients/myTischtennisClient.js
Normal file
@@ -0,0 +1,273 @@
|
||||
import axios from 'axios';
|
||||
|
||||
const BASE_URL = 'https://www.mytischtennis.de';
|
||||
|
||||
class MyTischtennisClient {
|
||||
constructor() {
|
||||
this.baseURL = BASE_URL;
|
||||
this.client = axios.create({
|
||||
baseURL: this.baseURL,
|
||||
timeout: 10000,
|
||||
headers: {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
'Accept': '*/*'
|
||||
},
|
||||
maxRedirects: 0, // Don't follow redirects automatically
|
||||
validateStatus: (status) => status >= 200 && status < 400 // Accept 3xx as success
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Login to myTischtennis API
|
||||
* @param {string} email - myTischtennis email (not username!)
|
||||
* @param {string} password - myTischtennis password
|
||||
* @returns {Promise<Object>} Login response with token and session data
|
||||
*/
|
||||
async login(email, password) {
|
||||
try {
|
||||
// Create form data
|
||||
const formData = new URLSearchParams();
|
||||
formData.append('email', email);
|
||||
formData.append('password', password);
|
||||
formData.append('intent', 'login');
|
||||
|
||||
const response = await this.client.post(
|
||||
'/login?next=%2F&_data=routes%2F_auth%2B%2Flogin',
|
||||
formData.toString()
|
||||
);
|
||||
|
||||
// Extract the cookie from response headers
|
||||
const setCookie = response.headers['set-cookie'];
|
||||
if (!setCookie || !Array.isArray(setCookie)) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine Session-Cookie erhalten'
|
||||
};
|
||||
}
|
||||
|
||||
// Find the sb-10-auth-token cookie
|
||||
const authCookie = setCookie.find(cookie => cookie.startsWith('sb-10-auth-token='));
|
||||
if (!authCookie) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Kein Auth-Token in Response gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
// Extract and decode the token
|
||||
const tokenMatch = authCookie.match(/sb-10-auth-token=base64-([^;]+)/);
|
||||
if (!tokenMatch) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Token-Format ungültig'
|
||||
};
|
||||
}
|
||||
|
||||
const base64Token = tokenMatch[1];
|
||||
let tokenData;
|
||||
try {
|
||||
const decodedToken = Buffer.from(base64Token, 'base64').toString('utf-8');
|
||||
tokenData = JSON.parse(decodedToken);
|
||||
} catch (decodeError) {
|
||||
console.error('Error decoding token:', decodeError);
|
||||
return {
|
||||
success: false,
|
||||
error: 'Token konnte nicht dekodiert werden'
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
accessToken: tokenData.access_token,
|
||||
refreshToken: tokenData.refresh_token,
|
||||
expiresAt: tokenData.expires_at,
|
||||
expiresIn: tokenData.expires_in,
|
||||
user: tokenData.user,
|
||||
cookie: authCookie.split(';')[0] // Just the cookie value without attributes
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('MyTischtennis login error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'Login fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify login credentials
|
||||
* @param {string} email - myTischtennis email
|
||||
* @param {string} password - myTischtennis password
|
||||
* @returns {Promise<boolean>} True if credentials are valid
|
||||
*/
|
||||
async verifyCredentials(email, password) {
|
||||
const result = await this.login(email, password);
|
||||
return result.success;
|
||||
}
|
||||
|
||||
/**
|
||||
* Make an authenticated request
|
||||
* @param {string} endpoint - API endpoint
|
||||
* @param {string} cookie - Authentication cookie (sb-10-auth-token)
|
||||
* @param {Object} options - Additional axios options
|
||||
* @returns {Promise<Object>} API response
|
||||
*/
|
||||
async authenticatedRequest(endpoint, cookie, options = {}) {
|
||||
try {
|
||||
const response = await this.client.request({
|
||||
url: endpoint,
|
||||
...options,
|
||||
headers: {
|
||||
...options.headers,
|
||||
'Cookie': cookie,
|
||||
'Accept': '*/*',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
|
||||
'Referer': 'https://www.mytischtennis.de/',
|
||||
'sec-fetch-dest': 'empty',
|
||||
'sec-fetch-mode': 'cors',
|
||||
'sec-fetch-site': 'same-origin'
|
||||
}
|
||||
});
|
||||
return {
|
||||
success: true,
|
||||
data: response.data
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('MyTischtennis API error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'API-Anfrage fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user profile and club information
|
||||
* @param {string} cookie - Authentication cookie (sb-10-auth-token)
|
||||
* @returns {Promise<Object>} User profile with club info
|
||||
*/
|
||||
async getUserProfile(cookie) {
|
||||
|
||||
const result = await this.authenticatedRequest('/?_data=root', cookie, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
|
||||
if (result.success) {
|
||||
console.log('[getUserProfile] - Response structure:', {
|
||||
hasUserProfile: !!result.data?.userProfile,
|
||||
hasClub: !!result.data?.userProfile?.club,
|
||||
hasOrganization: !!result.data?.userProfile?.organization,
|
||||
clubnr: result.data?.userProfile?.club?.clubnr,
|
||||
clubName: result.data?.userProfile?.club?.name,
|
||||
orgShort: result.data?.userProfile?.organization?.short,
|
||||
ttr: result.data?.userProfile?.ttr,
|
||||
qttr: result.data?.userProfile?.qttr
|
||||
});
|
||||
|
||||
|
||||
return {
|
||||
success: true,
|
||||
clubId: result.data?.userProfile?.club?.clubnr || null,
|
||||
clubName: result.data?.userProfile?.club?.name || null,
|
||||
fedNickname: result.data?.userProfile?.organization?.short || null,
|
||||
ttr: result.data?.userProfile?.ttr || null,
|
||||
qttr: result.data?.userProfile?.qttr || null,
|
||||
userProfile: result.data?.userProfile || null
|
||||
};
|
||||
}
|
||||
|
||||
console.error('[getUserProfile] - Failed:', result.error);
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get club rankings (andro-Rangliste)
|
||||
* @param {string} cookie - Authentication cookie
|
||||
* @param {string} clubId - Club number (e.g., "43030")
|
||||
* @param {string} fedNickname - Federation nickname (e.g., "HeTTV")
|
||||
* @returns {Promise<Object>} Rankings with player entries (all pages)
|
||||
*/
|
||||
async getClubRankings(cookie, clubId, fedNickname) {
|
||||
const allEntries = [];
|
||||
let currentPage = 0;
|
||||
let hasMorePages = true;
|
||||
|
||||
|
||||
while (hasMorePages) {
|
||||
const endpoint = `/rankings/andro-rangliste?all-players=on&clubnr=${clubId}&fednickname=${fedNickname}&results-per-page=100&page=${currentPage}&_data=routes%2F%24`;
|
||||
|
||||
|
||||
const result = await this.authenticatedRequest(endpoint, cookie, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
console.error(`[getClubRankings] - Failed to fetch page ${currentPage}:`, result.error);
|
||||
return result;
|
||||
}
|
||||
|
||||
// Find the dynamic key that contains entries
|
||||
const blockLoaderData = result.data?.pageContent?.blockLoaderData;
|
||||
if (!blockLoaderData) {
|
||||
console.error('[getClubRankings] - No blockLoaderData found');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine blockLoaderData gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
// Finde den Schlüssel, der entries enthält
|
||||
let entries = null;
|
||||
let rankingData = null;
|
||||
|
||||
for (const key in blockLoaderData) {
|
||||
if (blockLoaderData[key]?.entries) {
|
||||
entries = blockLoaderData[key].entries;
|
||||
rankingData = blockLoaderData[key];
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!entries) {
|
||||
console.error('[getClubRankings] - No entries found in blockLoaderData');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine entries in blockLoaderData gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
// Füge Entries hinzu
|
||||
allEntries.push(...entries);
|
||||
|
||||
// Prüfe ob es weitere Seiten gibt
|
||||
// Wenn die aktuelle Seite weniger Einträge hat als das Limit, sind wir am Ende
|
||||
// Oder wenn wir alle erwarteten Einträge haben
|
||||
if (entries.length === 0) {
|
||||
hasMorePages = false;
|
||||
} else if (rankingData.numberOfPages && currentPage >= rankingData.numberOfPages - 1) {
|
||||
hasMorePages = false;
|
||||
} else if (allEntries.length >= rankingData.resultLength) {
|
||||
hasMorePages = false;
|
||||
} else {
|
||||
currentPage++;
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
return {
|
||||
success: true,
|
||||
entries: allEntries,
|
||||
metadata: {
|
||||
totalEntries: allEntries.length,
|
||||
pagesFetched: currentPage + 1
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default new MyTischtennisClient();
|
||||
|
||||
128
backend/controllers/clubTeamController.js
Normal file
128
backend/controllers/clubTeamController.js
Normal file
@@ -0,0 +1,128 @@
|
||||
import ClubTeamService from '../services/clubTeamService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
export const getClubTeams = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { seasonid: seasonId } = req.query;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
// Check if user has access to this club
|
||||
const clubTeams = await ClubTeamService.getAllClubTeamsByClub(clubId, seasonId);
|
||||
|
||||
res.status(200).json(clubTeams);
|
||||
} catch (error) {
|
||||
console.error('[getClubTeams] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getClubTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubteamid: clubTeamId } = req.params;
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const clubTeam = await ClubTeamService.getClubTeamById(clubTeamId);
|
||||
if (!clubTeam) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json(clubTeam);
|
||||
} catch (error) {
|
||||
console.error('[getClubTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const createClubTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { name, leagueId, seasonId } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: "missingname" });
|
||||
}
|
||||
|
||||
const clubTeamData = {
|
||||
name,
|
||||
clubId: parseInt(clubId),
|
||||
leagueId: leagueId ? parseInt(leagueId) : null,
|
||||
seasonId: seasonId ? parseInt(seasonId) : null
|
||||
};
|
||||
|
||||
const newClubTeam = await ClubTeamService.createClubTeam(clubTeamData);
|
||||
|
||||
res.status(201).json(newClubTeam);
|
||||
} catch (error) {
|
||||
console.error('[createClubTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateClubTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubteamid: clubTeamId } = req.params;
|
||||
const { name, leagueId, seasonId } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const updateData = {};
|
||||
if (name !== undefined) updateData.name = name;
|
||||
if (leagueId !== undefined) updateData.leagueId = leagueId ? parseInt(leagueId) : null;
|
||||
if (seasonId !== undefined) updateData.seasonId = seasonId ? parseInt(seasonId) : null;
|
||||
|
||||
const success = await ClubTeamService.updateClubTeam(clubTeamId, updateData);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
const updatedClubTeam = await ClubTeamService.getClubTeamById(clubTeamId);
|
||||
res.status(200).json(updatedClubTeam);
|
||||
} catch (error) {
|
||||
console.error('[updateClubTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteClubTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubteamid: clubTeamId } = req.params;
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const success = await ClubTeamService.deleteClubTeam(clubTeamId);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json({ message: "Club team deleted successfully" });
|
||||
} catch (error) {
|
||||
console.error('[deleteClubTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getLeagues = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { seasonid: seasonId } = req.query;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const leagues = await ClubTeamService.getLeaguesByClub(clubId, seasonId);
|
||||
|
||||
res.status(200).json(leagues);
|
||||
} catch (error) {
|
||||
console.error('[getLeagues] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
@@ -1,77 +1,57 @@
|
||||
import ClubService from '../services/clubService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
export const getClubs = async (req, res) => {
|
||||
try {
|
||||
console.log('[getClubs] - get clubs');
|
||||
const clubs = await ClubService.getAllClubs();
|
||||
console.log('[getClubs] - prepare response');
|
||||
res.status(200).json(clubs);
|
||||
console.log('[getClubs] - done');
|
||||
} catch (error) {
|
||||
console.log('[getClubs] - error');
|
||||
console.log(error);
|
||||
console.error('[getClubs] - error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const addClub = async (req, res) => {
|
||||
console.log('[addClub] - Read out parameters');
|
||||
const { authcode: token } = req.headers;
|
||||
const { name: clubName } = req.body;
|
||||
|
||||
try {
|
||||
console.log('[addClub] - find club by name');
|
||||
const club = await ClubService.findClubByName(clubName);
|
||||
console.log('[addClub] - get user');
|
||||
const user = await getUserByToken(token);
|
||||
console.log('[addClub] - check if club already exists');
|
||||
if (club) {
|
||||
res.status(409).json({ error: "alreadyexists" });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[addClub] - create club');
|
||||
const newClub = await ClubService.createClub(clubName);
|
||||
console.log('[addClub] - add user to new club');
|
||||
await ClubService.addUserToClub(user.id, newClub.id);
|
||||
console.log('[addClub] - prepare response');
|
||||
res.status(200).json(newClub);
|
||||
console.log('[addClub] - done');
|
||||
} catch (error) {
|
||||
console.log('[addClub] - error');
|
||||
console.log(error);
|
||||
console.error('[addClub] - error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getClub = async (req, res) => {
|
||||
console.log('[getClub] - start');
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
console.log('[getClub] - get user');
|
||||
const user = await getUserByToken(token);
|
||||
console.log('[getClub] - get users club');
|
||||
const access = await ClubService.getUserClubAccess(user.id, clubId);
|
||||
console.log('[getClub] - check access');
|
||||
if (access.length === 0 || !access[0].approved) {
|
||||
res.status(403).json({ error: "noaccess", status: access.length === 0 ? "notrequested" : "requested" });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[getClub] - get club');
|
||||
const club = await ClubService.findClubById(clubId);
|
||||
console.log('[getClub] - check club exists');
|
||||
if (!club) {
|
||||
return res.status(404).json({ message: 'Club not found' });
|
||||
}
|
||||
|
||||
console.log('[getClub] - set response');
|
||||
res.status(200).json(club);
|
||||
console.log('[getClub] - done');
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
console.error('[getClub] - error:', error);
|
||||
res.status(500).json({ message: 'Server error' });
|
||||
}
|
||||
};
|
||||
@@ -82,7 +62,6 @@ export const requestClubAccess = async (req, res) => {
|
||||
|
||||
try {
|
||||
const user = await getUserByToken(token);
|
||||
console.log(user);
|
||||
|
||||
await ClubService.requestAccessToClub(user.id, clubId);
|
||||
res.status(200).json({});
|
||||
@@ -92,6 +71,7 @@ export const requestClubAccess = async (req, res) => {
|
||||
} else if (error.message === 'clubnotfound') {
|
||||
res.status(404).json({ err: "clubnotfound" });
|
||||
} else {
|
||||
console.error('[requestClubAccess] - error:', error);
|
||||
res.status(500).json({ err: "internalerror" });
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import diaryService from '../services/diaryService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getDatesForClub = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.params;
|
||||
@@ -38,7 +39,7 @@ const updateTrainingTimes = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { dateId, trainingStart, trainingEnd } = req.body;
|
||||
if (!dateId || !trainingStart) {
|
||||
console.log(dateId, trainingStart, trainingEnd);
|
||||
devLog(dateId, trainingStart, trainingEnd);
|
||||
throw new HttpError('notallfieldsfilled', 400);
|
||||
}
|
||||
const updatedDate = await diaryService.updateTrainingTimes(userToken, clubId, dateId, trainingStart, trainingEnd);
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import diaryDateActivityService from '../services/diaryDateActivityService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const createDiaryDateActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -15,7 +16,7 @@ export const createDiaryDateActivity = async (req, res) => {
|
||||
});
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error creating activity' });
|
||||
}
|
||||
};
|
||||
@@ -58,7 +59,7 @@ export const updateDiaryDateActivityOrder = async (req, res) => {
|
||||
const updatedActivity = await diaryDateActivityService.updateActivityOrder(userToken, clubId, id, orderId);
|
||||
res.status(200).json(updatedActivity);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error updating activity order' });
|
||||
}
|
||||
};
|
||||
@@ -70,7 +71,7 @@ export const getDiaryDateActivities = async (req, res) => {
|
||||
const activities = await diaryDateActivityService.getActivities(userToken, clubId, diaryDateId);
|
||||
res.status(200).json(activities);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error getting activities' });
|
||||
}
|
||||
}
|
||||
@@ -82,7 +83,7 @@ export const addGroupActivity = async(req, res) => {
|
||||
const activityItem = await diaryDateActivityService.addGroupActivity(userToken, clubId, diaryDateId, groupId, activity);
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error adding group activity' });
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,8 @@
|
||||
import diaryDateTagService from "../services/diaryDateTagService.js"
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getDiaryDateMemberTags = async (req, res) => {
|
||||
console.log("getDiaryDateMemberTags");
|
||||
devLog("getDiaryDateMemberTags");
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, memberId } = req.params;
|
||||
@@ -14,7 +15,7 @@ export const getDiaryDateMemberTags = async (req, res) => {
|
||||
}
|
||||
|
||||
export const addDiaryDateTag = async (req, res) => {
|
||||
console.log("addDiaryDateTag");
|
||||
devLog("addDiaryDateTag");
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
import DiaryMemberService from '../services/diaryMemberService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getMemberTags = async (req, res) => {
|
||||
try {
|
||||
const { diaryDateId, memberId } = req.query;
|
||||
const { clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log(diaryDateId, memberId, clubId);
|
||||
devLog(diaryDateId, memberId, clubId);
|
||||
const tags = await DiaryMemberService.getTagsForMemberAndDate(userToken, clubId, diaryDateId, memberId);
|
||||
res.status(200).json(tags);
|
||||
} catch (error) {
|
||||
@@ -19,7 +20,7 @@ const getMemberNotes = async (req, res) => {
|
||||
const { diaryDateId, memberId } = req.query;
|
||||
const { clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log('---------->', userToken, clubId);
|
||||
devLog('---------->', userToken, clubId);
|
||||
const notes = await DiaryMemberService.getNotesForMember(userToken, clubId, diaryDateId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { DiaryTag, DiaryDateTag } from '../models/index.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getTags = async (req, res) => {
|
||||
try {
|
||||
const tags = await DiaryTag.findAll();
|
||||
@@ -12,11 +13,10 @@ export const getTags = async (req, res) => {
|
||||
export const createTag = async (req, res) => {
|
||||
try {
|
||||
const { name } = req.body;
|
||||
console.log(name);
|
||||
devLog(name);
|
||||
const newTag = await DiaryTag.findOrCreate({ where: { name }, defaults: { name } });
|
||||
res.status(201).json(newTag);
|
||||
} catch (error) {
|
||||
console.log('[createTag] - Error:', error);
|
||||
res.status(500).json({ error: 'Error creating tag' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import groupService from '../services/groupService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const addGroup = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -9,7 +10,7 @@ const addGroup = async(req, res) => {
|
||||
res.status(201).json(result);
|
||||
} catch (error) {
|
||||
console.error('[addGroup] - Error:', error);
|
||||
console.log(req.params, req.headers, req.body)
|
||||
devLog(req.params, req.headers, req.body)
|
||||
res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import MatchService from '../services/matchService.js';
|
||||
import fs from 'fs';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const uploadCSV = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.body;
|
||||
@@ -21,10 +22,11 @@ export const uploadCSV = async (req, res) => {
|
||||
|
||||
export const getLeaguesForCurrentSeason = async (req, res) => {
|
||||
try {
|
||||
console.log(req.headers, req.params);
|
||||
devLog(req.headers, req.params);
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const leagues = await MatchService.getLeaguesForCurrentSeason(userToken, clubId);
|
||||
const { seasonid: seasonId } = req.query;
|
||||
const leagues = await MatchService.getLeaguesForCurrentSeason(userToken, clubId, seasonId);
|
||||
return res.status(200).json(leagues);
|
||||
} catch (error) {
|
||||
console.error('Error retrieving leagues:', error);
|
||||
@@ -36,7 +38,8 @@ export const getMatchesForLeagues = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const matches = await MatchService.getMatchesForLeagues(userToken, clubId);
|
||||
const { seasonid: seasonId } = req.query;
|
||||
const matches = await MatchService.getMatchesForLeagues(userToken, clubId, seasonId);
|
||||
return res.status(200).json(matches);
|
||||
} catch (error) {
|
||||
console.error('Error retrieving matches:', error);
|
||||
@@ -55,3 +58,47 @@ export const getMatchesForLeague = async (req, res) => {
|
||||
return res.status(500).json({ error: 'Failed to retrieve matches' });
|
||||
}
|
||||
};
|
||||
|
||||
export const getLeagueTable = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, leagueId } = req.params;
|
||||
const table = await MatchService.getLeagueTable(userToken, clubId, leagueId);
|
||||
return res.status(200).json(table);
|
||||
} catch (error) {
|
||||
console.error('Error retrieving league table:', error);
|
||||
return res.status(500).json({ error: 'Failed to retrieve league table' });
|
||||
}
|
||||
};
|
||||
|
||||
export const fetchLeagueTableFromMyTischtennis = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, leagueId } = req.params;
|
||||
const { userid: userIdOrEmail } = req.headers;
|
||||
|
||||
// Convert email to userId if needed
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
const User = (await import('../models/User.js')).default;
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (!user) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
userId = user.id;
|
||||
}
|
||||
|
||||
const autoFetchService = (await import('../services/autoFetchMatchResultsService.js')).default;
|
||||
await autoFetchService.fetchAndUpdateLeagueTable(userId, leagueId);
|
||||
|
||||
// Return updated table data
|
||||
const table = await MatchService.getLeagueTable(userToken, clubId, leagueId);
|
||||
return res.status(200).json({
|
||||
message: 'League table updated from MyTischtennis',
|
||||
data: table
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error fetching league table from MyTischtennis:', error);
|
||||
return res.status(500).json({ error: 'Failed to fetch league table from MyTischtennis' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import MemberService from "../services/memberService.js";
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getClubMembers = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -9,24 +10,17 @@ const getClubMembers = async(req, res) => {
|
||||
}
|
||||
res.status(200).json(await MemberService.getClubMembers(userToken, clubId, showAll));
|
||||
} catch(error) {
|
||||
console.log('[getClubMembers] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
}
|
||||
|
||||
const getWaitingApprovals = async(req, res) => {
|
||||
try {
|
||||
console.log('[getWaitingApprovals] - Start');
|
||||
const { id: clubId } = req.params;
|
||||
console.log('[getWaitingApprovals] - get token');
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log('[getWaitingApprovals] - load for waiting approvals');
|
||||
const waitingApprovals = await MemberService.getApprovalRequests(userToken, clubId);
|
||||
console.log('[getWaitingApprovals] - set response');
|
||||
res.status(200).json(waitingApprovals);
|
||||
console.log('[getWaitingApprovals] - done');
|
||||
} catch(error) {
|
||||
console.log('[getWaitingApprovals] - Error: ', error);
|
||||
res.status(403).json({ error: error });
|
||||
}
|
||||
}
|
||||
@@ -34,11 +28,11 @@ const getWaitingApprovals = async(req, res) => {
|
||||
const setClubMembers = async (req, res) => {
|
||||
try {
|
||||
const { id: memberId, firstname: firstName, lastname: lastName, street, city, birthdate, phone, email, active,
|
||||
testMembership, picsInInternetAllowed, gender } = req.body;
|
||||
testMembership, picsInInternetAllowed, gender, ttr, qttr } = req.body;
|
||||
const { id: clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const addResult = await MemberService.setClubMember(userToken, clubId, memberId, firstName, lastName, street, city, birthdate,
|
||||
phone, email, active, testMembership, picsInInternetAllowed, gender);
|
||||
phone, email, active, testMembership, picsInInternetAllowed, gender, ttr, qttr);
|
||||
res.status(addResult.status || 500).json(addResult.response);
|
||||
} catch (error) {
|
||||
console.error('[setClubMembers] - Error:', error);
|
||||
@@ -59,7 +53,6 @@ const uploadMemberImage = async (req, res) => {
|
||||
};
|
||||
|
||||
const getMemberImage = async (req, res) => {
|
||||
console.log('[getMemberImage]');
|
||||
try {
|
||||
const { clubId, memberId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -75,4 +68,37 @@ const getMemberImage = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
export { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage };
|
||||
const updateRatingsFromMyTischtennis = async (req, res) => {
|
||||
try {
|
||||
const { id: clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const result = await MemberService.updateRatingsFromMyTischtennis(userToken, clubId);
|
||||
res.status(result.status).json(result.response);
|
||||
} catch (error) {
|
||||
console.error('[updateRatingsFromMyTischtennis] - Error:', error);
|
||||
res.status(500).json({ error: 'Failed to update ratings' });
|
||||
}
|
||||
};
|
||||
|
||||
const rotateMemberImage = async (req, res) => {
|
||||
try {
|
||||
const { clubId, memberId } = req.params;
|
||||
const { direction } = req.body;
|
||||
const { authcode: userToken } = req.headers;
|
||||
|
||||
if (!direction || !['left', 'right'].includes(direction)) {
|
||||
return res.status(400).json({
|
||||
success: false,
|
||||
error: 'Ungültige Drehrichtung. Verwenden Sie "left" oder "right".'
|
||||
});
|
||||
}
|
||||
|
||||
const result = await MemberService.rotateMemberImage(userToken, clubId, memberId, direction);
|
||||
res.status(result.status).json(result.response);
|
||||
} catch (error) {
|
||||
console.error('[rotateMemberImage] - Error:', error);
|
||||
res.status(500).json({ success: false, error: 'Failed to rotate image' });
|
||||
}
|
||||
};
|
||||
|
||||
export { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage, updateRatingsFromMyTischtennis, rotateMemberImage };
|
||||
@@ -1,15 +1,14 @@
|
||||
import MemberNoteService from "../services/memberNoteService.js";
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getMemberNotes = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { memberId } = req.params;
|
||||
const { clubId } = req.query;
|
||||
console.log('[getMemberNotes]', userToken, memberId, clubId);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[getMemberNotes] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
@@ -18,12 +17,10 @@ const addMemberNote = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { memberId, content, clubId } = req.body;
|
||||
console.log('[addMemberNote]', userToken, memberId, content, clubId);
|
||||
await MemberNoteService.addNoteToMember(userToken, clubId, memberId, content);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(201).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[addMemberNote] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
@@ -33,13 +30,11 @@ const deleteMemberNote = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { noteId } = req.params;
|
||||
const { clubId } = req.body;
|
||||
console.log('[deleteMemberNote]', userToken, noteId, clubId);
|
||||
const memberId = await MemberNoteService.getMemberIdForNote(noteId); // Member ID ermitteln
|
||||
await MemberNoteService.deleteNoteForMember(userToken, clubId, noteId);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[deleteMemberNote] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
|
||||
205
backend/controllers/myTischtennisController.js
Normal file
205
backend/controllers/myTischtennisController.js
Normal file
@@ -0,0 +1,205 @@
|
||||
import myTischtennisService from '../services/myTischtennisService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
class MyTischtennisController {
|
||||
/**
|
||||
* GET /api/mytischtennis/account
|
||||
* Get current user's myTischtennis account
|
||||
*/
|
||||
async getAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const account = await myTischtennisService.getAccount(userId);
|
||||
|
||||
if (!account) {
|
||||
return res.status(200).json({ account: null });
|
||||
}
|
||||
|
||||
res.status(200).json({ account });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/status
|
||||
* Check account configuration status
|
||||
*/
|
||||
async getStatus(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const status = await myTischtennisService.checkAccountStatus(userId);
|
||||
res.status(200).json(status);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/account
|
||||
* Create or update myTischtennis account
|
||||
*/
|
||||
async upsertAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { email, password, savePassword, autoUpdateRatings, userPassword } = req.body;
|
||||
|
||||
if (!email) {
|
||||
throw new HttpError(400, 'E-Mail-Adresse erforderlich');
|
||||
}
|
||||
|
||||
// Wenn ein Passwort gesetzt wird, muss das App-Passwort angegeben werden
|
||||
if (password && !userPassword) {
|
||||
throw new HttpError(400, 'App-Passwort erforderlich zum Setzen des myTischtennis-Passworts');
|
||||
}
|
||||
|
||||
const account = await myTischtennisService.upsertAccount(
|
||||
userId,
|
||||
email,
|
||||
password,
|
||||
savePassword || false,
|
||||
autoUpdateRatings || false,
|
||||
userPassword
|
||||
);
|
||||
|
||||
res.status(200).json({
|
||||
message: 'myTischtennis-Account erfolgreich gespeichert',
|
||||
account
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/mytischtennis/account
|
||||
* Delete myTischtennis account
|
||||
*/
|
||||
async deleteAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const deleted = await myTischtennisService.deleteAccount(userId);
|
||||
|
||||
if (!deleted) {
|
||||
throw new HttpError(404, 'Kein myTischtennis-Account gefunden');
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'myTischtennis-Account gelöscht' });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/verify
|
||||
* Verify login credentials
|
||||
*/
|
||||
async verifyLogin(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { password } = req.body;
|
||||
|
||||
const result = await myTischtennisService.verifyLogin(userId, password);
|
||||
|
||||
res.status(200).json({
|
||||
message: 'Login erfolgreich',
|
||||
success: true,
|
||||
accessToken: result.accessToken,
|
||||
expiresAt: result.expiresAt,
|
||||
clubId: result.clubId,
|
||||
clubName: result.clubName
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/session
|
||||
* Get stored session data for authenticated requests
|
||||
*/
|
||||
async getSession(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const session = await myTischtennisService.getSession(userId);
|
||||
|
||||
res.status(200).json({ session });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/update-history
|
||||
* Get update ratings history
|
||||
*/
|
||||
async getUpdateHistory(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const history = await myTischtennisService.getUpdateHistory(userId);
|
||||
res.status(200).json({ history });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get fetch logs for current user
|
||||
*/
|
||||
async getFetchLogs(req, res, next) {
|
||||
try {
|
||||
const { userid: userIdOrEmail } = req.headers;
|
||||
|
||||
// Convert email to userId if needed
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
const User = (await import('../models/User.js')).default;
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (!user) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
userId = user.id;
|
||||
}
|
||||
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
const logs = await fetchLogService.getFetchLogs(userId, {
|
||||
limit: req.query.limit ? parseInt(req.query.limit) : 50,
|
||||
fetchType: req.query.type
|
||||
});
|
||||
|
||||
res.status(200).json({ logs });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get latest successful fetches for each type
|
||||
*/
|
||||
async getLatestFetches(req, res, next) {
|
||||
try {
|
||||
const { userid: userIdOrEmail } = req.headers;
|
||||
|
||||
// Convert email to userId if needed
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
const User = (await import('../models/User.js')).default;
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (!user) {
|
||||
return res.status(404).json({ error: 'User not found' });
|
||||
}
|
||||
userId = user.id;
|
||||
}
|
||||
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
const latestFetches = await fetchLogService.getLatestSuccessfulFetches(userId);
|
||||
|
||||
res.status(200).json({ latestFetches });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default new MyTischtennisController();
|
||||
|
||||
477
backend/controllers/myTischtennisUrlController.js
Normal file
477
backend/controllers/myTischtennisUrlController.js
Normal file
@@ -0,0 +1,477 @@
|
||||
import myTischtennisUrlParserService from '../services/myTischtennisUrlParserService.js';
|
||||
import myTischtennisService from '../services/myTischtennisService.js';
|
||||
import autoFetchMatchResultsService from '../services/autoFetchMatchResultsService.js';
|
||||
import ClubTeam from '../models/ClubTeam.js';
|
||||
import League from '../models/League.js';
|
||||
import Season from '../models/Season.js';
|
||||
import User from '../models/User.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
class MyTischtennisUrlController {
|
||||
/**
|
||||
* Parse myTischtennis URL and return configuration data
|
||||
* POST /api/mytischtennis/parse-url
|
||||
* Body: { url: string }
|
||||
*/
|
||||
async parseUrl(req, res, next) {
|
||||
try {
|
||||
const { url } = req.body;
|
||||
|
||||
if (!url) {
|
||||
throw new HttpError(400, 'URL is required');
|
||||
}
|
||||
|
||||
// Validate URL
|
||||
if (!myTischtennisUrlParserService.isValidTeamUrl(url)) {
|
||||
throw new HttpError(400, 'Invalid myTischtennis URL format');
|
||||
}
|
||||
|
||||
// Parse URL
|
||||
const parsedData = myTischtennisUrlParserService.parseUrl(url);
|
||||
|
||||
// Try to fetch additional data if user is authenticated
|
||||
const userIdOrEmail = req.headers.userid;
|
||||
let completeData = parsedData;
|
||||
|
||||
if (userIdOrEmail) {
|
||||
// Get actual user ID
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (user) userId = user.id;
|
||||
}
|
||||
|
||||
try {
|
||||
const account = await myTischtennisService.getAccount(userId);
|
||||
|
||||
if (account && account.accessToken) {
|
||||
completeData = await myTischtennisUrlParserService.fetchTeamData(
|
||||
parsedData,
|
||||
account.cookie,
|
||||
account.accessToken
|
||||
);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error fetching additional team data:', error);
|
||||
// Continue with parsed data only
|
||||
}
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
data: completeData
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Configure team from myTischtennis URL
|
||||
* POST /api/mytischtennis/configure-team
|
||||
* Body: { url: string, clubTeamId: number, createLeague?: boolean, createSeason?: boolean }
|
||||
*/
|
||||
async configureTeam(req, res, next) {
|
||||
try {
|
||||
const { url, clubTeamId, createLeague, createSeason } = req.body;
|
||||
const userIdOrEmail = req.headers.userid;
|
||||
|
||||
if (!url || !clubTeamId) {
|
||||
throw new HttpError(400, 'URL and clubTeamId are required');
|
||||
}
|
||||
|
||||
// Get actual user ID
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (!user) {
|
||||
throw new HttpError(404, 'User not found');
|
||||
}
|
||||
userId = user.id;
|
||||
}
|
||||
|
||||
// Parse URL
|
||||
const parsedData = myTischtennisUrlParserService.parseUrl(url);
|
||||
|
||||
// Try to fetch additional data
|
||||
let completeData = parsedData;
|
||||
const account = await myTischtennisService.getAccount(userId);
|
||||
|
||||
if (account && account.accessToken) {
|
||||
try {
|
||||
completeData = await myTischtennisUrlParserService.fetchTeamData(
|
||||
parsedData,
|
||||
account.cookie,
|
||||
account.accessToken
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Error fetching team data:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Find or create season
|
||||
let season = await Season.findOne({
|
||||
where: { season: completeData.season }
|
||||
});
|
||||
|
||||
if (!season && createSeason) {
|
||||
season = await Season.create({
|
||||
season: completeData.season
|
||||
});
|
||||
}
|
||||
|
||||
if (!season) {
|
||||
throw new HttpError(404, `Season ${completeData.season} not found. Set createSeason=true to create it.`);
|
||||
}
|
||||
|
||||
// Find or create league
|
||||
const team = await ClubTeam.findByPk(clubTeamId);
|
||||
if (!team) {
|
||||
throw new HttpError(404, 'Club team not found');
|
||||
}
|
||||
|
||||
let league;
|
||||
|
||||
// First, try to find existing league by name and season
|
||||
const leagueName = completeData.leagueName || completeData.groupname;
|
||||
league = await League.findOne({
|
||||
where: {
|
||||
name: leagueName,
|
||||
seasonId: season.id,
|
||||
clubId: team.clubId
|
||||
}
|
||||
});
|
||||
|
||||
if (league) {
|
||||
devLog(`Found existing league: ${league.name} (ID: ${league.id})`);
|
||||
// Update myTischtennis fields
|
||||
await league.update({
|
||||
myTischtennisGroupId: completeData.groupId,
|
||||
association: completeData.association,
|
||||
groupname: completeData.groupname
|
||||
});
|
||||
} else if (team.leagueId) {
|
||||
// Team has a league assigned, update it
|
||||
league = await League.findByPk(team.leagueId);
|
||||
|
||||
if (league) {
|
||||
devLog(`Updating team's existing league: ${league.name} (ID: ${league.id})`);
|
||||
await league.update({
|
||||
name: leagueName,
|
||||
myTischtennisGroupId: completeData.groupId,
|
||||
association: completeData.association,
|
||||
groupname: completeData.groupname
|
||||
});
|
||||
}
|
||||
} else if (createLeague) {
|
||||
// Create new league
|
||||
devLog(`Creating new league: ${leagueName}`);
|
||||
league = await League.create({
|
||||
name: leagueName,
|
||||
seasonId: season.id,
|
||||
clubId: team.clubId,
|
||||
myTischtennisGroupId: completeData.groupId,
|
||||
association: completeData.association,
|
||||
groupname: completeData.groupname
|
||||
});
|
||||
} else {
|
||||
throw new HttpError(400, 'League not found and team has no league assigned. Set createLeague=true to create one.');
|
||||
}
|
||||
|
||||
// Update team
|
||||
await team.update({
|
||||
myTischtennisTeamId: completeData.teamId,
|
||||
leagueId: league.id,
|
||||
seasonId: season.id
|
||||
});
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: 'Team configured successfully',
|
||||
data: {
|
||||
team: {
|
||||
id: team.id,
|
||||
name: team.name,
|
||||
myTischtennisTeamId: completeData.teamId
|
||||
},
|
||||
league: {
|
||||
id: league.id,
|
||||
name: league.name,
|
||||
myTischtennisGroupId: completeData.groupId,
|
||||
association: completeData.association,
|
||||
groupname: completeData.groupname
|
||||
},
|
||||
season: {
|
||||
id: season.id,
|
||||
name: season.season
|
||||
},
|
||||
parsedData: completeData
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Manually fetch team data from myTischtennis
|
||||
* POST /api/mytischtennis/fetch-team-data
|
||||
* Body: { clubTeamId: number }
|
||||
*/
|
||||
async fetchTeamData(req, res, next) {
|
||||
try {
|
||||
const { clubTeamId } = req.body;
|
||||
const userIdOrEmail = req.headers.userid;
|
||||
|
||||
if (!clubTeamId) {
|
||||
throw new HttpError(400, 'clubTeamId is required');
|
||||
}
|
||||
|
||||
// Get actual user ID (userid header might be email address)
|
||||
let userId = userIdOrEmail;
|
||||
if (isNaN(userIdOrEmail)) {
|
||||
// It's an email, find the user
|
||||
const user = await User.findOne({ where: { email: userIdOrEmail } });
|
||||
if (!user) {
|
||||
throw new HttpError(404, 'User not found');
|
||||
}
|
||||
userId = user.id;
|
||||
}
|
||||
|
||||
// Get myTischtennis session (similar to memberService.updateRatingsFromMyTischtennis)
|
||||
console.log('Fetching session for userId:', userId, '(from header:', userIdOrEmail, ')');
|
||||
let session;
|
||||
|
||||
try {
|
||||
session = await myTischtennisService.getSession(userId);
|
||||
console.log('Session found:', !!session);
|
||||
} catch (sessionError) {
|
||||
console.log('Session invalid, attempting login...', sessionError.message);
|
||||
|
||||
// Versuche automatischen Login mit gespeicherten Credentials
|
||||
try {
|
||||
await myTischtennisService.verifyLogin(userId);
|
||||
session = await myTischtennisService.getSession(userId);
|
||||
console.log('Automatic login successful');
|
||||
} catch (loginError) {
|
||||
console.error('Automatic login failed:', loginError.message);
|
||||
throw new HttpError(401, 'MyTischtennis-Session abgelaufen und automatischer Login fehlgeschlagen. Bitte melden Sie sich in den MyTischtennis-Einstellungen an.');
|
||||
}
|
||||
}
|
||||
|
||||
// Get account data (for clubId, etc.)
|
||||
const account = await myTischtennisService.getAccount(userId);
|
||||
|
||||
if (!account) {
|
||||
throw new HttpError(404, 'MyTischtennis-Account nicht verknüpft. Bitte verknüpfen Sie Ihren Account in den MyTischtennis-Einstellungen.');
|
||||
}
|
||||
|
||||
console.log('Using session:', {
|
||||
email: account.email,
|
||||
hasCookie: !!session.cookie,
|
||||
hasAccessToken: !!session.accessToken,
|
||||
expiresAt: new Date(session.expiresAt * 1000)
|
||||
});
|
||||
|
||||
// Get team with league and season
|
||||
const team = await ClubTeam.findByPk(clubTeamId, {
|
||||
include: [
|
||||
{
|
||||
model: League,
|
||||
as: 'league',
|
||||
include: [
|
||||
{
|
||||
model: Season,
|
||||
as: 'season'
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
if (!team) {
|
||||
throw new HttpError(404, 'Team not found');
|
||||
}
|
||||
|
||||
console.log('Team data:', {
|
||||
id: team.id,
|
||||
name: team.name,
|
||||
myTischtennisTeamId: team.myTischtennisTeamId,
|
||||
hasLeague: !!team.league,
|
||||
leagueData: team.league ? {
|
||||
id: team.league.id,
|
||||
name: team.league.name,
|
||||
myTischtennisGroupId: team.league.myTischtennisGroupId,
|
||||
association: team.league.association,
|
||||
groupname: team.league.groupname,
|
||||
hasSeason: !!team.league.season
|
||||
} : null
|
||||
});
|
||||
|
||||
if (!team.myTischtennisTeamId || !team.league || !team.league.myTischtennisGroupId) {
|
||||
throw new HttpError(400, 'Team is not configured for myTischtennis');
|
||||
}
|
||||
|
||||
// Fetch data for this specific team
|
||||
const startTime = Date.now();
|
||||
let matchResultsSuccess = false;
|
||||
let tableUpdateSuccess = false;
|
||||
let matchResultsCount = 0;
|
||||
let tableUpdateCount = 0;
|
||||
|
||||
try {
|
||||
const result = await autoFetchMatchResultsService.fetchTeamResults(
|
||||
{
|
||||
userId: account.userId,
|
||||
email: account.email,
|
||||
cookie: session.cookie,
|
||||
accessToken: session.accessToken,
|
||||
expiresAt: session.expiresAt,
|
||||
getPassword: () => null // Not needed for manual fetch
|
||||
},
|
||||
team
|
||||
);
|
||||
|
||||
matchResultsSuccess = true;
|
||||
matchResultsCount = result.fetchedCount || 0;
|
||||
|
||||
// Log match results fetch
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
await fetchLogService.logFetch(
|
||||
account.userId,
|
||||
'match_results',
|
||||
true,
|
||||
`${matchResultsCount} Spielergebnisse erfolgreich abgerufen`,
|
||||
{
|
||||
recordsProcessed: matchResultsCount,
|
||||
executionTime: Date.now() - startTime,
|
||||
isAutomatic: false
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Error fetching match results:', error);
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
await fetchLogService.logFetch(
|
||||
account.userId,
|
||||
'match_results',
|
||||
false,
|
||||
'Fehler beim Abrufen der Spielergebnisse',
|
||||
{
|
||||
errorDetails: error.message,
|
||||
executionTime: Date.now() - startTime,
|
||||
isAutomatic: false
|
||||
}
|
||||
);
|
||||
}
|
||||
|
||||
// Also fetch and update league table data
|
||||
let tableUpdateResult = null;
|
||||
const tableStartTime = Date.now();
|
||||
try {
|
||||
await autoFetchMatchResultsService.fetchAndUpdateLeagueTable(account.userId, team.league.id);
|
||||
tableUpdateResult = 'League table updated successfully';
|
||||
tableUpdateSuccess = true;
|
||||
tableUpdateCount = 1; // One table updated
|
||||
console.log('✓ League table updated for league:', team.league.id);
|
||||
|
||||
// Log league table fetch
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
await fetchLogService.logFetch(
|
||||
account.userId,
|
||||
'league_table',
|
||||
true,
|
||||
'Ligatabelle erfolgreich aktualisiert',
|
||||
{
|
||||
recordsProcessed: tableUpdateCount,
|
||||
executionTime: Date.now() - tableStartTime,
|
||||
isAutomatic: false
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Error fetching league table data:', error);
|
||||
tableUpdateResult = 'League table update failed: ' + error.message;
|
||||
|
||||
// Log league table fetch failure
|
||||
const fetchLogService = (await import('../services/myTischtennisFetchLogService.js')).default;
|
||||
await fetchLogService.logFetch(
|
||||
account.userId,
|
||||
'league_table',
|
||||
false,
|
||||
'Fehler beim Aktualisieren der Ligatabelle',
|
||||
{
|
||||
errorDetails: error.message,
|
||||
executionTime: Date.now() - tableStartTime,
|
||||
isAutomatic: false
|
||||
}
|
||||
);
|
||||
// Don't fail the entire request if table update fails
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
message: `${matchResultsCount} Datensätze abgerufen und verarbeitet`,
|
||||
data: {
|
||||
fetchedCount: matchResultsCount,
|
||||
teamName: team.name,
|
||||
tableUpdate: tableUpdateResult
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error in fetchTeamData:', error);
|
||||
console.error('Error stack:', error.stack);
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get myTischtennis URL for a team
|
||||
* GET /api/mytischtennis/team-url/:teamId
|
||||
*/
|
||||
async getTeamUrl(req, res, next) {
|
||||
try {
|
||||
const { teamId } = req.params;
|
||||
|
||||
const team = await ClubTeam.findByPk(teamId, {
|
||||
include: [
|
||||
{
|
||||
model: League,
|
||||
as: 'league',
|
||||
include: [
|
||||
{
|
||||
model: Season,
|
||||
as: 'season'
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
if (!team) {
|
||||
throw new HttpError(404, 'Team not found');
|
||||
}
|
||||
|
||||
if (!team.myTischtennisTeamId || !team.league || !team.league.myTischtennisGroupId) {
|
||||
throw new HttpError(400, 'Team is not configured for myTischtennis');
|
||||
}
|
||||
|
||||
const url = myTischtennisUrlParserService.buildUrl({
|
||||
association: team.league.association,
|
||||
season: team.league.season.name,
|
||||
groupname: team.league.groupname,
|
||||
groupId: team.league.myTischtennisGroupId,
|
||||
teamId: team.myTischtennisTeamId,
|
||||
teamname: team.name
|
||||
});
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
url
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default new MyTischtennisUrlController();
|
||||
@@ -1,12 +1,13 @@
|
||||
import Participant from '../models/Participant.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getParticipants = async (req, res) => {
|
||||
try {
|
||||
const { dateId } = req.params;
|
||||
const participants = await Participant.findAll({ where: { diaryDateId: dateId } });
|
||||
res.status(200).json(participants);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Abrufen der Teilnehmer' });
|
||||
}
|
||||
};
|
||||
@@ -17,7 +18,7 @@ export const addParticipant = async (req, res) => {
|
||||
const participant = await Participant.create({ diaryDateId, memberId });
|
||||
res.status(201).json(participant);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Hinzufügen des Teilnehmers' });
|
||||
}
|
||||
};
|
||||
@@ -28,7 +29,7 @@ export const removeParticipant = async (req, res) => {
|
||||
await Participant.destroy({ where: { diaryDateId, memberId } });
|
||||
res.status(200).json({ message: 'Teilnehmer entfernt' });
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Entfernen des Teilnehmers' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -5,6 +5,7 @@ import path from 'path';
|
||||
import fs from 'fs';
|
||||
import sharp from 'sharp';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const uploadPredefinedActivityImage = async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params; // predefinedActivityId
|
||||
@@ -35,7 +36,6 @@ export const uploadPredefinedActivityImage = async (req, res) => {
|
||||
|
||||
// Extrahiere Zeichnungsdaten aus dem Request
|
||||
const drawingData = req.body.drawingData ? JSON.parse(req.body.drawingData) : null;
|
||||
console.log('[uploadPredefinedActivityImage] - drawingData:', drawingData);
|
||||
|
||||
const imageRecord = await PredefinedActivityImage.create({
|
||||
predefinedActivityId: id,
|
||||
|
||||
103
backend/controllers/seasonController.js
Normal file
103
backend/controllers/seasonController.js
Normal file
@@ -0,0 +1,103 @@
|
||||
import SeasonService from '../services/seasonService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
export const getSeasons = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const seasons = await SeasonService.getAllSeasons();
|
||||
|
||||
res.status(200).json(seasons);
|
||||
} catch (error) {
|
||||
console.error('[getSeasons] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getCurrentSeason = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const season = await SeasonService.getOrCreateCurrentSeason();
|
||||
|
||||
res.status(200).json(season);
|
||||
} catch (error) {
|
||||
console.error('[getCurrentSeason] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const createSeason = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { season } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
if (!season) {
|
||||
return res.status(400).json({ error: "missingseason" });
|
||||
}
|
||||
|
||||
// Validiere Saison-Format (z.B. "2023/2024")
|
||||
const seasonRegex = /^\d{4}\/\d{4}$/;
|
||||
if (!seasonRegex.test(season)) {
|
||||
return res.status(400).json({ error: "invalidseasonformat" });
|
||||
}
|
||||
|
||||
const newSeason = await SeasonService.createSeason(season);
|
||||
|
||||
res.status(201).json(newSeason);
|
||||
} catch (error) {
|
||||
console.error('[createSeason] - Error:', error);
|
||||
if (error.message === 'Season already exists') {
|
||||
res.status(409).json({ error: "alreadyexists" });
|
||||
} else {
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export const getSeason = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { seasonid: seasonId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const season = await SeasonService.getSeasonById(seasonId);
|
||||
if (!season) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json(season);
|
||||
} catch (error) {
|
||||
console.error('[getSeason] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteSeason = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { seasonid: seasonId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const success = await SeasonService.deleteSeason(seasonId);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json({ message: "deleted" });
|
||||
} catch (error) {
|
||||
console.error('[deleteSeason] - Error:', error);
|
||||
if (error.message === 'Season is used by teams' || error.message === 'Season is used by leagues') {
|
||||
res.status(409).json({ error: "seasoninuse" });
|
||||
} else {
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
}
|
||||
};
|
||||
130
backend/controllers/teamController.js
Normal file
130
backend/controllers/teamController.js
Normal file
@@ -0,0 +1,130 @@
|
||||
import TeamService from '../services/teamService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
export const getTeams = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { seasonid: seasonId } = req.query;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
// Check if user has access to this club
|
||||
const teams = await TeamService.getAllTeamsByClub(clubId, seasonId);
|
||||
|
||||
res.status(200).json(teams);
|
||||
} catch (error) {
|
||||
console.error('[getTeams] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { teamid: teamId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const team = await TeamService.getTeamById(teamId);
|
||||
if (!team) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json(team);
|
||||
} catch (error) {
|
||||
console.error('[getTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const createTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { name, leagueId, seasonId } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
if (!name) {
|
||||
return res.status(400).json({ error: "missingname" });
|
||||
}
|
||||
|
||||
const teamData = {
|
||||
name,
|
||||
clubId: parseInt(clubId),
|
||||
leagueId: leagueId ? parseInt(leagueId) : null,
|
||||
seasonId: seasonId ? parseInt(seasonId) : null
|
||||
};
|
||||
|
||||
const newTeam = await TeamService.createTeam(teamData);
|
||||
|
||||
res.status(201).json(newTeam);
|
||||
} catch (error) {
|
||||
console.error('[createTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { teamid: teamId } = req.params;
|
||||
const { name, leagueId, seasonId } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const updateData = {};
|
||||
if (name !== undefined) updateData.name = name;
|
||||
if (leagueId !== undefined) updateData.leagueId = leagueId ? parseInt(leagueId) : null;
|
||||
if (seasonId !== undefined) updateData.seasonId = seasonId ? parseInt(seasonId) : null;
|
||||
|
||||
const success = await TeamService.updateTeam(teamId, updateData);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
const updatedTeam = await TeamService.getTeamById(teamId);
|
||||
res.status(200).json(updatedTeam);
|
||||
} catch (error) {
|
||||
console.error('[updateTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteTeam = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { teamid: teamId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const success = await TeamService.deleteTeam(teamId);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json({ message: "deleted" });
|
||||
} catch (error) {
|
||||
console.error('[deleteTeam] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getLeagues = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
const { seasonid: seasonId } = req.query;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const leagues = await TeamService.getLeaguesByClub(clubId, seasonId);
|
||||
|
||||
res.status(200).json(leagues);
|
||||
} catch (error) {
|
||||
console.error('[getLeagues] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
215
backend/controllers/teamDocumentController.js
Normal file
215
backend/controllers/teamDocumentController.js
Normal file
@@ -0,0 +1,215 @@
|
||||
import multer from 'multer';
|
||||
import path from 'path';
|
||||
import TeamDocumentService from '../services/teamDocumentService.js';
|
||||
import PDFParserService from '../services/pdfParserService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
// Multer-Konfiguration für Datei-Uploads
|
||||
const storage = multer.diskStorage({
|
||||
destination: (req, file, cb) => {
|
||||
cb(null, 'uploads/temp/');
|
||||
},
|
||||
filename: (req, file, cb) => {
|
||||
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
|
||||
cb(null, file.fieldname + '-' + uniqueSuffix + path.extname(file.originalname));
|
||||
}
|
||||
});
|
||||
|
||||
const upload = multer({
|
||||
storage: storage,
|
||||
limits: {
|
||||
fileSize: 10 * 1024 * 1024 // 10MB Limit
|
||||
},
|
||||
fileFilter: (req, file, cb) => {
|
||||
// Erlaube nur PDF, DOC, DOCX, TXT, CSV Dateien
|
||||
const allowedTypes = /pdf|doc|docx|txt|csv/;
|
||||
const extname = allowedTypes.test(path.extname(file.originalname).toLowerCase());
|
||||
const mimetype = allowedTypes.test(file.mimetype);
|
||||
|
||||
if (mimetype && extname) {
|
||||
return cb(null, true);
|
||||
} else {
|
||||
cb(new Error('Nur PDF, DOC, DOCX, TXT und CSV Dateien sind erlaubt!'));
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
export const uploadMiddleware = upload.single('document');
|
||||
|
||||
export const uploadDocument = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubteamid: clubTeamId } = req.params;
|
||||
const { documentType } = req.body;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
if (!req.file) {
|
||||
return res.status(400).json({ error: "nofile" });
|
||||
}
|
||||
|
||||
if (!documentType || !['code_list', 'pin_list'].includes(documentType)) {
|
||||
return res.status(400).json({ error: "invaliddocumenttype" });
|
||||
}
|
||||
|
||||
const document = await TeamDocumentService.uploadDocument(req.file, clubTeamId, documentType);
|
||||
|
||||
res.status(201).json(document);
|
||||
} catch (error) {
|
||||
console.error('[uploadDocument] - Error:', error);
|
||||
|
||||
// Lösche temporäre Datei bei Fehler
|
||||
if (req.file && req.file.path) {
|
||||
try {
|
||||
const fs = await import('fs');
|
||||
fs.unlinkSync(req.file.path);
|
||||
} catch (cleanupError) {
|
||||
console.error('Fehler beim Löschen der temporären Datei:', cleanupError);
|
||||
}
|
||||
}
|
||||
|
||||
if (error.message === 'Club-Team nicht gefunden') {
|
||||
return res.status(404).json({ error: "clubteamnotfound" });
|
||||
}
|
||||
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getDocuments = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubteamid: clubTeamId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const documents = await TeamDocumentService.getDocumentsByClubTeam(clubTeamId);
|
||||
|
||||
res.status(200).json(documents);
|
||||
} catch (error) {
|
||||
console.error('[getDocuments] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getDocument = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { documentid: documentId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const document = await TeamDocumentService.getDocumentById(documentId);
|
||||
if (!document) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json(document);
|
||||
} catch (error) {
|
||||
console.error('[getDocument] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const downloadDocument = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { documentid: documentId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const document = await TeamDocumentService.getDocumentById(documentId);
|
||||
if (!document) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
const filePath = await TeamDocumentService.getDocumentPath(documentId);
|
||||
if (!filePath) {
|
||||
return res.status(404).json({ error: "filenotfound" });
|
||||
}
|
||||
|
||||
// Prüfe ob Datei existiert
|
||||
const fs = await import('fs');
|
||||
if (!fs.existsSync(filePath)) {
|
||||
return res.status(404).json({ error: "filenotfound" });
|
||||
}
|
||||
|
||||
// Setze Headers für Inline-Anzeige (PDF-Viewer)
|
||||
res.setHeader('Content-Disposition', `inline; filename="${document.originalFileName}"`);
|
||||
res.setHeader('Content-Type', document.mimeType);
|
||||
|
||||
// Sende die Datei
|
||||
res.sendFile(filePath);
|
||||
} catch (error) {
|
||||
console.error('[downloadDocument] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteDocument = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { documentid: documentId } = req.params;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
const success = await TeamDocumentService.deleteDocument(documentId);
|
||||
if (!success) {
|
||||
return res.status(404).json({ error: "notfound" });
|
||||
}
|
||||
|
||||
res.status(200).json({ message: "Document deleted successfully" });
|
||||
} catch (error) {
|
||||
console.error('[deleteDocument] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const parsePDF = async (req, res) => {
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { documentid: documentId } = req.params;
|
||||
const { leagueid: leagueId } = req.query;
|
||||
|
||||
const user = await getUserByToken(token);
|
||||
|
||||
if (!leagueId) {
|
||||
return res.status(400).json({ error: "missingleagueid" });
|
||||
}
|
||||
|
||||
// Hole Dokument-Informationen
|
||||
const document = await TeamDocumentService.getDocumentById(documentId);
|
||||
if (!document) {
|
||||
return res.status(404).json({ error: "documentnotfound" });
|
||||
}
|
||||
|
||||
// Prüfe ob es eine PDF- oder TXT-Datei ist
|
||||
if (!document.mimeType.includes('pdf') && !document.mimeType.includes('text/plain')) {
|
||||
return res.status(400).json({ error: "notapdfortxt" });
|
||||
}
|
||||
|
||||
// Parse PDF
|
||||
const parseResult = await PDFParserService.parsePDF(document.filePath, document.clubTeam.clubId);
|
||||
|
||||
// Speichere Matches in Datenbank
|
||||
const saveResult = await PDFParserService.saveMatchesToDatabase(parseResult.matches, parseInt(leagueId));
|
||||
|
||||
res.status(200).json({
|
||||
parseResult: {
|
||||
matchesFound: parseResult.matches.length,
|
||||
debugInfo: parseResult.debugInfo,
|
||||
allLines: parseResult.allLines,
|
||||
rawText: parseResult.rawText
|
||||
},
|
||||
saveResult: {
|
||||
created: saveResult.created,
|
||||
updated: saveResult.updated,
|
||||
errors: saveResult.errors
|
||||
}
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('[parsePDF] - Error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
@@ -19,6 +19,25 @@ class TrainingStatsController {
|
||||
}
|
||||
});
|
||||
|
||||
// Anzahl der Trainings im jeweiligen Zeitraum berechnen
|
||||
const trainingsCount12Months = await DiaryDate.count({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: twelveMonthsAgo
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const trainingsCount3Months = await DiaryDate.count({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: threeMonthsAgo
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const stats = [];
|
||||
|
||||
for (const member of members) {
|
||||
@@ -116,7 +135,36 @@ class TrainingStatsController {
|
||||
// Nach Gesamtteilnahme absteigend sortieren
|
||||
stats.sort((a, b) => b.participationTotal - a.participationTotal);
|
||||
|
||||
res.json(stats);
|
||||
// Trainingstage mit Teilnehmerzahlen abrufen (letzte 12 Monate, absteigend sortiert)
|
||||
const trainingDays = await DiaryDate.findAll({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: twelveMonthsAgo
|
||||
}
|
||||
},
|
||||
include: [{
|
||||
model: Participant,
|
||||
as: 'participantList',
|
||||
attributes: ['id']
|
||||
}],
|
||||
order: [['date', 'DESC']]
|
||||
});
|
||||
|
||||
// Formatiere Trainingstage mit Teilnehmerzahl
|
||||
const formattedTrainingDays = trainingDays.map(day => ({
|
||||
id: day.id,
|
||||
date: day.date,
|
||||
participantCount: day.participantList ? day.participantList.length : 0
|
||||
}));
|
||||
|
||||
// Zusätzliche Metadaten mit Trainingsanzahl zurückgeben
|
||||
res.json({
|
||||
members: stats,
|
||||
trainingsCount12Months,
|
||||
trainingsCount3Months,
|
||||
trainingDays: formattedTrainingDays
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Laden der Trainings-Statistik:', error);
|
||||
|
||||
@@ -0,0 +1,13 @@
|
||||
-- Migration: Add auto update ratings fields to my_tischtennis table
|
||||
-- Date: 2025-01-27
|
||||
|
||||
-- Add auto_update_ratings column
|
||||
ALTER TABLE my_tischtennis
|
||||
ADD COLUMN auto_update_ratings BOOLEAN NOT NULL DEFAULT FALSE;
|
||||
|
||||
-- Add last_update_ratings column
|
||||
ALTER TABLE my_tischtennis
|
||||
ADD COLUMN last_update_ratings TIMESTAMP NULL;
|
||||
|
||||
-- Create index for auto_update_ratings for efficient querying
|
||||
CREATE INDEX idx_my_tischtennis_auto_update_ratings ON my_tischtennis(auto_update_ratings);
|
||||
28
backend/migrations/add_match_result_fields.sql
Normal file
28
backend/migrations/add_match_result_fields.sql
Normal file
@@ -0,0 +1,28 @@
|
||||
-- Migration: Add match result fields to match table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
-- Add myTischtennis meeting ID
|
||||
ALTER TABLE `match`
|
||||
ADD COLUMN my_tischtennis_meeting_id VARCHAR(255) NULL UNIQUE COMMENT 'Meeting ID from myTischtennis (e.g. 15440488)';
|
||||
|
||||
-- Add home match points
|
||||
ALTER TABLE `match`
|
||||
ADD COLUMN home_match_points INT DEFAULT 0 NULL COMMENT 'Match points won by home team';
|
||||
|
||||
-- Add guest match points
|
||||
ALTER TABLE `match`
|
||||
ADD COLUMN guest_match_points INT DEFAULT 0 NULL COMMENT 'Match points won by guest team';
|
||||
|
||||
-- Add is_completed flag
|
||||
ALTER TABLE `match`
|
||||
ADD COLUMN is_completed BOOLEAN NOT NULL DEFAULT FALSE COMMENT 'Whether the match is completed';
|
||||
|
||||
-- Add PDF URL
|
||||
ALTER TABLE `match`
|
||||
ADD COLUMN pdf_url VARCHAR(512) NULL COMMENT 'PDF URL from myTischtennis';
|
||||
|
||||
-- Create indexes
|
||||
CREATE INDEX idx_match_my_tischtennis_meeting_id ON `match`(my_tischtennis_meeting_id);
|
||||
CREATE INDEX idx_match_is_completed ON `match`(is_completed);
|
||||
|
||||
4
backend/migrations/add_matches_tied_to_team.sql
Normal file
4
backend/migrations/add_matches_tied_to_team.sql
Normal file
@@ -0,0 +1,4 @@
|
||||
-- Add matches_tied column to team table
|
||||
ALTER TABLE team
|
||||
ADD COLUMN matches_tied INTEGER NOT NULL DEFAULT 0 AFTER matches_lost;
|
||||
|
||||
19
backend/migrations/add_mytischtennis_fields_to_league.sql
Normal file
19
backend/migrations/add_mytischtennis_fields_to_league.sql
Normal file
@@ -0,0 +1,19 @@
|
||||
-- Migration: Add myTischtennis fields to league table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
-- Add my_tischtennis_group_id column
|
||||
ALTER TABLE league
|
||||
ADD COLUMN my_tischtennis_group_id VARCHAR(255) NULL COMMENT 'Group ID from myTischtennis (e.g. 504417)';
|
||||
|
||||
-- Add association column
|
||||
ALTER TABLE league
|
||||
ADD COLUMN association VARCHAR(255) NULL COMMENT 'Association/Verband (e.g. HeTTV)';
|
||||
|
||||
-- Add groupname column
|
||||
ALTER TABLE league
|
||||
ADD COLUMN groupname VARCHAR(255) NULL COMMENT 'Group name for URL (e.g. 1.Kreisklasse)';
|
||||
|
||||
-- Create index for efficient querying
|
||||
CREATE INDEX idx_league_my_tischtennis_group_id ON league(my_tischtennis_group_id);
|
||||
|
||||
11
backend/migrations/add_mytischtennis_player_id_to_member.sql
Normal file
11
backend/migrations/add_mytischtennis_player_id_to_member.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
-- Migration: Add myTischtennis player ID to member table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
-- Add my_tischtennis_player_id column
|
||||
ALTER TABLE member
|
||||
ADD COLUMN my_tischtennis_player_id VARCHAR(255) NULL COMMENT 'Player ID from myTischtennis (e.g. NU2705037)';
|
||||
|
||||
-- Create index for efficient querying
|
||||
CREATE INDEX idx_member_my_tischtennis_player_id ON member(my_tischtennis_player_id);
|
||||
|
||||
@@ -0,0 +1,11 @@
|
||||
-- Migration: Add myTischtennis team ID to club_team table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
-- Add my_tischtennis_team_id column
|
||||
ALTER TABLE club_team
|
||||
ADD COLUMN my_tischtennis_team_id VARCHAR(255) NULL COMMENT 'Team ID from myTischtennis (e.g. 2995094)';
|
||||
|
||||
-- Create index for efficient querying
|
||||
CREATE INDEX idx_club_team_my_tischtennis_team_id ON club_team(my_tischtennis_team_id);
|
||||
|
||||
44
backend/migrations/add_season_to_teams.sql
Normal file
44
backend/migrations/add_season_to_teams.sql
Normal file
@@ -0,0 +1,44 @@
|
||||
-- Migration: Add season_id to teams table
|
||||
-- First, add the column as nullable
|
||||
ALTER TABLE `team` ADD COLUMN `season_id` INT NULL;
|
||||
|
||||
-- Get or create current season
|
||||
SET @current_season_id = (
|
||||
SELECT id FROM `season`
|
||||
WHERE season = (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
)
|
||||
LIMIT 1
|
||||
);
|
||||
|
||||
-- If no season exists, create it
|
||||
INSERT IGNORE INTO `season` (season) VALUES (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
);
|
||||
|
||||
-- Get the season ID again (in case we just created it)
|
||||
SET @current_season_id = (
|
||||
SELECT id FROM `season`
|
||||
WHERE season = (
|
||||
CASE
|
||||
WHEN MONTH(CURDATE()) >= 7 THEN CONCAT(YEAR(CURDATE()), '/', YEAR(CURDATE()) + 1)
|
||||
ELSE CONCAT(YEAR(CURDATE()) - 1, '/', YEAR(CURDATE()))
|
||||
END
|
||||
)
|
||||
LIMIT 1
|
||||
);
|
||||
|
||||
-- Update all existing teams to use the current season
|
||||
UPDATE `team` SET `season_id` = @current_season_id WHERE `season_id` IS NULL;
|
||||
|
||||
-- Now make the column NOT NULL and add the foreign key constraint
|
||||
ALTER TABLE `team` MODIFY COLUMN `season_id` INT NOT NULL;
|
||||
ALTER TABLE `team` ADD CONSTRAINT `team_season_id_foreign_idx`
|
||||
FOREIGN KEY (`season_id`) REFERENCES `season` (`id`)
|
||||
ON DELETE CASCADE ON UPDATE CASCADE;
|
||||
11
backend/migrations/add_table_fields_to_team.sql
Normal file
11
backend/migrations/add_table_fields_to_team.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
-- Migration: Add table fields to team table
|
||||
-- Add fields for league table calculations
|
||||
|
||||
ALTER TABLE team ADD COLUMN matches_played INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN matches_won INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN matches_lost INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN sets_won INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN sets_lost INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN points_won INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN points_lost INT NOT NULL DEFAULT 0;
|
||||
ALTER TABLE team ADD COLUMN table_points INT NOT NULL DEFAULT 0;
|
||||
5
backend/migrations/add_table_points_won_lost_to_team.sql
Normal file
5
backend/migrations/add_table_points_won_lost_to_team.sql
Normal file
@@ -0,0 +1,5 @@
|
||||
-- Add table_points_won and table_points_lost columns to team table
|
||||
ALTER TABLE team
|
||||
ADD COLUMN table_points_won INTEGER NOT NULL DEFAULT 0 AFTER table_points,
|
||||
ADD COLUMN table_points_lost INTEGER NOT NULL DEFAULT 0 AFTER table_points_won;
|
||||
|
||||
20
backend/migrations/create_my_tischtennis_fetch_log.sql
Normal file
20
backend/migrations/create_my_tischtennis_fetch_log.sql
Normal file
@@ -0,0 +1,20 @@
|
||||
-- Create my_tischtennis_fetch_log table for tracking data fetches
|
||||
CREATE TABLE IF NOT EXISTS my_tischtennis_fetch_log (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
user_id INT NOT NULL,
|
||||
fetch_type ENUM('ratings', 'match_results', 'league_table') NOT NULL COMMENT 'Type of data fetch',
|
||||
success BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
message TEXT,
|
||||
error_details TEXT,
|
||||
records_processed INT NOT NULL DEFAULT 0 COMMENT 'Number of records processed',
|
||||
execution_time INT COMMENT 'Execution time in milliseconds',
|
||||
is_automatic BOOLEAN NOT NULL DEFAULT FALSE COMMENT 'Automatic or manual fetch',
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
FOREIGN KEY (user_id) REFERENCES user(id) ON DELETE CASCADE ON UPDATE CASCADE,
|
||||
|
||||
INDEX idx_user_fetch_type_created (user_id, fetch_type, created_at),
|
||||
INDEX idx_created_at (created_at)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
23
backend/migrations/create_my_tischtennis_update_history.sql
Normal file
23
backend/migrations/create_my_tischtennis_update_history.sql
Normal file
@@ -0,0 +1,23 @@
|
||||
-- Migration: Create my_tischtennis_update_history table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
CREATE TABLE IF NOT EXISTS my_tischtennis_update_history (
|
||||
id INT AUTO_INCREMENT PRIMARY KEY,
|
||||
user_id INT NOT NULL,
|
||||
success BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
message TEXT,
|
||||
error_details TEXT,
|
||||
updated_count INT DEFAULT 0,
|
||||
execution_time INT,
|
||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
|
||||
|
||||
CONSTRAINT fk_my_tischtennis_update_history_user_id
|
||||
FOREIGN KEY (user_id) REFERENCES user(id) ON DELETE CASCADE
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
|
||||
|
||||
-- Create indexes for efficient querying
|
||||
CREATE INDEX idx_my_tischtennis_update_history_user_id ON my_tischtennis_update_history(user_id);
|
||||
CREATE INDEX idx_my_tischtennis_update_history_created_at ON my_tischtennis_update_history(created_at);
|
||||
CREATE INDEX idx_my_tischtennis_update_history_success ON my_tischtennis_update_history(success);
|
||||
8
backend/migrations/make_location_optional_in_match.sql
Normal file
8
backend/migrations/make_location_optional_in_match.sql
Normal file
@@ -0,0 +1,8 @@
|
||||
-- Migration: Make locationId optional in match table
|
||||
-- Date: 2025-01-27
|
||||
-- For MariaDB
|
||||
|
||||
-- Modify locationId to allow NULL
|
||||
ALTER TABLE `match`
|
||||
MODIFY COLUMN location_id INT NULL;
|
||||
|
||||
60
backend/models/ClubTeam.js
Normal file
60
backend/models/ClubTeam.js
Normal file
@@ -0,0 +1,60 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Club from './Club.js';
|
||||
import League from './League.js';
|
||||
import Season from './Season.js';
|
||||
|
||||
const ClubTeam = sequelize.define('ClubTeam', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
name: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: Club,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
leagueId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
references: {
|
||||
model: League,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'SET NULL',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
seasonId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
references: {
|
||||
model: Season,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
myTischtennisTeamId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Team ID from myTischtennis (e.g. 2995094)',
|
||||
field: 'my_tischtennis_team_id'
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'club_team',
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default ClubTeam;
|
||||
@@ -34,6 +34,22 @@ const League = sequelize.define('League', {
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
myTischtennisGroupId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Group ID from myTischtennis (e.g. 504417)',
|
||||
field: 'my_tischtennis_group_id'
|
||||
},
|
||||
association: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Association/Verband (e.g. HeTTV)',
|
||||
},
|
||||
groupname: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Group name for URL (e.g. 1.Kreisklasse)',
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'league',
|
||||
|
||||
@@ -3,7 +3,6 @@ import sequelize from '../database.js';
|
||||
import Club from './Club.js';
|
||||
import League from './League.js';
|
||||
import Team from './Team.js';
|
||||
import Season from './Season.js';
|
||||
import Location from './Location.js';
|
||||
|
||||
const Match = sequelize.define('Match', {
|
||||
@@ -21,21 +20,13 @@ const Match = sequelize.define('Match', {
|
||||
type: DataTypes.TIME,
|
||||
allowNull: true,
|
||||
},
|
||||
seasonId: {
|
||||
type: DataTypes.INTEGER,
|
||||
references: {
|
||||
model: Season,
|
||||
key: 'id',
|
||||
},
|
||||
allowNull: false,
|
||||
},
|
||||
locationId: {
|
||||
type: DataTypes.INTEGER,
|
||||
references: {
|
||||
model: Location,
|
||||
key: 'id',
|
||||
},
|
||||
allowNull: false,
|
||||
allowNull: true,
|
||||
},
|
||||
homeTeamId: {
|
||||
type: DataTypes.INTEGER,
|
||||
@@ -69,6 +60,55 @@ const Match = sequelize.define('Match', {
|
||||
},
|
||||
allowNull: false,
|
||||
},
|
||||
code: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Spiel-Code aus PDF-Parsing'
|
||||
},
|
||||
homePin: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Pin-Code für Heimteam aus PDF-Parsing'
|
||||
},
|
||||
guestPin: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Pin-Code für Gastteam aus PDF-Parsing'
|
||||
},
|
||||
myTischtennisMeetingId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
unique: true,
|
||||
comment: 'Meeting ID from myTischtennis (e.g. 15440488)',
|
||||
field: 'my_tischtennis_meeting_id'
|
||||
},
|
||||
homeMatchPoints: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: 0,
|
||||
comment: 'Match points won by home team',
|
||||
field: 'home_match_points'
|
||||
},
|
||||
guestMatchPoints: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: 0,
|
||||
comment: 'Match points won by guest team',
|
||||
field: 'guest_match_points'
|
||||
},
|
||||
isCompleted: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
comment: 'Whether the match is completed',
|
||||
field: 'is_completed'
|
||||
},
|
||||
pdfUrl: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'PDF URL from myTischtennis',
|
||||
field: 'pdf_url'
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'match',
|
||||
|
||||
@@ -45,9 +45,9 @@ const Member = sequelize.define('Member', {
|
||||
},
|
||||
birthDate: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
allowNull: true,
|
||||
set(value) {
|
||||
const encryptedValue = encryptData(value);
|
||||
const encryptedValue = encryptData(value || '');
|
||||
this.setDataValue('birthDate', encryptedValue);
|
||||
},
|
||||
get() {
|
||||
@@ -127,6 +127,22 @@ const Member = sequelize.define('Member', {
|
||||
type: DataTypes.ENUM('male','female','diverse','unknown'),
|
||||
allowNull: true,
|
||||
defaultValue: 'unknown'
|
||||
},
|
||||
ttr: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null
|
||||
},
|
||||
qttr: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null
|
||||
},
|
||||
myTischtennisPlayerId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
comment: 'Player ID from myTischtennis (e.g. NU2705037)',
|
||||
field: 'my_tischtennis_player_id'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
|
||||
133
backend/models/MyTischtennis.js
Normal file
133
backend/models/MyTischtennis.js
Normal file
@@ -0,0 +1,133 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import { encryptData, decryptData } from '../utils/encrypt.js';
|
||||
|
||||
const MyTischtennis = sequelize.define('MyTischtennis', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
unique: true,
|
||||
references: {
|
||||
model: 'user',
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE'
|
||||
},
|
||||
email: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
encryptedPassword: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'encrypted_password'
|
||||
},
|
||||
savePassword: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
field: 'save_password'
|
||||
},
|
||||
autoUpdateRatings: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
field: 'auto_update_ratings'
|
||||
},
|
||||
accessToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'access_token'
|
||||
},
|
||||
refreshToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'refresh_token'
|
||||
},
|
||||
expiresAt: {
|
||||
type: DataTypes.BIGINT,
|
||||
allowNull: true,
|
||||
field: 'expires_at'
|
||||
},
|
||||
cookie: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true
|
||||
},
|
||||
userData: {
|
||||
type: DataTypes.JSON,
|
||||
allowNull: true,
|
||||
field: 'user_data'
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_id'
|
||||
},
|
||||
clubName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_name'
|
||||
},
|
||||
fedNickname: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'fed_nickname'
|
||||
},
|
||||
lastLoginAttempt: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_attempt'
|
||||
},
|
||||
lastLoginSuccess: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_success'
|
||||
},
|
||||
lastUpdateRatings: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_update_ratings'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'my_tischtennis',
|
||||
timestamps: true,
|
||||
hooks: {
|
||||
beforeSave: async (instance) => {
|
||||
// Wenn savePassword false ist, password auf null setzen
|
||||
if (!instance.savePassword) {
|
||||
instance.encryptedPassword = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Virtuelle Felder für password handling
|
||||
MyTischtennis.prototype.setPassword = function(password) {
|
||||
if (password && this.savePassword) {
|
||||
this.encryptedPassword = encryptData(password);
|
||||
} else {
|
||||
this.encryptedPassword = null;
|
||||
}
|
||||
};
|
||||
|
||||
MyTischtennis.prototype.getPassword = function() {
|
||||
if (this.encryptedPassword) {
|
||||
try {
|
||||
return decryptData(this.encryptedPassword);
|
||||
} catch (error) {
|
||||
console.error('Error decrypting myTischtennis password:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export default MyTischtennis;
|
||||
|
||||
72
backend/models/MyTischtennisFetchLog.js
Normal file
72
backend/models/MyTischtennisFetchLog.js
Normal file
@@ -0,0 +1,72 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import User from './User.js';
|
||||
|
||||
const MyTischtennisFetchLog = sequelize.define('MyTischtennisFetchLog', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: User,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
fetchType: {
|
||||
type: DataTypes.ENUM('ratings', 'match_results', 'league_table'),
|
||||
allowNull: false,
|
||||
comment: 'Type of data fetch: ratings, match_results, or league_table'
|
||||
},
|
||||
success: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
},
|
||||
message: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
},
|
||||
errorDetails: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
},
|
||||
recordsProcessed: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
comment: 'Number of records processed (e.g., players updated, matches fetched)'
|
||||
},
|
||||
executionTime: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
comment: 'Execution time in milliseconds'
|
||||
},
|
||||
isAutomatic: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false,
|
||||
comment: 'Whether this was an automatic or manual fetch'
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'my_tischtennis_fetch_log',
|
||||
timestamps: true,
|
||||
indexes: [
|
||||
{
|
||||
fields: ['user_id', 'fetch_type', 'created_at']
|
||||
},
|
||||
{
|
||||
fields: ['created_at']
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
export default MyTischtennisFetchLog;
|
||||
|
||||
63
backend/models/MyTischtennisUpdateHistory.js
Normal file
63
backend/models/MyTischtennisUpdateHistory.js
Normal file
@@ -0,0 +1,63 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const MyTischtennisUpdateHistory = sequelize.define('MyTischtennisUpdateHistory', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: 'user',
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE'
|
||||
},
|
||||
success: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
allowNull: false,
|
||||
defaultValue: false
|
||||
},
|
||||
message: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true
|
||||
},
|
||||
errorDetails: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'error_details'
|
||||
},
|
||||
updatedCount: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: 0,
|
||||
field: 'updated_count'
|
||||
},
|
||||
executionTime: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
comment: 'Execution time in milliseconds',
|
||||
field: 'execution_time'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'my_tischtennis_update_history',
|
||||
timestamps: true,
|
||||
indexes: [
|
||||
{
|
||||
fields: ['user_id']
|
||||
},
|
||||
{
|
||||
fields: ['created_at']
|
||||
},
|
||||
{
|
||||
fields: ['success']
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
export default MyTischtennisUpdateHistory;
|
||||
@@ -1,6 +1,8 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import Club from './Club.js';
|
||||
import League from './League.js';
|
||||
import Season from './Season.js';
|
||||
|
||||
const Team = sequelize.define('Team', {
|
||||
id: {
|
||||
@@ -23,6 +25,82 @@ const Team = sequelize.define('Team', {
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
leagueId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
references: {
|
||||
model: League,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'SET NULL',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
seasonId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
references: {
|
||||
model: Season,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
// Tabellenfelder
|
||||
matchesPlayed: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
matchesWon: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
matchesLost: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
matchesTied: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
setsWon: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
setsLost: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
pointsWon: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
pointsLost: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
tablePoints: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
tablePointsWon: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
tablePointsLost: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
defaultValue: 0,
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'team',
|
||||
|
||||
52
backend/models/TeamDocument.js
Normal file
52
backend/models/TeamDocument.js
Normal file
@@ -0,0 +1,52 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import ClubTeam from './ClubTeam.js';
|
||||
|
||||
const TeamDocument = sequelize.define('TeamDocument', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false,
|
||||
},
|
||||
fileName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
originalFileName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
filePath: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
fileSize: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
mimeType: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
documentType: {
|
||||
type: DataTypes.ENUM('code_list', 'pin_list'),
|
||||
allowNull: false,
|
||||
},
|
||||
clubTeamId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: ClubTeam,
|
||||
key: 'id',
|
||||
},
|
||||
onDelete: 'CASCADE',
|
||||
onUpdate: 'CASCADE',
|
||||
},
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'team_document',
|
||||
timestamps: true,
|
||||
});
|
||||
|
||||
export default TeamDocument;
|
||||
@@ -19,6 +19,8 @@ import DiaryDateActivity from './DiaryDateActivity.js';
|
||||
import Match from './Match.js';
|
||||
import League from './League.js';
|
||||
import Team from './Team.js';
|
||||
import ClubTeam from './ClubTeam.js';
|
||||
import TeamDocument from './TeamDocument.js';
|
||||
import Season from './Season.js';
|
||||
import Location from './Location.js';
|
||||
import Group from './Group.js';
|
||||
@@ -33,6 +35,9 @@ import UserToken from './UserToken.js';
|
||||
import OfficialTournament from './OfficialTournament.js';
|
||||
import OfficialCompetition from './OfficialCompetition.js';
|
||||
import OfficialCompetitionMember from './OfficialCompetitionMember.js';
|
||||
import MyTischtennis from './MyTischtennis.js';
|
||||
import MyTischtennisUpdateHistory from './MyTischtennisUpdateHistory.js';
|
||||
import MyTischtennisFetchLog from './MyTischtennisFetchLog.js';
|
||||
// Official tournaments relations
|
||||
OfficialTournament.hasMany(OfficialCompetition, { foreignKey: 'tournamentId', as: 'competitions' });
|
||||
OfficialCompetition.belongsTo(OfficialTournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
@@ -117,8 +122,28 @@ Team.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
Club.hasMany(League, { foreignKey: 'clubId', as: 'leagues' });
|
||||
League.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
|
||||
Match.belongsTo(Season, { foreignKey: 'seasonId', as: 'season' });
|
||||
Season.hasMany(Match, { foreignKey: 'seasonId', as: 'matches' });
|
||||
Season.hasMany(League, { foreignKey: 'seasonId', as: 'leagues' });
|
||||
League.belongsTo(Season, { foreignKey: 'seasonId', as: 'season' });
|
||||
|
||||
League.hasMany(Team, { foreignKey: 'leagueId', as: 'teams' });
|
||||
Team.belongsTo(League, { foreignKey: 'leagueId', as: 'league' });
|
||||
|
||||
Season.hasMany(Team, { foreignKey: 'seasonId', as: 'teams' });
|
||||
Team.belongsTo(Season, { foreignKey: 'seasonId', as: 'season' });
|
||||
|
||||
// ClubTeam relationships
|
||||
Club.hasMany(ClubTeam, { foreignKey: 'clubId', as: 'clubTeams' });
|
||||
ClubTeam.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
|
||||
League.hasMany(ClubTeam, { foreignKey: 'leagueId', as: 'clubTeams' });
|
||||
ClubTeam.belongsTo(League, { foreignKey: 'leagueId', as: 'league' });
|
||||
|
||||
Season.hasMany(ClubTeam, { foreignKey: 'seasonId', as: 'clubTeams' });
|
||||
ClubTeam.belongsTo(Season, { foreignKey: 'seasonId', as: 'season' });
|
||||
|
||||
// TeamDocument relationships
|
||||
ClubTeam.hasMany(TeamDocument, { foreignKey: 'clubTeamId', as: 'documents' });
|
||||
TeamDocument.belongsTo(ClubTeam, { foreignKey: 'clubTeamId', as: 'clubTeam' });
|
||||
|
||||
Match.belongsTo(Location, { foreignKey: 'locationId', as: 'location' });
|
||||
Location.hasMany(Match, { foreignKey: 'locationId', as: 'matches' });
|
||||
@@ -204,6 +229,15 @@ Member.hasMany(Accident, { foreignKey: 'memberId', as: 'accidents' });
|
||||
Accident.belongsTo(DiaryDate, { foreignKey: 'diaryDateId', as: 'diaryDates' });
|
||||
DiaryDate.hasMany(Accident, { foreignKey: 'diaryDateId', as: 'accidents' });
|
||||
|
||||
User.hasOne(MyTischtennis, { foreignKey: 'userId', as: 'myTischtennis' });
|
||||
MyTischtennis.belongsTo(User, { foreignKey: 'userId', as: 'user' });
|
||||
|
||||
User.hasMany(MyTischtennisUpdateHistory, { foreignKey: 'userId', as: 'updateHistory' });
|
||||
MyTischtennisUpdateHistory.belongsTo(User, { foreignKey: 'userId', as: 'user' });
|
||||
|
||||
User.hasMany(MyTischtennisFetchLog, { foreignKey: 'userId', as: 'fetchLogs' });
|
||||
MyTischtennisFetchLog.belongsTo(User, { foreignKey: 'userId', as: 'user' });
|
||||
|
||||
export {
|
||||
User,
|
||||
Log,
|
||||
@@ -227,6 +261,8 @@ export {
|
||||
Match,
|
||||
League,
|
||||
Team,
|
||||
ClubTeam,
|
||||
TeamDocument,
|
||||
Group,
|
||||
GroupActivity,
|
||||
Tournament,
|
||||
@@ -239,4 +275,7 @@ export {
|
||||
OfficialTournament,
|
||||
OfficialCompetition,
|
||||
OfficialCompetitionMember,
|
||||
MyTischtennis,
|
||||
MyTischtennisUpdateHistory,
|
||||
MyTischtennisFetchLog,
|
||||
};
|
||||
|
||||
252
backend/node_modules/.package-lock.json
generated
vendored
252
backend/node_modules/.package-lock.json
generated
vendored
@@ -4,6 +4,15 @@
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"node_modules/@babel/runtime": {
|
||||
"version": "7.28.4",
|
||||
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.28.4.tgz",
|
||||
"integrity": "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@emnapi/runtime": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.2.0.tgz",
|
||||
@@ -810,6 +819,12 @@
|
||||
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
|
||||
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
|
||||
},
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/aws-ssl-profiles": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/aws-ssl-profiles/-/aws-ssl-profiles-1.1.1.tgz",
|
||||
@@ -818,6 +833,17 @@
|
||||
"node": ">= 6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.12.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
|
||||
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.4",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/balanced-match": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||
@@ -955,6 +981,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/call-bind-apply-helpers": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
|
||||
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
@@ -1084,6 +1123,18 @@
|
||||
"color-support": "bin.js"
|
||||
}
|
||||
},
|
||||
"node_modules/combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/concat-map": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||
@@ -1227,6 +1278,22 @@
|
||||
"node": ">= 10"
|
||||
}
|
||||
},
|
||||
"node_modules/date-fns": {
|
||||
"version": "2.30.0",
|
||||
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-2.30.0.tgz",
|
||||
"integrity": "sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.21.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.11"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/date-fns"
|
||||
}
|
||||
},
|
||||
"node_modules/debug": {
|
||||
"version": "2.6.9",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
|
||||
@@ -1259,6 +1326,15 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/delayed-stream": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/delegates": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delegates/-/delegates-1.0.0.tgz",
|
||||
@@ -1315,6 +1391,20 @@
|
||||
"resolved": "https://registry.npmjs.org/dottie/-/dottie-2.0.6.tgz",
|
||||
"integrity": "sha512-iGCHkfUc5kFekGiqhe8B/mdaurD+lakO9txNnTvKtA6PISrw86LgqHvRzWYPyoE2Ph5aMIrCw9/uko6XHTKCwA=="
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"gopd": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/ecdsa-sig-formatter": {
|
||||
"version": "1.0.11",
|
||||
"resolved": "https://registry.npmjs.org/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz",
|
||||
@@ -1342,13 +1432,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/es-define-property": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.0.tgz",
|
||||
"integrity": "sha512-jxayLKShrEqqzJ0eumQbVhTYQM27CfT1T35+gCgDFoL82JLsXqTJ76zv6A0YLOgEnLUMvLzsDsGIrl8NFpT2gQ==",
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
|
||||
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.2.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
@@ -1362,6 +1449,33 @@
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-object-atoms": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
|
||||
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-set-tostringtag": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
|
||||
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"get-intrinsic": "^1.2.6",
|
||||
"has-tostringtag": "^1.0.2",
|
||||
"hasown": "^2.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/escape-html": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
|
||||
@@ -1752,6 +1866,42 @@
|
||||
"dev": true,
|
||||
"peer": true
|
||||
},
|
||||
"node_modules/follow-redirects": {
|
||||
"version": "1.15.11",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
|
||||
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "individual",
|
||||
"url": "https://github.com/sponsors/RubenVerborgh"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=4.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"debug": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/form-data": {
|
||||
"version": "4.0.4",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
|
||||
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"es-set-tostringtag": "^2.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"mime-types": "^2.1.12"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/forwarded": {
|
||||
"version": "0.2.0",
|
||||
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
|
||||
@@ -1849,16 +1999,21 @@
|
||||
}
|
||||
},
|
||||
"node_modules/get-intrinsic": {
|
||||
"version": "1.2.4",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
|
||||
"integrity": "sha512-5uYhsJH8VJBTv7oslg4BznJYhDoRI6waYCxMmCdnTrcCrHA/fCFKoTFz2JKKE0HdDFUF7/oQuhzumXJK7paBRQ==",
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
|
||||
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"has-proto": "^1.0.1",
|
||||
"has-symbols": "^1.0.3",
|
||||
"hasown": "^2.0.0"
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1867,6 +2022,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/get-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dunder-proto": "^1.0.1",
|
||||
"es-object-atoms": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/glob": {
|
||||
"version": "7.2.3",
|
||||
"resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
|
||||
@@ -1913,12 +2081,12 @@
|
||||
}
|
||||
},
|
||||
"node_modules/gopd": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz",
|
||||
"integrity": "sha512-d65bNlIadxvpb/A2abVdlqKqV563juRnZ1Wtk6s1sIR8uNsXR70xqIzVqxVf1eTqDunwT2MkczEeaezCKTZhwA==",
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
|
||||
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.1.3"
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
@@ -1945,10 +2113,10 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-proto": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.0.3.tgz",
|
||||
"integrity": "sha512-SJ1amZAJUiZS+PhsVLf5tGydlaVB8EdFpaSO4gmiUKUOxk8qzn5AIy4ZeJUmh22znIdk/uMAUT2pl3FxzVUH+Q==",
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
|
||||
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1957,11 +2125,14 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.3.tgz",
|
||||
"integrity": "sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A==",
|
||||
"node_modules/has-tostringtag": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
|
||||
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"has-symbols": "^1.0.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
@@ -2405,6 +2576,15 @@
|
||||
"semver": "bin/semver.js"
|
||||
}
|
||||
},
|
||||
"node_modules/math-intrinsics": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
|
||||
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/media-typer": {
|
||||
"version": "0.3.0",
|
||||
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
|
||||
@@ -2636,6 +2816,15 @@
|
||||
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-5.1.0.tgz",
|
||||
"integrity": "sha512-eh0GgfEkpnoWDq+VY8OyvYhFEzBk6jIYbRKdIlyTiAXIVJ8PyBaKb0rp7oDtoddbdoHWhq8wwr+XZ81F1rpNdA=="
|
||||
},
|
||||
"node_modules/node-cron": {
|
||||
"version": "4.2.1",
|
||||
"resolved": "https://registry.npmjs.org/node-cron/-/node-cron-4.2.1.tgz",
|
||||
"integrity": "sha512-lgimEHPE/QDgFlywTd8yTR61ptugX3Qer29efeyWw2rv259HtGBNn1vZVmp8lB9uo9wC0t/AT4iGqXxia+CJFg==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/node-ensure": {
|
||||
"version": "0.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-ensure/-/node-ensure-0.0.0.tgz",
|
||||
@@ -2662,9 +2851,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/nodemailer": {
|
||||
"version": "6.9.14",
|
||||
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-6.9.14.tgz",
|
||||
"integrity": "sha512-Dobp/ebDKBvz91sbtRKhcznLThrKxKt97GI2FAlAyy+fk19j73Uz3sBXolVtmcXjaorivqsbbbjDY+Jkt4/bQA==",
|
||||
"version": "7.0.9",
|
||||
"resolved": "https://registry.npmjs.org/nodemailer/-/nodemailer-7.0.9.tgz",
|
||||
"integrity": "sha512-9/Qm0qXIByEP8lEV2qOqcAW7bRpL8CR9jcTwk3NBnHJNmP9fIJ86g2fgmIXqHY+nj55ZEMwWqYAT2QTDpRUYiQ==",
|
||||
"license": "MIT-0",
|
||||
"engines": {
|
||||
"node": ">=6.0.0"
|
||||
}
|
||||
@@ -2972,6 +3162,12 @@
|
||||
"node": ">= 0.10"
|
||||
}
|
||||
},
|
||||
"node_modules/proxy-from-env": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pstree.remy": {
|
||||
"version": "1.1.8",
|
||||
"resolved": "https://registry.npmjs.org/pstree.remy/-/pstree.remy-1.1.8.tgz",
|
||||
|
||||
14
backend/node_modules/es-define-property/CHANGELOG.md
generated
vendored
14
backend/node_modules/es-define-property/CHANGELOG.md
generated
vendored
@@ -5,6 +5,20 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.0.1](https://github.com/ljharb/es-define-property/compare/v1.0.0...v1.0.1) - 2024-12-06
|
||||
|
||||
### Commits
|
||||
|
||||
- [types] use shared tsconfig [`954a663`](https://github.com/ljharb/es-define-property/commit/954a66360326e508a0e5daa4b07493d58f5e110e)
|
||||
- [actions] split out node 10-20, and 20+ [`3a8e84b`](https://github.com/ljharb/es-define-property/commit/3a8e84b23883f26ff37b3e82ff283834228e18c6)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `@ljharb/tsconfig`, `@types/get-intrinsic`, `@types/tape`, `auto-changelog`, `gopd`, `tape` [`86ae27b`](https://github.com/ljharb/es-define-property/commit/86ae27bb8cc857b23885136fad9cbe965ae36612)
|
||||
- [Refactor] avoid using `get-intrinsic` [`02480c0`](https://github.com/ljharb/es-define-property/commit/02480c0353ef6118965282977c3864aff53d98b1)
|
||||
- [Tests] replace `aud` with `npm audit` [`f6093ff`](https://github.com/ljharb/es-define-property/commit/f6093ff74ab51c98015c2592cd393bd42478e773)
|
||||
- [Tests] configure testling [`7139e66`](https://github.com/ljharb/es-define-property/commit/7139e66959247a56086d9977359caef27c6849e7)
|
||||
- [Dev Deps] update `tape` [`b901b51`](https://github.com/ljharb/es-define-property/commit/b901b511a75e001a40ce1a59fef7d9ffcfc87482)
|
||||
- [Tests] fix types in tests [`469d269`](https://github.com/ljharb/es-define-property/commit/469d269fd141b1e773ec053a9fa35843493583e0)
|
||||
- [Dev Deps] add missing peer dep [`733acfb`](https://github.com/ljharb/es-define-property/commit/733acfb0c4c96edf337e470b89a25a5b3724c352)
|
||||
|
||||
## v1.0.0 - 2024-02-12
|
||||
|
||||
### Commits
|
||||
|
||||
4
backend/node_modules/es-define-property/index.js
generated
vendored
4
backend/node_modules/es-define-property/index.js
generated
vendored
@@ -1,9 +1,7 @@
|
||||
'use strict';
|
||||
|
||||
var GetIntrinsic = require('get-intrinsic');
|
||||
|
||||
/** @type {import('.')} */
|
||||
var $defineProperty = GetIntrinsic('%Object.defineProperty%', true) || false;
|
||||
var $defineProperty = Object.defineProperty || false;
|
||||
if ($defineProperty) {
|
||||
try {
|
||||
$defineProperty({}, 'a', { value: 1 });
|
||||
|
||||
24
backend/node_modules/es-define-property/package.json
generated
vendored
24
backend/node_modules/es-define-property/package.json
generated
vendored
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "es-define-property",
|
||||
"version": "1.0.0",
|
||||
"version": "1.0.1",
|
||||
"description": "`Object.defineProperty`, but not IE 8's broken one.",
|
||||
"main": "index.js",
|
||||
"types": "./index.d.ts",
|
||||
@@ -19,7 +19,7 @@
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "nyc tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>= 10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -42,29 +42,29 @@
|
||||
"url": "https://github.com/ljharb/es-define-property/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/es-define-property#readme",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.2.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"@types/get-intrinsic": "^1.2.2",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.2",
|
||||
"@types/gopd": "^1.0.3",
|
||||
"@types/tape": "^5.6.4",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "^8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"gopd": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.4",
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/index.js"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
"template": "keepachangelog",
|
||||
|
||||
1
backend/node_modules/es-define-property/test/index.js
generated
vendored
1
backend/node_modules/es-define-property/test/index.js
generated
vendored
@@ -10,6 +10,7 @@ test('defineProperty: supported', { skip: !$defineProperty }, function (t) {
|
||||
|
||||
t.equal(typeof $defineProperty, 'function', 'defineProperty is supported');
|
||||
if ($defineProperty && gOPD) { // this `if` check is just to shut TS up
|
||||
/** @type {{ a: number, b?: number, c?: number }} */
|
||||
var o = { a: 1 };
|
||||
|
||||
$defineProperty(o, 'b', { enumerable: true, value: 2 });
|
||||
|
||||
44
backend/node_modules/es-define-property/tsconfig.json
generated
vendored
44
backend/node_modules/es-define-property/tsconfig.json
generated
vendored
@@ -1,47 +1,7 @@
|
||||
{
|
||||
"extends": "@ljharb/tsconfig",
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig.json to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "es2022", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
"useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "commonjs", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
// "moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
// "typeRoots": ["types"], /* Specify multiple folders that act like `./node_modules/@types`. */
|
||||
"resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
|
||||
|
||||
/* JavaScript Support */
|
||||
"allowJs": true, /* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */
|
||||
"checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
"maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from `node_modules`. Only applicable with `allowJs`. */
|
||||
|
||||
/* Emit */
|
||||
"declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
"declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
"noEmit": true, /* Disable emitting files from a compilation. */
|
||||
|
||||
/* Interop Constraints */
|
||||
"allowSyntheticDefaultImports": true, /* Allow `import x from y` when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
|
||||
/* Completeness */
|
||||
// "skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
"target": "es2022",
|
||||
},
|
||||
"exclude": [
|
||||
"coverage",
|
||||
|
||||
4
backend/node_modules/get-intrinsic/.eslintrc
generated
vendored
4
backend/node_modules/get-intrinsic/.eslintrc
generated
vendored
@@ -11,6 +11,10 @@
|
||||
"es2022": true,
|
||||
},
|
||||
|
||||
"globals": {
|
||||
"Float16Array": false,
|
||||
},
|
||||
|
||||
"rules": {
|
||||
"array-bracket-newline": 0,
|
||||
"complexity": 0,
|
||||
|
||||
43
backend/node_modules/get-intrinsic/CHANGELOG.md
generated
vendored
43
backend/node_modules/get-intrinsic/CHANGELOG.md
generated
vendored
@@ -5,6 +5,49 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.3.0](https://github.com/ljharb/get-intrinsic/compare/v1.2.7...v1.3.0) - 2025-02-22
|
||||
|
||||
### Commits
|
||||
|
||||
- [Dev Deps] update `es-abstract`, `es-value-fixtures`, `for-each`, `object-inspect` [`9b61553`](https://github.com/ljharb/get-intrinsic/commit/9b61553c587f1c1edbd435597e88c7d387da97dd)
|
||||
- [Deps] update `call-bind-apply-helpers`, `es-object-atoms`, `get-proto` [`a341fee`](https://github.com/ljharb/get-intrinsic/commit/a341fee0f39a403b0f0069e82c97642d5eb11043)
|
||||
- [New] add `Float16Array` [`de22116`](https://github.com/ljharb/get-intrinsic/commit/de22116b492fb989a0341bceb6e573abfaed73dc)
|
||||
|
||||
## [v1.2.7](https://github.com/ljharb/get-intrinsic/compare/v1.2.6...v1.2.7) - 2025-01-02
|
||||
|
||||
### Commits
|
||||
|
||||
- [Refactor] use `get-proto` directly [`00ab955`](https://github.com/ljharb/get-intrinsic/commit/00ab95546a0980c8ad42a84253daaa8d2adcedf9)
|
||||
- [Deps] update `math-intrinsics` [`c716cdd`](https://github.com/ljharb/get-intrinsic/commit/c716cdd6bbe36b438057025561b8bb5a879ac8a0)
|
||||
- [Dev Deps] update `call-bound`, `es-abstract` [`dc648a6`](https://github.com/ljharb/get-intrinsic/commit/dc648a67eb359037dff8d8619bfa71d86debccb1)
|
||||
|
||||
## [v1.2.6](https://github.com/ljharb/get-intrinsic/compare/v1.2.5...v1.2.6) - 2024-12-11
|
||||
|
||||
### Commits
|
||||
|
||||
- [Refactor] use `math-intrinsics` [`841be86`](https://github.com/ljharb/get-intrinsic/commit/841be8641a9254c4c75483b30c8871b5d5065926)
|
||||
- [Refactor] use `es-object-atoms` [`42057df`](https://github.com/ljharb/get-intrinsic/commit/42057dfa16f66f64787e66482af381cc6f31d2c1)
|
||||
- [Deps] update `call-bind-apply-helpers` [`45afa24`](https://github.com/ljharb/get-intrinsic/commit/45afa24a9ee4d6d3c172db1f555b16cb27843ef4)
|
||||
- [Dev Deps] update `call-bound` [`9cba9c6`](https://github.com/ljharb/get-intrinsic/commit/9cba9c6e70212bc163b7a5529cb25df46071646f)
|
||||
|
||||
## [v1.2.5](https://github.com/ljharb/get-intrinsic/compare/v1.2.4...v1.2.5) - 2024-12-06
|
||||
|
||||
### Commits
|
||||
|
||||
- [actions] split out node 10-20, and 20+ [`6e2b9dd`](https://github.com/ljharb/get-intrinsic/commit/6e2b9dd23902665681ebe453256ccfe21d7966f0)
|
||||
- [Refactor] use `dunder-proto` and `call-bind-apply-helpers` instead of `has-proto` [`c095d17`](https://github.com/ljharb/get-intrinsic/commit/c095d179ad0f4fbfff20c8a3e0cb4fe668018998)
|
||||
- [Refactor] use `gopd` [`9841d5b`](https://github.com/ljharb/get-intrinsic/commit/9841d5b35f7ab4fd2d193f0c741a50a077920e90)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `auto-changelog`, `es-abstract`, `es-value-fixtures`, `gopd`, `mock-property`, `object-inspect`, `tape` [`2d07e01`](https://github.com/ljharb/get-intrinsic/commit/2d07e01310cee2cbaedfead6903df128b1f5d425)
|
||||
- [Deps] update `gopd`, `has-proto`, `has-symbols`, `hasown` [`974d8bf`](https://github.com/ljharb/get-intrinsic/commit/974d8bf5baad7939eef35c25cc1dd88c10a30fa6)
|
||||
- [Dev Deps] update `call-bind`, `es-abstract`, `tape` [`df9dde1`](https://github.com/ljharb/get-intrinsic/commit/df9dde178186631ab8a3165ede056549918ce4bc)
|
||||
- [Refactor] cache `es-define-property` as well [`43ef543`](https://github.com/ljharb/get-intrinsic/commit/43ef543cb02194401420e3a914a4ca9168691926)
|
||||
- [Deps] update `has-proto`, `has-symbols`, `hasown` [`ad4949d`](https://github.com/ljharb/get-intrinsic/commit/ad4949d5467316505aad89bf75f9417ed782f7af)
|
||||
- [Tests] use `call-bound` directly [`ad5c406`](https://github.com/ljharb/get-intrinsic/commit/ad5c4069774bfe90e520a35eead5fe5ca9d69e80)
|
||||
- [Deps] update `has-proto`, `hasown` [`45414ca`](https://github.com/ljharb/get-intrinsic/commit/45414caa312333a2798953682c68f85c550627dd)
|
||||
- [Tests] replace `aud` with `npm audit` [`18d3509`](https://github.com/ljharb/get-intrinsic/commit/18d3509f79460e7924da70409ee81e5053087523)
|
||||
- [Deps] update `es-define-property` [`aadaa3b`](https://github.com/ljharb/get-intrinsic/commit/aadaa3b2188d77ad9bff394ce5d4249c49eb21f5)
|
||||
- [Dev Deps] add missing peer dep [`c296a16`](https://github.com/ljharb/get-intrinsic/commit/c296a16246d0c9a5981944f4cc5cf61fbda0cf6a)
|
||||
|
||||
## [v1.2.4](https://github.com/ljharb/get-intrinsic/compare/v1.2.3...v1.2.4) - 2024-02-05
|
||||
|
||||
### Commits
|
||||
|
||||
61
backend/node_modules/get-intrinsic/index.js
generated
vendored
61
backend/node_modules/get-intrinsic/index.js
generated
vendored
@@ -2,6 +2,8 @@
|
||||
|
||||
var undefined;
|
||||
|
||||
var $Object = require('es-object-atoms');
|
||||
|
||||
var $Error = require('es-errors');
|
||||
var $EvalError = require('es-errors/eval');
|
||||
var $RangeError = require('es-errors/range');
|
||||
@@ -10,6 +12,14 @@ var $SyntaxError = require('es-errors/syntax');
|
||||
var $TypeError = require('es-errors/type');
|
||||
var $URIError = require('es-errors/uri');
|
||||
|
||||
var abs = require('math-intrinsics/abs');
|
||||
var floor = require('math-intrinsics/floor');
|
||||
var max = require('math-intrinsics/max');
|
||||
var min = require('math-intrinsics/min');
|
||||
var pow = require('math-intrinsics/pow');
|
||||
var round = require('math-intrinsics/round');
|
||||
var sign = require('math-intrinsics/sign');
|
||||
|
||||
var $Function = Function;
|
||||
|
||||
// eslint-disable-next-line consistent-return
|
||||
@@ -19,14 +29,8 @@ var getEvalledConstructor = function (expressionSyntax) {
|
||||
} catch (e) {}
|
||||
};
|
||||
|
||||
var $gOPD = Object.getOwnPropertyDescriptor;
|
||||
if ($gOPD) {
|
||||
try {
|
||||
$gOPD({}, '');
|
||||
} catch (e) {
|
||||
$gOPD = null; // this is IE 8, which has a broken gOPD
|
||||
}
|
||||
}
|
||||
var $gOPD = require('gopd');
|
||||
var $defineProperty = require('es-define-property');
|
||||
|
||||
var throwTypeError = function () {
|
||||
throw new $TypeError();
|
||||
@@ -49,13 +53,13 @@ var ThrowTypeError = $gOPD
|
||||
: throwTypeError;
|
||||
|
||||
var hasSymbols = require('has-symbols')();
|
||||
var hasProto = require('has-proto')();
|
||||
|
||||
var getProto = Object.getPrototypeOf || (
|
||||
hasProto
|
||||
? function (x) { return x.__proto__; } // eslint-disable-line no-proto
|
||||
: null
|
||||
);
|
||||
var getProto = require('get-proto');
|
||||
var $ObjectGPO = require('get-proto/Object.getPrototypeOf');
|
||||
var $ReflectGPO = require('get-proto/Reflect.getPrototypeOf');
|
||||
|
||||
var $apply = require('call-bind-apply-helpers/functionApply');
|
||||
var $call = require('call-bind-apply-helpers/functionCall');
|
||||
|
||||
var needsEval = {};
|
||||
|
||||
@@ -86,6 +90,7 @@ var INTRINSICS = {
|
||||
'%Error%': $Error,
|
||||
'%eval%': eval, // eslint-disable-line no-eval
|
||||
'%EvalError%': $EvalError,
|
||||
'%Float16Array%': typeof Float16Array === 'undefined' ? undefined : Float16Array,
|
||||
'%Float32Array%': typeof Float32Array === 'undefined' ? undefined : Float32Array,
|
||||
'%Float64Array%': typeof Float64Array === 'undefined' ? undefined : Float64Array,
|
||||
'%FinalizationRegistry%': typeof FinalizationRegistry === 'undefined' ? undefined : FinalizationRegistry,
|
||||
@@ -102,7 +107,8 @@ var INTRINSICS = {
|
||||
'%MapIteratorPrototype%': typeof Map === 'undefined' || !hasSymbols || !getProto ? undefined : getProto(new Map()[Symbol.iterator]()),
|
||||
'%Math%': Math,
|
||||
'%Number%': Number,
|
||||
'%Object%': Object,
|
||||
'%Object%': $Object,
|
||||
'%Object.getOwnPropertyDescriptor%': $gOPD,
|
||||
'%parseFloat%': parseFloat,
|
||||
'%parseInt%': parseInt,
|
||||
'%Promise%': typeof Promise === 'undefined' ? undefined : Promise,
|
||||
@@ -128,7 +134,20 @@ var INTRINSICS = {
|
||||
'%URIError%': $URIError,
|
||||
'%WeakMap%': typeof WeakMap === 'undefined' ? undefined : WeakMap,
|
||||
'%WeakRef%': typeof WeakRef === 'undefined' ? undefined : WeakRef,
|
||||
'%WeakSet%': typeof WeakSet === 'undefined' ? undefined : WeakSet
|
||||
'%WeakSet%': typeof WeakSet === 'undefined' ? undefined : WeakSet,
|
||||
|
||||
'%Function.prototype.call%': $call,
|
||||
'%Function.prototype.apply%': $apply,
|
||||
'%Object.defineProperty%': $defineProperty,
|
||||
'%Object.getPrototypeOf%': $ObjectGPO,
|
||||
'%Math.abs%': abs,
|
||||
'%Math.floor%': floor,
|
||||
'%Math.max%': max,
|
||||
'%Math.min%': min,
|
||||
'%Math.pow%': pow,
|
||||
'%Math.round%': round,
|
||||
'%Math.sign%': sign,
|
||||
'%Reflect.getPrototypeOf%': $ReflectGPO
|
||||
};
|
||||
|
||||
if (getProto) {
|
||||
@@ -223,11 +242,11 @@ var LEGACY_ALIASES = {
|
||||
|
||||
var bind = require('function-bind');
|
||||
var hasOwn = require('hasown');
|
||||
var $concat = bind.call(Function.call, Array.prototype.concat);
|
||||
var $spliceApply = bind.call(Function.apply, Array.prototype.splice);
|
||||
var $replace = bind.call(Function.call, String.prototype.replace);
|
||||
var $strSlice = bind.call(Function.call, String.prototype.slice);
|
||||
var $exec = bind.call(Function.call, RegExp.prototype.exec);
|
||||
var $concat = bind.call($call, Array.prototype.concat);
|
||||
var $spliceApply = bind.call($apply, Array.prototype.splice);
|
||||
var $replace = bind.call($call, String.prototype.replace);
|
||||
var $strSlice = bind.call($call, String.prototype.slice);
|
||||
var $exec = bind.call($call, RegExp.prototype.exec);
|
||||
|
||||
/* adapted from https://github.com/lodash/lodash/blob/4.17.15/dist/lodash.js#L6735-L6744 */
|
||||
var rePropName = /[^%.[\]]+|\[(?:(-?\d+(?:\.\d+)?)|(["'])((?:(?!\2)[^\\]|\\.)*?)\2)\]|(?=(?:\.|\[\])(?:\.|\[\]|%$))/g;
|
||||
|
||||
44
backend/node_modules/get-intrinsic/package.json
generated
vendored
44
backend/node_modules/get-intrinsic/package.json
generated
vendored
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "get-intrinsic",
|
||||
"version": "1.2.4",
|
||||
"version": "1.3.0",
|
||||
"description": "Get and robustly cache all JS language-level intrinsics at first require time",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
@@ -17,7 +17,7 @@
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "nyc tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>= 10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -43,26 +43,37 @@
|
||||
"url": "https://github.com/ljharb/get-intrinsic/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/get-intrinsic#readme",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"call-bind": "^1.0.5",
|
||||
"es-abstract": "^1.22.3",
|
||||
"es-value-fixtures": "^1.4.2",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"call-bound": "^1.0.3",
|
||||
"encoding": "^0.1.13",
|
||||
"es-abstract": "^1.23.9",
|
||||
"es-value-fixtures": "^1.7.1",
|
||||
"eslint": "=8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"for-each": "^0.3.3",
|
||||
"gopd": "^1.0.1",
|
||||
"for-each": "^0.3.5",
|
||||
"make-async-function": "^1.0.0",
|
||||
"make-async-generator-function": "^1.0.0",
|
||||
"make-generator-function": "^2.0.0",
|
||||
"mock-property": "^1.0.3",
|
||||
"mock-property": "^1.1.0",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"object-inspect": "^1.13.1",
|
||||
"object-inspect": "^1.13.4",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.4"
|
||||
"tape": "^5.9.0"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
@@ -72,13 +83,6 @@
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2",
|
||||
"has-proto": "^1.0.1",
|
||||
"has-symbols": "^1.0.3",
|
||||
"hasown": "^2.0.0"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/GetIntrinsic.js"
|
||||
},
|
||||
|
||||
4
backend/node_modules/get-intrinsic/test/GetIntrinsic.js
generated
vendored
4
backend/node_modules/get-intrinsic/test/GetIntrinsic.js
generated
vendored
@@ -10,10 +10,10 @@ var asyncFns = require('make-async-function').list();
|
||||
var asyncGenFns = require('make-async-generator-function')();
|
||||
var mockProperty = require('mock-property');
|
||||
|
||||
var callBound = require('call-bind/callBound');
|
||||
var callBound = require('call-bound');
|
||||
var v = require('es-value-fixtures');
|
||||
var $gOPD = require('gopd');
|
||||
var DefinePropertyOrThrow = require('es-abstract/2021/DefinePropertyOrThrow');
|
||||
var DefinePropertyOrThrow = require('es-abstract/2023/DefinePropertyOrThrow');
|
||||
|
||||
var $isProto = callBound('%Object.prototype.isPrototypeOf%');
|
||||
|
||||
|
||||
20
backend/node_modules/gopd/CHANGELOG.md
generated
vendored
20
backend/node_modules/gopd/CHANGELOG.md
generated
vendored
@@ -5,6 +5,26 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.2.0](https://github.com/ljharb/gopd/compare/v1.1.0...v1.2.0) - 2024-12-03
|
||||
|
||||
### Commits
|
||||
|
||||
- [New] add `gOPD` entry point; remove `get-intrinsic` [`5b61232`](https://github.com/ljharb/gopd/commit/5b61232dedea4591a314bcf16101b1961cee024e)
|
||||
|
||||
## [v1.1.0](https://github.com/ljharb/gopd/compare/v1.0.1...v1.1.0) - 2024-11-29
|
||||
|
||||
### Commits
|
||||
|
||||
- [New] add types [`f585e39`](https://github.com/ljharb/gopd/commit/f585e397886d270e4ba84e53d226e4f9ca2eb0e6)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `auto-changelog`, `tape` [`0b8e4fd`](https://github.com/ljharb/gopd/commit/0b8e4fded64397a7726a9daa144a6cc9a5e2edfa)
|
||||
- [Dev Deps] update `aud`, `npmignore`, `tape` [`48378b2`](https://github.com/ljharb/gopd/commit/48378b2443f09a4f7efbd0fb6c3ee845a6cabcf3)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`78099ee`](https://github.com/ljharb/gopd/commit/78099eeed41bfdc134c912280483689cc8861c31)
|
||||
- [Tests] replace `aud` with `npm audit` [`4e0d0ac`](https://github.com/ljharb/gopd/commit/4e0d0ac47619d24a75318a8e1f543ee04b2a2632)
|
||||
- [meta] add missing `engines.node` [`1443316`](https://github.com/ljharb/gopd/commit/14433165d07835c680155b3dfd62d9217d735eca)
|
||||
- [Deps] update `get-intrinsic` [`eee5f51`](https://github.com/ljharb/gopd/commit/eee5f51769f3dbaf578b70e2a3199116b01aa670)
|
||||
- [Deps] update `get-intrinsic` [`550c378`](https://github.com/ljharb/gopd/commit/550c3780e3a9c77b62565712a001b4ed64ea61f5)
|
||||
- [Dev Deps] add missing peer dep [`8c2ecf8`](https://github.com/ljharb/gopd/commit/8c2ecf848122e4e30abfc5b5086fb48b390dce75)
|
||||
|
||||
## [v1.0.1](https://github.com/ljharb/gopd/compare/v1.0.0...v1.0.1) - 2022-11-01
|
||||
|
||||
### Commits
|
||||
|
||||
5
backend/node_modules/gopd/index.js
generated
vendored
5
backend/node_modules/gopd/index.js
generated
vendored
@@ -1,8 +1,7 @@
|
||||
'use strict';
|
||||
|
||||
var GetIntrinsic = require('get-intrinsic');
|
||||
|
||||
var $gOPD = GetIntrinsic('%Object.getOwnPropertyDescriptor%', true);
|
||||
/** @type {import('.')} */
|
||||
var $gOPD = require('./gOPD');
|
||||
|
||||
if ($gOPD) {
|
||||
try {
|
||||
|
||||
26
backend/node_modules/gopd/package.json
generated
vendored
26
backend/node_modules/gopd/package.json
generated
vendored
@@ -1,10 +1,11 @@
|
||||
{
|
||||
"name": "gopd",
|
||||
"version": "1.0.1",
|
||||
"version": "1.2.0",
|
||||
"description": "`Object.getOwnPropertyDescriptor`, but accounts for IE's broken implementation.",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
".": "./index.js",
|
||||
"./gOPD": "./gOPD.js",
|
||||
"./package.json": "./package.json"
|
||||
},
|
||||
"sideEffects": false,
|
||||
@@ -12,12 +13,13 @@
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"prelint": "tsc -p . && attw -P",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "evalmd README.md",
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>=10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -41,19 +43,20 @@
|
||||
"url": "https://github.com/ljharb/gopd/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/gopd#readme",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.1.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.0.0",
|
||||
"aud": "^2.0.1",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@arethetypeswrong/cli": "^0.17.0",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.0",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "=8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.0",
|
||||
"npmignore": "^0.3.1",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.6.1"
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
@@ -67,5 +70,8 @@
|
||||
"ignore": [
|
||||
".github/workflows"
|
||||
]
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
}
|
||||
|
||||
3
backend/node_modules/gopd/test/index.js
generated
vendored
3
backend/node_modules/gopd/test/index.js
generated
vendored
@@ -10,6 +10,7 @@ test('gOPD', function (t) {
|
||||
var obj = { x: 1 };
|
||||
st.ok('x' in obj, 'property exists');
|
||||
|
||||
// @ts-expect-error TS can't figure out narrowing from `skip`
|
||||
var desc = gOPD(obj, 'x');
|
||||
st.deepEqual(
|
||||
desc,
|
||||
@@ -25,7 +26,7 @@ test('gOPD', function (t) {
|
||||
st.end();
|
||||
});
|
||||
|
||||
t.test('not supported', { skip: gOPD }, function (st) {
|
||||
t.test('not supported', { skip: !!gOPD }, function (st) {
|
||||
st.notOk(gOPD, 'is falsy');
|
||||
|
||||
st.end();
|
||||
|
||||
5
backend/node_modules/has-proto/.eslintrc
generated
vendored
5
backend/node_modules/has-proto/.eslintrc
generated
vendored
@@ -1,5 +0,0 @@
|
||||
{
|
||||
"root": true,
|
||||
|
||||
"extends": "@ljharb",
|
||||
}
|
||||
12
backend/node_modules/has-proto/.github/FUNDING.yml
generated
vendored
12
backend/node_modules/has-proto/.github/FUNDING.yml
generated
vendored
@@ -1,12 +0,0 @@
|
||||
# These are supported funding model platforms
|
||||
|
||||
github: [ljharb]
|
||||
patreon: # Replace with a single Patreon username
|
||||
open_collective: # Replace with a single Open Collective username
|
||||
ko_fi: # Replace with a single Ko-fi username
|
||||
tidelift: npm/has-proto
|
||||
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
|
||||
liberapay: # Replace with a single Liberapay username
|
||||
issuehunt: # Replace with a single IssueHunt username
|
||||
otechie: # Replace with a single Otechie username
|
||||
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']
|
||||
38
backend/node_modules/has-proto/CHANGELOG.md
generated
vendored
38
backend/node_modules/has-proto/CHANGELOG.md
generated
vendored
@@ -1,38 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.0.3](https://github.com/inspect-js/has-proto/compare/v1.0.2...v1.0.3) - 2024-02-19
|
||||
|
||||
### Commits
|
||||
|
||||
- [types] add missing declaration file [`26ecade`](https://github.com/inspect-js/has-proto/commit/26ecade05d253bb5dc376945ee3186d1fbe334f8)
|
||||
|
||||
## [v1.0.2](https://github.com/inspect-js/has-proto/compare/v1.0.1...v1.0.2) - 2024-02-19
|
||||
|
||||
### Commits
|
||||
|
||||
- add types [`6435262`](https://github.com/inspect-js/has-proto/commit/64352626cf511c0276d5f4bb6be770a0bf0f8524)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `npmignore`, `tape` [`f16a5e4`](https://github.com/inspect-js/has-proto/commit/f16a5e4121651e551271419f9d60fdd3561fd82c)
|
||||
- [Refactor] tiny cleanup [`d1f1a4b`](https://github.com/inspect-js/has-proto/commit/d1f1a4bdc135f115a10f148ce302676224534702)
|
||||
- [meta] add `sideEffects` flag [`e7ab1a6`](https://github.com/inspect-js/has-proto/commit/e7ab1a6f153b3e80dee68d1748b71e46767a0531)
|
||||
|
||||
## [v1.0.1](https://github.com/inspect-js/has-proto/compare/v1.0.0...v1.0.1) - 2022-12-21
|
||||
|
||||
### Commits
|
||||
|
||||
- [meta] correct URLs and description [`ef34483`](https://github.com/inspect-js/has-proto/commit/ef34483ca0d35680f271b6b96e35526151b25dfc)
|
||||
- [patch] add an additional criteria [`e81959e`](https://github.com/inspect-js/has-proto/commit/e81959ed7c7a77fbf459f00cb4ef824f1099497f)
|
||||
- [Dev Deps] update `aud` [`2bec2c4`](https://github.com/inspect-js/has-proto/commit/2bec2c47b072b122ff5443fba0263f6dc649531f)
|
||||
|
||||
## v1.0.0 - 2022-12-12
|
||||
|
||||
### Commits
|
||||
|
||||
- Initial implementation, tests, readme [`6886fea`](https://github.com/inspect-js/has-proto/commit/6886fea578f67daf69a7920b2eb7637ea6ebb0bc)
|
||||
- Initial commit [`99129c8`](https://github.com/inspect-js/has-proto/commit/99129c8f42471ac89cb681ba9cb9d52a583eb94f)
|
||||
- npm init [`2844ad8`](https://github.com/inspect-js/has-proto/commit/2844ad8e75b84d66a46765b3bab9d2e8ea692e10)
|
||||
- Only apps should have lockfiles [`c65bc5e`](https://github.com/inspect-js/has-proto/commit/c65bc5e40b9004463f7336d47c67245fb139a36a)
|
||||
21
backend/node_modules/has-proto/LICENSE
generated
vendored
21
backend/node_modules/has-proto/LICENSE
generated
vendored
@@ -1,21 +0,0 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2022 Inspect JS
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
38
backend/node_modules/has-proto/README.md
generated
vendored
38
backend/node_modules/has-proto/README.md
generated
vendored
@@ -1,38 +0,0 @@
|
||||
# has-proto <sup>[![Version Badge][npm-version-svg]][package-url]</sup>
|
||||
|
||||
[![github actions][actions-image]][actions-url]
|
||||
[![coverage][codecov-image]][codecov-url]
|
||||
[![License][license-image]][license-url]
|
||||
[![Downloads][downloads-image]][downloads-url]
|
||||
|
||||
[![npm badge][npm-badge-png]][package-url]
|
||||
|
||||
Does this environment have the ability to set the [[Prototype]] of an object on creation with `__proto__`?
|
||||
|
||||
## Example
|
||||
|
||||
```js
|
||||
var hasProto = require('has-proto');
|
||||
var assert = require('assert');
|
||||
|
||||
assert.equal(typeof hasProto(), 'boolean');
|
||||
```
|
||||
|
||||
## Tests
|
||||
Simply clone the repo, `npm install`, and run `npm test`
|
||||
|
||||
[package-url]: https://npmjs.org/package/has-proto
|
||||
[npm-version-svg]: https://versionbadg.es/inspect-js/has-proto.svg
|
||||
[deps-svg]: https://david-dm.org/inspect-js/has-proto.svg
|
||||
[deps-url]: https://david-dm.org/inspect-js/has-proto
|
||||
[dev-deps-svg]: https://david-dm.org/inspect-js/has-proto/dev-status.svg
|
||||
[dev-deps-url]: https://david-dm.org/inspect-js/has-proto#info=devDependencies
|
||||
[npm-badge-png]: https://nodei.co/npm/has-proto.png?downloads=true&stars=true
|
||||
[license-image]: https://img.shields.io/npm/l/has-proto.svg
|
||||
[license-url]: LICENSE
|
||||
[downloads-image]: https://img.shields.io/npm/dm/has-proto.svg
|
||||
[downloads-url]: https://npm-stat.com/charts.html?package=has-proto
|
||||
[codecov-image]: https://codecov.io/gh/inspect-js/has-proto/branch/main/graphs/badge.svg
|
||||
[codecov-url]: https://app.codecov.io/gh/inspect-js/has-proto/
|
||||
[actions-image]: https://img.shields.io/endpoint?url=https://github-actions-badge-u3jn4tfpocch.runkit.sh/inspect-js/has-proto
|
||||
[actions-url]: https://github.com/inspect-js/has-proto/actions
|
||||
3
backend/node_modules/has-proto/index.d.ts
generated
vendored
3
backend/node_modules/has-proto/index.d.ts
generated
vendored
@@ -1,3 +0,0 @@
|
||||
declare function hasProto(): boolean;
|
||||
|
||||
export = hasProto;
|
||||
15
backend/node_modules/has-proto/index.js
generated
vendored
15
backend/node_modules/has-proto/index.js
generated
vendored
@@ -1,15 +0,0 @@
|
||||
'use strict';
|
||||
|
||||
var test = {
|
||||
__proto__: null,
|
||||
foo: {}
|
||||
};
|
||||
|
||||
var $Object = Object;
|
||||
|
||||
/** @type {import('.')} */
|
||||
module.exports = function hasProto() {
|
||||
// @ts-expect-error: TS errors on an inherited property for some reason
|
||||
return { __proto__: test }.foo === test.foo
|
||||
&& !(test instanceof $Object);
|
||||
};
|
||||
78
backend/node_modules/has-proto/package.json
generated
vendored
78
backend/node_modules/has-proto/package.json
generated
vendored
@@ -1,78 +0,0 @@
|
||||
{
|
||||
"name": "has-proto",
|
||||
"version": "1.0.3",
|
||||
"description": "Does this environment have the ability to get the [[Prototype]] of an object on creation with `__proto__`?",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
".": "./index.js",
|
||||
"./package.json": "./package.json"
|
||||
},
|
||||
"sideEffects": false,
|
||||
"scripts": {
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "tsc -p .",
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/inspect-js/has-proto.git"
|
||||
},
|
||||
"keywords": [
|
||||
"prototype",
|
||||
"proto",
|
||||
"set",
|
||||
"get",
|
||||
"__proto__",
|
||||
"getPrototypeOf",
|
||||
"setPrototypeOf",
|
||||
"has"
|
||||
],
|
||||
"author": "Jordan Harband <ljharb@gmail.com>",
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
},
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/inspect-js/has-proto/issues"
|
||||
},
|
||||
"homepage": "https://github.com/inspect-js/has-proto#readme",
|
||||
"testling": {
|
||||
"files": "test/index.js"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"@types/tape": "^5.6.4",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"eslint": "=8.8.0",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.5",
|
||||
"typescript": "next"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
"template": "keepachangelog",
|
||||
"unreleased": false,
|
||||
"commitLimit": false,
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"publishConfig": {
|
||||
"ignore": [
|
||||
".github/workflows"
|
||||
]
|
||||
}
|
||||
}
|
||||
19
backend/node_modules/has-proto/test/index.js
generated
vendored
19
backend/node_modules/has-proto/test/index.js
generated
vendored
@@ -1,19 +0,0 @@
|
||||
'use strict';
|
||||
|
||||
var test = require('tape');
|
||||
var hasProto = require('../');
|
||||
|
||||
test('hasProto', function (t) {
|
||||
var result = hasProto();
|
||||
t.equal(typeof result, 'boolean', 'returns a boolean (' + result + ')');
|
||||
|
||||
var obj = { __proto__: null };
|
||||
if (result) {
|
||||
t.notOk('toString' in obj, 'null object lacks toString');
|
||||
} else {
|
||||
t.ok('toString' in obj, 'without proto, null object has toString');
|
||||
t.equal(obj.__proto__, null); // eslint-disable-line no-proto
|
||||
}
|
||||
|
||||
t.end();
|
||||
});
|
||||
49
backend/node_modules/has-proto/tsconfig.json
generated
vendored
49
backend/node_modules/has-proto/tsconfig.json
generated
vendored
@@ -1,49 +0,0 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "ESNext", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
"useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "commonjs", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
// "moduleResolution": "node10", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
"typeRoots": ["types"], /* Specify multiple folders that act like './node_modules/@types'. */
|
||||
"resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
|
||||
|
||||
/* JavaScript Support */
|
||||
"allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
|
||||
"checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
"maxNodeModuleJsDepth": 0, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
|
||||
|
||||
/* Emit */
|
||||
"declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
"declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
"noEmit": true, /* Disable emitting files from a compilation. */
|
||||
|
||||
/* Interop Constraints */
|
||||
"allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
|
||||
/* Completeness */
|
||||
//"skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
},
|
||||
"exclude": [
|
||||
"coverage"
|
||||
]
|
||||
}
|
||||
16
backend/node_modules/has-symbols/CHANGELOG.md
generated
vendored
16
backend/node_modules/has-symbols/CHANGELOG.md
generated
vendored
@@ -5,6 +5,22 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.1.0](https://github.com/inspect-js/has-symbols/compare/v1.0.3...v1.1.0) - 2024-12-02
|
||||
|
||||
### Commits
|
||||
|
||||
- [actions] update workflows [`548c0bf`](https://github.com/inspect-js/has-symbols/commit/548c0bf8c9b1235458df7a1c0490b0064647a282)
|
||||
- [actions] further shard; update action deps [`bec56bb`](https://github.com/inspect-js/has-symbols/commit/bec56bb0fb44b43a786686b944875a3175cf3ff3)
|
||||
- [meta] use `npmignore` to autogenerate an npmignore file [`ac81032`](https://github.com/inspect-js/has-symbols/commit/ac81032809157e0a079e5264e9ce9b6f1275777e)
|
||||
- [New] add types [`6469cbf`](https://github.com/inspect-js/has-symbols/commit/6469cbff1866cfe367b2b3d181d9296ec14b2a3d)
|
||||
- [actions] update rebase action to use reusable workflow [`9c9d4d0`](https://github.com/inspect-js/has-symbols/commit/9c9d4d0d8938e4b267acdf8e421f4e92d1716d72)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `tape` [`adb5887`](https://github.com/inspect-js/has-symbols/commit/adb5887ca9444849b08beb5caaa9e1d42320cdfb)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`13ec198`](https://github.com/inspect-js/has-symbols/commit/13ec198ec80f1993a87710af1606a1970b22c7cb)
|
||||
- [Dev Deps] update `auto-changelog`, `core-js`, `tape` [`941be52`](https://github.com/inspect-js/has-symbols/commit/941be5248387cab1da72509b22acf3fdb223f057)
|
||||
- [Tests] replace `aud` with `npm audit` [`74f49e9`](https://github.com/inspect-js/has-symbols/commit/74f49e9a9d17a443020784234a1c53ce765b3559)
|
||||
- [Dev Deps] update `npmignore` [`9c0ac04`](https://github.com/inspect-js/has-symbols/commit/9c0ac0452a834f4c2a4b54044f2d6a89f17e9a70)
|
||||
- [Dev Deps] add missing peer dep [`52337a5`](https://github.com/inspect-js/has-symbols/commit/52337a5621cced61f846f2afdab7707a8132cc12)
|
||||
|
||||
## [v1.0.3](https://github.com/inspect-js/has-symbols/compare/v1.0.2...v1.0.3) - 2022-03-01
|
||||
|
||||
### Commits
|
||||
|
||||
1
backend/node_modules/has-symbols/index.js
generated
vendored
1
backend/node_modules/has-symbols/index.js
generated
vendored
@@ -3,6 +3,7 @@
|
||||
var origSymbol = typeof Symbol !== 'undefined' && Symbol;
|
||||
var hasSymbolSham = require('./shams');
|
||||
|
||||
/** @type {import('.')} */
|
||||
module.exports = function hasNativeSymbols() {
|
||||
if (typeof origSymbol !== 'function') { return false; }
|
||||
if (typeof Symbol !== 'function') { return false; }
|
||||
|
||||
28
backend/node_modules/has-symbols/package.json
generated
vendored
28
backend/node_modules/has-symbols/package.json
generated
vendored
@@ -1,21 +1,23 @@
|
||||
{
|
||||
"name": "has-symbols",
|
||||
"version": "1.0.3",
|
||||
"version": "1.1.0",
|
||||
"description": "Determine if the JS environment has Symbol support. Supports spec, or shams.",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"pretest": "npm run --silent lint",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"tests-only": "npm run test:stock && npm run test:staging && npm run test:shams",
|
||||
"posttest": "npx npm@'>=10.2' audit --production",
|
||||
"tests-only": "npm run test:stock && npm run test:shams",
|
||||
"test:stock": "nyc node test",
|
||||
"test:staging": "nyc node --harmony --es-staging test",
|
||||
"test:shams": "npm run --silent test:shams:getownpropertysymbols && npm run --silent test:shams:corejs",
|
||||
"test:shams:corejs": "nyc node test/shams/core-js.js",
|
||||
"test:shams:getownpropertysymbols": "nyc node test/shams/get-own-property-symbols.js",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "tsc -p . && attw -P",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -54,15 +56,22 @@
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/has-symbols#readme",
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^20.2.3",
|
||||
"aud": "^2.0.0",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@arethetypeswrong/cli": "^0.17.0",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.0",
|
||||
"@types/core-js": "^2.5.8",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"core-js": "^2.6.12",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "=8.8.0",
|
||||
"get-own-property-symbols": "^0.9.5",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.5.2"
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/index.js",
|
||||
@@ -93,9 +102,10 @@
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"greenkeeper": {
|
||||
"publishConfig": {
|
||||
"ignore": [
|
||||
"core-js"
|
||||
".github/workflows",
|
||||
"types"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
7
backend/node_modules/has-symbols/shams.js
generated
vendored
7
backend/node_modules/has-symbols/shams.js
generated
vendored
@@ -1,10 +1,12 @@
|
||||
'use strict';
|
||||
|
||||
/** @type {import('./shams')} */
|
||||
/* eslint complexity: [2, 18], max-statements: [2, 33] */
|
||||
module.exports = function hasSymbols() {
|
||||
if (typeof Symbol !== 'function' || typeof Object.getOwnPropertySymbols !== 'function') { return false; }
|
||||
if (typeof Symbol.iterator === 'symbol') { return true; }
|
||||
|
||||
/** @type {{ [k in symbol]?: unknown }} */
|
||||
var obj = {};
|
||||
var sym = Symbol('test');
|
||||
var symObj = Object(sym);
|
||||
@@ -23,7 +25,7 @@ module.exports = function hasSymbols() {
|
||||
|
||||
var symVal = 42;
|
||||
obj[sym] = symVal;
|
||||
for (sym in obj) { return false; } // eslint-disable-line no-restricted-syntax, no-unreachable-loop
|
||||
for (var _ in obj) { return false; } // eslint-disable-line no-restricted-syntax, no-unreachable-loop
|
||||
if (typeof Object.keys === 'function' && Object.keys(obj).length !== 0) { return false; }
|
||||
|
||||
if (typeof Object.getOwnPropertyNames === 'function' && Object.getOwnPropertyNames(obj).length !== 0) { return false; }
|
||||
@@ -34,7 +36,8 @@ module.exports = function hasSymbols() {
|
||||
if (!Object.prototype.propertyIsEnumerable.call(obj, sym)) { return false; }
|
||||
|
||||
if (typeof Object.getOwnPropertyDescriptor === 'function') {
|
||||
var descriptor = Object.getOwnPropertyDescriptor(obj, sym);
|
||||
// eslint-disable-next-line no-extra-parens
|
||||
var descriptor = /** @type {PropertyDescriptor} */ (Object.getOwnPropertyDescriptor(obj, sym));
|
||||
if (descriptor.value !== symVal || descriptor.enumerable !== true) { return false; }
|
||||
}
|
||||
|
||||
|
||||
1
backend/node_modules/has-symbols/test/shams/core-js.js
generated
vendored
1
backend/node_modules/has-symbols/test/shams/core-js.js
generated
vendored
@@ -8,6 +8,7 @@ if (typeof Symbol === 'function' && typeof Symbol() === 'symbol') {
|
||||
t.equal(typeof Symbol(), 'symbol');
|
||||
t.end();
|
||||
});
|
||||
// @ts-expect-error TS is stupid and doesn't know about top level return
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
1
backend/node_modules/has-symbols/test/shams/get-own-property-symbols.js
generated
vendored
1
backend/node_modules/has-symbols/test/shams/get-own-property-symbols.js
generated
vendored
@@ -8,6 +8,7 @@ if (typeof Symbol === 'function' && typeof Symbol() === 'symbol') {
|
||||
t.equal(typeof Symbol(), 'symbol');
|
||||
t.end();
|
||||
});
|
||||
// @ts-expect-error TS is stupid and doesn't know about top level return
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
6
backend/node_modules/has-symbols/test/tests.js
generated
vendored
6
backend/node_modules/has-symbols/test/tests.js
generated
vendored
@@ -1,5 +1,6 @@
|
||||
'use strict';
|
||||
|
||||
/** @type {(t: import('tape').Test) => false | void} */
|
||||
// eslint-disable-next-line consistent-return
|
||||
module.exports = function runSymbolTests(t) {
|
||||
t.equal(typeof Symbol, 'function', 'global Symbol is a function');
|
||||
@@ -31,6 +32,7 @@ module.exports = function runSymbolTests(t) {
|
||||
|
||||
t.equal(typeof Object.getOwnPropertySymbols, 'function', 'Object.getOwnPropertySymbols is a function');
|
||||
|
||||
/** @type {{ [k in symbol]?: unknown }} */
|
||||
var obj = {};
|
||||
var sym = Symbol('test');
|
||||
var symObj = Object(sym);
|
||||
@@ -40,8 +42,8 @@ module.exports = function runSymbolTests(t) {
|
||||
|
||||
var symVal = 42;
|
||||
obj[sym] = symVal;
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (sym in obj) { t.fail('symbol property key was found in for..in of object'); }
|
||||
// eslint-disable-next-line no-restricted-syntax, no-unused-vars
|
||||
for (var _ in obj) { t.fail('symbol property key was found in for..in of object'); }
|
||||
|
||||
t.deepEqual(Object.keys(obj), [], 'no enumerable own keys on symbol-valued object');
|
||||
t.deepEqual(Object.getOwnPropertyNames(obj), [], 'no own names on symbol-valued object');
|
||||
|
||||
7
backend/node_modules/nodemailer/.ncurc.js
generated
vendored
7
backend/node_modules/nodemailer/.ncurc.js
generated
vendored
@@ -1,10 +1,9 @@
|
||||
'use strict';
|
||||
|
||||
module.exports = {
|
||||
upgrade: true,
|
||||
reject: [
|
||||
// API changes break existing tests
|
||||
'proxy',
|
||||
|
||||
// API changes
|
||||
'eslint'
|
||||
'proxy'
|
||||
]
|
||||
};
|
||||
|
||||
2
backend/node_modules/nodemailer/.prettierrc.js
generated
vendored
2
backend/node_modules/nodemailer/.prettierrc.js
generated
vendored
@@ -1,3 +1,5 @@
|
||||
'use strict';
|
||||
|
||||
module.exports = {
|
||||
printWidth: 160,
|
||||
tabWidth: 4,
|
||||
|
||||
655
backend/node_modules/nodemailer/CHANGELOG.md
generated
vendored
655
backend/node_modules/nodemailer/CHANGELOG.md
generated
vendored
File diff suppressed because it is too large
Load Diff
26
backend/node_modules/nodemailer/CODE_OF_CONDUCT.md
generated
vendored
26
backend/node_modules/nodemailer/CODE_OF_CONDUCT.md
generated
vendored
@@ -14,22 +14,22 @@ appearance, race, religion, or sexual identity and orientation.
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
- Using welcoming and inclusive language
|
||||
- Being respectful of differing viewpoints and experiences
|
||||
- Gracefully accepting constructive criticism
|
||||
- Focusing on what is best for the community
|
||||
- Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
- The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
- Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
- Public or private harassment
|
||||
- Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
- Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
|
||||
24
backend/node_modules/nodemailer/README.md
generated
vendored
24
backend/node_modules/nodemailer/README.md
generated
vendored
@@ -37,36 +37,36 @@ It's either a firewall issue, or your SMTP server blocks authentication attempts
|
||||
|
||||
#### I get TLS errors
|
||||
|
||||
- If you are running the code on your machine, check your antivirus settings. Antiviruses often mess around with email ports usage. Node.js might not recognize the MITM cert your antivirus is using.
|
||||
- Latest Node versions allow only TLS versions 1.2 and higher. Some servers might still use TLS 1.1 or lower. Check Node.js docs on how to get correct TLS support for your app. You can change this with [tls.minVersion](https://nodejs.org/dist/latest-v16.x/docs/api/tls.html#tls_tls_createsecurecontext_options) option
|
||||
- You might have the wrong value for the `secure` option. This should be set to `true` only for port 465. For every other port, it should be `false`. Setting it to `false` does not mean that Nodemailer would not use TLS. Nodemailer would still try to upgrade the connection to use TLS if the server supports it.
|
||||
- Older Node versions do not fully support the certificate chain of the newest Let's Encrypt certificates. Either set [tls.rejectUnauthorized](https://nodejs.org/dist/latest-v16.x/docs/api/tls.html#tlsconnectoptions-callback) to `false` to skip chain verification or upgrade your Node version
|
||||
- If you are running the code on your machine, check your antivirus settings. Antiviruses often mess around with email ports usage. Node.js might not recognize the MITM cert your antivirus is using.
|
||||
- Latest Node versions allow only TLS versions 1.2 and higher. Some servers might still use TLS 1.1 or lower. Check Node.js docs on how to get correct TLS support for your app. You can change this with [tls.minVersion](https://nodejs.org/dist/latest-v16.x/docs/api/tls.html#tls_tls_createsecurecontext_options) option
|
||||
- You might have the wrong value for the `secure` option. This should be set to `true` only for port 465. For every other port, it should be `false`. Setting it to `false` does not mean that Nodemailer would not use TLS. Nodemailer would still try to upgrade the connection to use TLS if the server supports it.
|
||||
- Older Node versions do not fully support the certificate chain of the newest Let's Encrypt certificates. Either set [tls.rejectUnauthorized](https://nodejs.org/dist/latest-v16.x/docs/api/tls.html#tlsconnectoptions-callback) to `false` to skip chain verification or upgrade your Node version
|
||||
|
||||
```
|
||||
```js
|
||||
let configOptions = {
|
||||
host: "smtp.example.com",
|
||||
host: 'smtp.example.com',
|
||||
port: 587,
|
||||
tls: {
|
||||
rejectUnauthorized: true,
|
||||
minVersion: "TLSv1.2"
|
||||
minVersion: 'TLSv1.2'
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
#### I have issues with DNS / hosts file
|
||||
|
||||
Node.js uses [c-ares](https://nodejs.org/en/docs/meta/topics/dependencies/#c-ares) to resolve domain names, not the DNS library provided by the system, so if you have some custom DNS routing set up, it might be ignored. Nodemailer runs [dns.resolve4()](https://nodejs.org/dist/latest-v16.x/docs/api/dns.html#dnsresolve4hostname-options-callback) and [dns.resolve6()](https://nodejs.org/dist/latest-v16.x/docs/api/dns.html#dnsresolve6hostname-options-callback) to resolve hostname into an IP address. If both calls fail, then Nodemailer will fall back to [dns.lookup()](https://nodejs.org/dist/latest-v16.x/docs/api/dns.html#dnslookuphostname-options-callback). If this does not work for you, you can hard code the IP address into the configuration like shown below. In that case, Nodemailer would not perform any DNS lookups.
|
||||
|
||||
```
|
||||
```js
|
||||
let configOptions = {
|
||||
host: "1.2.3.4",
|
||||
host: '1.2.3.4',
|
||||
port: 465,
|
||||
secure: true,
|
||||
tls: {
|
||||
// must provide server name, otherwise TLS certificate check will fail
|
||||
servername: "example.com"
|
||||
servername: 'example.com'
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
#### I have an issue with TypeScript types
|
||||
|
||||
84
backend/node_modules/nodemailer/lib/addressparser/index.js
generated
vendored
84
backend/node_modules/nodemailer/lib/addressparser/index.js
generated
vendored
@@ -7,7 +7,6 @@
|
||||
* @return {Object} Address object
|
||||
*/
|
||||
function _handleAddress(tokens) {
|
||||
let token;
|
||||
let isGroup = false;
|
||||
let state = 'text';
|
||||
let address;
|
||||
@@ -16,28 +15,41 @@ function _handleAddress(tokens) {
|
||||
address: [],
|
||||
comment: [],
|
||||
group: [],
|
||||
text: []
|
||||
text: [],
|
||||
textWasQuoted: [] // Track which text tokens came from inside quotes
|
||||
};
|
||||
let i;
|
||||
let len;
|
||||
let insideQuotes = false; // Track if we're currently inside a quoted string
|
||||
|
||||
// Filter out <addresses>, (comments) and regular text
|
||||
for (i = 0, len = tokens.length; i < len; i++) {
|
||||
token = tokens[i];
|
||||
let token = tokens[i];
|
||||
let prevToken = i ? tokens[i - 1] : null;
|
||||
if (token.type === 'operator') {
|
||||
switch (token.value) {
|
||||
case '<':
|
||||
state = 'address';
|
||||
insideQuotes = false;
|
||||
break;
|
||||
case '(':
|
||||
state = 'comment';
|
||||
insideQuotes = false;
|
||||
break;
|
||||
case ':':
|
||||
state = 'group';
|
||||
isGroup = true;
|
||||
insideQuotes = false;
|
||||
break;
|
||||
case '"':
|
||||
// Track quote state for text tokens
|
||||
insideQuotes = !insideQuotes;
|
||||
state = 'text';
|
||||
break;
|
||||
default:
|
||||
state = 'text';
|
||||
insideQuotes = false;
|
||||
break;
|
||||
}
|
||||
} else if (token.value) {
|
||||
if (state === 'address') {
|
||||
@@ -46,7 +58,19 @@ function _handleAddress(tokens) {
|
||||
// and so will we
|
||||
token.value = token.value.replace(/^[^<]*<\s*/, '');
|
||||
}
|
||||
data[state].push(token.value);
|
||||
|
||||
if (prevToken && prevToken.noBreak && data[state].length) {
|
||||
// join values
|
||||
data[state][data[state].length - 1] += token.value;
|
||||
if (state === 'text' && insideQuotes) {
|
||||
data.textWasQuoted[data.textWasQuoted.length - 1] = true;
|
||||
}
|
||||
} else {
|
||||
data[state].push(token.value);
|
||||
if (state === 'text') {
|
||||
data.textWasQuoted.push(insideQuotes);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -59,16 +83,36 @@ function _handleAddress(tokens) {
|
||||
if (isGroup) {
|
||||
// http://tools.ietf.org/html/rfc2822#appendix-A.1.3
|
||||
data.text = data.text.join(' ');
|
||||
|
||||
// Parse group members, but flatten any nested groups (RFC 5322 doesn't allow nesting)
|
||||
let groupMembers = [];
|
||||
if (data.group.length) {
|
||||
let parsedGroup = addressparser(data.group.join(','));
|
||||
// Flatten: if any member is itself a group, extract its members into the sequence
|
||||
parsedGroup.forEach(member => {
|
||||
if (member.group) {
|
||||
// Nested group detected - flatten it by adding its members directly
|
||||
groupMembers = groupMembers.concat(member.group);
|
||||
} else {
|
||||
groupMembers.push(member);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
addresses.push({
|
||||
name: data.text || (address && address.name),
|
||||
group: data.group.length ? addressparser(data.group.join(',')) : []
|
||||
group: groupMembers
|
||||
});
|
||||
} else {
|
||||
// If no address was found, try to detect one from regular text
|
||||
if (!data.address.length && data.text.length) {
|
||||
for (i = data.text.length - 1; i >= 0; i--) {
|
||||
if (data.text[i].match(/^[^@\s]+@[^@\s]+$/)) {
|
||||
// Security fix: Do not extract email addresses from quoted strings
|
||||
// RFC 5321 allows @ inside quoted local-parts like "user@domain"@example.com
|
||||
// Extracting emails from quoted text leads to misrouting vulnerabilities
|
||||
if (!data.textWasQuoted[i] && data.text[i].match(/^[^@\s]+@[^@\s]+$/)) {
|
||||
data.address = data.text.splice(i, 1);
|
||||
data.textWasQuoted.splice(i, 1);
|
||||
break;
|
||||
}
|
||||
}
|
||||
@@ -85,10 +129,13 @@ function _handleAddress(tokens) {
|
||||
// still no address
|
||||
if (!data.address.length) {
|
||||
for (i = data.text.length - 1; i >= 0; i--) {
|
||||
// fixed the regex to parse email address correctly when email address has more than one @
|
||||
data.text[i] = data.text[i].replace(/\s*\b[^@\s]+@[^\s]+\b\s*/, _regexHandler).trim();
|
||||
if (data.address.length) {
|
||||
break;
|
||||
// Security fix: Do not extract email addresses from quoted strings
|
||||
if (!data.textWasQuoted[i]) {
|
||||
// fixed the regex to parse email address correctly when email address has more than one @
|
||||
data.text[i] = data.text[i].replace(/\s*\b[^@\s]+@[^\s]+\b\s*/, _regexHandler).trim();
|
||||
if (data.address.length) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -172,11 +219,12 @@ class Tokenizer {
|
||||
* @return {Array} An array of operator|text tokens
|
||||
*/
|
||||
tokenize() {
|
||||
let chr,
|
||||
list = [];
|
||||
let list = [];
|
||||
|
||||
for (let i = 0, len = this.str.length; i < len; i++) {
|
||||
chr = this.str.charAt(i);
|
||||
this.checkChar(chr);
|
||||
let chr = this.str.charAt(i);
|
||||
let nextChr = i < len - 1 ? this.str.charAt(i + 1) : null;
|
||||
this.checkChar(chr, nextChr);
|
||||
}
|
||||
|
||||
this.list.forEach(node => {
|
||||
@@ -194,7 +242,7 @@ class Tokenizer {
|
||||
*
|
||||
* @param {String} chr Character from the address field
|
||||
*/
|
||||
checkChar(chr) {
|
||||
checkChar(chr, nextChr) {
|
||||
if (this.escaped) {
|
||||
// ignore next condition blocks
|
||||
} else if (chr === this.operatorExpecting) {
|
||||
@@ -202,10 +250,16 @@ class Tokenizer {
|
||||
type: 'operator',
|
||||
value: chr
|
||||
};
|
||||
|
||||
if (nextChr && ![' ', '\t', '\r', '\n', ',', ';'].includes(nextChr)) {
|
||||
this.node.noBreak = true;
|
||||
}
|
||||
|
||||
this.list.push(this.node);
|
||||
this.node = null;
|
||||
this.operatorExpecting = '';
|
||||
this.escaped = false;
|
||||
|
||||
return;
|
||||
} else if (!this.operatorExpecting && chr in this.operators) {
|
||||
this.node = {
|
||||
|
||||
25
backend/node_modules/nodemailer/lib/base64/index.js
generated
vendored
25
backend/node_modules/nodemailer/lib/base64/index.js
generated
vendored
@@ -35,15 +35,12 @@ function wrap(str, lineLength) {
|
||||
let pos = 0;
|
||||
let chunkLength = lineLength * 1024;
|
||||
while (pos < str.length) {
|
||||
let wrappedLines = str
|
||||
.substr(pos, chunkLength)
|
||||
.replace(new RegExp('.{' + lineLength + '}', 'g'), '$&\r\n')
|
||||
.trim();
|
||||
let wrappedLines = str.substr(pos, chunkLength).replace(new RegExp('.{' + lineLength + '}', 'g'), '$&\r\n');
|
||||
result.push(wrappedLines);
|
||||
pos += chunkLength;
|
||||
}
|
||||
|
||||
return result.join('\r\n').trim();
|
||||
return result.join('');
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -56,7 +53,6 @@ function wrap(str, lineLength) {
|
||||
class Encoder extends Transform {
|
||||
constructor(options) {
|
||||
super();
|
||||
// init Transform
|
||||
this.options = options || {};
|
||||
|
||||
if (this.options.lineLength !== false) {
|
||||
@@ -98,17 +94,20 @@ class Encoder extends Transform {
|
||||
if (this.options.lineLength) {
|
||||
b64 = wrap(b64, this.options.lineLength);
|
||||
|
||||
// remove last line as it is still most probably incomplete
|
||||
let lastLF = b64.lastIndexOf('\n');
|
||||
if (lastLF < 0) {
|
||||
this._curLine = b64;
|
||||
b64 = '';
|
||||
} else if (lastLF === b64.length - 1) {
|
||||
this._curLine = '';
|
||||
} else {
|
||||
this._curLine = b64.substr(lastLF + 1);
|
||||
b64 = b64.substr(0, lastLF + 1);
|
||||
this._curLine = b64.substring(lastLF + 1);
|
||||
b64 = b64.substring(0, lastLF + 1);
|
||||
|
||||
if (b64 && !b64.endsWith('\r\n')) {
|
||||
b64 += '\r\n';
|
||||
}
|
||||
}
|
||||
} else {
|
||||
this._curLine = '';
|
||||
}
|
||||
|
||||
if (b64) {
|
||||
@@ -125,16 +124,14 @@ class Encoder extends Transform {
|
||||
}
|
||||
|
||||
if (this._curLine) {
|
||||
this._curLine = wrap(this._curLine, this.options.lineLength);
|
||||
this.outputBytes += this._curLine.length;
|
||||
this.push(this._curLine, 'ascii');
|
||||
this.push(Buffer.from(this._curLine, 'ascii'));
|
||||
this._curLine = '';
|
||||
}
|
||||
done();
|
||||
}
|
||||
}
|
||||
|
||||
// expose to the world
|
||||
module.exports = {
|
||||
encode,
|
||||
wrap,
|
||||
|
||||
6
backend/node_modules/nodemailer/lib/dkim/index.js
generated
vendored
6
backend/node_modules/nodemailer/lib/dkim/index.js
generated
vendored
@@ -12,7 +12,7 @@ const path = require('path');
|
||||
const crypto = require('crypto');
|
||||
|
||||
const DKIM_ALGO = 'sha256';
|
||||
const MAX_MESSAGE_SIZE = 128 * 1024; // buffer messages larger than this to disk
|
||||
const MAX_MESSAGE_SIZE = 2 * 1024 * 1024; // buffer messages larger than this to disk
|
||||
|
||||
/*
|
||||
// Usage:
|
||||
@@ -42,7 +42,9 @@ class DKIMSigner {
|
||||
this.chunks = [];
|
||||
this.chunklen = 0;
|
||||
this.readPos = 0;
|
||||
this.cachePath = this.cacheDir ? path.join(this.cacheDir, 'message.' + Date.now() + '-' + crypto.randomBytes(14).toString('hex')) : false;
|
||||
this.cachePath = this.cacheDir
|
||||
? path.join(this.cacheDir, 'message.' + Date.now() + '-' + crypto.randomBytes(14).toString('hex'))
|
||||
: false;
|
||||
this.cache = false;
|
||||
|
||||
this.headers = false;
|
||||
|
||||
2
backend/node_modules/nodemailer/lib/dkim/sign.js
generated
vendored
2
backend/node_modules/nodemailer/lib/dkim/sign.js
generated
vendored
@@ -41,7 +41,7 @@ module.exports = (headers, hashAlgo, bodyHash, options) => {
|
||||
signer.update(canonicalizedHeaderData.headers);
|
||||
try {
|
||||
signature = signer.sign(options.privateKey, 'base64');
|
||||
} catch (E) {
|
||||
} catch (_E) {
|
||||
return false;
|
||||
}
|
||||
|
||||
|
||||
8
backend/node_modules/nodemailer/lib/fetch/index.js
generated
vendored
8
backend/node_modules/nodemailer/lib/fetch/index.js
generated
vendored
@@ -132,7 +132,13 @@ function nmfetch(url, options) {
|
||||
});
|
||||
}
|
||||
|
||||
if (parsed.protocol === 'https:' && parsed.hostname && parsed.hostname !== reqOptions.host && !net.isIP(parsed.hostname) && !reqOptions.servername) {
|
||||
if (
|
||||
parsed.protocol === 'https:' &&
|
||||
parsed.hostname &&
|
||||
parsed.hostname !== reqOptions.host &&
|
||||
!net.isIP(parsed.hostname) &&
|
||||
!reqOptions.servername
|
||||
) {
|
||||
reqOptions.servername = parsed.hostname;
|
||||
}
|
||||
|
||||
|
||||
72
backend/node_modules/nodemailer/lib/mail-composer/index.js
generated
vendored
72
backend/node_modules/nodemailer/lib/mail-composer/index.js
generated
vendored
@@ -86,20 +86,34 @@ class MailComposer {
|
||||
let icalEvent, eventObject;
|
||||
let attachments = [].concat(this.mail.attachments || []).map((attachment, i) => {
|
||||
let data;
|
||||
let isMessageNode = /^message\//i.test(attachment.contentType);
|
||||
|
||||
if (/^data:/i.test(attachment.path || attachment.href)) {
|
||||
attachment = this._processDataUrl(attachment);
|
||||
}
|
||||
|
||||
let contentType = attachment.contentType || mimeFuncs.detectMimeType(attachment.filename || attachment.path || attachment.href || 'bin');
|
||||
let contentType =
|
||||
attachment.contentType || mimeFuncs.detectMimeType(attachment.filename || attachment.path || attachment.href || 'bin');
|
||||
|
||||
let isImage = /^image\//i.test(contentType);
|
||||
let contentDisposition = attachment.contentDisposition || (isMessageNode || (isImage && attachment.cid) ? 'inline' : 'attachment');
|
||||
let isMessageNode = /^message\//i.test(contentType);
|
||||
|
||||
let contentDisposition =
|
||||
attachment.contentDisposition || (isMessageNode || (isImage && attachment.cid) ? 'inline' : 'attachment');
|
||||
|
||||
let contentTransferEncoding;
|
||||
if ('contentTransferEncoding' in attachment) {
|
||||
// also contains `false`, to set
|
||||
contentTransferEncoding = attachment.contentTransferEncoding;
|
||||
} else if (isMessageNode) {
|
||||
contentTransferEncoding = '7bit';
|
||||
} else {
|
||||
contentTransferEncoding = 'base64'; // the default
|
||||
}
|
||||
|
||||
data = {
|
||||
contentType,
|
||||
contentDisposition,
|
||||
contentTransferEncoding: 'contentTransferEncoding' in attachment ? attachment.contentTransferEncoding : 'base64'
|
||||
contentTransferEncoding
|
||||
};
|
||||
|
||||
if (attachment.filename) {
|
||||
@@ -200,7 +214,10 @@ class MailComposer {
|
||||
eventObject;
|
||||
|
||||
if (this.mail.text) {
|
||||
if (typeof this.mail.text === 'object' && (this.mail.text.content || this.mail.text.path || this.mail.text.href || this.mail.text.raw)) {
|
||||
if (
|
||||
typeof this.mail.text === 'object' &&
|
||||
(this.mail.text.content || this.mail.text.path || this.mail.text.href || this.mail.text.raw)
|
||||
) {
|
||||
text = this.mail.text;
|
||||
} else {
|
||||
text = {
|
||||
@@ -225,7 +242,10 @@ class MailComposer {
|
||||
}
|
||||
|
||||
if (this.mail.amp) {
|
||||
if (typeof this.mail.amp === 'object' && (this.mail.amp.content || this.mail.amp.path || this.mail.amp.href || this.mail.amp.raw)) {
|
||||
if (
|
||||
typeof this.mail.amp === 'object' &&
|
||||
(this.mail.amp.content || this.mail.amp.path || this.mail.amp.href || this.mail.amp.raw)
|
||||
) {
|
||||
amp = this.mail.amp;
|
||||
} else {
|
||||
amp = {
|
||||
@@ -260,14 +280,18 @@ class MailComposer {
|
||||
}
|
||||
|
||||
eventObject.filename = false;
|
||||
eventObject.contentType = 'text/calendar; charset=utf-8; method=' + (eventObject.method || 'PUBLISH').toString().trim().toUpperCase();
|
||||
eventObject.contentType =
|
||||
'text/calendar; charset=utf-8; method=' + (eventObject.method || 'PUBLISH').toString().trim().toUpperCase();
|
||||
if (!eventObject.headers) {
|
||||
eventObject.headers = {};
|
||||
}
|
||||
}
|
||||
|
||||
if (this.mail.html) {
|
||||
if (typeof this.mail.html === 'object' && (this.mail.html.content || this.mail.html.path || this.mail.html.href || this.mail.html.raw)) {
|
||||
if (
|
||||
typeof this.mail.html === 'object' &&
|
||||
(this.mail.html.content || this.mail.html.path || this.mail.html.href || this.mail.html.raw)
|
||||
) {
|
||||
html = this.mail.html;
|
||||
} else {
|
||||
html = {
|
||||
@@ -292,7 +316,9 @@ class MailComposer {
|
||||
}
|
||||
|
||||
data = {
|
||||
contentType: alternative.contentType || mimeFuncs.detectMimeType(alternative.filename || alternative.path || alternative.href || 'txt'),
|
||||
contentType:
|
||||
alternative.contentType ||
|
||||
mimeFuncs.detectMimeType(alternative.filename || alternative.path || alternative.href || 'txt'),
|
||||
contentTransferEncoding: alternative.contentTransferEncoding
|
||||
};
|
||||
|
||||
@@ -538,9 +564,33 @@ class MailComposer {
|
||||
* @return {Object} Parsed element
|
||||
*/
|
||||
_processDataUrl(element) {
|
||||
const dataUrl = element.path || element.href;
|
||||
|
||||
// Early validation to prevent ReDoS
|
||||
if (!dataUrl || typeof dataUrl !== 'string') {
|
||||
return element;
|
||||
}
|
||||
|
||||
if (!dataUrl.startsWith('data:')) {
|
||||
return element;
|
||||
}
|
||||
|
||||
if (dataUrl.length > 100000) {
|
||||
// 100KB limit for data URL string
|
||||
// Return empty content for excessively long data URLs
|
||||
return Object.assign({}, element, {
|
||||
path: false,
|
||||
href: false,
|
||||
content: Buffer.alloc(0),
|
||||
contentType: element.contentType || 'application/octet-stream'
|
||||
});
|
||||
}
|
||||
|
||||
let parsedDataUri;
|
||||
if ((element.path || element.href).match(/^data:/)) {
|
||||
parsedDataUri = parseDataURI(element.path || element.href);
|
||||
try {
|
||||
parsedDataUri = parseDataURI(dataUrl);
|
||||
} catch (_err) {
|
||||
return element;
|
||||
}
|
||||
|
||||
if (!parsedDataUri) {
|
||||
|
||||
14
backend/node_modules/nodemailer/lib/mailer/index.js
generated
vendored
14
backend/node_modules/nodemailer/lib/mailer/index.js
generated
vendored
@@ -87,6 +87,11 @@ class Mail extends EventEmitter {
|
||||
this.transporter.on('idle', (...args) => {
|
||||
this.emit('idle', ...args);
|
||||
});
|
||||
|
||||
// indicates if the sender has became idle and all connections are terminated
|
||||
this.transporter.on('clear', (...args) => {
|
||||
this.emit('clear', ...args);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -236,7 +241,14 @@ class Mail extends EventEmitter {
|
||||
}
|
||||
|
||||
getVersionString() {
|
||||
return util.format('%s (%s; +%s; %s/%s)', packageData.name, packageData.version, packageData.homepage, this.transporter.name, this.transporter.version);
|
||||
return util.format(
|
||||
'%s (%s; +%s; %s/%s)',
|
||||
packageData.name,
|
||||
packageData.version,
|
||||
packageData.homepage,
|
||||
this.transporter.name,
|
||||
this.transporter.version
|
||||
);
|
||||
}
|
||||
|
||||
_processPlugins(step, mail, callback) {
|
||||
|
||||
3
backend/node_modules/nodemailer/lib/mailer/mail-message.js
generated
vendored
3
backend/node_modules/nodemailer/lib/mailer/mail-message.js
generated
vendored
@@ -64,7 +64,8 @@ class MailMessage {
|
||||
if (this.data.attachments && this.data.attachments.length) {
|
||||
this.data.attachments.forEach((attachment, i) => {
|
||||
if (!attachment.filename) {
|
||||
attachment.filename = (attachment.path || attachment.href || '').split('/').pop().split('?').shift() || 'attachment-' + (i + 1);
|
||||
attachment.filename =
|
||||
(attachment.path || attachment.href || '').split('/').pop().split('?').shift() || 'attachment-' + (i + 1);
|
||||
if (attachment.filename.indexOf('.') < 0) {
|
||||
attachment.filename += '.' + mimeFuncs.detectExtension(attachment.contentType);
|
||||
}
|
||||
|
||||
4
backend/node_modules/nodemailer/lib/mime-funcs/index.js
generated
vendored
4
backend/node_modules/nodemailer/lib/mime-funcs/index.js
generated
vendored
@@ -269,7 +269,7 @@ module.exports = {
|
||||
|
||||
// first line includes the charset and language info and needs to be encoded
|
||||
// even if it does not contain any unicode characters
|
||||
line = 'utf-8\x27\x27';
|
||||
line = "utf-8''";
|
||||
let encoded = true;
|
||||
startPos = 0;
|
||||
|
||||
@@ -614,7 +614,7 @@ module.exports = {
|
||||
try {
|
||||
// might throw if we try to encode invalid sequences, eg. partial emoji
|
||||
str = encodeURIComponent(str);
|
||||
} catch (E) {
|
||||
} catch (_E) {
|
||||
// should never run
|
||||
return str.replace(/[^\x00-\x1F *'()<>@,;:\\"[\]?=\u007F-\uFFFF]+/g, '');
|
||||
}
|
||||
|
||||
17
backend/node_modules/nodemailer/lib/mime-funcs/mime-types.js
generated
vendored
17
backend/node_modules/nodemailer/lib/mime-funcs/mime-types.js
generated
vendored
@@ -44,6 +44,7 @@ const mimeTypes = new Map([
|
||||
['application/fractals', 'fif'],
|
||||
['application/freeloader', 'frl'],
|
||||
['application/futuresplash', 'spl'],
|
||||
['application/geo+json', 'geojson'],
|
||||
['application/gnutar', 'tgz'],
|
||||
['application/groupwise', 'vew'],
|
||||
['application/hlp', 'hlp'],
|
||||
@@ -1101,7 +1102,10 @@ const extensions = new Map([
|
||||
['bdm', 'application/vnd.syncml.dm+wbxml'],
|
||||
['bed', 'application/vnd.realvnc.bed'],
|
||||
['bh2', 'application/vnd.fujitsu.oasysprs'],
|
||||
['bin', ['application/octet-stream', 'application/mac-binary', 'application/macbinary', 'application/x-macbinary', 'application/x-binary']],
|
||||
[
|
||||
'bin',
|
||||
['application/octet-stream', 'application/mac-binary', 'application/macbinary', 'application/x-macbinary', 'application/x-binary']
|
||||
],
|
||||
['bm', 'image/bmp'],
|
||||
['bmi', 'application/vnd.bmi'],
|
||||
['bmp', ['image/bmp', 'image/x-windows-bmp']],
|
||||
@@ -1146,7 +1150,10 @@ const extensions = new Map([
|
||||
['cii', 'application/vnd.anser-web-certificate-issue-initiation'],
|
||||
['cil', 'application/vnd.ms-artgalry'],
|
||||
['cla', 'application/vnd.claymore'],
|
||||
['class', ['application/octet-stream', 'application/java', 'application/java-byte-code', 'application/java-vm', 'application/x-java-class']],
|
||||
[
|
||||
'class',
|
||||
['application/octet-stream', 'application/java', 'application/java-byte-code', 'application/java-vm', 'application/x-java-class']
|
||||
],
|
||||
['clkk', 'application/vnd.crick.clicker.keyboard'],
|
||||
['clkp', 'application/vnd.crick.clicker.palette'],
|
||||
['clkt', 'application/vnd.crick.clicker.template'],
|
||||
@@ -1287,6 +1294,7 @@ const extensions = new Map([
|
||||
['gac', 'application/vnd.groove-account'],
|
||||
['gdl', 'model/vnd.gdl'],
|
||||
['geo', 'application/vnd.dynageo'],
|
||||
['geojson', 'application/geo+json'],
|
||||
['gex', 'application/vnd.geometry-explorer'],
|
||||
['ggb', 'application/vnd.geogebra.file'],
|
||||
['ggt', 'application/vnd.geogebra.tool'],
|
||||
@@ -1750,7 +1758,10 @@ const extensions = new Map([
|
||||
['sbml', 'application/sbml+xml'],
|
||||
['sc', 'application/vnd.ibm.secure-container'],
|
||||
['scd', 'application/x-msschedule'],
|
||||
['scm', ['application/vnd.lotus-screencam', 'video/x-scm', 'text/x-script.guile', 'application/x-lotusscreencam', 'text/x-script.scheme']],
|
||||
[
|
||||
'scm',
|
||||
['application/vnd.lotus-screencam', 'video/x-scm', 'text/x-script.guile', 'application/x-lotusscreencam', 'text/x-script.scheme']
|
||||
],
|
||||
['scq', 'application/scvp-cv-request'],
|
||||
['scs', 'application/scvp-cv-response'],
|
||||
['sct', 'text/scriptlet'],
|
||||
|
||||
20
backend/node_modules/nodemailer/lib/mime-node/index.js
generated
vendored
20
backend/node_modules/nodemailer/lib/mime-node/index.js
generated
vendored
@@ -552,7 +552,11 @@ class MimeNode {
|
||||
|
||||
this._handleContentType(structured);
|
||||
|
||||
if (structured.value.match(/^text\/plain\b/) && typeof this.content === 'string' && /[\u0080-\uFFFF]/.test(this.content)) {
|
||||
if (
|
||||
structured.value.match(/^text\/plain\b/) &&
|
||||
typeof this.content === 'string' &&
|
||||
/[\u0080-\uFFFF]/.test(this.content)
|
||||
) {
|
||||
structured.params.charset = 'utf-8';
|
||||
}
|
||||
|
||||
@@ -963,8 +967,8 @@ class MimeNode {
|
||||
setImmediate(() => {
|
||||
try {
|
||||
contentStream.end(content._resolvedValue);
|
||||
} catch (err) {
|
||||
contentStream.emit('error', err);
|
||||
} catch (_err) {
|
||||
contentStream.emit('error', _err);
|
||||
}
|
||||
});
|
||||
|
||||
@@ -995,8 +999,8 @@ class MimeNode {
|
||||
setImmediate(() => {
|
||||
try {
|
||||
contentStream.end(content || '');
|
||||
} catch (err) {
|
||||
contentStream.emit('error', err);
|
||||
} catch (_err) {
|
||||
contentStream.emit('error', _err);
|
||||
}
|
||||
});
|
||||
return contentStream;
|
||||
@@ -1014,7 +1018,6 @@ class MimeNode {
|
||||
return [].concat.apply(
|
||||
[],
|
||||
[].concat(addresses).map(address => {
|
||||
// eslint-disable-line prefer-spread
|
||||
if (address && address.address) {
|
||||
address.address = this._normalizeAddress(address.address);
|
||||
address.name = address.name || '';
|
||||
@@ -1113,7 +1116,6 @@ class MimeNode {
|
||||
.apply(
|
||||
[],
|
||||
[].concat(value || '').map(elm => {
|
||||
// eslint-disable-line prefer-spread
|
||||
elm = (elm || '')
|
||||
.toString()
|
||||
.replace(/\r?\n|\r/g, ' ')
|
||||
@@ -1219,7 +1221,7 @@ class MimeNode {
|
||||
|
||||
try {
|
||||
encodedDomain = punycode.toASCII(domain.toLowerCase());
|
||||
} catch (err) {
|
||||
} catch (_err) {
|
||||
// keep as is?
|
||||
}
|
||||
|
||||
@@ -1282,7 +1284,7 @@ class MimeNode {
|
||||
// count latin alphabet symbols and 8-bit range symbols + control symbols
|
||||
// if there are more latin characters, then use quoted-printable
|
||||
// encoding, otherwise use base64
|
||||
nonLatinLen = (value.match(/[\x00-\x08\x0B\x0C\x0E-\x1F\u0080-\uFFFF]/g) || []).length; // eslint-disable-line no-control-regex
|
||||
nonLatinLen = (value.match(/[\x00-\x08\x0B\x0C\x0E-\x1F\u0080-\uFFFF]/g) || []).length;
|
||||
latinLen = (value.match(/[a-z]/gi) || []).length;
|
||||
// if there are more latin symbols than binary/unicode, then prefer Q, otherwise B
|
||||
encoding = nonLatinLen < latinLen ? 'Q' : 'B';
|
||||
|
||||
7
backend/node_modules/nodemailer/lib/nodemailer.js
generated
vendored
7
backend/node_modules/nodemailer/lib/nodemailer.js
generated
vendored
@@ -45,6 +45,13 @@ module.exports.createTransport = function (transporter, defaults) {
|
||||
} else if (options.jsonTransport) {
|
||||
transporter = new JSONTransport(options);
|
||||
} else if (options.SES) {
|
||||
if (options.SES.ses && options.SES.aws) {
|
||||
let error = new Error(
|
||||
'Using legacy SES configuration, expecting @aws-sdk/client-sesv2, see https://nodemailer.com/transports/ses/'
|
||||
);
|
||||
error.code = 'LegacyConfig';
|
||||
throw error;
|
||||
}
|
||||
transporter = new SESTransport(options);
|
||||
} else {
|
||||
transporter = new SMTPTransport(options);
|
||||
|
||||
12
backend/node_modules/nodemailer/lib/qp/index.js
generated
vendored
12
backend/node_modules/nodemailer/lib/qp/index.js
generated
vendored
@@ -28,7 +28,10 @@ function encode(buffer) {
|
||||
for (let i = 0, len = buffer.length; i < len; i++) {
|
||||
ord = buffer[i];
|
||||
// if the char is in allowed range, then keep as is, unless it is a WS in the end of a line
|
||||
if (checkRanges(ord, ranges) && !((ord === 0x20 || ord === 0x09) && (i === len - 1 || buffer[i + 1] === 0x0a || buffer[i + 1] === 0x0d))) {
|
||||
if (
|
||||
checkRanges(ord, ranges) &&
|
||||
!((ord === 0x20 || ord === 0x09) && (i === len - 1 || buffer[i + 1] === 0x0a || buffer[i + 1] === 0x0d))
|
||||
) {
|
||||
result += String.fromCharCode(ord);
|
||||
continue;
|
||||
}
|
||||
@@ -90,7 +93,12 @@ function wrap(str, lineLength) {
|
||||
}
|
||||
|
||||
// ensure that utf-8 sequences are not split
|
||||
while (line.length > 3 && line.length < len - pos && !line.match(/^(?:=[\da-f]{2}){1,4}$/i) && (match = line.match(/[=][\da-f]{2}$/gi))) {
|
||||
while (
|
||||
line.length > 3 &&
|
||||
line.length < len - pos &&
|
||||
!line.match(/^(?:=[\da-f]{2}){1,4}$/i) &&
|
||||
(match = line.match(/[=][\da-f]{2}$/gi))
|
||||
) {
|
||||
code = parseInt(match[0].substr(1, 2), 16);
|
||||
if (code < 128) {
|
||||
break;
|
||||
|
||||
221
backend/node_modules/nodemailer/lib/ses-transport/index.js
generated
vendored
221
backend/node_modules/nodemailer/lib/ses-transport/index.js
generated
vendored
@@ -4,15 +4,11 @@ const EventEmitter = require('events');
|
||||
const packageData = require('../../package.json');
|
||||
const shared = require('../shared');
|
||||
const LeWindows = require('../mime-node/le-windows');
|
||||
const MimeNode = require('../mime-node');
|
||||
|
||||
/**
|
||||
* Generates a Transport object for AWS SES
|
||||
*
|
||||
* Possible options can be the following:
|
||||
*
|
||||
* * **sendingRate** optional Number specifying how many messages per second should be delivered to SES
|
||||
* * **maxConnections** optional Number specifying max number of parallel connections to SES
|
||||
*
|
||||
* @constructor
|
||||
* @param {Object} optional config parameter
|
||||
*/
|
||||
@@ -30,119 +26,17 @@ class SESTransport extends EventEmitter {
|
||||
this.logger = shared.getLogger(this.options, {
|
||||
component: this.options.component || 'ses-transport'
|
||||
});
|
||||
|
||||
// parallel sending connections
|
||||
this.maxConnections = Number(this.options.maxConnections) || Infinity;
|
||||
this.connections = 0;
|
||||
|
||||
// max messages per second
|
||||
this.sendingRate = Number(this.options.sendingRate) || Infinity;
|
||||
this.sendingRateTTL = null;
|
||||
this.rateInterval = 1000; // milliseconds
|
||||
this.rateMessages = [];
|
||||
|
||||
this.pending = [];
|
||||
|
||||
this.idling = true;
|
||||
|
||||
setImmediate(() => {
|
||||
if (this.idling) {
|
||||
this.emit('idle');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Schedules a sending of a message
|
||||
*
|
||||
* @param {Object} emailMessage MailComposer object
|
||||
* @param {Function} callback Callback function to run when the sending is completed
|
||||
*/
|
||||
send(mail, callback) {
|
||||
if (this.connections >= this.maxConnections) {
|
||||
this.idling = false;
|
||||
return this.pending.push({
|
||||
mail,
|
||||
callback
|
||||
});
|
||||
getRegion(cb) {
|
||||
if (this.ses.sesClient.config && typeof this.ses.sesClient.config.region === 'function') {
|
||||
// promise
|
||||
return this.ses.sesClient.config
|
||||
.region()
|
||||
.then(region => cb(null, region))
|
||||
.catch(err => cb(err));
|
||||
}
|
||||
|
||||
if (!this._checkSendingRate()) {
|
||||
this.idling = false;
|
||||
return this.pending.push({
|
||||
mail,
|
||||
callback
|
||||
});
|
||||
}
|
||||
|
||||
this._send(mail, (...args) => {
|
||||
setImmediate(() => callback(...args));
|
||||
this._sent();
|
||||
});
|
||||
}
|
||||
|
||||
_checkRatedQueue() {
|
||||
if (this.connections >= this.maxConnections || !this._checkSendingRate()) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!this.pending.length) {
|
||||
if (!this.idling) {
|
||||
this.idling = true;
|
||||
this.emit('idle');
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
let next = this.pending.shift();
|
||||
this._send(next.mail, (...args) => {
|
||||
setImmediate(() => next.callback(...args));
|
||||
this._sent();
|
||||
});
|
||||
}
|
||||
|
||||
_checkSendingRate() {
|
||||
clearTimeout(this.sendingRateTTL);
|
||||
|
||||
let now = Date.now();
|
||||
let oldest = false;
|
||||
// delete older messages
|
||||
for (let i = this.rateMessages.length - 1; i >= 0; i--) {
|
||||
if (this.rateMessages[i].ts >= now - this.rateInterval && (!oldest || this.rateMessages[i].ts < oldest)) {
|
||||
oldest = this.rateMessages[i].ts;
|
||||
}
|
||||
|
||||
if (this.rateMessages[i].ts < now - this.rateInterval && !this.rateMessages[i].pending) {
|
||||
this.rateMessages.splice(i, 1);
|
||||
}
|
||||
}
|
||||
|
||||
if (this.rateMessages.length < this.sendingRate) {
|
||||
return true;
|
||||
}
|
||||
|
||||
let delay = Math.max(oldest + 1001, now + 20);
|
||||
this.sendingRateTTL = setTimeout(() => this._checkRatedQueue(), now - delay);
|
||||
|
||||
try {
|
||||
this.sendingRateTTL.unref();
|
||||
} catch (E) {
|
||||
// Ignore. Happens on envs with non-node timer implementation
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
_sent() {
|
||||
this.connections--;
|
||||
this._checkRatedQueue();
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns true if there are free slots in the queue
|
||||
*/
|
||||
isIdle() {
|
||||
return this.idling;
|
||||
return cb(null, false);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -151,13 +45,17 @@ class SESTransport extends EventEmitter {
|
||||
* @param {Object} emailMessage MailComposer object
|
||||
* @param {Function} callback Callback function to run when the sending is completed
|
||||
*/
|
||||
_send(mail, callback) {
|
||||
send(mail, callback) {
|
||||
let statObject = {
|
||||
ts: Date.now(),
|
||||
pending: true
|
||||
};
|
||||
this.connections++;
|
||||
this.rateMessages.push(statObject);
|
||||
|
||||
let fromHeader = mail.message._headers.find(header => /^from$/i.test(header.key));
|
||||
if (fromHeader) {
|
||||
let mimeNode = new MimeNode('text/plain');
|
||||
fromHeader = mimeNode._convertAddresses(mimeNode._parseAddresses(fromHeader.value));
|
||||
}
|
||||
|
||||
let envelope = mail.data.envelope || mail.message.getEnvelope();
|
||||
let messageId = mail.message.messageId();
|
||||
@@ -227,45 +125,29 @@ class SESTransport extends EventEmitter {
|
||||
}
|
||||
|
||||
let sesMessage = {
|
||||
RawMessage: {
|
||||
// required
|
||||
Data: raw // required
|
||||
Content: {
|
||||
Raw: {
|
||||
// required
|
||||
Data: raw // required
|
||||
}
|
||||
},
|
||||
Source: envelope.from,
|
||||
Destinations: envelope.to
|
||||
FromEmailAddress: fromHeader ? fromHeader : envelope.from,
|
||||
Destination: {
|
||||
ToAddresses: envelope.to
|
||||
}
|
||||
};
|
||||
|
||||
Object.keys(mail.data.ses || {}).forEach(key => {
|
||||
sesMessage[key] = mail.data.ses[key];
|
||||
});
|
||||
|
||||
let ses = (this.ses.aws ? this.ses.ses : this.ses) || {};
|
||||
let aws = this.ses.aws || {};
|
||||
|
||||
let getRegion = cb => {
|
||||
if (ses.config && typeof ses.config.region === 'function') {
|
||||
// promise
|
||||
return ses.config
|
||||
.region()
|
||||
.then(region => cb(null, region))
|
||||
.catch(err => cb(err));
|
||||
}
|
||||
return cb(null, (ses.config && ses.config.region) || 'us-east-1');
|
||||
};
|
||||
|
||||
getRegion((err, region) => {
|
||||
this.getRegion((err, region) => {
|
||||
if (err || !region) {
|
||||
region = 'us-east-1';
|
||||
}
|
||||
|
||||
let sendPromise;
|
||||
if (typeof ses.send === 'function' && aws.SendRawEmailCommand) {
|
||||
// v3 API
|
||||
sendPromise = ses.send(new aws.SendRawEmailCommand(sesMessage));
|
||||
} else {
|
||||
// v2 API
|
||||
sendPromise = ses.sendRawEmail(sesMessage).promise();
|
||||
}
|
||||
const command = new this.ses.SendEmailCommand(sesMessage);
|
||||
const sendPromise = this.ses.sesClient.send(command);
|
||||
|
||||
sendPromise
|
||||
.then(data => {
|
||||
@@ -273,7 +155,7 @@ class SESTransport extends EventEmitter {
|
||||
region = 'email';
|
||||
}
|
||||
|
||||
statObject.pending = false;
|
||||
statObject.pending = true;
|
||||
callback(null, {
|
||||
envelope: {
|
||||
from: envelope.from,
|
||||
@@ -309,38 +191,41 @@ class SESTransport extends EventEmitter {
|
||||
*/
|
||||
verify(callback) {
|
||||
let promise;
|
||||
let ses = (this.ses.aws ? this.ses.ses : this.ses) || {};
|
||||
let aws = this.ses.aws || {};
|
||||
|
||||
const sesMessage = {
|
||||
RawMessage: {
|
||||
// required
|
||||
Data: 'From: invalid@invalid\r\nTo: invalid@invalid\r\n Subject: Invalid\r\n\r\nInvalid'
|
||||
},
|
||||
Source: 'invalid@invalid',
|
||||
Destinations: ['invalid@invalid']
|
||||
};
|
||||
|
||||
if (!callback) {
|
||||
promise = new Promise((resolve, reject) => {
|
||||
callback = shared.callbackPromise(resolve, reject);
|
||||
});
|
||||
}
|
||||
|
||||
const cb = err => {
|
||||
if (err && (err.code || err.Code) !== 'InvalidParameterValue') {
|
||||
if (err && !['InvalidParameterValue', 'MessageRejected'].includes(err.code || err.Code || err.name)) {
|
||||
return callback(err);
|
||||
}
|
||||
return callback(null, true);
|
||||
};
|
||||
|
||||
if (typeof ses.send === 'function' && aws.SendRawEmailCommand) {
|
||||
// v3 API
|
||||
sesMessage.RawMessage.Data = Buffer.from(sesMessage.RawMessage.Data);
|
||||
ses.send(new aws.SendRawEmailCommand(sesMessage), cb);
|
||||
} else {
|
||||
// v2 API
|
||||
ses.sendRawEmail(sesMessage, cb);
|
||||
}
|
||||
const sesMessage = {
|
||||
Content: {
|
||||
Raw: {
|
||||
Data: Buffer.from('From: <invalid@invalid>\r\nTo: <invalid@invalid>\r\n Subject: Invalid\r\n\r\nInvalid')
|
||||
}
|
||||
},
|
||||
FromEmailAddress: 'invalid@invalid',
|
||||
Destination: {
|
||||
ToAddresses: ['invalid@invalid']
|
||||
}
|
||||
};
|
||||
|
||||
this.getRegion((err, region) => {
|
||||
if (err || !region) {
|
||||
region = 'us-east-1';
|
||||
}
|
||||
|
||||
const command = new this.ses.SendEmailCommand(sesMessage);
|
||||
const sendPromise = this.ses.sesClient.send(command);
|
||||
|
||||
sendPromise.then(data => cb(null, data)).catch(err => cb(err));
|
||||
});
|
||||
|
||||
return promise;
|
||||
}
|
||||
|
||||
154
backend/node_modules/nodemailer/lib/shared/index.js
generated
vendored
154
backend/node_modules/nodemailer/lib/shared/index.js
generated
vendored
@@ -11,11 +11,19 @@ const net = require('net');
|
||||
const os = require('os');
|
||||
|
||||
const DNS_TTL = 5 * 60 * 1000;
|
||||
const CACHE_CLEANUP_INTERVAL = 30 * 1000; // Minimum 30 seconds between cleanups
|
||||
const MAX_CACHE_SIZE = 1000; // Maximum number of entries in cache
|
||||
|
||||
let lastCacheCleanup = 0;
|
||||
module.exports._lastCacheCleanup = () => lastCacheCleanup;
|
||||
module.exports._resetCacheCleanup = () => {
|
||||
lastCacheCleanup = 0;
|
||||
};
|
||||
|
||||
let networkInterfaces;
|
||||
try {
|
||||
networkInterfaces = os.networkInterfaces();
|
||||
} catch (err) {
|
||||
} catch (_err) {
|
||||
// fails on some systems
|
||||
}
|
||||
|
||||
@@ -81,8 +89,8 @@ const formatDNSValue = (value, extra) => {
|
||||
!value.addresses || !value.addresses.length
|
||||
? null
|
||||
: value.addresses.length === 1
|
||||
? value.addresses[0]
|
||||
: value.addresses[Math.floor(Math.random() * value.addresses.length)]
|
||||
? value.addresses[0]
|
||||
: value.addresses[Math.floor(Math.random() * value.addresses.length)]
|
||||
},
|
||||
extra || {}
|
||||
);
|
||||
@@ -113,7 +121,27 @@ module.exports.resolveHostname = (options, callback) => {
|
||||
if (dnsCache.has(options.host)) {
|
||||
cached = dnsCache.get(options.host);
|
||||
|
||||
if (!cached.expires || cached.expires >= Date.now()) {
|
||||
// Lazy cleanup with time throttling
|
||||
const now = Date.now();
|
||||
if (now - lastCacheCleanup > CACHE_CLEANUP_INTERVAL) {
|
||||
lastCacheCleanup = now;
|
||||
|
||||
// Clean up expired entries
|
||||
for (const [host, entry] of dnsCache.entries()) {
|
||||
if (entry.expires && entry.expires < now) {
|
||||
dnsCache.delete(host);
|
||||
}
|
||||
}
|
||||
|
||||
// If cache is still too large, remove oldest entries
|
||||
if (dnsCache.size > MAX_CACHE_SIZE) {
|
||||
const toDelete = Math.floor(MAX_CACHE_SIZE * 0.1); // Remove 10% of entries
|
||||
const keys = Array.from(dnsCache.keys()).slice(0, toDelete);
|
||||
keys.forEach(key => dnsCache.delete(key));
|
||||
}
|
||||
}
|
||||
|
||||
if (!cached.expires || cached.expires >= now) {
|
||||
return callback(
|
||||
null,
|
||||
formatDNSValue(cached.value, {
|
||||
@@ -126,7 +154,11 @@ module.exports.resolveHostname = (options, callback) => {
|
||||
resolver(4, options.host, options, (err, addresses) => {
|
||||
if (err) {
|
||||
if (cached) {
|
||||
// ignore error, use expired value
|
||||
dnsCache.set(options.host, {
|
||||
value: cached.value,
|
||||
expires: Date.now() + (options.dnsTtl || DNS_TTL)
|
||||
});
|
||||
|
||||
return callback(
|
||||
null,
|
||||
formatDNSValue(cached.value, {
|
||||
@@ -160,7 +192,11 @@ module.exports.resolveHostname = (options, callback) => {
|
||||
resolver(6, options.host, options, (err, addresses) => {
|
||||
if (err) {
|
||||
if (cached) {
|
||||
// ignore error, use expired value
|
||||
dnsCache.set(options.host, {
|
||||
value: cached.value,
|
||||
expires: Date.now() + (options.dnsTtl || DNS_TTL)
|
||||
});
|
||||
|
||||
return callback(
|
||||
null,
|
||||
formatDNSValue(cached.value, {
|
||||
@@ -195,7 +231,11 @@ module.exports.resolveHostname = (options, callback) => {
|
||||
dns.lookup(options.host, { all: true }, (err, addresses) => {
|
||||
if (err) {
|
||||
if (cached) {
|
||||
// ignore error, use expired value
|
||||
dnsCache.set(options.host, {
|
||||
value: cached.value,
|
||||
expires: Date.now() + (options.dnsTtl || DNS_TTL)
|
||||
});
|
||||
|
||||
return callback(
|
||||
null,
|
||||
formatDNSValue(cached.value, {
|
||||
@@ -246,9 +286,13 @@ module.exports.resolveHostname = (options, callback) => {
|
||||
})
|
||||
);
|
||||
});
|
||||
} catch (err) {
|
||||
} catch (_err) {
|
||||
if (cached) {
|
||||
// ignore error, use expired value
|
||||
dnsCache.set(options.host, {
|
||||
value: cached.value,
|
||||
expires: Date.now() + (options.dnsTtl || DNS_TTL)
|
||||
});
|
||||
|
||||
return callback(
|
||||
null,
|
||||
formatDNSValue(cached.value, {
|
||||
@@ -419,52 +463,74 @@ module.exports.callbackPromise = (resolve, reject) =>
|
||||
};
|
||||
|
||||
module.exports.parseDataURI = uri => {
|
||||
let input = uri;
|
||||
let commaPos = input.indexOf(',');
|
||||
if (!commaPos) {
|
||||
return uri;
|
||||
if (typeof uri !== 'string') {
|
||||
return null;
|
||||
}
|
||||
|
||||
let data = input.substring(commaPos + 1);
|
||||
let metaStr = input.substring('data:'.length, commaPos);
|
||||
// Early return for non-data URIs to avoid unnecessary processing
|
||||
if (!uri.startsWith('data:')) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Find the first comma safely - this prevents ReDoS
|
||||
const commaPos = uri.indexOf(',');
|
||||
if (commaPos === -1) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const data = uri.substring(commaPos + 1);
|
||||
const metaStr = uri.substring('data:'.length, commaPos);
|
||||
|
||||
let encoding;
|
||||
const metaEntries = metaStr.split(';');
|
||||
|
||||
let metaEntries = metaStr.split(';');
|
||||
let lastMetaEntry = metaEntries.length > 1 ? metaEntries[metaEntries.length - 1] : false;
|
||||
if (lastMetaEntry && lastMetaEntry.indexOf('=') < 0) {
|
||||
encoding = lastMetaEntry.toLowerCase();
|
||||
metaEntries.pop();
|
||||
}
|
||||
|
||||
let contentType = metaEntries.shift() || 'application/octet-stream';
|
||||
let params = {};
|
||||
for (let entry of metaEntries) {
|
||||
let sep = entry.indexOf('=');
|
||||
if (sep >= 0) {
|
||||
let key = entry.substring(0, sep);
|
||||
let value = entry.substring(sep + 1);
|
||||
params[key] = value;
|
||||
if (metaEntries.length > 0) {
|
||||
const lastEntry = metaEntries[metaEntries.length - 1].toLowerCase().trim();
|
||||
// Only recognize valid encoding types to prevent manipulation
|
||||
if (['base64', 'utf8', 'utf-8'].includes(lastEntry) && lastEntry.indexOf('=') === -1) {
|
||||
encoding = lastEntry;
|
||||
metaEntries.pop();
|
||||
}
|
||||
}
|
||||
|
||||
switch (encoding) {
|
||||
case 'base64':
|
||||
data = Buffer.from(data, 'base64');
|
||||
break;
|
||||
case 'utf8':
|
||||
data = Buffer.from(data);
|
||||
break;
|
||||
default:
|
||||
try {
|
||||
data = Buffer.from(decodeURIComponent(data));
|
||||
} catch (err) {
|
||||
data = Buffer.from(data);
|
||||
const contentType = metaEntries.length > 0 ? metaEntries.shift() : 'application/octet-stream';
|
||||
const params = {};
|
||||
|
||||
for (let i = 0; i < metaEntries.length; i++) {
|
||||
const entry = metaEntries[i];
|
||||
const sepPos = entry.indexOf('=');
|
||||
if (sepPos > 0) {
|
||||
// Ensure there's a key before the '='
|
||||
const key = entry.substring(0, sepPos).trim();
|
||||
const value = entry.substring(sepPos + 1).trim();
|
||||
if (key) {
|
||||
params[key] = value;
|
||||
}
|
||||
data = Buffer.from(data);
|
||||
}
|
||||
}
|
||||
|
||||
return { data, encoding, contentType, params };
|
||||
// Decode data based on encoding with proper error handling
|
||||
let bufferData;
|
||||
try {
|
||||
if (encoding === 'base64') {
|
||||
bufferData = Buffer.from(data, 'base64');
|
||||
} else {
|
||||
try {
|
||||
bufferData = Buffer.from(decodeURIComponent(data));
|
||||
} catch (_decodeError) {
|
||||
bufferData = Buffer.from(data);
|
||||
}
|
||||
}
|
||||
} catch (_bufferError) {
|
||||
bufferData = Buffer.alloc(0);
|
||||
}
|
||||
|
||||
return {
|
||||
data: bufferData,
|
||||
encoding: encoding || null,
|
||||
contentType: contentType || 'application/octet-stream',
|
||||
params
|
||||
};
|
||||
};
|
||||
|
||||
/**
|
||||
|
||||
4
backend/node_modules/nodemailer/lib/smtp-connection/http-proxy-client.js
generated
vendored
4
backend/node_modules/nodemailer/lib/smtp-connection/http-proxy-client.js
generated
vendored
@@ -51,7 +51,7 @@ function httpProxyClient(proxyUrl, destinationPort, destinationHost, callback) {
|
||||
finished = true;
|
||||
try {
|
||||
socket.destroy();
|
||||
} catch (E) {
|
||||
} catch (_E) {
|
||||
// ignore
|
||||
}
|
||||
callback(err);
|
||||
@@ -118,7 +118,7 @@ function httpProxyClient(proxyUrl, destinationPort, destinationHost, callback) {
|
||||
if (!match || (match[1] || '').charAt(0) !== '2') {
|
||||
try {
|
||||
socket.destroy();
|
||||
} catch (E) {
|
||||
} catch (_E) {
|
||||
// ignore
|
||||
}
|
||||
return callback(new Error('Invalid response from proxy' + ((match && ': ' + match[1]) || '')));
|
||||
|
||||
34
backend/node_modules/nodemailer/lib/smtp-connection/index.js
generated
vendored
34
backend/node_modules/nodemailer/lib/smtp-connection/index.js
generated
vendored
@@ -124,7 +124,7 @@ class SMTPConnection extends EventEmitter {
|
||||
|
||||
/**
|
||||
* The socket connecting to the server
|
||||
* @publick
|
||||
* @public
|
||||
*/
|
||||
this._socket = false;
|
||||
|
||||
@@ -243,6 +243,8 @@ class SMTPConnection extends EventEmitter {
|
||||
if (this.options.connection) {
|
||||
// connection is already opened
|
||||
this._socket = this.options.connection;
|
||||
setupConnectionHandlers();
|
||||
|
||||
if (this.secureConnection && !this.alreadySecured) {
|
||||
setImmediate(() =>
|
||||
this._upgradeConnection(err => {
|
||||
@@ -412,8 +414,8 @@ class SMTPConnection extends EventEmitter {
|
||||
|
||||
if (socket && !socket.destroyed) {
|
||||
try {
|
||||
this._socket[closeMethod]();
|
||||
} catch (E) {
|
||||
socket[closeMethod]();
|
||||
} catch (_E) {
|
||||
// just ignore
|
||||
}
|
||||
}
|
||||
@@ -628,6 +630,15 @@ class SMTPConnection extends EventEmitter {
|
||||
let startTime = Date.now();
|
||||
this._setEnvelope(envelope, (err, info) => {
|
||||
if (err) {
|
||||
// create passthrough stream to consume to prevent OOM
|
||||
let stream = new PassThrough();
|
||||
if (typeof message.pipe === 'function') {
|
||||
message.pipe(stream);
|
||||
} else {
|
||||
stream.write(message);
|
||||
stream.end();
|
||||
}
|
||||
|
||||
return callback(err);
|
||||
}
|
||||
let envelopeTime = Date.now();
|
||||
@@ -1283,7 +1294,12 @@ class SMTPConnection extends EventEmitter {
|
||||
|
||||
if (str.charAt(0) !== '2') {
|
||||
if (this.options.requireTLS) {
|
||||
this._onError(new Error('EHLO failed but HELO does not support required STARTTLS. response=' + str), 'ECONNECTION', str, 'EHLO');
|
||||
this._onError(
|
||||
new Error('EHLO failed but HELO does not support required STARTTLS. response=' + str),
|
||||
'ECONNECTION',
|
||||
str,
|
||||
'EHLO'
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -1465,7 +1481,9 @@ class SMTPConnection extends EventEmitter {
|
||||
let challengeString = '';
|
||||
|
||||
if (!challengeMatch) {
|
||||
return callback(this._formatError('Invalid login sequence while waiting for server challenge string', 'EAUTH', str, 'AUTH CRAM-MD5'));
|
||||
return callback(
|
||||
this._formatError('Invalid login sequence while waiting for server challenge string', 'EAUTH', str, 'AUTH CRAM-MD5')
|
||||
);
|
||||
} else {
|
||||
challengeString = challengeMatch[1];
|
||||
}
|
||||
@@ -1608,7 +1626,7 @@ class SMTPConnection extends EventEmitter {
|
||||
}
|
||||
|
||||
if (!this._envelope.rcptQueue.length) {
|
||||
return callback(this._formatError('Can\x27t send mail - no recipients defined', 'EENVELOPE', false, 'API'));
|
||||
return callback(this._formatError("Can't send mail - no recipients defined", 'EENVELOPE', false, 'API'));
|
||||
} else {
|
||||
this._recipientQueue = [];
|
||||
|
||||
@@ -1664,7 +1682,7 @@ class SMTPConnection extends EventEmitter {
|
||||
});
|
||||
this._sendCommand('DATA');
|
||||
} else {
|
||||
err = this._formatError('Can\x27t send mail - all recipients were rejected', 'EENVELOPE', str, 'RCPT TO');
|
||||
err = this._formatError("Can't send mail - all recipients were rejected", 'EENVELOPE', str, 'RCPT TO');
|
||||
err.rejected = this._envelope.rejected;
|
||||
err.rejectedErrors = this._envelope.rejectedErrors;
|
||||
return callback(err);
|
||||
@@ -1803,7 +1821,7 @@ class SMTPConnection extends EventEmitter {
|
||||
let defaultHostname;
|
||||
try {
|
||||
defaultHostname = os.hostname() || '';
|
||||
} catch (err) {
|
||||
} catch (_err) {
|
||||
// fails on windows 7
|
||||
defaultHostname = 'localhost';
|
||||
}
|
||||
|
||||
4
backend/node_modules/nodemailer/lib/smtp-pool/index.js
generated
vendored
4
backend/node_modules/nodemailer/lib/smtp-pool/index.js
generated
vendored
@@ -406,6 +406,10 @@ class SMTPPool extends EventEmitter {
|
||||
this._continueProcessing();
|
||||
}, 50);
|
||||
} else {
|
||||
if (!this._closed && this.idling && !this._connections.length) {
|
||||
this.emit('clear');
|
||||
}
|
||||
|
||||
this._continueProcessing();
|
||||
}
|
||||
});
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user