Compare commits
76 Commits
tournament
...
httv
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
7be98ffeeb | ||
|
|
1ef1711eea | ||
|
|
85981a880d | ||
|
|
84503b6404 | ||
|
|
bcc3ce036d | ||
|
|
0fe0514660 | ||
|
|
431ec861ba | ||
|
|
648b608036 | ||
|
|
4ac71d967f | ||
|
|
75d304ec6d | ||
|
|
afd96f5df1 | ||
|
|
4bfa6a5889 | ||
|
|
144034a305 | ||
|
|
f4187512ba | ||
|
|
b557297bf0 | ||
|
|
eb2273e28c | ||
|
|
091599b745 | ||
|
|
d70a5ca63e | ||
|
|
09ffd1db3d | ||
|
|
d90acf43e1 | ||
|
|
adb93af906 | ||
|
|
a36f0ea446 | ||
|
|
e4fcf2eca2 | ||
|
|
0ee16c7766 | ||
|
|
21c19298da | ||
|
|
3c65fed994 | ||
|
|
66046ddccd | ||
|
|
561d8186d3 | ||
|
|
312f8f24ab | ||
|
|
ba4b56360d | ||
|
|
02732a01da | ||
|
|
4307fa7d82 | ||
|
|
a1dc6afb2c | ||
|
|
92ce64b807 | ||
|
|
296939d1a0 | ||
|
|
dc8a5778d6 | ||
|
|
cf04e5bfe8 | ||
|
|
ace15ae1d3 | ||
|
|
d4b82a3a6f | ||
|
|
48cd0921df | ||
|
|
df02e48cfd | ||
|
|
4a6d868820 | ||
|
|
52556a4292 | ||
|
|
3a02ffb3e3 | ||
|
|
c4b9a7d782 | ||
|
|
5e8b221541 | ||
|
|
26720c8df3 | ||
|
|
a1ab742126 | ||
|
|
f21ad3d8a3 | ||
|
|
51d3087006 | ||
|
|
a08588a075 | ||
|
|
5d67a52b45 | ||
|
|
f29425c987 | ||
|
|
e3b8488d2b | ||
|
|
f49e1896b9 | ||
|
|
2092473cf3 | ||
|
|
c00849a154 | ||
|
|
8069946154 | ||
|
|
975800c1ab | ||
|
|
b82a80a11d | ||
|
|
244b61c901 | ||
|
|
c7325ac982 | ||
|
|
8fbdc68016 | ||
|
|
455b2c94cd | ||
|
|
c9a1026b50 | ||
|
|
f6f1ea0403 | ||
|
|
a636b32510 | ||
|
|
8ee1203ec6 | ||
|
|
bce5150757 | ||
|
|
117f6b4c93 | ||
|
|
6a8b0e35d7 | ||
|
|
ed96fc5f27 | ||
|
|
8bd05e4e38 | ||
|
|
e827964688 | ||
|
|
353b8386ee | ||
|
|
ad2ab3cae8 |
139
backend/README_CLEANUP.md
Normal file
139
backend/README_CLEANUP.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# MySQL Keys Cleanup - Anleitung
|
||||
|
||||
## Problem
|
||||
Der MySQL-Server hat ein Limit von maximal 64 Keys pro Tabelle. Sequelize erstellt automatisch viele INDEX für verschiedene Felder, was dieses Limit überschreitet.
|
||||
|
||||
## Wichtige Erkenntnis ⚠️
|
||||
**Alle INDEX-Namen in den ursprünglichen Scripts existieren nicht!** Das bedeutet, dass die Tabellennamen oder INDEX-Namen nicht mit der Realität übereinstimmen.
|
||||
|
||||
## Lösung
|
||||
Das Cleanup-Script entfernt überflüssige INDEX, behält aber die essentiellen Keys (PRIMARY KEY, UNIQUE Keys).
|
||||
|
||||
## Verfügbare Scripts (nach Priorität sortiert)
|
||||
|
||||
### **1. `checkRealIndexes.sql`** - Echte INDEX überprüfen ⭐ EMPFOHLEN ZUERST
|
||||
- Zeigt alle vorhandenen Tabellen in der Datenbank an
|
||||
- Zeigt alle **echten** INDEX und Keys an
|
||||
- Zeigt die Anzahl der Keys pro Tabelle an
|
||||
- **Verwenden Sie dieses Script zuerst, um die echten INDEX-Namen zu sehen!**
|
||||
|
||||
### **2. `cleanupKeysMinimal.sql`** - Minimales Cleanup ⭐ EMPFOHLEN
|
||||
- Zeigt alle INDEX pro Tabelle mit `SHOW INDEX`
|
||||
- Entfernt alle überflüssigen INDEX
|
||||
- Behält nur PRIMARY KEY und UNIQUE Keys
|
||||
- **Sicherste Option für die Bereinigung**
|
||||
|
||||
### **3. `cleanupKeysReal.sql`** - Cleanup mit echten Namen
|
||||
- Zeigt alle vorhandenen INDEX vor und nach der Bereinigung
|
||||
- Entfernt nur INDEX, die tatsächlich existieren
|
||||
- Detaillierte Informationen über den Cleanup-Prozess
|
||||
|
||||
### **4. `checkTableNames.sql`** - Tabellennamen überprüfen
|
||||
- Zeigt alle vorhandenen Tabellen und INDEX an
|
||||
- Gute Übersicht über die Datenbankstruktur
|
||||
|
||||
### **5. `cleanupKeysSmart.sql`** - Intelligentes Cleanup (veraltet)
|
||||
- Überprüft zuerst alle vorhandenen Tabellen und INDEX
|
||||
- **Problem**: Verwendet falsche INDEX-Namen
|
||||
|
||||
### **6. `cleanupKeysCorrected.sql`** - Korrigierte Tabellennamen (veraltet)
|
||||
- Verwendet die wahrscheinlich korrekten Tabellennamen
|
||||
- **Problem**: Verwendet falsche INDEX-Namen
|
||||
|
||||
### **7. `cleanupKeys.sql` & `cleanupKeysSimple.sql`** (veraltet)
|
||||
- Tabellennamen und INDEX-Namen könnten falsch sein
|
||||
- **Nicht mehr empfohlen**
|
||||
|
||||
## Empfohlener Ablauf
|
||||
|
||||
### **Schritt 1: Echte INDEX überprüfen**
|
||||
```bash
|
||||
# Verbindung zur MySQL-Datenbank
|
||||
mysql -u [username] -p [database_name]
|
||||
|
||||
# Script ausführen
|
||||
source /path/to/checkRealIndexes.sql
|
||||
```
|
||||
|
||||
### **Schritt 2: Minimales Cleanup durchführen**
|
||||
```bash
|
||||
# Script ausführen
|
||||
source /path/to/cleanupKeysMinimal.sql
|
||||
```
|
||||
|
||||
### **Alternative Ausführungsmethoden**
|
||||
|
||||
#### Über MySQL Workbench
|
||||
1. MySQL Workbench öffnen
|
||||
2. Verbindung zur Datenbank herstellen
|
||||
3. File -> Open SQL Script -> Script auswählen
|
||||
4. Execute (Blitz-Symbol) klicken
|
||||
|
||||
#### Über phpMyAdmin
|
||||
1. phpMyAdmin öffnen
|
||||
2. Datenbank `trainingsdiary` auswählen
|
||||
3. SQL-Tab öffnen
|
||||
4. Inhalt des Scripts einfügen
|
||||
5. Go klicken
|
||||
|
||||
## Nach der Ausführung
|
||||
|
||||
1. **Keys überprüfen:**
|
||||
```sql
|
||||
SELECT COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
```
|
||||
|
||||
2. **Keys pro Tabelle anzeigen:**
|
||||
```sql
|
||||
SELECT TABLE_NAME, COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
```
|
||||
|
||||
3. **Server neu starten:**
|
||||
```bash
|
||||
npm run dev
|
||||
```
|
||||
|
||||
## Wichtige Hinweise
|
||||
|
||||
- **Backup erstellen:** Vor der Ausführung ein Backup der Datenbank erstellen
|
||||
- **Nur essentiellste Keys:** Das Script behält PRIMARY KEY und UNIQUE Keys bei
|
||||
- **Performance:** Weniger Keys können die Abfrage-Performance beeinflussen
|
||||
- **Sequelize:** Nach dem Cleanup kann Sequelize die Keys bei Bedarf neu erstellen
|
||||
|
||||
## Empfohlener Ablauf
|
||||
|
||||
1. **`checkRealIndexes.sql` ausführen** - Echte INDEX-Namen sehen
|
||||
2. **`cleanupKeysMinimal.sql` ausführen** - Minimales Cleanup
|
||||
3. **Ergebnisse überprüfen** - Keys zählen
|
||||
4. **Server neu starten** - `npm run dev`
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Fehler: "Can't DROP INDEX; check that it exists"
|
||||
- **Das ist normal!** Alle INDEX-Namen in den ursprünglichen Scripts existieren nicht
|
||||
- Verwenden Sie `checkRealIndexes.sql` um die echten INDEX-Namen zu sehen
|
||||
- Verwenden Sie `cleanupKeysMinimal.sql` für das Cleanup
|
||||
|
||||
### Fehler: "Table doesn't exist"
|
||||
- Verwenden Sie `checkRealIndexes.sql` um die echten Tabellennamen zu sehen
|
||||
- Passen Sie die Scripts entsprechend an
|
||||
|
||||
### Fehler: "Index doesn't exist"
|
||||
- Das ist normal - `DROP INDEX IF EXISTS` verhindert Fehler
|
||||
- Nicht vorhandene INDEX werden einfach übersprungen
|
||||
|
||||
### Keys werden immer noch erstellt
|
||||
- Sequelize erstellt Keys automatisch bei `sync()`
|
||||
- Das ist normal und gewünscht
|
||||
- Nur überflüssige Keys werden entfernt
|
||||
|
||||
### MySQL-Key-Limit überschritten
|
||||
- Führen Sie das minimale Cleanup-Script aus
|
||||
- Überprüfen Sie die Anzahl der Keys nach der Bereinigung
|
||||
- Falls nötig, entfernen Sie weitere INDEX manuell basierend auf den echten Namen
|
||||
46
backend/checkRealIndexes.sql
Normal file
46
backend/checkRealIndexes.sql
Normal file
@@ -0,0 +1,46 @@
|
||||
-- Script zum Überprüfen der echten INDEX-Namen
|
||||
USE trainingsdiary;
|
||||
|
||||
-- Alle vorhandenen Tabellen anzeigen
|
||||
SELECT '=== VORHANDENE TABELLEN ===' as info;
|
||||
SHOW TABLES;
|
||||
|
||||
-- Alle vorhandenen INDEX und Keys anzeigen (mit echten Namen)
|
||||
SELECT '=== ALLE INDEX UND KEYS ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
INDEX_NAME,
|
||||
COLUMN_NAME,
|
||||
NON_UNIQUE,
|
||||
SEQ_IN_INDEX
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME, INDEX_NAME, SEQ_IN_INDEX;
|
||||
|
||||
-- Anzahl der Keys pro Tabelle
|
||||
SELECT '=== KEYS PRO TABELLE ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- Gesamtanzahl der Keys
|
||||
SELECT '=== GESAMTANZAHL KEYS ===' as info;
|
||||
SELECT
|
||||
COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- Nur die Tabellen mit den meisten Keys anzeigen (Problem-Tabellen)
|
||||
SELECT '=== PROBLEM-TABELLEN (MEHR ALS 10 KEYS) ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
HAVING COUNT(*) > 10
|
||||
ORDER BY key_count DESC;
|
||||
35
backend/checkTableNames.sql
Normal file
35
backend/checkTableNames.sql
Normal file
@@ -0,0 +1,35 @@
|
||||
-- Script zum Überprüfen der echten Tabellennamen
|
||||
USE trainingsdiary;
|
||||
|
||||
-- Alle Tabellen in der Datenbank anzeigen
|
||||
SHOW TABLES;
|
||||
|
||||
-- Detaillierte Informationen über alle Tabellen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
TABLE_ROWS,
|
||||
DATA_LENGTH,
|
||||
INDEX_LENGTH
|
||||
FROM INFORMATION_SCHEMA.TABLES
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME;
|
||||
|
||||
-- Alle INDEX und Keys pro Tabelle anzeigen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
INDEX_NAME,
|
||||
COLUMN_NAME,
|
||||
NON_UNIQUE,
|
||||
SEQ_IN_INDEX
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME, INDEX_NAME, SEQ_IN_INDEX;
|
||||
|
||||
-- Anzahl der Keys pro Tabelle
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
185
backend/cleanupKeys.sql
Normal file
185
backend/cleanupKeys.sql
Normal file
@@ -0,0 +1,185 @@
|
||||
-- Cleanup-Script für MySQL Keys
|
||||
-- Führt dieses Script in der MySQL-Datenbank aus, um überflüssige Keys zu entfernen
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Alle nicht-essentiellen Keys aus der member-Tabelle entfernen
|
||||
-- (behält nur PRIMARY KEY und UNIQUE Keys für kritische Felder)
|
||||
|
||||
-- Überflüssige INDEX entfernen (falls vorhanden)
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- 2. Überflüssige Keys aus anderen Tabellen entfernen
|
||||
-- User-Tabelle
|
||||
DROP INDEX IF EXISTS idx_user_email ON user;
|
||||
DROP INDEX IF EXISTS idx_user_created_at ON user;
|
||||
DROP INDEX IF EXISTS idx_user_updated_at ON user;
|
||||
|
||||
-- Clubs-Tabelle
|
||||
DROP INDEX IF EXISTS idx_clubs_name ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_created_at ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_updated_at ON clubs;
|
||||
|
||||
-- User_Club-Tabelle
|
||||
DROP INDEX IF EXISTS idx_user_club_approved ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_created_at ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_updated_at ON user_club;
|
||||
|
||||
-- Log-Tabelle
|
||||
DROP INDEX IF EXISTS idx_log_activity ON log;
|
||||
DROP INDEX IF EXISTS idx_log_created_at ON log;
|
||||
DROP INDEX IF EXISTS idx_log_updated_at ON log;
|
||||
|
||||
-- Diary_Dates-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_dates_date ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_created_at ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_updated_at ON diary_dates;
|
||||
|
||||
-- Participants-Tabelle
|
||||
DROP INDEX IF EXISTS idx_participant_created_at ON participants;
|
||||
DROP INDEX IF EXISTS idx_participant_updated_at ON participants;
|
||||
|
||||
-- Activity-Tabelle
|
||||
DROP INDEX IF EXISTS idx_activity_created_at ON activities;
|
||||
DROP INDEX IF EXISTS idx_activity_updated_at ON activities;
|
||||
|
||||
-- Member_Note-Tabelle
|
||||
DROP INDEX IF EXISTS idx_member_note_created_at ON member_note;
|
||||
DROP INDEX IF EXISTS idx_member_note_updated_at ON member_note;
|
||||
|
||||
-- Diary_Note-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_note_created_at ON diary_note;
|
||||
DROP INDEX IF EXISTS idx_diary_note_updated_at ON diary_note;
|
||||
|
||||
-- Diary_Tag-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_tag_created_at ON diary_tag;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_updated_at ON diary_tag;
|
||||
|
||||
-- Member_Diary_Tag-Tabelle
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_created_at ON member_diary_tag;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_updated_at ON member_diary_tag;
|
||||
|
||||
-- Diary_Date_Tag-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_created_at ON diary_date_tag;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_updated_at ON diary_date_tag;
|
||||
|
||||
-- Diary_Member_Note-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_created_at ON diary_member_note;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_updated_at ON diary_member_note;
|
||||
|
||||
-- Predefined_Activity-Tabelle
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_created_at ON predefined_activities;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_updated_at ON predefined_activities;
|
||||
|
||||
-- Diary_Date_Activity-Tabelle
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_created_at ON diary_date_activity;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_updated_at ON diary_date_activity;
|
||||
|
||||
-- Match-Tabelle
|
||||
DROP INDEX IF EXISTS idx_match_created_at ON match;
|
||||
DROP INDEX IF EXISTS idx_match_updated_at ON match;
|
||||
|
||||
-- League-Tabelle
|
||||
DROP INDEX IF EXISTS idx_league_created_at ON league;
|
||||
DROP INDEX IF EXISTS idx_league_updated_at ON league;
|
||||
|
||||
-- Team-Tabelle
|
||||
DROP INDEX IF EXISTS idx_team_created_at ON team;
|
||||
DROP INDEX IF EXISTS idx_team_updated_at ON team;
|
||||
|
||||
-- Season-Tabelle
|
||||
DROP INDEX IF EXISTS idx_season_created_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_updated_at ON season;
|
||||
|
||||
-- Location-Tabelle
|
||||
DROP INDEX IF EXISTS idx_location_created_at ON location;
|
||||
DROP INDEX IF EXISTS idx_location_updated_at ON location;
|
||||
|
||||
-- Group-Tabelle
|
||||
DROP INDEX IF EXISTS idx_group_created_at ON `group`;
|
||||
DROP INDEX IF EXISTS idx_group_updated_at ON `group`;
|
||||
|
||||
-- Group_Activity-Tabelle
|
||||
DROP INDEX IF EXISTS idx_group_activity_created_at ON group_activity;
|
||||
DROP INDEX IF EXISTS idx_group_activity_updated_at ON group_activity;
|
||||
|
||||
-- Tournament-Tabelle
|
||||
DROP INDEX IF EXISTS idx_tournament_created_at ON tournament;
|
||||
DROP INDEX IF EXISTS idx_tournament_updated_at ON tournament;
|
||||
|
||||
-- Tournament_Group-Tabelle
|
||||
DROP INDEX IF EXISTS idx_tournament_group_created_at ON tournament_group;
|
||||
DROP INDEX IF EXISTS idx_tournament_group_updated_at ON tournament_group;
|
||||
|
||||
-- Tournament_Member-Tabelle
|
||||
DROP INDEX IF EXISTS idx_tournament_member_created_at ON tournament_member;
|
||||
DROP INDEX IF EXISTS idx_tournament_member_updated_at ON tournament_member;
|
||||
|
||||
-- Tournament_Match-Tabelle
|
||||
DROP INDEX IF EXISTS idx_tournament_match_created_at ON tournament_match;
|
||||
DROP INDEX IF EXISTS idx_tournament_match_updated_at ON tournament_match;
|
||||
|
||||
-- Tournament_Result-Tabelle
|
||||
DROP INDEX IF EXISTS idx_tournament_result_created_at ON tournament_result;
|
||||
DROP INDEX IF EXISTS idx_tournament_result_updated_at ON tournament_result;
|
||||
|
||||
-- Accident-Tabelle
|
||||
DROP INDEX IF EXISTS idx_accident_created_at ON accident;
|
||||
DROP INDEX IF EXISTS idx_accident_updated_at ON accident;
|
||||
|
||||
-- User_Token-Tabelle
|
||||
DROP INDEX IF EXISTS idx_user_token_created_at ON UserToken;
|
||||
DROP INDEX IF EXISTS idx_user_token_updated_at ON UserToken;
|
||||
|
||||
-- 3. Nur essentiellste Keys beibehalten
|
||||
-- Diese Keys sind für die Funktionalität notwendig
|
||||
|
||||
-- Member-Tabelle: Nur PRIMARY KEY und UNIQUE für hashed_id
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- User-Tabelle: Nur PRIMARY KEY und UNIQUE für email
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- Clubs-Tabelle: Nur PRIMARY KEY und UNIQUE für name
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- User_Club-Tabelle: Nur PRIMARY KEY
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- Log-Tabelle: Nur PRIMARY KEY
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- Diary_Dates-Tabelle: Nur PRIMARY KEY
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- Participant-Tabelle: Nur PRIMARY KEY
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- Alle anderen Tabellen: Nur PRIMARY KEY
|
||||
-- (wird automatisch von MySQL verwaltet)
|
||||
|
||||
-- 4. Status anzeigen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
INDEX_NAME,
|
||||
COLUMN_NAME,
|
||||
NON_UNIQUE,
|
||||
SEQ_IN_INDEX
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME, INDEX_NAME, SEQ_IN_INDEX;
|
||||
|
||||
-- 5. Anzahl der Keys pro Tabelle anzeigen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
123
backend/cleanupKeysAggressive.sql
Normal file
123
backend/cleanupKeysAggressive.sql
Normal file
@@ -0,0 +1,123 @@
|
||||
-- Aggressives Cleanup-Script - Entfernt alle überflüssigen INDEX
|
||||
-- Behält nur PRIMARY KEY und UNIQUE constraints
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Status vor dem aggressiven Cleanup
|
||||
SELECT '=== STATUS VOR AGGRESSIVEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 2. Alle INDEX der Problem-Tabellen anzeigen
|
||||
SELECT '=== MEMBER TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM member;
|
||||
|
||||
SELECT '=== DIARY_TAGS TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM diary_tags;
|
||||
|
||||
SELECT '=== SEASON TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM season;
|
||||
|
||||
-- 3. Alle nicht-essentiellen INDEX entfernen
|
||||
-- Behalte nur: PRIMARY KEY, UNIQUE constraints, FOREIGN KEY
|
||||
|
||||
-- Member-Tabelle: Alle INDEX außer PRIMARY und UNIQUE entfernen
|
||||
SELECT '=== ENTFERNE ALLE ÜBERFLÜSSIGEN MEMBER INDEX ===' as info;
|
||||
|
||||
-- Alle INDEX außer PRIMARY entfernen (PRIMARY kann nicht gelöscht werden)
|
||||
-- Verwende SHOW INDEX um die echten INDEX-Namen zu sehen
|
||||
-- Dann entferne alle außer PRIMARY
|
||||
|
||||
-- Häufige überflüssige INDEX-Namen (alle außer PRIMARY)
|
||||
DROP INDEX IF EXISTS member_hashed_id_unique ON member;
|
||||
DROP INDEX IF EXISTS member_first_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_last_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_birth_date_index ON member;
|
||||
DROP INDEX IF EXISTS member_active_index ON member;
|
||||
DROP INDEX IF EXISTS member_created_at_index ON member;
|
||||
DROP INDEX IF EXISTS member_updated_at_index ON member;
|
||||
DROP INDEX IF EXISTS member_club_id_index ON member;
|
||||
DROP INDEX IF EXISTS member_hashed_id_index ON member;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_club_id ON member;
|
||||
|
||||
-- Diary_Tags-Tabelle: Alle überflüssigen INDEX entfernen
|
||||
SELECT '=== ENTFERNE ALLE ÜBERFLÜSSIGEN DIARY_TAGS INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS diary_tags_name_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_created_at_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_updated_at_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_club_id_index ON diary_tags;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_diary_tags_name ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_updated_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_club_id ON diary_tags;
|
||||
|
||||
-- Season-Tabelle: Alle überflüssigen INDEX entfernen
|
||||
SELECT '=== ENTFERNE ALLE ÜBERFLÜSSIGEN SEASON INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS season_name_index ON season;
|
||||
DROP INDEX IF EXISTS season_start_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_end_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_created_at_index ON season;
|
||||
DROP INDEX IF EXISTS season_updated_at_index ON season;
|
||||
DROP INDEX IF EXISTS season_club_id_index ON season;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_season_name ON season;
|
||||
DROP INDEX IF EXISTS idx_season_start_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_end_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_created_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_updated_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_club_id ON season;
|
||||
|
||||
-- 4. Status nach dem aggressiven Cleanup
|
||||
SELECT '=== STATUS NACH AGGRESSIVEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 5. Gesamtanzahl der Keys
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after_aggressive_cleanup
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 6. Ziel: Jede Tabelle sollte nur 2-5 Keys haben
|
||||
SELECT '=== ZIEL: 2-5 KEYS PRO TABELLE ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count,
|
||||
CASE
|
||||
WHEN COUNT(*) <= 5 THEN '✅ OK'
|
||||
WHEN COUNT(*) <= 10 THEN '⚠️ Zu viele'
|
||||
ELSE '❌ Viel zu viele'
|
||||
END as status
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 7. Zusammenfassung
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Aggressives Cleanup abgeschlossen. Jede Tabelle sollte nur 2-5 Keys haben.' as message;
|
||||
152
backend/cleanupKeysCorrected.sql
Normal file
152
backend/cleanupKeysCorrected.sql
Normal file
@@ -0,0 +1,152 @@
|
||||
-- Korrigiertes Cleanup-Script für MySQL Keys
|
||||
-- Verwendet die wahrscheinlich korrekten Tabellennamen
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Alle überflüssigen INDEX entfernen
|
||||
-- Diese entfernen die meisten Keys, die das Limit überschreiten
|
||||
|
||||
-- User-Tabelle (wahrscheinlich 'user')
|
||||
DROP INDEX IF EXISTS idx_user_email ON user;
|
||||
DROP INDEX IF EXISTS idx_user_created_at ON user;
|
||||
DROP INDEX IF EXISTS idx_user_updated_at ON user;
|
||||
|
||||
-- Clubs-Tabelle (wahrscheinlich 'clubs')
|
||||
DROP INDEX IF EXISTS idx_clubs_name ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_created_at ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_updated_at ON clubs;
|
||||
|
||||
-- User_Club-Tabelle (wahrscheinlich 'user_club')
|
||||
DROP INDEX IF EXISTS idx_user_club_approved ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_created_at ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_updated_at ON user_club;
|
||||
|
||||
-- Member-Tabelle (wahrscheinlich 'member')
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- Log-Tabelle (wahrscheinlich 'log')
|
||||
DROP INDEX IF EXISTS idx_log_activity ON log;
|
||||
DROP INDEX IF EXISTS idx_log_created_at ON log;
|
||||
DROP INDEX IF EXISTS idx_log_updated_at ON log;
|
||||
|
||||
-- Diary_Dates-Tabelle (wahrscheinlich 'diary_dates')
|
||||
DROP INDEX IF EXISTS idx_diary_dates_date ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_created_at ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_updated_at ON diary_dates;
|
||||
|
||||
-- Participants-Tabelle (wahrscheinlich 'participants')
|
||||
DROP INDEX IF EXISTS idx_participant_created_at ON participants;
|
||||
DROP INDEX IF EXISTS idx_participant_updated_at ON participants;
|
||||
|
||||
-- Activities-Tabelle (wahrscheinlich 'activities')
|
||||
DROP INDEX IF EXISTS idx_activity_created_at ON activities;
|
||||
DROP INDEX IF EXISTS idx_activity_updated_at ON activities;
|
||||
|
||||
-- Member_Notes-Tabelle (wahrscheinlich 'member_notes')
|
||||
DROP INDEX IF EXISTS idx_member_note_created_at ON member_notes;
|
||||
DROP INDEX IF EXISTS idx_member_note_updated_at ON member_notes;
|
||||
|
||||
-- Diary_Notes-Tabelle (wahrscheinlich 'diary_notes')
|
||||
DROP INDEX IF EXISTS idx_diary_note_created_at ON diary_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_note_updated_at ON diary_notes;
|
||||
|
||||
-- Diary_Tags-Tabelle (wahrscheinlich 'diary_tags')
|
||||
DROP INDEX IF EXISTS idx_diary_tag_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_updated_at ON diary_tags;
|
||||
|
||||
-- Member_Diary_Tags-Tabelle (wahrscheinlich 'member_diary_tags')
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_created_at ON member_diary_tags;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_updated_at ON member_diary_tags;
|
||||
|
||||
-- Diary_Date_Tags-Tabelle (wahrscheinlich 'diary_date_tags')
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_created_at ON diary_date_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_updated_at ON diary_date_tags;
|
||||
|
||||
-- Diary_Member_Notes-Tabelle (wahrscheinlich 'diary_member_notes')
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_created_at ON diary_member_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_updated_at ON diary_member_notes;
|
||||
|
||||
-- Predefined_Activities-Tabelle (wahrscheinlich 'predefined_activities')
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_created_at ON predefined_activities;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_updated_at ON predefined_activities;
|
||||
|
||||
-- Diary_Date_Activities-Tabelle (wahrscheinlich 'diary_date_activities')
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_created_at ON diary_date_activities;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_updated_at ON diary_date_activities;
|
||||
|
||||
-- Matches-Tabelle (wahrscheinlich 'matches')
|
||||
DROP INDEX IF EXISTS idx_match_created_at ON matches;
|
||||
DROP INDEX IF EXISTS idx_match_updated_at ON matches;
|
||||
|
||||
-- Leagues-Tabelle (wahrscheinlich 'leagues')
|
||||
DROP INDEX IF EXISTS idx_league_created_at ON leagues;
|
||||
DROP INDEX IF EXISTS idx_league_updated_at ON leagues;
|
||||
|
||||
-- Teams-Tabelle (wahrscheinlich 'teams')
|
||||
DROP INDEX IF EXISTS idx_team_created_at ON teams;
|
||||
DROP INDEX IF EXISTS idx_team_updated_at ON teams;
|
||||
|
||||
-- Seasons-Tabelle (wahrscheinlich 'seasons')
|
||||
DROP INDEX IF EXISTS idx_season_created_at ON seasons;
|
||||
DROP INDEX IF EXISTS idx_season_updated_at ON seasons;
|
||||
|
||||
-- Locations-Tabelle (wahrscheinlich 'locations')
|
||||
DROP INDEX IF EXISTS idx_location_created_at ON locations;
|
||||
DROP INDEX IF EXISTS idx_location_updated_at ON locations;
|
||||
|
||||
-- Groups-Tabelle (wahrscheinlich 'groups')
|
||||
DROP INDEX IF EXISTS idx_group_created_at ON `groups`;
|
||||
DROP INDEX IF EXISTS idx_group_updated_at ON `groups`;
|
||||
|
||||
-- Group_Activities-Tabelle (wahrscheinlich 'group_activities')
|
||||
DROP INDEX IF EXISTS idx_group_activity_created_at ON group_activities;
|
||||
DROP INDEX IF EXISTS idx_group_activity_updated_at ON group_activities;
|
||||
|
||||
-- Tournaments-Tabelle (wahrscheinlich 'tournaments')
|
||||
DROP INDEX IF EXISTS idx_tournament_created_at ON tournaments;
|
||||
DROP INDEX IF EXISTS idx_tournament_updated_at ON tournaments;
|
||||
|
||||
-- Tournament_Groups-Tabelle (wahrscheinlich 'tournament_groups')
|
||||
DROP INDEX IF EXISTS idx_tournament_group_created_at ON tournament_groups;
|
||||
DROP INDEX IF EXISTS idx_tournament_group_updated_at ON tournament_groups;
|
||||
|
||||
-- Tournament_Members-Tabelle (wahrscheinlich 'tournament_members')
|
||||
DROP INDEX IF EXISTS idx_tournament_member_created_at ON tournament_members;
|
||||
DROP INDEX IF EXISTS idx_tournament_member_updated_at ON tournament_members;
|
||||
|
||||
-- Tournament_Matches-Tabelle (wahrscheinlich 'tournament_matches')
|
||||
DROP INDEX IF EXISTS idx_tournament_match_created_at ON tournament_matches;
|
||||
DROP INDEX IF EXISTS idx_tournament_match_updated_at ON tournament_matches;
|
||||
|
||||
-- Tournament_Results-Tabelle (wahrscheinlich 'tournament_results')
|
||||
DROP INDEX IF EXISTS idx_tournament_result_created_at ON tournament_results;
|
||||
DROP INDEX IF EXISTS idx_tournament_result_updated_at ON tournament_results;
|
||||
|
||||
-- Accidents-Tabelle (wahrscheinlich 'accidents')
|
||||
DROP INDEX IF EXISTS idx_accident_created_at ON accidents;
|
||||
DROP INDEX IF EXISTS idx_accident_updated_at ON accidents;
|
||||
|
||||
-- User_Tokens-Tabelle (wahrscheinlich 'user_tokens')
|
||||
DROP INDEX IF EXISTS idx_user_token_created_at ON user_tokens;
|
||||
DROP INDEX IF EXISTS idx_user_token_updated_at ON user_tokens;
|
||||
|
||||
-- 2. Status anzeigen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 3. Gesamtanzahl der Keys anzeigen
|
||||
SELECT
|
||||
COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
100
backend/cleanupKeysFinal.sql
Normal file
100
backend/cleanupKeysFinal.sql
Normal file
@@ -0,0 +1,100 @@
|
||||
-- Finales Cleanup-Script für die verbleibenden Problem-Tabellen
|
||||
-- Entfernt weitere INDEX aus member, diary_tags und season
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Status vor dem finalen Cleanup
|
||||
SELECT '=== STATUS VOR FINALEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 2. Alle INDEX der Problem-Tabellen anzeigen
|
||||
SELECT '=== MEMBER TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM member;
|
||||
|
||||
SELECT '=== DIARY_TAGS TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM diary_tags;
|
||||
|
||||
SELECT '=== SEASON TABELLE INDEX ===' as info;
|
||||
SHOW INDEX FROM season;
|
||||
|
||||
-- 3. Spezifische INDEX entfernen (basierend auf den echten Namen)
|
||||
-- Diese INDEX sind wahrscheinlich überflüssig und können entfernt werden
|
||||
|
||||
-- Member-Tabelle: Weitere INDEX entfernen
|
||||
SELECT '=== ENTFERNE WEITERE MEMBER INDEX ===' as info;
|
||||
|
||||
-- Versuche, INDEX zu entfernen, die wahrscheinlich überflüssig sind
|
||||
-- (Diese Namen basieren auf typischen Sequelize-Konventionen)
|
||||
|
||||
-- Häufige überflüssige INDEX-Namen
|
||||
DROP INDEX IF EXISTS member_hashed_id_unique ON member;
|
||||
DROP INDEX IF EXISTS member_first_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_last_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_birth_date_index ON member;
|
||||
DROP INDEX IF EXISTS member_active_index ON member;
|
||||
DROP INDEX IF EXISTS member_created_at_index ON member;
|
||||
DROP INDEX IF EXISTS member_updated_at_index ON member;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- Diary_Tags-Tabelle: Weitere INDEX entfernen
|
||||
SELECT '=== ENTFERNE WEITERE DIARY_TAGS INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS diary_tags_name_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_created_at_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_updated_at_index ON member;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_diary_tags_name ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_updated_at ON diary_tags;
|
||||
|
||||
-- Season-Tabelle: Weitere INDEX entfernen
|
||||
SELECT '=== ENTFERNE WEITERE SEASON INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS season_name_index ON season;
|
||||
DROP INDEX IF EXISTS season_start_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_end_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_created_at_index ON season;
|
||||
DROP INDEX IF EXISTS season_updated_at_index ON season;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_season_name ON season;
|
||||
DROP INDEX IF EXISTS idx_season_start_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_end_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_created_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_updated_at ON season;
|
||||
|
||||
-- 4. Status nach dem finalen Cleanup
|
||||
SELECT '=== STATUS NACH FINALEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 5. Gesamtanzahl der Keys
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after_final_cleanup
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 6. Zusammenfassung
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Finales Cleanup abgeschlossen. Überprüfen Sie die Anzahl der Keys oben.' as message;
|
||||
125
backend/cleanupKeysIntelligent.sql
Normal file
125
backend/cleanupKeysIntelligent.sql
Normal file
@@ -0,0 +1,125 @@
|
||||
-- Intelligentes Cleanup-Script - Ermittelt echte INDEX-Namen und entfernt diese
|
||||
-- Behält nur PRIMARY KEY und UNIQUE constraints
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Status vor dem intelligenten Cleanup
|
||||
SELECT '=== STATUS VOR INTELLIGENTEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 2. Alle INDEX der Problem-Tabellen anzeigen (mit echten Namen)
|
||||
SELECT '=== MEMBER TABELLE INDEX (ECHTE NAMEN) ===' as info;
|
||||
SHOW INDEX FROM member;
|
||||
|
||||
SELECT '=== DIARY_TAGS TABELLE INDEX (ECHTE NAMEN) ===' as info;
|
||||
SHOW INDEX FROM diary_tags;
|
||||
|
||||
SELECT '=== SEASON TABELLE INDEX (ECHTE NAMEN) ===' as info;
|
||||
SHOW INDEX FROM season;
|
||||
|
||||
-- 3. Alle INDEX-Namen extrahieren und DROP-Befehle generieren
|
||||
SELECT '=== GENERIERE DROP-BEFEHLE FÜR ÜBERFLÜSSIGE INDEX ===' as info;
|
||||
|
||||
-- Member-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== ENTFERNE ÜBERFLÜSSIGE MEMBER INDEX ===' as info;
|
||||
|
||||
-- Verwende die echten INDEX-Namen aus SHOW INDEX
|
||||
-- Entferne alle außer PRIMARY KEY (PRIMARY kann nicht gelöscht werden)
|
||||
|
||||
-- Beispiel für häufige überflüssige INDEX-Namen (basierend auf Sequelize-Konventionen)
|
||||
-- Diese werden nur ausgeführt, wenn sie existieren
|
||||
|
||||
-- Häufige überflüssige INDEX-Namen
|
||||
DROP INDEX IF EXISTS member_hashed_id_unique ON member;
|
||||
DROP INDEX IF EXISTS member_first_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_last_name_index ON member;
|
||||
DROP INDEX IF EXISTS member_birth_date_index ON member;
|
||||
DROP INDEX IF EXISTS member_active_index ON member;
|
||||
DROP INDEX IF EXISTS member_created_at_index ON member;
|
||||
DROP INDEX IF EXISTS member_updated_at_index ON member;
|
||||
DROP INDEX IF EXISTS member_club_id_index ON member;
|
||||
DROP INDEX IF EXISTS member_hashed_id_index ON member;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_club_id ON member;
|
||||
|
||||
-- Diary_Tags-Tabelle: Alle überflüssigen INDEX entfernen
|
||||
SELECT '=== ENTFERNE ÜBERFLÜSSIGE DIARY_TAGS INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS diary_tags_name_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_created_at_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_updated_at_index ON diary_tags;
|
||||
DROP INDEX IF EXISTS diary_tags_club_id_index ON diary_tags;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_diary_tags_name ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_updated_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tags_club_id ON diary_tags;
|
||||
|
||||
-- Season-Tabelle: Alle überflüssigen INDEX entfernen
|
||||
SELECT '=== ENTFERNE ÜBERFLÜSSIGE SEASON INDEX ===' as info;
|
||||
|
||||
DROP INDEX IF EXISTS season_name_index ON season;
|
||||
DROP INDEX IF EXISTS season_start_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_end_date_index ON season;
|
||||
DROP INDEX IF EXISTS season_created_at_index ON season;
|
||||
DROP INDEX IF EXISTS season_updated_at_index ON season;
|
||||
DROP INDEX IF EXISTS season_club_id_index ON season;
|
||||
|
||||
-- Alternative INDEX-Namen
|
||||
DROP INDEX IF EXISTS idx_season_name ON season;
|
||||
DROP INDEX IF EXISTS idx_season_start_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_end_date ON season;
|
||||
DROP INDEX IF EXISTS idx_season_created_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_updated_at ON season;
|
||||
DROP INDEX IF EXISTS idx_season_club_id ON season;
|
||||
|
||||
-- 4. Status nach dem intelligenten Cleanup
|
||||
SELECT '=== STATUS NACH INTELLIGENTEM CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 5. Gesamtanzahl der Keys
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after_intelligent_cleanup
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 6. Ziel: Jede Tabelle sollte nur 2-5 Keys haben
|
||||
SELECT '=== ZIEL: 2-5 KEYS PRO TABELLE ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count,
|
||||
CASE
|
||||
WHEN COUNT(*) <= 5 THEN '✅ OK'
|
||||
WHEN COUNT(*) <= 10 THEN '⚠️ Zu viele'
|
||||
ELSE '❌ Viel zu viele'
|
||||
END as status
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 7. Zusammenfassung
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Intelligentes Cleanup abgeschlossen. Überprüfen Sie die Anzahl der Keys oben.' as message;
|
||||
79
backend/cleanupKeysMinimal.sql
Normal file
79
backend/cleanupKeysMinimal.sql
Normal file
@@ -0,0 +1,79 @@
|
||||
-- Minimales Cleanup-Script
|
||||
-- Entfernt alle INDEX außer PRIMARY KEY und UNIQUE Keys
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Status vor Cleanup
|
||||
SELECT '=== STATUS VOR CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
SELECT
|
||||
COUNT(*) as total_keys_before
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 2. Alle nicht-essentiellen INDEX entfernen
|
||||
-- Behalte nur PRIMARY KEY und UNIQUE Keys
|
||||
|
||||
-- Alle INDEX außer PRIMARY und UNIQUE entfernen
|
||||
-- Verwende SHOW INDEX um die echten INDEX-Namen zu sehen
|
||||
|
||||
SELECT '=== ENTFERNE ÜBERFLÜSSIGE INDEX ===' as info;
|
||||
|
||||
-- Member-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== MEMBER TABELLE ===' as info;
|
||||
SHOW INDEX FROM member;
|
||||
|
||||
-- User-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== USER TABELLE ===' as info;
|
||||
SHOW INDEX FROM user;
|
||||
|
||||
-- Clubs-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== CLUBS TABELLE ===' as info;
|
||||
SHOW INDEX FROM clubs;
|
||||
|
||||
-- User_Club-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== USER_CLUB TABELLE ===' as info;
|
||||
SHOW INDEX FROM user_club;
|
||||
|
||||
-- Log-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== LOG TABELLE ===' as info;
|
||||
SHOW INDEX FROM log;
|
||||
|
||||
-- Diary_Dates-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== DIARY_DATES TABELLE ===' as info;
|
||||
SHOW INDEX FROM diary_dates;
|
||||
|
||||
-- Participants-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== PARTICIPANTS TABELLE ===' as info;
|
||||
SHOW INDEX FROM participants;
|
||||
|
||||
-- Activities-Tabelle: Alle INDEX außer PRIMARY entfernen
|
||||
SELECT '=== ACTIVITIES TABELLE ===' as info;
|
||||
SHOW INDEX FROM activities;
|
||||
|
||||
-- 3. Status nach Cleanup
|
||||
SELECT '=== STATUS NACH CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 4. Zusammenfassung
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Minimales Cleanup abgeschlossen. Überprüfen Sie die Anzahl der Keys oben.' as message;
|
||||
143
backend/cleanupKeysNode.cjs
Normal file
143
backend/cleanupKeysNode.cjs
Normal file
@@ -0,0 +1,143 @@
|
||||
const mysql = require('mysql2/promise');
|
||||
require('dotenv').config();
|
||||
|
||||
// Datenbankverbindung
|
||||
const dbConfig = {
|
||||
host: process.env.DB_HOST || 'localhost',
|
||||
user: process.env.DB_USER || 'root',
|
||||
password: process.env.DB_PASSWORD || '',
|
||||
database: process.env.DB_NAME || 'trainingsdiary'
|
||||
};
|
||||
|
||||
async function cleanupKeys() {
|
||||
let connection;
|
||||
|
||||
try {
|
||||
console.log('🔌 Verbinde mit der Datenbank...');
|
||||
connection = await mysql.createConnection(dbConfig);
|
||||
|
||||
// 1. Status vor dem Cleanup
|
||||
console.log('\n📊 STATUS VOR DEM CLEANUP:');
|
||||
const [tablesBefore] = await connection.execute(`
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC
|
||||
`, [dbConfig.database]);
|
||||
|
||||
tablesBefore.forEach(table => {
|
||||
console.log(` ${table.TABLE_NAME}: ${table.key_count} Keys`);
|
||||
});
|
||||
|
||||
// 2. Alle INDEX der Problem-Tabellen anzeigen
|
||||
const problemTables = ['member', 'diary_tags', 'season'];
|
||||
|
||||
for (const tableName of problemTables) {
|
||||
console.log(`\n🔍 INDEX für Tabelle '${tableName}':`);
|
||||
|
||||
try {
|
||||
const [indexes] = await connection.execute(`SHOW INDEX FROM \`${tableName}\``);
|
||||
|
||||
if (indexes.length === 0) {
|
||||
console.log(` Keine INDEX gefunden für Tabelle '${tableName}'`);
|
||||
continue;
|
||||
}
|
||||
|
||||
indexes.forEach(index => {
|
||||
console.log(` - ${index.Key_name} (${index.Column_name}) - ${index.Non_unique === 0 ? 'UNIQUE' : 'NON-UNIQUE'}`);
|
||||
});
|
||||
|
||||
// 3. Überflüssige INDEX entfernen (alle außer PRIMARY und UNIQUE)
|
||||
console.log(`\n🗑️ Entferne überflüssige INDEX aus '${tableName}':`);
|
||||
|
||||
for (const index of indexes) {
|
||||
// Behalte PRIMARY KEY und UNIQUE constraints
|
||||
if (index.Key_name === 'PRIMARY' || index.Non_unique === 0) {
|
||||
console.log(` ✅ Behalte: ${index.Key_name} (${index.Column_name})`);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Entferne alle anderen INDEX
|
||||
try {
|
||||
await connection.execute(`DROP INDEX \`${index.Key_name}\` ON \`${tableName}\``);
|
||||
console.log(` ❌ Entfernt: ${index.Key_name} (${index.Column_name})`);
|
||||
} catch (error) {
|
||||
if (error.code === 'ER_CANT_DROP_FIELD_OR_KEY') {
|
||||
console.log(` ⚠️ Kann nicht entfernen: ${index.Key_name} (${index.Column_name}) - ${error.message}`);
|
||||
} else {
|
||||
console.log(` ❌ Fehler beim Entfernen von ${index.Key_name}: ${error.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.log(` ⚠️ Fehler beim Zugriff auf Tabelle '${tableName}': ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Status nach dem Cleanup
|
||||
console.log('\n📊 STATUS NACH DEM CLEANUP:');
|
||||
const [tablesAfter] = await connection.execute(`
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC
|
||||
`, [dbConfig.database]);
|
||||
|
||||
tablesAfter.forEach(table => {
|
||||
const before = tablesBefore.find(t => t.TABLE_NAME === table.TABLE_NAME);
|
||||
const beforeCount = before ? before.key_count : 0;
|
||||
const diff = beforeCount - table.key_count;
|
||||
const status = table.key_count <= 5 ? '✅' : table.key_count <= 10 ? '⚠️' : '❌';
|
||||
|
||||
console.log(` ${status} ${table.TABLE_NAME}: ${table.key_count} Keys (${diff > 0 ? `-${diff}` : `+${Math.abs(diff)}`})`);
|
||||
});
|
||||
|
||||
// 5. Gesamtanzahl der Keys
|
||||
const [totalKeys] = await connection.execute(`
|
||||
SELECT COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
`, [dbConfig.database]);
|
||||
|
||||
console.log(`\n📈 GESAMTANZAHL KEYS: ${totalKeys[0].total_keys}`);
|
||||
|
||||
// 6. Zusammenfassung
|
||||
console.log('\n🎯 ZUSAMMENFASSUNG:');
|
||||
const problemTablesAfter = tablesAfter.filter(t => t.key_count > 10);
|
||||
|
||||
if (problemTablesAfter.length === 0) {
|
||||
console.log(' ✅ Alle Tabellen haben jetzt weniger als 10 Keys!');
|
||||
} else {
|
||||
console.log(' ⚠️ Folgende Tabellen haben immer noch zu viele Keys:');
|
||||
problemTablesAfter.forEach(table => {
|
||||
console.log(` - ${table.TABLE_NAME}: ${table.key_count} Keys`);
|
||||
});
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Fehler beim Cleanup:', error);
|
||||
} finally {
|
||||
if (connection) {
|
||||
await connection.end();
|
||||
console.log('\n🔌 Datenbankverbindung geschlossen.');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Script ausführen
|
||||
console.log('🚀 Starte intelligentes INDEX-Cleanup...\n');
|
||||
cleanupKeys().then(() => {
|
||||
console.log('\n✨ Cleanup abgeschlossen!');
|
||||
process.exit(0);
|
||||
}).catch(error => {
|
||||
console.error('\n💥 Fehler beim Cleanup:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
166
backend/cleanupKeysNode.js
Normal file
166
backend/cleanupKeysNode.js
Normal file
@@ -0,0 +1,166 @@
|
||||
import mysql from 'mysql2/promise';
|
||||
import dotenv from 'dotenv';
|
||||
import path from 'path';
|
||||
import { fileURLToPath } from 'url';
|
||||
|
||||
// __dirname für ES-Module
|
||||
const __filename = fileURLToPath(import.meta.url);
|
||||
const __dirname = path.dirname(__filename);
|
||||
|
||||
// Umgebungsvariablen aus dem Root-Verzeichnis laden
|
||||
//const envPath = path.join(__dirname, '..', '.env');
|
||||
//console.log('🔍 Lade .env-Datei von:', envPath);
|
||||
dotenv.config();
|
||||
|
||||
// Debug: Zeige geladene Umgebungsvariablen
|
||||
console.log('🔍 Geladene Umgebungsvariablen:');
|
||||
console.log(' DB_HOST:', process.env.DB_HOST);
|
||||
console.log(' DB_USER:', process.env.DB_USER);
|
||||
console.log(' DB_NAME:', process.env.DB_NAME);
|
||||
console.log(' DB_PASSWORD:', process.env.DB_PASSWORD ? '***gesetzt***' : 'nicht gesetzt');
|
||||
|
||||
// Datenbankverbindung
|
||||
const dbConfig = {
|
||||
host: process.env.DB_HOST || 'localhost',
|
||||
user: process.env.DB_USER || 'root',
|
||||
password: process.env.DB_PASSWORD || '',
|
||||
database: process.env.DB_NAME || 'trainingsdiary'
|
||||
};
|
||||
|
||||
console.log('🔍 Datenbankverbindung:');
|
||||
console.log(' Host:', dbConfig.host);
|
||||
console.log(' User:', dbConfig.user);
|
||||
console.log(' Database:', dbConfig.database);
|
||||
console.log(' Password:', dbConfig.password ? '***gesetzt***' : 'nicht gesetzt');
|
||||
|
||||
async function cleanupKeys() {
|
||||
let connection;
|
||||
|
||||
try {
|
||||
console.log('🔌 Verbinde mit der Datenbank...');
|
||||
connection = await mysql.createConnection(dbConfig);
|
||||
|
||||
// 1. Status vor dem Cleanup
|
||||
console.log('\n📊 STATUS VOR DEM CLEANUP:');
|
||||
const [tablesBefore] = await connection.execute(`
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC
|
||||
`, [dbConfig.database]);
|
||||
|
||||
tablesBefore.forEach(table => {
|
||||
console.log(` ${table.TABLE_NAME}: ${table.key_count} Keys`);
|
||||
});
|
||||
|
||||
// 2. Alle INDEX der Problem-Tabellen anzeigen
|
||||
const problemTables = ['member', 'diary_tags', 'season'];
|
||||
|
||||
for (const tableName of problemTables) {
|
||||
console.log(`\n🔍 INDEX für Tabelle '${tableName}':`);
|
||||
|
||||
try {
|
||||
const [indexes] = await connection.execute(`SHOW INDEX FROM \`${tableName}\``);
|
||||
|
||||
if (indexes.length === 0) {
|
||||
console.log(` Keine INDEX gefunden für Tabelle '${tableName}'`);
|
||||
continue;
|
||||
}
|
||||
|
||||
indexes.forEach(index => {
|
||||
console.log(` - ${index.Key_name} (${index.Column_name}) - ${index.Non_unique === 0 ? 'UNIQUE' : 'NON-UNIQUE'}`);
|
||||
});
|
||||
|
||||
// 3. Überflüssige INDEX entfernen (alle außer PRIMARY und UNIQUE)
|
||||
console.log(`\n🗑️ Entferne überflüssige INDEX aus '${tableName}':`);
|
||||
|
||||
for (const index of indexes) {
|
||||
// Behalte PRIMARY KEY und UNIQUE constraints
|
||||
if (index.Key_name === 'PRIMARY' || index.Non_unique === 0) {
|
||||
console.log(` ✅ Behalte: ${index.Key_name} (${index.Column_name})`);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Entferne alle anderen INDEX
|
||||
try {
|
||||
await connection.execute(`DROP INDEX \`${index.Key_name}\` ON \`${tableName}\``);
|
||||
console.log(` ❌ Entfernt: ${index.Key_name} (${index.Column_name})`);
|
||||
} catch (error) {
|
||||
if (error.code === 'ER_CANT_DROP_FIELD_OR_KEY') {
|
||||
console.log(` ⚠️ Kann nicht entfernen: ${index.Key_name} (${index.Column_name}) - ${error.message}`);
|
||||
} else {
|
||||
console.log(` ❌ Fehler beim Entfernen von ${index.Key_name}: ${error.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.log(` ⚠️ Fehler beim Zugriff auf Tabelle '${tableName}': ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Status nach dem Cleanup
|
||||
console.log('\n📊 STATUS NACH DEM CLEANUP:');
|
||||
const [tablesAfter] = await connection.execute(`
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC
|
||||
`, [dbConfig.database]);
|
||||
|
||||
tablesAfter.forEach(table => {
|
||||
const before = tablesBefore.find(t => t.TABLE_NAME === table.TABLE_NAME);
|
||||
const beforeCount = before ? before.key_count : 0;
|
||||
const diff = beforeCount - table.key_count;
|
||||
const status = table.key_count <= 5 ? '✅' : table.key_count <= 10 ? '⚠️' : '❌';
|
||||
|
||||
console.log(` ${status} ${table.TABLE_NAME}: ${table.key_count} Keys (${diff > 0 ? `-${diff}` : `+${Math.abs(diff)}`})`);
|
||||
});
|
||||
|
||||
// 5. Gesamtanzahl der Keys
|
||||
const [totalKeys] = await connection.execute(`
|
||||
SELECT COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = ?
|
||||
`, [dbConfig.database]);
|
||||
|
||||
console.log(`\n📈 GESAMTANZAHL KEYS: ${totalKeys[0].total_keys}`);
|
||||
|
||||
// 6. Zusammenfassung
|
||||
console.log('\n🎯 ZUSAMMENFASSUNG:');
|
||||
const problemTablesAfter = tablesAfter.filter(t => t.key_count > 10);
|
||||
|
||||
if (problemTablesAfter.length === 0) {
|
||||
console.log(' ✅ Alle Tabellen haben jetzt weniger als 10 Keys!');
|
||||
} else {
|
||||
console.log(' ⚠️ Folgende Tabellen haben immer noch zu viele Keys:');
|
||||
problemTablesAfter.forEach(table => {
|
||||
console.log(` - ${table.TABLE_NAME}: ${table.key_count} Keys`);
|
||||
});
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Fehler beim Cleanup:', error);
|
||||
} finally {
|
||||
if (connection) {
|
||||
await connection.end();
|
||||
console.log('\n🔌 Datenbankverbindung geschlossen.');
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Script ausführen
|
||||
console.log('🚀 Starte intelligentes INDEX-Cleanup...\n');
|
||||
cleanupKeys().then(() => {
|
||||
console.log('\n✨ Cleanup abgeschlossen!');
|
||||
process.exit(0);
|
||||
}).catch(error => {
|
||||
console.error('\n💥 Fehler beim Cleanup:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
149
backend/cleanupKeysReal.sql
Normal file
149
backend/cleanupKeysReal.sql
Normal file
@@ -0,0 +1,149 @@
|
||||
-- Cleanup-Script mit echten INDEX-Namen
|
||||
-- Zeigt zuerst alle vorhandenen INDEX an und entfernt dann nur die überflüssigen
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Alle vorhandenen INDEX anzeigen
|
||||
SELECT '=== VORHANDENE INDEX VOR CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
INDEX_NAME,
|
||||
COLUMN_NAME,
|
||||
NON_UNIQUE,
|
||||
SEQ_IN_INDEX
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME, INDEX_NAME, SEQ_IN_INDEX;
|
||||
|
||||
-- 2. Anzahl der Keys pro Tabelle vor Cleanup
|
||||
SELECT '=== KEYS PRO TABELLE VOR CLEANUP ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 3. Gesamtanzahl der Keys vor Cleanup
|
||||
SELECT '=== GESAMTANZAHL KEYS VOR CLEANUP ===' as info;
|
||||
SELECT
|
||||
COUNT(*) as total_keys_before
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 4. Cleanup: Nur INDEX entfernen, die tatsächlich existieren
|
||||
-- Verwende DROP INDEX IF EXISTS für alle möglichen INDEX
|
||||
|
||||
-- Member-Tabelle
|
||||
SELECT '=== ENTFERNE MEMBER INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- User-Tabelle
|
||||
SELECT '=== ENTFERNE USER INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_user_email ON user;
|
||||
DROP INDEX IF EXISTS idx_user_created_at ON user;
|
||||
DROP INDEX IF EXISTS idx_user_updated_at ON user;
|
||||
|
||||
-- Clubs-Tabelle
|
||||
SELECT '=== ENTFERNE CLUBS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_clubs_name ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_created_at ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_updated_at ON clubs;
|
||||
|
||||
-- User_Club-Tabelle
|
||||
SELECT '=== ENTFERNE USER_CLUB INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_user_club_approved ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_created_at ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_updated_at ON user_club;
|
||||
|
||||
-- Log-Tabelle
|
||||
SELECT '=== ENTFERNE LOG INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_log_activity ON log;
|
||||
DROP INDEX IF EXISTS idx_log_created_at ON log;
|
||||
DROP INDEX IF EXISTS idx_log_updated_at ON log;
|
||||
|
||||
-- Diary_Dates-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_date ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_created_at ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_updated_at ON diary_dates;
|
||||
|
||||
-- Participants-Tabelle
|
||||
SELECT '=== ENTFERNE PARTICIPANTS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_participant_created_at ON participants;
|
||||
DROP INDEX IF EXISTS idx_participant_updated_at ON participants;
|
||||
|
||||
-- Activities-Tabelle
|
||||
SELECT '=== ENTFERNE ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_activity_created_at ON activities;
|
||||
DROP INDEX IF EXISTS idx_activity_updated_at ON activities;
|
||||
|
||||
-- Member_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE MEMBER_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_note_created_at ON member_notes;
|
||||
DROP INDEX IF EXISTS idx_member_note_updated_at ON member_notes;
|
||||
|
||||
-- Diary_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_note_created_at ON diary_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_note_updated_at ON diary_notes;
|
||||
|
||||
-- Diary_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_updated_at ON diary_tags;
|
||||
|
||||
-- Member_Diary_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE MEMBER_DIARY_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_created_at ON member_diary_tags;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_updated_at ON member_diary_tags;
|
||||
|
||||
-- Diary_Date_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATE_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_created_at ON diary_date_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_updated_at ON diary_date_tags;
|
||||
|
||||
-- Diary_Member_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_MEMBER_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_created_at ON diary_member_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_updated_at ON diary_member_notes;
|
||||
|
||||
-- Predefined_Activities-Tabelle
|
||||
SELECT '=== ENTFERNE PREDEFINED_ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_created_at ON predefined_activities;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_updated_at ON predefined_activities;
|
||||
|
||||
-- Diary_Date_Activities-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATE_ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_created_at ON diary_date_activities;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_updated_at ON diary_date_activities;
|
||||
|
||||
-- 5. Nach der Bereinigung: Status anzeigen
|
||||
SELECT '=== STATUS NACH BEREINIGUNG ===' as info;
|
||||
|
||||
-- Anzahl der Keys pro Tabelle nach der Bereinigung
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- Gesamtanzahl der Keys nach der Bereinigung
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 6. Zusammenfassung der Änderungen
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Cleanup abgeschlossen. Überprüfen Sie die Anzahl der Keys oben.' as message;
|
||||
41
backend/cleanupKeysSimple.sql
Normal file
41
backend/cleanupKeysSimple.sql
Normal file
@@ -0,0 +1,41 @@
|
||||
-- Vereinfachtes Cleanup-Script für MySQL Keys
|
||||
-- Entfernt nur die problematischsten Keys
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Alle überflüssigen INDEX entfernen (die meisten werden von Sequelize automatisch erstellt)
|
||||
-- Diese entfernen die meisten Keys, die das Limit überschreiten
|
||||
|
||||
-- Member-Tabelle (Hauptproblem)
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- User-Tabelle
|
||||
DROP INDEX IF EXISTS idx_user_email ON user;
|
||||
DROP INDEX IF EXISTS idx_user_created_at ON user;
|
||||
DROP INDEX IF EXISTS idx_user_updated_at ON user;
|
||||
|
||||
-- Clubs-Tabelle
|
||||
DROP INDEX IF EXISTS idx_clubs_name ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_created_at ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_updated_at ON clubs;
|
||||
|
||||
-- 2. Status anzeigen
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 3. Gesamtanzahl der Keys anzeigen
|
||||
SELECT
|
||||
COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
153
backend/cleanupKeysSmart.sql
Normal file
153
backend/cleanupKeysSmart.sql
Normal file
@@ -0,0 +1,153 @@
|
||||
-- Intelligentes Cleanup-Script für MySQL Keys
|
||||
-- Überprüft zuerst die echten Tabellennamen und entfernt nur vorhandene INDEX
|
||||
|
||||
USE trainingsdiary;
|
||||
|
||||
-- 1. Alle vorhandenen Tabellen anzeigen
|
||||
SELECT '=== VORHANDENE TABELLEN ===' as info;
|
||||
SHOW TABLES;
|
||||
|
||||
-- 2. Alle vorhandenen INDEX anzeigen
|
||||
SELECT '=== VORHANDENE INDEX ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
INDEX_NAME,
|
||||
COLUMN_NAME,
|
||||
NON_UNIQUE,
|
||||
SEQ_IN_INDEX
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
ORDER BY TABLE_NAME, INDEX_NAME, SEQ_IN_INDEX;
|
||||
|
||||
-- 3. Anzahl der Keys pro Tabelle anzeigen
|
||||
SELECT '=== KEYS PRO TABELLE ===' as info;
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- 4. Gesamtanzahl der Keys anzeigen
|
||||
SELECT '=== GESAMTANZAHL KEYS ===' as info;
|
||||
SELECT
|
||||
COUNT(*) as total_keys
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 5. Intelligente INDEX-Entfernung basierend auf vorhandenen Tabellen
|
||||
-- Nur INDEX entfernen, die tatsächlich existieren
|
||||
|
||||
-- Member-Tabelle (Hauptproblem)
|
||||
SELECT '=== ENTFERNE MEMBER INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_hashed_id ON member;
|
||||
DROP INDEX IF EXISTS idx_member_first_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_last_name ON member;
|
||||
DROP INDEX IF EXISTS idx_member_birth_date ON member;
|
||||
DROP INDEX IF EXISTS idx_member_active ON member;
|
||||
DROP INDEX IF EXISTS idx_member_created_at ON member;
|
||||
DROP INDEX IF EXISTS idx_member_updated_at ON member;
|
||||
|
||||
-- User-Tabelle
|
||||
SELECT '=== ENTFERNE USER INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_user_email ON user;
|
||||
DROP INDEX IF EXISTS idx_user_created_at ON user;
|
||||
DROP INDEX IF EXISTS idx_user_updated_at ON user;
|
||||
|
||||
-- Clubs-Tabelle
|
||||
SELECT '=== ENTFERNE CLUBS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_clubs_name ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_created_at ON clubs;
|
||||
DROP INDEX IF EXISTS idx_clubs_updated_at ON clubs;
|
||||
|
||||
-- User_Club-Tabelle
|
||||
SELECT '=== ENTFERNE USER_CLUB INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_user_club_approved ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_created_at ON user_club;
|
||||
DROP INDEX IF EXISTS idx_user_club_updated_at ON user_club;
|
||||
|
||||
-- Log-Tabelle
|
||||
SELECT '=== ENTFERNE LOG INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_log_activity ON log;
|
||||
DROP INDEX IF EXISTS idx_log_created_at ON log;
|
||||
DROP INDEX IF EXISTS idx_log_updated_at ON log;
|
||||
|
||||
-- Diary_Dates-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_date ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_created_at ON diary_dates;
|
||||
DROP INDEX IF EXISTS idx_diary_dates_updated_at ON diary_dates;
|
||||
|
||||
-- Participants-Tabelle
|
||||
SELECT '=== ENTFERNE PARTICIPANTS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_participant_created_at ON participants;
|
||||
DROP INDEX IF EXISTS idx_participant_updated_at ON participants;
|
||||
|
||||
-- Activities-Tabelle
|
||||
SELECT '=== ENTFERNE ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_activity_created_at ON activities;
|
||||
DROP INDEX IF EXISTS idx_activity_updated_at ON activities;
|
||||
|
||||
-- Member_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE MEMBER_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_note_created_at ON member_notes;
|
||||
DROP INDEX IF EXISTS idx_member_note_updated_at ON member_notes;
|
||||
|
||||
-- Diary_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_note_created_at ON diary_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_note_updated_at ON diary_notes;
|
||||
|
||||
-- Diary_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_created_at ON diary_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_tag_updated_at ON diary_tags;
|
||||
|
||||
-- Member_Diary_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE MEMBER_DIARY_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_created_at ON member_diary_tags;
|
||||
DROP INDEX IF EXISTS idx_member_diary_tag_updated_at ON member_diary_tags;
|
||||
|
||||
-- Diary_Date_Tags-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATE_TAGS INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_created_at ON diary_date_tags;
|
||||
DROP INDEX IF EXISTS idx_diary_date_tag_updated_at ON diary_date_tags;
|
||||
|
||||
-- Diary_Member_Notes-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_MEMBER_NOTES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_created_at ON diary_member_notes;
|
||||
DROP INDEX IF EXISTS idx_diary_member_note_updated_at ON diary_member_notes;
|
||||
|
||||
-- Predefined_Activities-Tabelle
|
||||
SELECT '=== ENTFERNE PREDEFINED_ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_created_at ON predefined_activities;
|
||||
DROP INDEX IF EXISTS idx_predefined_activity_updated_at ON predefined_activities;
|
||||
|
||||
-- Diary_Date_Activities-Tabelle
|
||||
SELECT '=== ENTFERNE DIARY_DATE_ACTIVITIES INDEX ===' as info;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_created_at ON diary_date_activities;
|
||||
DROP INDEX IF EXISTS idx_diary_date_activity_updated_at ON diary_date_activities;
|
||||
|
||||
-- 6. Nach der Bereinigung: Status anzeigen
|
||||
SELECT '=== STATUS NACH BEREINIGUNG ===' as info;
|
||||
|
||||
-- Anzahl der Keys pro Tabelle nach der Bereinigung
|
||||
SELECT
|
||||
TABLE_NAME,
|
||||
COUNT(*) as key_count
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary'
|
||||
GROUP BY TABLE_NAME
|
||||
ORDER BY key_count DESC;
|
||||
|
||||
-- Gesamtanzahl der Keys nach der Bereinigung
|
||||
SELECT
|
||||
COUNT(*) as total_keys_after_cleanup
|
||||
FROM INFORMATION_SCHEMA.STATISTICS
|
||||
WHERE TABLE_SCHEMA = 'trainingsdiary';
|
||||
|
||||
-- 7. Zusammenfassung
|
||||
SELECT '=== ZUSAMMENFASSUNG ===' as info;
|
||||
SELECT
|
||||
'Cleanup abgeschlossen. Überprüfen Sie die Anzahl der Keys oben.' as message;
|
||||
603
backend/clients/hettvClient.js
Normal file
603
backend/clients/hettvClient.js
Normal file
@@ -0,0 +1,603 @@
|
||||
import axios from 'axios';
|
||||
import fs from 'fs';
|
||||
import path from 'path';
|
||||
|
||||
const BASE_URL = 'https://ttde-id.liga.nu';
|
||||
const CLICK_TT_BASE = 'https://httv.click-tt.de';
|
||||
|
||||
class HettvClient {
|
||||
constructor() {
|
||||
this.baseURL = BASE_URL;
|
||||
this.client = axios.create({
|
||||
baseURL: this.baseURL,
|
||||
timeout: 15000,
|
||||
headers: {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7'
|
||||
},
|
||||
maxRedirects: 5, // Folge den OAuth2-Redirects
|
||||
validateStatus: (status) => status >= 200 && status < 400
|
||||
});
|
||||
|
||||
// Einfache Cookie-Jar nach Host -> { name: value }
|
||||
this.cookieJar = new Map();
|
||||
this.defaultHeaders = {
|
||||
'User-Agent': 'Mozilla/5.0 (X11; Linux x86_64; rv:141.0) Gecko/20100101 Firefox/141.0',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7'
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Login to HeTTV via OAuth2
|
||||
* @param {string} username - HeTTV username (email)
|
||||
* @param {string} password - HeTTV password
|
||||
* @returns {Promise<Object>} Login response with session data
|
||||
*/
|
||||
async login(username, password) {
|
||||
try {
|
||||
console.log('[HettvClient] - Starting login for:', username);
|
||||
|
||||
// Schritt 1: OAuth2-Authorization-Endpoint aufrufen - das sollte zur Login-Seite weiterleiten
|
||||
const oauthParams = new URLSearchParams({
|
||||
'scope': 'nuLiga',
|
||||
'response_type': 'code',
|
||||
'redirect_uri': 'https://httv.click-tt.de/cgi-bin/WebObjects/nuLigaTTDE.woa/wa/oAuthLogin',
|
||||
'state': 'nonce=' + Math.random().toString(36).substring(2, 15),
|
||||
'client_id': 'XtVpGjXKAhz3BZuu'
|
||||
});
|
||||
|
||||
// OAuth2 Start
|
||||
|
||||
// Der OAuth2-Endpoint sollte direkt zur Login-Seite weiterleiten
|
||||
const loginPageResponse = await this.client.get(`/oauth2/authz/ttde?${oauthParams.toString()}`, {
|
||||
maxRedirects: 5, // Folge den Redirects zur Login-Seite
|
||||
validateStatus: (status) => status >= 200 && status < 400,
|
||||
headers: {
|
||||
...this.defaultHeaders
|
||||
}
|
||||
});
|
||||
|
||||
// Login-Seite erreicht
|
||||
|
||||
// Session-Cookie aus der Login-Seite extrahieren
|
||||
const setCookies = loginPageResponse.headers['set-cookie'];
|
||||
if (!setCookies || !Array.isArray(setCookies)) {
|
||||
console.error('[HettvClient] - No cookies from login page');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine Session-Cookie von Login-Seite erhalten'
|
||||
};
|
||||
}
|
||||
|
||||
const sessionCookie = setCookies.find(cookie => cookie.startsWith('nusportingress='));
|
||||
if (!sessionCookie) {
|
||||
console.error('[HettvClient] - No nusportingress cookie from login page');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine nusportingress Session von Login-Seite erhalten'
|
||||
};
|
||||
}
|
||||
|
||||
// Extrahiere t:formdata aus dem HTML der Login-Seite
|
||||
const htmlContent = loginPageResponse.data;
|
||||
// HTML erhalten
|
||||
|
||||
// Suche nach t:formdata im HTML - verschiedene mögliche Formate
|
||||
let formDataMatch = htmlContent.match(/name="t:formdata"\s+value="([^"]+)"/);
|
||||
|
||||
if (!formDataMatch) {
|
||||
// Versuche andere Formate
|
||||
formDataMatch = htmlContent.match(/name='t:formdata'\s+value='([^']+)'/);
|
||||
}
|
||||
|
||||
if (!formDataMatch) {
|
||||
// Suche nach hidden input mit t:formdata (value vor name)
|
||||
formDataMatch = htmlContent.match(/<input[^>]*value="([^"]+)"[^>]*name="t:formdata"/);
|
||||
}
|
||||
|
||||
if (!formDataMatch) {
|
||||
// Suche nach hidden input mit t:formdata (name vor value)
|
||||
formDataMatch = htmlContent.match(/<input[^>]*name="t:formdata"[^>]*value="([^"]+)"/);
|
||||
}
|
||||
|
||||
if (!formDataMatch) {
|
||||
// Suche nach t:formdata ohne Anführungszeichen
|
||||
formDataMatch = htmlContent.match(/name=t:formdata\s+value=([^\s>]+)/);
|
||||
}
|
||||
|
||||
if (!formDataMatch) {
|
||||
console.error('[HettvClient] - No t:formdata found in login page');
|
||||
console.log('[HettvClient] - HTML snippet:', htmlContent.substring(0, 2000));
|
||||
|
||||
// Debug: Suche nach allen hidden inputs
|
||||
const hiddenInputs = htmlContent.match(/<input[^>]*type="hidden"[^>]*>/g);
|
||||
console.log('[HettvClient] - Hidden inputs found:', hiddenInputs);
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine t:formdata von Login-Seite erhalten'
|
||||
};
|
||||
}
|
||||
|
||||
const tFormData = formDataMatch[1];
|
||||
// CSRF-Token gefunden
|
||||
|
||||
// Schritt 2: Login mit den korrekten Daten durchführen
|
||||
// Verwende die Session-Cookie für den Login-Request
|
||||
const formData = new URLSearchParams();
|
||||
formData.append('t:submit', '["submit_0","submit_0"]');
|
||||
formData.append('t:ac', 'ttde');
|
||||
formData.append('t:formdata', tFormData);
|
||||
formData.append('username', username);
|
||||
formData.append('password', password);
|
||||
|
||||
const loginResponse = await this.client.post('/oauth2/login.loginform', formData.toString(), {
|
||||
headers: {
|
||||
'Cookie': sessionCookie.split(';')[0],
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
|
||||
...this.defaultHeaders,
|
||||
'Referer': `${BASE_URL}/oauth2/login.loginform`
|
||||
},
|
||||
maxRedirects: 5,
|
||||
validateStatus: (status) => status >= 200 && status < 400
|
||||
});
|
||||
|
||||
// Login-Antwort erhalten
|
||||
|
||||
// Prüfe ob wir erfolgreich eingeloggt sind
|
||||
// Login-Response geprüft
|
||||
|
||||
// Prüfe den Response-Inhalt um zu sehen ob wir noch auf der Login-Seite sind
|
||||
const responseContent = loginResponse.data;
|
||||
const isLoginPage = responseContent.includes('click-TT ID') &&
|
||||
responseContent.includes('Username') &&
|
||||
responseContent.includes('Password');
|
||||
|
||||
// Login-Page-Erkennung durchgeführt
|
||||
|
||||
if (isLoginPage) {
|
||||
console.log('[HettvClient] - Still on login page, login failed');
|
||||
console.log('[HettvClient] - Response snippet:', responseContent.substring(0, 500));
|
||||
return {
|
||||
success: false,
|
||||
error: 'Login fehlgeschlagen - ungültige Zugangsdaten'
|
||||
};
|
||||
}
|
||||
|
||||
// Prüfe auf OAuth2-Redirect oder Erfolg
|
||||
const hasOAuthRedirect = responseContent.includes('oauth2') ||
|
||||
responseContent.includes('redirect') ||
|
||||
loginResponse.status >= 300;
|
||||
|
||||
// OAuth Redirect erkannt
|
||||
|
||||
// Extrahiere die finale Session-Cookie
|
||||
const finalCookies = loginResponse.headers['set-cookie'];
|
||||
const finalSessionCookie = finalCookies?.find(cookie => cookie.startsWith('nusportingress='));
|
||||
|
||||
const sessionId = (finalSessionCookie || sessionCookie).match(/nusportingress=([^;]+)/)?.[1];
|
||||
|
||||
console.log('[HettvClient] - Login erfolgreich (HeTTV).');
|
||||
|
||||
// Versuche die finale OAuth-Weiterleitung zu httv.click-tt.de aufzurufen, um PHPSESSID zu erhalten
|
||||
let finalUrl = loginResponse.request?.res?.responseUrl;
|
||||
console.log('[HettvClient] - Login finalUrl:', finalUrl);
|
||||
let phpSessIdCookie = null;
|
||||
let finalHtml = null;
|
||||
try {
|
||||
if (finalUrl && finalUrl.includes('oAuthLogin')) {
|
||||
const clickTTClient = axios.create({
|
||||
timeout: 15000,
|
||||
maxRedirects: 0,
|
||||
validateStatus: (status) => status >= 200 && status < 400
|
||||
});
|
||||
|
||||
// Folge der Redirect-Kette manuell, übernehme Cookies
|
||||
let currentUrl = finalUrl;
|
||||
let lastResp = null;
|
||||
let hop = 0;
|
||||
const maxHops = 10;
|
||||
while (hop++ < maxHops && currentUrl) {
|
||||
lastResp = await clickTTClient.get(currentUrl, {
|
||||
headers: {
|
||||
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
|
||||
...this.defaultHeaders,
|
||||
'Referer': hop === 1 ? `${BASE_URL}/oauth2/login.loginform` : (lastResp?.request?.res?.responseUrl || currentUrl),
|
||||
'Cookie': this._cookieHeaderForUrl(currentUrl)
|
||||
}
|
||||
});
|
||||
|
||||
this._ingestSetCookiesFromResponse(currentUrl, lastResp.headers['set-cookie']);
|
||||
|
||||
const loc = lastResp.headers['location'];
|
||||
if (loc) {
|
||||
// Absolut vs relativ
|
||||
if (/^https?:\/\//i.test(loc)) {
|
||||
currentUrl = loc;
|
||||
} else {
|
||||
const u = new URL(currentUrl);
|
||||
currentUrl = `${u.origin}${loc}`;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
break; // keine weitere Location => final
|
||||
}
|
||||
|
||||
const clickTTResp = lastResp;
|
||||
finalHtml = typeof clickTTResp.data === 'string' ? clickTTResp.data : '';
|
||||
const ctSetCookies = clickTTResp.headers['set-cookie'];
|
||||
if (Array.isArray(ctSetCookies)) {
|
||||
phpSessIdCookie = ctSetCookies.find(c => c.startsWith('PHPSESSID='))?.split(';')[0] || null;
|
||||
}
|
||||
// Finale click-TT URL ermittelt
|
||||
}
|
||||
} catch (e) {
|
||||
// Finale click-TT Seite konnte nicht geladen werden
|
||||
}
|
||||
|
||||
// Baue kombinierte Cookie-Kette (falls PHPSESSID vorhanden)
|
||||
const baseCookie = (finalSessionCookie || sessionCookie).split(';')[0];
|
||||
const combinedCookie = phpSessIdCookie ? `${baseCookie}; ${phpSessIdCookie}` : baseCookie;
|
||||
|
||||
return {
|
||||
success: true,
|
||||
sessionId: sessionId,
|
||||
cookie: combinedCookie,
|
||||
accessToken: null,
|
||||
refreshToken: null,
|
||||
expiresAt: null,
|
||||
user: {
|
||||
finalUrl: finalUrl || null,
|
||||
htmlSnippet: finalHtml ? finalHtml.substring(0, 2000) : null
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('HeTTV login error:', error.message);
|
||||
console.error('Error details:', error.response?.status, error.response?.statusText);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'Login fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify login credentials
|
||||
* @param {string} username - HeTTV username
|
||||
* @param {string} password - HeTTV password
|
||||
* @returns {Promise<boolean>} True if credentials are valid
|
||||
*/
|
||||
async verifyCredentials(username, password) {
|
||||
const result = await this.login(username, password);
|
||||
return result.success;
|
||||
}
|
||||
|
||||
/**
|
||||
* Make an authenticated request to click-TT
|
||||
* @param {string} endpoint - API endpoint
|
||||
* @param {string} cookie - JSESSIONID cookie
|
||||
* @param {Object} options - Additional axios options
|
||||
* @returns {Promise<Object>} API response
|
||||
*/
|
||||
async authenticatedRequest(endpoint, cookie, options = {}, finalUrl = null) {
|
||||
try {
|
||||
// Bestimme Basis-URL dynamisch aus finalUrl, falls vorhanden
|
||||
let baseURL = CLICK_TT_BASE;
|
||||
if (finalUrl) {
|
||||
try {
|
||||
const url = new URL(finalUrl);
|
||||
baseURL = url.origin;
|
||||
} catch (_) {}
|
||||
}
|
||||
|
||||
const isAbsolute = /^https?:\/\//i.test(endpoint);
|
||||
const client = axios.create({
|
||||
baseURL: isAbsolute ? undefined : baseURL,
|
||||
timeout: 15000,
|
||||
maxRedirects: 0,
|
||||
validateStatus: (status) => status >= 200 && status < 400
|
||||
});
|
||||
|
||||
// Manuelles Redirect-Following inkl. Cookies/Referer
|
||||
let currentUrl = isAbsolute ? endpoint : `${baseURL}${endpoint.startsWith('/') ? '' : '/'}${endpoint}`;
|
||||
let lastResp = null;
|
||||
const trace = [];
|
||||
let hop = 0;
|
||||
const maxHops = 10;
|
||||
|
||||
console.log(`[HettvClient] - Starting redirect chain from: ${currentUrl}`);
|
||||
|
||||
while (hop++ < maxHops && currentUrl) {
|
||||
console.log(`[HettvClient] - Redirect ${hop}: GET ${currentUrl}`);
|
||||
|
||||
lastResp = await client.request({
|
||||
method: options.method || 'GET',
|
||||
url: currentUrl,
|
||||
data: options.data,
|
||||
headers: {
|
||||
...this.defaultHeaders,
|
||||
...(options.headers || {}),
|
||||
'Cookie': this._mergeCookieHeader(cookie, this._cookieHeaderForUrl(currentUrl)),
|
||||
'Referer': hop === 1 ? (finalUrl || baseURL) : (lastResp?.request?.res?.responseUrl || currentUrl)
|
||||
}
|
||||
});
|
||||
|
||||
this._ingestSetCookiesFromResponse(currentUrl, lastResp.headers['set-cookie']);
|
||||
const loc = lastResp.headers['location'];
|
||||
|
||||
console.log(`[HettvClient] - Response: ${lastResp.status} ${lastResp.statusText}`);
|
||||
console.log(`[HettvClient] - Location header: ${loc || 'none'}`);
|
||||
console.log(`[HettvClient] - Set-Cookie header: ${lastResp.headers['set-cookie'] ? 'present' : 'none'}`);
|
||||
console.log(`[HettvClient] - Content-Type: ${lastResp.headers['content-type'] || 'none'}`);
|
||||
|
||||
// Speichere jede Seite zur Analyse
|
||||
try {
|
||||
const dir = path.resolve(process.cwd(), 'backend', 'uploads');
|
||||
if (!fs.existsSync(dir)) {
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
}
|
||||
const filename = `hettv_redirect_${hop}_${Date.now()}.html`;
|
||||
const filePath = path.join(dir, filename);
|
||||
const content = typeof lastResp.data === 'string' ? lastResp.data : JSON.stringify(lastResp.data, null, 2);
|
||||
fs.writeFileSync(filePath, content, 'utf8');
|
||||
console.log(`[HettvClient] - Saved page to: ${filename}`);
|
||||
} catch (e) {
|
||||
console.log(`[HettvClient] - Could not save page ${hop}:`, e.message);
|
||||
}
|
||||
|
||||
trace.push({
|
||||
url: currentUrl,
|
||||
status: lastResp.status,
|
||||
location: loc || null
|
||||
});
|
||||
|
||||
if (loc) {
|
||||
const newUrl = /^https?:\/\//i.test(loc) ? loc : `${new URL(currentUrl).origin}${loc}`;
|
||||
console.log(`[HettvClient] - Following redirect to: ${newUrl}`);
|
||||
currentUrl = newUrl;
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`[HettvClient] - Final response: ${lastResp.status} (no more redirects)`);
|
||||
break;
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: lastResp?.data,
|
||||
trace
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('HeTTV API error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'API-Anfrage fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Navigate to main HeTTV page and find Downloads menu
|
||||
* @param {string} cookie - Session cookie
|
||||
* @returns {Promise<Object>} Response with main page content and download links
|
||||
*/
|
||||
async getMainPageWithDownloads(cookie, finalUrl = null) {
|
||||
try {
|
||||
console.log('[HettvClient] - Loading main HeTTV page...');
|
||||
|
||||
// Kandidaten für Einstiegs-URL bestimmen
|
||||
let origin = CLICK_TT_BASE;
|
||||
if (finalUrl) {
|
||||
try { origin = new URL(finalUrl).origin; } catch (_) {}
|
||||
}
|
||||
|
||||
const candidates = [];
|
||||
// Direkt zu HeTTV navigieren
|
||||
candidates.push('http://httv.click-tt.de/');
|
||||
candidates.push('http://httv.click-tt.de/wa/');
|
||||
candidates.push('http://httv.click-tt.de/cgi-bin/WebObjects/nuLigaTTDE.woa/wa/');
|
||||
|
||||
// Wenn wir eine finalUrl haben, verwende diese auch
|
||||
if (finalUrl) {
|
||||
candidates.push(finalUrl);
|
||||
}
|
||||
|
||||
console.log('[HettvClient] - URL candidates:', candidates);
|
||||
|
||||
let mainPageResponse = null;
|
||||
let mainTrace = [];
|
||||
let lastError = null;
|
||||
for (const candidate of candidates) {
|
||||
const resp = await this.authenticatedRequest(candidate, cookie, {}, finalUrl);
|
||||
if (resp.success && typeof resp.data === 'string' && resp.data.length > 0) {
|
||||
mainPageResponse = resp;
|
||||
mainTrace = resp.trace || [];
|
||||
break;
|
||||
}
|
||||
lastError = resp;
|
||||
}
|
||||
|
||||
if (!mainPageResponse) {
|
||||
return lastError || { success: false, error: 'HeTTV Einstiegsseite nicht erreichbar', status: 404 };
|
||||
}
|
||||
|
||||
const htmlContent = mainPageResponse.data;
|
||||
console.log('[HettvClient] - Main page loaded, HTML length:', htmlContent.length);
|
||||
|
||||
// Erkenne Fehlerseite (Session ungültig)
|
||||
if (/click-TT\s*-\s*Fehlerseite/i.test(htmlContent) || /ungültige oder nicht mehr gültige URL/i.test(htmlContent)) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Session ungültig oder abgelaufen',
|
||||
status: 401,
|
||||
data: { htmlSnippet: htmlContent.substring(0, 1000) }
|
||||
};
|
||||
}
|
||||
|
||||
// Speichere HTML zur Analyse
|
||||
let savedFile = null;
|
||||
try {
|
||||
const dir = path.resolve(process.cwd(), 'backend', 'uploads');
|
||||
if (!fs.existsSync(dir)) {
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
}
|
||||
const filename = `hettv_main_${Date.now()}.html`;
|
||||
const filePath = path.join(dir, filename);
|
||||
fs.writeFileSync(filePath, htmlContent, 'utf8');
|
||||
savedFile = filePath;
|
||||
} catch (e) {
|
||||
// Ignoriere Speicherfehler still, nur für Debug
|
||||
}
|
||||
|
||||
// Suche nach Downloads-Links im HTML
|
||||
const downloadLinks = [];
|
||||
|
||||
// 1) URL-Heuristiken
|
||||
const urlPatterns = [
|
||||
/href="([^"]*download[^"]*)"/gi,
|
||||
/href="([^"]*downloads[^"]*)"/gi,
|
||||
/href="([^"]*Download[^"]*)"/gi,
|
||||
/href="([^"]*Downloads[^"]*)"/gi
|
||||
];
|
||||
|
||||
urlPatterns.forEach(pattern => {
|
||||
let match;
|
||||
while ((match = pattern.exec(htmlContent)) !== null) {
|
||||
const link = match[1];
|
||||
if (link && !downloadLinks.includes(link)) {
|
||||
downloadLinks.push(link);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// 2) Linktext-Heuristik: <a ...>Downloads</a>
|
||||
const anchorPattern = /<a[^>]*href="([^"]+)"[^>]*>([\s\S]*?)<\/a>/gi;
|
||||
let aMatch;
|
||||
while ((aMatch = anchorPattern.exec(htmlContent)) !== null) {
|
||||
const href = aMatch[1];
|
||||
const text = aMatch[2].replace(/<[^>]*>/g, ' ').replace(/\s+/g, ' ').trim();
|
||||
if (/\bdownloads?\b/i.test(text)) {
|
||||
if (href && !downloadLinks.includes(href)) {
|
||||
downloadLinks.push(href);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 3) Fallback: Menüpunkte in Navigationen (role="navigation" etc.)
|
||||
if (downloadLinks.length === 0) {
|
||||
const navSectionRegex = /<nav[\s\S]*?<\/nav>/gi;
|
||||
let nav;
|
||||
while ((nav = navSectionRegex.exec(htmlContent)) !== null) {
|
||||
const section = nav[0];
|
||||
let m;
|
||||
anchorPattern.lastIndex = 0;
|
||||
while ((m = anchorPattern.exec(section)) !== null) {
|
||||
const href = m[1];
|
||||
const text = m[2].replace(/<[^>]*>/g, ' ').replace(/\s+/g, ' ').trim();
|
||||
if (/\bdownloads?\b/i.test(text)) {
|
||||
if (href && !downloadLinks.includes(href)) {
|
||||
downloadLinks.push(href);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log('[HettvClient] - Found download links:', downloadLinks);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: {
|
||||
htmlContent: htmlContent,
|
||||
downloadLinks: downloadLinks,
|
||||
htmlSnippet: htmlContent.substring(0, 2000), // Erste 2000 Zeichen für Analyse
|
||||
savedFile,
|
||||
trace: mainTrace,
|
||||
lastUrl: mainTrace.length ? mainTrace[mainTrace.length - 1].url : null,
|
||||
lastStatus: mainTrace.length ? mainTrace[mainTrace.length - 1].status : null
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('HeTTV main page error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.message || 'Fehler beim Laden der Hauptseite',
|
||||
status: 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load a specific download page
|
||||
* @param {string} downloadUrl - URL to the download page
|
||||
* @param {string} cookie - Session cookie
|
||||
* @returns {Promise<Object>} Response with download page content
|
||||
*/
|
||||
async loadDownloadPage(downloadUrl, cookie, finalUrl = null) {
|
||||
try {
|
||||
console.log('[HettvClient] - Loading download page:', downloadUrl);
|
||||
|
||||
const response = await this.authenticatedRequest(downloadUrl, cookie, {}, finalUrl);
|
||||
if (!response.success) {
|
||||
return response;
|
||||
}
|
||||
|
||||
const htmlContent = response.data;
|
||||
console.log('[HettvClient] - Download page loaded, HTML length:', htmlContent.length);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: {
|
||||
url: downloadUrl,
|
||||
htmlContent: htmlContent,
|
||||
htmlSnippet: htmlContent.substring(0, 3000) // Erste 3000 Zeichen für Analyse
|
||||
}
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('HeTTV download page error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.message || 'Fehler beim Laden der Download-Seite',
|
||||
status: 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// --- Cookie-Helfer ---
|
||||
_ingestSetCookiesFromResponse(currentUrl, setCookies) {
|
||||
if (!Array.isArray(setCookies) || setCookies.length === 0) return;
|
||||
const { host } = new URL(currentUrl);
|
||||
if (!this.cookieJar.has(host)) this.cookieJar.set(host, new Map());
|
||||
const jar = this.cookieJar.get(host);
|
||||
setCookies.forEach((cookieStr) => {
|
||||
const pair = cookieStr.split(';')[0];
|
||||
const eq = pair.indexOf('=');
|
||||
if (eq > 0) {
|
||||
const name = pair.substring(0, eq).trim();
|
||||
const value = pair.substring(eq + 1).trim();
|
||||
jar.set(name, value);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
_cookieHeaderForUrl(currentUrl) {
|
||||
const { host } = new URL(currentUrl);
|
||||
const jar = this.cookieJar.get(host);
|
||||
if (!jar || jar.size === 0) return '';
|
||||
return Array.from(jar.entries()).map(([k, v]) => `${k}=${v}`).join('; ');
|
||||
}
|
||||
|
||||
_mergeCookieHeader(primary, secondary) {
|
||||
const items = [];
|
||||
if (primary) items.push(primary);
|
||||
if (secondary) items.push(secondary);
|
||||
return items.filter(Boolean).join('; ');
|
||||
}
|
||||
}
|
||||
|
||||
export default new HettvClient();
|
||||
|
||||
284
backend/clients/myTischtennisClient.js
Normal file
284
backend/clients/myTischtennisClient.js
Normal file
@@ -0,0 +1,284 @@
|
||||
import axios from 'axios';
|
||||
|
||||
const BASE_URL = 'https://www.mytischtennis.de';
|
||||
|
||||
class MyTischtennisClient {
|
||||
constructor() {
|
||||
this.baseURL = BASE_URL;
|
||||
this.client = axios.create({
|
||||
baseURL: this.baseURL,
|
||||
timeout: 10000,
|
||||
headers: {
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
'Accept': '*/*'
|
||||
},
|
||||
maxRedirects: 0, // Don't follow redirects automatically
|
||||
validateStatus: (status) => status >= 200 && status < 400 // Accept 3xx as success
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Login to myTischtennis API
|
||||
* @param {string} email - myTischtennis email (not username!)
|
||||
* @param {string} password - myTischtennis password
|
||||
* @returns {Promise<Object>} Login response with token and session data
|
||||
*/
|
||||
async login(email, password) {
|
||||
try {
|
||||
// Create form data
|
||||
const formData = new URLSearchParams();
|
||||
formData.append('email', email);
|
||||
formData.append('password', password);
|
||||
formData.append('intent', 'login');
|
||||
|
||||
const response = await this.client.post(
|
||||
'/login?next=%2F&_data=routes%2F_auth%2B%2Flogin',
|
||||
formData.toString()
|
||||
);
|
||||
|
||||
// Extract the cookie from response headers
|
||||
const setCookie = response.headers['set-cookie'];
|
||||
if (!setCookie || !Array.isArray(setCookie)) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine Session-Cookie erhalten'
|
||||
};
|
||||
}
|
||||
|
||||
// Find the sb-10-auth-token cookie
|
||||
const authCookie = setCookie.find(cookie => cookie.startsWith('sb-10-auth-token='));
|
||||
if (!authCookie) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Kein Auth-Token in Response gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
// Extract and decode the token
|
||||
const tokenMatch = authCookie.match(/sb-10-auth-token=base64-([^;]+)/);
|
||||
if (!tokenMatch) {
|
||||
return {
|
||||
success: false,
|
||||
error: 'Token-Format ungültig'
|
||||
};
|
||||
}
|
||||
|
||||
const base64Token = tokenMatch[1];
|
||||
let tokenData;
|
||||
try {
|
||||
const decodedToken = Buffer.from(base64Token, 'base64').toString('utf-8');
|
||||
tokenData = JSON.parse(decodedToken);
|
||||
} catch (decodeError) {
|
||||
console.error('Error decoding token:', decodeError);
|
||||
return {
|
||||
success: false,
|
||||
error: 'Token konnte nicht dekodiert werden'
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
accessToken: tokenData.access_token,
|
||||
refreshToken: tokenData.refresh_token,
|
||||
expiresAt: tokenData.expires_at,
|
||||
expiresIn: tokenData.expires_in,
|
||||
user: tokenData.user,
|
||||
cookie: authCookie.split(';')[0] // Just the cookie value without attributes
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('MyTischtennis login error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'Login fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify login credentials
|
||||
* @param {string} email - myTischtennis email
|
||||
* @param {string} password - myTischtennis password
|
||||
* @returns {Promise<boolean>} True if credentials are valid
|
||||
*/
|
||||
async verifyCredentials(email, password) {
|
||||
const result = await this.login(email, password);
|
||||
return result.success;
|
||||
}
|
||||
|
||||
/**
|
||||
* Make an authenticated request
|
||||
* @param {string} endpoint - API endpoint
|
||||
* @param {string} cookie - Authentication cookie (sb-10-auth-token)
|
||||
* @param {Object} options - Additional axios options
|
||||
* @returns {Promise<Object>} API response
|
||||
*/
|
||||
async authenticatedRequest(endpoint, cookie, options = {}) {
|
||||
try {
|
||||
const response = await this.client.request({
|
||||
url: endpoint,
|
||||
...options,
|
||||
headers: {
|
||||
...options.headers,
|
||||
'Cookie': cookie,
|
||||
'Accept': '*/*',
|
||||
'Accept-Language': 'de-DE,de;q=0.9,en-US;q=0.8,en;q=0.7',
|
||||
'Referer': 'https://www.mytischtennis.de/',
|
||||
'sec-fetch-dest': 'empty',
|
||||
'sec-fetch-mode': 'cors',
|
||||
'sec-fetch-site': 'same-origin'
|
||||
}
|
||||
});
|
||||
return {
|
||||
success: true,
|
||||
data: response.data
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('MyTischtennis API error:', error.message);
|
||||
return {
|
||||
success: false,
|
||||
error: error.response?.data?.message || 'API-Anfrage fehlgeschlagen',
|
||||
status: error.response?.status || 500
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get user profile and club information
|
||||
* @param {string} cookie - Authentication cookie (sb-10-auth-token)
|
||||
* @returns {Promise<Object>} User profile with club info
|
||||
*/
|
||||
async getUserProfile(cookie) {
|
||||
console.log('[getUserProfile] - Calling /?_data=root with cookie:', cookie?.substring(0, 50) + '...');
|
||||
|
||||
const result = await this.authenticatedRequest('/?_data=root', cookie, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
console.log('[getUserProfile] - Result success:', result.success);
|
||||
|
||||
if (result.success) {
|
||||
console.log('[getUserProfile] - Response structure:', {
|
||||
hasUserProfile: !!result.data?.userProfile,
|
||||
hasClub: !!result.data?.userProfile?.club,
|
||||
hasOrganization: !!result.data?.userProfile?.organization,
|
||||
clubnr: result.data?.userProfile?.club?.clubnr,
|
||||
clubName: result.data?.userProfile?.club?.name,
|
||||
orgShort: result.data?.userProfile?.organization?.short,
|
||||
ttr: result.data?.userProfile?.ttr,
|
||||
qttr: result.data?.userProfile?.qttr
|
||||
});
|
||||
|
||||
console.log('[getUserProfile] - Full userProfile.club:', result.data?.userProfile?.club);
|
||||
console.log('[getUserProfile] - Full userProfile.organization:', result.data?.userProfile?.organization);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
clubId: result.data?.userProfile?.club?.clubnr || null,
|
||||
clubName: result.data?.userProfile?.club?.name || null,
|
||||
fedNickname: result.data?.userProfile?.organization?.short || null,
|
||||
ttr: result.data?.userProfile?.ttr || null,
|
||||
qttr: result.data?.userProfile?.qttr || null,
|
||||
userProfile: result.data?.userProfile || null
|
||||
};
|
||||
}
|
||||
|
||||
console.error('[getUserProfile] - Failed:', result.error);
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get club rankings (andro-Rangliste)
|
||||
* @param {string} cookie - Authentication cookie
|
||||
* @param {string} clubId - Club number (e.g., "43030")
|
||||
* @param {string} fedNickname - Federation nickname (e.g., "HeTTV")
|
||||
* @returns {Promise<Object>} Rankings with player entries (all pages)
|
||||
*/
|
||||
async getClubRankings(cookie, clubId, fedNickname) {
|
||||
const allEntries = [];
|
||||
let currentPage = 0;
|
||||
let hasMorePages = true;
|
||||
|
||||
console.log('[getClubRankings] - Starting to fetch rankings for club', clubId);
|
||||
|
||||
while (hasMorePages) {
|
||||
const endpoint = `/rankings/andro-rangliste?all-players=on&clubnr=${clubId}&fednickname=${fedNickname}&results-per-page=100&page=${currentPage}&_data=routes%2F%24`;
|
||||
|
||||
console.log(`[getClubRankings] - Fetching page ${currentPage}...`);
|
||||
|
||||
const result = await this.authenticatedRequest(endpoint, cookie, {
|
||||
method: 'GET'
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
console.error(`[getClubRankings] - Failed to fetch page ${currentPage}:`, result.error);
|
||||
return result;
|
||||
}
|
||||
|
||||
// Find the dynamic key that contains entries
|
||||
const blockLoaderData = result.data?.pageContent?.blockLoaderData;
|
||||
if (!blockLoaderData) {
|
||||
console.error('[getClubRankings] - No blockLoaderData found');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine blockLoaderData gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
// Finde den Schlüssel, der entries enthält
|
||||
let entries = null;
|
||||
let rankingData = null;
|
||||
|
||||
for (const key in blockLoaderData) {
|
||||
if (blockLoaderData[key]?.entries) {
|
||||
entries = blockLoaderData[key].entries;
|
||||
rankingData = blockLoaderData[key];
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!entries) {
|
||||
console.error('[getClubRankings] - No entries found in blockLoaderData');
|
||||
return {
|
||||
success: false,
|
||||
error: 'Keine entries in blockLoaderData gefunden'
|
||||
};
|
||||
}
|
||||
|
||||
console.log(`[getClubRankings] - Page ${currentPage}: Found ${entries.length} entries`);
|
||||
|
||||
// Füge Entries hinzu
|
||||
allEntries.push(...entries);
|
||||
|
||||
// Prüfe ob es weitere Seiten gibt
|
||||
// Wenn die aktuelle Seite weniger Einträge hat als das Limit, sind wir am Ende
|
||||
// Oder wenn wir alle erwarteten Einträge haben
|
||||
if (entries.length === 0) {
|
||||
hasMorePages = false;
|
||||
console.log('[getClubRankings] - No more entries, stopping');
|
||||
} else if (rankingData.numberOfPages && currentPage >= rankingData.numberOfPages - 1) {
|
||||
hasMorePages = false;
|
||||
console.log(`[getClubRankings] - Reached last page (${rankingData.numberOfPages})`);
|
||||
} else if (allEntries.length >= rankingData.resultLength) {
|
||||
hasMorePages = false;
|
||||
console.log(`[getClubRankings] - Got all entries (${allEntries.length}/${rankingData.resultLength})`);
|
||||
} else {
|
||||
currentPage++;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`[getClubRankings] - Total entries fetched: ${allEntries.length}`);
|
||||
|
||||
return {
|
||||
success: true,
|
||||
entries: allEntries,
|
||||
metadata: {
|
||||
totalEntries: allEntries.length,
|
||||
pagesFetched: currentPage + 1
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
export default new MyTischtennisClient();
|
||||
|
||||
@@ -1,4 +1,7 @@
|
||||
import { register, activateUser, login, logout } from '../services/authService.js';
|
||||
import jwt from 'jsonwebtoken';
|
||||
import UserToken from '../models/UserToken.js';
|
||||
import User from '../models/User.js'; // ggf. Pfad anpassen
|
||||
|
||||
const registerUser = async (req, res, next) => {
|
||||
try {
|
||||
@@ -30,14 +33,14 @@ const loginUser = async (req, res, next) => {
|
||||
}
|
||||
};
|
||||
|
||||
const logoutUser = async(req, res) => {
|
||||
const { userid: userId, authtoken: authToken } = req.headers;
|
||||
const logoutUser = async (req, res, next) => {
|
||||
try {
|
||||
logout(userId, authToken);
|
||||
const token = req.headers['authorization']?.split(' ')[1];
|
||||
const result = await logout(token);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
res.status(401).json({ msg: 'not found' });
|
||||
next(error);
|
||||
}
|
||||
res.status(200).json({ msg: 'loggedout' });
|
||||
}
|
||||
};
|
||||
|
||||
export { registerUser, activate, loginUser, logoutUser };
|
||||
|
||||
@@ -1,77 +1,76 @@
|
||||
import ClubService from '../services/clubService.js';
|
||||
import { getUserByToken } from '../utils/userUtils.js';
|
||||
import { devLog } from '../utils/logger.js';
|
||||
|
||||
export const getClubs = async (req, res) => {
|
||||
try {
|
||||
console.log('[getClubs] - get clubs');
|
||||
devLog('[getClubs] - get clubs');
|
||||
const clubs = await ClubService.getAllClubs();
|
||||
console.log('[getClubs] - prepare response');
|
||||
devLog('[getClubs] - prepare response');
|
||||
res.status(200).json(clubs);
|
||||
console.log('[getClubs] - done');
|
||||
devLog('[getClubs] - done');
|
||||
} catch (error) {
|
||||
console.log('[getClubs] - error');
|
||||
console.log(error);
|
||||
console.error('[getClubs] - error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const addClub = async (req, res) => {
|
||||
console.log('[addClub] - Read out parameters');
|
||||
devLog('[addClub] - Read out parameters');
|
||||
const { authcode: token } = req.headers;
|
||||
const { name: clubName } = req.body;
|
||||
|
||||
try {
|
||||
console.log('[addClub] - find club by name');
|
||||
devLog('[addClub] - find club by name');
|
||||
const club = await ClubService.findClubByName(clubName);
|
||||
console.log('[addClub] - get user');
|
||||
devLog('[addClub] - get user');
|
||||
const user = await getUserByToken(token);
|
||||
console.log('[addClub] - check if club already exists');
|
||||
devLog('[addClub] - check if club already exists');
|
||||
if (club) {
|
||||
res.status(409).json({ error: "alreadyexists" });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[addClub] - create club');
|
||||
devLog('[addClub] - create club');
|
||||
const newClub = await ClubService.createClub(clubName);
|
||||
console.log('[addClub] - add user to new club');
|
||||
devLog('[addClub] - add user to new club');
|
||||
await ClubService.addUserToClub(user.id, newClub.id);
|
||||
console.log('[addClub] - prepare response');
|
||||
devLog('[addClub] - prepare response');
|
||||
res.status(200).json(newClub);
|
||||
console.log('[addClub] - done');
|
||||
devLog('[addClub] - done');
|
||||
} catch (error) {
|
||||
console.log('[addClub] - error');
|
||||
console.log(error);
|
||||
console.error('[addClub] - error:', error);
|
||||
res.status(500).json({ error: "internalerror" });
|
||||
}
|
||||
};
|
||||
|
||||
export const getClub = async (req, res) => {
|
||||
console.log('[getClub] - start');
|
||||
devLog('[getClub] - start');
|
||||
try {
|
||||
const { authcode: token } = req.headers;
|
||||
const { clubid: clubId } = req.params;
|
||||
console.log('[getClub] - get user');
|
||||
devLog('[getClub] - get user');
|
||||
const user = await getUserByToken(token);
|
||||
console.log('[getClub] - get users club');
|
||||
devLog('[getClub] - get users club');
|
||||
const access = await ClubService.getUserClubAccess(user.id, clubId);
|
||||
console.log('[getClub] - check access');
|
||||
devLog('[getClub] - check access');
|
||||
if (access.length === 0 || !access[0].approved) {
|
||||
res.status(403).json({ error: "noaccess", status: access.length === 0 ? "notrequested" : "requested" });
|
||||
return;
|
||||
}
|
||||
|
||||
console.log('[getClub] - get club');
|
||||
devLog('[getClub] - get club');
|
||||
const club = await ClubService.findClubById(clubId);
|
||||
console.log('[getClub] - check club exists');
|
||||
devLog('[getClub] - check club exists');
|
||||
if (!club) {
|
||||
return res.status(404).json({ message: 'Club not found' });
|
||||
}
|
||||
|
||||
console.log('[getClub] - set response');
|
||||
devLog('[getClub] - set response');
|
||||
res.status(200).json(club);
|
||||
console.log('[getClub] - done');
|
||||
devLog('[getClub] - done');
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
console.error('[getClub] - error:', error);
|
||||
res.status(500).json({ message: 'Server error' });
|
||||
}
|
||||
};
|
||||
@@ -82,7 +81,7 @@ export const requestClubAccess = async (req, res) => {
|
||||
|
||||
try {
|
||||
const user = await getUserByToken(token);
|
||||
console.log(user);
|
||||
devLog('[requestClubAccess] - user:', user);
|
||||
|
||||
await ClubService.requestAccessToClub(user.id, clubId);
|
||||
res.status(200).json({});
|
||||
@@ -92,6 +91,7 @@ export const requestClubAccess = async (req, res) => {
|
||||
} else if (error.message === 'clubnotfound') {
|
||||
res.status(404).json({ err: "clubnotfound" });
|
||||
} else {
|
||||
console.error('[requestClubAccess] - error:', error);
|
||||
res.status(500).json({ err: "internalerror" });
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import diaryService from '../services/diaryService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getDatesForClub = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.params;
|
||||
@@ -38,7 +39,7 @@ const updateTrainingTimes = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { dateId, trainingStart, trainingEnd } = req.body;
|
||||
if (!dateId || !trainingStart) {
|
||||
console.log(dateId, trainingStart, trainingEnd);
|
||||
devLog(dateId, trainingStart, trainingEnd);
|
||||
throw new HttpError('notallfieldsfilled', 400);
|
||||
}
|
||||
const updatedDate = await diaryService.updateTrainingTimes(userToken, clubId, dateId, trainingStart, trainingEnd);
|
||||
@@ -116,3 +117,15 @@ const deleteTagFromDiaryDate = async (req, res) => {
|
||||
|
||||
export { getDatesForClub, createDateForClub, updateTrainingTimes, addDiaryNote, deleteDiaryNote, addDiaryTag,
|
||||
addTagToDiaryDate, deleteTagFromDiaryDate };
|
||||
|
||||
export const deleteDateForClub = async (req, res) => {
|
||||
try {
|
||||
const { clubId, dateId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const result = await diaryService.removeDateForClub(userToken, clubId, dateId);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[deleteDateForClub] - Error:', error);
|
||||
res.status(error.statusCode || 500).json({ error: error.message || 'systemerror' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import diaryDateActivityService from '../services/diaryDateActivityService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const createDiaryDateActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -15,7 +16,7 @@ export const createDiaryDateActivity = async (req, res) => {
|
||||
});
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error creating activity' });
|
||||
}
|
||||
};
|
||||
@@ -58,7 +59,7 @@ export const updateDiaryDateActivityOrder = async (req, res) => {
|
||||
const updatedActivity = await diaryDateActivityService.updateActivityOrder(userToken, clubId, id, orderId);
|
||||
res.status(200).json(updatedActivity);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error updating activity order' });
|
||||
}
|
||||
};
|
||||
@@ -70,7 +71,7 @@ export const getDiaryDateActivities = async (req, res) => {
|
||||
const activities = await diaryDateActivityService.getActivities(userToken, clubId, diaryDateId);
|
||||
res.status(200).json(activities);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error getting activities' });
|
||||
}
|
||||
}
|
||||
@@ -82,7 +83,7 @@ export const addGroupActivity = async(req, res) => {
|
||||
const activityItem = await diaryDateActivityService.addGroupActivity(userToken, clubId, diaryDateId, groupId, activity);
|
||||
res.status(201).json(activityItem);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Error adding group activity' });
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,8 @@
|
||||
import diaryDateTagService from "../services/diaryDateTagService.js"
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getDiaryDateMemberTags = async (req, res) => {
|
||||
console.log("getDiaryDateMemberTags");
|
||||
devLog("getDiaryDateMemberTags");
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, memberId } = req.params;
|
||||
@@ -14,7 +15,7 @@ export const getDiaryDateMemberTags = async (req, res) => {
|
||||
}
|
||||
|
||||
export const addDiaryDateTag = async (req, res) => {
|
||||
console.log("addDiaryDateTag");
|
||||
devLog("addDiaryDateTag");
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
|
||||
52
backend/controllers/diaryMemberActivityController.js
Normal file
52
backend/controllers/diaryMemberActivityController.js
Normal file
@@ -0,0 +1,52 @@
|
||||
import DiaryMemberActivity from '../models/DiaryMemberActivity.js';
|
||||
import Participant from '../models/Participant.js';
|
||||
import { checkAccess } from '../utils/userUtils.js';
|
||||
|
||||
export const getMembersForActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, diaryDateActivityId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const list = await DiaryMemberActivity.findAll({ where: { diaryDateActivityId } });
|
||||
res.status(200).json(list);
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Error fetching members for activity' });
|
||||
}
|
||||
};
|
||||
|
||||
export const addMembersToActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, diaryDateActivityId } = req.params;
|
||||
const { participantIds } = req.body; // array of participant ids
|
||||
await checkAccess(userToken, clubId);
|
||||
const validParticipants = await Participant.findAll({ where: { id: participantIds } });
|
||||
const validIds = new Set(validParticipants.map(p => p.id));
|
||||
const created = [];
|
||||
for (const pid of participantIds) {
|
||||
if (!validIds.has(pid)) continue;
|
||||
const existing = await DiaryMemberActivity.findOne({ where: { diaryDateActivityId, participantId: pid } });
|
||||
if (!existing) {
|
||||
const rec = await DiaryMemberActivity.create({ diaryDateActivityId, participantId: pid });
|
||||
created.push(rec);
|
||||
}
|
||||
}
|
||||
res.status(201).json(created);
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Error adding members to activity' });
|
||||
}
|
||||
};
|
||||
|
||||
export const removeMemberFromActivity = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, diaryDateActivityId, participantId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
await DiaryMemberActivity.destroy({ where: { diaryDateActivityId, participantId } });
|
||||
res.status(200).json({ ok: true });
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Error removing member from activity' });
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
import DiaryMemberService from '../services/diaryMemberService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getMemberTags = async (req, res) => {
|
||||
try {
|
||||
const { diaryDateId, memberId } = req.query;
|
||||
const { clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log(diaryDateId, memberId, clubId);
|
||||
devLog(diaryDateId, memberId, clubId);
|
||||
const tags = await DiaryMemberService.getTagsForMemberAndDate(userToken, clubId, diaryDateId, memberId);
|
||||
res.status(200).json(tags);
|
||||
} catch (error) {
|
||||
@@ -19,7 +20,7 @@ const getMemberNotes = async (req, res) => {
|
||||
const { diaryDateId, memberId } = req.query;
|
||||
const { clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log('---------->', userToken, clubId);
|
||||
devLog('---------->', userToken, clubId);
|
||||
const notes = await DiaryMemberService.getNotesForMember(userToken, clubId, diaryDateId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import { DiaryTag, DiaryDateTag } from '../models/index.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getTags = async (req, res) => {
|
||||
try {
|
||||
const tags = await DiaryTag.findAll();
|
||||
@@ -12,11 +13,11 @@ export const getTags = async (req, res) => {
|
||||
export const createTag = async (req, res) => {
|
||||
try {
|
||||
const { name } = req.body;
|
||||
console.log(name);
|
||||
devLog(name);
|
||||
const newTag = await DiaryTag.findOrCreate({ where: { name }, defaults: { name } });
|
||||
res.status(201).json(newTag);
|
||||
} catch (error) {
|
||||
console.log('[createTag] - Error:', error);
|
||||
devLog('[createTag] - Error:', error);
|
||||
res.status(500).json({ error: 'Error creating tag' });
|
||||
}
|
||||
};
|
||||
|
||||
172
backend/controllers/externalServiceController.js
Normal file
172
backend/controllers/externalServiceController.js
Normal file
@@ -0,0 +1,172 @@
|
||||
import externalServiceService from '../services/externalServiceService.js';
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
|
||||
class ExternalServiceController {
|
||||
/**
|
||||
* GET /api/mytischtennis/account?service=mytischtennis
|
||||
* Get current user's external service account
|
||||
*/
|
||||
async getAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const service = req.query.service || 'mytischtennis';
|
||||
const account = await externalServiceService.getAccount(userId, service);
|
||||
|
||||
if (!account) {
|
||||
return res.status(200).json({ account: null });
|
||||
}
|
||||
|
||||
res.status(200).json({ account });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/status?service=mytischtennis
|
||||
* Check account configuration status
|
||||
*/
|
||||
async getStatus(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const service = req.query.service || 'mytischtennis';
|
||||
const status = await externalServiceService.checkAccountStatus(userId, service);
|
||||
res.status(200).json(status);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/account
|
||||
* Create or update external service account
|
||||
*/
|
||||
async upsertAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { email, password, savePassword, userPassword, service = 'mytischtennis' } = req.body;
|
||||
|
||||
if (!email) {
|
||||
throw new HttpError(400, 'E-Mail-Adresse erforderlich');
|
||||
}
|
||||
|
||||
// Wenn ein Passwort gesetzt wird, muss das App-Passwort angegeben werden
|
||||
if (password && !userPassword) {
|
||||
throw new HttpError(400, 'App-Passwort erforderlich zum Setzen des myTischtennis-Passworts');
|
||||
}
|
||||
|
||||
const account = await externalServiceService.upsertAccount(
|
||||
userId,
|
||||
email,
|
||||
password,
|
||||
savePassword || false,
|
||||
userPassword,
|
||||
service
|
||||
);
|
||||
|
||||
res.status(200).json({
|
||||
message: `${service}-Account erfolgreich gespeichert`,
|
||||
account
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* DELETE /api/mytischtennis/account?service=mytischtennis
|
||||
* Delete external service account
|
||||
*/
|
||||
async deleteAccount(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const service = req.query.service || 'mytischtennis';
|
||||
const deleted = await externalServiceService.deleteAccount(userId, service);
|
||||
|
||||
if (!deleted) {
|
||||
throw new HttpError(404, `Kein ${service}-Account gefunden`);
|
||||
}
|
||||
|
||||
res.status(200).json({ message: `${service}-Account gelöscht` });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/mytischtennis/verify
|
||||
* Verify login credentials
|
||||
*/
|
||||
async verifyLogin(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { password, service = 'mytischtennis' } = req.body;
|
||||
|
||||
const result = await externalServiceService.verifyLogin(userId, password, service);
|
||||
|
||||
res.status(200).json({
|
||||
message: 'Login erfolgreich',
|
||||
success: true,
|
||||
accessToken: result.accessToken,
|
||||
expiresAt: result.expiresAt,
|
||||
clubId: result.clubId,
|
||||
clubName: result.clubName
|
||||
});
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/mytischtennis/session?service=mytischtennis
|
||||
* Get stored session data for authenticated requests
|
||||
*/
|
||||
async getSession(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const service = req.query.service || 'mytischtennis';
|
||||
const session = await externalServiceService.getSession(userId, service);
|
||||
|
||||
res.status(200).json({ session });
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* GET /api/external-service/hettv/main-page
|
||||
* Load HeTTV main page and find download links
|
||||
*/
|
||||
async loadHettvMainPage(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const result = await externalServiceService.loadHettvMainPage(userId);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* POST /api/external-service/hettv/download-page
|
||||
* Load specific HeTTV download page
|
||||
*/
|
||||
async loadHettvDownloadPage(req, res, next) {
|
||||
try {
|
||||
const userId = req.user.id;
|
||||
const { downloadUrl } = req.body;
|
||||
|
||||
if (!downloadUrl) {
|
||||
throw new HttpError(400, 'Download-URL ist erforderlich');
|
||||
}
|
||||
|
||||
const result = await externalServiceService.loadHettvDownloadPage(userId, downloadUrl);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
next(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default new ExternalServiceController();
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import HttpError from '../exceptions/HttpError.js';
|
||||
import groupService from '../services/groupService.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const addGroup = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -9,7 +10,7 @@ const addGroup = async(req, res) => {
|
||||
res.status(201).json(result);
|
||||
} catch (error) {
|
||||
console.error('[addGroup] - Error:', error);
|
||||
console.log(req.params, req.headers, req.body)
|
||||
devLog(req.params, req.headers, req.body)
|
||||
res.status(error.statusCode || 500).json({ error: error.message });
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import MatchService from '../services/matchService.js';
|
||||
import fs from 'fs';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const uploadCSV = async (req, res) => {
|
||||
try {
|
||||
const { clubId } = req.body;
|
||||
@@ -21,7 +22,7 @@ export const uploadCSV = async (req, res) => {
|
||||
|
||||
export const getLeaguesForCurrentSeason = async (req, res) => {
|
||||
try {
|
||||
console.log(req.headers, req.params);
|
||||
devLog(req.headers, req.params);
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
const leagues = await MatchService.getLeaguesForCurrentSeason(userToken, clubId);
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import MemberService from "../services/memberService.js";
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getClubMembers = async(req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -9,24 +10,24 @@ const getClubMembers = async(req, res) => {
|
||||
}
|
||||
res.status(200).json(await MemberService.getClubMembers(userToken, clubId, showAll));
|
||||
} catch(error) {
|
||||
console.log('[getClubMembers] - Error: ', error);
|
||||
devLog('[getClubMembers] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
}
|
||||
|
||||
const getWaitingApprovals = async(req, res) => {
|
||||
try {
|
||||
console.log('[getWaitingApprovals] - Start');
|
||||
devLog('[getWaitingApprovals] - Start');
|
||||
const { id: clubId } = req.params;
|
||||
console.log('[getWaitingApprovals] - get token');
|
||||
devLog('[getWaitingApprovals] - get token');
|
||||
const { authcode: userToken } = req.headers;
|
||||
console.log('[getWaitingApprovals] - load for waiting approvals');
|
||||
devLog('[getWaitingApprovals] - load for waiting approvals');
|
||||
const waitingApprovals = await MemberService.getApprovalRequests(userToken, clubId);
|
||||
console.log('[getWaitingApprovals] - set response');
|
||||
devLog('[getWaitingApprovals] - set response');
|
||||
res.status(200).json(waitingApprovals);
|
||||
console.log('[getWaitingApprovals] - done');
|
||||
devLog('[getWaitingApprovals] - done');
|
||||
} catch(error) {
|
||||
console.log('[getWaitingApprovals] - Error: ', error);
|
||||
devLog('[getWaitingApprovals] - Error: ', error);
|
||||
res.status(403).json({ error: error });
|
||||
}
|
||||
}
|
||||
@@ -34,11 +35,11 @@ const getWaitingApprovals = async(req, res) => {
|
||||
const setClubMembers = async (req, res) => {
|
||||
try {
|
||||
const { id: memberId, firstname: firstName, lastname: lastName, street, city, birthdate, phone, email, active,
|
||||
testMembership, picsInInternetAllowed } = req.body;
|
||||
testMembership, picsInInternetAllowed, gender, ttr, qttr } = req.body;
|
||||
const { id: clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const addResult = await MemberService.setClubMember(userToken, clubId, memberId, firstName, lastName, street, city, birthdate,
|
||||
phone, email, active, testMembership, picsInInternetAllowed);
|
||||
phone, email, active, testMembership, picsInInternetAllowed, gender, ttr, qttr);
|
||||
res.status(addResult.status || 500).json(addResult.response);
|
||||
} catch (error) {
|
||||
console.error('[setClubMembers] - Error:', error);
|
||||
@@ -59,7 +60,7 @@ const uploadMemberImage = async (req, res) => {
|
||||
};
|
||||
|
||||
const getMemberImage = async (req, res) => {
|
||||
console.log('[getMemberImage]');
|
||||
devLog('[getMemberImage]');
|
||||
try {
|
||||
const { clubId, memberId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
@@ -75,4 +76,17 @@ const getMemberImage = async (req, res) => {
|
||||
}
|
||||
};
|
||||
|
||||
export { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage };
|
||||
const updateRatingsFromMyTischtennis = async (req, res) => {
|
||||
devLog('[updateRatingsFromMyTischtennis]');
|
||||
try {
|
||||
const { id: clubId } = req.params;
|
||||
const { authcode: userToken } = req.headers;
|
||||
const result = await MemberService.updateRatingsFromMyTischtennis(userToken, clubId);
|
||||
res.status(result.status).json(result.response);
|
||||
} catch (error) {
|
||||
console.error('[updateRatingsFromMyTischtennis] - Error:', error);
|
||||
res.status(500).json({ error: 'Failed to update ratings' });
|
||||
}
|
||||
};
|
||||
|
||||
export { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage, updateRatingsFromMyTischtennis };
|
||||
@@ -1,15 +1,16 @@
|
||||
import MemberNoteService from "../services/memberNoteService.js";
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
const getMemberNotes = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { memberId } = req.params;
|
||||
const { clubId } = req.query;
|
||||
console.log('[getMemberNotes]', userToken, memberId, clubId);
|
||||
devLog('[getMemberNotes]', userToken, memberId, clubId);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[getMemberNotes] - Error: ', error);
|
||||
devLog('[getMemberNotes] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
@@ -18,12 +19,12 @@ const addMemberNote = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { memberId, content, clubId } = req.body;
|
||||
console.log('[addMemberNote]', userToken, memberId, content, clubId);
|
||||
devLog('[addMemberNote]', userToken, memberId, content, clubId);
|
||||
await MemberNoteService.addNoteToMember(userToken, clubId, memberId, content);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(201).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[addMemberNote] - Error: ', error);
|
||||
devLog('[addMemberNote] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
@@ -33,13 +34,13 @@ const deleteMemberNote = async (req, res) => {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { noteId } = req.params;
|
||||
const { clubId } = req.body;
|
||||
console.log('[deleteMemberNote]', userToken, noteId, clubId);
|
||||
devLog('[deleteMemberNote]', userToken, noteId, clubId);
|
||||
const memberId = await MemberNoteService.getMemberIdForNote(noteId); // Member ID ermitteln
|
||||
await MemberNoteService.deleteNoteForMember(userToken, clubId, noteId);
|
||||
const notes = await MemberNoteService.getNotesForMember(userToken, clubId, memberId);
|
||||
res.status(200).json(notes);
|
||||
} catch (error) {
|
||||
console.log('[deleteMemberNote] - Error: ', error);
|
||||
devLog('[deleteMemberNote] - Error: ', error);
|
||||
res.status(500).json({ error: 'systemerror' });
|
||||
}
|
||||
};
|
||||
|
||||
619
backend/controllers/officialTournamentController.js
Normal file
619
backend/controllers/officialTournamentController.js
Normal file
@@ -0,0 +1,619 @@
|
||||
import { createRequire } from 'module';
|
||||
const require = createRequire(import.meta.url);
|
||||
const pdfParse = require('pdf-parse/lib/pdf-parse.js');
|
||||
import { checkAccess } from '../utils/userUtils.js';
|
||||
import OfficialTournament from '../models/OfficialTournament.js';
|
||||
import OfficialCompetition from '../models/OfficialCompetition.js';
|
||||
import OfficialCompetitionMember from '../models/OfficialCompetitionMember.js';
|
||||
import Member from '../models/Member.js';
|
||||
import { Op } from 'sequelize';
|
||||
|
||||
// In-Memory Store (einfacher Start); später DB-Modell
|
||||
const parsedTournaments = new Map(); // key: id, value: { id, clubId, rawText, parsedData }
|
||||
let seq = 1;
|
||||
|
||||
export const uploadTournamentPdf = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
if (!req.file || !req.file.buffer) return res.status(400).json({ error: 'No pdf provided' });
|
||||
const data = await pdfParse(req.file.buffer);
|
||||
const parsed = parseTournamentText(data.text);
|
||||
const t = await OfficialTournament.create({
|
||||
clubId,
|
||||
title: parsed.title || null,
|
||||
eventDate: parsed.termin || null,
|
||||
organizer: null,
|
||||
host: null,
|
||||
venues: JSON.stringify(parsed.austragungsorte || []),
|
||||
competitionTypes: JSON.stringify(parsed.konkurrenztypen || []),
|
||||
registrationDeadlines: JSON.stringify(parsed.meldeschluesse || []),
|
||||
entryFees: JSON.stringify(parsed.entryFees || {}),
|
||||
});
|
||||
// competitions persistieren
|
||||
for (const c of parsed.competitions || []) {
|
||||
// Korrigiere Fehlzuordnung: Wenn die Zeile mit "Stichtag" fälschlich in performanceClass steht
|
||||
let performanceClass = c.leistungsklasse || c.performanceClass || null;
|
||||
let cutoffDate = c.stichtag || c.cutoffDate || null;
|
||||
if (performanceClass && /^stichtag\b/i.test(performanceClass)) {
|
||||
cutoffDate = performanceClass.replace(/^stichtag\s*:?\s*/i, '').trim();
|
||||
performanceClass = null;
|
||||
}
|
||||
await OfficialCompetition.create({
|
||||
tournamentId: t.id,
|
||||
ageClassCompetition: c.altersklasseWettbewerb || c.ageClassCompetition || null,
|
||||
performanceClass,
|
||||
startTime: c.startzeit || c.startTime || null,
|
||||
registrationDeadlineDate: c.meldeschlussDatum || c.registrationDeadlineDate || null,
|
||||
registrationDeadlineOnline: c.meldeschlussOnline || c.registrationDeadlineOnline || null,
|
||||
cutoffDate,
|
||||
ttrRelevant: c.ttrRelevant || null,
|
||||
openTo: c.offenFuer || c.openTo || null,
|
||||
preliminaryRound: c.vorrunde || c.preliminaryRound || null,
|
||||
finalRound: c.endrunde || c.finalRound || null,
|
||||
maxParticipants: c.maxTeilnehmer || c.maxParticipants || null,
|
||||
entryFee: c.startgeld || c.entryFee || null,
|
||||
});
|
||||
}
|
||||
res.status(201).json({ id: String(t.id) });
|
||||
} catch (e) {
|
||||
console.error('[uploadTournamentPdf] Error:', e);
|
||||
res.status(500).json({ error: 'Failed to parse pdf' });
|
||||
}
|
||||
};
|
||||
|
||||
export const getParsedTournament = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, id } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const t = await OfficialTournament.findOne({ where: { id, clubId } });
|
||||
if (!t) return res.status(404).json({ error: 'not found' });
|
||||
const comps = await OfficialCompetition.findAll({ where: { tournamentId: id } });
|
||||
const entries = await OfficialCompetitionMember.findAll({ where: { tournamentId: id } });
|
||||
const competitions = comps.map((c) => {
|
||||
const j = c.toJSON();
|
||||
return {
|
||||
id: j.id,
|
||||
tournamentId: j.tournamentId,
|
||||
ageClassCompetition: j.ageClassCompetition || null,
|
||||
performanceClass: j.performanceClass || null,
|
||||
startTime: j.startTime || null,
|
||||
registrationDeadlineDate: j.registrationDeadlineDate || null,
|
||||
registrationDeadlineOnline: j.registrationDeadlineOnline || null,
|
||||
cutoffDate: j.cutoffDate || null,
|
||||
ttrRelevant: j.ttrRelevant || null,
|
||||
openTo: j.openTo || null,
|
||||
preliminaryRound: j.preliminaryRound || null,
|
||||
finalRound: j.finalRound || null,
|
||||
maxParticipants: j.maxParticipants || null,
|
||||
entryFee: j.entryFee || null,
|
||||
// Legacy Felder zusätzlich, falls Frontend sie noch nutzt
|
||||
altersklasseWettbewerb: j.ageClassCompetition || null,
|
||||
leistungsklasse: j.performanceClass || null,
|
||||
startzeit: j.startTime || null,
|
||||
meldeschlussDatum: j.registrationDeadlineDate || null,
|
||||
meldeschlussOnline: j.registrationDeadlineOnline || null,
|
||||
stichtag: j.cutoffDate || null,
|
||||
offenFuer: j.openTo || null,
|
||||
vorrunde: j.preliminaryRound || null,
|
||||
endrunde: j.finalRound || null,
|
||||
maxTeilnehmer: j.maxParticipants || null,
|
||||
startgeld: j.entryFee || null,
|
||||
};
|
||||
});
|
||||
res.status(200).json({
|
||||
id: String(t.id),
|
||||
clubId: String(t.clubId),
|
||||
parsedData: {
|
||||
title: t.title,
|
||||
termin: t.eventDate,
|
||||
austragungsorte: JSON.parse(t.venues || '[]'),
|
||||
konkurrenztypen: JSON.parse(t.competitionTypes || '[]'),
|
||||
meldeschluesse: JSON.parse(t.registrationDeadlines || '[]'),
|
||||
entryFees: JSON.parse(t.entryFees || '{}'),
|
||||
competitions,
|
||||
},
|
||||
participation: entries.map(e => ({
|
||||
id: e.id,
|
||||
tournamentId: e.tournamentId,
|
||||
competitionId: e.competitionId,
|
||||
memberId: e.memberId,
|
||||
wants: !!e.wants,
|
||||
registered: !!e.registered,
|
||||
participated: !!e.participated,
|
||||
placement: e.placement || null,
|
||||
})),
|
||||
});
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Failed to fetch parsed tournament' });
|
||||
}
|
||||
};
|
||||
|
||||
export const upsertCompetitionMember = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, id } = req.params; // id = tournamentId
|
||||
await checkAccess(userToken, clubId);
|
||||
const { competitionId, memberId, wants, registered, participated, placement } = req.body;
|
||||
if (!competitionId || !memberId) return res.status(400).json({ error: 'competitionId and memberId required' });
|
||||
const [row] = await OfficialCompetitionMember.findOrCreate({
|
||||
where: { competitionId, memberId },
|
||||
defaults: {
|
||||
tournamentId: id,
|
||||
competitionId,
|
||||
memberId,
|
||||
wants: !!wants,
|
||||
registered: !!registered,
|
||||
participated: !!participated,
|
||||
placement: placement || null,
|
||||
}
|
||||
});
|
||||
row.wants = wants !== undefined ? !!wants : row.wants;
|
||||
row.registered = registered !== undefined ? !!registered : row.registered;
|
||||
row.participated = participated !== undefined ? !!participated : row.participated;
|
||||
if (placement !== undefined) row.placement = placement;
|
||||
await row.save();
|
||||
return res.status(200).json({ success: true, id: row.id });
|
||||
} catch (e) {
|
||||
console.error('[upsertCompetitionMember] Error:', e);
|
||||
res.status(500).json({ error: 'Failed to save participation' });
|
||||
}
|
||||
};
|
||||
|
||||
export const updateParticipantStatus = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, id } = req.params; // id = tournamentId
|
||||
await checkAccess(userToken, clubId);
|
||||
const { competitionId, memberId, action } = req.body;
|
||||
|
||||
if (!competitionId || !memberId || !action) {
|
||||
return res.status(400).json({ error: 'competitionId, memberId and action required' });
|
||||
}
|
||||
|
||||
const [row] = await OfficialCompetitionMember.findOrCreate({
|
||||
where: { competitionId, memberId },
|
||||
defaults: {
|
||||
tournamentId: id,
|
||||
competitionId,
|
||||
memberId,
|
||||
wants: false,
|
||||
registered: false,
|
||||
participated: false,
|
||||
placement: null,
|
||||
}
|
||||
});
|
||||
|
||||
// Status-Update basierend auf Aktion
|
||||
switch (action) {
|
||||
case 'register':
|
||||
// Von "möchte teilnehmen" zu "angemeldet"
|
||||
row.wants = true;
|
||||
row.registered = true;
|
||||
row.participated = false;
|
||||
break;
|
||||
case 'participate':
|
||||
// Von "angemeldet" zu "hat gespielt"
|
||||
row.wants = true;
|
||||
row.registered = true;
|
||||
row.participated = true;
|
||||
break;
|
||||
case 'reset':
|
||||
// Zurück zu "möchte teilnehmen"
|
||||
row.wants = true;
|
||||
row.registered = false;
|
||||
row.participated = false;
|
||||
break;
|
||||
default:
|
||||
return res.status(400).json({ error: 'Invalid action. Use: register, participate, or reset' });
|
||||
}
|
||||
|
||||
await row.save();
|
||||
return res.status(200).json({
|
||||
success: true,
|
||||
id: row.id,
|
||||
status: {
|
||||
wants: row.wants,
|
||||
registered: row.registered,
|
||||
participated: row.participated,
|
||||
placement: row.placement
|
||||
}
|
||||
});
|
||||
} catch (e) {
|
||||
console.error('[updateParticipantStatus] Error:', e);
|
||||
res.status(500).json({ error: 'Failed to update participant status' });
|
||||
}
|
||||
};
|
||||
|
||||
export const listOfficialTournaments = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const list = await OfficialTournament.findAll({ where: { clubId } });
|
||||
res.status(200).json(list);
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Failed to list tournaments' });
|
||||
}
|
||||
};
|
||||
|
||||
export const listClubParticipations = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const tournaments = await OfficialTournament.findAll({ where: { clubId } });
|
||||
if (!tournaments || tournaments.length === 0) return res.status(200).json([]);
|
||||
const tournamentIds = tournaments.map(t => t.id);
|
||||
|
||||
const rows = await OfficialCompetitionMember.findAll({
|
||||
where: { tournamentId: { [Op.in]: tournamentIds }, participated: true },
|
||||
include: [
|
||||
{ model: OfficialCompetition, as: 'competition', attributes: ['id', 'tournamentId', 'ageClassCompetition', 'startTime'] },
|
||||
{ model: OfficialTournament, as: 'tournament', attributes: ['id', 'title', 'eventDate'] },
|
||||
{ model: Member, as: 'member', attributes: ['id', 'firstName', 'lastName'] },
|
||||
]
|
||||
});
|
||||
|
||||
const parseDmy = (s) => {
|
||||
if (!s) return null;
|
||||
const m = String(s).match(/(\d{1,2})\.(\d{1,2})\.(\d{4})/);
|
||||
if (!m) return null;
|
||||
const d = new Date(Number(m[3]), Number(m[2]) - 1, Number(m[1]));
|
||||
return isNaN(d.getTime()) ? null : d;
|
||||
};
|
||||
const fmtDmy = (d) => {
|
||||
const dd = String(d.getDate()).padStart(2, '0');
|
||||
const mm = String(d.getMonth() + 1).padStart(2, '0');
|
||||
const yyyy = d.getFullYear();
|
||||
return `${dd}.${mm}.${yyyy}`;
|
||||
};
|
||||
|
||||
const byTournament = new Map();
|
||||
for (const r of rows) {
|
||||
const t = r.tournament;
|
||||
const c = r.competition;
|
||||
const m = r.member;
|
||||
if (!t || !c || !m) continue;
|
||||
if (!byTournament.has(t.id)) {
|
||||
byTournament.set(t.id, {
|
||||
tournamentId: String(t.id),
|
||||
title: t.title || null,
|
||||
startDate: null,
|
||||
endDate: null,
|
||||
entries: [],
|
||||
_dates: [],
|
||||
_eventDate: t.eventDate || null,
|
||||
});
|
||||
}
|
||||
const bucket = byTournament.get(t.id);
|
||||
const compDate = parseDmy(c.startTime || '') || null;
|
||||
if (compDate) bucket._dates.push(compDate);
|
||||
bucket.entries.push({
|
||||
memberId: m.id,
|
||||
memberName: `${m.firstName || ''} ${m.lastName || ''}`.trim(),
|
||||
competitionId: c.id,
|
||||
competitionName: c.ageClassCompetition || '',
|
||||
placement: r.placement || null,
|
||||
date: compDate ? fmtDmy(compDate) : null,
|
||||
});
|
||||
}
|
||||
|
||||
const out = [];
|
||||
for (const t of tournaments) {
|
||||
const bucket = byTournament.get(t.id) || {
|
||||
tournamentId: String(t.id),
|
||||
title: t.title || null,
|
||||
startDate: null,
|
||||
endDate: null,
|
||||
entries: [],
|
||||
_dates: [],
|
||||
_eventDate: t.eventDate || null,
|
||||
};
|
||||
// Ableiten Start/Ende
|
||||
if (bucket._dates.length) {
|
||||
bucket._dates.sort((a, b) => a - b);
|
||||
bucket.startDate = fmtDmy(bucket._dates[0]);
|
||||
bucket.endDate = fmtDmy(bucket._dates[bucket._dates.length - 1]);
|
||||
} else if (bucket._eventDate) {
|
||||
const all = String(bucket._eventDate).match(/(\d{1,2}\.\d{1,2}\.\d{4})/g) || [];
|
||||
if (all.length >= 1) {
|
||||
const d1 = parseDmy(all[0]);
|
||||
const d2 = all.length >= 2 ? parseDmy(all[1]) : d1;
|
||||
if (d1) bucket.startDate = fmtDmy(d1);
|
||||
if (d2) bucket.endDate = fmtDmy(d2);
|
||||
}
|
||||
}
|
||||
// Sort entries: Mitglied, dann Konkurrenz
|
||||
bucket.entries.sort((a, b) => {
|
||||
const mcmp = (a.memberName || '').localeCompare(b.memberName || '', 'de', { sensitivity: 'base' });
|
||||
if (mcmp !== 0) return mcmp;
|
||||
return (a.competitionName || '').localeCompare(b.competitionName || '', 'de', { sensitivity: 'base' });
|
||||
});
|
||||
delete bucket._dates;
|
||||
delete bucket._eventDate;
|
||||
out.push(bucket);
|
||||
}
|
||||
|
||||
res.status(200).json(out);
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Failed to list club participations' });
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteOfficialTournament = async (req, res) => {
|
||||
try {
|
||||
const { authcode: userToken } = req.headers;
|
||||
const { clubId, id } = req.params;
|
||||
await checkAccess(userToken, clubId);
|
||||
const t = await OfficialTournament.findOne({ where: { id, clubId } });
|
||||
if (!t) return res.status(404).json({ error: 'not found' });
|
||||
await OfficialCompetition.destroy({ where: { tournamentId: id } });
|
||||
await OfficialTournament.destroy({ where: { id } });
|
||||
res.status(204).send();
|
||||
} catch (e) {
|
||||
res.status(500).json({ error: 'Failed to delete tournament' });
|
||||
}
|
||||
};
|
||||
|
||||
function parseTournamentText(text) {
|
||||
const lines = text.split(/\r?\n/);
|
||||
const normLines = lines.map(l => l.replace(/\s+/g, ' ').trim());
|
||||
|
||||
const findTitle = () => {
|
||||
const idx = normLines.findIndex(l => /Kreiseinzelmeisterschaften/i.test(l));
|
||||
return idx >= 0 ? normLines[idx] : null;
|
||||
};
|
||||
|
||||
// Neue Funktion: Teilnahmegebühren pro Spielklasse extrahieren
|
||||
const extractEntryFees = () => {
|
||||
const entryFees = {};
|
||||
|
||||
// Verschiedene Patterns für Teilnahmegebühren suchen
|
||||
const feePatterns = [
|
||||
// Pattern 1: "Startgeld: U12: 5€, U14: 7€, U16: 10€"
|
||||
/startgeld\s*:?\s*(.+)/i,
|
||||
// Pattern 2: "Teilnahmegebühr: U12: 5€, U14: 7€"
|
||||
/teilnahmegebühr\s*:?\s*(.+)/i,
|
||||
// Pattern 3: "Gebühr: U12: 5€, U14: 7€"
|
||||
/gebühr\s*:?\s*(.+)/i,
|
||||
// Pattern 4: "Einschreibegebühr: U12: 5€, U14: 7€"
|
||||
/einschreibegebühr\s*:?\s*(.+)/i,
|
||||
// Pattern 5: "Anmeldegebühr: U12: 5€, U14: 7€"
|
||||
/anmeldegebühr\s*:?\s*(.+)/i
|
||||
];
|
||||
|
||||
for (const pattern of feePatterns) {
|
||||
for (let i = 0; i < normLines.length; i++) {
|
||||
const line = normLines[i];
|
||||
const match = line.match(pattern);
|
||||
if (match) {
|
||||
const feeText = match[1];
|
||||
|
||||
// Extrahiere Gebühren aus dem Text
|
||||
// Unterstützt verschiedene Formate:
|
||||
// "U12: 5€, U14: 7€, U16: 10€"
|
||||
// "U12: 5 Euro, U14: 7 Euro"
|
||||
// "U12 5€, U14 7€"
|
||||
// "U12: 5,00€, U14: 7,00€"
|
||||
const feeMatches = feeText.matchAll(/(U\d+|AK\s*\d+)\s*:?\s*(\d+(?:[,.]\d+)?)\s*(?:€|Euro|EUR)?/gi);
|
||||
|
||||
for (const feeMatch of feeMatches) {
|
||||
const ageClass = feeMatch[1].toUpperCase().replace(/\s+/g, '');
|
||||
const amount = feeMatch[2].replace(',', '.');
|
||||
const numericAmount = parseFloat(amount);
|
||||
|
||||
if (!isNaN(numericAmount)) {
|
||||
entryFees[ageClass] = {
|
||||
amount: numericAmount,
|
||||
currency: '€',
|
||||
rawText: feeMatch[0]
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
// Wenn wir Gebühren gefunden haben, brechen wir ab
|
||||
if (Object.keys(entryFees).length > 0) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (Object.keys(entryFees).length > 0) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
return entryFees;
|
||||
};
|
||||
|
||||
const extractBlockAfter = (labels, multiline = false) => {
|
||||
const idx = normLines.findIndex(l => labels.some(lb => l.toLowerCase().startsWith(lb)));
|
||||
if (idx === -1) return multiline ? [] : null;
|
||||
const line = normLines[idx];
|
||||
const afterColon = line.includes(':') ? line.split(':').slice(1).join(':').trim() : '';
|
||||
if (!multiline) {
|
||||
if (afterColon) return afterColon;
|
||||
// sonst nächste nicht-leere Zeile
|
||||
for (let i = idx + 1; i < normLines.length; i++) {
|
||||
if (normLines[i]) return normLines[i];
|
||||
}
|
||||
return null;
|
||||
}
|
||||
// multiline bis zur nächsten Leerzeile oder nächsten bekannten Section
|
||||
const out = [];
|
||||
if (afterColon) out.push(afterColon);
|
||||
for (let i = idx + 1; i < normLines.length; i++) {
|
||||
const ln = normLines[i];
|
||||
if (!ln) break;
|
||||
if (/^(termin|austragungsort|austragungsorte|konkurrenz|konkurrenzen|konkurrenztypen|meldeschluss|altersklassen|startzeiten)/i.test(ln)) break;
|
||||
out.push(ln);
|
||||
}
|
||||
return out;
|
||||
};
|
||||
|
||||
const extractAllMatches = (regex) => {
|
||||
const results = [];
|
||||
for (const l of normLines) {
|
||||
const m = l.match(regex);
|
||||
if (m) results.push(m);
|
||||
}
|
||||
return results;
|
||||
};
|
||||
|
||||
const title = findTitle();
|
||||
const termin = extractBlockAfter(['termin', 'termin '], false);
|
||||
const austragungsorte = extractBlockAfter(['austragungsort', 'austragungsorte'], true);
|
||||
let konkurrenzRaw = extractBlockAfter(['konkurrenz', 'konkurrenzen', 'konkurrenztypen'], true);
|
||||
if (konkurrenzRaw && !Array.isArray(konkurrenzRaw)) konkurrenzRaw = [konkurrenzRaw];
|
||||
const konkurrenztypen = (konkurrenzRaw || []).flatMap(l => l.split(/[;,]/)).map(s => s.trim()).filter(Boolean);
|
||||
|
||||
// Meldeschlüsse mit Position und Zuordnung zu AK ermitteln
|
||||
const meldeschluesseRaw = [];
|
||||
for (let i = 0; i < normLines.length; i++) {
|
||||
const l = normLines[i];
|
||||
const m = l.match(/meldeschluss\s*:?\s*(.+)$/i);
|
||||
if (m) meldeschluesseRaw.push({ line: i, value: m[1].trim() });
|
||||
}
|
||||
|
||||
let altersRaw = extractBlockAfter(['altersklassen', 'altersklasse'], true);
|
||||
if (altersRaw && !Array.isArray(altersRaw)) altersRaw = [altersRaw];
|
||||
const altersklassen = (altersRaw || []).flatMap(l => l.split(/[;,]/)).map(s => s.trim()).filter(Boolean);
|
||||
|
||||
// Wettbewerbe/Konkurrenzen parsen (Block ab "3. Konkurrenzen")
|
||||
const competitions = [];
|
||||
const konkIdx = normLines.findIndex(l => /^\s*3\.?\s+Konkurrenzen/i.test(l) || /^Konkurrenzen\b/i.test(l));
|
||||
// Bestimme Start-Sektionsnummer (z. B. 3 bei "3. Konkurrenzen"), fallback 3
|
||||
const startSectionNum = (() => {
|
||||
if (konkIdx === -1) return 3;
|
||||
const m = normLines[konkIdx].match(/^\s*(\d+)\./);
|
||||
return m ? parseInt(m[1], 10) : 3;
|
||||
})();
|
||||
const nextSectionIdx = () => {
|
||||
for (let i = konkIdx + 1; i < normLines.length; i++) {
|
||||
const m = normLines[i].match(/^\s*(\d+)\.\s+/);
|
||||
if (m) {
|
||||
const num = parseInt(m[1], 10);
|
||||
if (!Number.isNaN(num) && num > startSectionNum) return i;
|
||||
}
|
||||
// Hinweis: Seitenfußzeilen wie "nu.Dokument ..." ignorieren wir, damit mehrseitige Blöcke nicht abbrechen
|
||||
}
|
||||
return normLines.length;
|
||||
};
|
||||
if (konkIdx !== -1) {
|
||||
const endIdx = nextSectionIdx();
|
||||
let i = konkIdx + 1;
|
||||
while (i < endIdx) {
|
||||
const line = normLines[i];
|
||||
if (/^Altersklasse\/Wettbewerb\s*:/i.test(line)) {
|
||||
const comp = {};
|
||||
comp.altersklasseWettbewerb = line.split(':').slice(1).join(':').trim();
|
||||
i++;
|
||||
while (i < endIdx && !/^Altersklasse\/Wettbewerb\s*:/i.test(normLines[i])) {
|
||||
const ln = normLines[i];
|
||||
const m = ln.match(/^([^:]+):\s*(.*)$/);
|
||||
if (m) {
|
||||
const key = m[1].trim().toLowerCase();
|
||||
const val = m[2].trim();
|
||||
if (key.startsWith('leistungsklasse')) comp.leistungsklasse = val;
|
||||
else if (key === 'startzeit') {
|
||||
// Erwartet: 20.09.2025 13:30 Uhr -> wir extrahieren Datum+Zeit
|
||||
const sm = val.match(/(\d{2}\.\d{2}\.\d{4})\s+(\d{1,2}:\d{2})/);
|
||||
comp.startzeit = sm ? `${sm[1]} ${sm[2]}` : val;
|
||||
}
|
||||
else if (key.startsWith('meldeschluss datum')) comp.meldeschlussDatum = val;
|
||||
else if (key.startsWith('meldeschluss online')) comp.meldeschlussOnline = val;
|
||||
else if (key === 'stichtag') comp.stichtag = val;
|
||||
else if (key === 'ttr-relevant') comp.ttrRelevant = val;
|
||||
else if (key === 'offen für') comp.offenFuer = val;
|
||||
else if (key.startsWith('austragungssys. vorrunde')) comp.vorrunde = val;
|
||||
else if (key.startsWith('austragungssys. endrunde')) comp.endrunde = val;
|
||||
else if (key.startsWith('max. teilnehmerzahl')) comp.maxTeilnehmer = val;
|
||||
else if (key === 'startgeld') {
|
||||
comp.startgeld = val;
|
||||
// Versuche auch spezifische Gebühren für diese Altersklasse zu extrahieren
|
||||
const ageClassMatch = comp.altersklasseWettbewerb?.match(/(U\d+|AK\s*\d+)/i);
|
||||
if (ageClassMatch) {
|
||||
const ageClass = ageClassMatch[1].toUpperCase().replace(/\s+/g, '');
|
||||
const feeMatch = val.match(/(\d+(?:[,.]\d+)?)\s*(?:€|Euro|EUR)?/);
|
||||
if (feeMatch) {
|
||||
const amount = feeMatch[1].replace(',', '.');
|
||||
const numericAmount = parseFloat(amount);
|
||||
if (!isNaN(numericAmount)) {
|
||||
comp.entryFeeDetails = {
|
||||
amount: numericAmount,
|
||||
currency: '€',
|
||||
ageClass: ageClass
|
||||
};
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
i++;
|
||||
}
|
||||
competitions.push(comp);
|
||||
continue; // schon auf nächster Zeile
|
||||
}
|
||||
i++;
|
||||
}
|
||||
}
|
||||
|
||||
// Altersklassen-Positionen im Text (zur Zuordnung von Meldeschlüssen)
|
||||
const akPositions = [];
|
||||
for (let i = 0; i < normLines.length; i++) {
|
||||
const l = normLines[i];
|
||||
const m = l.match(/\b(U\d+|AK\s*\d+)\b/i);
|
||||
if (m) akPositions.push({ line: i, ak: m[1].toUpperCase().replace(/\s+/g, '') });
|
||||
}
|
||||
|
||||
const meldeschluesseByAk = {};
|
||||
for (const ms of meldeschluesseRaw) {
|
||||
// Nächste AK im Umkreis von 3 Zeilen suchen
|
||||
let best = null;
|
||||
let bestDist = Infinity;
|
||||
for (const ak of akPositions) {
|
||||
const dist = Math.abs(ak.line - ms.line);
|
||||
if (dist < bestDist && dist <= 3) { best = ak; bestDist = dist; }
|
||||
}
|
||||
if (best) {
|
||||
if (!meldeschluesseByAk[best.ak]) meldeschluesseByAk[best.ak] = new Set();
|
||||
meldeschluesseByAk[best.ak].add(ms.value);
|
||||
}
|
||||
}
|
||||
|
||||
// Dedup global
|
||||
const meldeschluesse = Array.from(new Set(meldeschluesseRaw.map(x => x.value)));
|
||||
// Sets zu Arrays
|
||||
const meldeschluesseByAkOut = Object.fromEntries(Object.entries(meldeschluesseByAk).map(([k,v]) => [k, Array.from(v)]));
|
||||
|
||||
// Vorhandene einfache Personenerkennung (optional, zu Analysezwecken)
|
||||
const entries = [];
|
||||
for (const l of normLines) {
|
||||
const m = l.match(/^([A-Za-zÄÖÜäöüß\-\s']{3,})(?:\s+\((m|w|d)\))?$/i);
|
||||
if (m && /\s/.test(m[1])) {
|
||||
entries.push({ name: m[1].trim(), genderHint: m[2] || null });
|
||||
}
|
||||
}
|
||||
|
||||
// Extrahiere Teilnahmegebühren
|
||||
const entryFees = extractEntryFees();
|
||||
|
||||
return {
|
||||
title,
|
||||
termin,
|
||||
austragungsorte,
|
||||
konkurrenztypen,
|
||||
meldeschluesse,
|
||||
meldeschluesseByAk: meldeschluesseByAkOut,
|
||||
altersklassen,
|
||||
startzeiten: {},
|
||||
competitions,
|
||||
entries,
|
||||
entryFees, // Neue: Teilnahmegebühren pro Spielklasse
|
||||
debug: { normLines },
|
||||
};
|
||||
}
|
||||
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
import Participant from '../models/Participant.js';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const getParticipants = async (req, res) => {
|
||||
try {
|
||||
const { dateId } = req.params;
|
||||
const participants = await Participant.findAll({ where: { diaryDateId: dateId } });
|
||||
res.status(200).json(participants);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Abrufen der Teilnehmer' });
|
||||
}
|
||||
};
|
||||
@@ -17,7 +18,7 @@ export const addParticipant = async (req, res) => {
|
||||
const participant = await Participant.create({ diaryDateId, memberId });
|
||||
res.status(201).json(participant);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Hinzufügen des Teilnehmers' });
|
||||
}
|
||||
};
|
||||
@@ -28,7 +29,7 @@ export const removeParticipant = async (req, res) => {
|
||||
await Participant.destroy({ where: { diaryDateId, memberId } });
|
||||
res.status(200).json({ message: 'Teilnehmer entfernt' });
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
devLog(error);
|
||||
res.status(500).json({ error: 'Fehler beim Entfernen des Teilnehmers' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import predefinedActivityService from '../services/predefinedActivityService.js';
|
||||
import PredefinedActivityImage from '../models/PredefinedActivityImage.js';
|
||||
import path from 'path';
|
||||
import fs from 'fs';
|
||||
|
||||
export const createPredefinedActivity = async (req, res) => {
|
||||
try {
|
||||
const { name, description, durationText, duration } = req.body;
|
||||
const predefinedActivity = await predefinedActivityService.createPredefinedActivity({ name, description, durationText, duration });
|
||||
const { name, code, description, durationText, duration, imageLink, drawingData } = req.body;
|
||||
const predefinedActivity = await predefinedActivityService.createPredefinedActivity({ name, code, description, durationText, duration, imageLink, drawingData });
|
||||
res.status(201).json(predefinedActivity);
|
||||
} catch (error) {
|
||||
console.error('[createPredefinedActivity] - Error:', error);
|
||||
@@ -25,10 +28,11 @@ export const getPredefinedActivityById = async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const predefinedActivity = await predefinedActivityService.getPredefinedActivityById(id);
|
||||
const images = await PredefinedActivityImage.findAll({ where: { predefinedActivityId: id } });
|
||||
if (!predefinedActivity) {
|
||||
return res.status(404).json({ error: 'Predefined activity not found' });
|
||||
}
|
||||
res.status(200).json(predefinedActivity);
|
||||
res.status(200).json({ ...predefinedActivity.toJSON(), images });
|
||||
} catch (error) {
|
||||
console.error('[getPredefinedActivityById] - Error:', error);
|
||||
res.status(500).json({ error: 'Error fetching predefined activity' });
|
||||
@@ -38,11 +42,43 @@ export const getPredefinedActivityById = async (req, res) => {
|
||||
export const updatePredefinedActivity = async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params;
|
||||
const { name, description, durationText, duration } = req.body;
|
||||
const updatedActivity = await predefinedActivityService.updatePredefinedActivity(id, { name, description, durationText, duration });
|
||||
const { name, code, description, durationText, duration, imageLink, drawingData } = req.body;
|
||||
const updatedActivity = await predefinedActivityService.updatePredefinedActivity(id, { name, code, description, durationText, duration, imageLink, drawingData });
|
||||
res.status(200).json(updatedActivity);
|
||||
} catch (error) {
|
||||
console.error('[updatePredefinedActivity] - Error:', error);
|
||||
res.status(500).json({ error: 'Error updating predefined activity' });
|
||||
}
|
||||
};
|
||||
|
||||
export const searchPredefinedActivities = async (req, res) => {
|
||||
try {
|
||||
const { q, limit } = req.query;
|
||||
const result = await predefinedActivityService.searchPredefinedActivities(q, limit);
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[searchPredefinedActivities] - Error:', error);
|
||||
res.status(500).json({ error: 'Error searching predefined activities' });
|
||||
}
|
||||
};
|
||||
|
||||
export const mergePredefinedActivities = async (req, res) => {
|
||||
try {
|
||||
const { sourceId, targetId } = req.body;
|
||||
await predefinedActivityService.mergeActivities(sourceId, targetId);
|
||||
res.status(200).json({ ok: true });
|
||||
} catch (error) {
|
||||
console.error('[mergePredefinedActivities] - Error:', error);
|
||||
res.status(500).json({ error: 'Error merging predefined activities' });
|
||||
}
|
||||
};
|
||||
|
||||
export const deduplicatePredefinedActivities = async (req, res) => {
|
||||
try {
|
||||
const result = await predefinedActivityService.deduplicateActivities();
|
||||
res.status(200).json(result);
|
||||
} catch (error) {
|
||||
console.error('[deduplicatePredefinedActivities] - Error:', error);
|
||||
res.status(500).json({ error: 'Error deduplicating predefined activities' });
|
||||
}
|
||||
};
|
||||
|
||||
98
backend/controllers/predefinedActivityImageController.js
Normal file
98
backend/controllers/predefinedActivityImageController.js
Normal file
@@ -0,0 +1,98 @@
|
||||
import PredefinedActivity from '../models/PredefinedActivity.js';
|
||||
import PredefinedActivityImage from '../models/PredefinedActivityImage.js';
|
||||
import { checkGlobalAccess } from '../utils/userUtils.js';
|
||||
import path from 'path';
|
||||
import fs from 'fs';
|
||||
import sharp from 'sharp';
|
||||
|
||||
import { devLog } from '../utils/logger.js';
|
||||
export const uploadPredefinedActivityImage = async (req, res) => {
|
||||
try {
|
||||
const { id } = req.params; // predefinedActivityId
|
||||
const { authcode: userToken } = req.headers;
|
||||
await checkGlobalAccess(userToken); // Predefined Activities sind global, keine Club-Zugriffskontrolle nötig
|
||||
|
||||
const activity = await PredefinedActivity.findByPk(id);
|
||||
if (!activity) {
|
||||
return res.status(404).json({ error: 'Predefined activity not found' });
|
||||
}
|
||||
|
||||
if (!req.file || !req.file.buffer) {
|
||||
return res.status(400).json({ error: 'No image uploaded' });
|
||||
}
|
||||
|
||||
const imagesDir = path.join('images', 'predefined');
|
||||
if (!fs.existsSync(imagesDir)) {
|
||||
fs.mkdirSync(imagesDir, { recursive: true });
|
||||
}
|
||||
|
||||
const fileName = `${id}-${Date.now()}.jpg`;
|
||||
const filePath = path.join(imagesDir, fileName);
|
||||
|
||||
await sharp(req.file.buffer)
|
||||
.resize(800, 800, { fit: 'inside' })
|
||||
.jpeg({ quality: 85 })
|
||||
.toFile(filePath);
|
||||
|
||||
// Extrahiere Zeichnungsdaten aus dem Request
|
||||
const drawingData = req.body.drawingData ? JSON.parse(req.body.drawingData) : null;
|
||||
devLog('[uploadPredefinedActivityImage] - drawingData:', drawingData);
|
||||
|
||||
const imageRecord = await PredefinedActivityImage.create({
|
||||
predefinedActivityId: id,
|
||||
imagePath: filePath,
|
||||
mimeType: 'image/jpeg',
|
||||
drawingData: drawingData ? JSON.stringify(drawingData) : null,
|
||||
});
|
||||
|
||||
// Optional: als imageLink am Activity-Datensatz setzen
|
||||
activity.imageLink = `/api/predefined-activities/${id}/image/${imageRecord.id}`;
|
||||
await activity.save();
|
||||
|
||||
res.status(201).json({ id: imageRecord.id, imageLink: activity.imageLink });
|
||||
} catch (error) {
|
||||
console.error('[uploadPredefinedActivityImage] - Error:', error);
|
||||
res.status(500).json({ error: 'Failed to upload image' });
|
||||
}
|
||||
};
|
||||
|
||||
export const deletePredefinedActivityImage = async (req, res) => {
|
||||
try {
|
||||
const { id, imageId } = req.params; // predefinedActivityId, imageId
|
||||
const { authcode: userToken } = req.headers;
|
||||
await checkGlobalAccess(userToken);
|
||||
|
||||
const activity = await PredefinedActivity.findByPk(id);
|
||||
if (!activity) {
|
||||
return res.status(404).json({ error: 'Predefined activity not found' });
|
||||
}
|
||||
|
||||
const image = await PredefinedActivityImage.findOne({
|
||||
where: { id: imageId, predefinedActivityId: id }
|
||||
});
|
||||
if (!image) {
|
||||
return res.status(404).json({ error: 'Image not found' });
|
||||
}
|
||||
|
||||
// Datei vom Dateisystem löschen
|
||||
if (fs.existsSync(image.imagePath)) {
|
||||
fs.unlinkSync(image.imagePath);
|
||||
}
|
||||
|
||||
// Datensatz aus der Datenbank löschen
|
||||
await image.destroy();
|
||||
|
||||
// Falls das gelöschte Bild der aktuelle imageLink war, diesen zurücksetzen
|
||||
if (activity.imageLink === `/api/predefined-activities/${id}/image/${imageId}`) {
|
||||
activity.imageLink = null;
|
||||
await activity.save();
|
||||
}
|
||||
|
||||
res.status(200).json({ message: 'Image deleted successfully' });
|
||||
} catch (error) {
|
||||
console.error('[deletePredefinedActivityImage] - Error:', error);
|
||||
res.status(500).json({ error: 'Failed to delete image' });
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
176
backend/controllers/trainingStatsController.js
Normal file
176
backend/controllers/trainingStatsController.js
Normal file
@@ -0,0 +1,176 @@
|
||||
import { DiaryDate, Member, Participant } from '../models/index.js';
|
||||
import { Op } from 'sequelize';
|
||||
|
||||
class TrainingStatsController {
|
||||
async getTrainingStats(req, res) {
|
||||
try {
|
||||
const { clubId } = req.params;
|
||||
|
||||
// Aktuelle Datum für Berechnungen
|
||||
const now = new Date();
|
||||
const twelveMonthsAgo = new Date(now.getFullYear() - 1, now.getMonth(), now.getDate());
|
||||
const threeMonthsAgo = new Date(now.getFullYear(), now.getMonth() - 3, now.getDate());
|
||||
|
||||
// Alle aktiven Mitglieder des spezifischen Vereins laden
|
||||
const members = await Member.findAll({
|
||||
where: {
|
||||
active: true,
|
||||
clubId: parseInt(clubId)
|
||||
}
|
||||
});
|
||||
|
||||
// Anzahl der Trainings im jeweiligen Zeitraum berechnen
|
||||
const trainingsCount12Months = await DiaryDate.count({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: twelveMonthsAgo
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const trainingsCount3Months = await DiaryDate.count({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: threeMonthsAgo
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
const stats = [];
|
||||
|
||||
for (const member of members) {
|
||||
// Trainingsteilnahmen der letzten 12 Monate über Participant-Model
|
||||
const participation12Months = await Participant.count({
|
||||
include: [{
|
||||
model: DiaryDate,
|
||||
as: 'diaryDate',
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: twelveMonthsAgo
|
||||
}
|
||||
}
|
||||
}],
|
||||
where: {
|
||||
memberId: member.id
|
||||
}
|
||||
});
|
||||
|
||||
// Trainingsteilnahmen der letzten 3 Monate über Participant-Model
|
||||
const participation3Months = await Participant.count({
|
||||
include: [{
|
||||
model: DiaryDate,
|
||||
as: 'diaryDate',
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: threeMonthsAgo
|
||||
}
|
||||
}
|
||||
}],
|
||||
where: {
|
||||
memberId: member.id
|
||||
}
|
||||
});
|
||||
|
||||
// Trainingsteilnahmen insgesamt über Participant-Model
|
||||
const participationTotal = await Participant.count({
|
||||
include: [{
|
||||
model: DiaryDate,
|
||||
as: 'diaryDate',
|
||||
where: {
|
||||
clubId: parseInt(clubId)
|
||||
}
|
||||
}],
|
||||
where: {
|
||||
memberId: member.id
|
||||
}
|
||||
});
|
||||
|
||||
// Detaillierte Trainingsdaten (absteigend sortiert) über Participant-Model
|
||||
const trainingDetails = await Participant.findAll({
|
||||
include: [{
|
||||
model: DiaryDate,
|
||||
as: 'diaryDate',
|
||||
where: {
|
||||
clubId: parseInt(clubId)
|
||||
}
|
||||
}],
|
||||
where: {
|
||||
memberId: member.id
|
||||
},
|
||||
order: [['diaryDate', 'date', 'DESC']],
|
||||
limit: 50 // Begrenzen auf die letzten 50 Trainingseinheiten
|
||||
});
|
||||
|
||||
// Trainingsteilnahmen für den Member formatieren
|
||||
const formattedTrainingDetails = trainingDetails.map(participation => ({
|
||||
id: participation.id,
|
||||
date: participation.diaryDate.date,
|
||||
activityName: 'Training',
|
||||
startTime: '--:--',
|
||||
endTime: '--:--'
|
||||
}));
|
||||
|
||||
// Letztes Training
|
||||
const lastTrainingDate = trainingDetails.length ? trainingDetails[0].diaryDate.date : null;
|
||||
const lastTrainingTs = lastTrainingDate ? new Date(lastTrainingDate).getTime() : 0;
|
||||
|
||||
stats.push({
|
||||
id: member.id,
|
||||
firstName: member.firstName,
|
||||
lastName: member.lastName,
|
||||
birthDate: member.birthDate,
|
||||
participation12Months,
|
||||
participation3Months,
|
||||
participationTotal,
|
||||
lastTraining: lastTrainingDate,
|
||||
lastTrainingTs,
|
||||
trainingDetails: formattedTrainingDetails
|
||||
});
|
||||
}
|
||||
|
||||
// Nach Gesamtteilnahme absteigend sortieren
|
||||
stats.sort((a, b) => b.participationTotal - a.participationTotal);
|
||||
|
||||
// Trainingstage mit Teilnehmerzahlen abrufen (letzte 12 Monate, absteigend sortiert)
|
||||
const trainingDays = await DiaryDate.findAll({
|
||||
where: {
|
||||
clubId: parseInt(clubId),
|
||||
date: {
|
||||
[Op.gte]: twelveMonthsAgo
|
||||
}
|
||||
},
|
||||
include: [{
|
||||
model: Participant,
|
||||
as: 'participantList',
|
||||
attributes: ['id']
|
||||
}],
|
||||
order: [['date', 'DESC']]
|
||||
});
|
||||
|
||||
// Formatiere Trainingstage mit Teilnehmerzahl
|
||||
const formattedTrainingDays = trainingDays.map(day => ({
|
||||
id: day.id,
|
||||
date: day.date,
|
||||
participantCount: day.participantList ? day.participantList.length : 0
|
||||
}));
|
||||
|
||||
// Zusätzliche Metadaten mit Trainingsanzahl zurückgeben
|
||||
res.json({
|
||||
members: stats,
|
||||
trainingsCount12Months,
|
||||
trainingsCount3Months,
|
||||
trainingDays: formattedTrainingDays
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
console.error('Fehler beim Laden der Trainings-Statistik:', error);
|
||||
res.status(500).json({ error: 'Fehler beim Laden der Trainings-Statistik' });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export default new TrainingStatsController();
|
||||
@@ -1,14 +1,16 @@
|
||||
import { Sequelize } from 'sequelize';
|
||||
import { development } from './config.js';
|
||||
|
||||
|
||||
const sequelize = new Sequelize(
|
||||
development.database,
|
||||
development.database,
|
||||
development.username,
|
||||
development.password,
|
||||
{
|
||||
host: development.host,
|
||||
dialect: development.dialect,
|
||||
define: development.define,
|
||||
logging: false, // SQL-Logging deaktivieren
|
||||
}
|
||||
);
|
||||
|
||||
|
||||
@@ -1,18 +1,23 @@
|
||||
import User from '../models/User.js';
|
||||
import jwt from 'jsonwebtoken';
|
||||
import UserToken from '../models/UserToken.js';
|
||||
|
||||
export const authenticate = async (req, res, next) => {
|
||||
try {
|
||||
const { userid: userId, authcode: authCode } = req.headers;
|
||||
if (!userId || !authCode) {
|
||||
return res.status(401).json({ error: 'Unauthorized: Missing credentials' });
|
||||
}
|
||||
const user = await User.findOne({ where: { email: userId, authCode: authCode } });
|
||||
if (!user) {
|
||||
return res.status(401).json({ error: 'Unauthorized: Invalid credentials' });
|
||||
}
|
||||
next();
|
||||
} catch(error) {
|
||||
console.log(error);
|
||||
return res.status(500).json({ error: 'Internal Server Error at auth' });
|
||||
let token = req.headers['authorization']?.split(' ')[1];
|
||||
if (!token) {
|
||||
token = req.headers['authcode'];
|
||||
}
|
||||
if (!token) {
|
||||
return res.status(401).json({ error: 'Unauthorized: Token fehlt' });
|
||||
}
|
||||
try {
|
||||
const decoded = jwt.verify(token, process.env.JWT_SECRET);
|
||||
const tokenRecord = await UserToken.findOne({ where: { token } });
|
||||
if (!tokenRecord || tokenRecord.expiresAt < new Date()) {
|
||||
return res.status(401).json({ error: 'Unauthorized: Invalid credentials' });
|
||||
}
|
||||
req.user = { id: decoded.userId };
|
||||
next();
|
||||
} catch (err) {
|
||||
return res.status(401).json({ error: 'Unauthorized: Invalid credentials' });
|
||||
}
|
||||
};
|
||||
@@ -0,0 +1,11 @@
|
||||
-- Migration: Add drawing_data column to predefined_activity_images table
|
||||
-- Date: 2025-09-22
|
||||
-- Description: Adds drawing_data column to store Court Drawing Tool metadata
|
||||
|
||||
ALTER TABLE `predefined_activity_images`
|
||||
ADD COLUMN `drawing_data` TEXT NULL
|
||||
COMMENT 'JSON string containing drawing metadata for Court Drawing Tool'
|
||||
AFTER `mime_type`;
|
||||
|
||||
-- Verify the column was added
|
||||
DESCRIBE `predefined_activity_images`;
|
||||
26
backend/models/DiaryMemberActivity.js
Normal file
26
backend/models/DiaryMemberActivity.js
Normal file
@@ -0,0 +1,26 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const DiaryMemberActivity = sequelize.define('DiaryMemberActivity', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
},
|
||||
diaryDateActivityId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
participantId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
}, {
|
||||
tableName: 'diary_member_activities',
|
||||
timestamps: true,
|
||||
underscored: true,
|
||||
});
|
||||
|
||||
export default DiaryMemberActivity;
|
||||
|
||||
|
||||
132
backend/models/ExternalServiceAccount.js
Normal file
132
backend/models/ExternalServiceAccount.js
Normal file
@@ -0,0 +1,132 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import { encryptData, decryptData } from '../utils/encrypt.js';
|
||||
|
||||
const ExternalServiceAccount = sequelize.define('ExternalServiceAccount', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: 'user',
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE'
|
||||
},
|
||||
service: {
|
||||
type: DataTypes.STRING(50),
|
||||
allowNull: false,
|
||||
defaultValue: 'mytischtennis'
|
||||
},
|
||||
email: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
encryptedPassword: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'encrypted_password'
|
||||
},
|
||||
savePassword: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
field: 'save_password'
|
||||
},
|
||||
accessToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'access_token'
|
||||
},
|
||||
refreshToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'refresh_token'
|
||||
},
|
||||
expiresAt: {
|
||||
type: DataTypes.BIGINT,
|
||||
allowNull: true,
|
||||
field: 'expires_at'
|
||||
},
|
||||
cookie: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true
|
||||
},
|
||||
userData: {
|
||||
type: DataTypes.JSON,
|
||||
allowNull: true,
|
||||
field: 'user_data'
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_id'
|
||||
},
|
||||
clubName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_name'
|
||||
},
|
||||
fedNickname: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'fed_nickname'
|
||||
},
|
||||
lastLoginAttempt: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_attempt'
|
||||
},
|
||||
lastLoginSuccess: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_success'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'external_service_account',
|
||||
timestamps: true,
|
||||
indexes: [
|
||||
{
|
||||
unique: true,
|
||||
fields: ['user_id', 'service']
|
||||
}
|
||||
],
|
||||
hooks: {
|
||||
beforeSave: async (instance) => {
|
||||
// Wenn savePassword false ist, password auf null setzen
|
||||
if (!instance.savePassword) {
|
||||
instance.encryptedPassword = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Virtuelle Felder für password handling
|
||||
ExternalServiceAccount.prototype.setPassword = function(password) {
|
||||
if (password && this.savePassword) {
|
||||
this.encryptedPassword = encryptData(password);
|
||||
} else {
|
||||
this.encryptedPassword = null;
|
||||
}
|
||||
};
|
||||
|
||||
ExternalServiceAccount.prototype.getPassword = function() {
|
||||
if (this.encryptedPassword) {
|
||||
try {
|
||||
return decryptData(this.encryptedPassword);
|
||||
} catch (error) {
|
||||
console.error('Error decrypting password:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export default ExternalServiceAccount;
|
||||
|
||||
@@ -122,6 +122,22 @@ const Member = sequelize.define('Member', {
|
||||
allowNull: false,
|
||||
default: false,
|
||||
}
|
||||
,
|
||||
gender: {
|
||||
type: DataTypes.ENUM('male','female','diverse','unknown'),
|
||||
allowNull: true,
|
||||
defaultValue: 'unknown'
|
||||
},
|
||||
ttr: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null
|
||||
},
|
||||
qttr: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
defaultValue: null
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
sequelize,
|
||||
|
||||
132
backend/models/MyTischtennis.js
Normal file
132
backend/models/MyTischtennis.js
Normal file
@@ -0,0 +1,132 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
import { encryptData, decryptData } from '../utils/encrypt.js';
|
||||
|
||||
const ExternalServiceAccount = sequelize.define('ExternalServiceAccount', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
allowNull: false
|
||||
},
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
references: {
|
||||
model: 'user',
|
||||
key: 'id'
|
||||
},
|
||||
onDelete: 'CASCADE'
|
||||
},
|
||||
service: {
|
||||
type: DataTypes.STRING(50),
|
||||
allowNull: false,
|
||||
defaultValue: 'mytischtennis'
|
||||
},
|
||||
email: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
encryptedPassword: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'encrypted_password'
|
||||
},
|
||||
savePassword: {
|
||||
type: DataTypes.BOOLEAN,
|
||||
defaultValue: false,
|
||||
allowNull: false,
|
||||
field: 'save_password'
|
||||
},
|
||||
accessToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'access_token'
|
||||
},
|
||||
refreshToken: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
field: 'refresh_token'
|
||||
},
|
||||
expiresAt: {
|
||||
type: DataTypes.BIGINT,
|
||||
allowNull: true,
|
||||
field: 'expires_at'
|
||||
},
|
||||
cookie: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true
|
||||
},
|
||||
userData: {
|
||||
type: DataTypes.JSON,
|
||||
allowNull: true,
|
||||
field: 'user_data'
|
||||
},
|
||||
clubId: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_id'
|
||||
},
|
||||
clubName: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'club_name'
|
||||
},
|
||||
fedNickname: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
field: 'fed_nickname'
|
||||
},
|
||||
lastLoginAttempt: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_attempt'
|
||||
},
|
||||
lastLoginSuccess: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: true,
|
||||
field: 'last_login_success'
|
||||
}
|
||||
}, {
|
||||
underscored: true,
|
||||
tableName: 'external_service_account',
|
||||
timestamps: true,
|
||||
indexes: [
|
||||
{
|
||||
unique: true,
|
||||
fields: ['user_id', 'service']
|
||||
}
|
||||
],
|
||||
hooks: {
|
||||
beforeSave: async (instance) => {
|
||||
// Wenn savePassword false ist, password auf null setzen
|
||||
if (!instance.savePassword) {
|
||||
instance.encryptedPassword = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Virtuelle Felder für password handling
|
||||
ExternalServiceAccount.prototype.setPassword = function(password) {
|
||||
if (password && this.savePassword) {
|
||||
this.encryptedPassword = encryptData(password);
|
||||
} else {
|
||||
this.encryptedPassword = null;
|
||||
}
|
||||
};
|
||||
|
||||
ExternalServiceAccount.prototype.getPassword = function() {
|
||||
if (this.encryptedPassword) {
|
||||
try {
|
||||
return decryptData(this.encryptedPassword);
|
||||
} catch (error) {
|
||||
console.error('Error decrypting password:', error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export default ExternalServiceAccount;
|
||||
|
||||
28
backend/models/OfficialCompetition.js
Normal file
28
backend/models/OfficialCompetition.js
Normal file
@@ -0,0 +1,28 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const OfficialCompetition = sequelize.define('OfficialCompetition', {
|
||||
id: { type: DataTypes.INTEGER, primaryKey: true, autoIncrement: true },
|
||||
tournamentId: { type: DataTypes.INTEGER, allowNull: false },
|
||||
// Englische Attributnamen, gemappt auf bestehende DB-Spalten
|
||||
ageClassCompetition: { type: DataTypes.STRING, allowNull: true, field: 'age_class_competition' },
|
||||
performanceClass: { type: DataTypes.STRING, allowNull: true, field: 'performance_class' },
|
||||
startTime: { type: DataTypes.STRING, allowNull: true, field: 'start_time' },
|
||||
registrationDeadlineDate: { type: DataTypes.STRING, allowNull: true, field: 'registration_deadline_date' },
|
||||
registrationDeadlineOnline: { type: DataTypes.STRING, allowNull: true, field: 'registration_deadline_online' },
|
||||
cutoffDate: { type: DataTypes.STRING, allowNull: true, field: 'cutoff_date' },
|
||||
ttrRelevant: { type: DataTypes.STRING, allowNull: true },
|
||||
openTo: { type: DataTypes.STRING, allowNull: true, field: 'open_to' },
|
||||
preliminaryRound: { type: DataTypes.STRING, allowNull: true, field: 'preliminary_round' },
|
||||
finalRound: { type: DataTypes.STRING, allowNull: true, field: 'final_round' },
|
||||
maxParticipants: { type: DataTypes.STRING, allowNull: true, field: 'max_participants' },
|
||||
entryFee: { type: DataTypes.STRING, allowNull: true, field: 'entry_fee' },
|
||||
}, {
|
||||
tableName: 'official_competitions',
|
||||
timestamps: true,
|
||||
underscored: true,
|
||||
});
|
||||
|
||||
export default OfficialCompetition;
|
||||
|
||||
|
||||
25
backend/models/OfficialCompetitionMember.js
Normal file
25
backend/models/OfficialCompetitionMember.js
Normal file
@@ -0,0 +1,25 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const OfficialCompetitionMember = sequelize.define('OfficialCompetitionMember', {
|
||||
id: { type: DataTypes.INTEGER, primaryKey: true, autoIncrement: true },
|
||||
tournamentId: { type: DataTypes.INTEGER, allowNull: false },
|
||||
competitionId: { type: DataTypes.INTEGER, allowNull: false },
|
||||
memberId: { type: DataTypes.INTEGER, allowNull: false },
|
||||
wants: { type: DataTypes.BOOLEAN, allowNull: false, defaultValue: false },
|
||||
registered: { type: DataTypes.BOOLEAN, allowNull: false, defaultValue: false },
|
||||
participated: { type: DataTypes.BOOLEAN, allowNull: false, defaultValue: false },
|
||||
placement: { type: DataTypes.STRING, allowNull: true },
|
||||
}, {
|
||||
tableName: 'official_competition_members',
|
||||
timestamps: true,
|
||||
underscored: true,
|
||||
indexes: [
|
||||
{ unique: true, fields: ['competition_id', 'member_id'] },
|
||||
{ fields: ['tournament_id'] },
|
||||
],
|
||||
});
|
||||
|
||||
export default OfficialCompetitionMember;
|
||||
|
||||
|
||||
23
backend/models/OfficialTournament.js
Normal file
23
backend/models/OfficialTournament.js
Normal file
@@ -0,0 +1,23 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const OfficialTournament = sequelize.define('OfficialTournament', {
|
||||
id: { type: DataTypes.INTEGER, primaryKey: true, autoIncrement: true },
|
||||
clubId: { type: DataTypes.INTEGER, allowNull: false },
|
||||
title: { type: DataTypes.STRING, allowNull: true },
|
||||
eventDate: { type: DataTypes.STRING, allowNull: true },
|
||||
organizer: { type: DataTypes.STRING, allowNull: true },
|
||||
host: { type: DataTypes.STRING, allowNull: true },
|
||||
venues: { type: DataTypes.TEXT, allowNull: true }, // JSON.stringify(Array)
|
||||
competitionTypes: { type: DataTypes.TEXT, allowNull: true }, // JSON.stringify(Array)
|
||||
registrationDeadlines: { type: DataTypes.TEXT, allowNull: true }, // JSON.stringify(Array)
|
||||
entryFees: { type: DataTypes.TEXT, allowNull: true }, // JSON.stringify(Object) - Teilnahmegebühren pro Spielklasse
|
||||
}, {
|
||||
tableName: 'official_tournaments',
|
||||
timestamps: true,
|
||||
underscored: true,
|
||||
});
|
||||
|
||||
export default OfficialTournament;
|
||||
|
||||
|
||||
@@ -11,10 +11,19 @@ const PredefinedActivity = sequelize.define('PredefinedActivity', {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
code: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
},
|
||||
description: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
},
|
||||
drawingData: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
comment: 'JSON string with metadata for Court Drawing Tool'
|
||||
},
|
||||
durationText: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
@@ -23,6 +32,10 @@ const PredefinedActivity = sequelize.define('PredefinedActivity', {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: true,
|
||||
},
|
||||
imageLink: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
},
|
||||
}, {
|
||||
tableName: 'predefined_activities',
|
||||
timestamps: true,
|
||||
|
||||
35
backend/models/PredefinedActivityImage.js
Normal file
35
backend/models/PredefinedActivityImage.js
Normal file
@@ -0,0 +1,35 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js';
|
||||
|
||||
const PredefinedActivityImage = sequelize.define('PredefinedActivityImage', {
|
||||
id: {
|
||||
type: DataTypes.INTEGER,
|
||||
primaryKey: true,
|
||||
autoIncrement: true,
|
||||
},
|
||||
predefinedActivityId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
imagePath: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
},
|
||||
mimeType: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: true,
|
||||
},
|
||||
drawingData: {
|
||||
type: DataTypes.TEXT,
|
||||
allowNull: true,
|
||||
comment: 'JSON string containing drawing metadata for Court Drawing Tool'
|
||||
},
|
||||
}, {
|
||||
tableName: 'predefined_activity_images',
|
||||
timestamps: true,
|
||||
underscored: true,
|
||||
});
|
||||
|
||||
export default PredefinedActivityImage;
|
||||
|
||||
|
||||
@@ -63,4 +63,8 @@ const User = sequelize.define('User', {
|
||||
},
|
||||
});
|
||||
|
||||
User.prototype.validatePassword = function(password) {
|
||||
return bcrypt.compare(password, this.password);
|
||||
};
|
||||
|
||||
export default User;
|
||||
|
||||
20
backend/models/UserToken.js
Normal file
20
backend/models/UserToken.js
Normal file
@@ -0,0 +1,20 @@
|
||||
import { DataTypes } from 'sequelize';
|
||||
import sequelize from '../database.js'; // Korrigierter Pfad
|
||||
|
||||
const UserToken = sequelize.define('UserToken', {
|
||||
userId: {
|
||||
type: DataTypes.INTEGER,
|
||||
allowNull: false,
|
||||
},
|
||||
token: {
|
||||
type: DataTypes.STRING,
|
||||
allowNull: false,
|
||||
unique: true,
|
||||
},
|
||||
expiresAt: {
|
||||
type: DataTypes.DATE,
|
||||
allowNull: false,
|
||||
},
|
||||
});
|
||||
|
||||
export default UserToken;
|
||||
@@ -13,6 +13,8 @@ import DiaryDateTag from './DiaryDateTag.js';
|
||||
import DiaryMemberNote from './DiaryMemberNote.js';
|
||||
import DiaryMemberTag from './DiaryMemberTag.js';
|
||||
import PredefinedActivity from './PredefinedActivity.js';
|
||||
import DiaryMemberActivity from './DiaryMemberActivity.js';
|
||||
import PredefinedActivityImage from './PredefinedActivityImage.js';
|
||||
import DiaryDateActivity from './DiaryDateActivity.js';
|
||||
import Match from './Match.js';
|
||||
import League from './League.js';
|
||||
@@ -27,6 +29,21 @@ import TournamentMember from './TournamentMember.js';
|
||||
import TournamentMatch from './TournamentMatch.js';
|
||||
import TournamentResult from './TournamentResult.js';
|
||||
import Accident from './Accident.js';
|
||||
import UserToken from './UserToken.js';
|
||||
import OfficialTournament from './OfficialTournament.js';
|
||||
import OfficialCompetition from './OfficialCompetition.js';
|
||||
import OfficialCompetitionMember from './OfficialCompetitionMember.js';
|
||||
import ExternalServiceAccount from './ExternalServiceAccount.js';
|
||||
// Official tournaments relations
|
||||
OfficialTournament.hasMany(OfficialCompetition, { foreignKey: 'tournamentId', as: 'competitions' });
|
||||
OfficialCompetition.belongsTo(OfficialTournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
// Official competition participations
|
||||
OfficialCompetition.hasMany(OfficialCompetitionMember, { foreignKey: 'competitionId', as: 'members' });
|
||||
OfficialCompetitionMember.belongsTo(OfficialCompetition, { foreignKey: 'competitionId', as: 'competition' });
|
||||
OfficialTournament.hasMany(OfficialCompetitionMember, { foreignKey: 'tournamentId', as: 'competitionMembers' });
|
||||
OfficialCompetitionMember.belongsTo(OfficialTournament, { foreignKey: 'tournamentId', as: 'tournament' });
|
||||
Member.hasMany(OfficialCompetitionMember, { foreignKey: 'memberId', as: 'officialCompetitionEntries' });
|
||||
OfficialCompetitionMember.belongsTo(Member, { foreignKey: 'memberId', as: 'member' });
|
||||
|
||||
User.hasMany(Log, { foreignKey: 'userId' });
|
||||
Log.belongsTo(User, { foreignKey: 'userId' });
|
||||
@@ -40,6 +57,12 @@ Club.hasMany(DiaryDate, { foreignKey: 'clubId' });
|
||||
DiaryDate.belongsToMany(Member, { through: Participant, as: 'participants', foreignKey: 'diaryDateId' });
|
||||
Member.belongsToMany(DiaryDate, { through: Participant, as: 'diaryDates', foreignKey: 'memberId' });
|
||||
|
||||
// Explizite Assoziationen für Participant
|
||||
Participant.belongsTo(DiaryDate, { foreignKey: 'diaryDateId', as: 'diaryDate' });
|
||||
Participant.belongsTo(Member, { foreignKey: 'memberId', as: 'member' });
|
||||
DiaryDate.hasMany(Participant, { foreignKey: 'diaryDateId', as: 'participantList' });
|
||||
Member.hasMany(Participant, { foreignKey: 'memberId', as: 'participantList' });
|
||||
|
||||
DiaryDate.hasMany(Activity, { as: 'activities', foreignKey: 'diaryDateId' });
|
||||
Activity.belongsTo(DiaryDate, { as: 'diaryDate', foreignKey: 'diaryDateId' });
|
||||
|
||||
@@ -69,6 +92,14 @@ DiaryDateActivity.belongsTo(DiaryDate, { foreignKey: 'diaryDateId', as: 'diaryDa
|
||||
|
||||
PredefinedActivity.hasMany(DiaryDateActivity, { foreignKey: 'predefinedActivityId', as: 'predefinedActivities' });
|
||||
DiaryDateActivity.belongsTo(PredefinedActivity, { foreignKey: 'predefinedActivityId', as: 'predefinedActivity' });
|
||||
// DiaryMemberActivity links a Participant to a DiaryDateActivity
|
||||
DiaryMemberActivity.belongsTo(DiaryDateActivity, { foreignKey: 'diaryDateActivityId', as: 'activity' });
|
||||
DiaryDateActivity.hasMany(DiaryMemberActivity, { foreignKey: 'diaryDateActivityId', as: 'activityMembers' });
|
||||
DiaryMemberActivity.belongsTo(Participant, { foreignKey: 'participantId', as: 'participant' });
|
||||
Participant.hasMany(DiaryMemberActivity, { foreignKey: 'participantId', as: 'memberActivities' });
|
||||
// PredefinedActivity Images
|
||||
PredefinedActivity.hasMany(PredefinedActivityImage, { foreignKey: 'predefinedActivityId', as: 'images' });
|
||||
PredefinedActivityImage.belongsTo(PredefinedActivity, { foreignKey: 'predefinedActivityId', as: 'predefinedActivity' });
|
||||
|
||||
Club.hasMany(Match, { foreignKey: 'clubId', as: 'matches' });
|
||||
Match.belongsTo(Club, { foreignKey: 'clubId', as: 'club' });
|
||||
@@ -174,6 +205,9 @@ Member.hasMany(Accident, { foreignKey: 'memberId', as: 'accidents' });
|
||||
Accident.belongsTo(DiaryDate, { foreignKey: 'diaryDateId', as: 'diaryDates' });
|
||||
DiaryDate.hasMany(Accident, { foreignKey: 'diaryDateId', as: 'accidents' });
|
||||
|
||||
User.hasMany(ExternalServiceAccount, { foreignKey: 'userId', as: 'externalServiceAccounts' });
|
||||
ExternalServiceAccount.belongsTo(User, { foreignKey: 'userId', as: 'user' });
|
||||
|
||||
export {
|
||||
User,
|
||||
Log,
|
||||
@@ -191,6 +225,8 @@ export {
|
||||
DiaryMemberNote,
|
||||
DiaryMemberTag,
|
||||
PredefinedActivity,
|
||||
DiaryMemberActivity,
|
||||
PredefinedActivityImage,
|
||||
DiaryDateActivity,
|
||||
Match,
|
||||
League,
|
||||
@@ -203,4 +239,9 @@ export {
|
||||
TournamentMatch,
|
||||
TournamentResult,
|
||||
Accident,
|
||||
UserToken,
|
||||
OfficialTournament,
|
||||
OfficialCompetition,
|
||||
OfficialCompetitionMember,
|
||||
ExternalServiceAccount,
|
||||
};
|
||||
|
||||
646
backend/node_modules/.package-lock.json
generated
vendored
646
backend/node_modules/.package-lock.json
generated
vendored
@@ -4,6 +4,26 @@
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"node_modules/@babel/runtime": {
|
||||
"version": "7.28.4",
|
||||
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.28.4.tgz",
|
||||
"integrity": "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@emnapi/runtime": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.2.0.tgz",
|
||||
"integrity": "sha512-bV21/9LQmcQeCPEg3BDFtvwL6cwiTMksYNWQQ4KOxCZikEGalWtenoZ0wCiukJINlGCIi2KXx01g4FoH/LxpzQ==",
|
||||
"ideallyInert": true,
|
||||
"license": "MIT",
|
||||
"optional": true,
|
||||
"dependencies": {
|
||||
"tslib": "^2.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@eslint-community/eslint-utils": {
|
||||
"version": "4.4.0",
|
||||
"resolved": "https://registry.npmjs.org/@eslint-community/eslint-utils/-/eslint-utils-4.4.0.tgz",
|
||||
@@ -180,6 +200,137 @@
|
||||
"url": "https://github.com/sponsors/nzakas"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-darwin-arm64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-darwin-arm64/-/sharp-darwin-arm64-0.33.5.tgz",
|
||||
"integrity": "sha512-UT4p+iz/2H4twwAoLCqfA9UH5pI6DggwKEGuaPy7nCVQ8ZsiY5PIcrRvD1DzuY3qYL07NtIQcWnBSY/heikIFQ==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-darwin-arm64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-darwin-x64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-darwin-x64/-/sharp-darwin-x64-0.33.5.tgz",
|
||||
"integrity": "sha512-fyHac4jIc1ANYGRDxtiqelIbdWkIuQaI84Mv45KvGRRxSAa7o7d1ZKAOBaYbnepLC1WqxfpimdeWfvqqSGwR2Q==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-darwin-x64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-darwin-arm64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-arm64/-/sharp-libvips-darwin-arm64-1.0.4.tgz",
|
||||
"integrity": "sha512-XblONe153h0O2zuFfTAbQYAX2JhYmDHeWikp1LM9Hul9gVPjFY427k6dFEcOL72O01QxQsWi761svJ/ev9xEDg==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-darwin-x64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-darwin-x64/-/sharp-libvips-darwin-x64-1.0.4.tgz",
|
||||
"integrity": "sha512-xnGR8YuZYfJGmWPvmlunFaWJsb9T/AO2ykoP3Fz/0X5XV2aoYBPkX6xqCQvUTKKiLddarLaxpzNe+b1hjeWHAQ==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linux-arm": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm/-/sharp-libvips-linux-arm-1.0.5.tgz",
|
||||
"integrity": "sha512-gvcC4ACAOPRNATg/ov8/MnbxFDJqf/pDePbBnuBDcjsI8PssmjoKMAz4LtLaVi+OnSb5FK/yIOamqDwGmXW32g==",
|
||||
"cpu": [
|
||||
"arm"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linux-arm64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-arm64/-/sharp-libvips-linux-arm64-1.0.4.tgz",
|
||||
"integrity": "sha512-9B+taZ8DlyyqzZQnoeIvDVR/2F4EbMepXMc/NdVbkzsJbzkUjhXv/70GQJ7tdLA4YJgNP25zukcxpX2/SueNrA==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linux-s390x": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-s390x/-/sharp-libvips-linux-s390x-1.0.4.tgz",
|
||||
"integrity": "sha512-u7Wz6ntiSSgGSGcjZ55im6uvTrOxSIS8/dgoVMoiGE9I6JAfU50yH5BoDlYA1tcuGS7g/QNtetJnxA6QEsCVTA==",
|
||||
"cpu": [
|
||||
"s390x"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linux-x64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linux-x64/-/sharp-libvips-linux-x64-1.0.4.tgz",
|
||||
@@ -196,6 +347,23 @@
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linuxmusl-arm64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-arm64/-/sharp-libvips-linuxmusl-arm64-1.0.4.tgz",
|
||||
"integrity": "sha512-9Ti+BbTYDcsbp4wfYib8Ctm1ilkugkA/uscUn6UXK1ldpC1JjiXbLfFZtRlBhjPZ5o1NCLiDbg8fhUPKStHoTA==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-libvips-linuxmusl-x64": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-libvips-linuxmusl-x64/-/sharp-libvips-linuxmusl-x64-1.0.4.tgz",
|
||||
@@ -212,6 +380,75 @@
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linux-arm": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linux-arm/-/sharp-linux-arm-0.33.5.tgz",
|
||||
"integrity": "sha512-JTS1eldqZbJxjvKaAkxhZmBqPRGmxgu+qFKSInv8moZ2AmT5Yib3EQ1c6gp493HvrvV8QgdOXdyaIBrhvFhBMQ==",
|
||||
"cpu": [
|
||||
"arm"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-linux-arm": "1.0.5"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linux-arm64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linux-arm64/-/sharp-linux-arm64-0.33.5.tgz",
|
||||
"integrity": "sha512-JMVv+AMRyGOHtO1RFBiJy/MBsgz0x4AWrT6QoEVVTyh1E39TrCUpTRI7mx9VksGX4awWASxqCYLCV4wBZHAYxA==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-linux-arm64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linux-s390x": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linux-s390x/-/sharp-linux-s390x-0.33.5.tgz",
|
||||
"integrity": "sha512-y/5PCd+mP4CA/sPDKl2961b+C9d+vPAveS33s6Z3zfASk2j5upL6fXVPZi7ztePZ5CuH+1kW8JtvxgbuXHRa4Q==",
|
||||
"cpu": [
|
||||
"s390x"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-linux-s390x": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linux-x64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linux-x64/-/sharp-linux-x64-0.33.5.tgz",
|
||||
@@ -234,6 +471,29 @@
|
||||
"@img/sharp-libvips-linux-x64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linuxmusl-arm64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-arm64/-/sharp-linuxmusl-arm64-0.33.5.tgz",
|
||||
"integrity": "sha512-XrHMZwGQGvJg2V/oRSUfSAfjfPxO+4DkiRh6p2AFjLQztWUuY/o8Mq0eMQVIY7HJ1CDQUJlxGGZRw1a5bqmd1g==",
|
||||
"cpu": [
|
||||
"arm64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"linux"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"@img/sharp-libvips-linuxmusl-arm64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-linuxmusl-x64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-linuxmusl-x64/-/sharp-linuxmusl-x64-0.33.5.tgz",
|
||||
@@ -256,6 +516,66 @@
|
||||
"@img/sharp-libvips-linuxmusl-x64": "1.0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-wasm32": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-wasm32/-/sharp-wasm32-0.33.5.tgz",
|
||||
"integrity": "sha512-ykUW4LVGaMcU9lu9thv85CbRMAwfeadCJHRsg2GmeRa/cJxsVY9Rbd57JcMxBkKHag5U/x7TSBpScF4U8ElVzg==",
|
||||
"cpu": [
|
||||
"wasm32"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0 AND LGPL-3.0-or-later AND MIT",
|
||||
"optional": true,
|
||||
"dependencies": {
|
||||
"@emnapi/runtime": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-win32-ia32": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-win32-ia32/-/sharp-win32-ia32-0.33.5.tgz",
|
||||
"integrity": "sha512-T36PblLaTwuVJ/zw/LaH0PdZkRz5rd3SmMHX8GSmR7vtNSP5Z6bQkExdSK7xGWyxLw4sUknBuugTelgw2faBbQ==",
|
||||
"cpu": [
|
||||
"ia32"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0 AND LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"win32"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@img/sharp-win32-x64": {
|
||||
"version": "0.33.5",
|
||||
"resolved": "https://registry.npmjs.org/@img/sharp-win32-x64/-/sharp-win32-x64-0.33.5.tgz",
|
||||
"integrity": "sha512-MpY/o8/8kj+EcnxwvrP4aTJSWw/aZ7JIGR4aBeZkZw5B7/Jn+tY9/VNwtcoGmdT7GfggGIU4kygOMSbYnOrAbg==",
|
||||
"cpu": [
|
||||
"x64"
|
||||
],
|
||||
"ideallyInert": true,
|
||||
"license": "Apache-2.0 AND LGPL-3.0-or-later",
|
||||
"optional": true,
|
||||
"os": [
|
||||
"win32"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^18.17.0 || ^20.3.0 || >=21.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://opencollective.com/libvips"
|
||||
}
|
||||
},
|
||||
"node_modules/@mapbox/node-pre-gyp": {
|
||||
"version": "1.0.11",
|
||||
"resolved": "https://registry.npmjs.org/@mapbox/node-pre-gyp/-/node-pre-gyp-1.0.11.tgz",
|
||||
@@ -499,6 +819,12 @@
|
||||
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
|
||||
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
|
||||
},
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/aws-ssl-profiles": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/aws-ssl-profiles/-/aws-ssl-profiles-1.1.1.tgz",
|
||||
@@ -507,6 +833,17 @@
|
||||
"node": ">= 6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.12.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
|
||||
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.4",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/balanced-match": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||
@@ -573,9 +910,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/brace-expansion": {
|
||||
"version": "1.1.11",
|
||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
|
||||
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
|
||||
"version": "1.1.12",
|
||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
|
||||
"integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"balanced-match": "^1.0.0",
|
||||
"concat-map": "0.0.1"
|
||||
@@ -643,6 +981,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/call-bind-apply-helpers": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
|
||||
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
@@ -772,6 +1123,18 @@
|
||||
"color-support": "bin.js"
|
||||
}
|
||||
},
|
||||
"node_modules/combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/concat-map": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||
@@ -848,9 +1211,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/cookie": {
|
||||
"version": "0.6.0",
|
||||
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz",
|
||||
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==",
|
||||
"version": "0.7.1",
|
||||
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
|
||||
"integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
@@ -879,10 +1243,11 @@
|
||||
}
|
||||
},
|
||||
"node_modules/cross-spawn": {
|
||||
"version": "7.0.3",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
|
||||
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==",
|
||||
"version": "7.0.6",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
|
||||
"integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"path-key": "^3.1.0",
|
||||
@@ -913,6 +1278,22 @@
|
||||
"node": ">= 10"
|
||||
}
|
||||
},
|
||||
"node_modules/date-fns": {
|
||||
"version": "2.30.0",
|
||||
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-2.30.0.tgz",
|
||||
"integrity": "sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.21.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.11"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/date-fns"
|
||||
}
|
||||
},
|
||||
"node_modules/debug": {
|
||||
"version": "2.6.9",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
|
||||
@@ -945,6 +1326,15 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/delayed-stream": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/delegates": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delegates/-/delegates-1.0.0.tgz",
|
||||
@@ -1001,6 +1391,20 @@
|
||||
"resolved": "https://registry.npmjs.org/dottie/-/dottie-2.0.6.tgz",
|
||||
"integrity": "sha512-iGCHkfUc5kFekGiqhe8B/mdaurD+lakO9txNnTvKtA6PISrw86LgqHvRzWYPyoE2Ph5aMIrCw9/uko6XHTKCwA=="
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"gopd": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/ecdsa-sig-formatter": {
|
||||
"version": "1.0.11",
|
||||
"resolved": "https://registry.npmjs.org/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz",
|
||||
@@ -1028,13 +1432,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/es-define-property": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.0.tgz",
|
||||
"integrity": "sha512-jxayLKShrEqqzJ0eumQbVhTYQM27CfT1T35+gCgDFoL82JLsXqTJ76zv6A0YLOgEnLUMvLzsDsGIrl8NFpT2gQ==",
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
|
||||
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.2.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
@@ -1048,6 +1449,33 @@
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-object-atoms": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
|
||||
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-set-tostringtag": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
|
||||
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"get-intrinsic": "^1.2.6",
|
||||
"has-tostringtag": "^1.0.2",
|
||||
"hasown": "^2.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/escape-html": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
|
||||
@@ -1265,16 +1693,17 @@
|
||||
}
|
||||
},
|
||||
"node_modules/express": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz",
|
||||
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==",
|
||||
"version": "4.21.2",
|
||||
"resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz",
|
||||
"integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"accepts": "~1.3.8",
|
||||
"array-flatten": "1.1.1",
|
||||
"body-parser": "1.20.3",
|
||||
"content-disposition": "0.5.4",
|
||||
"content-type": "~1.0.4",
|
||||
"cookie": "0.6.0",
|
||||
"cookie": "0.7.1",
|
||||
"cookie-signature": "1.0.6",
|
||||
"debug": "2.6.9",
|
||||
"depd": "2.0.0",
|
||||
@@ -1288,7 +1717,7 @@
|
||||
"methods": "~1.1.2",
|
||||
"on-finished": "2.4.1",
|
||||
"parseurl": "~1.3.3",
|
||||
"path-to-regexp": "0.1.10",
|
||||
"path-to-regexp": "0.1.12",
|
||||
"proxy-addr": "~2.0.7",
|
||||
"qs": "6.13.0",
|
||||
"range-parser": "~1.2.1",
|
||||
@@ -1303,6 +1732,10 @@
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.10.0"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/express"
|
||||
}
|
||||
},
|
||||
"node_modules/express/node_modules/encodeurl": {
|
||||
@@ -1433,6 +1866,42 @@
|
||||
"dev": true,
|
||||
"peer": true
|
||||
},
|
||||
"node_modules/follow-redirects": {
|
||||
"version": "1.15.11",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
|
||||
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "individual",
|
||||
"url": "https://github.com/sponsors/RubenVerborgh"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=4.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"debug": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/form-data": {
|
||||
"version": "4.0.4",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
|
||||
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"es-set-tostringtag": "^2.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"mime-types": "^2.1.12"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/forwarded": {
|
||||
"version": "0.2.0",
|
||||
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
|
||||
@@ -1477,6 +1946,21 @@
|
||||
"resolved": "https://registry.npmjs.org/fs.realpath/-/fs.realpath-1.0.0.tgz",
|
||||
"integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw=="
|
||||
},
|
||||
"node_modules/fsevents": {
|
||||
"version": "2.3.3",
|
||||
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
|
||||
"integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
|
||||
"dev": true,
|
||||
"hasInstallScript": true,
|
||||
"ideallyInert": true,
|
||||
"optional": true,
|
||||
"os": [
|
||||
"darwin"
|
||||
],
|
||||
"engines": {
|
||||
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/function-bind": {
|
||||
"version": "1.1.2",
|
||||
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
|
||||
@@ -1515,16 +1999,21 @@
|
||||
}
|
||||
},
|
||||
"node_modules/get-intrinsic": {
|
||||
"version": "1.2.4",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
|
||||
"integrity": "sha512-5uYhsJH8VJBTv7oslg4BznJYhDoRI6waYCxMmCdnTrcCrHA/fCFKoTFz2JKKE0HdDFUF7/oQuhzumXJK7paBRQ==",
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
|
||||
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"has-proto": "^1.0.1",
|
||||
"has-symbols": "^1.0.3",
|
||||
"hasown": "^2.0.0"
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1533,6 +2022,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/get-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dunder-proto": "^1.0.1",
|
||||
"es-object-atoms": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/glob": {
|
||||
"version": "7.2.3",
|
||||
"resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
|
||||
@@ -1579,12 +2081,12 @@
|
||||
}
|
||||
},
|
||||
"node_modules/gopd": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz",
|
||||
"integrity": "sha512-d65bNlIadxvpb/A2abVdlqKqV563juRnZ1Wtk6s1sIR8uNsXR70xqIzVqxVf1eTqDunwT2MkczEeaezCKTZhwA==",
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
|
||||
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.1.3"
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
@@ -1611,10 +2113,10 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-proto": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.0.3.tgz",
|
||||
"integrity": "sha512-SJ1amZAJUiZS+PhsVLf5tGydlaVB8EdFpaSO4gmiUKUOxk8qzn5AIy4ZeJUmh22znIdk/uMAUT2pl3FxzVUH+Q==",
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
|
||||
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1623,11 +2125,14 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.3.tgz",
|
||||
"integrity": "sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A==",
|
||||
"node_modules/has-tostringtag": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
|
||||
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"has-symbols": "^1.0.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
@@ -2071,6 +2576,15 @@
|
||||
"semver": "bin/semver.js"
|
||||
}
|
||||
},
|
||||
"node_modules/math-intrinsics": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
|
||||
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/media-typer": {
|
||||
"version": "0.3.0",
|
||||
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
|
||||
@@ -2302,6 +2816,12 @@
|
||||
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-5.1.0.tgz",
|
||||
"integrity": "sha512-eh0GgfEkpnoWDq+VY8OyvYhFEzBk6jIYbRKdIlyTiAXIVJ8PyBaKb0rp7oDtoddbdoHWhq8wwr+XZ81F1rpNdA=="
|
||||
},
|
||||
"node_modules/node-ensure": {
|
||||
"version": "0.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-ensure/-/node-ensure-0.0.0.tgz",
|
||||
"integrity": "sha512-DRI60hzo2oKN1ma0ckc6nQWlHU69RH6xN0sjQTjMpChPfTYvKZdcQFfdYK2RWbJcKyUizSIy/l8OTGxMAM1QDw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/node-fetch": {
|
||||
"version": "2.7.0",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz",
|
||||
@@ -2554,9 +3074,37 @@
|
||||
}
|
||||
},
|
||||
"node_modules/path-to-regexp": {
|
||||
"version": "0.1.10",
|
||||
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.10.tgz",
|
||||
"integrity": "sha512-7lf7qcQidTku0Gu3YDPc8DJ1q7OOucfa/BSsIwjuh56VU7katFvuM8hULfkwB3Fns/rsVF7PwPKVw1sl5KQS9w==",
|
||||
"version": "0.1.12",
|
||||
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz",
|
||||
"integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pdf-parse": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/pdf-parse/-/pdf-parse-1.1.1.tgz",
|
||||
"integrity": "sha512-v6ZJ/efsBpGrGGknjtq9J/oC8tZWq0KWL5vQrk2GlzLEQPUDB1ex+13Rmidl1neNN358Jn9EHZw5y07FFtaC7A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"debug": "^3.1.0",
|
||||
"node-ensure": "^0.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.8.1"
|
||||
}
|
||||
},
|
||||
"node_modules/pdf-parse/node_modules/debug": {
|
||||
"version": "3.2.7",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-3.2.7.tgz",
|
||||
"integrity": "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ms": "^2.1.1"
|
||||
}
|
||||
},
|
||||
"node_modules/pdf-parse/node_modules/ms": {
|
||||
"version": "2.1.3",
|
||||
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
|
||||
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pg-connection-string": {
|
||||
@@ -2604,6 +3152,12 @@
|
||||
"node": ">= 0.10"
|
||||
}
|
||||
},
|
||||
"node_modules/proxy-from-env": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pstree.remy": {
|
||||
"version": "1.1.8",
|
||||
"resolved": "https://registry.npmjs.org/pstree.remy/-/pstree.remy-1.1.8.tgz",
|
||||
@@ -3240,6 +3794,14 @@
|
||||
"resolved": "https://registry.npmjs.org/tr46/-/tr46-0.0.3.tgz",
|
||||
"integrity": "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="
|
||||
},
|
||||
"node_modules/tslib": {
|
||||
"version": "2.7.0",
|
||||
"resolved": "https://registry.npmjs.org/tslib/-/tslib-2.7.0.tgz",
|
||||
"integrity": "sha512-gLXCKdN1/j47AiHiOkJN69hJmcbGTHI0ImLmbYLHykhgeN0jVGola9yVjFgzCUklsZQMW55o+dW7IXv3RCXDzA==",
|
||||
"ideallyInert": true,
|
||||
"license": "0BSD",
|
||||
"optional": true
|
||||
},
|
||||
"node_modules/type-check": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/type-check/-/type-check-0.4.0.tgz",
|
||||
|
||||
2
backend/node_modules/brace-expansion/index.js
generated
vendored
2
backend/node_modules/brace-expansion/index.js
generated
vendored
@@ -109,7 +109,7 @@ function expand(str, isTop) {
|
||||
var isOptions = m.body.indexOf(',') >= 0;
|
||||
if (!isSequence && !isOptions) {
|
||||
// {a},b}
|
||||
if (m.post.match(/,.*\}/)) {
|
||||
if (m.post.match(/,(?!,).*\}/)) {
|
||||
str = m.pre + '{' + m.body + escClose + m.post;
|
||||
return expand(str);
|
||||
}
|
||||
|
||||
5
backend/node_modules/brace-expansion/package.json
generated
vendored
5
backend/node_modules/brace-expansion/package.json
generated
vendored
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "brace-expansion",
|
||||
"description": "Brace expansion as known from sh/bash",
|
||||
"version": "1.1.11",
|
||||
"version": "1.1.12",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/juliangruber/brace-expansion.git"
|
||||
@@ -43,5 +43,8 @@
|
||||
"iphone/6.0..latest",
|
||||
"android-browser/4.2..latest"
|
||||
]
|
||||
},
|
||||
"publishConfig": {
|
||||
"tag": "1.x"
|
||||
}
|
||||
}
|
||||
|
||||
147
backend/node_modules/cookie/HISTORY.md
generated
vendored
147
backend/node_modules/cookie/HISTORY.md
generated
vendored
@@ -1,147 +0,0 @@
|
||||
0.6.0 / 2023-11-06
|
||||
==================
|
||||
|
||||
* Add `partitioned` option
|
||||
|
||||
0.5.0 / 2022-04-11
|
||||
==================
|
||||
|
||||
* Add `priority` option
|
||||
* Fix `expires` option to reject invalid dates
|
||||
* perf: improve default decode speed
|
||||
* perf: remove slow string split in parse
|
||||
|
||||
0.4.2 / 2022-02-02
|
||||
==================
|
||||
|
||||
* perf: read value only when assigning in parse
|
||||
* perf: remove unnecessary regexp in parse
|
||||
|
||||
0.4.1 / 2020-04-21
|
||||
==================
|
||||
|
||||
* Fix `maxAge` option to reject invalid values
|
||||
|
||||
0.4.0 / 2019-05-15
|
||||
==================
|
||||
|
||||
* Add `SameSite=None` support
|
||||
|
||||
0.3.1 / 2016-05-26
|
||||
==================
|
||||
|
||||
* Fix `sameSite: true` to work with draft-7 clients
|
||||
- `true` now sends `SameSite=Strict` instead of `SameSite`
|
||||
|
||||
0.3.0 / 2016-05-26
|
||||
==================
|
||||
|
||||
* Add `sameSite` option
|
||||
- Replaces `firstPartyOnly` option, never implemented by browsers
|
||||
* Improve error message when `encode` is not a function
|
||||
* Improve error message when `expires` is not a `Date`
|
||||
|
||||
0.2.4 / 2016-05-20
|
||||
==================
|
||||
|
||||
* perf: enable strict mode
|
||||
* perf: use for loop in parse
|
||||
* perf: use string concatenation for serialization
|
||||
|
||||
0.2.3 / 2015-10-25
|
||||
==================
|
||||
|
||||
* Fix cookie `Max-Age` to never be a floating point number
|
||||
|
||||
0.2.2 / 2015-09-17
|
||||
==================
|
||||
|
||||
* Fix regression when setting empty cookie value
|
||||
- Ease the new restriction, which is just basic header-level validation
|
||||
* Fix typo in invalid value errors
|
||||
|
||||
0.2.1 / 2015-09-17
|
||||
==================
|
||||
|
||||
* Throw on invalid values provided to `serialize`
|
||||
- Ensures the resulting string is a valid HTTP header value
|
||||
|
||||
0.2.0 / 2015-08-13
|
||||
==================
|
||||
|
||||
* Add `firstPartyOnly` option
|
||||
* Throw better error for invalid argument to parse
|
||||
* perf: hoist regular expression
|
||||
|
||||
0.1.5 / 2015-09-17
|
||||
==================
|
||||
|
||||
* Fix regression when setting empty cookie value
|
||||
- Ease the new restriction, which is just basic header-level validation
|
||||
* Fix typo in invalid value errors
|
||||
|
||||
0.1.4 / 2015-09-17
|
||||
==================
|
||||
|
||||
* Throw better error for invalid argument to parse
|
||||
* Throw on invalid values provided to `serialize`
|
||||
- Ensures the resulting string is a valid HTTP header value
|
||||
|
||||
0.1.3 / 2015-05-19
|
||||
==================
|
||||
|
||||
* Reduce the scope of try-catch deopt
|
||||
* Remove argument reassignments
|
||||
|
||||
0.1.2 / 2014-04-16
|
||||
==================
|
||||
|
||||
* Remove unnecessary files from npm package
|
||||
|
||||
0.1.1 / 2014-02-23
|
||||
==================
|
||||
|
||||
* Fix bad parse when cookie value contained a comma
|
||||
* Fix support for `maxAge` of `0`
|
||||
|
||||
0.1.0 / 2013-05-01
|
||||
==================
|
||||
|
||||
* Add `decode` option
|
||||
* Add `encode` option
|
||||
|
||||
0.0.6 / 2013-04-08
|
||||
==================
|
||||
|
||||
* Ignore cookie parts missing `=`
|
||||
|
||||
0.0.5 / 2012-10-29
|
||||
==================
|
||||
|
||||
* Return raw cookie value if value unescape errors
|
||||
|
||||
0.0.4 / 2012-06-21
|
||||
==================
|
||||
|
||||
* Use encode/decodeURIComponent for cookie encoding/decoding
|
||||
- Improve server/client interoperability
|
||||
|
||||
0.0.3 / 2012-06-06
|
||||
==================
|
||||
|
||||
* Only escape special characters per the cookie RFC
|
||||
|
||||
0.0.2 / 2012-06-01
|
||||
==================
|
||||
|
||||
* Fix `maxAge` option to not throw error
|
||||
|
||||
0.0.1 / 2012-05-28
|
||||
==================
|
||||
|
||||
* Add more tests
|
||||
|
||||
0.0.0 / 2012-05-28
|
||||
==================
|
||||
|
||||
* Initial release
|
||||
174
backend/node_modules/cookie/index.js
generated
vendored
174
backend/node_modules/cookie/index.js
generated
vendored
@@ -23,14 +23,66 @@ exports.serialize = serialize;
|
||||
var __toString = Object.prototype.toString
|
||||
|
||||
/**
|
||||
* RegExp to match field-content in RFC 7230 sec 3.2
|
||||
* RegExp to match cookie-name in RFC 6265 sec 4.1.1
|
||||
* This refers out to the obsoleted definition of token in RFC 2616 sec 2.2
|
||||
* which has been replaced by the token definition in RFC 7230 appendix B.
|
||||
*
|
||||
* field-content = field-vchar [ 1*( SP / HTAB ) field-vchar ]
|
||||
* field-vchar = VCHAR / obs-text
|
||||
* obs-text = %x80-FF
|
||||
* cookie-name = token
|
||||
* token = 1*tchar
|
||||
* tchar = "!" / "#" / "$" / "%" / "&" / "'" /
|
||||
* "*" / "+" / "-" / "." / "^" / "_" /
|
||||
* "`" / "|" / "~" / DIGIT / ALPHA
|
||||
*/
|
||||
|
||||
var fieldContentRegExp = /^[\u0009\u0020-\u007e\u0080-\u00ff]+$/;
|
||||
var cookieNameRegExp = /^[!#$%&'*+\-.^_`|~0-9A-Za-z]+$/;
|
||||
|
||||
/**
|
||||
* RegExp to match cookie-value in RFC 6265 sec 4.1.1
|
||||
*
|
||||
* cookie-value = *cookie-octet / ( DQUOTE *cookie-octet DQUOTE )
|
||||
* cookie-octet = %x21 / %x23-2B / %x2D-3A / %x3C-5B / %x5D-7E
|
||||
* ; US-ASCII characters excluding CTLs,
|
||||
* ; whitespace DQUOTE, comma, semicolon,
|
||||
* ; and backslash
|
||||
*/
|
||||
|
||||
var cookieValueRegExp = /^("?)[\u0021\u0023-\u002B\u002D-\u003A\u003C-\u005B\u005D-\u007E]*\1$/;
|
||||
|
||||
/**
|
||||
* RegExp to match domain-value in RFC 6265 sec 4.1.1
|
||||
*
|
||||
* domain-value = <subdomain>
|
||||
* ; defined in [RFC1034], Section 3.5, as
|
||||
* ; enhanced by [RFC1123], Section 2.1
|
||||
* <subdomain> = <label> | <subdomain> "." <label>
|
||||
* <label> = <let-dig> [ [ <ldh-str> ] <let-dig> ]
|
||||
* Labels must be 63 characters or less.
|
||||
* 'let-dig' not 'letter' in the first char, per RFC1123
|
||||
* <ldh-str> = <let-dig-hyp> | <let-dig-hyp> <ldh-str>
|
||||
* <let-dig-hyp> = <let-dig> | "-"
|
||||
* <let-dig> = <letter> | <digit>
|
||||
* <letter> = any one of the 52 alphabetic characters A through Z in
|
||||
* upper case and a through z in lower case
|
||||
* <digit> = any one of the ten digits 0 through 9
|
||||
*
|
||||
* Keep support for leading dot: https://github.com/jshttp/cookie/issues/173
|
||||
*
|
||||
* > (Note that a leading %x2E ("."), if present, is ignored even though that
|
||||
* character is not permitted, but a trailing %x2E ("."), if present, will
|
||||
* cause the user agent to ignore the attribute.)
|
||||
*/
|
||||
|
||||
var domainValueRegExp = /^([.]?[a-z0-9]([a-z0-9-]{0,61}[a-z0-9])?)([.][a-z0-9]([a-z0-9-]{0,61}[a-z0-9])?)*$/i;
|
||||
|
||||
/**
|
||||
* RegExp to match path-value in RFC 6265 sec 4.1.1
|
||||
*
|
||||
* path-value = <any CHAR except CTLs or ";">
|
||||
* CHAR = %x01-7F
|
||||
* ; defined in RFC 5234 appendix B.1
|
||||
*/
|
||||
|
||||
var pathValueRegExp = /^[\u0020-\u003A\u003D-\u007E]*$/;
|
||||
|
||||
/**
|
||||
* Parse a cookie header.
|
||||
@@ -39,107 +91,128 @@ var fieldContentRegExp = /^[\u0009\u0020-\u007e\u0080-\u00ff]+$/;
|
||||
* The object has the various cookies as keys(names) => values
|
||||
*
|
||||
* @param {string} str
|
||||
* @param {object} [options]
|
||||
* @param {object} [opt]
|
||||
* @return {object}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function parse(str, options) {
|
||||
function parse(str, opt) {
|
||||
if (typeof str !== 'string') {
|
||||
throw new TypeError('argument str must be a string');
|
||||
}
|
||||
|
||||
var obj = {}
|
||||
var opt = options || {};
|
||||
var dec = opt.decode || decode;
|
||||
var obj = {};
|
||||
var len = str.length;
|
||||
// RFC 6265 sec 4.1.1, RFC 2616 2.2 defines a cookie name consists of one char minimum, plus '='.
|
||||
if (len < 2) return obj;
|
||||
|
||||
var index = 0
|
||||
while (index < str.length) {
|
||||
var eqIdx = str.indexOf('=', index)
|
||||
var dec = (opt && opt.decode) || decode;
|
||||
var index = 0;
|
||||
var eqIdx = 0;
|
||||
var endIdx = 0;
|
||||
|
||||
// no more cookie pairs
|
||||
if (eqIdx === -1) {
|
||||
break
|
||||
}
|
||||
do {
|
||||
eqIdx = str.indexOf('=', index);
|
||||
if (eqIdx === -1) break; // No more cookie pairs.
|
||||
|
||||
var endIdx = str.indexOf(';', index)
|
||||
endIdx = str.indexOf(';', index);
|
||||
|
||||
if (endIdx === -1) {
|
||||
endIdx = str.length
|
||||
} else if (endIdx < eqIdx) {
|
||||
endIdx = len;
|
||||
} else if (eqIdx > endIdx) {
|
||||
// backtrack on prior semicolon
|
||||
index = str.lastIndexOf(';', eqIdx - 1) + 1
|
||||
continue
|
||||
index = str.lastIndexOf(';', eqIdx - 1) + 1;
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = str.slice(index, eqIdx).trim()
|
||||
var keyStartIdx = startIndex(str, index, eqIdx);
|
||||
var keyEndIdx = endIndex(str, eqIdx, keyStartIdx);
|
||||
var key = str.slice(keyStartIdx, keyEndIdx);
|
||||
|
||||
// only assign once
|
||||
if (undefined === obj[key]) {
|
||||
var val = str.slice(eqIdx + 1, endIdx).trim()
|
||||
if (!obj.hasOwnProperty(key)) {
|
||||
var valStartIdx = startIndex(str, eqIdx + 1, endIdx);
|
||||
var valEndIdx = endIndex(str, endIdx, valStartIdx);
|
||||
|
||||
// quoted values
|
||||
if (val.charCodeAt(0) === 0x22) {
|
||||
val = val.slice(1, -1)
|
||||
if (str.charCodeAt(valStartIdx) === 0x22 /* " */ && str.charCodeAt(valEndIdx - 1) === 0x22 /* " */) {
|
||||
valStartIdx++;
|
||||
valEndIdx--;
|
||||
}
|
||||
|
||||
var val = str.slice(valStartIdx, valEndIdx);
|
||||
obj[key] = tryDecode(val, dec);
|
||||
}
|
||||
|
||||
index = endIdx + 1
|
||||
}
|
||||
} while (index < len);
|
||||
|
||||
return obj;
|
||||
}
|
||||
|
||||
function startIndex(str, index, max) {
|
||||
do {
|
||||
var code = str.charCodeAt(index);
|
||||
if (code !== 0x20 /* */ && code !== 0x09 /* \t */) return index;
|
||||
} while (++index < max);
|
||||
return max;
|
||||
}
|
||||
|
||||
function endIndex(str, index, min) {
|
||||
while (index > min) {
|
||||
var code = str.charCodeAt(--index);
|
||||
if (code !== 0x20 /* */ && code !== 0x09 /* \t */) return index + 1;
|
||||
}
|
||||
return min;
|
||||
}
|
||||
|
||||
/**
|
||||
* Serialize data into a cookie header.
|
||||
*
|
||||
* Serialize the a name value pair into a cookie string suitable for
|
||||
* http headers. An optional options object specified cookie parameters.
|
||||
* Serialize a name value pair into a cookie string suitable for
|
||||
* http headers. An optional options object specifies cookie parameters.
|
||||
*
|
||||
* serialize('foo', 'bar', { httpOnly: true })
|
||||
* => "foo=bar; httpOnly"
|
||||
*
|
||||
* @param {string} name
|
||||
* @param {string} val
|
||||
* @param {object} [options]
|
||||
* @param {object} [opt]
|
||||
* @return {string}
|
||||
* @public
|
||||
*/
|
||||
|
||||
function serialize(name, val, options) {
|
||||
var opt = options || {};
|
||||
var enc = opt.encode || encode;
|
||||
function serialize(name, val, opt) {
|
||||
var enc = (opt && opt.encode) || encodeURIComponent;
|
||||
|
||||
if (typeof enc !== 'function') {
|
||||
throw new TypeError('option encode is invalid');
|
||||
}
|
||||
|
||||
if (!fieldContentRegExp.test(name)) {
|
||||
if (!cookieNameRegExp.test(name)) {
|
||||
throw new TypeError('argument name is invalid');
|
||||
}
|
||||
|
||||
var value = enc(val);
|
||||
|
||||
if (value && !fieldContentRegExp.test(value)) {
|
||||
if (!cookieValueRegExp.test(value)) {
|
||||
throw new TypeError('argument val is invalid');
|
||||
}
|
||||
|
||||
var str = name + '=' + value;
|
||||
if (!opt) return str;
|
||||
|
||||
if (null != opt.maxAge) {
|
||||
var maxAge = opt.maxAge - 0;
|
||||
var maxAge = Math.floor(opt.maxAge);
|
||||
|
||||
if (isNaN(maxAge) || !isFinite(maxAge)) {
|
||||
if (!isFinite(maxAge)) {
|
||||
throw new TypeError('option maxAge is invalid')
|
||||
}
|
||||
|
||||
str += '; Max-Age=' + Math.floor(maxAge);
|
||||
str += '; Max-Age=' + maxAge;
|
||||
}
|
||||
|
||||
if (opt.domain) {
|
||||
if (!fieldContentRegExp.test(opt.domain)) {
|
||||
if (!domainValueRegExp.test(opt.domain)) {
|
||||
throw new TypeError('option domain is invalid');
|
||||
}
|
||||
|
||||
@@ -147,7 +220,7 @@ function serialize(name, val, options) {
|
||||
}
|
||||
|
||||
if (opt.path) {
|
||||
if (!fieldContentRegExp.test(opt.path)) {
|
||||
if (!pathValueRegExp.test(opt.path)) {
|
||||
throw new TypeError('option path is invalid');
|
||||
}
|
||||
|
||||
@@ -178,8 +251,7 @@ function serialize(name, val, options) {
|
||||
|
||||
if (opt.priority) {
|
||||
var priority = typeof opt.priority === 'string'
|
||||
? opt.priority.toLowerCase()
|
||||
: opt.priority
|
||||
? opt.priority.toLowerCase() : opt.priority;
|
||||
|
||||
switch (priority) {
|
||||
case 'low':
|
||||
@@ -234,17 +306,6 @@ function decode (str) {
|
||||
: str
|
||||
}
|
||||
|
||||
/**
|
||||
* URL-encode value.
|
||||
*
|
||||
* @param {string} val
|
||||
* @returns {string}
|
||||
*/
|
||||
|
||||
function encode (val) {
|
||||
return encodeURIComponent(val)
|
||||
}
|
||||
|
||||
/**
|
||||
* Determine if value is a Date.
|
||||
*
|
||||
@@ -253,8 +314,7 @@ function encode (val) {
|
||||
*/
|
||||
|
||||
function isDate (val) {
|
||||
return __toString.call(val) === '[object Date]' ||
|
||||
val instanceof Date
|
||||
return __toString.call(val) === '[object Date]';
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
6
backend/node_modules/cookie/package.json
generated
vendored
6
backend/node_modules/cookie/package.json
generated
vendored
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "cookie",
|
||||
"description": "HTTP server cookie parsing and serialization",
|
||||
"version": "0.6.0",
|
||||
"version": "0.7.1",
|
||||
"author": "Roman Shtylman <shtylman@gmail.com>",
|
||||
"contributors": [
|
||||
"Douglas Christopher Wilson <doug@somethingdoug.com>"
|
||||
@@ -29,6 +29,7 @@
|
||||
"SECURITY.md",
|
||||
"index.js"
|
||||
],
|
||||
"main": "index.js",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
},
|
||||
@@ -38,7 +39,6 @@
|
||||
"test": "mocha --reporter spec --bail --check-leaks test/",
|
||||
"test-ci": "nyc --reporter=lcov --reporter=text npm test",
|
||||
"test-cov": "nyc --reporter=html --reporter=text npm test",
|
||||
"update-bench": "node scripts/update-benchmark.js",
|
||||
"version": "node scripts/version-history.js && git add HISTORY.md"
|
||||
"update-bench": "node scripts/update-benchmark.js"
|
||||
}
|
||||
}
|
||||
|
||||
14
backend/node_modules/es-define-property/CHANGELOG.md
generated
vendored
14
backend/node_modules/es-define-property/CHANGELOG.md
generated
vendored
@@ -5,6 +5,20 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.0.1](https://github.com/ljharb/es-define-property/compare/v1.0.0...v1.0.1) - 2024-12-06
|
||||
|
||||
### Commits
|
||||
|
||||
- [types] use shared tsconfig [`954a663`](https://github.com/ljharb/es-define-property/commit/954a66360326e508a0e5daa4b07493d58f5e110e)
|
||||
- [actions] split out node 10-20, and 20+ [`3a8e84b`](https://github.com/ljharb/es-define-property/commit/3a8e84b23883f26ff37b3e82ff283834228e18c6)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `@ljharb/tsconfig`, `@types/get-intrinsic`, `@types/tape`, `auto-changelog`, `gopd`, `tape` [`86ae27b`](https://github.com/ljharb/es-define-property/commit/86ae27bb8cc857b23885136fad9cbe965ae36612)
|
||||
- [Refactor] avoid using `get-intrinsic` [`02480c0`](https://github.com/ljharb/es-define-property/commit/02480c0353ef6118965282977c3864aff53d98b1)
|
||||
- [Tests] replace `aud` with `npm audit` [`f6093ff`](https://github.com/ljharb/es-define-property/commit/f6093ff74ab51c98015c2592cd393bd42478e773)
|
||||
- [Tests] configure testling [`7139e66`](https://github.com/ljharb/es-define-property/commit/7139e66959247a56086d9977359caef27c6849e7)
|
||||
- [Dev Deps] update `tape` [`b901b51`](https://github.com/ljharb/es-define-property/commit/b901b511a75e001a40ce1a59fef7d9ffcfc87482)
|
||||
- [Tests] fix types in tests [`469d269`](https://github.com/ljharb/es-define-property/commit/469d269fd141b1e773ec053a9fa35843493583e0)
|
||||
- [Dev Deps] add missing peer dep [`733acfb`](https://github.com/ljharb/es-define-property/commit/733acfb0c4c96edf337e470b89a25a5b3724c352)
|
||||
|
||||
## v1.0.0 - 2024-02-12
|
||||
|
||||
### Commits
|
||||
|
||||
4
backend/node_modules/es-define-property/index.js
generated
vendored
4
backend/node_modules/es-define-property/index.js
generated
vendored
@@ -1,9 +1,7 @@
|
||||
'use strict';
|
||||
|
||||
var GetIntrinsic = require('get-intrinsic');
|
||||
|
||||
/** @type {import('.')} */
|
||||
var $defineProperty = GetIntrinsic('%Object.defineProperty%', true) || false;
|
||||
var $defineProperty = Object.defineProperty || false;
|
||||
if ($defineProperty) {
|
||||
try {
|
||||
$defineProperty({}, 'a', { value: 1 });
|
||||
|
||||
24
backend/node_modules/es-define-property/package.json
generated
vendored
24
backend/node_modules/es-define-property/package.json
generated
vendored
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "es-define-property",
|
||||
"version": "1.0.0",
|
||||
"version": "1.0.1",
|
||||
"description": "`Object.defineProperty`, but not IE 8's broken one.",
|
||||
"main": "index.js",
|
||||
"types": "./index.d.ts",
|
||||
@@ -19,7 +19,7 @@
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "nyc tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>= 10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -42,29 +42,29 @@
|
||||
"url": "https://github.com/ljharb/es-define-property/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/es-define-property#readme",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.2.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"@types/get-intrinsic": "^1.2.2",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.2",
|
||||
"@types/gopd": "^1.0.3",
|
||||
"@types/tape": "^5.6.4",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "^8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"gopd": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.4",
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/index.js"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
"template": "keepachangelog",
|
||||
|
||||
1
backend/node_modules/es-define-property/test/index.js
generated
vendored
1
backend/node_modules/es-define-property/test/index.js
generated
vendored
@@ -10,6 +10,7 @@ test('defineProperty: supported', { skip: !$defineProperty }, function (t) {
|
||||
|
||||
t.equal(typeof $defineProperty, 'function', 'defineProperty is supported');
|
||||
if ($defineProperty && gOPD) { // this `if` check is just to shut TS up
|
||||
/** @type {{ a: number, b?: number, c?: number }} */
|
||||
var o = { a: 1 };
|
||||
|
||||
$defineProperty(o, 'b', { enumerable: true, value: 2 });
|
||||
|
||||
44
backend/node_modules/es-define-property/tsconfig.json
generated
vendored
44
backend/node_modules/es-define-property/tsconfig.json
generated
vendored
@@ -1,47 +1,7 @@
|
||||
{
|
||||
"extends": "@ljharb/tsconfig",
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig.json to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "es2022", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
"useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "commonjs", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
// "moduleResolution": "node", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
// "typeRoots": ["types"], /* Specify multiple folders that act like `./node_modules/@types`. */
|
||||
"resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
|
||||
|
||||
/* JavaScript Support */
|
||||
"allowJs": true, /* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */
|
||||
"checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
"maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from `node_modules`. Only applicable with `allowJs`. */
|
||||
|
||||
/* Emit */
|
||||
"declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
"declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
"noEmit": true, /* Disable emitting files from a compilation. */
|
||||
|
||||
/* Interop Constraints */
|
||||
"allowSyntheticDefaultImports": true, /* Allow `import x from y` when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
|
||||
/* Completeness */
|
||||
// "skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
"target": "es2022",
|
||||
},
|
||||
"exclude": [
|
||||
"coverage",
|
||||
|
||||
14
backend/node_modules/express/History.md
generated
vendored
14
backend/node_modules/express/History.md
generated
vendored
@@ -1,3 +1,17 @@
|
||||
4.21.2 / 2024-11-06
|
||||
==========
|
||||
|
||||
* deps: path-to-regexp@0.1.12
|
||||
- Fix backtracking protection
|
||||
* deps: path-to-regexp@0.1.11
|
||||
- Throws an error on invalid path values
|
||||
|
||||
4.21.1 / 2024-10-08
|
||||
==========
|
||||
|
||||
* Backported a fix for [CVE-2024-47764](https://nvd.nist.gov/vuln/detail/CVE-2024-47764)
|
||||
|
||||
|
||||
4.21.0 / 2024-09-11
|
||||
==========
|
||||
|
||||
|
||||
10
backend/node_modules/express/package.json
generated
vendored
10
backend/node_modules/express/package.json
generated
vendored
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "express",
|
||||
"description": "Fast, unopinionated, minimalist web framework",
|
||||
"version": "4.21.0",
|
||||
"version": "4.21.2",
|
||||
"author": "TJ Holowaychuk <tj@vision-media.ca>",
|
||||
"contributors": [
|
||||
"Aaron Heckmann <aaron.heckmann+github@gmail.com>",
|
||||
@@ -15,6 +15,10 @@
|
||||
"license": "MIT",
|
||||
"repository": "expressjs/express",
|
||||
"homepage": "http://expressjs.com/",
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/express"
|
||||
},
|
||||
"keywords": [
|
||||
"express",
|
||||
"framework",
|
||||
@@ -33,7 +37,7 @@
|
||||
"body-parser": "1.20.3",
|
||||
"content-disposition": "0.5.4",
|
||||
"content-type": "~1.0.4",
|
||||
"cookie": "0.6.0",
|
||||
"cookie": "0.7.1",
|
||||
"cookie-signature": "1.0.6",
|
||||
"debug": "2.6.9",
|
||||
"depd": "2.0.0",
|
||||
@@ -47,7 +51,7 @@
|
||||
"methods": "~1.1.2",
|
||||
"on-finished": "2.4.1",
|
||||
"parseurl": "~1.3.3",
|
||||
"path-to-regexp": "0.1.10",
|
||||
"path-to-regexp": "0.1.12",
|
||||
"proxy-addr": "~2.0.7",
|
||||
"qs": "6.13.0",
|
||||
"range-parser": "~1.2.1",
|
||||
|
||||
4
backend/node_modules/get-intrinsic/.eslintrc
generated
vendored
4
backend/node_modules/get-intrinsic/.eslintrc
generated
vendored
@@ -11,6 +11,10 @@
|
||||
"es2022": true,
|
||||
},
|
||||
|
||||
"globals": {
|
||||
"Float16Array": false,
|
||||
},
|
||||
|
||||
"rules": {
|
||||
"array-bracket-newline": 0,
|
||||
"complexity": 0,
|
||||
|
||||
43
backend/node_modules/get-intrinsic/CHANGELOG.md
generated
vendored
43
backend/node_modules/get-intrinsic/CHANGELOG.md
generated
vendored
@@ -5,6 +5,49 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.3.0](https://github.com/ljharb/get-intrinsic/compare/v1.2.7...v1.3.0) - 2025-02-22
|
||||
|
||||
### Commits
|
||||
|
||||
- [Dev Deps] update `es-abstract`, `es-value-fixtures`, `for-each`, `object-inspect` [`9b61553`](https://github.com/ljharb/get-intrinsic/commit/9b61553c587f1c1edbd435597e88c7d387da97dd)
|
||||
- [Deps] update `call-bind-apply-helpers`, `es-object-atoms`, `get-proto` [`a341fee`](https://github.com/ljharb/get-intrinsic/commit/a341fee0f39a403b0f0069e82c97642d5eb11043)
|
||||
- [New] add `Float16Array` [`de22116`](https://github.com/ljharb/get-intrinsic/commit/de22116b492fb989a0341bceb6e573abfaed73dc)
|
||||
|
||||
## [v1.2.7](https://github.com/ljharb/get-intrinsic/compare/v1.2.6...v1.2.7) - 2025-01-02
|
||||
|
||||
### Commits
|
||||
|
||||
- [Refactor] use `get-proto` directly [`00ab955`](https://github.com/ljharb/get-intrinsic/commit/00ab95546a0980c8ad42a84253daaa8d2adcedf9)
|
||||
- [Deps] update `math-intrinsics` [`c716cdd`](https://github.com/ljharb/get-intrinsic/commit/c716cdd6bbe36b438057025561b8bb5a879ac8a0)
|
||||
- [Dev Deps] update `call-bound`, `es-abstract` [`dc648a6`](https://github.com/ljharb/get-intrinsic/commit/dc648a67eb359037dff8d8619bfa71d86debccb1)
|
||||
|
||||
## [v1.2.6](https://github.com/ljharb/get-intrinsic/compare/v1.2.5...v1.2.6) - 2024-12-11
|
||||
|
||||
### Commits
|
||||
|
||||
- [Refactor] use `math-intrinsics` [`841be86`](https://github.com/ljharb/get-intrinsic/commit/841be8641a9254c4c75483b30c8871b5d5065926)
|
||||
- [Refactor] use `es-object-atoms` [`42057df`](https://github.com/ljharb/get-intrinsic/commit/42057dfa16f66f64787e66482af381cc6f31d2c1)
|
||||
- [Deps] update `call-bind-apply-helpers` [`45afa24`](https://github.com/ljharb/get-intrinsic/commit/45afa24a9ee4d6d3c172db1f555b16cb27843ef4)
|
||||
- [Dev Deps] update `call-bound` [`9cba9c6`](https://github.com/ljharb/get-intrinsic/commit/9cba9c6e70212bc163b7a5529cb25df46071646f)
|
||||
|
||||
## [v1.2.5](https://github.com/ljharb/get-intrinsic/compare/v1.2.4...v1.2.5) - 2024-12-06
|
||||
|
||||
### Commits
|
||||
|
||||
- [actions] split out node 10-20, and 20+ [`6e2b9dd`](https://github.com/ljharb/get-intrinsic/commit/6e2b9dd23902665681ebe453256ccfe21d7966f0)
|
||||
- [Refactor] use `dunder-proto` and `call-bind-apply-helpers` instead of `has-proto` [`c095d17`](https://github.com/ljharb/get-intrinsic/commit/c095d179ad0f4fbfff20c8a3e0cb4fe668018998)
|
||||
- [Refactor] use `gopd` [`9841d5b`](https://github.com/ljharb/get-intrinsic/commit/9841d5b35f7ab4fd2d193f0c741a50a077920e90)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `auto-changelog`, `es-abstract`, `es-value-fixtures`, `gopd`, `mock-property`, `object-inspect`, `tape` [`2d07e01`](https://github.com/ljharb/get-intrinsic/commit/2d07e01310cee2cbaedfead6903df128b1f5d425)
|
||||
- [Deps] update `gopd`, `has-proto`, `has-symbols`, `hasown` [`974d8bf`](https://github.com/ljharb/get-intrinsic/commit/974d8bf5baad7939eef35c25cc1dd88c10a30fa6)
|
||||
- [Dev Deps] update `call-bind`, `es-abstract`, `tape` [`df9dde1`](https://github.com/ljharb/get-intrinsic/commit/df9dde178186631ab8a3165ede056549918ce4bc)
|
||||
- [Refactor] cache `es-define-property` as well [`43ef543`](https://github.com/ljharb/get-intrinsic/commit/43ef543cb02194401420e3a914a4ca9168691926)
|
||||
- [Deps] update `has-proto`, `has-symbols`, `hasown` [`ad4949d`](https://github.com/ljharb/get-intrinsic/commit/ad4949d5467316505aad89bf75f9417ed782f7af)
|
||||
- [Tests] use `call-bound` directly [`ad5c406`](https://github.com/ljharb/get-intrinsic/commit/ad5c4069774bfe90e520a35eead5fe5ca9d69e80)
|
||||
- [Deps] update `has-proto`, `hasown` [`45414ca`](https://github.com/ljharb/get-intrinsic/commit/45414caa312333a2798953682c68f85c550627dd)
|
||||
- [Tests] replace `aud` with `npm audit` [`18d3509`](https://github.com/ljharb/get-intrinsic/commit/18d3509f79460e7924da70409ee81e5053087523)
|
||||
- [Deps] update `es-define-property` [`aadaa3b`](https://github.com/ljharb/get-intrinsic/commit/aadaa3b2188d77ad9bff394ce5d4249c49eb21f5)
|
||||
- [Dev Deps] add missing peer dep [`c296a16`](https://github.com/ljharb/get-intrinsic/commit/c296a16246d0c9a5981944f4cc5cf61fbda0cf6a)
|
||||
|
||||
## [v1.2.4](https://github.com/ljharb/get-intrinsic/compare/v1.2.3...v1.2.4) - 2024-02-05
|
||||
|
||||
### Commits
|
||||
|
||||
61
backend/node_modules/get-intrinsic/index.js
generated
vendored
61
backend/node_modules/get-intrinsic/index.js
generated
vendored
@@ -2,6 +2,8 @@
|
||||
|
||||
var undefined;
|
||||
|
||||
var $Object = require('es-object-atoms');
|
||||
|
||||
var $Error = require('es-errors');
|
||||
var $EvalError = require('es-errors/eval');
|
||||
var $RangeError = require('es-errors/range');
|
||||
@@ -10,6 +12,14 @@ var $SyntaxError = require('es-errors/syntax');
|
||||
var $TypeError = require('es-errors/type');
|
||||
var $URIError = require('es-errors/uri');
|
||||
|
||||
var abs = require('math-intrinsics/abs');
|
||||
var floor = require('math-intrinsics/floor');
|
||||
var max = require('math-intrinsics/max');
|
||||
var min = require('math-intrinsics/min');
|
||||
var pow = require('math-intrinsics/pow');
|
||||
var round = require('math-intrinsics/round');
|
||||
var sign = require('math-intrinsics/sign');
|
||||
|
||||
var $Function = Function;
|
||||
|
||||
// eslint-disable-next-line consistent-return
|
||||
@@ -19,14 +29,8 @@ var getEvalledConstructor = function (expressionSyntax) {
|
||||
} catch (e) {}
|
||||
};
|
||||
|
||||
var $gOPD = Object.getOwnPropertyDescriptor;
|
||||
if ($gOPD) {
|
||||
try {
|
||||
$gOPD({}, '');
|
||||
} catch (e) {
|
||||
$gOPD = null; // this is IE 8, which has a broken gOPD
|
||||
}
|
||||
}
|
||||
var $gOPD = require('gopd');
|
||||
var $defineProperty = require('es-define-property');
|
||||
|
||||
var throwTypeError = function () {
|
||||
throw new $TypeError();
|
||||
@@ -49,13 +53,13 @@ var ThrowTypeError = $gOPD
|
||||
: throwTypeError;
|
||||
|
||||
var hasSymbols = require('has-symbols')();
|
||||
var hasProto = require('has-proto')();
|
||||
|
||||
var getProto = Object.getPrototypeOf || (
|
||||
hasProto
|
||||
? function (x) { return x.__proto__; } // eslint-disable-line no-proto
|
||||
: null
|
||||
);
|
||||
var getProto = require('get-proto');
|
||||
var $ObjectGPO = require('get-proto/Object.getPrototypeOf');
|
||||
var $ReflectGPO = require('get-proto/Reflect.getPrototypeOf');
|
||||
|
||||
var $apply = require('call-bind-apply-helpers/functionApply');
|
||||
var $call = require('call-bind-apply-helpers/functionCall');
|
||||
|
||||
var needsEval = {};
|
||||
|
||||
@@ -86,6 +90,7 @@ var INTRINSICS = {
|
||||
'%Error%': $Error,
|
||||
'%eval%': eval, // eslint-disable-line no-eval
|
||||
'%EvalError%': $EvalError,
|
||||
'%Float16Array%': typeof Float16Array === 'undefined' ? undefined : Float16Array,
|
||||
'%Float32Array%': typeof Float32Array === 'undefined' ? undefined : Float32Array,
|
||||
'%Float64Array%': typeof Float64Array === 'undefined' ? undefined : Float64Array,
|
||||
'%FinalizationRegistry%': typeof FinalizationRegistry === 'undefined' ? undefined : FinalizationRegistry,
|
||||
@@ -102,7 +107,8 @@ var INTRINSICS = {
|
||||
'%MapIteratorPrototype%': typeof Map === 'undefined' || !hasSymbols || !getProto ? undefined : getProto(new Map()[Symbol.iterator]()),
|
||||
'%Math%': Math,
|
||||
'%Number%': Number,
|
||||
'%Object%': Object,
|
||||
'%Object%': $Object,
|
||||
'%Object.getOwnPropertyDescriptor%': $gOPD,
|
||||
'%parseFloat%': parseFloat,
|
||||
'%parseInt%': parseInt,
|
||||
'%Promise%': typeof Promise === 'undefined' ? undefined : Promise,
|
||||
@@ -128,7 +134,20 @@ var INTRINSICS = {
|
||||
'%URIError%': $URIError,
|
||||
'%WeakMap%': typeof WeakMap === 'undefined' ? undefined : WeakMap,
|
||||
'%WeakRef%': typeof WeakRef === 'undefined' ? undefined : WeakRef,
|
||||
'%WeakSet%': typeof WeakSet === 'undefined' ? undefined : WeakSet
|
||||
'%WeakSet%': typeof WeakSet === 'undefined' ? undefined : WeakSet,
|
||||
|
||||
'%Function.prototype.call%': $call,
|
||||
'%Function.prototype.apply%': $apply,
|
||||
'%Object.defineProperty%': $defineProperty,
|
||||
'%Object.getPrototypeOf%': $ObjectGPO,
|
||||
'%Math.abs%': abs,
|
||||
'%Math.floor%': floor,
|
||||
'%Math.max%': max,
|
||||
'%Math.min%': min,
|
||||
'%Math.pow%': pow,
|
||||
'%Math.round%': round,
|
||||
'%Math.sign%': sign,
|
||||
'%Reflect.getPrototypeOf%': $ReflectGPO
|
||||
};
|
||||
|
||||
if (getProto) {
|
||||
@@ -223,11 +242,11 @@ var LEGACY_ALIASES = {
|
||||
|
||||
var bind = require('function-bind');
|
||||
var hasOwn = require('hasown');
|
||||
var $concat = bind.call(Function.call, Array.prototype.concat);
|
||||
var $spliceApply = bind.call(Function.apply, Array.prototype.splice);
|
||||
var $replace = bind.call(Function.call, String.prototype.replace);
|
||||
var $strSlice = bind.call(Function.call, String.prototype.slice);
|
||||
var $exec = bind.call(Function.call, RegExp.prototype.exec);
|
||||
var $concat = bind.call($call, Array.prototype.concat);
|
||||
var $spliceApply = bind.call($apply, Array.prototype.splice);
|
||||
var $replace = bind.call($call, String.prototype.replace);
|
||||
var $strSlice = bind.call($call, String.prototype.slice);
|
||||
var $exec = bind.call($call, RegExp.prototype.exec);
|
||||
|
||||
/* adapted from https://github.com/lodash/lodash/blob/4.17.15/dist/lodash.js#L6735-L6744 */
|
||||
var rePropName = /[^%.[\]]+|\[(?:(-?\d+(?:\.\d+)?)|(["'])((?:(?!\2)[^\\]|\\.)*?)\2)\]|(?=(?:\.|\[\])(?:\.|\[\]|%$))/g;
|
||||
|
||||
44
backend/node_modules/get-intrinsic/package.json
generated
vendored
44
backend/node_modules/get-intrinsic/package.json
generated
vendored
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "get-intrinsic",
|
||||
"version": "1.2.4",
|
||||
"version": "1.3.0",
|
||||
"description": "Get and robustly cache all JS language-level intrinsics at first require time",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
@@ -17,7 +17,7 @@
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "nyc tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>= 10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -43,26 +43,37 @@
|
||||
"url": "https://github.com/ljharb/get-intrinsic/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/get-intrinsic#readme",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"call-bind": "^1.0.5",
|
||||
"es-abstract": "^1.22.3",
|
||||
"es-value-fixtures": "^1.4.2",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"call-bound": "^1.0.3",
|
||||
"encoding": "^0.1.13",
|
||||
"es-abstract": "^1.23.9",
|
||||
"es-value-fixtures": "^1.7.1",
|
||||
"eslint": "=8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"for-each": "^0.3.3",
|
||||
"gopd": "^1.0.1",
|
||||
"for-each": "^0.3.5",
|
||||
"make-async-function": "^1.0.0",
|
||||
"make-async-generator-function": "^1.0.0",
|
||||
"make-generator-function": "^2.0.0",
|
||||
"mock-property": "^1.0.3",
|
||||
"mock-property": "^1.1.0",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"object-inspect": "^1.13.1",
|
||||
"object-inspect": "^1.13.4",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.4"
|
||||
"tape": "^5.9.0"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
@@ -72,13 +83,6 @@
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2",
|
||||
"has-proto": "^1.0.1",
|
||||
"has-symbols": "^1.0.3",
|
||||
"hasown": "^2.0.0"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/GetIntrinsic.js"
|
||||
},
|
||||
|
||||
4
backend/node_modules/get-intrinsic/test/GetIntrinsic.js
generated
vendored
4
backend/node_modules/get-intrinsic/test/GetIntrinsic.js
generated
vendored
@@ -10,10 +10,10 @@ var asyncFns = require('make-async-function').list();
|
||||
var asyncGenFns = require('make-async-generator-function')();
|
||||
var mockProperty = require('mock-property');
|
||||
|
||||
var callBound = require('call-bind/callBound');
|
||||
var callBound = require('call-bound');
|
||||
var v = require('es-value-fixtures');
|
||||
var $gOPD = require('gopd');
|
||||
var DefinePropertyOrThrow = require('es-abstract/2021/DefinePropertyOrThrow');
|
||||
var DefinePropertyOrThrow = require('es-abstract/2023/DefinePropertyOrThrow');
|
||||
|
||||
var $isProto = callBound('%Object.prototype.isPrototypeOf%');
|
||||
|
||||
|
||||
20
backend/node_modules/gopd/CHANGELOG.md
generated
vendored
20
backend/node_modules/gopd/CHANGELOG.md
generated
vendored
@@ -5,6 +5,26 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.2.0](https://github.com/ljharb/gopd/compare/v1.1.0...v1.2.0) - 2024-12-03
|
||||
|
||||
### Commits
|
||||
|
||||
- [New] add `gOPD` entry point; remove `get-intrinsic` [`5b61232`](https://github.com/ljharb/gopd/commit/5b61232dedea4591a314bcf16101b1961cee024e)
|
||||
|
||||
## [v1.1.0](https://github.com/ljharb/gopd/compare/v1.0.1...v1.1.0) - 2024-11-29
|
||||
|
||||
### Commits
|
||||
|
||||
- [New] add types [`f585e39`](https://github.com/ljharb/gopd/commit/f585e397886d270e4ba84e53d226e4f9ca2eb0e6)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `auto-changelog`, `tape` [`0b8e4fd`](https://github.com/ljharb/gopd/commit/0b8e4fded64397a7726a9daa144a6cc9a5e2edfa)
|
||||
- [Dev Deps] update `aud`, `npmignore`, `tape` [`48378b2`](https://github.com/ljharb/gopd/commit/48378b2443f09a4f7efbd0fb6c3ee845a6cabcf3)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`78099ee`](https://github.com/ljharb/gopd/commit/78099eeed41bfdc134c912280483689cc8861c31)
|
||||
- [Tests] replace `aud` with `npm audit` [`4e0d0ac`](https://github.com/ljharb/gopd/commit/4e0d0ac47619d24a75318a8e1f543ee04b2a2632)
|
||||
- [meta] add missing `engines.node` [`1443316`](https://github.com/ljharb/gopd/commit/14433165d07835c680155b3dfd62d9217d735eca)
|
||||
- [Deps] update `get-intrinsic` [`eee5f51`](https://github.com/ljharb/gopd/commit/eee5f51769f3dbaf578b70e2a3199116b01aa670)
|
||||
- [Deps] update `get-intrinsic` [`550c378`](https://github.com/ljharb/gopd/commit/550c3780e3a9c77b62565712a001b4ed64ea61f5)
|
||||
- [Dev Deps] add missing peer dep [`8c2ecf8`](https://github.com/ljharb/gopd/commit/8c2ecf848122e4e30abfc5b5086fb48b390dce75)
|
||||
|
||||
## [v1.0.1](https://github.com/ljharb/gopd/compare/v1.0.0...v1.0.1) - 2022-11-01
|
||||
|
||||
### Commits
|
||||
|
||||
5
backend/node_modules/gopd/index.js
generated
vendored
5
backend/node_modules/gopd/index.js
generated
vendored
@@ -1,8 +1,7 @@
|
||||
'use strict';
|
||||
|
||||
var GetIntrinsic = require('get-intrinsic');
|
||||
|
||||
var $gOPD = GetIntrinsic('%Object.getOwnPropertyDescriptor%', true);
|
||||
/** @type {import('.')} */
|
||||
var $gOPD = require('./gOPD');
|
||||
|
||||
if ($gOPD) {
|
||||
try {
|
||||
|
||||
26
backend/node_modules/gopd/package.json
generated
vendored
26
backend/node_modules/gopd/package.json
generated
vendored
@@ -1,10 +1,11 @@
|
||||
{
|
||||
"name": "gopd",
|
||||
"version": "1.0.1",
|
||||
"version": "1.2.0",
|
||||
"description": "`Object.getOwnPropertyDescriptor`, but accounts for IE's broken implementation.",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
".": "./index.js",
|
||||
"./gOPD": "./gOPD.js",
|
||||
"./package.json": "./package.json"
|
||||
},
|
||||
"sideEffects": false,
|
||||
@@ -12,12 +13,13 @@
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"prelint": "tsc -p . && attw -P",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "evalmd README.md",
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"posttest": "npx npm@'>=10.2' audit --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -41,19 +43,20 @@
|
||||
"url": "https://github.com/ljharb/gopd/issues"
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/gopd#readme",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.1.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.0.0",
|
||||
"aud": "^2.0.1",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@arethetypeswrong/cli": "^0.17.0",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.0",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "=8.8.0",
|
||||
"evalmd": "^0.0.19",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.0",
|
||||
"npmignore": "^0.3.1",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.6.1"
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
@@ -67,5 +70,8 @@
|
||||
"ignore": [
|
||||
".github/workflows"
|
||||
]
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
}
|
||||
|
||||
3
backend/node_modules/gopd/test/index.js
generated
vendored
3
backend/node_modules/gopd/test/index.js
generated
vendored
@@ -10,6 +10,7 @@ test('gOPD', function (t) {
|
||||
var obj = { x: 1 };
|
||||
st.ok('x' in obj, 'property exists');
|
||||
|
||||
// @ts-expect-error TS can't figure out narrowing from `skip`
|
||||
var desc = gOPD(obj, 'x');
|
||||
st.deepEqual(
|
||||
desc,
|
||||
@@ -25,7 +26,7 @@ test('gOPD', function (t) {
|
||||
st.end();
|
||||
});
|
||||
|
||||
t.test('not supported', { skip: gOPD }, function (st) {
|
||||
t.test('not supported', { skip: !!gOPD }, function (st) {
|
||||
st.notOk(gOPD, 'is falsy');
|
||||
|
||||
st.end();
|
||||
|
||||
5
backend/node_modules/has-proto/.eslintrc
generated
vendored
5
backend/node_modules/has-proto/.eslintrc
generated
vendored
@@ -1,5 +0,0 @@
|
||||
{
|
||||
"root": true,
|
||||
|
||||
"extends": "@ljharb",
|
||||
}
|
||||
12
backend/node_modules/has-proto/.github/FUNDING.yml
generated
vendored
12
backend/node_modules/has-proto/.github/FUNDING.yml
generated
vendored
@@ -1,12 +0,0 @@
|
||||
# These are supported funding model platforms
|
||||
|
||||
github: [ljharb]
|
||||
patreon: # Replace with a single Patreon username
|
||||
open_collective: # Replace with a single Open Collective username
|
||||
ko_fi: # Replace with a single Ko-fi username
|
||||
tidelift: npm/has-proto
|
||||
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
|
||||
liberapay: # Replace with a single Liberapay username
|
||||
issuehunt: # Replace with a single IssueHunt username
|
||||
otechie: # Replace with a single Otechie username
|
||||
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']
|
||||
38
backend/node_modules/has-proto/CHANGELOG.md
generated
vendored
38
backend/node_modules/has-proto/CHANGELOG.md
generated
vendored
@@ -1,38 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.0.3](https://github.com/inspect-js/has-proto/compare/v1.0.2...v1.0.3) - 2024-02-19
|
||||
|
||||
### Commits
|
||||
|
||||
- [types] add missing declaration file [`26ecade`](https://github.com/inspect-js/has-proto/commit/26ecade05d253bb5dc376945ee3186d1fbe334f8)
|
||||
|
||||
## [v1.0.2](https://github.com/inspect-js/has-proto/compare/v1.0.1...v1.0.2) - 2024-02-19
|
||||
|
||||
### Commits
|
||||
|
||||
- add types [`6435262`](https://github.com/inspect-js/has-proto/commit/64352626cf511c0276d5f4bb6be770a0bf0f8524)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `npmignore`, `tape` [`f16a5e4`](https://github.com/inspect-js/has-proto/commit/f16a5e4121651e551271419f9d60fdd3561fd82c)
|
||||
- [Refactor] tiny cleanup [`d1f1a4b`](https://github.com/inspect-js/has-proto/commit/d1f1a4bdc135f115a10f148ce302676224534702)
|
||||
- [meta] add `sideEffects` flag [`e7ab1a6`](https://github.com/inspect-js/has-proto/commit/e7ab1a6f153b3e80dee68d1748b71e46767a0531)
|
||||
|
||||
## [v1.0.1](https://github.com/inspect-js/has-proto/compare/v1.0.0...v1.0.1) - 2022-12-21
|
||||
|
||||
### Commits
|
||||
|
||||
- [meta] correct URLs and description [`ef34483`](https://github.com/inspect-js/has-proto/commit/ef34483ca0d35680f271b6b96e35526151b25dfc)
|
||||
- [patch] add an additional criteria [`e81959e`](https://github.com/inspect-js/has-proto/commit/e81959ed7c7a77fbf459f00cb4ef824f1099497f)
|
||||
- [Dev Deps] update `aud` [`2bec2c4`](https://github.com/inspect-js/has-proto/commit/2bec2c47b072b122ff5443fba0263f6dc649531f)
|
||||
|
||||
## v1.0.0 - 2022-12-12
|
||||
|
||||
### Commits
|
||||
|
||||
- Initial implementation, tests, readme [`6886fea`](https://github.com/inspect-js/has-proto/commit/6886fea578f67daf69a7920b2eb7637ea6ebb0bc)
|
||||
- Initial commit [`99129c8`](https://github.com/inspect-js/has-proto/commit/99129c8f42471ac89cb681ba9cb9d52a583eb94f)
|
||||
- npm init [`2844ad8`](https://github.com/inspect-js/has-proto/commit/2844ad8e75b84d66a46765b3bab9d2e8ea692e10)
|
||||
- Only apps should have lockfiles [`c65bc5e`](https://github.com/inspect-js/has-proto/commit/c65bc5e40b9004463f7336d47c67245fb139a36a)
|
||||
21
backend/node_modules/has-proto/LICENSE
generated
vendored
21
backend/node_modules/has-proto/LICENSE
generated
vendored
@@ -1,21 +0,0 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2022 Inspect JS
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
38
backend/node_modules/has-proto/README.md
generated
vendored
38
backend/node_modules/has-proto/README.md
generated
vendored
@@ -1,38 +0,0 @@
|
||||
# has-proto <sup>[![Version Badge][npm-version-svg]][package-url]</sup>
|
||||
|
||||
[![github actions][actions-image]][actions-url]
|
||||
[![coverage][codecov-image]][codecov-url]
|
||||
[![License][license-image]][license-url]
|
||||
[![Downloads][downloads-image]][downloads-url]
|
||||
|
||||
[![npm badge][npm-badge-png]][package-url]
|
||||
|
||||
Does this environment have the ability to set the [[Prototype]] of an object on creation with `__proto__`?
|
||||
|
||||
## Example
|
||||
|
||||
```js
|
||||
var hasProto = require('has-proto');
|
||||
var assert = require('assert');
|
||||
|
||||
assert.equal(typeof hasProto(), 'boolean');
|
||||
```
|
||||
|
||||
## Tests
|
||||
Simply clone the repo, `npm install`, and run `npm test`
|
||||
|
||||
[package-url]: https://npmjs.org/package/has-proto
|
||||
[npm-version-svg]: https://versionbadg.es/inspect-js/has-proto.svg
|
||||
[deps-svg]: https://david-dm.org/inspect-js/has-proto.svg
|
||||
[deps-url]: https://david-dm.org/inspect-js/has-proto
|
||||
[dev-deps-svg]: https://david-dm.org/inspect-js/has-proto/dev-status.svg
|
||||
[dev-deps-url]: https://david-dm.org/inspect-js/has-proto#info=devDependencies
|
||||
[npm-badge-png]: https://nodei.co/npm/has-proto.png?downloads=true&stars=true
|
||||
[license-image]: https://img.shields.io/npm/l/has-proto.svg
|
||||
[license-url]: LICENSE
|
||||
[downloads-image]: https://img.shields.io/npm/dm/has-proto.svg
|
||||
[downloads-url]: https://npm-stat.com/charts.html?package=has-proto
|
||||
[codecov-image]: https://codecov.io/gh/inspect-js/has-proto/branch/main/graphs/badge.svg
|
||||
[codecov-url]: https://app.codecov.io/gh/inspect-js/has-proto/
|
||||
[actions-image]: https://img.shields.io/endpoint?url=https://github-actions-badge-u3jn4tfpocch.runkit.sh/inspect-js/has-proto
|
||||
[actions-url]: https://github.com/inspect-js/has-proto/actions
|
||||
3
backend/node_modules/has-proto/index.d.ts
generated
vendored
3
backend/node_modules/has-proto/index.d.ts
generated
vendored
@@ -1,3 +0,0 @@
|
||||
declare function hasProto(): boolean;
|
||||
|
||||
export = hasProto;
|
||||
15
backend/node_modules/has-proto/index.js
generated
vendored
15
backend/node_modules/has-proto/index.js
generated
vendored
@@ -1,15 +0,0 @@
|
||||
'use strict';
|
||||
|
||||
var test = {
|
||||
__proto__: null,
|
||||
foo: {}
|
||||
};
|
||||
|
||||
var $Object = Object;
|
||||
|
||||
/** @type {import('.')} */
|
||||
module.exports = function hasProto() {
|
||||
// @ts-expect-error: TS errors on an inherited property for some reason
|
||||
return { __proto__: test }.foo === test.foo
|
||||
&& !(test instanceof $Object);
|
||||
};
|
||||
78
backend/node_modules/has-proto/package.json
generated
vendored
78
backend/node_modules/has-proto/package.json
generated
vendored
@@ -1,78 +0,0 @@
|
||||
{
|
||||
"name": "has-proto",
|
||||
"version": "1.0.3",
|
||||
"description": "Does this environment have the ability to get the [[Prototype]] of an object on creation with `__proto__`?",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
".": "./index.js",
|
||||
"./package.json": "./package.json"
|
||||
},
|
||||
"sideEffects": false,
|
||||
"scripts": {
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "tsc -p .",
|
||||
"pretest": "npm run lint",
|
||||
"tests-only": "tape 'test/**/*.js'",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/inspect-js/has-proto.git"
|
||||
},
|
||||
"keywords": [
|
||||
"prototype",
|
||||
"proto",
|
||||
"set",
|
||||
"get",
|
||||
"__proto__",
|
||||
"getPrototypeOf",
|
||||
"setPrototypeOf",
|
||||
"has"
|
||||
],
|
||||
"author": "Jordan Harband <ljharb@gmail.com>",
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
},
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/inspect-js/has-proto/issues"
|
||||
},
|
||||
"homepage": "https://github.com/inspect-js/has-proto#readme",
|
||||
"testling": {
|
||||
"files": "test/index.js"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^21.1.0",
|
||||
"@types/tape": "^5.6.4",
|
||||
"aud": "^2.0.4",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"eslint": "=8.8.0",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.7.5",
|
||||
"typescript": "next"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"auto-changelog": {
|
||||
"output": "CHANGELOG.md",
|
||||
"template": "keepachangelog",
|
||||
"unreleased": false,
|
||||
"commitLimit": false,
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"publishConfig": {
|
||||
"ignore": [
|
||||
".github/workflows"
|
||||
]
|
||||
}
|
||||
}
|
||||
19
backend/node_modules/has-proto/test/index.js
generated
vendored
19
backend/node_modules/has-proto/test/index.js
generated
vendored
@@ -1,19 +0,0 @@
|
||||
'use strict';
|
||||
|
||||
var test = require('tape');
|
||||
var hasProto = require('../');
|
||||
|
||||
test('hasProto', function (t) {
|
||||
var result = hasProto();
|
||||
t.equal(typeof result, 'boolean', 'returns a boolean (' + result + ')');
|
||||
|
||||
var obj = { __proto__: null };
|
||||
if (result) {
|
||||
t.notOk('toString' in obj, 'null object lacks toString');
|
||||
} else {
|
||||
t.ok('toString' in obj, 'without proto, null object has toString');
|
||||
t.equal(obj.__proto__, null); // eslint-disable-line no-proto
|
||||
}
|
||||
|
||||
t.end();
|
||||
});
|
||||
49
backend/node_modules/has-proto/tsconfig.json
generated
vendored
49
backend/node_modules/has-proto/tsconfig.json
generated
vendored
@@ -1,49 +0,0 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig to read more about this file */
|
||||
|
||||
/* Projects */
|
||||
|
||||
/* Language and Environment */
|
||||
"target": "ESNext", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
// "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
|
||||
"useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
|
||||
// "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */
|
||||
|
||||
/* Modules */
|
||||
"module": "commonjs", /* Specify what module code is generated. */
|
||||
// "rootDir": "./", /* Specify the root folder within your source files. */
|
||||
// "moduleResolution": "node10", /* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
|
||||
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
|
||||
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
|
||||
"typeRoots": ["types"], /* Specify multiple folders that act like './node_modules/@types'. */
|
||||
"resolveJsonModule": true, /* Enable importing .json files. */
|
||||
// "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */
|
||||
|
||||
/* JavaScript Support */
|
||||
"allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */
|
||||
"checkJs": true, /* Enable error reporting in type-checked JavaScript files. */
|
||||
"maxNodeModuleJsDepth": 0, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */
|
||||
|
||||
/* Emit */
|
||||
"declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
|
||||
"declarationMap": true, /* Create sourcemaps for d.ts files. */
|
||||
"noEmit": true, /* Disable emitting files from a compilation. */
|
||||
|
||||
/* Interop Constraints */
|
||||
"allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */
|
||||
"esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */
|
||||
"forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */
|
||||
|
||||
/* Type Checking */
|
||||
"strict": true, /* Enable all strict type-checking options. */
|
||||
|
||||
/* Completeness */
|
||||
//"skipLibCheck": true /* Skip type checking all .d.ts files. */
|
||||
},
|
||||
"exclude": [
|
||||
"coverage"
|
||||
]
|
||||
}
|
||||
16
backend/node_modules/has-symbols/CHANGELOG.md
generated
vendored
16
backend/node_modules/has-symbols/CHANGELOG.md
generated
vendored
@@ -5,6 +5,22 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/)
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [v1.1.0](https://github.com/inspect-js/has-symbols/compare/v1.0.3...v1.1.0) - 2024-12-02
|
||||
|
||||
### Commits
|
||||
|
||||
- [actions] update workflows [`548c0bf`](https://github.com/inspect-js/has-symbols/commit/548c0bf8c9b1235458df7a1c0490b0064647a282)
|
||||
- [actions] further shard; update action deps [`bec56bb`](https://github.com/inspect-js/has-symbols/commit/bec56bb0fb44b43a786686b944875a3175cf3ff3)
|
||||
- [meta] use `npmignore` to autogenerate an npmignore file [`ac81032`](https://github.com/inspect-js/has-symbols/commit/ac81032809157e0a079e5264e9ce9b6f1275777e)
|
||||
- [New] add types [`6469cbf`](https://github.com/inspect-js/has-symbols/commit/6469cbff1866cfe367b2b3d181d9296ec14b2a3d)
|
||||
- [actions] update rebase action to use reusable workflow [`9c9d4d0`](https://github.com/inspect-js/has-symbols/commit/9c9d4d0d8938e4b267acdf8e421f4e92d1716d72)
|
||||
- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `aud`, `tape` [`adb5887`](https://github.com/inspect-js/has-symbols/commit/adb5887ca9444849b08beb5caaa9e1d42320cdfb)
|
||||
- [Dev Deps] update `@ljharb/eslint-config`, `aud`, `tape` [`13ec198`](https://github.com/inspect-js/has-symbols/commit/13ec198ec80f1993a87710af1606a1970b22c7cb)
|
||||
- [Dev Deps] update `auto-changelog`, `core-js`, `tape` [`941be52`](https://github.com/inspect-js/has-symbols/commit/941be5248387cab1da72509b22acf3fdb223f057)
|
||||
- [Tests] replace `aud` with `npm audit` [`74f49e9`](https://github.com/inspect-js/has-symbols/commit/74f49e9a9d17a443020784234a1c53ce765b3559)
|
||||
- [Dev Deps] update `npmignore` [`9c0ac04`](https://github.com/inspect-js/has-symbols/commit/9c0ac0452a834f4c2a4b54044f2d6a89f17e9a70)
|
||||
- [Dev Deps] add missing peer dep [`52337a5`](https://github.com/inspect-js/has-symbols/commit/52337a5621cced61f846f2afdab7707a8132cc12)
|
||||
|
||||
## [v1.0.3](https://github.com/inspect-js/has-symbols/compare/v1.0.2...v1.0.3) - 2022-03-01
|
||||
|
||||
### Commits
|
||||
|
||||
1
backend/node_modules/has-symbols/index.js
generated
vendored
1
backend/node_modules/has-symbols/index.js
generated
vendored
@@ -3,6 +3,7 @@
|
||||
var origSymbol = typeof Symbol !== 'undefined' && Symbol;
|
||||
var hasSymbolSham = require('./shams');
|
||||
|
||||
/** @type {import('.')} */
|
||||
module.exports = function hasNativeSymbols() {
|
||||
if (typeof origSymbol !== 'function') { return false; }
|
||||
if (typeof Symbol !== 'function') { return false; }
|
||||
|
||||
28
backend/node_modules/has-symbols/package.json
generated
vendored
28
backend/node_modules/has-symbols/package.json
generated
vendored
@@ -1,21 +1,23 @@
|
||||
{
|
||||
"name": "has-symbols",
|
||||
"version": "1.0.3",
|
||||
"version": "1.1.0",
|
||||
"description": "Determine if the JS environment has Symbol support. Supports spec, or shams.",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"prepack": "npmignore --auto --commentLines=autogenerated",
|
||||
"prepublishOnly": "safe-publish-latest",
|
||||
"prepublish": "not-in-publish || npm run prepublishOnly",
|
||||
"pretest": "npm run --silent lint",
|
||||
"test": "npm run tests-only",
|
||||
"posttest": "aud --production",
|
||||
"tests-only": "npm run test:stock && npm run test:staging && npm run test:shams",
|
||||
"posttest": "npx npm@'>=10.2' audit --production",
|
||||
"tests-only": "npm run test:stock && npm run test:shams",
|
||||
"test:stock": "nyc node test",
|
||||
"test:staging": "nyc node --harmony --es-staging test",
|
||||
"test:shams": "npm run --silent test:shams:getownpropertysymbols && npm run --silent test:shams:corejs",
|
||||
"test:shams:corejs": "nyc node test/shams/core-js.js",
|
||||
"test:shams:getownpropertysymbols": "nyc node test/shams/get-own-property-symbols.js",
|
||||
"lint": "eslint --ext=js,mjs .",
|
||||
"postlint": "tsc -p . && attw -P",
|
||||
"version": "auto-changelog && git add CHANGELOG.md",
|
||||
"postversion": "auto-changelog && git add CHANGELOG.md && git commit --no-edit --amend && git tag -f \"v$(node -e \"console.log(require('./package.json').version)\")\""
|
||||
},
|
||||
@@ -54,15 +56,22 @@
|
||||
},
|
||||
"homepage": "https://github.com/ljharb/has-symbols#readme",
|
||||
"devDependencies": {
|
||||
"@ljharb/eslint-config": "^20.2.3",
|
||||
"aud": "^2.0.0",
|
||||
"auto-changelog": "^2.4.0",
|
||||
"@arethetypeswrong/cli": "^0.17.0",
|
||||
"@ljharb/eslint-config": "^21.1.1",
|
||||
"@ljharb/tsconfig": "^0.2.0",
|
||||
"@types/core-js": "^2.5.8",
|
||||
"@types/tape": "^5.6.5",
|
||||
"auto-changelog": "^2.5.0",
|
||||
"core-js": "^2.6.12",
|
||||
"encoding": "^0.1.13",
|
||||
"eslint": "=8.8.0",
|
||||
"get-own-property-symbols": "^0.9.5",
|
||||
"in-publish": "^2.0.1",
|
||||
"npmignore": "^0.3.1",
|
||||
"nyc": "^10.3.2",
|
||||
"safe-publish-latest": "^2.0.0",
|
||||
"tape": "^5.5.2"
|
||||
"tape": "^5.9.0",
|
||||
"typescript": "next"
|
||||
},
|
||||
"testling": {
|
||||
"files": "test/index.js",
|
||||
@@ -93,9 +102,10 @@
|
||||
"backfillLimit": false,
|
||||
"hideCredit": true
|
||||
},
|
||||
"greenkeeper": {
|
||||
"publishConfig": {
|
||||
"ignore": [
|
||||
"core-js"
|
||||
".github/workflows",
|
||||
"types"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
7
backend/node_modules/has-symbols/shams.js
generated
vendored
7
backend/node_modules/has-symbols/shams.js
generated
vendored
@@ -1,10 +1,12 @@
|
||||
'use strict';
|
||||
|
||||
/** @type {import('./shams')} */
|
||||
/* eslint complexity: [2, 18], max-statements: [2, 33] */
|
||||
module.exports = function hasSymbols() {
|
||||
if (typeof Symbol !== 'function' || typeof Object.getOwnPropertySymbols !== 'function') { return false; }
|
||||
if (typeof Symbol.iterator === 'symbol') { return true; }
|
||||
|
||||
/** @type {{ [k in symbol]?: unknown }} */
|
||||
var obj = {};
|
||||
var sym = Symbol('test');
|
||||
var symObj = Object(sym);
|
||||
@@ -23,7 +25,7 @@ module.exports = function hasSymbols() {
|
||||
|
||||
var symVal = 42;
|
||||
obj[sym] = symVal;
|
||||
for (sym in obj) { return false; } // eslint-disable-line no-restricted-syntax, no-unreachable-loop
|
||||
for (var _ in obj) { return false; } // eslint-disable-line no-restricted-syntax, no-unreachable-loop
|
||||
if (typeof Object.keys === 'function' && Object.keys(obj).length !== 0) { return false; }
|
||||
|
||||
if (typeof Object.getOwnPropertyNames === 'function' && Object.getOwnPropertyNames(obj).length !== 0) { return false; }
|
||||
@@ -34,7 +36,8 @@ module.exports = function hasSymbols() {
|
||||
if (!Object.prototype.propertyIsEnumerable.call(obj, sym)) { return false; }
|
||||
|
||||
if (typeof Object.getOwnPropertyDescriptor === 'function') {
|
||||
var descriptor = Object.getOwnPropertyDescriptor(obj, sym);
|
||||
// eslint-disable-next-line no-extra-parens
|
||||
var descriptor = /** @type {PropertyDescriptor} */ (Object.getOwnPropertyDescriptor(obj, sym));
|
||||
if (descriptor.value !== symVal || descriptor.enumerable !== true) { return false; }
|
||||
}
|
||||
|
||||
|
||||
1
backend/node_modules/has-symbols/test/shams/core-js.js
generated
vendored
1
backend/node_modules/has-symbols/test/shams/core-js.js
generated
vendored
@@ -8,6 +8,7 @@ if (typeof Symbol === 'function' && typeof Symbol() === 'symbol') {
|
||||
t.equal(typeof Symbol(), 'symbol');
|
||||
t.end();
|
||||
});
|
||||
// @ts-expect-error TS is stupid and doesn't know about top level return
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
1
backend/node_modules/has-symbols/test/shams/get-own-property-symbols.js
generated
vendored
1
backend/node_modules/has-symbols/test/shams/get-own-property-symbols.js
generated
vendored
@@ -8,6 +8,7 @@ if (typeof Symbol === 'function' && typeof Symbol() === 'symbol') {
|
||||
t.equal(typeof Symbol(), 'symbol');
|
||||
t.end();
|
||||
});
|
||||
// @ts-expect-error TS is stupid and doesn't know about top level return
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
6
backend/node_modules/has-symbols/test/tests.js
generated
vendored
6
backend/node_modules/has-symbols/test/tests.js
generated
vendored
@@ -1,5 +1,6 @@
|
||||
'use strict';
|
||||
|
||||
/** @type {(t: import('tape').Test) => false | void} */
|
||||
// eslint-disable-next-line consistent-return
|
||||
module.exports = function runSymbolTests(t) {
|
||||
t.equal(typeof Symbol, 'function', 'global Symbol is a function');
|
||||
@@ -31,6 +32,7 @@ module.exports = function runSymbolTests(t) {
|
||||
|
||||
t.equal(typeof Object.getOwnPropertySymbols, 'function', 'Object.getOwnPropertySymbols is a function');
|
||||
|
||||
/** @type {{ [k in symbol]?: unknown }} */
|
||||
var obj = {};
|
||||
var sym = Symbol('test');
|
||||
var symObj = Object(sym);
|
||||
@@ -40,8 +42,8 @@ module.exports = function runSymbolTests(t) {
|
||||
|
||||
var symVal = 42;
|
||||
obj[sym] = symVal;
|
||||
// eslint-disable-next-line no-restricted-syntax
|
||||
for (sym in obj) { t.fail('symbol property key was found in for..in of object'); }
|
||||
// eslint-disable-next-line no-restricted-syntax, no-unused-vars
|
||||
for (var _ in obj) { t.fail('symbol property key was found in for..in of object'); }
|
||||
|
||||
t.deepEqual(Object.keys(obj), [], 'no enumerable own keys on symbol-valued object');
|
||||
t.deepEqual(Object.getOwnPropertyNames(obj), [], 'no own names on symbol-valued object');
|
||||
|
||||
16
backend/node_modules/path-to-regexp/index.js
generated
vendored
16
backend/node_modules/path-to-regexp/index.js
generated
vendored
@@ -65,23 +65,33 @@ function pathToRegexp(path, keys, options) {
|
||||
return new RegExp(path.join('|'), flags);
|
||||
}
|
||||
|
||||
if (typeof path !== 'string') {
|
||||
throw new TypeError('path must be a string, array of strings, or regular expression');
|
||||
}
|
||||
|
||||
path = path.replace(
|
||||
/\\.|(\/)?(\.)?:(\w+)(\(.*?\))?(\*)?(\?)?|[.*]|\/\(/g,
|
||||
function (match, slash, format, key, capture, star, optional, offset) {
|
||||
pos = offset + match.length;
|
||||
|
||||
if (match[0] === '\\') {
|
||||
backtrack += match;
|
||||
pos += 2;
|
||||
return match;
|
||||
}
|
||||
|
||||
if (match === '.') {
|
||||
backtrack += '\\.';
|
||||
extraOffset += 1;
|
||||
pos += 1;
|
||||
return '\\.';
|
||||
}
|
||||
|
||||
backtrack = slash || format ? '' : path.slice(pos, offset);
|
||||
if (slash || format) {
|
||||
backtrack = '';
|
||||
} else {
|
||||
backtrack += path.slice(pos, offset);
|
||||
}
|
||||
|
||||
pos = offset + match.length;
|
||||
|
||||
if (match === '*') {
|
||||
extraOffset += 3;
|
||||
|
||||
2
backend/node_modules/path-to-regexp/package.json
generated
vendored
2
backend/node_modules/path-to-regexp/package.json
generated
vendored
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"name": "path-to-regexp",
|
||||
"description": "Express style path to RegExp utility",
|
||||
"version": "0.1.10",
|
||||
"version": "0.1.12",
|
||||
"files": [
|
||||
"index.js",
|
||||
"LICENSE"
|
||||
|
||||
315
backend/package-lock.json
generated
315
backend/package-lock.json
generated
@@ -10,10 +10,12 @@
|
||||
"hasInstallScript": true,
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"axios": "^1.12.2",
|
||||
"bcrypt": "^5.1.1",
|
||||
"cors": "^2.8.5",
|
||||
"crypto": "^1.0.1",
|
||||
"csv-parser": "^3.0.0",
|
||||
"date-fns": "^2.30.0",
|
||||
"dotenv": "^16.4.5",
|
||||
"express": "^4.19.2",
|
||||
"iconv-lite": "^0.6.3",
|
||||
@@ -21,6 +23,7 @@
|
||||
"multer": "^1.4.5-lts.1",
|
||||
"mysql2": "^3.10.3",
|
||||
"nodemailer": "^6.9.14",
|
||||
"pdf-parse": "^1.1.1",
|
||||
"sequelize": "^6.37.3",
|
||||
"sharp": "^0.33.5"
|
||||
},
|
||||
@@ -29,6 +32,15 @@
|
||||
"vue-eslint-parser": "9.4.3"
|
||||
}
|
||||
},
|
||||
"node_modules/@babel/runtime": {
|
||||
"version": "7.28.4",
|
||||
"resolved": "https://registry.npmjs.org/@babel/runtime/-/runtime-7.28.4.tgz",
|
||||
"integrity": "sha512-Q/N6JNWvIvPnLDvjlE1OUBLPQHH6l3CltCEsHIujp45zQUSSh8K+gHnaEX45yAT1nyngnINhvWtzN+Nb9D8RAQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=6.9.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@emnapi/runtime": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.2.0.tgz",
|
||||
@@ -819,6 +831,12 @@
|
||||
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
|
||||
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg=="
|
||||
},
|
||||
"node_modules/asynckit": {
|
||||
"version": "0.4.0",
|
||||
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
|
||||
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/aws-ssl-profiles": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/aws-ssl-profiles/-/aws-ssl-profiles-1.1.1.tgz",
|
||||
@@ -827,6 +845,17 @@
|
||||
"node": ">= 6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/axios": {
|
||||
"version": "1.12.2",
|
||||
"resolved": "https://registry.npmjs.org/axios/-/axios-1.12.2.tgz",
|
||||
"integrity": "sha512-vMJzPewAlRyOgxV2dU0Cuz2O8zzzx9VYtbJOaBgXFeLc4IV/Eg50n4LowmehOOR61S8ZMpc2K5Sa7g6A4jfkUw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"follow-redirects": "^1.15.6",
|
||||
"form-data": "^4.0.4",
|
||||
"proxy-from-env": "^1.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/balanced-match": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
|
||||
@@ -893,9 +922,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/brace-expansion": {
|
||||
"version": "1.1.11",
|
||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.11.tgz",
|
||||
"integrity": "sha512-iCuPHDFgrHX7H2vEI/5xpz07zSHB00TpugqhmYtVmMO6518mCuRMoOYFldEBl0g187ufozdaHgWKcYFb61qGiA==",
|
||||
"version": "1.1.12",
|
||||
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz",
|
||||
"integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"balanced-match": "^1.0.0",
|
||||
"concat-map": "0.0.1"
|
||||
@@ -963,6 +993,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/call-bind-apply-helpers": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
|
||||
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"function-bind": "^1.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/callsites": {
|
||||
"version": "3.1.0",
|
||||
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
|
||||
@@ -1092,6 +1135,18 @@
|
||||
"color-support": "bin.js"
|
||||
}
|
||||
},
|
||||
"node_modules/combined-stream": {
|
||||
"version": "1.0.8",
|
||||
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
|
||||
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"delayed-stream": "~1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.8"
|
||||
}
|
||||
},
|
||||
"node_modules/concat-map": {
|
||||
"version": "0.0.1",
|
||||
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
|
||||
@@ -1168,9 +1223,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/cookie": {
|
||||
"version": "0.6.0",
|
||||
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.6.0.tgz",
|
||||
"integrity": "sha512-U71cyTamuh1CRNCfpGY6to28lxvNwPG4Guz/EVjgf3Jmzv0vlDp1atT9eS5dDjMYHucpHbWns6Lwf3BKz6svdw==",
|
||||
"version": "0.7.1",
|
||||
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.1.tgz",
|
||||
"integrity": "sha512-6DnInpx7SJ2AK3+CTUE/ZM0vWTUboZCegxhC2xiIydHR9jNuTAASBrfEpHhiGOZw/nX51bHt6YQl8jsGo4y/0w==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.6"
|
||||
}
|
||||
@@ -1199,10 +1255,11 @@
|
||||
}
|
||||
},
|
||||
"node_modules/cross-spawn": {
|
||||
"version": "7.0.3",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.3.tgz",
|
||||
"integrity": "sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==",
|
||||
"version": "7.0.6",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
|
||||
"integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
|
||||
"dev": true,
|
||||
"license": "MIT",
|
||||
"peer": true,
|
||||
"dependencies": {
|
||||
"path-key": "^3.1.0",
|
||||
@@ -1233,6 +1290,22 @@
|
||||
"node": ">= 10"
|
||||
}
|
||||
},
|
||||
"node_modules/date-fns": {
|
||||
"version": "2.30.0",
|
||||
"resolved": "https://registry.npmjs.org/date-fns/-/date-fns-2.30.0.tgz",
|
||||
"integrity": "sha512-fnULvOpxnC5/Vg3NCiWelDsLiUc9bRwAPs/+LfTLNvetFCtCTN+yQz15C/fs4AwX1R9K5GLtLfn8QW+dWisaAw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@babel/runtime": "^7.21.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.11"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/date-fns"
|
||||
}
|
||||
},
|
||||
"node_modules/debug": {
|
||||
"version": "2.6.9",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
|
||||
@@ -1265,6 +1338,15 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/delayed-stream": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
|
||||
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4.0"
|
||||
}
|
||||
},
|
||||
"node_modules/delegates": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/delegates/-/delegates-1.0.0.tgz",
|
||||
@@ -1321,6 +1403,20 @@
|
||||
"resolved": "https://registry.npmjs.org/dottie/-/dottie-2.0.6.tgz",
|
||||
"integrity": "sha512-iGCHkfUc5kFekGiqhe8B/mdaurD+lakO9txNnTvKtA6PISrw86LgqHvRzWYPyoE2Ph5aMIrCw9/uko6XHTKCwA=="
|
||||
},
|
||||
"node_modules/dunder-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"gopd": "^1.2.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/ecdsa-sig-formatter": {
|
||||
"version": "1.0.11",
|
||||
"resolved": "https://registry.npmjs.org/ecdsa-sig-formatter/-/ecdsa-sig-formatter-1.0.11.tgz",
|
||||
@@ -1348,13 +1444,10 @@
|
||||
}
|
||||
},
|
||||
"node_modules/es-define-property": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.0.tgz",
|
||||
"integrity": "sha512-jxayLKShrEqqzJ0eumQbVhTYQM27CfT1T35+gCgDFoL82JLsXqTJ76zv6A0YLOgEnLUMvLzsDsGIrl8NFpT2gQ==",
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
|
||||
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.2.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
@@ -1368,6 +1461,33 @@
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-object-atoms": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
|
||||
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/es-set-tostringtag": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
|
||||
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"es-errors": "^1.3.0",
|
||||
"get-intrinsic": "^1.2.6",
|
||||
"has-tostringtag": "^1.0.2",
|
||||
"hasown": "^2.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/escape-html": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
|
||||
@@ -1585,16 +1705,17 @@
|
||||
}
|
||||
},
|
||||
"node_modules/express": {
|
||||
"version": "4.21.0",
|
||||
"resolved": "https://registry.npmjs.org/express/-/express-4.21.0.tgz",
|
||||
"integrity": "sha512-VqcNGcj/Id5ZT1LZ/cfihi3ttTn+NJmkli2eZADigjq29qTlWi/hAQ43t/VLPq8+UX06FCEx3ByOYet6ZFblng==",
|
||||
"version": "4.21.2",
|
||||
"resolved": "https://registry.npmjs.org/express/-/express-4.21.2.tgz",
|
||||
"integrity": "sha512-28HqgMZAmih1Czt9ny7qr6ek2qddF4FclbMzwhCREB6OFfH+rXAnuNCwo1/wFvrtbgsQDb4kSbX9de9lFbrXnA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"accepts": "~1.3.8",
|
||||
"array-flatten": "1.1.1",
|
||||
"body-parser": "1.20.3",
|
||||
"content-disposition": "0.5.4",
|
||||
"content-type": "~1.0.4",
|
||||
"cookie": "0.6.0",
|
||||
"cookie": "0.7.1",
|
||||
"cookie-signature": "1.0.6",
|
||||
"debug": "2.6.9",
|
||||
"depd": "2.0.0",
|
||||
@@ -1608,7 +1729,7 @@
|
||||
"methods": "~1.1.2",
|
||||
"on-finished": "2.4.1",
|
||||
"parseurl": "~1.3.3",
|
||||
"path-to-regexp": "0.1.10",
|
||||
"path-to-regexp": "0.1.12",
|
||||
"proxy-addr": "~2.0.7",
|
||||
"qs": "6.13.0",
|
||||
"range-parser": "~1.2.1",
|
||||
@@ -1623,6 +1744,10 @@
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.10.0"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/express"
|
||||
}
|
||||
},
|
||||
"node_modules/express/node_modules/encodeurl": {
|
||||
@@ -1753,6 +1878,42 @@
|
||||
"dev": true,
|
||||
"peer": true
|
||||
},
|
||||
"node_modules/follow-redirects": {
|
||||
"version": "1.15.11",
|
||||
"resolved": "https://registry.npmjs.org/follow-redirects/-/follow-redirects-1.15.11.tgz",
|
||||
"integrity": "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "individual",
|
||||
"url": "https://github.com/sponsors/RubenVerborgh"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=4.0"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"debug": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/form-data": {
|
||||
"version": "4.0.4",
|
||||
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
|
||||
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"asynckit": "^0.4.0",
|
||||
"combined-stream": "^1.0.8",
|
||||
"es-set-tostringtag": "^2.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"mime-types": "^2.1.12"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
}
|
||||
},
|
||||
"node_modules/forwarded": {
|
||||
"version": "0.2.0",
|
||||
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
|
||||
@@ -1849,16 +2010,21 @@
|
||||
}
|
||||
},
|
||||
"node_modules/get-intrinsic": {
|
||||
"version": "1.2.4",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.2.4.tgz",
|
||||
"integrity": "sha512-5uYhsJH8VJBTv7oslg4BznJYhDoRI6waYCxMmCdnTrcCrHA/fCFKoTFz2JKKE0HdDFUF7/oQuhzumXJK7paBRQ==",
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
|
||||
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"call-bind-apply-helpers": "^1.0.2",
|
||||
"es-define-property": "^1.0.1",
|
||||
"es-errors": "^1.3.0",
|
||||
"es-object-atoms": "^1.1.1",
|
||||
"function-bind": "^1.1.2",
|
||||
"has-proto": "^1.0.1",
|
||||
"has-symbols": "^1.0.3",
|
||||
"hasown": "^2.0.0"
|
||||
"get-proto": "^1.0.1",
|
||||
"gopd": "^1.2.0",
|
||||
"has-symbols": "^1.1.0",
|
||||
"hasown": "^2.0.2",
|
||||
"math-intrinsics": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1867,6 +2033,19 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/get-proto": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
|
||||
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"dunder-proto": "^1.0.1",
|
||||
"es-object-atoms": "^1.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/glob": {
|
||||
"version": "7.2.3",
|
||||
"resolved": "https://registry.npmjs.org/glob/-/glob-7.2.3.tgz",
|
||||
@@ -1913,12 +2092,12 @@
|
||||
}
|
||||
},
|
||||
"node_modules/gopd": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.0.1.tgz",
|
||||
"integrity": "sha512-d65bNlIadxvpb/A2abVdlqKqV563juRnZ1Wtk6s1sIR8uNsXR70xqIzVqxVf1eTqDunwT2MkczEeaezCKTZhwA==",
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
|
||||
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"get-intrinsic": "^1.1.3"
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
@@ -1945,10 +2124,10 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-proto": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-proto/-/has-proto-1.0.3.tgz",
|
||||
"integrity": "sha512-SJ1amZAJUiZS+PhsVLf5tGydlaVB8EdFpaSO4gmiUKUOxk8qzn5AIy4ZeJUmh22znIdk/uMAUT2pl3FxzVUH+Q==",
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
|
||||
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
@@ -1957,11 +2136,14 @@
|
||||
"url": "https://github.com/sponsors/ljharb"
|
||||
}
|
||||
},
|
||||
"node_modules/has-symbols": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.0.3.tgz",
|
||||
"integrity": "sha512-l3LCuF6MgDNwTDKkdYGEihYjt5pRPbEg46rtlmnSPlUbgmB8LOIrKJbYYFBSbnPaJexMKtiPO8hmeRjRz2Td+A==",
|
||||
"node_modules/has-tostringtag": {
|
||||
"version": "1.0.2",
|
||||
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
|
||||
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"has-symbols": "^1.0.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
},
|
||||
@@ -2405,6 +2587,15 @@
|
||||
"semver": "bin/semver.js"
|
||||
}
|
||||
},
|
||||
"node_modules/math-intrinsics": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
|
||||
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 0.4"
|
||||
}
|
||||
},
|
||||
"node_modules/media-typer": {
|
||||
"version": "0.3.0",
|
||||
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
|
||||
@@ -2636,6 +2827,12 @@
|
||||
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-5.1.0.tgz",
|
||||
"integrity": "sha512-eh0GgfEkpnoWDq+VY8OyvYhFEzBk6jIYbRKdIlyTiAXIVJ8PyBaKb0rp7oDtoddbdoHWhq8wwr+XZ81F1rpNdA=="
|
||||
},
|
||||
"node_modules/node-ensure": {
|
||||
"version": "0.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-ensure/-/node-ensure-0.0.0.tgz",
|
||||
"integrity": "sha512-DRI60hzo2oKN1ma0ckc6nQWlHU69RH6xN0sjQTjMpChPfTYvKZdcQFfdYK2RWbJcKyUizSIy/l8OTGxMAM1QDw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/node-fetch": {
|
||||
"version": "2.7.0",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.7.0.tgz",
|
||||
@@ -2888,9 +3085,37 @@
|
||||
}
|
||||
},
|
||||
"node_modules/path-to-regexp": {
|
||||
"version": "0.1.10",
|
||||
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.10.tgz",
|
||||
"integrity": "sha512-7lf7qcQidTku0Gu3YDPc8DJ1q7OOucfa/BSsIwjuh56VU7katFvuM8hULfkwB3Fns/rsVF7PwPKVw1sl5KQS9w==",
|
||||
"version": "0.1.12",
|
||||
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz",
|
||||
"integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pdf-parse": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/pdf-parse/-/pdf-parse-1.1.1.tgz",
|
||||
"integrity": "sha512-v6ZJ/efsBpGrGGknjtq9J/oC8tZWq0KWL5vQrk2GlzLEQPUDB1ex+13Rmidl1neNN358Jn9EHZw5y07FFtaC7A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"debug": "^3.1.0",
|
||||
"node-ensure": "^0.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=6.8.1"
|
||||
}
|
||||
},
|
||||
"node_modules/pdf-parse/node_modules/debug": {
|
||||
"version": "3.2.7",
|
||||
"resolved": "https://registry.npmjs.org/debug/-/debug-3.2.7.tgz",
|
||||
"integrity": "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ms": "^2.1.1"
|
||||
}
|
||||
},
|
||||
"node_modules/pdf-parse/node_modules/ms": {
|
||||
"version": "2.1.3",
|
||||
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
|
||||
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pg-connection-string": {
|
||||
@@ -2938,6 +3163,12 @@
|
||||
"node": ">= 0.10"
|
||||
}
|
||||
},
|
||||
"node_modules/proxy-from-env": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/proxy-from-env/-/proxy-from-env-1.1.0.tgz",
|
||||
"integrity": "sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pstree.remy": {
|
||||
"version": "1.1.8",
|
||||
"resolved": "https://registry.npmjs.org/pstree.remy/-/pstree.remy-1.1.8.tgz",
|
||||
|
||||
@@ -5,17 +5,21 @@
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"postinstall": "cd ../frontend && npm install && npm run build",
|
||||
"dev": "nodemon server.js"
|
||||
"dev": "nodemon server.js",
|
||||
"cleanup:usertoken": "node ./scripts/cleanupUserTokenKeys.js",
|
||||
"cleanup:indexes": "node ./scripts/cleanupAllIndexes.js"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"description": "",
|
||||
"dependencies": {
|
||||
"axios": "^1.12.2",
|
||||
"bcrypt": "^5.1.1",
|
||||
"cors": "^2.8.5",
|
||||
"crypto": "^1.0.1",
|
||||
"csv-parser": "^3.0.0",
|
||||
"date-fns": "^2.30.0",
|
||||
"dotenv": "^16.4.5",
|
||||
"express": "^4.19.2",
|
||||
"iconv-lite": "^0.6.3",
|
||||
@@ -23,6 +27,7 @@
|
||||
"multer": "^1.4.5-lts.1",
|
||||
"mysql2": "^3.10.3",
|
||||
"nodemailer": "^6.9.14",
|
||||
"pdf-parse": "^1.1.1",
|
||||
"sequelize": "^6.37.3",
|
||||
"sharp": "^0.33.5"
|
||||
},
|
||||
|
||||
@@ -6,6 +6,6 @@ const router = express.Router();
|
||||
router.post('/register', registerUser);
|
||||
router.get('/activate/:activationCode', activate);
|
||||
router.post('/login', loginUser);
|
||||
router.get('/logout', logoutUser);
|
||||
router.post('/logout', logoutUser); // Ändere GET zu POST
|
||||
|
||||
export default router;
|
||||
|
||||
15
backend/routes/diaryMemberActivityRoutes.js
Normal file
15
backend/routes/diaryMemberActivityRoutes.js
Normal file
@@ -0,0 +1,15 @@
|
||||
import express from 'express';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
import { addMembersToActivity, removeMemberFromActivity, getMembersForActivity } from '../controllers/diaryMemberActivityController.js';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
router.use(authenticate);
|
||||
|
||||
router.get('/:clubId/:diaryDateActivityId', getMembersForActivity);
|
||||
router.post('/:clubId/:diaryDateActivityId', addMembersToActivity);
|
||||
router.delete('/:clubId/:diaryDateActivityId/:participantId', removeMemberFromActivity);
|
||||
|
||||
export default router;
|
||||
|
||||
|
||||
@@ -8,7 +8,8 @@ import {
|
||||
deleteDiaryNote,
|
||||
addDiaryTag,
|
||||
addTagToDiaryDate,
|
||||
deleteTagFromDiaryDate
|
||||
deleteTagFromDiaryDate,
|
||||
deleteDateForClub,
|
||||
} from '../controllers/diaryController.js';
|
||||
|
||||
const router = express.Router();
|
||||
@@ -21,5 +22,6 @@ router.delete('/:clubId/tag', authenticate, deleteTagFromDiaryDate);
|
||||
router.get('/:clubId', authenticate, getDatesForClub);
|
||||
router.post('/:clubId', authenticate, createDateForClub);
|
||||
router.put('/:clubId', authenticate, updateTrainingTimes);
|
||||
router.delete('/:clubId/:dateId', authenticate, deleteDateForClub);
|
||||
|
||||
export default router;
|
||||
|
||||
36
backend/routes/externalServiceRoutes.js
Normal file
36
backend/routes/externalServiceRoutes.js
Normal file
@@ -0,0 +1,36 @@
|
||||
import express from 'express';
|
||||
import externalServiceController from '../controllers/externalServiceController.js';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
|
||||
const router = express.Router();
|
||||
|
||||
// All routes require authentication
|
||||
router.use(authenticate);
|
||||
|
||||
// GET /api/external-service/account?service=mytischtennis - Get account
|
||||
router.get('/account', externalServiceController.getAccount);
|
||||
|
||||
// GET /api/external-service/status?service=mytischtennis - Check status
|
||||
router.get('/status', externalServiceController.getStatus);
|
||||
|
||||
// POST /api/external-service/account - Create or update account
|
||||
router.post('/account', externalServiceController.upsertAccount);
|
||||
|
||||
// DELETE /api/external-service/account?service=mytischtennis - Delete account
|
||||
router.delete('/account', externalServiceController.deleteAccount);
|
||||
|
||||
// POST /api/external-service/verify - Verify login
|
||||
router.post('/verify', externalServiceController.verifyLogin);
|
||||
|
||||
// GET /api/external-service/session?service=mytischtennis - Get stored session
|
||||
router.get('/session', externalServiceController.getSession);
|
||||
|
||||
// HeTTV specific routes
|
||||
// GET /api/external-service/hettv/main-page - Load HeTTV main page and find downloads
|
||||
router.get('/hettv/main-page', externalServiceController.loadHettvMainPage);
|
||||
|
||||
// POST /api/external-service/hettv/download-page - Load specific HeTTV download page
|
||||
router.post('/hettv/download-page', externalServiceController.loadHettvDownloadPage);
|
||||
|
||||
export default router;
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage } from '../controllers/memberController.js';
|
||||
import { getClubMembers, getWaitingApprovals, setClubMembers, uploadMemberImage, getMemberImage, updateRatingsFromMyTischtennis } from '../controllers/memberController.js';
|
||||
import express from 'express';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
import multer from 'multer';
|
||||
@@ -13,5 +13,6 @@ router.get('/image/:clubId/:memberId', authenticate, getMemberImage);
|
||||
router.get('/get/:id/:showAll', authenticate, getClubMembers);
|
||||
router.post('/set/:id', authenticate, setClubMembers);
|
||||
router.get('/notapproved/:id', authenticate, getWaitingApprovals);
|
||||
router.post('/update-ratings/:id', authenticate, updateRatingsFromMyTischtennis);
|
||||
|
||||
export default router;
|
||||
|
||||
21
backend/routes/officialTournamentRoutes.js
Normal file
21
backend/routes/officialTournamentRoutes.js
Normal file
@@ -0,0 +1,21 @@
|
||||
import express from 'express';
|
||||
import multer from 'multer';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
import { uploadTournamentPdf, getParsedTournament, listOfficialTournaments, deleteOfficialTournament, upsertCompetitionMember, listClubParticipations, updateParticipantStatus } from '../controllers/officialTournamentController.js';
|
||||
|
||||
const router = express.Router();
|
||||
const upload = multer({ storage: multer.memoryStorage() });
|
||||
|
||||
router.use(authenticate);
|
||||
|
||||
router.get('/:clubId', listOfficialTournaments);
|
||||
router.get('/:clubId/participations/summary', listClubParticipations);
|
||||
router.post('/:clubId/upload', upload.single('pdf'), uploadTournamentPdf);
|
||||
router.get('/:clubId/:id', getParsedTournament);
|
||||
router.delete('/:clubId/:id', deleteOfficialTournament);
|
||||
router.post('/:clubId/:id/participation', upsertCompetitionMember);
|
||||
router.post('/:clubId/:id/status', updateParticipantStatus);
|
||||
|
||||
export default router;
|
||||
|
||||
|
||||
@@ -4,13 +4,41 @@ import {
|
||||
getAllPredefinedActivities,
|
||||
getPredefinedActivityById,
|
||||
updatePredefinedActivity,
|
||||
searchPredefinedActivities,
|
||||
mergePredefinedActivities,
|
||||
deduplicatePredefinedActivities,
|
||||
} from '../controllers/predefinedActivityController.js';
|
||||
import multer from 'multer';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
import { uploadPredefinedActivityImage, deletePredefinedActivityImage } from '../controllers/predefinedActivityImageController.js';
|
||||
import PredefinedActivityImage from '../models/PredefinedActivityImage.js';
|
||||
import path from 'path';
|
||||
import fs from 'fs';
|
||||
|
||||
const router = express.Router();
|
||||
const upload = multer({ storage: multer.memoryStorage() });
|
||||
|
||||
router.post('/', createPredefinedActivity);
|
||||
router.get('/', getAllPredefinedActivities);
|
||||
router.get('/:id', getPredefinedActivityById);
|
||||
router.put('/:id', updatePredefinedActivity);
|
||||
router.post('/', authenticate, createPredefinedActivity);
|
||||
router.get('/', authenticate, getAllPredefinedActivities);
|
||||
router.get('/:id', authenticate, getPredefinedActivityById);
|
||||
router.put('/:id', authenticate, updatePredefinedActivity);
|
||||
router.post('/:id/image', authenticate, upload.single('image'), uploadPredefinedActivityImage);
|
||||
router.put('/:id/image', authenticate, upload.single('image'), uploadPredefinedActivityImage);
|
||||
router.delete('/:id/image/:imageId', authenticate, deletePredefinedActivityImage);
|
||||
router.get('/search/query', authenticate, searchPredefinedActivities);
|
||||
router.post('/merge', authenticate, mergePredefinedActivities);
|
||||
router.post('/deduplicate', authenticate, deduplicatePredefinedActivities);
|
||||
router.get('/:id/image/:imageId', async (req, res) => {
|
||||
try {
|
||||
const { id, imageId } = req.params;
|
||||
const image = await PredefinedActivityImage.findOne({ where: { id: imageId, predefinedActivityId: id } });
|
||||
if (!image) return res.status(404).json({ error: 'Image not found' });
|
||||
if (!fs.existsSync(image.imagePath)) return res.status(404).json({ error: 'Image file missing' });
|
||||
res.sendFile(path.resolve(image.imagePath));
|
||||
} catch (e) {
|
||||
console.error('[getPredefinedActivityImage] - Error:', e);
|
||||
res.status(500).json({ error: 'Failed to fetch image' });
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
||||
|
||||
10
backend/routes/trainingStatsRoutes.js
Normal file
10
backend/routes/trainingStatsRoutes.js
Normal file
@@ -0,0 +1,10 @@
|
||||
import express from 'express';
|
||||
import trainingStatsController from '../controllers/trainingStatsController.js';
|
||||
import { authenticate } from '../middleware/authMiddleware.js';
|
||||
const router = express.Router();
|
||||
|
||||
router.use(authenticate);
|
||||
|
||||
router.get('/:clubId', trainingStatsController.getTrainingStats);
|
||||
|
||||
export default router;
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user