23 Commits

Author SHA1 Message Date
Torsten Schulz (local)
fddde56076 Update package version to 1.1.6 and enhance deploy script with dependency checks
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m29s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 3m7s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Failing after 11s
- Bumped the package version to 1.1.6 in package.json.
- Added `install_dependencies_if_needed` function in deploy-test.sh to conditionally install dependencies based on the state of package-lock.json.
- Improved logging of build output to a file for better error tracking during deployment.
- Updated Nuxt cache handling to retain .nuxt directory for faster builds unless explicitly cleaned.
2026-05-06 16:03:16 +02:00
Torsten Schulz (local)
c385df4a0c Add dependency installation check and logging to deploy script
All checks were successful
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m54s
- Introduced `install_dependencies_if_needed` function to conditionally install dependencies based on the presence and changes in `package-lock.json`.
- Updated the deployment process to log build output to a file for better error tracking.
- Modified Nuxt configuration to disable source maps in production and prevent reporting of compressed sizes in Vite builds.
2026-05-06 15:58:02 +02:00
Torsten Schulz (local)
e44d3c5c74 Update package version to 1.1.5 in package.json
All checks were successful
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m5s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 3m35s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Successful in 11s
2026-05-05 15:14:38 +02:00
Torsten Schulz (local)
c409fa6d4b Update candidate paths for CSV file retrieval in mannschaften.get.js
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Has been cancelled
- Adjusted the logic to prioritize the new CMS write target for public data.
- Updated comments to clarify the order of candidate paths for file retrieval.
2026-05-05 15:13:22 +02:00
Torsten Schulz (local)
0fa19493c5 Refactor readPackageVersion function to support multiple candidate paths for package.json
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 1m58s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 2m47s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Failing after 9s
- Updated the logic to read the package version from either the current directory or the parent directory.
- Added error handling to continue searching through candidate paths if the first read fails.
2026-04-27 16:52:12 +02:00
Torsten Schulz (local)
c145a723ed Update package version to 1.1.4 in package.json and package-lock.json
All checks were successful
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m2s
2026-04-27 16:47:57 +02:00
Torsten Schulz (local)
d0b15f3e83 Update package version to 1.1.3 and postcss dependency to 8.5.12 in package.json and package-lock.json
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m4s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 3m25s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Failing after 9s
2026-04-27 16:40:33 +02:00
Torsten Schulz (local)
e60c0f4481 Add logic to include active trainers as newsletter recipients
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m1s
Code Analysis and Production Deploy / analyze (pull_request) Failing after 3m16s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Successful in 11s
- Enhanced the getRecipientsByGroup function to filter and add active trainers from users.json to the newsletter recipients list.
- Ensured that duplicate emails are not added to the recipients array.
2026-04-27 15:10:57 +02:00
Torsten Schulz (local)
27a096546f Implement user sorting feature in Benutzer.vue
All checks were successful
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m19s
- Added a dropdown for sorting active users by first name or last name.
- Updated the display of active users to reflect the selected sorting order.
- Introduced helper functions to split names and format display names accordingly.
2026-04-27 15:04:41 +02:00
Torsten Schulz (local)
20a1cdd7f2 Update package version to 1.1.2 in package.json and modify code-analysis.yml to trigger analysis only on pull requests.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Has been skipped
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 1m49s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 2m47s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
Require Package Version Change / check (pull_request) Failing after 7s
2026-04-16 13:53:31 +02:00
Torsten Schulz (local)
e3825ad217 Update package version to 1.1.1 in package.json for the Harheimer Tischtennis Club website.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 2m48s
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Has been cancelled
2026-04-16 13:45:02 +02:00
Torsten Schulz (local)
a12f1f7815 Remove package version change requirement for main PRs in code-analysis.yml to streamline workflow.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 2m44s
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 1m54s
Code Analysis and Production Deploy / analyze (pull_request) Successful in 2m49s
Require Package Version Change / check (pull_request) Failing after 8s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
2026-04-16 13:36:45 +02:00
Torsten Schulz (local)
6fea2749e0 Add app version display in Footer and implement version API endpoint
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 2m49s
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 1m54s
Code Analysis and Production Deploy / analyze (pull_request) Failing after 11s
Code Analysis and Production Deploy / deploy-production (pull_request) Has been skipped
Code Analysis and Production Deploy / deploy-test (pull_request) Has been skipped
- Updated Footer.vue to show the application version for logged-in users.
- Added a new API endpoint to return the application version from package.json.
- Enhanced code-analysis.yml to require package version changes for main PRs.
2026-04-16 13:16:53 +02:00
Torsten Schulz (local)
18da725567 Refactor deployment scripts to use git fetch and reset for pulling latest changes. Update deploy-production.sh and deploy-test.sh to ensure a clean state before deployment. Modify code-analysis.yml to reflect these changes in deployment commands.
All checks were successful
Code Analysis and Production Deploy / analyze (push) Successful in 2m58s
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Successful in 2m0s
2026-04-16 13:11:23 +02:00
Torsten Schulz (local)
4d5fb43ebc Enhance deploy-test.sh with functions for Node.js version management, dependency installation, and public document synchronization. Implement checks for Node.js version requirements and improve error handling for document syncing. Update environment configuration in harheimertc.test.config.cjs to support development and test environments. Modify email recipient logic in contact and email service APIs to prevent notifications in test environments. Add tests to verify behavior in test conditions.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 2m52s
Code Analysis and Production Deploy / deploy-production (push) Has been skipped
Code Analysis and Production Deploy / deploy-test (push) Failing after 12s
2026-04-16 13:06:14 +02:00
Torsten Schulz (local)
986b2056cd Enhance deploy-production.sh with new functions for Node.js version management and public document synchronization. Added checks for Node.js version requirements and improved error handling for document syncing. Updated package.json and package-lock.json to specify Node.js and npm engine requirements.
All checks were successful
Code Analysis and Production Deploy / analyze (push) Successful in 2m45s
Code Analysis and Production Deploy / deploy-production (push) Successful in 1m57s
2026-04-15 21:51:08 +02:00
Torsten Schulz (local)
337c172d07 Refactor dependency installation in deploy-production.sh to use a dedicated function. This improves error handling for missing package-lock.json and ensures consistent installation behavior. Removed obsolete public-data restoration logic for cleaner script execution.
All checks were successful
Code Analysis and Production Deploy / analyze (push) Successful in 3m2s
Code Analysis and Production Deploy / deploy-production (push) Successful in 2m0s
2026-04-15 21:43:32 +02:00
Torsten Schulz (local)
15b8f3c4c1 Update version in package.json from 1.0.0 to 1.1.0 for the Harheimer Tischtennis Club website.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 3m49s
Code Analysis and Production Deploy / deploy-production (push) Failing after 0s
2026-04-15 21:36:37 +02:00
Torsten Schulz (local)
510cfd39f9 Update code-analysis workflow to include production deployment steps and rename workflow for clarity. Add SSH setup and connection testing for secure deployment to production environment.
Some checks failed
Code Analysis and Production Deploy / analyze (push) Successful in 2m44s
Code Analysis and Production Deploy / deploy-production (push) Failing after 1s
2026-04-15 21:30:09 +02:00
Torsten Schulz (local)
e0bad51764 Update commander dependency to version 13.1.0 in package.json and package-lock.json for improved functionality and compatibility.
All checks were successful
Code Analysis (JS/Vue) / analyze (push) Successful in 3m10s
2026-04-15 21:18:26 +02:00
Torsten Schulz (local)
c1de0c1671 Enhance deploy-production.sh with error handling for git pull failures. Provide user guidance for SSH key setup and switching to HTTPS if necessary. Update code-analysis.yml to include Node.js setup with caching for improved workflow efficiency.
Some checks failed
Code Analysis (JS/Vue) / analyze (push) Failing after 1m25s
2026-04-15 21:09:04 +02:00
Torsten Schulz (local)
2bedbee08d Upgrade nodemailer to latest major for audit compliance.
Some checks failed
Code Analysis (JS/Vue) / analyze (push) Failing after 10s
This removes the remaining SMTP command injection advisories by moving to nodemailer 8.0.5 and refreshing the lockfile accordingly.

Made-with: Cursor
2026-04-15 21:00:43 +02:00
Torsten Schulz (local)
9c54b6907e Apply non-major audit updates and harden path handling for Semgrep.
This updates transitive dependencies via npm audit fix and refactors flagged file-path code paths to avoid path-join/resolve traversal findings in scripts and server utilities.

Made-with: Cursor
2026-04-15 21:00:28 +02:00
30 changed files with 2910 additions and 1152 deletions

0
.codex Normal file
View File

View File

@@ -1,17 +1,24 @@
name: Code Analysis (JS/Vue) name: Code Analysis and Production Deploy
on: on:
pull_request: pull_request:
push: push:
branches: [ main ] branches: [ main, dev ]
jobs: jobs:
analyze: analyze:
runs-on: ubuntu-latest runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v4 uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 22
cache: npm
- name: Workspace sanity check - name: Workspace sanity check
run: | run: |
echo "PWD: $(pwd)" echo "PWD: $(pwd)"
@@ -82,3 +89,63 @@ jobs:
./osv-scanner --version ./osv-scanner --version
test -f ./package-lock.json test -f ./package-lock.json
./osv-scanner --lockfile ./package-lock.json ./osv-scanner --lockfile ./package-lock.json
deploy-production:
runs-on: ubuntu-latest
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
steps:
- name: Prepare SSH
run: |
set -euo pipefail
mkdir -p ~/.ssh
printf "%s" "${{ secrets.PROD_SSH_KEY }}" > ~/.ssh/id_ed25519
chmod 600 ~/.ssh/id_ed25519
ssh-keyscan -p "${{ vars.PROD_PORT }}" "${{ vars.PROD_HOST }}" >> ~/.ssh/known_hosts
- name: Test SSH connection
run: |
ssh -i ~/.ssh/id_ed25519 \
-o StrictHostKeyChecking=no \
-o BatchMode=yes \
-p "${{ vars.PROD_PORT }}" \
"${{ vars.PROD_USER }}@${{ vars.PROD_HOST }}" \
"echo SSH OK"
- name: Run production deployment script
run: |
ssh -i ~/.ssh/id_ed25519 \
-o BatchMode=yes \
-p "${{ vars.PROD_PORT }}" \
"${{ vars.PROD_USER }}@${{ vars.PROD_HOST }}" \
"bash -lc 'cd /var/www/harheimertc && git fetch origin main && git checkout -B main origin/main && git reset --hard origin/main && ./deploy-production.sh'"
deploy-test:
runs-on: ubuntu-latest
if: github.event_name == 'push' && github.ref == 'refs/heads/dev'
steps:
- name: Prepare SSH
run: |
set -euo pipefail
mkdir -p ~/.ssh
printf "%s" "${{ secrets.PROD_SSH_KEY }}" > ~/.ssh/id_ed25519
chmod 600 ~/.ssh/id_ed25519
ssh-keyscan -p "${{ vars.PROD_PORT }}" "${{ vars.PROD_HOST }}" >> ~/.ssh/known_hosts
- name: Test SSH connection
run: |
ssh -i ~/.ssh/id_ed25519 \
-o StrictHostKeyChecking=no \
-o BatchMode=yes \
-p "${{ vars.PROD_PORT }}" \
"${{ vars.PROD_USER }}@${{ vars.PROD_HOST }}" \
"echo SSH OK"
- name: Run test deployment script
run: |
ssh -i ~/.ssh/id_ed25519 \
-o BatchMode=yes \
-p "${{ vars.PROD_PORT }}" \
"${{ vars.PROD_USER }}@${{ vars.PROD_HOST }}" \
"bash -lc 'cd /var/www/harheimertc.test && git fetch origin dev && git checkout -B dev origin/dev && git reset --hard origin/dev && ./deploy-test.sh'"

View File

@@ -0,0 +1,20 @@
name: Require Package Version Change
on:
pull_request:
branches: [ main ]
jobs:
check:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: 22
- name: Check package.json version changed
run: scripts/check-package-version-changed.sh origin/main

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
22

View File

@@ -6,6 +6,13 @@
© {{ currentYear }} Harheimer TC 1954 e.V. © {{ currentYear }} Harheimer TC 1954 e.V.
</p> </p>
<div class="flex items-center space-x-6 text-sm relative"> <div class="flex items-center space-x-6 text-sm relative">
<span
v-if="isLoggedIn && appVersion"
class="text-xs text-gray-600"
title="Version"
>
v{{ appVersion }}
</span>
<NuxtLink <NuxtLink
to="/impressum" to="/impressum"
class="text-gray-400 hover:text-primary-400 transition-colors" class="text-gray-400 hover:text-primary-400 transition-colors"
@@ -89,7 +96,7 @@
</template> </template>
<script setup> <script setup>
import { ref, computed, onMounted, onUnmounted } from 'vue' import { ref, computed, onMounted, onUnmounted, watch } from 'vue'
import { useRouter } from 'vue-router' import { useRouter } from 'vue-router'
import { User, ChevronUp } from 'lucide-vue-next' import { User, ChevronUp } from 'lucide-vue-next'
@@ -97,11 +104,26 @@ const router = useRouter()
const authStore = useAuthStore() const authStore = useAuthStore()
const currentYear = new Date().getFullYear() const currentYear = new Date().getFullYear()
const isMemberMenuOpen = ref(false) const isMemberMenuOpen = ref(false)
const appVersion = ref('')
// Reactive auth state from store // Reactive auth state from store
const isLoggedIn = computed(() => authStore.isLoggedIn) const isLoggedIn = computed(() => authStore.isLoggedIn)
// const isAdmin = computed(() => authStore.isAdmin) // const isAdmin = computed(() => authStore.isAdmin)
const loadAppVersion = async () => {
if (!isLoggedIn.value) {
appVersion.value = ''
return
}
try {
const response = await $fetch('/api/app/version')
appVersion.value = response.version || ''
} catch (_error) {
appVersion.value = ''
}
}
const toggleMemberMenu = () => { const toggleMemberMenu = () => {
isMemberMenuOpen.value = !isMemberMenuOpen.value isMemberMenuOpen.value = !isMemberMenuOpen.value
} }
@@ -116,6 +138,10 @@ onMounted(() => {
authStore.checkAuth() authStore.checkAuth()
}) })
watch(isLoggedIn, () => {
loadAppVersion()
}, { immediate: true })
// Close menu when clicking outside // Close menu when clicking outside
const handleClickOutside = (event) => { const handleClickOutside = (event) => {
if (!event.target.closest('.relative')) { if (!event.target.closest('.relative')) {

View File

@@ -54,6 +54,108 @@ has_tracked_files_under() {
git ls-files "$prefix" | head -n 1 | grep -q . git ls-files "$prefix" | head -n 1 | grep -q .
} }
install_dependencies() {
if [ -f "package-lock.json" ]; then
echo " Running: npm ci"
npm ci
else
echo " WARNING: package-lock.json fehlt. Führe npm install aus..."
npm install
fi
}
install_dependencies_if_needed() {
local cache_dir=".deploy-cache"
local lock_hash_file="$cache_dir/package-lock.sha256"
local current_lock_hash=""
local previous_lock_hash=""
if [ ! -f "package-lock.json" ]; then
echo " package-lock.json fehlt, führe npm install aus..."
install_dependencies
return 0
fi
mkdir -p "$cache_dir"
current_lock_hash="$(sha256sum package-lock.json | awk '{print $1}')"
if [ -f "$lock_hash_file" ]; then
previous_lock_hash="$(cat "$lock_hash_file" 2>/dev/null || true)"
fi
if [ ! -d "node_modules" ]; then
echo " node_modules fehlt, installiere Dependencies..."
install_dependencies
elif [ "$current_lock_hash" != "$previous_lock_hash" ]; then
echo " package-lock.json geändert, führe npm ci aus..."
install_dependencies
else
echo " package-lock.json unverändert, überspringe npm ci"
fi
printf '%s\n' "$current_lock_hash" > "$lock_hash_file"
}
use_project_node() {
export NVM_DIR="${NVM_DIR:-$HOME/.nvm}"
if [ -s "$NVM_DIR/nvm.sh" ]; then
# shellcheck disable=SC1090
. "$NVM_DIR/nvm.sh"
if [ -f ".nvmrc" ]; then
echo " Using Node version from .nvmrc..."
nvm use
fi
fi
}
ensure_node_version() {
if ! command -v node >/dev/null 2>&1; then
echo "ERROR: Node.js ist nicht im PATH."
exit 1
fi
local node_version
node_version="$(node -p 'process.versions.node')"
if ! node -e 'const [major, minor] = process.versions.node.split(".").map(Number); process.exit(major > 22 || (major === 22 && minor >= 12) ? 0 : 1)' >/dev/null 2>&1; then
echo "ERROR: Node.js >= 22.12.0 wird benötigt, aktuell ist $node_version aktiv."
echo "Bitte Node 22 installieren/aktivieren, z.B.:"
echo " nvm install 22"
echo " nvm alias default 22"
exit 1
fi
echo " Node.js $node_version"
}
sync_public_documents_to_build() {
if [ ! -d "public/documents" ]; then
echo " No public/documents directory to sync"
return 0
fi
if [ ! -d ".output/public" ]; then
echo "ERROR: .output/public fehlt, kann public/documents nicht synchronisieren."
exit 1
fi
mkdir -p ".output/public/documents"
cp -a "public/documents/." ".output/public/documents/"
echo " ✓ public/documents -> .output/public/documents synchronisiert"
local template_pdf="beitrittserklärung_template.pdf"
if [ -f "public/documents/$template_pdf" ]; then
local source_size output_size
source_size=$(stat -f%z "public/documents/$template_pdf" 2>/dev/null || stat -c%s "public/documents/$template_pdf" 2>/dev/null || echo "0")
output_size=$(stat -f%z ".output/public/documents/$template_pdf" 2>/dev/null || stat -c%s ".output/public/documents/$template_pdf" 2>/dev/null || echo "0")
if [ "$source_size" != "$output_size" ] || [ "$source_size" = "0" ]; then
echo "ERROR: .output/public/documents/$template_pdf stimmt nicht mit public/documents überein (Source: $source_size, Output: $output_size)."
exit 1
fi
echo "$template_pdf im Build verifiziert ($output_size bytes)"
fi
}
echo "0. Ensuring persistent data directories (recommended)..." echo "0. Ensuring persistent data directories (recommended)..."
# IMPORTANT: Only symlink server/data if it's not tracked by git. # IMPORTANT: Only symlink server/data if it's not tracked by git.
if has_tracked_files_under "server/data"; then if has_tracked_files_under "server/data"; then
@@ -137,7 +239,20 @@ git clean -fd \
# Pull latest changes # Pull latest changes
echo " Pulling latest changes..." echo " Pulling latest changes..."
git pull git fetch origin main
git checkout -B main origin/main
if ! git reset --hard origin/main; then
echo "ERROR: git pull fehlgeschlagen."
echo ""
echo "Häufige Ursache: SSH-Key für den aktuellen User fehlt."
echo "Prüfen:"
echo " ssh -T git@tsschulz.de"
echo ""
echo "Optional auf HTTPS wechseln:"
echo " git remote set-url origin https://tsschulz.de/<owner>/<repo>.git"
echo "Oder SSH-Key für User $(id -un) hinterlegen."
exit 1
fi
# Reset any accidental changes from stash restore (should be none now) # Reset any accidental changes from stash restore (should be none now)
git reset --hard HEAD >/dev/null 2>&1 git reset --hard HEAD >/dev/null 2>&1
@@ -153,7 +268,9 @@ fi
# 3. Install dependencies # 3. Install dependencies
echo "" echo ""
echo "3. Installing dependencies..." echo "3. Installing dependencies..."
npm install use_project_node
ensure_node_version
install_dependencies_if_needed
# 4. Remove old build (but keep data!) # 4. Remove old build (but keep data!)
echo "" echo ""
@@ -170,56 +287,29 @@ if [ -d ".output" ]; then
if [ -d ".output" ]; then if [ -d ".output" ]; then
echo "ERROR: .output konnte auch nach erneutem Versuch nicht gelöscht werden!" echo "ERROR: .output konnte auch nach erneutem Versuch nicht gelöscht werden!"
echo "Bitte manuell prüfen und löschen: rm -rf .output" echo "Bitte manuell prüfen und löschen: rm -rf .output"
if ls "$BACKUP_DIR/public-data"/*.csv >/dev/null 2>&1; then
# Restore into internal storage (server/data/public-data)
mkdir -p server/data/public-data
for csv_file in "$BACKUP_DIR/public-data"/*.csv; do
filename=$(basename "$csv_file")
cp -f "$csv_file" "server/data/public-data/$filename"
if [ -f "server/data/public-data/$filename" ]; then
backup_size=$(stat -f%z "$csv_file" 2>/dev/null || stat -c%s "$csv_file" 2>/dev/null || echo "0")
restored_size=$(stat -f%z "server/data/public-data/$filename" 2>/dev/null || stat -c%s "server/data/public-data/$filename" 2>/dev/null || echo "0")
if [ "$backup_size" = "$restored_size" ] && [ "$backup_size" != "0" ]; then
echo " \u2713 Restored server/data/public-data/$filename from backup ($backup_size bytes)"
else
echo " \u26a0 WARNING: server/data/public-data/$filename size mismatch (Backup: $backup_size, Restored: $restored_size)"
fi
else
echo " \u274c ERROR: Konnte server/data/public-data/$filename nicht wiederherstellen!"
fi
done
echo " \u2713 All public-data files restored into server/data/public-data ($BACKUP_DIR/public-data)."
# Optional: synchronize internal public-data into public/data for legacy builds
# This uses the project's sync script and forces overwrite in public/data.
if command -v node >/dev/null 2>&1 && [ -f scripts/sync-public-data.js ]; then
echo " Synchronizing server/data/public-data -> public/data (using scripts/sync-public-data.js --force)"
node scripts/sync-public-data.js --force || echo " WARNING: sync script failed"
else
echo " Note: To publish CSVs to public/data run: node scripts/sync-public-data.js --force"
fi
else
echo " No public CSVs to restore"
fi
exit 1 exit 1
fi fi
fi fi
echo " ✓ .output gelöscht" echo " ✓ .output gelöscht"
fi fi
# Auch .nuxt Cache löschen für sauberen Build # .nuxt standardmäßig behalten (beschleunigt Folge-Builds deutlich).
if [ -d ".nuxt" ]; then # Für erzwungenen Clean-Build: CLEAN_NUXT_CACHE=1 ./deploy-production.sh
echo " Removing .nuxt cache..." if [ "${CLEAN_NUXT_CACHE:-0}" = "1" ]; then
if [ -d ".nuxt" ]; then
echo " CLEAN_NUXT_CACHE=1 gesetzt: entferne .nuxt cache..."
rm -rf .nuxt rm -rf .nuxt
echo " ✓ .nuxt gelöscht" echo " ✓ .nuxt gelöscht"
fi
else
echo " Behalte .nuxt cache für schnelleren Build (CLEAN_NUXT_CACHE=1 für Clean-Build)"
fi fi
# Prüfe, ob node_modules vorhanden ist (für npm run build) # Prüfe, ob node_modules vorhanden ist (für npm run build)
if [ ! -d "node_modules" ]; then if [ ! -d "node_modules" ]; then
echo "" echo ""
echo "WARNING: node_modules fehlt. Installiere Dependencies..." echo "WARNING: node_modules fehlt. Installiere Dependencies..."
npm install install_dependencies
fi fi
# 5. Build # 5. Build
@@ -232,28 +322,36 @@ echo " (This may take a few minutes...)"
echo " Checking dependencies..." echo " Checking dependencies..."
if [ ! -f "node_modules/.package-lock.json" ] && [ ! -f "package-lock.json" ]; then if [ ! -f "node_modules/.package-lock.json" ] && [ ! -f "package-lock.json" ]; then
echo " WARNING: package-lock.json fehlt. Führe npm install aus..." echo " WARNING: package-lock.json fehlt. Führe npm install aus..."
npm install install_dependencies
fi fi
# Build mit expliziter Fehlerbehandlung und Output-Capture # Build mit expliziter Fehlerbehandlung und gleichzeitiger Log-Datei
BUILD_OUTPUT=$(npm run build 2>&1) BUILD_LOG_FILE=".deploy-cache/build-$(date +%Y%m%d-%H%M%S).log"
BUILD_EXIT_CODE=$? mkdir -p ".deploy-cache"
if npm run build 2>&1 | tee "$BUILD_LOG_FILE"; then
# Zeige Build-Output BUILD_EXIT_CODE=0
echo "$BUILD_OUTPUT" else
BUILD_EXIT_CODE=$?
fi
if [ "$BUILD_EXIT_CODE" -ne 0 ]; then if [ "$BUILD_EXIT_CODE" -ne 0 ]; then
echo "" echo ""
echo "ERROR: Build fehlgeschlagen mit Exit-Code $BUILD_EXIT_CODE" echo "ERROR: Build fehlgeschlagen mit Exit-Code $BUILD_EXIT_CODE"
echo "Bitte prüfen Sie die Build-Ausgabe oben auf Fehler." echo "Bitte prüfen Sie die Build-Ausgabe oben auf Fehler."
echo "Build-Log: $BUILD_LOG_FILE"
exit 1 exit 1
fi fi
echo ""
echo " Synchronizing public documents into build output..."
sync_public_documents_to_build
# Prüfe auf Warnungen im Build-Output, die auf Probleme hinweisen # Prüfe auf Warnungen im Build-Output, die auf Probleme hinweisen
if echo "$BUILD_OUTPUT" | grep -qi "error\|failed\|missing"; then if rg -i "error|failed|missing" "$BUILD_LOG_FILE" >/dev/null 2>&1; then
echo "" echo ""
echo "WARNING: Build-Output enthält möglicherweise Fehler oder Warnungen." echo "WARNING: Build-Output enthält möglicherweise Fehler oder Warnungen."
echo "Bitte prüfen Sie die Ausgabe oben." echo "Bitte prüfen Sie die Ausgabe oben."
echo "Build-Log: $BUILD_LOG_FILE"
fi fi
# Prüfe, ob der Build erfolgreich war - mehrere Checks # Prüfe, ob der Build erfolgreich war - mehrere Checks
@@ -487,4 +585,3 @@ echo " pm2 status # View status"
echo " pm2 restart harheimertc # Restart instance on port 3100" echo " pm2 restart harheimertc # Restart instance on port 3100"
echo " pm2 restart harheimertc-3102 # Restart instance on port 3102" echo " pm2 restart harheimertc-3102 # Restart instance on port 3102"
echo " pm2 restart all # Restart all instances" echo " pm2 restart all # Restart all instances"

View File

@@ -67,6 +67,108 @@ has_tracked_files_under() {
git ls-files "$prefix" | head -n 1 | grep -q . git ls-files "$prefix" | head -n 1 | grep -q .
} }
install_dependencies() {
if [ -f "package-lock.json" ]; then
echo " Running: npm ci"
npm ci
else
echo " WARNING: package-lock.json fehlt. Führe npm install aus..."
npm install
fi
}
install_dependencies_if_needed() {
local cache_dir=".deploy-cache"
local lock_hash_file="$cache_dir/package-lock.sha256"
local current_lock_hash=""
local previous_lock_hash=""
if [ ! -f "package-lock.json" ]; then
echo " package-lock.json fehlt, führe npm install aus..."
install_dependencies
return 0
fi
mkdir -p "$cache_dir"
current_lock_hash="$(sha256sum package-lock.json | awk '{print $1}')"
if [ -f "$lock_hash_file" ]; then
previous_lock_hash="$(cat "$lock_hash_file" 2>/dev/null || true)"
fi
if [ ! -d "node_modules" ]; then
echo " node_modules fehlt, installiere Dependencies..."
install_dependencies
elif [ "$current_lock_hash" != "$previous_lock_hash" ]; then
echo " package-lock.json geändert, führe npm ci aus..."
install_dependencies
else
echo " package-lock.json unverändert, überspringe npm ci"
fi
printf '%s\n' "$current_lock_hash" > "$lock_hash_file"
}
use_project_node() {
export NVM_DIR="${NVM_DIR:-$HOME/.nvm}"
if [ -s "$NVM_DIR/nvm.sh" ]; then
# shellcheck disable=SC1090
. "$NVM_DIR/nvm.sh"
if [ -f ".nvmrc" ]; then
echo " Using Node version from .nvmrc..."
nvm use
fi
fi
}
ensure_node_version() {
if ! command -v node >/dev/null 2>&1; then
echo "ERROR: Node.js ist nicht im PATH."
exit 1
fi
local node_version
node_version="$(node -p 'process.versions.node')"
if ! node -e 'const [major, minor] = process.versions.node.split(".").map(Number); process.exit(major > 22 || (major === 22 && minor >= 12) ? 0 : 1)' >/dev/null 2>&1; then
echo "ERROR: Node.js >= 22.12.0 wird benötigt, aktuell ist $node_version aktiv."
echo "Bitte Node 22 installieren/aktivieren, z.B.:"
echo " nvm install 22"
echo " nvm alias default 22"
exit 1
fi
echo " Node.js $node_version"
}
sync_public_documents_to_build() {
if [ ! -d "public/documents" ]; then
echo " No public/documents directory to sync"
return 0
fi
if [ ! -d ".output/public" ]; then
echo "ERROR: .output/public fehlt, kann public/documents nicht synchronisieren."
exit 1
fi
mkdir -p ".output/public/documents"
cp -a "public/documents/." ".output/public/documents/"
echo " ✓ public/documents -> .output/public/documents synchronisiert"
local template_pdf="beitrittserklärung_template.pdf"
if [ -f "public/documents/$template_pdf" ]; then
local source_size output_size
source_size=$(stat -f%z "public/documents/$template_pdf" 2>/dev/null || stat -c%s "public/documents/$template_pdf" 2>/dev/null || echo "0")
output_size=$(stat -f%z ".output/public/documents/$template_pdf" 2>/dev/null || stat -c%s ".output/public/documents/$template_pdf" 2>/dev/null || echo "0")
if [ "$source_size" != "$output_size" ] || [ "$source_size" = "0" ]; then
echo "ERROR: .output/public/documents/$template_pdf stimmt nicht mit public/documents überein (Source: $source_size, Output: $output_size)."
exit 1
fi
echo "$template_pdf im Build verifiziert ($output_size bytes)"
fi
}
echo "0. Ensuring persistent data directories (recommended)..." echo "0. Ensuring persistent data directories (recommended)..."
# IMPORTANT: Only symlink server/data if it's not tracked by git. # IMPORTANT: Only symlink server/data if it's not tracked by git.
if has_tracked_files_under "server/data"; then if has_tracked_files_under "server/data"; then
@@ -143,7 +245,9 @@ git clean -fd \
# Pull latest changes # Pull latest changes
echo " Pulling latest changes..." echo " Pulling latest changes..."
if ! git pull --ff-only; then git fetch origin dev
git checkout -B dev origin/dev
if ! git reset --hard origin/dev; then
echo "ERROR: git pull fehlgeschlagen." echo "ERROR: git pull fehlgeschlagen."
echo "" echo ""
echo "Häufige Ursache: SSH-Key für den aktuellen User fehlt." echo "Häufige Ursache: SSH-Key für den aktuellen User fehlt."
@@ -170,7 +274,9 @@ fi
# 3. Install dependencies # 3. Install dependencies
echo "" echo ""
echo "3. Installing dependencies..." echo "3. Installing dependencies..."
npm install use_project_node
ensure_node_version
install_dependencies_if_needed
# 4. Remove old build (but keep data!) # 4. Remove old build (but keep data!)
echo "" echo ""
@@ -193,18 +299,23 @@ if [ -d ".output" ]; then
echo " ✓ .output gelöscht" echo " ✓ .output gelöscht"
fi fi
# Auch .nuxt Cache löschen für sauberen Build # .nuxt standardmäßig behalten (beschleunigt Folge-Builds deutlich).
if [ -d ".nuxt" ]; then # Für erzwungenen Clean-Build: CLEAN_NUXT_CACHE=1 ./deploy-test.sh
echo " Removing .nuxt cache..." if [ "${CLEAN_NUXT_CACHE:-0}" = "1" ]; then
if [ -d ".nuxt" ]; then
echo " CLEAN_NUXT_CACHE=1 gesetzt: entferne .nuxt cache..."
rm -rf .nuxt rm -rf .nuxt
echo " ✓ .nuxt gelöscht" echo " ✓ .nuxt gelöscht"
fi
else
echo " Behalte .nuxt cache für schnelleren Build (CLEAN_NUXT_CACHE=1 für Clean-Build)"
fi fi
# Prüfe, ob node_modules vorhanden ist (für npm run build) # Prüfe, ob node_modules vorhanden ist (für npm run build)
if [ ! -d "node_modules" ]; then if [ ! -d "node_modules" ]; then
echo "" echo ""
echo "WARNING: node_modules fehlt. Installiere Dependencies..." echo "WARNING: node_modules fehlt. Installiere Dependencies..."
npm install install_dependencies
fi fi
# 5. Build # 5. Build
@@ -217,28 +328,36 @@ echo " (This may take a few minutes...)"
echo " Checking dependencies..." echo " Checking dependencies..."
if [ ! -f "node_modules/.package-lock.json" ] && [ ! -f "package-lock.json" ]; then if [ ! -f "node_modules/.package-lock.json" ] && [ ! -f "package-lock.json" ]; then
echo " WARNING: package-lock.json fehlt. Führe npm install aus..." echo " WARNING: package-lock.json fehlt. Führe npm install aus..."
npm install install_dependencies
fi fi
# Build mit expliziter Fehlerbehandlung und Output-Capture # Build mit expliziter Fehlerbehandlung und gleichzeitiger Log-Datei
BUILD_OUTPUT=$(npm run build 2>&1) BUILD_LOG_FILE=".deploy-cache/build-$(date +%Y%m%d-%H%M%S).log"
BUILD_EXIT_CODE=$? mkdir -p ".deploy-cache"
if npm run build 2>&1 | tee "$BUILD_LOG_FILE"; then
# Zeige Build-Output BUILD_EXIT_CODE=0
echo "$BUILD_OUTPUT" else
BUILD_EXIT_CODE=$?
fi
if [ "$BUILD_EXIT_CODE" -ne 0 ]; then if [ "$BUILD_EXIT_CODE" -ne 0 ]; then
echo "" echo ""
echo "ERROR: Build fehlgeschlagen mit Exit-Code $BUILD_EXIT_CODE" echo "ERROR: Build fehlgeschlagen mit Exit-Code $BUILD_EXIT_CODE"
echo "Bitte prüfen Sie die Build-Ausgabe oben auf Fehler." echo "Bitte prüfen Sie die Build-Ausgabe oben auf Fehler."
echo "Build-Log: $BUILD_LOG_FILE"
exit 1 exit 1
fi fi
echo ""
echo " Synchronizing public documents into build output..."
sync_public_documents_to_build
# Prüfe auf Warnungen im Build-Output, die auf Probleme hinweisen # Prüfe auf Warnungen im Build-Output, die auf Probleme hinweisen
if echo "$BUILD_OUTPUT" | grep -qi "error\|failed\|missing"; then if rg -i "error|failed|missing" "$BUILD_LOG_FILE" >/dev/null 2>&1; then
echo "" echo ""
echo "WARNING: Build-Output enthält möglicherweise Fehler oder Warnungen." echo "WARNING: Build-Output enthält möglicherweise Fehler oder Warnungen."
echo "Bitte prüfen Sie die Ausgabe oben." echo "Bitte prüfen Sie die Ausgabe oben."
echo "Build-Log: $BUILD_LOG_FILE"
fi fi
# Prüfe, ob der Build erfolgreich war - mehrere Checks # Prüfe, ob der Build erfolgreich war - mehrere Checks

View File

@@ -10,7 +10,8 @@ try {
// Helper function to create env object // Helper function to create env object
function createEnv(port) { function createEnv(port) {
return { return {
NODE_ENV: 'production', NODE_ENV: process.env.NODE_ENV || 'development',
APP_ENV: process.env.APP_ENV || 'test',
PORT: port, PORT: port,
// Secrets/Config (loaded from .env above, if present) // Secrets/Config (loaded from .env above, if present)
ENCRYPTION_KEY: process.env.ENCRYPTION_KEY, ENCRYPTION_KEY: process.env.ENCRYPTION_KEY,

View File

@@ -1,12 +1,19 @@
// https://nuxt.com/docs/api/configuration/nuxt-config // https://nuxt.com/docs/api/configuration/nuxt-config
export default defineNuxtConfig({ export default defineNuxtConfig({
devtools: { enabled: true }, devtools: { enabled: process.env.NODE_ENV !== 'production' },
modules: ['@nuxtjs/tailwindcss', '@pinia/nuxt'], modules: ['@nuxtjs/tailwindcss', '@pinia/nuxt'],
nitro: { nitro: {
preset: 'node-server', preset: 'node-server',
dev: process.env.NODE_ENV !== 'production' dev: process.env.NODE_ENV !== 'production',
sourceMap: false
},
vite: {
build: {
reportCompressedSize: false
}
}, },
// Erzwinge Dev-Port und Host zuverlässig für `npm run dev` // Erzwinge Dev-Port und Host zuverlässig für `npm run dev`

3179
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,13 @@
{ {
"name": "harheimertc-website", "name": "harheimertc-website",
"version": "1.0.0", "version": "1.1.6",
"description": "Moderne Webseite für den Harheimer Tischtennis Club", "description": "Moderne Webseite für den Harheimer Tischtennis Club",
"private": true, "private": true,
"type": "module", "type": "module",
"engines": {
"node": ">=22.12.0",
"npm": ">=10"
},
"scripts": { "scripts": {
"dev": "nuxt dev --port 3100", "dev": "nuxt dev --port 3100",
"build": "nuxt build", "build": "nuxt build",
@@ -27,7 +31,7 @@
"dompurify": "^3.3.1", "dompurify": "^3.3.1",
"jsonwebtoken": "^9.0.2", "jsonwebtoken": "^9.0.2",
"multer": "^2.0.2", "multer": "^2.0.2",
"nodemailer": "^7.0.9", "nodemailer": "^8.0.5",
"nuxt": "^4.1.3", "nuxt": "^4.1.3",
"pdf-lib": "^1.17.1", "pdf-lib": "^1.17.1",
"pdf-parse": "^2.4.5", "pdf-parse": "^2.4.5",
@@ -41,11 +45,12 @@
"@nuxtjs/tailwindcss": "^6.11.0", "@nuxtjs/tailwindcss": "^6.11.0",
"@types/dompurify": "^3.0.5", "@types/dompurify": "^3.0.5",
"autoprefixer": "^10.4.0", "autoprefixer": "^10.4.0",
"commander": "^13.1.0",
"dotenv": "^17.2.3", "dotenv": "^17.2.3",
"eslint-plugin-vue": "^10.6.2", "eslint-plugin-vue": "^10.6.2",
"globals": "^16.5.0", "globals": "^16.5.0",
"lucide-vue-next": "^0.344.0", "lucide-vue-next": "^0.344.0",
"postcss": "^8.4.0", "postcss": "^8.5.12",
"supertest": "^7.1.0", "supertest": "^7.1.0",
"tailwindcss": "^3.4.0", "tailwindcss": "^3.4.0",
"vitest": "^4.0.16", "vitest": "^4.0.16",

View File

@@ -106,9 +106,31 @@
<!-- Active Users --> <!-- Active Users -->
<div> <div>
<h2 class="text-2xl font-display font-bold text-gray-900 mb-4"> <div class="flex flex-col gap-3 mb-4 sm:flex-row sm:items-end sm:justify-between">
Aktive Benutzer ({{ activeUsers.length }}) <h2 class="text-2xl font-display font-bold text-gray-900">
Aktive Benutzer ({{ sortedActiveUsers.length }})
</h2> </h2>
<div class="flex items-center gap-2">
<label
for="user-sort-order"
class="text-sm font-medium text-gray-700"
>
Sortierung
</label>
<select
id="user-sort-order"
v-model="nameSortMode"
class="px-3 py-2 border border-gray-300 rounded-lg text-sm focus:ring-2 focus:ring-primary-600"
>
<option value="firstLast">
Vorname Nachname
</option>
<option value="lastFirst">
Nachname, Vorname
</option>
</select>
</div>
</div>
<div class="bg-white rounded-xl shadow-lg overflow-hidden"> <div class="bg-white rounded-xl shadow-lg overflow-hidden">
<table class="min-w-full divide-y divide-gray-200"> <table class="min-w-full divide-y divide-gray-200">
<thead class="bg-gray-50"> <thead class="bg-gray-50">
@@ -135,13 +157,13 @@
</thead> </thead>
<tbody class="bg-white divide-y divide-gray-200"> <tbody class="bg-white divide-y divide-gray-200">
<tr <tr
v-for="user in activeUsers" v-for="user in sortedActiveUsers"
:key="user.id" :key="user.id"
class="hover:bg-gray-50" class="hover:bg-gray-50"
> >
<td class="px-6 py-4 whitespace-nowrap"> <td class="px-6 py-4 whitespace-nowrap">
<div class="text-sm font-medium text-gray-900"> <div class="text-sm font-medium text-gray-900">
{{ user.name }} {{ getDisplayName(user) }}
</div> </div>
</td> </td>
<td class="px-6 py-4 whitespace-nowrap"> <td class="px-6 py-4 whitespace-nowrap">
@@ -253,7 +275,7 @@
> >
<div class="bg-white rounded-xl shadow-2xl max-w-md w-full p-6"> <div class="bg-white rounded-xl shadow-2xl max-w-md w-full p-6">
<h2 class="text-2xl font-display font-bold text-gray-900 mb-4"> <h2 class="text-2xl font-display font-bold text-gray-900 mb-4">
Rollen bearbeiten: {{ editingUser.name }} Rollen bearbeiten: {{ getDisplayName(editingUser) }}
</h2> </h2>
<div class="space-y-3 mb-6"> <div class="space-y-3 mb-6">
@@ -350,6 +372,7 @@ const errorMessage = ref('')
const showRoleModal = ref(false) const showRoleModal = ref(false)
const editingUser = ref(null) const editingUser = ref(null)
const selectedRoles = ref([]) const selectedRoles = ref([])
const nameSortMode = ref('firstLast')
const pendingUsers = computed(() => { const pendingUsers = computed(() => {
return allUsers.value return allUsers.value
@@ -364,6 +387,61 @@ const activeUsers = computed(() => {
return allUsers.value.filter(u => u.active === true) return allUsers.value.filter(u => u.active === true)
}) })
const splitNameParts = (name = '') => {
const trimmed = (name || '').trim()
if (!trimmed) {
return { firstName: '', lastName: '' }
}
if (trimmed.includes(',')) {
const [lastNameRaw, ...firstNameRaw] = trimmed.split(',')
return {
firstName: firstNameRaw.join(',').trim(),
lastName: (lastNameRaw || '').trim()
}
}
const parts = trimmed.split(/\s+/).filter(Boolean)
if (parts.length <= 1) {
return { firstName: parts[0] || '', lastName: '' }
}
return {
firstName: parts[0],
lastName: parts.slice(1).join(' ')
}
}
const getDisplayName = (user) => {
const { firstName, lastName } = splitNameParts(user?.name || '')
if (nameSortMode.value === 'lastFirst') {
if (!lastName) {
return firstName
}
return `${lastName}, ${firstName}`.trim()
}
return `${firstName} ${lastName}`.trim()
}
const sortedActiveUsers = computed(() => {
return [...activeUsers.value].sort((a, b) => {
const nameA = splitNameParts(a.name)
const nameB = splitNameParts(b.name)
if (nameSortMode.value === 'lastFirst') {
const lastNameCompare = nameA.lastName.localeCompare(nameB.lastName, 'de', { sensitivity: 'base' })
if (lastNameCompare !== 0) return lastNameCompare
return nameA.firstName.localeCompare(nameB.firstName, 'de', { sensitivity: 'base' })
}
const firstNameCompare = nameA.firstName.localeCompare(nameB.firstName, 'de', { sensitivity: 'base' })
if (firstNameCompare !== 0) return firstNameCompare
return nameA.lastName.localeCompare(nameB.lastName, 'de', { sensitivity: 'base' })
})
})
const formatDate = (dateString) => { const formatDate = (dateString) => {
return new Date(dateString).toLocaleString('de-DE', { return new Date(dateString).toLocaleString('de-DE', {
year: 'numeric', year: 'numeric',

View File

@@ -23,16 +23,12 @@ dotenv.config({ path: path.join(__dirname, '..', '.env') })
const targetEmail = String(process.argv[2] || 'tsschulz@gmx.net').trim().toLowerCase() const targetEmail = String(process.argv[2] || 'tsschulz@gmx.net').trim().toLowerCase()
function getDataPath(filename) { function getUsersFilePath() {
const cwd = process.cwd() const cwd = process.cwd()
if (cwd.endsWith('.output')) { if (cwd.endsWith('.output')) {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return `${cwd}/../server/data/users.json`
// filename is internal constant in this script (users.json), not user input.
return path.join(cwd, '../server/data', filename)
} }
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return `${cwd}/server/data/users.json`
// filename is internal constant in this script (users.json), not user input.
return path.join(cwd, 'server/data', filename)
} }
async function createBackup(filePath) { async function createBackup(filePath) {
@@ -44,7 +40,7 @@ async function createBackup(filePath) {
} }
async function main() { async function main() {
const usersFile = getDataPath('users.json') const usersFile = getUsersFilePath()
console.log(`Suche Benutzer: ${targetEmail}`) console.log(`Suche Benutzer: ${targetEmail}`)

View File

@@ -0,0 +1,25 @@
#!/usr/bin/env bash
set -euo pipefail
BASE_REF="${1:-origin/main}"
BASE_BRANCH="${BASE_REF#origin/}"
git fetch --no-tags --depth=1 origin "$BASE_BRANCH"
current_version="$(node -e 'const fs = require("fs"); const pkg = JSON.parse(fs.readFileSync("package.json", "utf8")); process.stdout.write(String(pkg.version || ""));')"
base_version="$(git show "$BASE_REF:package.json" | node -e 'let input = ""; process.stdin.setEncoding("utf8"); process.stdin.on("data", chunk => input += chunk); process.stdin.on("end", () => { const pkg = JSON.parse(input); process.stdout.write(String(pkg.version || "")); });')"
if [ -z "$current_version" ]; then
echo "ERROR: package.json enthält kein version-Feld."
exit 1
fi
if [ "$current_version" = "$base_version" ]; then
echo "ERROR: package.json version wurde nicht geändert."
echo "Base ($BASE_REF): $base_version"
echo "Current: $current_version"
echo "Bitte version in package.json erhöhen, bevor nach main gemerged wird."
exit 1
fi
echo "package.json version changed: $base_version -> $current_version"

View File

@@ -2,18 +2,14 @@ import fs from 'fs/promises'
import path from 'path' import path from 'path'
import sharp from 'sharp' import sharp from 'sharp'
const getDataPath = (filename) => { const getDataRoot = () => {
const cwd = process.cwd() const cwd = process.cwd()
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return cwd.endsWith('.output') ? `${cwd}/../server/data` : `${cwd}/server/data`
// filename is internal constant in this script.
if (cwd.endsWith('.output')) return path.join(cwd, '../server/data', filename)
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal
// filename is internal constant in this script.
return path.join(cwd, 'server/data', filename)
} }
const GALERIE_DIR = getDataPath('galerie') const DATA_ROOT = getDataRoot()
const GALERIE_METADATA = getDataPath('galerie-metadata.json') const GALERIE_DIR = `${DATA_ROOT}/galerie`
const GALERIE_METADATA = `${DATA_ROOT}/galerie-metadata.json`
async function readJsonArray(file) { async function readJsonArray(file) {
try { try {
@@ -45,18 +41,16 @@ async function fileExists(p) {
} }
async function generatePreviewForEntry(entry, size) { async function generatePreviewForEntry(entry, size) {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const safeOriginal = path.basename(String(entry.filename || ''))
// entry.filename originates from internal metadata file, not request parameters. const original = `${GALERIE_DIR}/originals/${safeOriginal}`
const original = path.join(GALERIE_DIR, 'originals', entry.filename)
if (!(await fileExists(original))) return { ok: false, reason: 'missing original' } if (!(await fileExists(original))) return { ok: false, reason: 'missing original' }
const previewFilename = entry.previewFilename && String(entry.previewFilename).trim() !== '' const previewFilename = entry.previewFilename && String(entry.previewFilename).trim() !== ''
? entry.previewFilename ? entry.previewFilename
: `preview_${entry.filename}` : `preview_${entry.filename}`
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const safePreview = path.basename(String(previewFilename || ''))
// previewFilename is generated from metadata/internal naming conventions. const preview = `${GALERIE_DIR}/previews/${safePreview}`
const preview = path.join(GALERIE_DIR, 'previews', previewFilename)
await sharp(original) await sharp(original)
.rotate() .rotate()

View File

@@ -70,9 +70,8 @@ async function main() {
if (fs.existsSync(internalUploads)) { if (fs.existsSync(internalUploads)) {
pdfFiles = fs.readdirSync(internalUploads).filter(f => f.toLowerCase().endsWith('.pdf')) pdfFiles = fs.readdirSync(internalUploads).filter(f => f.toLowerCase().endsWith('.pdf'))
.map(f => { .map(f => {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const safeName = path.basename(String(f || ''))
// f comes from fs.readdirSync(internalUploads), not external input. const filePath = `${internalUploads}/${safeName}`
const filePath = path.join(internalUploads, f)
return { f, mtime: fs.statSync(filePath).mtimeMs, dir: internalUploads } return { f, mtime: fs.statSync(filePath).mtimeMs, dir: internalUploads }
}) })
} }

View File

@@ -4,18 +4,14 @@ import { randomUUID } from 'crypto'
const allowed = new Set(['.jpg', '.jpeg', '.png', '.gif', '.webp', '.svg']) const allowed = new Set(['.jpg', '.jpeg', '.png', '.gif', '.webp', '.svg'])
const getDataPath = (filename) => { const getDataRoot = () => {
const cwd = process.cwd() const cwd = process.cwd()
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return cwd.endsWith('.output') ? `${cwd}/../server/data` : `${cwd}/server/data`
// filename is internal constant in this script.
if (cwd.endsWith('.output')) return path.join(cwd, '../server/data', filename)
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal
// filename is internal constant in this script.
return path.join(cwd, 'server/data', filename)
} }
const GALERIE_DIR = getDataPath('galerie') const DATA_ROOT = getDataRoot()
const GALERIE_METADATA = getDataPath('galerie-metadata.json') const GALERIE_DIR = `${DATA_ROOT}/galerie`
const GALERIE_METADATA = `${DATA_ROOT}/galerie-metadata.json`
const PUBLIC_GALERIE_DIR = path.join(process.cwd(), 'public', 'galerie') const PUBLIC_GALERIE_DIR = path.join(process.cwd(), 'public', 'galerie')
function titleFromFilename(filename) { function titleFromFilename(filename) {

View File

@@ -13,9 +13,8 @@ if (!KEY) {
} }
async function reencryptFile(file) { async function reencryptFile(file) {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const safeFile = path.basename(String(file || ''))
// file comes from fs.readdir(DIR) and is constrained to *.json below. const filePath = `${DIR}/${safeFile}`
const filePath = path.join(DIR, file)
try { try {
const content = await fs.readFile(filePath, 'utf8') const content = await fs.readFile(filePath, 'utf8')
// Prüfe, ob bereits verschlüsselt (v2: Prefix) // Prüfe, ob bereits verschlüsselt (v2: Prefix)

View File

@@ -0,0 +1,40 @@
import { promises as fs } from 'fs'
import path from 'path'
import { getUserFromToken } from '../../utils/auth.js'
async function readPackageVersion() {
const cwd = process.cwd()
const candidatePaths = [
path.join(cwd, 'package.json'),
path.join(cwd, '../package.json')
]
for (const packageJsonPath of candidatePaths) {
try {
const packageJson = JSON.parse(await fs.readFile(packageJsonPath, 'utf8'))
if (packageJson?.version) {
return String(packageJson.version)
}
} catch (_error) {
// Try next candidate path (e.g. .output runtime)
}
}
return ''
}
export default defineEventHandler(async (event) => {
const token = getCookie(event, 'auth_token')
const user = token ? await getUserFromToken(token) : null
if (!user) {
throw createError({
statusCode: 401,
statusMessage: 'Nicht authentifiziert'
})
}
return {
version: await readPackageVersion()
}
})

View File

@@ -93,14 +93,13 @@ export default defineEventHandler(async (event) => {
} }
// Ziel: internes Datenverzeichnis unter `server/data/public-data` (persistente, interne Quelle) // Ziel: internes Datenverzeichnis unter `server/data/public-data` (persistente, interne Quelle)
const internalPaths = [ const dataTargetsByFile = {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal 'vereinsmeisterschaften.csv': [`${cwd}/server/data/public-data/vereinsmeisterschaften.csv`, `${cwd}/../server/data/public-data/vereinsmeisterschaften.csv`],
// filename is allowlisted via allowedFiles above. 'mannschaften.csv': [`${cwd}/server/data/public-data/mannschaften.csv`, `${cwd}/../server/data/public-data/mannschaften.csv`],
path.join(cwd, 'server/data/public-data', filename), 'termine.csv': [`${cwd}/server/data/public-data/termine.csv`, `${cwd}/../server/data/public-data/termine.csv`],
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal 'spielplan.csv': [`${cwd}/server/data/public-data/spielplan.csv`, `${cwd}/../server/data/public-data/spielplan.csv`]
// filename is allowlisted via allowedFiles above. }
path.join(cwd, '../server/data/public-data', filename) const internalPaths = dataTargetsByFile[filename] || []
]
const uniquePaths = [...new Set([...internalPaths])] const uniquePaths = [...new Set([...internalPaths])]
const writeResults = [] const writeResults = []

View File

@@ -6,19 +6,15 @@ import { readUsers, migrateUserRoles } from '../utils/auth.js'
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal // nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal
// filename is always a hardcoded constant ('config.json'), never user input // filename is always a hardcoded constant ('config.json'), never user input
const getDataPath = (filename) => { const getConfigPath = () => {
const cwd = process.cwd() const cwd = process.cwd()
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal if (cwd.endsWith('.output')) return `${cwd}/../server/data/config.json`
// filename is a fixed internal constant ('config.json'). return `${cwd}/server/data/config.json`
if (cwd.endsWith('.output')) return path.join(cwd, '../server/data', filename)
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal
// filename is a fixed internal constant ('config.json').
return path.join(cwd, 'server/data', filename)
} }
async function loadConfig() { async function loadConfig() {
try { try {
const configFile = getDataPath('config.json') const configFile = getConfigPath()
const raw = await fs.readFile(configFile, 'utf-8') const raw = await fs.readFile(configFile, 'utf-8')
return JSON.parse(raw) return JSON.parse(raw)
} catch (error) { } catch (error) {
@@ -28,6 +24,12 @@ async function loadConfig() {
} }
async function collectRecipients(config) { async function collectRecipients(config) {
const isProduction = process.env.NODE_ENV === 'production' && process.env.APP_ENV !== 'test'
if (!isProduction) {
return ['tsschulz@tsschulz.de']
}
const recipients = [] const recipients = []
// Vorstand // Vorstand

View File

@@ -15,8 +15,11 @@ export default defineEventHandler(async (event) => {
const cwd = process.cwd() const cwd = process.cwd()
const filename = 'mannschaften.csv' const filename = 'mannschaften.csv'
// Prefer server/data, then .output/public/data, then public/data // Prefer CMS write target first (server/data/public-data),
// then legacy locations.
const candidates = [ const candidates = [
path.join(cwd, 'server/data/public-data', filename),
path.join(cwd, '../server/data/public-data', filename),
path.join(cwd, '.output/server/data', filename), path.join(cwd, '.output/server/data', filename),
path.join(cwd, 'server/data', filename), path.join(cwd, 'server/data', filename),
path.join(cwd, '.output/public/data', filename), path.join(cwd, '.output/public/data', filename),

View File

@@ -2,18 +2,14 @@ import fs from 'fs/promises'
import path from 'path' import path from 'path'
import { getUserFromToken, verifyToken } from '../../../utils/auth.js' import { getUserFromToken, verifyToken } from '../../../utils/auth.js'
const getDataPath = (filename) => { const getDataRoot = () => {
const cwd = process.cwd() const cwd = process.cwd()
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return cwd.endsWith('.output') ? `${cwd}/../server/data` : `${cwd}/server/data`
// filename is fixed internal names for gallery storage.
if (cwd.endsWith('.output')) return path.join(cwd, '../server/data', filename)
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal
// filename is fixed internal names for gallery storage.
return path.join(cwd, 'server/data', filename)
} }
const GALERIE_DIR = getDataPath('galerie') const DATA_ROOT = getDataRoot()
const GALERIE_METADATA = getDataPath('galerie-metadata.json') const GALERIE_DIR = `${DATA_ROOT}/galerie`
const GALERIE_METADATA = `${DATA_ROOT}/galerie-metadata.json`
async function readGalerieMetadata() { async function readGalerieMetadata() {
try { try {

View File

@@ -35,10 +35,9 @@ export default defineEventHandler(async (event) => {
const filePath = resolveInternalPath(reqPath) const filePath = resolveInternalPath(reqPath)
// check existence and ensure it stays within baseDir // check existence and ensure it stays within baseDir
const baseDir = path.join(process.cwd(), 'server', 'private', 'gallery-internal') const baseDir = path.join(process.cwd(), 'server', 'private', 'gallery-internal')
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const resolved = path.normalize(filePath)
// filePath is validated against baseDir via startsWith(path.resolve(baseDir)) below. const normalizedBaseDir = path.normalize(baseDir + path.sep)
const resolved = path.resolve(filePath) if (!resolved.startsWith(normalizedBaseDir)) {
if (!resolved.startsWith(path.resolve(baseDir))) {
throw createError({ statusCode: 400, statusMessage: 'Ungültiger Pfad' }) throw createError({ statusCode: 400, statusMessage: 'Ungültiger Pfad' })
} }

View File

@@ -41,7 +41,7 @@ async function loadConfig() {
* @returns {Array<string>} Email addresses * @returns {Array<string>} Email addresses
*/ */
function getEmailRecipients(data, config) { function getEmailRecipients(data, config) {
const isProduction = process.env.NODE_ENV === 'production' const isProduction = process.env.NODE_ENV === 'production' && process.env.APP_ENV !== 'test'
if (!isProduction) { if (!isProduction) {
return ['tsschulz@tsschulz.de'] return ['tsschulz@tsschulz.de']

View File

@@ -236,6 +236,22 @@ export async function getRecipientsByGroup(targetGroup) {
email: m.email, email: m.email,
name: `${m.firstName || ''} ${m.lastName || ''}`.trim() || m.name || '' name: `${m.firstName || ''} ${m.lastName || ''}`.trim() || m.name || ''
})) }))
// Zusätzlich aktive Trainer aus users.json anschreiben
users
.filter(u => {
if (!u.active || !u.email || !u.email.trim()) return false
const roles = Array.isArray(u.roles) ? u.roles : (u.role ? [u.role] : [])
return roles.includes('trainer')
})
.forEach(u => {
if (!recipients.find(r => r.email.toLowerCase().trim() === u.email.toLowerCase().trim())) {
recipients.push({
email: u.email.trim(),
name: u.name || ''
})
}
})
break break
case 'mannschaftsspieler': case 'mannschaftsspieler':

View File

@@ -6,9 +6,8 @@ function uniqueCandidates(candidates) {
} }
function hasServerDataDir(root) { function hasServerDataDir(root) {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal const normalizedRoot = String(root || '').replace(/\/+$/, '')
// root candidates come only from APP_ROOT/cwd/parent and are used only for existence checks. return fs.existsSync(`${normalizedRoot}/server/data`)
return fs.existsSync(path.join(root, 'server', 'data'))
} }
export function resolveProjectRoot() { export function resolveProjectRoot() {

View File

@@ -9,13 +9,9 @@ const getDataPath = (filename) => {
// Prefer server/data in both production and development // Prefer server/data in both production and development
// e.g. project-root/server/data/termine.csv or .output/server/data/termine.csv // e.g. project-root/server/data/termine.csv or .output/server/data/termine.csv
if (cwd.endsWith('.output')) { if (cwd.endsWith('.output')) {
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return `${cwd}/../server/data/${filename}`
// filename is internal constant ('termine.csv').
return path.join(cwd, '../server/data', filename)
} }
// nosemgrep: javascript.lang.security.audit.path-traversal.path-join-resolve-traversal.path-join-resolve-traversal return `${cwd}/server/data/${filename}`
// filename is internal constant ('termine.csv').
return path.join(cwd, 'server/data', filename)
} }
const TERMINE_FILE = getDataPath('termine.csv') const TERMINE_FILE = getDataPath('termine.csv')

View File

@@ -1,5 +1,6 @@
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest' import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { createEvent, mockSuccessReadBody } from './setup' import { createEvent, mockSuccessReadBody } from './setup'
import { readFileSync } from 'fs'
vi.mock('../server/utils/auth.js', () => { vi.mock('../server/utils/auth.js', () => {
return { return {
@@ -60,8 +61,14 @@ import logoutHandler from '../server/api/auth/logout.post.js'
import registerHandler from '../server/api/auth/register.post.js' import registerHandler from '../server/api/auth/register.post.js'
import resetPasswordHandler from '../server/api/auth/reset-password.post.js' import resetPasswordHandler from '../server/api/auth/reset-password.post.js'
import statusHandler from '../server/api/auth/status.get.js' import statusHandler from '../server/api/auth/status.get.js'
import versionHandler from '../server/api/app/version.get.js'
describe('Auth API Endpoints', () => { describe('Auth API Endpoints', () => {
afterEach(() => {
delete process.env.NODE_ENV
delete process.env.APP_ENV
})
beforeEach(() => { beforeEach(() => {
// Setze SMTP-Credentials für Tests // Setze SMTP-Credentials für Tests
process.env.SMTP_USER = 'test@example.com' process.env.SMTP_USER = 'test@example.com'
@@ -171,6 +178,30 @@ describe('Auth API Endpoints', () => {
}) })
expect(nodemailer.default.createTransport).toHaveBeenCalled() expect(nodemailer.default.createTransport).toHaveBeenCalled()
}) })
it('benachrichtigt in Testumgebung nicht die Vorstand-Empfänger', async () => {
process.env.NODE_ENV = 'production'
process.env.APP_ENV = 'test'
const event = createEvent()
mockSuccessReadBody({
name: 'Max',
email: 'max@example.com',
password: '12345678',
phone: '123',
geburtsdatum: '2000-01-01'
})
authUtils.readUsers.mockResolvedValue([])
authUtils.hashPassword.mockResolvedValue('hashed')
authUtils.writeUsers.mockResolvedValue(true)
await registerHandler(event)
const transporter = nodemailer.default.createTransport.mock.results[0].value
expect(transporter.sendMail).toHaveBeenNthCalledWith(1, expect.objectContaining({
to: 'tsschulz@tsschulz.de'
}))
})
}) })
describe('POST /api/auth/reset-password', () => { describe('POST /api/auth/reset-password', () => {
@@ -212,4 +243,22 @@ describe('Auth API Endpoints', () => {
expect(response.user).toMatchObject({ id: '1' }) expect(response.user).toMatchObject({ id: '1' })
}) })
}) })
describe('GET /api/app/version', () => {
it('verlangt Login', async () => {
const event = createEvent()
await expect(versionHandler(event)).rejects.toMatchObject({ statusCode: 401 })
})
it('liefert eingeloggten Benutzern die package.json-Version', async () => {
const event = createEvent({ cookies: { auth_token: 'token' } })
authUtils.getUserFromToken.mockResolvedValue({ id: '1', email: 'user@example.com', roles: ['mitglied'] })
const packageJson = JSON.parse(readFileSync(new URL('../package.json', import.meta.url), 'utf8'))
const response = await versionHandler(event)
expect(response.version).toBe(packageJson.version)
})
})
}) })

View File

@@ -1,4 +1,4 @@
import { beforeEach, describe, expect, it, vi } from 'vitest' import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest'
import { createEvent, mockSuccessReadBody } from './setup' import { createEvent, mockSuccessReadBody } from './setup'
import fsPromises from 'fs/promises' import fsPromises from 'fs/promises'
import { promises as fs } from 'fs' import { promises as fs } from 'fs'
@@ -26,6 +26,11 @@ import termineHandler from '../server/api/termine.get.js'
import spielplaeneHandler from '../server/api/spielplaene.get.js' import spielplaeneHandler from '../server/api/spielplaene.get.js'
describe('Öffentliche API-Endpunkte', () => { describe('Öffentliche API-Endpunkte', () => {
afterEach(() => {
delete process.env.NODE_ENV
delete process.env.APP_ENV
})
beforeEach(() => { beforeEach(() => {
// Setze SMTP-Credentials für Tests // Setze SMTP-Credentials für Tests
process.env.SMTP_USER = 'test@example.com' process.env.SMTP_USER = 'test@example.com'
@@ -58,6 +63,21 @@ describe('Öffentliche API-Endpunkte', () => {
expect(response.success).toBe(true) expect(response.success).toBe(true)
expect(nodemailer.default.createTransport).toHaveBeenCalled() expect(nodemailer.default.createTransport).toHaveBeenCalled()
}) })
it('sendet in Testumgebung nicht an Vorstand-Empfänger', async () => {
process.env.NODE_ENV = 'production'
process.env.APP_ENV = 'test'
const event = createEvent()
mockSuccessReadBody({ name: 'Max', email: 'max@example.com', subject: 'Frage', message: 'Hallo' })
await contactHandler(event)
const transporter = nodemailer.default.createTransport.mock.results[0].value
expect(transporter.sendMail).toHaveBeenCalledWith(expect.objectContaining({
to: 'tsschulz@tsschulz.de'
}))
})
}) })
describe('GET /api/galerie', () => { describe('GET /api/galerie', () => {