refactor(exercises): standardize answer language handling across exercise scripts
All checks were successful
Deploy to production / deploy (push) Successful in 2m48s

- Introduced a mechanism to infer answer language based on question phrasing in multiple exercise scripts, enhancing consistency in exercise data.
- Updated question formats to clarify the intent of exercises, improving user understanding and engagement.
- Streamlined the code for better maintainability and clarity in exercise generation processes.
This commit is contained in:
Torsten Schulz (local)
2026-04-07 14:32:44 +02:00
parent 160c9dafb2
commit ebb2283646
7 changed files with 107 additions and 6 deletions

View File

@@ -582,6 +582,7 @@ async function updateFoodCareExercises() {
instruction: 'Wähle die richtige Übersetzung.',
questionData: JSON.stringify({
type: 'multiple_choice',
answerLanguage: 'target',
question: `Wie sagt man "${conv.native}" auf Bisaya?`,
options: [
conv.bisaya,
@@ -608,6 +609,7 @@ async function updateFoodCareExercises() {
instruction: 'Wähle die richtige Übersetzung.',
questionData: JSON.stringify({
type: 'multiple_choice',
answerLanguage: 'native',
question: `Was bedeutet "${conv.bisaya}"?`,
options: [
conv.native,