Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: migrate live session data to live quiz table #4363

Merged
merged 12 commits into from
Nov 10, 2024

Conversation

sjschlapbach
Copy link
Member

@sjschlapbach sjschlapbach commented Nov 6, 2024

To test the scripts, dump the current production database, load it into the local setup, apply the latest migration and then run the two scripts. Note that they can be executed independently and out of order.

Migrate status attributes

pnpm run script src/scripts/2024-11-07_migrate_unified_enums.ts

Migrate live session to live quiz and corresponding child components

pnpm run script src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts

Summary by CodeRabbit

Release Notes

  • New Features

    • Introduced a migration process for transitioning existing live sessions to a new live quiz format.
    • Added support for new unified enum types in the database for better data categorization.
  • Database Changes

    • Updated the schema to accommodate new fields and enums, enhancing data structure and integrity.
    • Implemented a new ElementBlock table to support the live quiz format.
    • Updated handling of Redis keys to align with new naming conventions for live quizzes.
  • Documentation

    • Added comments indicating future migration plans and deprecated fields for clarity.

Copy link

aviator-app bot commented Nov 6, 2024

Current Aviator status

Aviator will automatically update this comment as the status of the PR changes.
Comment /aviator refresh to force Aviator to re-examine your PR (or learn about other /aviator commands).

This PR was merged manually (without Aviator). Merging manually can negatively impact the performance of the queue. Consider using Aviator next time.


See the real-time status of this PR on the Aviator webapp.
Use the Aviator Chrome Extension to see the status of your PR within GitHub.

Copy link

coderabbitai bot commented Nov 6, 2024

📝 Walkthrough
📝 Walkthrough
📝 Walkthrough

Walkthrough

This pull request introduces scripts to migrate existing live sessions and enum types to a new database schema for live quizzes. The changes include the transformation of data structures, the addition of new fields and enums, and the adjustment of existing models in the Prisma schema. Key functions are implemented to handle the migration of data from old formats to new ones, ensuring compatibility and integrity during the transition.

Changes

File Change Summary
packages/graphql/src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts Added functions for migrating live sessions to live quiz format, including data transformation and error handling.
packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts Implemented migration for old enum types to new unified enums in the database.
packages/prisma/src/prisma/migrations/20241107142334_live_quiz_blocks/migration.sql Significant schema changes including new columns, dropping old columns, and creating new tables and constraints.
packages/prisma/src/prisma/schema/element.prisma Updated Element schema with new enums and fields to accommodate new functionalities.
packages/prisma/src/prisma/schema/quiz.prisma Added new enum for GroupActivityStatus and updated the GroupActivity model to reflect new status handling.
apps/backend-docker/scripts/checkRedisConsistency.ts Modified Redis key pattern for consistency checks in the checkRedisConsistency script.
apps/backend-docker/scripts/fixPointsInconsistency.ts Updated Redis key prefixes for leaderboard data retrieval in the fixPointsInconsistency script.
apps/func-response-processor/src/index.ts Altered session key construction and enhanced error handling in the serviceBusTrigger function.
packages/graphql/src/services/liveQuizzes.ts Changed Redis key prefixes for various operations related to live quizzes in the liveQuizzes service.

Possibly related PRs

Suggested reviewers

  • rschlaefli

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@sjschlapbach sjschlapbach changed the base branch from v3 to v3-new-live-quiz November 6, 2024 15:35
Copy link

cypress bot commented Nov 6, 2024

klicker-uzh    Run #3624

Run Properties:  status check passed Passed #3624  •  git commit 7a818b7554 ℹ️: Merge b80c20c977e790a7654511b0306eb04a550c4182 into dbbfe4154f2b144d9b310d104d0c...
Project klicker-uzh
Branch Review live-quiz-migration
Run status status check passed Passed #3624
Run duration 11m 27s
Commit git commit 7a818b7554 ℹ️: Merge b80c20c977e790a7654511b0306eb04a550c4182 into dbbfe4154f2b144d9b310d104d0c...
Committer Julius Schlapbach
View all properties for this run ↗︎

Test results
Tests that failed  Failures 0
Tests that were flaky  Flaky 0
Tests that did not run due to a developer annotating a test with .skip  Pending 0
Tests that did not run due to a failure in a mocha hook  Skipped 0
Tests that passed  Passing 140
View all changes introduced in this branch ↗︎

@sjschlapbach sjschlapbach changed the title chore: migrate live session to live quiz chore: migrate live session data to live quiz table Nov 6, 2024
@sjschlapbach sjschlapbach marked this pull request as ready for review November 7, 2024 15:28
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.

Actionable comments posted: 8

🧹 Outside diff range and nitpick comments (12)
packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts (2)

1-11: Enhance script documentation with additional details.

While the current comments explain the purpose, consider adding:

  • Expected execution duration
  • Whether the script is idempotent
  • Any prerequisites or post-migration verification steps

74-74: Improve script execution handling.

The current implementation lacks proper script initialization and configuration options.

Consider this improvement:

-await run()
+if (require.main === module) {
+  run()
+    .then(() => {
+      console.log('Migration completed successfully')
+      process.exit(0)
+    })
+    .catch((error) => {
+      console.error('Migration failed:', error)
+      process.exit(1)
+    })
+}

Consider adding command-line arguments for:

  • Batch size
  • Dry run mode
  • Specific table migration
packages/prisma/src/prisma/migrations/20241107142334_live_quiz_blocks/migration.sql (1)

66-75: Consider indexing strategy for ElementBlock

The table design is solid, but consider the following improvements:

  1. The originalId column might benefit from an index if it's used for lookups
  2. The execution column with a default of 0 suggests a counter - consider adding a check constraint to ensure non-negative values

Consider adding:

-- Add index if originalId is used for lookups
CREATE INDEX "ElementBlock_originalId_idx" ON "ElementBlock"("originalId");

-- Add check constraint for execution counter
ALTER TABLE "ElementBlock" ADD CONSTRAINT "ElementBlock_execution_check" 
  CHECK (execution >= 0);
packages/prisma/src/prisma/schema/element.prisma (3)

Line range hint 6-14: Enhance documentation for new element types.

While the comment indicates these are "new types", it would be helpful to document the intended use cases for CONTENT and FLASHCARD types.


162-164: Ensure migration handles null typeNEW values.

The typeNEW field is optional during migration but will be required later. Consider:

  1. Adding a database constraint or validation to ensure all records have typeNEW populated before removing the old type field
  2. Documenting the migration steps in a separate migration guide

Line range hint 1-214: Migration schema changes look well-structured.

The schema changes support a gradual migration strategy with:

  1. Parallel support for old and new structures
  2. Clear TODO markers for post-migration cleanup
  3. Proper field optionality during transition

However, consider:

  1. Creating a migration checklist document
  2. Adding validation steps before cleanup
  3. Including rollback procedures
packages/prisma/src/prisma/schema/quiz.prisma (1)

287-294: Track cleanup of temporary migration enum.

The GroupActivityStatus enum is marked as temporary for migration. To ensure this technical debt is addressed, consider creating a follow-up task to remove this enum once the migration is complete.

Would you like me to create a GitHub issue to track the cleanup of this temporary enum?

packages/graphql/src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts (5)

252-252: Address the TODO: Initialize Redis cache when content is updated

There's a TODO comment indicating that Redis cache initialization needs to be implemented when its content should be updated. Implementing this ensures cache consistency during the migration process.

Would you like assistance in implementing the Redis cache initialization, or should I open a new GitHub issue to track this task?


519-521: Address the TODO: Apply cache updates and cleanup

The code contains a TODO comment about applying cache updates and cleaning up old live session cache data. Implementing this is important to prevent data inconsistencies and potential memory leaks.

Would you like assistance in uncommenting and adjusting the cache update and cleanup logic, or should I open a new GitHub issue to track this task?


28-33: Make logging flags configurable via environment variables

Currently, the logging flags (logFakedElement, logQuestionDataConversion, etc.) are hardcoded. To improve flexibility and maintainability, consider making these flags configurable through environment variables.

Example implementation:

- const logFakedElement = false
- const logQuestionDataConversion = false
- const logResultsConversion = false
- const logInstanceConversion = false
+ const logFakedElement = process.env.LOG_FAKED_ELEMENT === 'true'
+ const logQuestionDataConversion = process.env.LOG_QUESTION_DATA_CONVERSION === 'true'
+ const logResultsConversion = process.env.LOG_RESULTS_CONVERSION === 'true'
+ const logInstanceConversion = process.env.LOG_INSTANCE_CONVERSION === 'true'

165-244: Refactor convertOldResults to reduce code duplication

The handling of ElementType.NUMERICAL and ElementType.FREE_TEXT in convertOldResults shares similar logic. Refactoring this code can reduce duplication and enhance maintainability.

Consider abstracting the common logic into a helper function that processes open-ended question types.


25-27: Update or remove outdated comments

The comments describing the migration (liveSession -> liveQuiz, sessionBlock -> elementBlock, etc.) might be outdated or no longer necessary. Updating or removing them can improve code clarity.

Consider revising the comments to accurately reflect the current migration process or removing them if they're redundant.

🛑 Comments failed to post (8)
packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts (4)

12-14: 🛠️ Refactor suggestion

Add proper error handling and Prisma client lifecycle management.

The current implementation lacks proper error handling and client cleanup.

Consider this improvement:

 async function run() {
   const prisma = new PrismaClient()
+  try {
+    // existing migration logic
+  } catch (error) {
+    console.error('Migration failed:', error)
+    throw error
+  } finally {
+    await prisma.$disconnect()
+  }
 }

Committable suggestion skipped: line range outside the PR's diff.


41-71: 🛠️ Refactor suggestion

⚠️ Potential issue

Reconsider default status and improve GroupActivity migration performance.

Two concerns:

  1. Setting DRAFT as default status might not be appropriate for all cases
  2. Same performance issues as ElementStack migration (bulk fetching, single updates)

For the default status:

-    let newType: PublicationStatus = PublicationStatus.DRAFT
+    if (!oldType) {
+      console.warn(`GroupActivity ${groupActivity.id} has no status, skipping`)
+      continue
+    }
+    let newType: PublicationStatus

Apply the same batch processing improvements as suggested for ElementStack migration.

Committable suggestion skipped: line range outside the PR's diff.


15-39: 🛠️ Refactor suggestion

Improve migration performance and reliability for ElementStack updates.

The current implementation has several potential issues:

  1. Fetches all records at once, which could cause memory issues with large datasets
  2. Updates records one by one, which is inefficient
  3. Lacks progress tracking and error recovery

Consider this improved approach:

-  const elementStacks = await prisma.elementStack.findMany()
-  for (const elementStack of elementStacks) {
+  let processed = 0
+  const batchSize = 1000
+  
+  while (true) {
+    const elementStacks = await prisma.elementStack.findMany({
+      take: batchSize,
+      skip: processed,
+      where: { typeNEW: null }, // Only process unprocessed records
+    })
+    
+    if (elementStacks.length === 0) break
+    
+    await prisma.$transaction(
+      elementStacks.map((elementStack) => {
         const oldType = elementStack.type
         let newType: ElementStackTypeNew | undefined
 
         switch (oldType) {
           // ... existing switch cases ...
         }
 
-        await prisma.elementStack.update({
+        return prisma.elementStack.update({
           where: { id: elementStack.id },
           data: { typeNEW: newType },
         })
-      }
+      })
+    )
+    
+    processed += elementStacks.length
+    console.log(`Processed ${processed} ElementStack records`)
+  }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

  // element stack type to new column
  let processed = 0
  const batchSize = 1000
  
  while (true) {
    const elementStacks = await prisma.elementStack.findMany({
      take: batchSize,
      skip: processed,
      where: { typeNEW: null }, // Only process unprocessed records
    })
    
    if (elementStacks.length === 0) break
    
    await prisma.$transaction(
      elementStacks.map((elementStack) => {
        const oldType = elementStack.type
        let newType: ElementStackTypeNew | undefined

        switch (oldType) {
          case ElementStackType.GROUP_ACTIVITY:
            newType = ElementStackTypeNew.GROUP_ACTIVITY
            break
          case ElementStackType.MICROLEARNING:
            newType = ElementStackTypeNew.MICROLEARNING
            break
          case ElementStackType.PRACTICE_QUIZ:
            newType = ElementStackTypeNew.PRACTICE_QUIZ
            break
          default:
            break
        }

        return prisma.elementStack.update({
          where: { id: elementStack.id },
          data: { typeNEW: newType },
        })
      })
    )
    
    processed += elementStacks.length
    console.log(`Processed ${processed} ElementStack records`)
  }

1-74: 💡 Codebase verification

Add data validation, logging, and transaction safety to the migration script

Based on the analysis of similar migration scripts in the codebase, the current enum migration script needs several improvements for safer data migration:

  1. Add logging to track migration progress and identify potential issues
  2. Implement data validation before and after migration
  3. Use transactions to ensure atomic updates
  4. Add error handling with rollback capabilities

Example implementation:

// Before migration validation
console.log(`Starting migration. Found ${elementStacks.length} element stacks and ${groupActivities.length} group activities`)

// Wrap operations in transaction
await prisma.$transaction(async (tx) => {
  // Existing migration logic with added logging
  for (const elementStack of elementStacks) {
    console.log(`Migrating ElementStack ${elementStack.id}`)
    // ... migration logic
  }
  
  // Verify migration
  const unmigrated = await tx.elementStack.count({
    where: { typeNEW: null }
  })
  if (unmigrated > 0) throw new Error(`Found ${unmigrated} unmigrated records`)
})
🔗 Analysis chain

Verify data integrity before and after migration.

The script should include data verification steps to ensure safe migration.

Run these commands to check data consistency:

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Check for any records that might be missed in the migration
echo "Checking for ElementStack records without new type..."
ast-grep --pattern 'await prisma.elementStack.findMany({ where: { typeNEW: null } })'

echo "Checking for GroupActivity records without new status..."
ast-grep --pattern 'await prisma.groupActivity.findMany({ where: { statusNEW: null } })'

# Look for any error handling in similar migration scripts
echo "Checking for error handling patterns in other migration scripts..."
rg -l "catch.*error" "src/scripts/*migrate*.ts"

Length of output: 676


Script:

#!/bin/bash
# Let's try a different approach to analyze migration scripts and error handling patterns

# Find all migration scripts in the repository
echo "Looking for migration scripts..."
fd migrate

# Check for any error handling patterns in TypeScript files
echo "Checking error handling patterns in TypeScript files..."
rg "try.*catch" -A 2 --type ts

# Look for similar enum migrations
echo "Looking for enum-related migrations..."
rg -l "enum.*migration|migrate.*enum" --type ts

# Check the schema for these enums
echo "Checking Prisma schema for relevant enums..."
fd "schema.prisma" --exec cat {}

Length of output: 1321


Script:

#!/bin/bash
# Let's examine similar migration scripts for patterns and check the specific enums

# Check content of similar migration scripts
echo "Checking content of other migration scripts..."
cat "packages/prisma/src/scripts/2024-02-06_migrate_group_activity.ts"
cat "packages/prisma/src/scripts/2024-02-18_migrate_microlearning.ts"

# Look for transaction patterns in migration scripts
echo "Checking for transaction patterns..."
rg "prisma.\$transaction" --type ts

# Check for validation or logging in migration scripts
echo "Looking for validation or logging patterns..."
rg "console\.(log|error)" "packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts"
rg "console\.(log|error)" "packages/prisma/src/scripts/*migrate*.ts"

Length of output: 13375

packages/prisma/src/prisma/migrations/20241107142334_live_quiz_blocks/migration.sql (1)

15-16: ⚠️ Potential issue

Address PostgreSQL 11 compatibility warning

The migration adds multiple enum values to PublicationStatus which, as noted in the comments, is not possible in a single migration for PostgreSQL 11 and earlier versions. This should be split into multiple migrations for backward compatibility.

Split the enum modifications into separate migrations:

-- Migration 1
ALTER TYPE "PublicationStatus" ADD VALUE 'ENDED';

-- Migration 2
ALTER TYPE "PublicationStatus" ADD VALUE 'GRADED';

Also applies to: 22-25

packages/graphql/src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts (3)

69-69: ⚠️ Potential issue

Fix typo in variable name elemnetId

The variable elemnetId appears to be a typo. It should be elementId for clarity and consistency.

Apply this diff to correct the typo:

- const elemnetId = questionData.id.split('-')[0]
+ const elementId = questionData.id.split('-')[0]

And update the reference:

- id: elemnetId,
+ id: elementId,

Also applies to: 74-74


64-66: ⚠️ Potential issue

Avoid logging potentially sensitive questionData

Logging questionData may expose sensitive information in the logs. It's advisable to remove or sanitize these logs to prevent potential security risks.

Apply this diff to remove the logging statements:

- console.log('QUESTION DATA')
- console.log(questionData)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

    throw new Error('Missing required question data properties')

69-70: ⚠️ Potential issue

Ensure questionData.id has the expected format before splitting

When splitting questionData.id, the code assumes it contains a '-' character. If questionData.id does not contain '-', this will cause an error. Consider adding validation or handling cases where the expected format is not met.

Apply this diff to add validation:

+ if (!questionData.id.includes('-')) {
+   throw new Error(`Invalid questionData.id format: ${questionData.id}`)
+ }
  const elementId = questionData.id.split('-')[0]
  const elementVersion = Number(questionData.id.split('-')[1].slice(1))
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

  if (!questionData.id.includes('-')) {
    throw new Error(`Invalid questionData.id format: ${questionData.id}`)
  }
  const elemnetId = questionData.id.split('-')[0]
  const elementVersion = Number(questionData.id.split('-')[1].slice(1))

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (8)
apps/backend-docker/scripts/checkRedisConsistency.ts (1)

Line range hint 1-30: Consider enhancing error handling and cleanup capabilities.

The script could benefit from the following improvements:

  1. Error handling for Redis and Prisma operations
  2. Option to clean up invalid keys
  3. More detailed logging of affected quiz IDs

Consider applying these enhancements:

 async function run() {
   const prisma = new PrismaClient()
+  try {
     const redisExec = new Redis({
       family: 4,
       host: process.env.REDIS_HOST ?? 'localhost',
       password: process.env.REDIS_PASS ?? '',
       port: Number(process.env.REDIS_PORT) ?? 6379,
       tls: process.env.REDIS_TLS ? {} : undefined,
     })

     const quizzes = await prisma.liveQuiz.findMany({
       where: {
         status: {
           not: PublicationStatus.PUBLISHED,
         },
       },
     })

     let count = 0
+    const affectedQuizzes = [];

     for (const quiz of quizzes) {
       const invalidKeys = await redisExec.keys(`lq:${quiz.id}:*`)

       if (invalidKeys.length > 0) {
         count += invalidKeys.length
+        affectedQuizzes.push({
+          quizId: quiz.id,
+          keyCount: invalidKeys.length,
+          keys: invalidKeys
+        });
+        
+        // Optionally clean up invalid keys
+        if (process.env.CLEANUP === 'true') {
+          await redisExec.del(...invalidKeys);
+          console.log(`Cleaned up ${invalidKeys.length} keys for quiz ${quiz.id}`);
+        }
       }
     }

-    console.log(count)
+    console.log(`Found ${count} invalid keys across ${affectedQuizzes.length} quizzes`);
+    if (affectedQuizzes.length > 0) {
+      console.log('Affected quizzes:', JSON.stringify(affectedQuizzes, null, 2));
+    }
+  } catch (error) {
+    console.error('Error during consistency check:', error);
+    process.exit(1);
+  } finally {
+    await prisma.$disconnect();
+    await redisExec.quit();
+  }
 }
apps/backend-docker/scripts/fixPointsInconsistency.ts (1)

Line range hint 1-55: Add safety measures to prevent unintended data modifications.

Several critical safety issues need to be addressed in this script:

  1. Empty COURSE_ID and QUIZ_ID constants could lead to unintended modifications
  2. No validation of input parameters
  3. Missing backup mechanism before performing updates
  4. Commented-out update operation suggests incomplete testing

Consider implementing these safety measures:

 const FAILURES = 1

-const COURSE_ID = ''
-const QUIZ_ID = ''
+// Require environment variables for sensitive operations
+const COURSE_ID = process.env.COURSE_ID
+const QUIZ_ID = process.env.QUIZ_ID
+
+if (!COURSE_ID || !QUIZ_ID) {
+  console.error('COURSE_ID and QUIZ_ID environment variables are required')
+  process.exit(1)
+}
+
+// Backup existing data before modifications
+async function backupExistingData() {
+  const entries = await prisma.leaderboardEntry.findMany({
+    where: {
+      courseId: COURSE_ID,
+      type: 'COURSE',
+    },
+  })
+  
+  // Store backup in a separate file
+  const timestamp = new Date().toISOString()
+  await fs.writeFile(
+    `backup_leaderboard_${timestamp}.json`,
+    JSON.stringify(entries, null, 2)
+  )
+}
+
+// Add validation for Redis data
+if (!Object.keys(quizLB).length) {
+  console.error('No leaderboard entries found in Redis')
+  process.exit(1)
+}

Also, consider adding a dry-run mode to verify changes before applying them.

Would you like me to provide a complete implementation with all these safety measures?

apps/func-response-processor/src/index.ts (4)

Line range hint 4-14: Consider extracting grading logic into a separate service.

The file imports multiple grading-related functions. Consider extracting these into a dedicated grading service to improve maintainability and testability.


Line range hint 17-19: Address TODOs regarding participant restrictions.

These TODOs highlight important edge cases that need to be addressed:

  1. Handling participants not part of the course
  2. Prepopulating leaderboard with all participations
  3. Handling participants joining during a session
  4. Filtering zero-point participants

Would you like me to help create GitHub issues to track these requirements?


Line range hint 42-43: Document rationale for using pipeline over transaction.

The code uses Redis pipeline instead of transaction, as indicated by the comment. Please document the rationale for this choice, as it affects atomicity guarantees.


Line range hint 291-303: Consider structuring error handling more robustly.

The error handling could be improved by:

  1. Using a custom error class for different types of failures
  2. Adding more context to error messages
  3. Implementing retry logic for transient Redis failures

Example implementation:

class ResponseProcessingError extends Error {
  constructor(
    message: string,
    public readonly code: 'REDIS_ERROR' | 'PROCESSING_ERROR',
    public readonly context: any
  ) {
    super(message);
    this.name = 'ResponseProcessingError';
  }
}

// Usage
throw new ResponseProcessingError(
  'Redis transaction failed',
  'REDIS_ERROR',
  { queueItem, error: e }
);
packages/graphql/src/services/liveQuizzes.ts (2)

62-63: Refactor Redis key generation to use helper functions or constants

Throughout the code, the Redis key prefixes 'lq:' are hardcoded in multiple places. To enhance maintainability and reduce the risk of errors, consider creating helper functions or constants to generate Redis keys. This approach centralizes key generation, making future changes easier and minimizing duplication.

Apply this refactor by introducing helper functions:

// At the top of the file or in a utility module
function liveQuizKey(...parts: (string | number)) {
  return `lq:${parts.join(':')}`;
}

function liveQuizInstanceKey(quizId: string, instanceId: number, suffix: string) {
  return liveQuizKey(quizId, 'i', instanceId, suffix);
}

// Then update Redis key usages accordingly
// Example:
redisMulti.hgetall(liveQuizKey(quizId, 'lb'));
redisMulti.hgetall(liveQuizKey(quizId, 'b', blockId, 'lb'));
redisMulti.hgetall(liveQuizInstanceKey(quizId, instanceId, 'info'));

Also applies to: 65-68, 258-258, 260-263, 717-717, 820-820, 969-969, 979-979, 987-987, 991-991, 998-998, 1002-1002, 1242-1243, 1494-1494, 1714-1714


1494-1494: Refactor duplicated code for Redis key deletion

The logic for deleting Redis keys is duplicated in the endLiveQuiz and cancelLiveQuiz functions. To enhance code maintainability and reduce duplication, consider extracting this logic into a shared utility function.

Create a helper function:

async function deleteLiveQuizKeys(ctx: Context, quizId: string) {
  const keys = await ctx.redisExec.keys(`lq:${quizId}:*`);
  const pipe = ctx.redisExec.multi();
  for (const key of keys) {
    pipe.unlink(key);
  }
  await pipe.exec();
}

// Then use it in both functions
await deleteLiveQuizKeys(ctx, id);

Also applies to: 1714-1714

🛑 Comments failed to post (2)
packages/graphql/src/services/liveQuizzes.ts (2)

1494-1494: ⚠️ Potential issue

Avoid using Redis KEYS command in production

The use of ctx.redisExec.keys(...) can lead to performance degradation in production environments, as the KEYS command blocks Redis during execution and can cause latency issues. It is advisable to use the SCAN command or maintain a list of keys to be deleted.

Consider refactoring as follows:

- const keys = await ctx.redisExec.keys(`lq:${id}:*`)
- const pipe = ctx.redisExec.multi()
- for (const key of keys) {
-   pipe.unlink(key)
- }
- await pipe.exec()

+ // Maintain a Set or List of keys when they are created
+ // Use SCAN or a stored list to retrieve and delete keys without blocking Redis

Also applies to: 1714-1714


979-979: 🛠️ Refactor suggestion

Update deprecated hmset to hset in Redis commands

The HMSET command has been deprecated since Redis 4.0. It is recommended to use HSET for setting multiple fields in a hash. Updating to hset ensures compatibility with newer Redis versions.

Apply this change to comply with the latest Redis standards:

- redisMulti.hmset(`lq:${quiz.id}:i:${instance.id}:info`, { /* fields */ });
- redisMulti.hmset(`lq:${quiz.id}:i:${instance.id}:results`, { /* fields */ });
+ redisMulti.hset(`lq:${quiz.id}:i:${instance.id}:info`, { /* fields */ });
+ redisMulti.hset(`lq:${quiz.id}:i:${instance.id}:results`, { /* fields */ });

Also applies to: 987-987, 991-991, 998-998, 1002-1002

Copy link

sonarcloud bot commented Nov 8, 2024

Copy link

cypress bot commented Nov 8, 2024

klicker-uzh    Run #3622

Run Properties:  status check passed Passed #3622  •  git commit dbbfe4154f: Merge branch 'v3' of https://github.com/uzh-bf/klicker-uzh into v3-new-live-quiz
Project klicker-uzh
Branch Review v3-new-live-quiz
Run status status check passed Passed #3622
Run duration 11m 32s
Commit git commit dbbfe4154f: Merge branch 'v3' of https://github.com/uzh-bf/klicker-uzh into v3-new-live-quiz
Committer sjschlapbach
View all properties for this run ↗︎

Test results
Tests that failed  Failures 0
Tests that were flaky  Flaky 0
Tests that did not run due to a developer annotating a test with .skip  Pending 0
Tests that did not run due to a failure in a mocha hook  Skipped 0
Tests that passed  Passing 140
View all changes introduced in this branch ↗︎

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

1 participant