-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: migrate live session data to live quiz table #4363
Conversation
Current Aviator status
This PR was merged manually (without Aviator). Merging manually can negatively impact the performance of the queue. Consider using Aviator next time.
See the real-time status of this PR on the
Aviator webapp.
Use the Aviator Chrome Extension
to see the status of your PR within GitHub.
|
📝 Walkthrough📝 Walkthrough📝 WalkthroughWalkthroughThis pull request introduces scripts to migrate existing live sessions and enum types to a new database schema for live quizzes. The changes include the transformation of data structures, the addition of new fields and enums, and the adjustment of existing models in the Prisma schema. Key functions are implemented to handle the migration of data from old formats to new ones, ensuring compatibility and integrity during the transition. Changes
Possibly related PRs
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
klicker-uzh Run #3624
Run Properties:
|
Project |
klicker-uzh
|
Branch Review |
live-quiz-migration
|
Run status |
Passed #3624
|
Run duration | 11m 27s |
Commit |
7a818b7554 ℹ️: Merge b80c20c977e790a7654511b0306eb04a550c4182 into dbbfe4154f2b144d9b310d104d0c...
|
Committer | Julius Schlapbach |
View all properties for this run ↗︎ |
Test results | |
---|---|
Failures |
0
|
Flaky |
0
|
Pending |
0
|
Skipped |
0
|
Passing |
140
|
View all changes introduced in this branch ↗︎ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.
Actionable comments posted: 8
🧹 Outside diff range and nitpick comments (12)
packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts (2)
1-11
: Enhance script documentation with additional details.While the current comments explain the purpose, consider adding:
- Expected execution duration
- Whether the script is idempotent
- Any prerequisites or post-migration verification steps
74-74
: Improve script execution handling.The current implementation lacks proper script initialization and configuration options.
Consider this improvement:
-await run() +if (require.main === module) { + run() + .then(() => { + console.log('Migration completed successfully') + process.exit(0) + }) + .catch((error) => { + console.error('Migration failed:', error) + process.exit(1) + }) +}Consider adding command-line arguments for:
- Batch size
- Dry run mode
- Specific table migration
packages/prisma/src/prisma/migrations/20241107142334_live_quiz_blocks/migration.sql (1)
66-75
: Consider indexing strategy for ElementBlockThe table design is solid, but consider the following improvements:
- The
originalId
column might benefit from an index if it's used for lookups- The
execution
column with a default of 0 suggests a counter - consider adding a check constraint to ensure non-negative valuesConsider adding:
-- Add index if originalId is used for lookups CREATE INDEX "ElementBlock_originalId_idx" ON "ElementBlock"("originalId"); -- Add check constraint for execution counter ALTER TABLE "ElementBlock" ADD CONSTRAINT "ElementBlock_execution_check" CHECK (execution >= 0);packages/prisma/src/prisma/schema/element.prisma (3)
Line range hint
6-14
: Enhance documentation for new element types.While the comment indicates these are "new types", it would be helpful to document the intended use cases for
CONTENT
andFLASHCARD
types.
162-164
: Ensure migration handles null typeNEW values.The
typeNEW
field is optional during migration but will be required later. Consider:
- Adding a database constraint or validation to ensure all records have
typeNEW
populated before removing the oldtype
field- Documenting the migration steps in a separate migration guide
Line range hint
1-214
: Migration schema changes look well-structured.The schema changes support a gradual migration strategy with:
- Parallel support for old and new structures
- Clear TODO markers for post-migration cleanup
- Proper field optionality during transition
However, consider:
- Creating a migration checklist document
- Adding validation steps before cleanup
- Including rollback procedures
packages/prisma/src/prisma/schema/quiz.prisma (1)
287-294
: Track cleanup of temporary migration enum.The
GroupActivityStatus
enum is marked as temporary for migration. To ensure this technical debt is addressed, consider creating a follow-up task to remove this enum once the migration is complete.Would you like me to create a GitHub issue to track the cleanup of this temporary enum?
packages/graphql/src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts (5)
252-252
: Address the TODO: Initialize Redis cache when content is updatedThere's a TODO comment indicating that Redis cache initialization needs to be implemented when its content should be updated. Implementing this ensures cache consistency during the migration process.
Would you like assistance in implementing the Redis cache initialization, or should I open a new GitHub issue to track this task?
519-521
: Address the TODO: Apply cache updates and cleanupThe code contains a TODO comment about applying cache updates and cleaning up old live session cache data. Implementing this is important to prevent data inconsistencies and potential memory leaks.
Would you like assistance in uncommenting and adjusting the cache update and cleanup logic, or should I open a new GitHub issue to track this task?
28-33
: Make logging flags configurable via environment variablesCurrently, the logging flags (
logFakedElement
,logQuestionDataConversion
, etc.) are hardcoded. To improve flexibility and maintainability, consider making these flags configurable through environment variables.Example implementation:
- const logFakedElement = false - const logQuestionDataConversion = false - const logResultsConversion = false - const logInstanceConversion = false + const logFakedElement = process.env.LOG_FAKED_ELEMENT === 'true' + const logQuestionDataConversion = process.env.LOG_QUESTION_DATA_CONVERSION === 'true' + const logResultsConversion = process.env.LOG_RESULTS_CONVERSION === 'true' + const logInstanceConversion = process.env.LOG_INSTANCE_CONVERSION === 'true'
165-244
: RefactorconvertOldResults
to reduce code duplicationThe handling of
ElementType.NUMERICAL
andElementType.FREE_TEXT
inconvertOldResults
shares similar logic. Refactoring this code can reduce duplication and enhance maintainability.Consider abstracting the common logic into a helper function that processes open-ended question types.
25-27
: Update or remove outdated commentsThe comments describing the migration (
liveSession -> liveQuiz
,sessionBlock -> elementBlock
, etc.) might be outdated or no longer necessary. Updating or removing them can improve code clarity.Consider revising the comments to accurately reflect the current migration process or removing them if they're redundant.
🛑 Comments failed to post (8)
packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts (4)
12-14: 🛠️ Refactor suggestion
Add proper error handling and Prisma client lifecycle management.
The current implementation lacks proper error handling and client cleanup.
Consider this improvement:
async function run() { const prisma = new PrismaClient() + try { + // existing migration logic + } catch (error) { + console.error('Migration failed:', error) + throw error + } finally { + await prisma.$disconnect() + } }Committable suggestion skipped: line range outside the PR's diff.
41-71: 🛠️ Refactor suggestion
⚠️ Potential issueReconsider default status and improve GroupActivity migration performance.
Two concerns:
- Setting
DRAFT
as default status might not be appropriate for all cases- Same performance issues as ElementStack migration (bulk fetching, single updates)
For the default status:
- let newType: PublicationStatus = PublicationStatus.DRAFT + if (!oldType) { + console.warn(`GroupActivity ${groupActivity.id} has no status, skipping`) + continue + } + let newType: PublicationStatusApply the same batch processing improvements as suggested for ElementStack migration.
Committable suggestion skipped: line range outside the PR's diff.
15-39: 🛠️ Refactor suggestion
Improve migration performance and reliability for ElementStack updates.
The current implementation has several potential issues:
- Fetches all records at once, which could cause memory issues with large datasets
- Updates records one by one, which is inefficient
- Lacks progress tracking and error recovery
Consider this improved approach:
- const elementStacks = await prisma.elementStack.findMany() - for (const elementStack of elementStacks) { + let processed = 0 + const batchSize = 1000 + + while (true) { + const elementStacks = await prisma.elementStack.findMany({ + take: batchSize, + skip: processed, + where: { typeNEW: null }, // Only process unprocessed records + }) + + if (elementStacks.length === 0) break + + await prisma.$transaction( + elementStacks.map((elementStack) => { const oldType = elementStack.type let newType: ElementStackTypeNew | undefined switch (oldType) { // ... existing switch cases ... } - await prisma.elementStack.update({ + return prisma.elementStack.update({ where: { id: elementStack.id }, data: { typeNEW: newType }, }) - } + }) + ) + + processed += elementStacks.length + console.log(`Processed ${processed} ElementStack records`) + }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.// element stack type to new column let processed = 0 const batchSize = 1000 while (true) { const elementStacks = await prisma.elementStack.findMany({ take: batchSize, skip: processed, where: { typeNEW: null }, // Only process unprocessed records }) if (elementStacks.length === 0) break await prisma.$transaction( elementStacks.map((elementStack) => { const oldType = elementStack.type let newType: ElementStackTypeNew | undefined switch (oldType) { case ElementStackType.GROUP_ACTIVITY: newType = ElementStackTypeNew.GROUP_ACTIVITY break case ElementStackType.MICROLEARNING: newType = ElementStackTypeNew.MICROLEARNING break case ElementStackType.PRACTICE_QUIZ: newType = ElementStackTypeNew.PRACTICE_QUIZ break default: break } return prisma.elementStack.update({ where: { id: elementStack.id }, data: { typeNEW: newType }, }) }) ) processed += elementStacks.length console.log(`Processed ${processed} ElementStack records`) }
1-74: 💡 Codebase verification
Add data validation, logging, and transaction safety to the migration script
Based on the analysis of similar migration scripts in the codebase, the current enum migration script needs several improvements for safer data migration:
- Add logging to track migration progress and identify potential issues
- Implement data validation before and after migration
- Use transactions to ensure atomic updates
- Add error handling with rollback capabilities
Example implementation:
// Before migration validation console.log(`Starting migration. Found ${elementStacks.length} element stacks and ${groupActivities.length} group activities`) // Wrap operations in transaction await prisma.$transaction(async (tx) => { // Existing migration logic with added logging for (const elementStack of elementStacks) { console.log(`Migrating ElementStack ${elementStack.id}`) // ... migration logic } // Verify migration const unmigrated = await tx.elementStack.count({ where: { typeNEW: null } }) if (unmigrated > 0) throw new Error(`Found ${unmigrated} unmigrated records`) })🔗 Analysis chain
Verify data integrity before and after migration.
The script should include data verification steps to ensure safe migration.
Run these commands to check data consistency:
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check for any records that might be missed in the migration echo "Checking for ElementStack records without new type..." ast-grep --pattern 'await prisma.elementStack.findMany({ where: { typeNEW: null } })' echo "Checking for GroupActivity records without new status..." ast-grep --pattern 'await prisma.groupActivity.findMany({ where: { statusNEW: null } })' # Look for any error handling in similar migration scripts echo "Checking for error handling patterns in other migration scripts..." rg -l "catch.*error" "src/scripts/*migrate*.ts"Length of output: 676
Script:
#!/bin/bash # Let's try a different approach to analyze migration scripts and error handling patterns # Find all migration scripts in the repository echo "Looking for migration scripts..." fd migrate # Check for any error handling patterns in TypeScript files echo "Checking error handling patterns in TypeScript files..." rg "try.*catch" -A 2 --type ts # Look for similar enum migrations echo "Looking for enum-related migrations..." rg -l "enum.*migration|migrate.*enum" --type ts # Check the schema for these enums echo "Checking Prisma schema for relevant enums..." fd "schema.prisma" --exec cat {}Length of output: 1321
Script:
#!/bin/bash # Let's examine similar migration scripts for patterns and check the specific enums # Check content of similar migration scripts echo "Checking content of other migration scripts..." cat "packages/prisma/src/scripts/2024-02-06_migrate_group_activity.ts" cat "packages/prisma/src/scripts/2024-02-18_migrate_microlearning.ts" # Look for transaction patterns in migration scripts echo "Checking for transaction patterns..." rg "prisma.\$transaction" --type ts # Check for validation or logging in migration scripts echo "Looking for validation or logging patterns..." rg "console\.(log|error)" "packages/graphql/src/scripts/2024-11-07_migrate_unified_enums.ts" rg "console\.(log|error)" "packages/prisma/src/scripts/*migrate*.ts"Length of output: 13375
packages/prisma/src/prisma/migrations/20241107142334_live_quiz_blocks/migration.sql (1)
15-16:
⚠️ Potential issueAddress PostgreSQL 11 compatibility warning
The migration adds multiple enum values to
PublicationStatus
which, as noted in the comments, is not possible in a single migration for PostgreSQL 11 and earlier versions. This should be split into multiple migrations for backward compatibility.Split the enum modifications into separate migrations:
-- Migration 1 ALTER TYPE "PublicationStatus" ADD VALUE 'ENDED'; -- Migration 2 ALTER TYPE "PublicationStatus" ADD VALUE 'GRADED';Also applies to: 22-25
packages/graphql/src/scripts/2024-11-07_migrate_live_session_to_live_quiz.ts (3)
69-69:
⚠️ Potential issueFix typo in variable name
elemnetId
The variable
elemnetId
appears to be a typo. It should beelementId
for clarity and consistency.Apply this diff to correct the typo:
- const elemnetId = questionData.id.split('-')[0] + const elementId = questionData.id.split('-')[0]And update the reference:
- id: elemnetId, + id: elementId,Also applies to: 74-74
64-66:
⚠️ Potential issueAvoid logging potentially sensitive
questionData
Logging
questionData
may expose sensitive information in the logs. It's advisable to remove or sanitize these logs to prevent potential security risks.Apply this diff to remove the logging statements:
- console.log('QUESTION DATA') - console.log(questionData)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.throw new Error('Missing required question data properties')
69-70:
⚠️ Potential issueEnsure
questionData.id
has the expected format before splittingWhen splitting
questionData.id
, the code assumes it contains a'-'
character. IfquestionData.id
does not contain'-'
, this will cause an error. Consider adding validation or handling cases where the expected format is not met.Apply this diff to add validation:
+ if (!questionData.id.includes('-')) { + throw new Error(`Invalid questionData.id format: ${questionData.id}`) + } const elementId = questionData.id.split('-')[0] const elementVersion = Number(questionData.id.split('-')[1].slice(1))📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.if (!questionData.id.includes('-')) { throw new Error(`Invalid questionData.id format: ${questionData.id}`) } const elemnetId = questionData.id.split('-')[0] const elementVersion = Number(questionData.id.split('-')[1].slice(1))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Inline review comments failed to post. This is likely due to GitHub's limits when posting large numbers of comments.
Actionable comments posted: 2
🧹 Outside diff range and nitpick comments (8)
apps/backend-docker/scripts/checkRedisConsistency.ts (1)
Line range hint
1-30
: Consider enhancing error handling and cleanup capabilities.The script could benefit from the following improvements:
- Error handling for Redis and Prisma operations
- Option to clean up invalid keys
- More detailed logging of affected quiz IDs
Consider applying these enhancements:
async function run() { const prisma = new PrismaClient() + try { const redisExec = new Redis({ family: 4, host: process.env.REDIS_HOST ?? 'localhost', password: process.env.REDIS_PASS ?? '', port: Number(process.env.REDIS_PORT) ?? 6379, tls: process.env.REDIS_TLS ? {} : undefined, }) const quizzes = await prisma.liveQuiz.findMany({ where: { status: { not: PublicationStatus.PUBLISHED, }, }, }) let count = 0 + const affectedQuizzes = []; for (const quiz of quizzes) { const invalidKeys = await redisExec.keys(`lq:${quiz.id}:*`) if (invalidKeys.length > 0) { count += invalidKeys.length + affectedQuizzes.push({ + quizId: quiz.id, + keyCount: invalidKeys.length, + keys: invalidKeys + }); + + // Optionally clean up invalid keys + if (process.env.CLEANUP === 'true') { + await redisExec.del(...invalidKeys); + console.log(`Cleaned up ${invalidKeys.length} keys for quiz ${quiz.id}`); + } } } - console.log(count) + console.log(`Found ${count} invalid keys across ${affectedQuizzes.length} quizzes`); + if (affectedQuizzes.length > 0) { + console.log('Affected quizzes:', JSON.stringify(affectedQuizzes, null, 2)); + } + } catch (error) { + console.error('Error during consistency check:', error); + process.exit(1); + } finally { + await prisma.$disconnect(); + await redisExec.quit(); + } }apps/backend-docker/scripts/fixPointsInconsistency.ts (1)
Line range hint
1-55
: Add safety measures to prevent unintended data modifications.Several critical safety issues need to be addressed in this script:
- Empty
COURSE_ID
andQUIZ_ID
constants could lead to unintended modifications- No validation of input parameters
- Missing backup mechanism before performing updates
- Commented-out update operation suggests incomplete testing
Consider implementing these safety measures:
const FAILURES = 1 -const COURSE_ID = '' -const QUIZ_ID = '' +// Require environment variables for sensitive operations +const COURSE_ID = process.env.COURSE_ID +const QUIZ_ID = process.env.QUIZ_ID + +if (!COURSE_ID || !QUIZ_ID) { + console.error('COURSE_ID and QUIZ_ID environment variables are required') + process.exit(1) +} + +// Backup existing data before modifications +async function backupExistingData() { + const entries = await prisma.leaderboardEntry.findMany({ + where: { + courseId: COURSE_ID, + type: 'COURSE', + }, + }) + + // Store backup in a separate file + const timestamp = new Date().toISOString() + await fs.writeFile( + `backup_leaderboard_${timestamp}.json`, + JSON.stringify(entries, null, 2) + ) +} + +// Add validation for Redis data +if (!Object.keys(quizLB).length) { + console.error('No leaderboard entries found in Redis') + process.exit(1) +}Also, consider adding a dry-run mode to verify changes before applying them.
Would you like me to provide a complete implementation with all these safety measures?
apps/func-response-processor/src/index.ts (4)
Line range hint
4-14
: Consider extracting grading logic into a separate service.The file imports multiple grading-related functions. Consider extracting these into a dedicated grading service to improve maintainability and testability.
Line range hint
17-19
: Address TODOs regarding participant restrictions.These TODOs highlight important edge cases that need to be addressed:
- Handling participants not part of the course
- Prepopulating leaderboard with all participations
- Handling participants joining during a session
- Filtering zero-point participants
Would you like me to help create GitHub issues to track these requirements?
Line range hint
42-43
: Document rationale for using pipeline over transaction.The code uses Redis pipeline instead of transaction, as indicated by the comment. Please document the rationale for this choice, as it affects atomicity guarantees.
Line range hint
291-303
: Consider structuring error handling more robustly.The error handling could be improved by:
- Using a custom error class for different types of failures
- Adding more context to error messages
- Implementing retry logic for transient Redis failures
Example implementation:
class ResponseProcessingError extends Error { constructor( message: string, public readonly code: 'REDIS_ERROR' | 'PROCESSING_ERROR', public readonly context: any ) { super(message); this.name = 'ResponseProcessingError'; } } // Usage throw new ResponseProcessingError( 'Redis transaction failed', 'REDIS_ERROR', { queueItem, error: e } );packages/graphql/src/services/liveQuizzes.ts (2)
62-63
: Refactor Redis key generation to use helper functions or constantsThroughout the code, the Redis key prefixes
'lq:'
are hardcoded in multiple places. To enhance maintainability and reduce the risk of errors, consider creating helper functions or constants to generate Redis keys. This approach centralizes key generation, making future changes easier and minimizing duplication.Apply this refactor by introducing helper functions:
// At the top of the file or in a utility module function liveQuizKey(...parts: (string | number)) { return `lq:${parts.join(':')}`; } function liveQuizInstanceKey(quizId: string, instanceId: number, suffix: string) { return liveQuizKey(quizId, 'i', instanceId, suffix); } // Then update Redis key usages accordingly // Example: redisMulti.hgetall(liveQuizKey(quizId, 'lb')); redisMulti.hgetall(liveQuizKey(quizId, 'b', blockId, 'lb')); redisMulti.hgetall(liveQuizInstanceKey(quizId, instanceId, 'info'));Also applies to: 65-68, 258-258, 260-263, 717-717, 820-820, 969-969, 979-979, 987-987, 991-991, 998-998, 1002-1002, 1242-1243, 1494-1494, 1714-1714
1494-1494
: Refactor duplicated code for Redis key deletionThe logic for deleting Redis keys is duplicated in the
endLiveQuiz
andcancelLiveQuiz
functions. To enhance code maintainability and reduce duplication, consider extracting this logic into a shared utility function.Create a helper function:
async function deleteLiveQuizKeys(ctx: Context, quizId: string) { const keys = await ctx.redisExec.keys(`lq:${quizId}:*`); const pipe = ctx.redisExec.multi(); for (const key of keys) { pipe.unlink(key); } await pipe.exec(); } // Then use it in both functions await deleteLiveQuizKeys(ctx, id);Also applies to: 1714-1714
🛑 Comments failed to post (2)
packages/graphql/src/services/liveQuizzes.ts (2)
1494-1494:
⚠️ Potential issueAvoid using Redis
KEYS
command in productionThe use of
ctx.redisExec.keys(...)
can lead to performance degradation in production environments, as theKEYS
command blocks Redis during execution and can cause latency issues. It is advisable to use theSCAN
command or maintain a list of keys to be deleted.Consider refactoring as follows:
- const keys = await ctx.redisExec.keys(`lq:${id}:*`) - const pipe = ctx.redisExec.multi() - for (const key of keys) { - pipe.unlink(key) - } - await pipe.exec() + // Maintain a Set or List of keys when they are created + // Use SCAN or a stored list to retrieve and delete keys without blocking RedisAlso applies to: 1714-1714
979-979: 🛠️ Refactor suggestion
Update deprecated
hmset
tohset
in Redis commandsThe
HMSET
command has been deprecated since Redis 4.0. It is recommended to useHSET
for setting multiple fields in a hash. Updating tohset
ensures compatibility with newer Redis versions.Apply this change to comply with the latest Redis standards:
- redisMulti.hmset(`lq:${quiz.id}:i:${instance.id}:info`, { /* fields */ }); - redisMulti.hmset(`lq:${quiz.id}:i:${instance.id}:results`, { /* fields */ }); + redisMulti.hset(`lq:${quiz.id}:i:${instance.id}:info`, { /* fields */ }); + redisMulti.hset(`lq:${quiz.id}:i:${instance.id}:results`, { /* fields */ });Also applies to: 987-987, 991-991, 998-998, 1002-1002
Quality Gate passedIssues Measures |
klicker-uzh Run #3622
Run Properties:
|
Project |
klicker-uzh
|
Branch Review |
v3-new-live-quiz
|
Run status |
Passed #3622
|
Run duration | 11m 32s |
Commit |
dbbfe4154f: Merge branch 'v3' of https://github.com/uzh-bf/klicker-uzh into v3-new-live-quiz
|
Committer | sjschlapbach |
View all properties for this run ↗︎ |
Test results | |
---|---|
Failures |
0
|
Flaky |
0
|
Pending |
0
|
Skipped |
0
|
Passing |
140
|
View all changes introduced in this branch ↗︎ |
To test the scripts, dump the current production database, load it into the local setup, apply the latest migration and then run the two scripts. Note that they can be executed independently and out of order.
Migrate status attributes
Migrate live session to live quiz and corresponding child components
Summary by CodeRabbit
Release Notes
New Features
Database Changes
ElementBlock
table to support the live quiz format.Documentation