Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(rethinkdb): TeamMember: Phase 2 #9993

Merged
merged 17 commits into from
Jul 23, 2024
Merged

Conversation

mattkrick
Copy link
Member

@mattkrick mattkrick commented Jul 17, 2024

Description

Migrates existing R data to PG

Summary by CodeRabbit

  • New Features

    • Improved handling of user names with a default value of 'Unknown' for undefined preferred names in new meetings.
  • Bug Fixes

    • Enhanced reliability of the user meeting functionality.
  • Refactor

    • Updated database operations to use DataLoader for efficient data fetching.
    • Migrated from RethinkDB to PostgreSQL queries in several mutations.
    • Simplified and streamlined logic in multiple GraphQL mutations for better performance and readability.
  • Chores

    • Cleared DataLoader caches post operations for data consistency.

Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Signed-off-by: Matt Krick <matt.krick@gmail.com>
Copy link
Contributor

This PR exceeds the recommended size of 1000 lines. Please make sure you are NOT addressing multiple issues with one PR. Note this PR will be delayed and might be rejected due to its size.

Copy link
Contributor

coderabbitai bot commented Jul 17, 2024

Walkthrough

The recent updates enhance data handling and retrieval mechanisms across the client and server packages. Key improvements include the introduction of default values for user names, the flexibility of nullable fields in the database, and optimised data loading through DataLoader. Additionally, a significant transition from RethinkDB to PostgreSQL for various operations aims to boost reliability, performance, and maintainability. Overall, these changes streamline data access and ensure consistent defaults.

Changes

Files/Paths Change Summaries
packages/client/hooks/useSnacksForNewMeetings.ts Improved handling of preferredName by providing a default value of 'Unknown'.
packages/server/database/types/Meeting.ts Made createdBy property optional, allowing it to be null.
packages/server/dataloader/RootDataLoader.ts Enhanced clearAll method to accept both single and array inputs; adjusted dependsOn function accordingly.
packages/server/dataloader/rethinkForeignKeyLoader.ts Changed function signature and logic to work with RootDataLoader and rethinkPrimaryKeyLoaderMakers.
packages/server/graphql/mutations/addTeam.ts Switched from getTeamsByOrgIds to dataLoader.get('teamsByOrgIds').load(orgId) for validation.
packages/server/graphql/mutations/createTask.ts Refactored to use dataLoader for fetching teamMembers instead of direct database queries.
packages/server/graphql/mutations/deleteTask.ts Adjusted retrieval of subscribedUserIds by fetching teamMembers first and extracting userId from each member.
packages/server/graphql/mutations/endCheckIn.ts Replaced a direct query with dataLoader.get('teamMembersByTeamId').load(teamId) for efficient team member retrieval.
packages/server/graphql/mutations/helpers/createTeamAndLeader.ts Added imports and new operations for user updates and team creation; cleared specific data loaders post-operation.
packages/server/graphql/mutations/helpers/removeFromOrg.ts Replaced RethinkDB queries with PostgreSQL queries and DataLoader for efficient data loading.
packages/server/graphql/mutations/helpers/removeTeamMember.ts Migrated database operations from RethinkDB to PostgreSQL using getKysely.
packages/server/graphql/mutations/helpers/safeEndRetrospective.ts Used DataLoader to fetch team members and extract the team lead user ID.
packages/server/graphql/mutations/helpers/softDeleteUser.ts Refactored to use DataLoader for fetching team members and their IDs.
packages/server/graphql/mutations/moveTeamToOrg.ts Switched to dataLoader.get('teams').load(teamId) for team retrieval and added dataLoader.clearAll('teams').
packages/server/graphql/mutations/promoteToTeamLead.ts Refactored team lead assignment using dataLoader and getKysely for database operations.
packages/server/graphql/mutations/setPokerSpectate.ts Updated to use getKysely for database operations and clear data loader cache.
packages/server/graphql/mutations/toggleTeamDrawer.ts Added database update operation using getKysely for TeamMember table.
packages/server/graphql/mutations/updateTask.ts Modified to fetch teamMembers using DataLoader and adjusted related database operations.
packages/server/graphql/mutations/updateTeamName.ts Used dataLoader to fetch team and viewer info and added a call to clear cached team data post-update.
packages/server/graphql/private/mutations/backupOrganization.ts Replaced RethinkDB operations with PostgreSQL via getKysely for streamlined data backup processes.
packages/server/postgres/migrations/1721356124871_TeamMember-phase2.ts Implemented migration for TeamMember table, updating and batching entries while handling foreign key constraints.

Sequence Diagram(s)

The changes predominantly focus on data handling and enhancements across various modules. As these updates do not introduce new control flows or features requiring visual representation, no sequence diagrams are generated.


Recent review details

Configuration used: CodeRabbit UI
Review profile: CHILL

Commits

Files that changed from the base of the PR and between f563d14 and f21004e.

Files selected for processing (4)
  • packages/server/tests/globalTeardown.ts (1 hunks)
  • packages/server/dataloader/tests/isOrgVerified.test.ts (1 hunks)
  • packages/server/dataloader/tests/usersCustomRedisQueries.test.ts (1 hunks)
  • packages/server/utils/tests/isRequestToJoinDomainAllowed.test.ts (1 hunks)
Files skipped from review due to trivial changes (4)
  • packages/server/tests/globalTeardown.ts
  • packages/server/dataloader/tests/isOrgVerified.test.ts
  • packages/server/dataloader/tests/usersCustomRedisQueries.test.ts
  • packages/server/utils/tests/isRequestToJoinDomainAllowed.test.ts

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

Outside diff range, codebase verification and nitpick comments (5)
packages/server/utils/authorization.ts (1)

Line range hint 15-24:
Add error handling for database queries.

The function should handle potential errors from the database queries to ensure robustness.

export const isTeamLead = async (userId: string, teamId: string, dataLoader: DataLoaderWorker) => {
  try {
    const r = await getRethink()
    const teamMemberId = toTeamMemberId(teamId, userId)
    if (await r.table('TeamMember').get(teamMemberId)('isLead').default(false).run()) {
      return true
    }

    const team = await dataLoader.get('teams').loadNonNull(teamId)
    const organizationUser = await dataLoader
      .get('organizationUsersByUserIdOrgId')
      .load({userId, orgId: team.orgId})
    return organizationUser?.role === 'ORG_ADMIN'
  } catch (error) {
    console.error('Error checking if user is team lead:', error)
    return false
  }
}
packages/server/graphql/public/mutations/updateUserProfile.ts (1)

53-70: Inconsistent usage of PostgreSQL and RethinkDB in application logic

The codebase shows extensive usage of RethinkDB in the application logic, while PostgreSQL usage is limited to configuration files. This inconsistency may lead to potential issues in data management and synchronisation.

  • Ensure that PostgreSQL is integrated consistently in the application logic where required.
  • Review the database operations to maintain consistency and avoid potential data issues.
Analysis chain

PostgreSQL query execution and data update logic looks good.

The logic correctly updates the preferredName in both PostgreSQL and RethinkDB. Ensure consistency in using both databases.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the usage of both PostgreSQL and RethinkDB in the codebase.

# Test: Search for the usage of both databases. Expect: Consistent usage.
rg --type python -A 5 $'getKysely'
rg --type python -A 5 $'getRethink'

Length of output: 130


Script:

#!/bin/bash
# Description: Verify the usage of both PostgreSQL and RethinkDB in the codebase.

# Test: Search for the usage of both databases. Expect: Consistent usage.
rg --type js -A 5 $'getKysely'
rg --type js -A 5 $'getRethink'

Length of output: 1398


Script:

#!/bin/bash
# Description: Verify the usage of both PostgreSQL and RethinkDB in the codebase.

# Test: Search for the usage of both databases. Expect: Consistent usage.
rg --type js -A 5 $'pg'
rg --type js -A 5 $'r.table'

Length of output: 62525


Script:

#!/bin/bash
# Description: Verify the usage of PostgreSQL in the application logic.

# Test: Search for the usage of PostgreSQL in TypeScript/JavaScript files, excluding configuration and migration scripts.
rg --type js -g '!scripts/**' -g '!packages/server/database/migrations/**' -A 5 $'pg'

Length of output: 1509

packages/server/graphql/resolvers.ts (2)

Line range hint 79-83:
Ensure proper type checking for teamId and team.

The function resolveTeam should include type checks for teamId and team to ensure they are valid.

export const resolveTeam = (
  {team, teamId}: {teamId?: string; team?: TeamSource},
  _args: unknown,
  {dataLoader}: GQLContext
) => {
  if (teamId && typeof teamId !== 'string') {
    throw new Error('Invalid teamId')
  }
  if (team && typeof team !== 'object') {
    throw new Error('Invalid team')
  }
  // TODO figure out how to lock this down without using the tms, since the mutation may have invalidated it
  return teamId ? dataLoader.get('teams').load(teamId) : team
  // const teamDoc = teamId ? await dataLoader.get('teams').load(teamId) : team;
  // const {tms} = authToken;
  // return tms.includes(teamDoc.id) ? teamDoc : null;
}

Line range hint 91-95:
Ensure proper type checking for teamIds and teams.

The function resolveTeams should include type checks for teamIds and teams to ensure they are valid.

export const resolveTeams = (
  {teamIds, teams}: {teamIds: string; teams: TeamSource[]},
  _args: unknown,
  {dataLoader}: GQLContext
) => {
  if (teamIds && !Array.isArray(teamIds)) {
    throw new Error('Invalid teamIds')
  }
  if (teams && !Array.isArray(teams)) {
    throw new Error('Invalid teams')
  }
  // TODO figure out how to lock this down without using the tms, since the mutation may have invalidated it
  return teamIds && teamIds.length > 0 ? dataLoader.get('teams').loadMany(teamIds) : teams
  // const {tms} = authToken;
  // const teamDocs = (teamIds && teamIds.length > 0) ? await dataLoader.get('teams').loadMany(teamIds) : teams;
  // return Array.isArray(teamDocs) ? teamDocs.filter((team) => tms.includes(team.id)) : null;
}
packages/server/graphql/private/typeDefs/_legacy.graphql (1)

Issues found due to the removal of the NewMeeting interface.

The NewMeeting interface is extensively used across the codebase, including in GraphQL type definitions, mutations, queries, and various other files. Its removal will cause numerous broken references and errors.

  • File: packages/server/graphql/private/typeDefs/_legacy.graphql
  • Lines: 756-761

Please review and address the impacted areas to ensure the codebase remains functional.

Analysis chain

Line range hint 1-1:
Verify the impact of removing the NewMeeting interface.

The removal of the NewMeeting interface may impact other parts of the codebase that depend on this interface.

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the impact of removing the `NewMeeting` interface in the codebase.

# Test: Search for the usage of `NewMeeting` interface. Expect: No broken references in the codebase.
rg --type graphql 'NewMeeting'

Length of output: 62


Script:

#!/bin/bash
# Description: Verify the impact of removing the `NewMeeting` interface in the codebase.

# Test: Search for the usage of `NewMeeting` interface across all files. Expect: No broken references in the codebase.
rg 'NewMeeting'

Length of output: 107587

Comment on lines 35 to 40
.forEach((row: any) => {
return r
.table('TeamMember')
.get(row('id'))
.update({updatedAt: row('updatedAt')})
})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle potential errors in the forEach loop.

The forEach loop should handle potential errors that may occur during the update operation.

- await r(dateDates)
-   .forEach((row: any) => {
-     return r
-       .table('TeamMember')
-       .get(row('id'))
-       .update({updatedAt: row('updatedAt')})
-   })
-   .run()
+ await Promise.all(
+   dateDates.map(async (row: any) => {
+     try {
+       await r
+         .table('TeamMember')
+         .get(row.id)
+         .update({updatedAt: row.updatedAt})
+         .run()
+     } catch (error) {
+       console.error(`Failed to update date for row ${row.id}:`, error)
+     }
+   })
+ )
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.forEach((row: any) => {
return r
.table('TeamMember')
.get(row('id'))
.update({updatedAt: row('updatedAt')})
})
await Promise.all(
dateDates.map(async (row: any) => {
try {
await r
.table('TeamMember')
.get(row.id)
.update({updatedAt: row.updatedAt})
.run()
} catch (error) {
console.error(`Failed to update date for row ${row.id}:`, error)
}
})
)

Comment on lines 14 to 15
// add a dummy date for nulls
const parabolEpoch = new Date('2016-06-01')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider making the dummy date configurable.

Hardcoding the dummy date can lead to issues if the date needs to be changed in the future.

- const parabolEpoch = new Date('2016-06-01')
+ const PARABOL_EPOCH = process.env.PARABOL_EPOCH || '2016-06-01'
+ const parabolEpoch = new Date(PARABOL_EPOCH)
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// add a dummy date for nulls
const parabolEpoch = new Date('2016-06-01')
// add a dummy date for nulls
const PARABOL_EPOCH = process.env.PARABOL_EPOCH || '2016-06-01'
const parabolEpoch = new Date(PARABOL_EPOCH)

Comment on lines 103 to 127
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.log(e, row)
}
})
)
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Improve error handling for batch insert operations.

The batch insert operation should handle potential errors more gracefully and provide more detailed logging.

- try {
-   await pg
-     .insertInto('TeamMember')
-     .values(rowsToInsert)
-     .onConflict((oc) => oc.doNothing())
-     .execute()
- } catch (e) {
-   await Promise.all(
-     rowsToInsert.map(async (row) => {
-       try {
-         await pg
-           .insertInto('TeamMember')
-           .values(row)
-           .onConflict((oc) => oc.doNothing())
-           .execute()
-       } catch (e) {
-         if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
-           console.log(`Skipping ${row.id} because it has no user/team`)
-           return
-         }
-         console.log(e, row)
-       }
-     })
-   )
- }
+ try {
+   await pg
+     .insertInto('TeamMember')
+     .values(rowsToInsert)
+     .onConflict((oc) => oc.doNothing())
+     .execute()
+ } catch (error) {
+   console.error('Batch insert failed:', error)
+   await Promise.all(
+     rowsToInsert.map(async (row) => {
+       try {
+         await pg
+           .insertInto('TeamMember')
+           .values(row)
+           .onConflict((oc) => oc.doNothing())
+           .execute()
+       } catch (e) {
+         if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
+           console.log(`Skipping ${row.id} because it has no user/team`)
+           return
+         }
+         console.error(`Failed to insert row ${row.id}:`, e)
+       }
+     })
+   )
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.log(e, row)
}
})
)
}
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (error) {
console.error('Batch insert failed:', error)
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.error(`Failed to insert row ${row.id}:`, e)
}
})
)
}

Comment on lines 13 to 41
const setFacilitatedUserIdOrDelete = async (
userIdToDelete: string,
teamIds: string[],
dataLoader: DataLoaderInstance
) => {
const r = await getRethink()
const facilitatedMeetings = await r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.run()
facilitatedMeetings.map(async (meeting) => {
const {id: meetingId} = meeting
const meetingMembers = await dataLoader.get('meetingMembersByMeetingId').load(meetingId)
const otherMember = meetingMembers.find(({userId}) => userId !== userIdToDelete)
if (otherMember) {
await r
.table('NewMeeting')
.get(meetingId)
.update({
facilitatorUserId: otherMember.userId
})
.run()
} else {
// single-person meeting must be deleted because facilitatorUserId must be non-null
await r.table('NewMeeting').get(meetingId).delete().run()
}
})
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for asynchronous operations.

The function setFacilitatedUserIdOrDelete performs multiple asynchronous operations without error handling. Consider adding try-catch blocks to handle potential errors.

const setFacilitatedUserIdOrDelete = async (
  userIdToDelete: string,
  teamIds: string[],
  dataLoader: DataLoaderInstance
) => {
  try {
    const r = await getRethink()
    const facilitatedMeetings = await r
      .table('NewMeeting')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => row('createdBy').eq(userIdToDelete))
      .run()
    facilitatedMeetings.map(async (meeting) => {
      const {id: meetingId} = meeting
      const meetingMembers = await dataLoader.get('meetingMembersByMeetingId').load(meetingId)
      const otherMember = meetingMembers.find(({userId}) => userId !== userIdToDelete)
      if (otherMember) {
        await r
          .table('NewMeeting')
          .get(meetingId)
          .update({
            facilitatorUserId: otherMember.userId
          })
          .run()
      } else {
        // single-person meeting must be deleted because facilitatorUserId must be non-null
        await r.table('NewMeeting').get(meetingId).delete().run()
      }
    })
  } catch (error) {
    console.error('Error in setFacilitatedUserIdOrDelete:', error)
    throw error
  }
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const setFacilitatedUserIdOrDelete = async (
userIdToDelete: string,
teamIds: string[],
dataLoader: DataLoaderInstance
) => {
const r = await getRethink()
const facilitatedMeetings = await r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.run()
facilitatedMeetings.map(async (meeting) => {
const {id: meetingId} = meeting
const meetingMembers = await dataLoader.get('meetingMembersByMeetingId').load(meetingId)
const otherMember = meetingMembers.find(({userId}) => userId !== userIdToDelete)
if (otherMember) {
await r
.table('NewMeeting')
.get(meetingId)
.update({
facilitatorUserId: otherMember.userId
})
.run()
} else {
// single-person meeting must be deleted because facilitatorUserId must be non-null
await r.table('NewMeeting').get(meetingId).delete().run()
}
})
}
const setFacilitatedUserIdOrDelete = async (
userIdToDelete: string,
teamIds: string[],
dataLoader: DataLoaderInstance
) => {
try {
const r = await getRethink()
const facilitatedMeetings = await r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.run()
facilitatedMeetings.map(async (meeting) => {
const {id: meetingId} = meeting
const meetingMembers = await dataLoader.get('meetingMembersByMeetingId').load(meetingId)
const otherMember = meetingMembers.find(({userId}) => userId !== userIdToDelete)
if (otherMember) {
await r
.table('NewMeeting')
.get(meetingId)
.update({
facilitatorUserId: otherMember.userId
})
.run()
} else {
// single-person meeting must be deleted because facilitatorUserId must be non-null
await r.table('NewMeeting').get(meetingId).delete().run()
}
})
} catch (error) {
console.error('Error in setFacilitatedUserIdOrDelete:', error)
throw error
}
}

Comment on lines 82 to 89
await setFacilitatedUserIdOrDelete(userIdToDelete, teamIds, dataLoader)
await r({
nullifyCreatedBy: r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.update({createdBy: null})
.run(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for asynchronous operations.

The call to setFacilitatedUserIdOrDelete and subsequent RethinkDB operations lack error handling. Consider adding try-catch blocks to handle potential errors.

await setFacilitatedUserIdOrDelete(userIdToDelete, teamIds, dataLoader)
try {
  await r({
    nullifyCreatedBy: r
      .table('NewMeeting')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => row('createdBy').eq(userIdToDelete))
      .update({createdBy: null})
      .run(),
    teamMember: r.table('TeamMember').getAll(userIdToDelete, {index: 'userId'}).delete(),
    meetingMember: r.table('MeetingMember').getAll(userIdToDelete, {index: 'userId'}).delete(),
    notification: r.table('Notification').getAll(userIdToDelete, {index: 'userId'}).delete(),
    suggestedAction: r.table('SuggestedAction').getAll(userIdToDelete, {index: 'userId'}).delete(),
    createdTasks: r
      .table('Task')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => row('createdBy').eq(userIdToDelete))
      .delete(),
    agendaItem: r
      .table('AgendaItem')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => r(teamMemberIds).contains(row('teamMemberId')))
      .delete(),
    pushInvitation: r.table('PushInvitation').getAll(userIdToDelete, {index: 'userId'}).delete(),
    slackNotification: r
      .table('SlackNotification')
      .getAll(userIdToDelete, {index: 'userId'})
      .delete(),
    invitedByTeamInvitation: r
      .table('TeamInvitation')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => row('invitedBy').eq(userIdToDelete))
      .delete(),
    createdByTeamInvitations: r
      .table('TeamInvitation')
      .getAll(r.args(teamIds), {index: 'teamId'})
      .filter((row: RValue) => row('acceptedBy').eq(userIdToDelete))
      .update({acceptedBy: ''}),
    comment: r
      .table('Comment')
      .getAll(r.args(teamDiscussionIds), {index: 'discussionId'})
      .filter((row: RValue) => row('createdBy').eq(userIdToDelete))
      .update({
        createdBy: tombstoneId,
        isAnonymous: true
      })
  }).run()
} catch (error) {
  console.error('Error in hardDeleteUser RethinkDB operations:', error)
  throw error
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
await setFacilitatedUserIdOrDelete(userIdToDelete, teamIds, dataLoader)
await r({
nullifyCreatedBy: r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.update({createdBy: null})
.run(),
await setFacilitatedUserIdOrDelete(userIdToDelete, teamIds, dataLoader)
try {
await r({
nullifyCreatedBy: r
.table('NewMeeting')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.update({createdBy: null})
.run(),
teamMember: r.table('TeamMember').getAll(userIdToDelete, {index: 'userId'}).delete(),
meetingMember: r.table('MeetingMember').getAll(userIdToDelete, {index: 'userId'}).delete(),
notification: r.table('Notification').getAll(userIdToDelete, {index: 'userId'}).delete(),
suggestedAction: r.table('SuggestedAction').getAll(userIdToDelete, {index: 'userId'}).delete(),
createdTasks: r
.table('Task')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.delete(),
agendaItem: r
.table('AgendaItem')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => r(teamMemberIds).contains(row('teamMemberId')))
.delete(),
pushInvitation: r.table('PushInvitation').getAll(userIdToDelete, {index: 'userId'}).delete(),
slackNotification: r
.table('SlackNotification')
.getAll(userIdToDelete, {index: 'userId'})
.delete(),
invitedByTeamInvitation: r
.table('TeamInvitation')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('invitedBy').eq(userIdToDelete))
.delete(),
createdByTeamInvitations: r
.table('TeamInvitation')
.getAll(r.args(teamIds), {index: 'teamId'})
.filter((row: RValue) => row('acceptedBy').eq(userIdToDelete))
.update({acceptedBy: ''}),
comment: r
.table('Comment')
.getAll(r.args(teamDiscussionIds), {index: 'discussionId'})
.filter((row: RValue) => row('createdBy').eq(userIdToDelete))
.update({
createdBy: tombstoneId,
isAnonymous: true
})
}).run()
} catch (error) {
console.error('Error in hardDeleteUser RethinkDB operations:', error)
throw error
}

@@ -27,10 +26,11 @@ const removeTeamMember = async (
) => {
const {evictorUserId} = options
const r = await getRethink()
const pg = getKysely()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for asynchronous operations.

The function removeTeamMember performs multiple asynchronous operations without error handling. Consider adding try-catch blocks to handle potential errors.

const removeTeamMember = async (
  teamMemberId: string,
  options: Options,
  dataLoader: DataLoaderWorker
) => {
  const {evictorUserId} = options
  const r = await getRethink()
  const pg = getKysely()
  const now = new Date()
  const {userId, teamId} = fromTeamMemberId(teamMemberId)
  try {
    // see if they were a leader, make a new guy leader so later we can reassign tasks
    const activeTeamMembers = await dataLoader.get('teamMembersByTeamId').load(teamId)
    const teamMember = activeTeamMembers.find((t) => t.id === teamMemberId)
    const {isLead, isNotRemoved} = teamMember ?? {}
    // if the guy being removed is the leader & not the last, pick a new one. else, use him
    const teamLeader = activeTeamMembers.find((t) => t.isLead === !isLead) || teamMember
    if (!isNotRemoved || !teamMember || !teamLeader) {
      throw new Error('Team member already removed')
    }

    if (activeTeamMembers.length === 1) {
      await Promise.all([
        // archive single-person teams
        pg.updateTable('Team').set({isArchived: true}).where('id', '=', teamId).execute(),
        // delete all tasks belonging to a 1-person team
        r.table('Task').getAll(teamId, {index: 'teamId'}).delete()
      ])
    } else if (isLead) {
      await pg
        .updateTable('TeamMember')
        .set(({not}) => ({isLead: not('isLead')}))
        .where('id', 'in', [teamMemberId, teamLeader.id])
        .execute()
      // assign new leader, remove old leader flag
      await r({
        newTeamLead: r.table('TeamMember').get(teamLeader.id).update({
          isLead: true
        }),
        oldTeamLead: r.table('TeamMember').get(teamMemberId).update({isLead: false})
      }).run()
    }

    await pg
      .updateTable('TeamMember')
      .set({isNotRemoved: false})
      .where('id', '=', teamMemberId)
      .execute()
    // assign active tasks to the team lead
    const {integratedTasksToArchive, reassignedTasks} = await r({
      teamMember: r.table('TeamMember').get(teamMemberId).update({
        isNotRemoved: false,
        updatedAt: now
      }),
      integratedTasksToArchive: r
        .table('Task')
        .getAll(userId, {index: 'userId'})
        .filter({teamId})
        .filter((task: RDatum) => {
          return r.and(
            task('tags').contains('archived').not(),
            task('integrations').default(null).ne(null)
          )
        })
        .coerceTo('array') as unknown as Task[],
      reassignedTasks: r
        .table('Task')
        .getAll(userId, {index: 'userId'})
        .filter({teamId})
        .filter((task: RDatum) =>
          r.and(task('tags').contains('archived').not(), task('integrations').default(null).eq(null))
        )
        .update(
          {
            userId: teamLeader.userId
          },
          {returnChanges: true}
        )('changes')('new_val')
        .default([]) as unknown as Task[]
    }).run()

    await pg
      .updateTable('User')
      .set(({fn, ref, val}) => ({tms: fn('ARRAY_REMOVE', [ref('tms'), val(teamId)])}))
      .where('id', '=', userId)
      .execute()
    dataLoader.clearAll(['users', 'teamMembers'])
    const user = await dataLoader.get('users').load(userId)

    let notificationId: string | undefined
    if (evictorUserId) {
      const notification = new NotificationKickedOut({teamId, userId, evictorUserId})
      notificationId = notification.id
      await r.table('Notification').insert(notification).run()
    }

    const archivedTasks = await archiveTasksForDB(integratedTasksToArchive)
    const archivedTaskIds = archivedTasks.map(({id}) => id)
    const agendaItemIds = await r
      .table('AgendaItem')
      .getAll(teamId, {index: 'teamId'})
      .filter((row: RDatum) => row('teamMemberId').eq(teamMemberId))
      .getField('id')
      .run()

    // if a new meeting was currently running, remove them from it
    const filterFn = (stage: CheckInStage | UpdatesStage | EstimateStage | AgendaItemsStage) =>
      (stage as CheckInStage | UpdatesStage).teamMemberId === teamMemberId ||
      agendaItemIds.includes((stage as AgendaItemsStage).agendaItemId)
    removeSlackAuths(userId, teamId)
    await removeStagesFromMeetings(filterFn, teamId, dataLoader)
    await removeUserFromMeetingStages(userId, teamId, dataLoader)
    // TODO should probably just inactivate the meeting member
    const activeMeetings = await dataLoader.get('activeMeetingsByTeamId').load(teamId)
    const meetingIds = activeMeetings.map(({id}) => id)
    await r
      .table('MeetingMember')
      .getAll(r.args(meetingIds), {index: 'meetingId'})
      .filter({userId})
      .delete()
      .run()

    // Reassign facilitator for meetings this user is facilitating.
    const facilitatingMeetings = await r
      .table('NewMeeting')
      .getAll(r.args(meetingIds), {index: 'id'})
      .filter({
        facilitatorUserId: userId
      })
      .run()

    const newMeetingFacilitators = (
      await dataLoader
        .get('meetingMembersByMeetingId')
        .loadMany(facilitatingMeetings.map((meeting) => meeting.id))
    )
      .filter(errorFilter)
      .map((members) => members[0])
      .filter((member) => !!member)

    Promise.allSettled(
      newMeetingFacilitators.map(async (newFacilitator) => {
        if (!newFacilitator) {
          // This user is the only meeting member, so do nothing.
          // :TODO: (jmtaber129): Consider closing meetings where this user is the only meeting
          // member.
          return
        }
        await r
          .table('NewMeeting')
          .get(newFacilitator.meetingId)
          .update({
            facilitatorUserId: newFacilitator.userId,
            updatedAt: now
          })
          .run()
      })
    )

    return {
      user,
      notificationId,
      archivedTaskIds,
      reassignedTaskIds: reassignedTasks.map(({id}) => id)
    }
  } catch (error) {
    console.error('Error in removeTeamMember:', error)
    throw error
  }
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const pg = getKysely()
const removeTeamMember = async (
teamMemberId: string,
options: Options,
dataLoader: DataLoaderWorker
) => {
const {evictorUserId} = options
const r = await getRethink()
const pg = getKysely()
const now = new Date()
const {userId, teamId} = fromTeamMemberId(teamMemberId)
try {
// see if they were a leader, make a new guy leader so later we can reassign tasks
const activeTeamMembers = await dataLoader.get('teamMembersByTeamId').load(teamId)
const teamMember = activeTeamMembers.find((t) => t.id === teamMemberId)
const {isLead, isNotRemoved} = teamMember ?? {}
// if the guy being removed is the leader & not the last, pick a new one. else, use him
const teamLeader = activeTeamMembers.find((t) => t.isLead === !isLead) || teamMember
if (!isNotRemoved || !teamMember || !teamLeader) {
throw new Error('Team member already removed')
}
if (activeTeamMembers.length === 1) {
await Promise.all([
// archive single-person teams
pg.updateTable('Team').set({isArchived: true}).where('id', '=', teamId).execute(),
// delete all tasks belonging to a 1-person team
r.table('Task').getAll(teamId, {index: 'teamId'}).delete()
])
} else if (isLead) {
await pg
.updateTable('TeamMember')
.set(({not}) => ({isLead: not('isLead')}))
.where('id', 'in', [teamMemberId, teamLeader.id])
.execute()
// assign new leader, remove old leader flag
await r({
newTeamLead: r.table('TeamMember').get(teamLeader.id).update({
isLead: true
}),
oldTeamLead: r.table('TeamMember').get(teamMemberId).update({isLead: false})
}).run()
}
await pg
.updateTable('TeamMember')
.set({isNotRemoved: false})
.where('id', '=', teamMemberId)
.execute()
// assign active tasks to the team lead
const {integratedTasksToArchive, reassignedTasks} = await r({
teamMember: r.table('TeamMember').get(teamMemberId).update({
isNotRemoved: false,
updatedAt: now
}),
integratedTasksToArchive: r
.table('Task')
.getAll(userId, {index: 'userId'})
.filter({teamId})
.filter((task: RDatum) => {
return r.and(
task('tags').contains('archived').not(),
task('integrations').default(null).ne(null)
)
})
.coerceTo('array') as unknown as Task[],
reassignedTasks: r
.table('Task')
.getAll(userId, {index: 'userId'})
.filter({teamId})
.filter((task: RDatum) =>
r.and(task('tags').contains('archived').not(), task('integrations').default(null).eq(null))
)
.update(
{
userId: teamLeader.userId
},
{returnChanges: true}
)('changes')('new_val')
.default([]) as unknown as Task[]
}).run()
await pg
.updateTable('User')
.set(({fn, ref, val}) => ({tms: fn('ARRAY_REMOVE', [ref('tms'), val(teamId)])}))
.where('id', '=', userId)
.execute()
dataLoader.clearAll(['users', 'teamMembers'])
const user = await dataLoader.get('users').load(userId)
let notificationId: string | undefined
if (evictorUserId) {
const notification = new NotificationKickedOut({teamId, userId, evictorUserId})
notificationId = notification.id
await r.table('Notification').insert(notification).run()
}
const archivedTasks = await archiveTasksForDB(integratedTasksToArchive)
const archivedTaskIds = archivedTasks.map(({id}) => id)
const agendaItemIds = await r
.table('AgendaItem')
.getAll(teamId, {index: 'teamId'})
.filter((row: RDatum) => row('teamMemberId').eq(teamMemberId))
.getField('id')
.run()
// if a new meeting was currently running, remove them from it
const filterFn = (stage: CheckInStage | UpdatesStage | EstimateStage | AgendaItemsStage) =>
(stage as CheckInStage | UpdatesStage).teamMemberId === teamMemberId ||
agendaItemIds.includes((stage as AgendaItemsStage).agendaItemId)
removeSlackAuths(userId, teamId)
await removeStagesFromMeetings(filterFn, teamId, dataLoader)
await removeUserFromMeetingStages(userId, teamId, dataLoader)
// TODO should probably just inactivate the meeting member
const activeMeetings = await dataLoader.get('activeMeetingsByTeamId').load(teamId)
const meetingIds = activeMeetings.map(({id}) => id)
await r
.table('MeetingMember')
.getAll(r.args(meetingIds), {index: 'meetingId'})
.filter({userId})
.delete()
.run()
// Reassign facilitator for meetings this user is facilitating.
const facilitatingMeetings = await r
.table('NewMeeting')
.getAll(r.args(meetingIds), {index: 'id'})
.filter({
facilitatorUserId: userId
})
.run()
const newMeetingFacilitators = (
await dataLoader
.get('meetingMembersByMeetingId')
.loadMany(facilitatingMeetings.map((meeting) => meeting.id))
)
.filter(errorFilter)
.map((members) => members[0])
.filter((member) => !!member)
Promise.allSettled(
newMeetingFacilitators.map(async (newFacilitator) => {
if (!newFacilitator) {
// This user is the only meeting member, so do nothing.
// :TODO: (jmtaber129): Consider closing meetings where this user is the only meeting
// member.
return
}
await r
.table('NewMeeting')
.get(newFacilitator.meetingId)
.update({
facilitatorUserId: newFacilitator.userId,
updatedAt: now
})
.run()
})
)
return {
user,
notificationId,
archivedTaskIds,
reassignedTaskIds: reassignedTasks.map(({id}) => id)
}
} catch (error) {
console.error('Error in removeTeamMember:', error)
throw error
}
}

Comment on lines 48 to 66
.with('TeamInsert', (qc) => qc.insertInto('Team').values(verifiedTeam))
.with('UserUpdate', (qc) =>
qc
.updateTable('User')
.set({tms: sql`arr_append_uniq("tms", ${teamId})`})
.where('id', '=', userId)
)
.with('TeamMemberInsert', (qc) =>
qc.insertInto('TeamMember').values({
id: TeamMemberId.join(teamId, userId),
teamId,
userId,
picture,
preferredName,
email,
isLead: true,
openDrawer: 'manageTeam'
})
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add error handling for database operations.

The function should handle potential errors from the database operations to ensure robustness.

export default async function createTeamAndLeader(
  user: IUser,
  newTeam: ValidNewTeam,
  dataLoader: DataLoaderInstance
) {
  try {
    const r = await getRethink()
    const {id: userId, picture, preferredName, email} = user
    const {id: teamId, orgId} = newTeam
    const organization = await dataLoader.get('organizations').loadNonNull(orgId)
    const {tier, trialStartDate} = organization
    const verifiedTeam = new Team({...newTeam, createdBy: userId, tier, trialStartDate})
    const meetingSettings = [
      new MeetingSettingsRetrospective({teamId}),
      new MeetingSettingsAction({teamId}),
      new MeetingSettingsPoker({teamId})
    ]
    const timelineEvent = new TimelineEventCreatedTeam({
      createdAt: new Date(Date.now() + 5),
      userId,
      teamId,
      orgId
    })

    const pg = getKysely()
    await Promise.all([
      pg
        .with('TeamInsert', (qc) => qc.insertInto('Team').values(verifiedTeam))
        .with('UserUpdate', (qc) =>
          qc
            .updateTable('User')
            .set({tms: sql`arr_append_uniq("tms", ${teamId})`})
            .where('id', '=', userId)
        )
        .with('TeamMemberInsert', (qc) =>
          qc.insertInto('TeamMember').values({
            id: TeamMemberId.join(teamId, userId),
            teamId,
            userId,
            picture,
            preferredName,
            email,
            isLead: true,
            openDrawer: 'manageTeam'
          })
        )
        .insertInto('TimelineEvent')
        .values(timelineEvent)
        .execute(),
      // add meeting settings
      r.table('MeetingSettings').insert(meetingSettings).run(),
      // denormalize common fields to team member
      insertNewTeamMember(user, teamId, dataLoader)
    ])
    dataLoader.clearAll(['teams', 'users', 'teamMembers', 'timelineEvents', 'meetingSettings'])
  } catch (error) {
    console.error('Error creating team and leader:', error)
  }
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.with('TeamInsert', (qc) => qc.insertInto('Team').values(verifiedTeam))
.with('UserUpdate', (qc) =>
qc
.updateTable('User')
.set({tms: sql`arr_append_uniq("tms", ${teamId})`})
.where('id', '=', userId)
)
.with('TeamMemberInsert', (qc) =>
qc.insertInto('TeamMember').values({
id: TeamMemberId.join(teamId, userId),
teamId,
userId,
picture,
preferredName,
email,
isLead: true,
openDrawer: 'manageTeam'
})
)
.with('TeamInsert', (qc) => qc.insertInto('Team').values(verifiedTeam))
.with('UserUpdate', (qc) =>
qc
.updateTable('User')
.set({tms: sql`arr_append_uniq("tms", ${teamId})`})
.where('id', '=', userId)
)
.with('TeamMemberInsert', (qc) =>
qc.insertInto('TeamMember').values({
id: TeamMemberId.join(teamId, userId),
teamId,
userId,
picture,
preferredName,
email,
isLead: true,
openDrawer: 'manageTeam'
})
)
export default async function createTeamAndLeader(
user: IUser,
newTeam: ValidNewTeam,
dataLoader: DataLoaderInstance
) {
try {
const r = await getRethink()
const {id: userId, picture, preferredName, email} = user
const {id: teamId, orgId} = newTeam
const organization = await dataLoader.get('organizations').loadNonNull(orgId)
const {tier, trialStartDate} = organization
const verifiedTeam = new Team({...newTeam, createdBy: userId, tier, trialStartDate})
const meetingSettings = [
new MeetingSettingsRetrospective({teamId}),
new MeetingSettingsAction({teamId}),
new MeetingSettingsPoker({teamId})
]
const timelineEvent = new TimelineEventCreatedTeam({
createdAt: new Date(Date.now() + 5),
userId,
teamId,
orgId
})
const pg = getKysely()
await Promise.all([
pg
.with('TeamInsert', (qc) => qc.insertInto('Team').values(verifiedTeam))
.with('UserUpdate', (qc) =>
qc
.updateTable('User')
.set({tms: sql`arr_append_uniq("tms", ${teamId})`})
.where('id', '=', userId)
)
.with('TeamMemberInsert', (qc) =>
qc.insertInto('TeamMember').values({
id: TeamMemberId.join(teamId, userId),
teamId,
userId,
picture,
preferredName,
email,
isLead: true,
openDrawer: 'manageTeam'
})
)
.insertInto('TimelineEvent')
.values(timelineEvent)
.execute(),
// add meeting settings
r.table('MeetingSettings').insert(meetingSettings).run(),
// denormalize common fields to team member
insertNewTeamMember(user, teamId, dataLoader)
])
dataLoader.clearAll(['teams', 'users', 'teamMembers', 'timelineEvents', 'meetingSettings'])
} catch (error) {
console.error('Error creating team and leader:', error)
}
}

Comment on lines 51 to 57
const orgTeams = (await dataLoader.get('teamsByOrgIds').loadMany(orgIds)).filter(isValid).flat()
const teamIds = orgTeams.map(({id}) => id)
const teamMembers = (await dataLoader.get('teamMembersByTeamId').loadMany(teamIds))
.filter(isValid)
.flat()
const leadTeamMembers = teamMembers.filter(({isLead}) => isLead)
const leadUserIds = [...new Set(leadTeamMembers.map(({userId}) => userId))]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add error handling for database operations.

The function should handle potential errors from the database operations to ensure robustness.

const requestToJoinDomain: MutationResolvers['requestToJoinDomain'] = async (
  _source,
  {},
  {authToken, dataLoader}
) => {
  try {
    const r = await getRethink()
    const operationId = dataLoader.share()
    const subOptions = {operationId}
    const pg = getKysely()
    const viewerId = getUserId(authToken)
    const viewer = await dataLoader.get('users').loadNonNull(viewerId)
    const domain = getDomainFromEmail(viewer.email)
    const now = new Date()

    const orgIds = await getEligibleOrgIdsByDomain(domain, viewerId, dataLoader)

    if (!orgIds.length) {
      return standardError(new Error('No relevant organizations in the domain were found'))
    }

    const insertResult = await pg
      .insertInto('DomainJoinRequest')
      .values({
        createdBy: viewerId,
        domain,
        expiresAt: new Date(now.getTime() + ms(`${REQUEST_EXPIRATION_DAYS}d`))
      })
      .onConflict((oc) => oc.columns(['createdBy', 'domain']).doNothing())
      .returning('id')
      .executeTakeFirst()

    if (!insertResult) {
      // Request is already exists, lets return success without sending duplicate notifications
      return {success: true}
    }

    const orgTeams = (await dataLoader.get('teamsByOrgIds').loadMany(orgIds)).filter(isValid).flat()
    const teamIds = orgTeams.map(({id}) => id)
    const teamMembers = (await dataLoader.get('teamMembersByTeamId').loadMany(teamIds))
      .filter(isValid)
      .flat()
    const leadTeamMembers = teamMembers.filter(({isLead}) => isLead)
    const leadUserIds = [...new Set(leadTeamMembers.map(({userId}) => userId))]

    const notificationsToInsert = leadUserIds.map((userId) => {
      return new NotificationRequestToJoinOrg({
        userId,
        email: viewer.email,
        name: viewer.preferredName,
        picture: viewer.picture,
        requestCreatedBy: viewerId,
        domainJoinRequestId: insertResult.id
      })
    })

    await r.table('Notification').insert(notificationsToInsert).run()

    notificationsToInsert.forEach((notification) => {
      publishNotification(notification, subOptions)
    })

    return {success: true}
  } catch (error) {
    console.error('Error requesting to join domain:', error)
    return {success: false}
  }
}
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const orgTeams = (await dataLoader.get('teamsByOrgIds').loadMany(orgIds)).filter(isValid).flat()
const teamIds = orgTeams.map(({id}) => id)
const teamMembers = (await dataLoader.get('teamMembersByTeamId').loadMany(teamIds))
.filter(isValid)
.flat()
const leadTeamMembers = teamMembers.filter(({isLead}) => isLead)
const leadUserIds = [...new Set(leadTeamMembers.map(({userId}) => userId))]
try {
const orgTeams = (await dataLoader.get('teamsByOrgIds').loadMany(orgIds)).filter(isValid).flat()
const teamIds = orgTeams.map(({id}) => id)
const teamMembers = (await dataLoader.get('teamMembersByTeamId').loadMany(teamIds))
.filter(isValid)
.flat()
const leadTeamMembers = teamMembers.filter(({isLead}) => isLead)
const leadUserIds = [...new Set(leadTeamMembers.map(({userId}) => userId))]
} catch (error) {
console.error('Error fetching team data:', error)
return {success: false}
}

Signed-off-by: Matt Krick <matt.krick@gmail.com>
@github-actions github-actions bot added size/m and removed size/xl labels Jul 19, 2024
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 9

Outside diff range, codebase verification and nitpick comments (2)
packages/server/postgres/migrations/1721356124871_TeamMember-phase2.ts (2)

1-4: Consider importing only necessary modules.

Importing the entire rethinkdb-ts package might be unnecessary if only specific functions are used.

- import {r} from 'rethinkdb-ts'
+ import {r, connect} from 'rethinkdb-ts'

54-54: Remove unnecessary console.log statement.

The console.log statement is redundant and can be removed.

- await console.log('Adding index complete')
+ console.log('Adding index complete')

Comment on lines +33 to +41
// some dates are
await r(dateDates)
.forEach((row: any) => {
return r
.table('TeamMember')
.get(row('id'))
.update({updatedAt: row('updatedAt')})
})
.run()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle potential errors during date conversion in RethinkDB.

Converting dates in RethinkDB should include error handling to manage potential issues.

- await r(dateDates)
-   .forEach((row: any) => {
-     return r
-       .table('TeamMember')
-       .get(row('id'))
-       .update({updatedAt: row('updatedAt')})
-   })
-   .run()
+ try {
+   await r(dateDates)
+     .forEach((row: any) => {
+       return r
+         .table('TeamMember')
+         .get(row('id'))
+         .update({updatedAt: row('updatedAt')})
+     })
+     .run()
+ } catch (error) {
+   console.error('Failed to convert dates in RethinkDB:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// some dates are
await r(dateDates)
.forEach((row: any) => {
return r
.table('TeamMember')
.get(row('id'))
.update({updatedAt: row('updatedAt')})
})
.run()
try {
await r(dateDates)
.forEach((row: any) => {
return r
.table('TeamMember')
.get(row('id'))
.update({updatedAt: row('updatedAt')})
})
.run()
} catch (error) {
console.error('Failed to convert dates in RethinkDB:', error)
throw error
}

Comment on lines +131 to +137
export async function down() {
await connectRethinkDB()
try {
await r.table('TeamMember').indexDrop('updatedAtId').run()
} catch {
// index already dropped
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid catching errors just to log and rethrow them.

Let the errors propagate naturally to be handled by the calling function or global error handler.

- try {
-   await r.table('TeamMember').indexDrop('updatedAtId').run()
- } catch {
-   // index already dropped
- }
+ await r.table('TeamMember').indexDrop('updatedAtId').run()
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function down() {
await connectRethinkDB()
try {
await r.table('TeamMember').indexDrop('updatedAtId').run()
} catch {
// index already dropped
}
export async function down() {
await connectRethinkDB()
await r.table('TeamMember').indexDrop('updatedAtId').run()
}

Comment on lines +8 to +12
const pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for PostgreSQL connection.

The connection to PostgreSQL should include error handling to manage potential connection issues.

- const pg = new Kysely<any>({
-   dialect: new PostgresDialect({
-     pool: getPg()
-   })
- })
+ let pg
+ try {
+   pg = new Kysely<any>({
+     dialect: new PostgresDialect({
+       pool: getPg()
+     })
+   })
+ } catch (error) {
+   console.error('Failed to connect to PostgreSQL:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
let pg
try {
pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
} catch (error) {
console.error('Failed to connect to PostgreSQL:', error)
throw error
}

Comment on lines +6 to +7
export async function up() {
await connectRethinkDB()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for database connections.

The connection to RethinkDB should include error handling to manage potential connection issues.

- await connectRethinkDB()
+ try {
+   await connectRethinkDB()
+ } catch (error) {
+   console.error('Failed to connect to RethinkDB:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
export async function up() {
await connectRethinkDB()
export async function up() {
try {
await connectRethinkDB()
} catch (error) {
console.error('Failed to connect to RethinkDB:', error)
throw error
}

Comment on lines +23 to +27
const strDates = await r
.table('TeamMember')
.filter((row) => row('updatedAt').typeOf().eq('STRING'))
.pluck('updatedAt', 'id', 'createdAt')
.run()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle potential errors during date filtering in RethinkDB.

Filtering dates in RethinkDB should include error handling to manage potential issues.

- const strDates = await r
-   .table('TeamMember')
-   .filter((row) => row('updatedAt').typeOf().eq('STRING'))
-   .pluck('updatedAt', 'id', 'createdAt')
-   .run()
+ let strDates
+ try {
+   strDates = await r
+     .table('TeamMember')
+     .filter((row) => row('updatedAt').typeOf().eq('STRING'))
+     .pluck('updatedAt', 'id', 'createdAt')
+     .run()
+ } catch (error) {
+   console.error('Failed to filter dates in RethinkDB:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const strDates = await r
.table('TeamMember')
.filter((row) => row('updatedAt').typeOf().eq('STRING'))
.pluck('updatedAt', 'id', 'createdAt')
.run()
let strDates
try {
strDates = await r
.table('TeamMember')
.filter((row) => row('updatedAt').typeOf().eq('STRING'))
.pluck('updatedAt', 'id', 'createdAt')
.run()
} catch (error) {
console.error('Failed to filter dates in RethinkDB:', error)
throw error
}

Comment on lines +138 to +143
const pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
await pg.deleteFrom('TeamMember').execute()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure proper error handling for PostgreSQL connection.

The connection to PostgreSQL should include error handling to manage potential connection issues.

- const pg = new Kysely<any>({
-   dialect: new PostgresDialect({
-     pool: getPg()
-   })
- })
+ let pg
+ try {
+   pg = new Kysely<any>({
+     dialect: new PostgresDialect({
+       pool: getPg()
+     })
+   })
+ } catch (error) {
+   console.error('Failed to connect to PostgreSQL:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
await pg.deleteFrom('TeamMember').execute()
let pg
try {
pg = new Kysely<any>({
dialect: new PostgresDialect({
pool: getPg()
})
})
} catch (error) {
console.error('Failed to connect to PostgreSQL:', error)
throw error
}
await pg.deleteFrom('TeamMember').execute()

Comment on lines +43 to +52
try {
console.log('Adding index')
await r
.table('TeamMember')
.indexCreate('updatedAtId', (row: any) => [row('updatedAt'), row('id')])
.run()
await r.table('TeamMember').indexWait().run()
} catch {
// index already exists
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid catching errors just to log and rethrow them.

Let the errors propagate naturally to be handled by the calling function or global error handler.

- try {
-   console.log('Adding index')
-   await r
-     .table('TeamMember')
-     .indexCreate('updatedAtId', (row: any) => [row('updatedAt'), row('id')])
-     .run()
-   await r.table('TeamMember').indexWait().run()
- } catch {
-   // index already exists
- }
+ console.log('Adding index')
+ await r
+   .table('TeamMember')
+   .indexCreate('updatedAtId', (row: any) => [row('updatedAt'), row('id')])
+   .run()
+ await r.table('TeamMember').indexWait().run()
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
console.log('Adding index')
await r
.table('TeamMember')
.indexCreate('updatedAtId', (row: any) => [row('updatedAt'), row('id')])
.run()
await r.table('TeamMember').indexWait().run()
} catch {
// index already exists
}
console.log('Adding index')
await r
.table('TeamMember')
.indexCreate('updatedAtId', (row: any) => [row('updatedAt'), row('id')])
.run()
await r.table('TeamMember').indexWait().run()

Comment on lines +14 to +22
// add a dummy date for nulls
const parabolEpoch = new Date('2016-06-01')
await r
.table('TeamMember')
.update((row) => ({
updatedAt: row('updatedAt').default(parabolEpoch),
createdAt: row('createdAt').default(parabolEpoch)
}))
.run()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle potential errors during date updates in RethinkDB.

Updating dates in RethinkDB should include error handling to manage potential issues.

- await r
-   .table('TeamMember')
-   .update((row) => ({
-     updatedAt: row('updatedAt').default(parabolEpoch),
-     createdAt: row('createdAt').default(parabolEpoch)
-   }))
-   .run()
+ try {
+   await r
+     .table('TeamMember')
+     .update((row) => ({
+       updatedAt: row('updatedAt').default(parabolEpoch),
+       createdAt: row('createdAt').default(parabolEpoch)
+     }))
+     .run()
+ } catch (error) {
+   console.error('Failed to update dates in RethinkDB:', error)
+   throw error
+ }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// add a dummy date for nulls
const parabolEpoch = new Date('2016-06-01')
await r
.table('TeamMember')
.update((row) => ({
updatedAt: row('updatedAt').default(parabolEpoch),
createdAt: row('createdAt').default(parabolEpoch)
}))
.run()
// add a dummy date for nulls
const parabolEpoch = new Date('2016-06-01')
try {
await r
.table('TeamMember')
.update((row) => ({
updatedAt: row('updatedAt').default(parabolEpoch),
createdAt: row('createdAt').default(parabolEpoch)
}))
.run()
} catch (error) {
console.error('Failed to update dates in RethinkDB:', error)
throw error
}

Comment on lines +75 to +128
let curUpdatedAt = r.minval
let curId = r.minval
for (let i = 0; i < 1e6; i++) {
console.log('inserting row', i * BATCH_SIZE, String(curUpdatedAt), String(curId))
const rawRowsToInsert = (await r
.table('TeamMember')
.between([curUpdatedAt, curId], [r.maxval, r.maxval], {
index: 'updatedAtId',
leftBound: 'open',
rightBound: 'closed'
})
.orderBy({index: 'updatedAtId'})
.limit(BATCH_SIZE)
.pluck(...PG_COLS)
.run()) as TeamMember[]

const rowsToInsert = rawRowsToInsert.map((row) => {
const {preferredName, picture, ...rest} = row as any
return {
...rest,
preferredName: preferredName.slice(0, 100),
picture: picture.slice(0, 2056)
}
})
if (rowsToInsert.length === 0) break
const lastRow = rowsToInsert[rowsToInsert.length - 1]
curUpdatedAt = lastRow.updatedAt
curId = lastRow.id
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.log(e, row)
}
})
)
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Optimise data migration loop.

The data migration loop can be optimised by reducing the number of console.log statements and improving error handling.

- for (let i = 0; i < 1e6; i++) {
-   console.log('inserting row', i * BATCH_SIZE, String(curUpdatedAt), String(curId))
+ for (let i = 0; ; i++) {
+   if (i % 100 === 0) {
+     console.log('inserting row', i * BATCH_SIZE, String(curUpdatedAt), String(curId))
+   }
    const rawRowsToInsert = (await r
      .table('TeamMember')
      .between([curUpdatedAt, curId], [r.maxval, r.maxval], {
        index: 'updatedAtId',
        leftBound: 'open',
        rightBound: 'closed'
      })
      .orderBy({index: 'updatedAtId'})
      .limit(BATCH_SIZE)
      .pluck(...PG_COLS)
      .run()) as TeamMember[]

    const rowsToInsert = rawRowsToInsert.map((row) => {
      const {preferredName, picture, ...rest} = row as any
      return {
        ...rest,
        preferredName: preferredName.slice(0, 100),
        picture: picture.slice(0, 2056)
      }
    })
    if (rowsToInsert.length === 0) break
    const lastRow = rowsToInsert[rowsToInsert.length - 1]
    curUpdatedAt = lastRow.updatedAt
    curId = lastRow.id
    try {
      await pg
        .insertInto('TeamMember')
        .values(rowsToInsert)
        .onConflict((oc) => oc.doNothing())
        .execute()
    } catch (e) {
      await Promise.all(
        rowsToInsert.map(async (row) => {
          try {
            await pg
              .insertInto('TeamMember')
              .values(row)
              .onConflict((oc) => oc.doNothing())
              .execute()
          } catch (e) {
            if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
              console.log(`Skipping ${row.id} because it has no user/team`)
              return
            }
            console.log(e, row)
          }
        })
      )
    }
  }
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let curUpdatedAt = r.minval
let curId = r.minval
for (let i = 0; i < 1e6; i++) {
console.log('inserting row', i * BATCH_SIZE, String(curUpdatedAt), String(curId))
const rawRowsToInsert = (await r
.table('TeamMember')
.between([curUpdatedAt, curId], [r.maxval, r.maxval], {
index: 'updatedAtId',
leftBound: 'open',
rightBound: 'closed'
})
.orderBy({index: 'updatedAtId'})
.limit(BATCH_SIZE)
.pluck(...PG_COLS)
.run()) as TeamMember[]
const rowsToInsert = rawRowsToInsert.map((row) => {
const {preferredName, picture, ...rest} = row as any
return {
...rest,
preferredName: preferredName.slice(0, 100),
picture: picture.slice(0, 2056)
}
})
if (rowsToInsert.length === 0) break
const lastRow = rowsToInsert[rowsToInsert.length - 1]
curUpdatedAt = lastRow.updatedAt
curId = lastRow.id
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.log(e, row)
}
})
)
}
}
let curUpdatedAt = r.minval
let curId = r.minval
for (let i = 0; ; i++) {
if (i % 100 === 0) {
console.log('inserting row', i * BATCH_SIZE, String(curUpdatedAt), String(curId))
}
const rawRowsToInsert = (await r
.table('TeamMember')
.between([curUpdatedAt, curId], [r.maxval, r.maxval], {
index: 'updatedAtId',
leftBound: 'open',
rightBound: 'closed'
})
.orderBy({index: 'updatedAtId'})
.limit(BATCH_SIZE)
.pluck(...PG_COLS)
.run()) as TeamMember[]
const rowsToInsert = rawRowsToInsert.map((row) => {
const {preferredName, picture, ...rest} = row as any
return {
...rest,
preferredName: preferredName.slice(0, 100),
picture: picture.slice(0, 2056)
}
})
if (rowsToInsert.length === 0) break
const lastRow = rowsToInsert[rowsToInsert.length - 1]
curUpdatedAt = lastRow.updatedAt
curId = lastRow.id
try {
await pg
.insertInto('TeamMember')
.values(rowsToInsert)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
await Promise.all(
rowsToInsert.map(async (row) => {
try {
await pg
.insertInto('TeamMember')
.values(row)
.onConflict((oc) => oc.doNothing())
.execute()
} catch (e) {
if (e.constraint === 'fk_userId' || e.constraint === 'fk_teamId') {
console.log(`Skipping ${row.id} because it has no user/team`)
return
}
console.log(e, row)
}
})
)
}
}

@mattkrick mattkrick merged commit 90de32f into master Jul 23, 2024
7 checks passed
@mattkrick mattkrick deleted the chore/teamMember-phase2 branch July 23, 2024 23:26
@github-actions github-actions bot mentioned this pull request Jul 23, 2024
24 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant