Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(server): Import or Export Project Functionality #1141

Merged
merged 7 commits into from
Sep 18, 2024

Conversation

hexaforce
Copy link
Contributor

@hexaforce hexaforce commented Sep 12, 2024

Overview

Import or Export Project Functionality

About Export.

The JSON format for export is as follows. It is broadly divided into three sections: plugins, project, and scene. For the scene, the sceneJSON is output using the Builder.

{
  "plugins": [
    { ... }
  ],
  "project": {
    ...
  },
  "scene": {
    ...
    "widgets": [
      { ... }
    ],
    "widgetAlignSystem": {
      ...
    },
    "story": {
      ...
      "pages": [
        {
          ...
          "blocks": [
            { ... }
          ],
        }
      ]
    },
    "nlsLayers": [
      { ... }
    ],
    "layerStyles": [
      { ... }
    ]
  }
}

For importing properties, a Type is required, but since the scene JSON does not include Type in its properties, we added an export flag to the Builder instance.

type Builder struct {
   ...
++   exportType bool
}

The output will look like this:

"property": {
  "default": {
    "text": "hogehoge"
  }
}

After adding Type, the output will be:

"property": {
  "default": {
    "text": {
      "type": "string",
      "value": "hogehoge"
    }
  }
}

About Import

The IDs of each object will be imported with the same ID, so if they already exist in the database, they will be overwritten. However, new IDs will be created for the associated properties.

Currently, the plugins, layers, tags, and clusters in the scene are not supported.

"scene": {
  ...
  "plugins": {}, // not supported
  "layers": [], // not supported
  "tags": [], // not supported
  "clusters": [], // not supported
}

About GraphQL

We have added ExportProject and ImportProject mutations to the Project.

ExportProject returns the JSON data as raw. The process of downloading it as a .reearth file is expected to be handled on the browser side.

ImportProject is designed to upload a .reearth file.

What I've done

The uploaded files are now imported into the respective DB repositories:

  • ImportProject() // => project
  • ImportPlugins() // => plugin
  • ImportScene() // => storytelling, property
  • ImportNLSLayers() // => nlsLayer
  • ImportStyles() // => style
  • ImportStoryTelling() // => storytelling, property

I have implemented the corresponding tests:

  • TestImportProject()
  • TestImportPlugins()
  • TestImportScene()
  • TestImportNLSLayers()
  • TestImportStyles()
  • TestImportStoryTelling()

I have also implemented end-to-end tests for GraphQL:

  • TestCallExportProject()
  • TestCallImportProject()

I created a common function getWidgePlugin() in server/internal/usecase/interactor/scene.go to retrieve Widget plugins and refactored the code accordingly.

I created a common function ParseSceneJSON() for parsing the package. Since sceneJSON is private, I added this function to the builder package.

Additionally, I created the following functions:

  • ParserWidgetAlignSystem() for parsing the complex WidgetAlignSystem
  • AddItemFromPropertyJSON() for parsing and adding properties

All of these functions are located in server/pkg/scene/builder/decoder.go.

What I haven't done

Regarding Plugins

Plugins only retrieve data from the database. Plugin iles are not supported.

How I tested

  • Testing with test code
  • Verifying functionality through web calls

Which point I want you to review particularly

Memo

reference
https://www.notion.so/eukarya/Import-or-Export-Project-Functionality-728f84108ad54d8bb0d15b21fc3fa68e

Summary by CodeRabbit

Release Notes

  • New Features

    • Added GraphQL mutations for importing and exporting project data, enhancing data management capabilities.
    • Implemented end-to-end tests for GraphQL import/export functionalities.
    • Introduced a method for importing plugins from external JSON data, improving plugin management.
    • Enhanced scene builder functionality with export mode support for better scene management.
    • Added functionality to import storytelling data, expanding storytelling capabilities.
  • Bug Fixes

    • Improved error handling in various import methods to ensure robustness.
  • Tests

    • Expanded test coverage for new import/export functionalities, including tests for project and storytelling import functionality.

@hexaforce hexaforce self-assigned this Sep 12, 2024
Copy link

coderabbitai bot commented Sep 12, 2024

Walkthrough

The changes introduce new functionalities for importing and exporting projects and plugins through GraphQL, along with comprehensive end-to-end tests to validate these operations. New input and payload types are defined in the GraphQL schema, and the mutation resolver interface is updated to accommodate these new features. Additionally, methods for importing plugins and storytelling data from JSON structures are added, enhancing the overall capabilities of the server codebase.

Changes

Files Change Summary
server/e2e/gql_import_export_test.go Introduced end-to-end tests for GraphQL project import and export functionalities.
server/internal/adapter/gql/generated.go Updated to include new input types, payload types, and mutation methods for exporting and importing projects.
server/internal/usecase/interactor/plugin.go, server/internal/usecase/interactor/storytelling.go Added methods for importing plugins and storytelling data from JSON structures.
server/internal/usecase/interactor/project.go Introduced methods for exporting and importing project data, enhancing project management capabilities.
server/internal/usecase/interfaces/project.go Updated interface to include new methods for exporting and importing project data.
server/pkg/property/sealed.go, server/pkg/scene/builder/builder.go, server/pkg/scene/builder/scene.go Modified methods to support new export functionality with conditional checks based on export type.
server/pkg/scene/builder/decoder.go Introduced functionality for parsing JSON representations of scenes and their widget configurations.
server/pkg/scene/builder/builder_test.go Enhanced tests to validate scene builder functionality in both normal and export modes.

Poem

🐰 In the garden where code does bloom,
New features sprout, dispelling gloom.
Projects can now import with grace,
Plugins and styles find their place.
With tests to guard each new delight,
Our codebase shines, oh what a sight! 🌼


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share
Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    -- I pushed a fix in commit <commit_id>, please review it.
    -- Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    -- @coderabbitai generate unit testing code for this file.
    -- @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    -- @coderabbitai generate interesting stats about this repository and render them as a table.
    -- @coderabbitai read src/utils.ts and generate unit testing code.
    -- @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    -- @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.
Early access features: enabled

We are currently testing the following features in early access:

  • OpenAI o1 for code reviews: OpenAI's new o1 model is being tested for code reviews. This model has advanced reasoning capabilities and can provide more detailed feedback on your code.

Note:

  • You can enable or disable early access features from the CodeRabbit UI or by updating the CodeRabbit configuration file.

Copy link

netlify bot commented Sep 12, 2024

Deploy Preview for reearth-web canceled.

Name Link
🔨 Latest commit c3fe185
🔍 Latest deploy log https://app.netlify.com/sites/reearth-web/deploys/66ea3bb7f536850008ee01d3

@hexaforce hexaforce force-pushed the feat/project-import-export branch from e90b1de to 07cb2b9 Compare September 12, 2024 09:43
@hexaforce hexaforce requested a review from airslice September 12, 2024 12:04
@hexaforce hexaforce marked this pull request as ready for review September 12, 2024 12:05
@hexaforce hexaforce requested a review from pyshx as a code owner September 12, 2024 12:05
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

Outside diff range and nitpick comments (16)
server/internal/usecase/interfaces/plugin.go (1)

24-24: LGTM! Consider adding documentation for the input parameter.

The addition of the ImportPlugins method to the Plugin interface looks good. It expands the functionality to support importing plugins from various sources.

The method signature is well-defined, with the context.Context parameter for request-scoped values and cancellation, and appropriate return values for the imported plugins and errors.

Suggestion:

Consider adding documentation or comments to describe the expected format of the []interface{} input parameter. This will make it clearer for developers to understand what data structure should be passed when calling this method.

server/internal/usecase/interfaces/project.go (1)

71-71: LGTM, but consider adding documentation comments.

The ImportProject method signature looks good:

  • It follows the existing naming conventions in the interface.
  • The parameter type is appropriate for importing project data.
  • The return types are suitable for returning the imported project and any errors.

To improve the readability and maintainability of the interface, consider adding documentation comments for the new ExportProject and ImportProject methods. The comments should describe the purpose, parameters, and return values of each method, providing clear guidance for future developers who may need to implement or use these methods.

server/internal/adapter/gql/gqlmodel/convert_project.go (1)

71-81: Consider returning the error instead of nil.

The function provides a convenient way to convert a JSON representation of a project into a Project struct and handles errors gracefully by returning nil if either the marshaling or unmarshaling fails.

Consider returning the error instead of nil to provide more context about the failure. Apply this diff:

func ToProjectFromJSON(data map[string]any) (*Project, error) {
	var p Project
	bytes, err := json.MarshalIndent(data, "", "  ")
	if err != nil {
-		return nil
+		return nil, err
	}
	if err := json.Unmarshal(bytes, &p); err != nil {
-		return nil
+		return nil, err
	}
-	return &p
+	return &p, nil
}
server/internal/usecase/interactor/plugin.go (1)

55-101: LGTM! Consider adding logging and handling potential data conflicts.

The ImportPlugins function is well-structured and correctly handles the importation of plugins from JSON data. It follows a clear flow of converting the input data, extracting plugin information, creating extension instances, building plugin instances, and saving them to the repository. Error handling is implemented throughout the function to handle potential issues during the import process.

To further enhance the functionality, consider the following suggestions:

  1. Add logging statements at key points in the function to track the progress and status of the import process. This can help with debugging and monitoring the import operation.
  2. Evaluate the impact of importing plugins on existing data in the system. Consider scenarios where imported plugins may conflict with or duplicate existing plugins, and implement appropriate handling mechanisms to resolve such conflicts gracefully.
server/internal/usecase/interfaces/nlslayer.go (1)

89-89: LGTM! Consider using a structured input type and adding unit tests.

The addition of the ImportNLSLayers method to the NLSLayer interface looks good. It provides a way to import NLS layers from a map of key-value pairs, which can be useful for data integration scenarios.

However, consider the following suggestions to improve the method:

  1. Use a more structured input type, such as a custom struct, instead of map[string]interface{}. This will provide better type safety, documentation, and maintainability. It will also make it easier to validate the input and handle errors.

  2. Add unit tests for the method implementation to ensure it handles various input scenarios correctly, such as missing keys, invalid value types, and edge cases. This will help catch bugs early and improve the reliability of the code.

server/internal/adapter/gql/gqlmodel/convert_plugin.go (1)

53-63: Return the actual error for better error handling.

The ToPluginsFromJSON function is a useful addition that provides a mechanism to convert JSON data directly into the plugin model. However, consider improving the error handling by returning the actual error instead of nil. This will help with debugging and provide more context about the failure.

Apply this diff to improve the error handling:

func ToPluginsFromJSON(data []interface{}) ([]*Plugin, error) {
	var plgs []*Plugin
	bytes, err := json.MarshalIndent(data, "", "  ")
	if err != nil {
-		return nil
+		return nil, err
	}
	if err := json.Unmarshal(bytes, &plgs); err != nil {
-		return nil
+		return nil, err
	}
-	return plgs
+	return plgs, nil
}
server/pkg/scene/builder/decoder.go (3)

14-24: Remove the unused ctx parameter and simplify the JSON parsing.

The ctx parameter is unused in the function body and can be removed. Additionally, the indentation step with json.MarshalIndent seems unnecessary since the JSON is immediately unmarshaled. Consider directly using json.Unmarshal on the input map to simplify the parsing logic.

Apply this diff to remove the ctx parameter and simplify the JSON parsing:

-func ParseSceneJSON(ctx context.Context, sceneJSONData map[string]interface{}) (*sceneJSON, error) {
-	sceneBytes, err := json.MarshalIndent(sceneJSONData, "", "  ")
-	if err != nil {
-		return nil, err
-	}
+func ParseSceneJSON(sceneJSONData map[string]interface{}) (*sceneJSON, error) {
 	var result sceneJSON
-	if err := json.Unmarshal(sceneBytes, &result); err != nil {
+	if err := json.Unmarshal(sceneJSONData, &result); err != nil {
 		return nil, err
 	}
 	return &result, nil
 }

26-38: Correct the function name.

The function name has a typo: "Parser" instead of "Parse". Consider renaming the function to ParseWidgetAlignSystem for consistency with other parsing functions.

Apply this diff to correct the function name:

-func ParserWidgetAlignSystem(widgetAlignSystemJSON *widgetAlignSystemJSON) *scene.WidgetAlignSystem {
+func ParseWidgetAlignSystem(widgetAlignSystemJSON *widgetAlignSystemJSON) *scene.WidgetAlignSystem {
 	if widgetAlignSystemJSON == nil {
 		return nil
 	}
 	was := scene.NewWidgetAlignSystem()
 	if widgetAlignSystemJSON.Inner != nil {
 		parseWidgetZone(was.Zone(scene.WidgetZoneInner), widgetAlignSystemJSON.Inner)
 	}
 	if widgetAlignSystemJSON.Outer != nil {
 		parseWidgetZone(was.Zone(scene.WidgetZoneOuter), widgetAlignSystemJSON.Outer)
 	}
 	return was
 }

108-126: Consider simplifying the function to improve readability and maintainability.

The function is quite complex and heavily nested, which may affect readability and maintainability. Consider simplifying the function by extracting some of the logic into separate functions or using more descriptive variable names.

Here are a few suggestions to improve the function:

  1. Extract the logic for parsing the schema group ID and field ID into separate functions.
  2. Use more descriptive variable names instead of value1, iVal, value2, etc.
  3. Consider using a separate function for updating the prop with the parsed value to reduce the nesting level.

Example:

func AddItemFromPropertyJSON(prop *property.Property, ps *property.Schema, pj propertyJSON) (*property.Property, error) {
	for sgKey, sgValue := range pj {
		schemaGroupID := parseSchemaGroupID(sgKey)
		if fieldValues, ok := sgValue.(map[string]interface{}); ok {
			for fKey, fValue := range fieldValues {
				fieldID := parseFieldID(fKey)
				ptr := property.NewPointer(schemaGroupID, nil, fieldID)
				if err

</blockquote></details>
<details>
<summary>server/pkg/scene/builder/builder.go (1)</summary><blockquote>

`41-48`: **Use a more descriptive name for the new parameter.**

The changes look good and are consistent with the addition of the new `export` field. However, consider renaming the parameter `exp` to something more descriptive, such as `isExport`, to improve code readability.

</blockquote></details>
<details>
<summary>server/internal/usecase/interactor/style.go (1)</summary><blockquote>

`203-237`: **LGTM! Consider adding a brief comment to explain the function's behavior.**

The `ImportStyles` function is well-implemented and follows good practices:

- It correctly handles errors during JSON parsing, style creation, and saving to the repository.
- It uses the `scene.NewStyle()` builder pattern to create styles, which enhances readability and maintainability.
- It saves the imported styles to the repository and returns them as a `scene.StyleList`.

To further improve the code, consider adding a brief comment explaining the function's behavior and parameters. This will make it easier for other developers to understand its purpose and usage.


Add a brief comment to explain the function's behavior and parameters:

```go
// ImportStyles imports styles from a given scene represented as a JSON object.
// It parses the scene JSON, retrieves the scene ID, and iterates over the layer styles to create new `scene.Style` objects.
// It saves the imported styles to the repository and returns the list of imported styles.
func (i *Style) ImportStyles(ctx context.Context, sceneData map[string]interface{}) (scene.StyleList, error) {
	// ...
}
server/internal/usecase/interactor/storytelling_test.go (1)

22-202: Comprehensive test coverage for the ImportStoryTelling function.

The test function TestImportStoryTelling effectively covers the import functionality of the Storytelling interactor. It sets up the necessary context, mocks dependencies, prepares input data, invokes the function, and asserts the result against the expected JSON. The removal of dynamic fields from the result JSON ensures a focused comparison of the essential data.

Consider the following suggestions to enhance the test:

  1. Extract the expected JSON into a separate file or use a more flexible comparison approach to make the test more resilient to changes in the structure or values.
  2. Break down the test function into smaller, more focused test functions or use helper functions to improve readability and maintainability.
server/internal/usecase/interactor/scene_test.go (1)

21-335: LGTM! The test function is well-structured and covers the basic functionality of the ImportScene function.

The test function sets up the necessary dependencies, prepares test data, invokes the ImportScene function, and verifies the result using JSON equality assertion. It ensures that the function can import a scene with widgets and widget alignment system correctly.

Suggestions for potential improvements:

  1. Consider splitting the test function into smaller, more focused test cases to improve readability and maintainability.
  2. Add more test cases to cover different scenarios and edge cases, such as importing scenes with different types of widgets, layers, and properties.
  3. Extract the test setup logic into a separate helper function to reduce duplication and improve reusability.
server/internal/usecase/interactor/scene.go (1)

584-693: LGTM! Consider breaking down the function for better readability.

The ImportScene function is well-structured and comprehensive in its handling of scene importation. It follows a clear sequence of steps, from parsing the JSON data to creating the necessary entities and saving them to the repositories. The error handling is thorough, covering various potential issues that could arise during the import process.

The function also integrates well with the existing codebase, leveraging established packages and interfaces for parsing, visualization, and data persistence.

To further enhance the readability and maintainability of this function, consider breaking it down into smaller sub-functions that handle specific tasks, such as:

  1. Parsing the scene JSON data
  2. Creating widgets and their properties
  3. Creating clusters and their properties
  4. Building the scene object
  5. Saving the imported data to repositories

This refactoring would make the code more modular and easier to understand at a glance, without affecting the current functionality.

server/internal/usecase/interactor/nlslayer.go (1)

835-877: LGTM with suggestions!

The ImportNLSLayers function is well-structured and follows a clear logic flow for importing NLS layers from a given scene JSON. The error handling is appropriate, and the usage of the nlslayer.New() builder pattern enhances code readability and maintainability.

Consider handling the case when nlsLayerJSON.Config is nil to avoid passing a nil pointer to (*nlslayer.Config)(nlsLayerJSON.Config). You can check if nlsLayerJSON.Config is nil and set a default value or skip the Config() call in the builder chain.

To prevent potential performance issues or resource exhaustion, it would be beneficial to add a check for the maximum number of NLS layers allowed per scene. If the imported layers exceed the defined limit, you can return an appropriate error or truncate the list of layers to fit within the limit.

server/internal/usecase/interactor/storytelling.go (1)

999-1128: LGTM! Consider extracting block and page creation logic.

The ImportStoryTelling function is well-structured and follows a logical flow for importing storytelling data from a JSON structure. It handles the parsing, construction, and saving of various components such as blocks, pages, and stories, with appropriate error handling throughout the process.

To further improve readability and maintainability, consider extracting the block and page creation logic into separate helper functions. This will make the main function more concise and easier to understand at a high level. For example:

func createBlockFromJSON(ctx context.Context, blockJSON *builder.BlockJSON, sceneID id.SceneID, i *Storytelling) (*storytelling.Block, error) {
    // ... block creation logic ...
}

func createPageFromJSON(ctx context.Context, pageJSON *builder.PageJSON, sceneID id.SceneID, i *Storytelling) (*storytelling.Page, error) {
    // ... page creation logic ...
}

Then, you can simplify the main function to:

func (i *Storytelling) ImportStoryTelling(ctx context.Context, sceneData map[string]interface{}) (*storytelling.Story, error) {
    // ... parsing and error handling ...

    pages := []*storytelling.Page{}
    for _, pageJSON := range storyJSON.Pages {
        page, err := createPageFromJSON(ctx, pageJSON, sceneID, i)
        if err != nil {
            return nil, err
        }
        pages = append(pages, page)
    }

    // ... story creation and saving ...
}
Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between c18275f and 1c9469e.

Files selected for processing (39)
  • server/Makefile (3 hunks)
  • server/e2e/gql_import_export_test.go (1 hunks)
  • server/e2e/project.json (1 hunks)
  • server/gql/project.graphql (4 hunks)
  • server/internal/adapter/gql/generated.go (27 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_plugin.go (3 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_project.go (4 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_scene.go (1 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_storytelling.go (1 hunks)
  • server/internal/adapter/gql/gqlmodel/models_gen.go (2 hunks)
  • server/internal/adapter/gql/resolver_mutation_project.go (2 hunks)
  • server/internal/usecase/interactor/nlslayer.go (2 hunks)
  • server/internal/usecase/interactor/nlslayer_test.go (2 hunks)
  • server/internal/usecase/interactor/plugin.go (2 hunks)
  • server/internal/usecase/interactor/plugin_test.go (1 hunks)
  • server/internal/usecase/interactor/project.go (6 hunks)
  • server/internal/usecase/interactor/project_test.go (3 hunks)
  • server/internal/usecase/interactor/scene.go (5 hunks)
  • server/internal/usecase/interactor/scene_test.go (1 hunks)
  • server/internal/usecase/interactor/storytelling.go (3 hunks)
  • server/internal/usecase/interactor/storytelling_test.go (1 hunks)
  • server/internal/usecase/interactor/style.go (2 hunks)
  • server/internal/usecase/interactor/style_test.go (1 hunks)
  • server/internal/usecase/interfaces/nlslayer.go (1 hunks)
  • server/internal/usecase/interfaces/plugin.go (1 hunks)
  • server/internal/usecase/interfaces/project.go (2 hunks)
  • server/internal/usecase/interfaces/scene.go (2 hunks)
  • server/internal/usecase/interfaces/story.go (1 hunks)
  • server/internal/usecase/interfaces/style.go (1 hunks)
  • server/pkg/property/sealed.go (3 hunks)
  • server/pkg/property/sealed_test.go (1 hunks)
  • server/pkg/property/value.go (1 hunks)
  • server/pkg/scene/builder/builder.go (3 hunks)
  • server/pkg/scene/builder/builder_test.go (2 hunks)
  • server/pkg/scene/builder/decoder.go (1 hunks)
  • server/pkg/scene/builder/encoder.go (2 hunks)
  • server/pkg/scene/builder/scene.go (2 hunks)
  • server/pkg/scene/scene.go (2 hunks)
  • server/pkg/scene/widget.go (1 hunks)
Files skipped from review due to trivial changes (1)
  • server/pkg/scene/widget.go
Additional comments not posted (49)
server/Makefile (2)

1-2: LGTM!

The TEST_DIR variable is defined correctly with a default value that allows running tests recursively in all subdirectories. The use of the ?= operator is appropriate as it allows overriding the value from the command line if needed.


10-10: LGTM!

The addition of the failcheck target is a great enhancement to the testing capabilities. The use of the -failfast and -p 1 flags is appropriate for quickly identifying failures and running tests sequentially. The TEST_DIR variable provides flexibility in specifying the directory for running tests. The update to the help section is also helpful for users to understand the purpose of the new target.

Also applies to: 23-24

server/internal/usecase/interfaces/style.go (1)

30-30: LGTM!

The addition of the ImportStyles method to the Style interface is a welcome enhancement. The method signature is well-defined, follows the existing convention, and clearly conveys its purpose. This new functionality will facilitate the bulk import of styles, potentially streamlining the management of styles within the application.

The method parameters and return values are consistent with the rest of the interface, ensuring a cohesive design. The use of map[string]interface{} suggests flexibility in the input format, which could support various import scenarios.

Overall, this is a solid addition to the interface, expanding its capabilities without introducing any apparent issues or inconsistencies.

server/internal/adapter/gql/gqlmodel/convert_scene.go (1)

67-68: LGTM!

The addition of the SceneID field to the Style struct is a good change that provides more context about the association between a style and a scene. The implementation is consistent with the existing code and follows the same pattern of using the IDFrom function to convert IDs.

This change may impact how styles are processed and utilized within the application, so ensure that any code that consumes the Style struct is updated to handle the new SceneID field appropriately.

server/pkg/scene/scene.go (1)

81-90: LGTM!

The PluginIds method is a great addition to the Scene struct. It provides a convenient way to retrieve the identifiers of the plugins associated with a scene, which can be useful in various scenarios.

The implementation is straightforward and handles the case when the Scene instance is nil by returning nil. The use of a slice to store the plugin IDs is appropriate, and the method name follows the naming convention of the other methods in the struct.

Overall, this method enhances the functionality of the Scene struct and improves the interaction with the plugin system.

server/internal/usecase/interfaces/project.go (1)

70-70: LGTM!

The ExportProject method signature looks good:

  • It follows the existing naming conventions in the interface.
  • The parameters are appropriate for exporting a project.
  • The return types are suitable for returning the exported project data, including the project itself, a map of interface data, and a slice of plugins.
server/internal/usecase/interfaces/scene.go (1)

34-34: LGTM!

The addition of the ImportScene method to the Scene interface looks good:

  • The method signature is well-defined and follows the existing convention in the interface.
  • The map[string]interface{} parameter allows for passing the scene data as a generic map, which can be parsed and validated inside the method implementation.
  • Returning (*scene.Scene, error) is appropriate as it allows the method to return the imported scene or an error if the import fails.
  • This addition aligns with the PR objective of introducing import functionality for scenes.
server/internal/adapter/gql/gqlmodel/convert_project.go (2)

19-19: LGTM!

The change improves the robustness of the status conversion function by returning the string representation of the PublishmentStatus enum instead of an empty string when the input does not match any known status.


31-31: LGTM!

The change improves the robustness of the status conversion function by returning the PublishmentStatus enum representation of the input string instead of an empty enum when the input does not match any known status.

server/internal/usecase/interactor/style_test.go (1)

20-93: LGTM!

The test function TestImportStyles is well-structured and provides good coverage for the ImportStyles function. It follows a clear Arrange-Act-Assert pattern and sets up the test scenario correctly. The JSON strings used for the test data and expected result are properly formatted and contain relevant data. The comparison between the actual and expected JSON strings is done using assert.JSONEq, which is appropriate for comparing JSON data.

Overall, the test function looks good and thoroughly tests the functionality of the ImportStyles function.

server/gql/project.graphql (4)

100-102: LGTM!

The ExportProjectInput input type is correctly defined and aligns with the PR objective of introducing project export functionality. The required projectId field is appropriate for identifying the project to be exported.


104-106: LGTM!

The ImportProjectInput input type is correctly defined and aligns with the PR objective of introducing project import functionality. The file field of type Upload is appropriate for facilitating the upload of a project file.


118-120: LGTM!

The ExportProjectPayload payload type is correctly defined and aligns with the PR objective of returning exported project data. The projectData field of type JSON is appropriate for representing the exported project data.


122-124: LGTM!

The ImportProjectPayload payload type is correctly defined and aligns with the PR objective of returning imported project data. The projectData field of type JSON is appropriate for representing the imported project data.

server/internal/adapter/gql/gqlmodel/convert_storytelling.go (1)

23-23: LGTM!

The addition of the SceneID field to the Story struct enhances the ToStory function by providing a more comprehensive mapping of the storytelling.Story data. This change is consistent with the existing code structure and follows the pattern of assigning fields using the respective methods of the storytelling.Story object. The inclusion of the scene ID allows for better tracking or categorization of stories based on their associated scenes.

server/internal/adapter/gql/gqlmodel/convert_plugin.go (2)

47-51: LGTM!

The ToPlugins function is a nice addition that provides a convenient way to convert multiple plugins at once. The use of the lo.Map function is appropriate and helps keep the code concise.


Line range hint 66-88: LGTM!

The modification to the ToPluginExtensionType function improves type handling by returning a string representation of the plugin.ExtensionType when the type does not match any known cases. This change ensures that the function returns a more meaningful value in such scenarios, which can be helpful for logging or debugging purposes.

server/pkg/scene/builder/decoder.go (4)

40-53: LGTM!

The function correctly parses a widgetZoneJSON struct into a scene.WidgetZone. It handles the case when the input zone or widgetZoneJSON is nil and uses the setWidgetSection function to parse the left, center, and right widget sections.


55-62: LGTM!

The function correctly sets the areas of a scene.WidgetSection based on a widgetSectionJSON struct. It handles the case when the input section or widgetSectionJSON is nil and uses the parseWidgetArea function to parse the top, middle, and bottom areas.


83-94: LGTM!

The function correctly parses a string representation of a widget alignment into a scene.WidgetAlignType. It uses a switch statement to map the input string to the corresponding constant and returns scene.WidgetAlignStart as the default alignment if the input string doesn't match any of the cases.


96-106: LGTM!

The function correctly parses a widgetAreaPaddingJSON struct into a scene.WidgetAreaPadding. It handles the case when the input paddingJSON is nil and creates a new scene.WidgetAreaPadding with the values from the input JSON.

server/pkg/property/value.go (1)

116-124: LGTM!

The InterfaceWithType method is a useful addition to the Value struct. It provides a convenient way to retrieve both the value and its type in a single call, improving the usability of the API. The implementation correctly handles the case when the Value instance is nil and leverages the existing Interface and Type methods to retrieve the necessary information. The returned map provides a structured format for accessing the value and type, making it easy for users to work with the data.

Furthermore, the new method complements the existing Interface method without modifying its behavior, ensuring backward compatibility. This is a good practice as it enhances the functionality of the struct while preserving the existing API.

Overall, the code changes are well-implemented and improve the usability and flexibility of the Value struct.

server/internal/usecase/interfaces/story.go (1)

152-153: LGTM!

The ImportStoryTelling method is a great addition to the Storytelling interface. It aligns with the PR objective of introducing import functionality for storytelling data.

The method signature looks good with appropriate parameters and return values. The map[string]interface{} parameter suggests that the method will parse a generic JSON structure to import storytelling data, which is a flexible approach.

The actual implementation of this method will be provided by a concrete type that satisfies the Storytelling interface. It's good to see the interface being extended to support the new import functionality.

server/pkg/scene/builder/encoder.go (2)

112-112: Provide more context about the Interface method change.

The Interface method of the property.Sealed type now accepts a boolean parameter, and the property method is passing false to it. Can you please provide more information about:

  1. What is the purpose of the boolean parameter in the Interface method?
  2. How does passing false impact the behavior of the Interface method and the returned propertyJSON object?
  3. Are there any other parts of the codebase that rely on the Interface method and might be affected by this change?

This additional context will help in understanding the full impact of this modification.


186-186: LGTM! Clarify if there are any related changes.

The addition of the Enabled field to the widgetJSON struct looks good. It introduces the capability to represent the enabled/disabled state of a widget in the JSON format.

Just to confirm:

  1. Are there any other parts of the codebase that need to be updated to set or handle this new Enabled field?
  2. Are there any specific considerations or constraints for determining the enabled/disabled state of a widget?

Providing this additional information will help ensure a smooth integration of this new feature.

server/pkg/scene/builder/builder.go (1)

6-6: LGTM!

The import statement for the "errors" package is correctly formatted and the package is used appropriately within the BuildResult method.

server/pkg/property/sealed.go (3)

Line range hint 120-134: LGTM!

The changes to the Interface method of the Sealed struct introduce a new parameter export to control the level of detail in the interface representation. When export is true, the method likely includes type information in the output, and when export is false, it defaults to the previous behavior. This enhances the flexibility of the interface representation, allowing for different levels of detail based on the context.


Line range hint 136-154: LGTM!

The changes to the Interface method of the SealedItem struct introduce a new parameter export to control the level of detail in the interface representation. The export parameter is propagated to the sealedFieldsInterface function call, maintaining consistency with the updates made to the Interface method of the Sealed struct. The changes preserve the existing behavior of handling groups and fields separately.


Line range hint 156-168: LGTM!

The changes to the sealedFieldsInterface function introduce a new parameter export to control the level of detail in the interface representation of the sealed fields. When export is true, the function calls InterfaceWithType on the field value, likely including type information in the output, and when export is false, it calls Interface on the field value, defaulting to the previous behavior without type information. This enhances the flexibility of the interface representation, allowing for different levels of detail based on the context.

server/internal/adapter/gql/resolver_mutation_project.go (1)

113-131: LGTM!

The ExportProject mutation resolver function looks good:

  • It correctly converts the GraphQL ProjectID input to an internal id.Project type and handles any conversion errors.
  • It calls the Project.ExportProject usecase with the converted project ID and operator context to retrieve the project, plugins, and export data.
  • The returned data is appropriately converted to GraphQL types and encapsulated in an ExportProjectPayload before being returned.
  • Error handling is implemented to propagate any errors encountered during the process.
server/pkg/scene/builder/scene.go (3)

72-72: LGTM! The change provides flexibility in widget inclusion based on the export context.

The added condition !b.export && !w.Enabled() ensures that when b.export is false, disabled widgets will be excluded from the result. This change allows for selective inclusion of widgets based on the export state, providing flexibility in the widget selection process.


145-145: Great catch! The added nil check improves the reliability of the findProperty function.

The nil check p != nil ensures that only valid property pointers are considered when searching for a property by its ID. This change enhances the robustness of the code by preventing potential nil dereference errors and improves the error handling within the findProperty function.


140-140: Verify the usage of the b.export flag in property.SealProperty.

The change suggests that the handling of properties depends on whether the builder is in export mode or not. Passing the b.export flag to property.SealProperty likely affects how properties are sealed based on the export context.

To fully understand the impact of this change, please verify how the b.export flag is used within the property.SealProperty function and ensure that the property sealing behavior aligns with the expected export functionality.

Run the following script to verify the usage of b.export flag:

Verification successful

The usage of the b.export flag is appropriate and consistent with its intended purpose.

The b.export flag is used to control export-related behavior in the Builder struct. It is not directly used within the SealProperty function, but it influences the behavior of the Interface method. The change in the code aligns with the expected functionality for handling export scenarios. No issues were found with this implementation.

  • server/pkg/scene/builder/builder.go: Usage in export logic.
  • server/pkg/scene/builder/scene.go: Conditional logic and method argument.
Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the usage of `b.export` flag in `property.SealProperty`.

# Test: Search for the function definition.
rg --type go -A 10 $'func.*property.SealProperty'

Length of output: 50


Script:

#!/bin/bash
# Description: Use ast-grep to search for the SealProperty function definition in the codebase.

# Search for the SealProperty function definition in Go files.
ast-grep --lang go --pattern 'func SealProperty($_, $_) $_'

Length of output: 482


Script:

#!/bin/bash
# Description: Search for the usage of the `b.export` flag in the Builder struct or related methods.

# Search for the `b.export` flag usage in Go files.
rg --type go 'b\.export'

Length of output: 299

server/internal/usecase/interactor/plugin_test.go (1)

17-175: LGTM!

The test function TestImportPlugins is well-structured and follows the AAA (Arrange-Act-Assert) pattern. It thoroughly tests the ImportPlugins function of the Plugin interactor by setting up the necessary dependencies, preparing the input data, invoking the function, and comparing the actual result with the expected result using JSON equality assertion.

The test function covers the important aspects of the ImportPlugins function, including the handling of multiple extensions within a plugin and the conversion of the result to the appropriate GraphQL model.

Overall, the test function provides good coverage and ensures the correctness of the ImportPlugins functionality.

server/internal/usecase/interactor/project_test.go (1)

146-260: LGTM! The test function is well-structured and thoroughly tests the project import functionality.

The test function follows a clear sequence of steps to set up the necessary dependencies, simulate the import process using predefined JSON data, and validate the correctness of the imported project data. It checks for errors and verifies the expected behavior of the ImportProject method.

Suggestions for improvement:

  1. Consider adding more test cases to cover different scenarios, such as importing projects with missing or invalid data, to ensure the robustness of the import functionality.
  2. You can extract the JSON data used for testing into separate files or constants to improve readability and maintainability of the test code.
  3. Consider adding assertions to verify the state of the database after the import process to ensure that the project data is persisted correctly.

Overall, the test function is a valuable addition to the test suite and enhances the coverage of the project import functionality.

server/internal/usecase/interactor/nlslayer_test.go (1)

264-343: LGTM!

The TestImportNLSLayers function is a well-structured and comprehensive test for the NLS layer import functionality. It follows a clear setup, execution, and assertion pattern, making it easy to understand and maintain.

The test effectively validates the import process by providing a representative JSON input and comparing the output against the expected structure. The use of assert.JSONEq for comparing the actual and expected JSON representations is appropriate and ensures that the imported NLS layers are correctly processed and structured.

Overall, this test enhances the test coverage for the NLS layer functionality and ensures that any future changes to the import logic can be verified against the established behavior.

server/e2e/project.json (3)

2-64: LGTM!

The plugins section has the correct structure and all the required properties. The extension objects within the plugin also look good.


101-340: Verify the layers property and consider moving some properties to the project section.

The overall structure of the scene section looks good. The properties, widgets, layers, story, and styles have the expected structure and data. However, please note the following:

The layers property is null. Please verify if this is intentional or if there should be layer data here.

Run the following script to verify the layers property in scene objects:

The coreSupport, enableGa, and trackingId properties at the end of the scene object seem more relevant to the project than the scene. Consider moving them to the project section.

Apply this diff to move the properties:

@@ -294,10 +294,6 @@
     "position": "right",
     "bgColor": "#b2efd8ff"
   },
-  "coreSupport": true,
-  "enableGa": false,
-  "trackingId": "",
   "nlsLayers": [
     {
       "id": "01j7g9gwj6qbv286pcwwmwq5ds",
@@ -334,6 +330,9 @@
         }
       }
     ],
+    "coreSupport": true,
+    "enableGa": false,
+    "trackingId": ""
   }
 }

65-100: Verify the imageUrl property structure.

The overall structure of the project section and most of its properties look good. However, the imageUrl property is an object instead of a string URL.

Please verify if this is the intended structure for the imageUrl property. Typically, I would expect it to be a string URL.

Run the following script to verify the imageUrl structure:

server/internal/usecase/interactor/project.go (2)

496-554: LGTM!

The ExportProject function correctly retrieves all the necessary data for exporting a project, including the project, scene, plugins, NLS layers, layer styles, and storytelling data. It filters out the "reearth" plugin and uses the builder package to construct the scene JSON, which is a good separation of concerns. The function returns a complete representation of the project for export.


556-609: LGTM!

The ImportProject function correctly imports a project from a JSON representation. It converts the JSON data to a jsonmodel.Project struct for easier access to the fields and converts the project and workspace IDs from strings to their respective types for type safety. The function uses the project.New() builder to create a new project and sets various fields based on the imported data, ensuring that all relevant data is preserved during the import process. Finally, it saves the imported project to the repository, completing the import process.

server/internal/usecase/interactor/scene.go (1)

701-717: Excellent refactoring! The new function improves code reusability and maintainability.

The introduction of the getWidgePlugin function is a great refactoring that addresses code duplication and enhances the maintainability of the codebase. By centralizing the logic for retrieving a widget's plugin and extension, it ensures consistency and reduces the likelihood of errors stemming from duplicated code.

The function's error handling is comprehensive, covering various scenarios that could lead to issues, such as missing plugins or extensions, or mismatched extension types. This helps to catch potential problems early and handle them gracefully.

Moreover, the function integrates seamlessly with the existing codebase, leveraging the established repository interfaces and error types, making it a cohesive addition to the project.

server/pkg/scene/builder/builder_test.go (1)

772-818: LGTM!

The changes to the TestSceneBuilder function look good:

  • The addition of the export flag allows testing the SceneBuilder in both normal and export modes.
  • The test assertions have been updated correctly to match the expected output structure in export mode, which includes type information for properties and widgets.
  • The test setup remains comprehensive, covering various scene elements and scenarios.

Great job on enhancing the test coverage! The changes are safe to merge.

server/internal/usecase/interactor/storytelling.go (1)

43-44: Looks good!

The addition of the propertySchemaRepo field to the Storytelling struct and the corresponding initialization in the constructor are consistent with the usage of the new repository in the ImportStoryTelling function. The changes enable the management of property schemas related to storytelling.

Also applies to: 64-64

server/internal/adapter/gql/gqlmodel/models_gen.go (4)

561-563: LGTM!

The ExportProjectInput struct is defined correctly with the required ProjectID field to identify the project for export.


565-567: LGTM!

The ExportProjectPayload struct is defined correctly with the ProjectData field to hold the exported project data in JSON format.


617-619: LGTM!

The ImportProjectInput struct is defined correctly with the File field of type graphql.Upload to handle file uploads for importing a project.


621-623: LGTM!

The ImportProjectPayload struct is defined correctly with the ProjectData field to hold the imported project data in JSON format.

server/internal/adapter/gql/generated.go (1)

Line range hint 1-1: LGTM!

The code changes in this file are adding new functionality for project export and import operations within the GraphQL schema. The changes include:

  • New input types ExportProjectInput and ImportProjectInput
  • New payload types ExportProjectPayload and ImportProjectPayload
  • Updates to the MutationResolver interface to include ExportProject and ImportProject methods
  • Complexity functions for the new mutations
  • Modifications to the executable schema to handle the new mutations and their arguments
  • Resolver functions for the new mutations

The code changes are implemented correctly without any issues. The new types, methods, and functions follow the existing patterns and conventions of the codebase.

As an observation, this file seems to contain generated GraphQL code. It's important to keep in mind that any manual changes made to this file may be overwritten if the code is regenerated from the GraphQL schema. The actual implementation of the export and import functionality would be present in other files.

Comment on lines 12 to 57
func TestCallExportProject(t *testing.T) {
e := StartServer(t, &config.Config{
Origins: []string{"https://example.com"},
AuthSrv: config.AuthSrvConfig{
Disabled: true,
},
}, true, baseSeeder)

requestBody := GraphQLRequest{
OperationName: "ExportProject",
Query: "mutation ExportProject($projectId: ID!) { exportProject(input: {projectId: $projectId}) { projectData __typename } }",
Variables: map[string]any{
"projectId": pID.String(),
},
}

response := e.POST("/api/graphql").
WithHeader("Origin", "https://example.com").
WithHeader("authorization", "Bearer test").
WithHeader("X-Reearth-Debug-User", uID.String()).
WithHeader("Content-Type", "application/json").
WithJSON(requestBody).
Expect()

response.Status(http.StatusOK)

// response.JSON().
// Object().
// Value("data").Object().
// Value("exportProject").Object().
// Value("projectData").Object()

}
Copy link

@coderabbitai coderabbitai bot Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add assertions for the response body.

The test setup and GraphQL request look good. Checking for a successful response status is a good start.

However, to ensure the ExportProject mutation is working as expected, please uncomment the response body assertions and add checks for the expected project data structure. This will help verify that the exported data contains the correct information in the right format.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Disabled due to problems with graphql

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hexaforce, thank you for the clarification. If you need any assistance resolving the GraphQL issues in the future, feel free to reach out.

(_/)?
(•ㅅ•)
⏭️


Learnings added
Learnt from: hexaforce
PR: reearth/reearth-visualizer#1141
File: server/e2e/gql_import_export_test.go:12-44
Timestamp: 2024-09-13T01:19:05.702Z
Learning: The response body assertions in `TestCallExportProject` are disabled due to existing problems with GraphQL.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.

Comment on lines 48 to 471
func TestCallImportProject(t *testing.T) {
e := StartServer(t, &config.Config{
Origins: []string{"https://example.com"},
AuthSrv: config.AuthSrvConfig{
Disabled: true,
},
}, true, baseSeeder)

testFilePath := "project.json"

response := e.POST("/api/graphql").
WithHeader("Origin", "https://example.com").
WithHeader("authorization", "Bearer test").
WithHeader("X-Reearth-Debug-User", uID.String()).
WithMultipart().
WithFormField("operations", `{
"operationName": "ImportProject",
"variables": {"file": null},
"query": "mutation ImportProject($file: Upload!) { importProject(input: {file: $file}) { projectData __typename }}"
}`).
WithFormField("map", `{"1":["variables.file"]}`).
WithFile("1", testFilePath).
Expect()

response.Status(http.StatusOK)

// response.JSON().
// Object().
// Value("data").Object().
// Value("exportProject").Object().
// Value("projectData").Object()

}
Copy link

@coderabbitai coderabbitai bot Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add assertions for the response body and clarify the test fixture.

The test setup and GraphQL request with file upload look good. Checking for a successful response status is a good start.

However, similar to the TestCallExportProject function, please uncomment the response body assertions and add checks for the expected imported project data. This will help verify that the import process is working correctly and the imported data matches the expected structure.

Also, please clarify where the "project.json" file used for the import is located. If it's not already part of the test fixtures, consider adding it to ensure the test is self-contained and reproducible.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Disabled due to problems with graphql

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hexaforce, understood. Thank you for the clarification.

(_/)
( •_•)
/>💔


Learnings added
Learnt from: hexaforce
PR: reearth/reearth-visualizer#1141
File: server/e2e/gql_import_export_test.go:48-80
Timestamp: 2024-09-13T01:19:11.108Z
Learning: In `server/e2e/gql_import_export_test.go`, test functions like `TestCallImportProject` may have disabled assertions due to problems with GraphQL.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.

Comment on lines 64 to 84
func parseWidgetArea(widgetAreaJSON *widgetAreaJSON) *scene.WidgetArea {
if widgetAreaJSON == nil {
return nil
}
var widgetIDs []idx.ID[id.Widget]
for _, widgetID := range widgetAreaJSON.WidgetIDs {
id, _ := gqlmodel.ToID[id.Widget](gqlmodel.ID(widgetID))
widgetIDs = append(widgetIDs, id)
}
return scene.NewWidgetArea(
widgetIDs,
parseWidgetAlign(widgetAreaJSON.Align),
parseWidgetAreaPadding(widgetAreaJSON.Padding),
widgetAreaJSON.Gap,
widgetAreaJSON.Centered,
widgetAreaJSON.Background,
)
}
Copy link

@coderabbitai coderabbitai bot Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle the error returned by gqlmodel.ToID.

The function uses the gqlmodel.ToID function to convert the widget IDs from gqlmodel.ID to idx.ID[id.Widget]. However, it ignores the error returned by gqlmodel.ToID using the blank identifier _. Consider handling the error to ensure that invalid widget IDs are properly handled.

Apply this diff to handle the error returned by gqlmodel.ToID:

 	var widgetIDs []idx.ID[id.Widget]
 	for _, widgetID := range widgetAreaJSON.WidgetIDs {
-		id, _ := gqlmodel.ToID[id.Widget](gqlmodel.ID(widgetID))
+		id, err := gqlmodel.ToID[id.Widget](gqlmodel.ID(widgetID))
+		if err != nil {
+			// Handle the error, e.g., log it or return an error
+			continue
+		}
 		widgetIDs = append(widgetIDs, id)
 	}
 	return scene.NewWidgetArea(
 		widgetIDs,
 		parseWidgetAlign(widgetAreaJSON.Align),
 		parseWidgetAreaPadding(widgetAreaJSON.Padding),
 		widgetAreaJSON.Gap,
 		widgetAreaJSON.Centered,
 		widgetAreaJSON.Background,
 	)
Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
func parseWidgetArea(widgetAreaJSON *widgetAreaJSON) *scene.WidgetArea {
if widgetAreaJSON == nil {
return nil
}
var widgetIDs []idx.ID[id.Widget]
for _, widgetID := range widgetAreaJSON.WidgetIDs {
id, _ := gqlmodel.ToID[id.Widget](gqlmodel.ID(widgetID))
widgetIDs = append(widgetIDs, id)
}
return scene.NewWidgetArea(
widgetIDs,
parseWidgetAlign(widgetAreaJSON.Align),
parseWidgetAreaPadding(widgetAreaJSON.Padding),
widgetAreaJSON.Gap,
widgetAreaJSON.Centered,
widgetAreaJSON.Background,
)
}
func parseWidgetArea(widgetAreaJSON *widgetAreaJSON) *scene.WidgetArea {
if widgetAreaJSON == nil {
return nil
}
var widgetIDs []idx.ID[id.Widget]
for _, widgetID := range widgetAreaJSON.WidgetIDs {
id, err := gqlmodel.ToID[id.Widget](gqlmodel.ID(widgetID))
if err != nil {
// Handle the error, e.g., log it or return an error
continue
}
widgetIDs = append(widgetIDs, id)
}
return scene.NewWidgetArea(
widgetIDs,
parseWidgetAlign(widgetAreaJSON.Align),
parseWidgetAreaPadding(widgetAreaJSON.Padding),
widgetAreaJSON.Gap,
widgetAreaJSON.Centered,
widgetAreaJSON.Background,
)
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hexaforce, glad to hear that you've addressed the issue!

(_/)
(•_•)
/🥕🎉


If you found this review helpful, would you consider giving us a shout-out on X?
Thank you for using CodeRabbit!

Comment on lines 37 to 38

export bool
Copy link

@coderabbitai coderabbitai bot Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename the field and clarify its purpose.

  • The field name export is a reserved keyword in Go. Consider renaming it to something more descriptive, such as isExport, to avoid confusion.
  • Please provide a comment explaining the purpose of this field and how it is intended to be used within the Builder struct.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed in exportType

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hexaforce Thank you for addressing the issue by renaming the field to exportType. It helps clarify the purpose of the field.

(_/)
(•ᗜ•)💨🎉
( )🎊


If you found this review helpful, would you consider giving us a shout-out on X?
Thank you for using CodeRabbit!

Comment on lines +132 to +167
func (b *Builder) BuildResult(ctx context.Context, publishedAt time.Time, coreSupport bool, enableGa bool, trackingId string) (*sceneJSON, error) {
if b == nil || b.scene == nil {
return nil, errors.New("invalid builder state")
}

sceneData, err := b.buildScene(ctx, publishedAt, coreSupport, enableGa, trackingId)
if err != nil {
return nil, err
}

if b.story != nil {
story, err := b.buildStory(ctx)
if err != nil {
return nil, err
}
sceneData.Story = story
}

if b.nlsLayer != nil {
nlsLayers, err := b.buildNLSLayers(ctx)
if err != nil {
return nil, err
}
sceneData.NLSLayers = nlsLayers
}

if b.layerStyles != nil {
layerStyles, err := b.buildLayerStyles(ctx)
if err != nil {
return nil, err
}
sceneData.LayerStyles = layerStyles
}

return sceneData, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Consider extracting error checks for readability.

The implementation of the BuildResult method looks good! It follows a clear and logical structure, appropriately handles errors, and conditionally builds the various components based on their initialization state.

As a potential refactoring opportunity, consider extracting the repeated error checks and nil returns into a separate helper function. This could improve code readability and maintainability by reducing duplication and keeping the main method focused on its core logic.

Comment on lines 133 to 196
func (r *mutationResolver) ImportProject(ctx context.Context, input gqlmodel.ImportProjectInput) (*gqlmodel.ImportProjectPayload, error) {

fileBytes, err := io.ReadAll(input.File.File)
if err != nil {
return nil, err
}

var jsonData map[string]interface{}
if err := json.Unmarshal(fileBytes, &jsonData); err != nil {
return nil, err
}

projectData, _ := jsonData["project"].(map[string]interface{})
prj, err := usecases(ctx).Project.ImportProject(ctx, projectData)
if err != nil {
return nil, err
}

pluginsData, _ := jsonData["plugins"].([]interface{})
_, err = usecases(ctx).Plugin.ImportPlugins(ctx, pluginsData)
if err != nil {
return nil, err
}

sceneData, _ := jsonData["scene"].(map[string]interface{})
_, err = usecases(ctx).Scene.ImportScene(ctx, prj, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).NLSLayer.ImportNLSLayers(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).Style.ImportStyles(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).StoryTelling.ImportStoryTelling(ctx, sceneData)
if err != nil {
return nil, err
}

prj, res, plgs, err := usecases(ctx).Project.ExportProject(ctx, prj.ID(), getOperator(ctx))
if err != nil {
return nil, err
}
res["project"] = gqlmodel.ToProject(prj)
res["plugins"] = gqlmodel.ToPlugins(plgs)

return &gqlmodel.ImportProjectPayload{
ProjectData: res,
}, nil

}
Copy link

@coderabbitai coderabbitai bot Sep 12, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Consider adding a transaction to ensure data consistency.

The ImportProject mutation resolver function is implemented well:

  • It correctly reads the file bytes from the GraphQL file upload input and unmarshals the JSON data.
  • It calls the appropriate usecase functions to import the project, plugins, scene, NLS layers, styles, and storytelling, passing the respective data from the JSON.
  • After importing, it retrieves the newly created project data and associated plugins using the Project.ExportProject usecase.
  • The exported data is correctly converted to GraphQL types and returned in an ImportProjectPayload.
  • Error handling is implemented after each usecase call to propagate any errors encountered during the import process.

To ensure data consistency in case of partial imports due to errors, consider wrapping the import process in a transaction. This way, if an error occurs during the import of any component (project, plugins, scene, etc.), the transaction can be rolled back to prevent incomplete data from being persisted in the database.

You can use the Transaction function from the usecases(ctx) to create a new transaction and wrap the import process within it. Here's a simplified example:

err := usecases(ctx).Transaction(func(tx *usecases(ctx).Tx) error {
    // Perform the import process using the transaction
    _, err := tx.Project.ImportProject(ctx, projectData)
    if err != nil {
        return err
    }
    
    // ... import other components using the transaction
    
    return nil
})
if err != nil {
    return nil, err
}

By using a transaction, you can ensure that the import process is atomic and maintain data integrity in case of any errors.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add transaction processing

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hexaforce, glad to hear that you're adding transaction processing! Let me know if you need any assistance.

(( ^o^)/


If you found this review helpful, would you consider giving us a shout-out on X?
Thank you for using CodeRabbit!

@@ -463,7 +463,7 @@ func TestSealed_Interface(t *testing.T) {
tc := tc
t.Run(tc.Name, func(t *testing.T) {
t.Parallel()
res := tc.S.Interface()
res := tc.S.Interface(false)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tip

Codebase Verification

Method signature update not found.

The Interface method signature with a boolean parameter was not found in the codebase. Please verify if the method signature update was intended and ensure that the test case modification aligns with the actual method implementation. If the update was overlooked, consider revisiting the changes to ensure consistency.

  • The Interface method is invoked without a boolean parameter in the codebase.
Analysis chain

Verify the method signature change and alignment with expected behavior.

The Interface method invocation has been updated to include a boolean argument set to false. Please ensure that:

  1. The Interface method signature has been updated to accept this boolean flag.
  2. The false value passed to the method aligns with the expected behavior and doesn't alter the test case's expected outcome.

To verify the changes, you can run the following script:

Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Verify the `Interface` method signature and behavior.

# Test 1: Search for the method declaration. Expect: Declaration with a boolean parameter.
ast-grep --lang go --pattern $'func (s *Sealed) Interface(flag bool) map[string]interface{}'

# Test 2: Search for method invocations. Expect: Only invocations with a boolean argument.
rg --type go -A 5 $'\.Interface\('

Length of output: 6732

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Outside diff range and nitpick comments (1)
server/internal/usecase/interactor/storytelling.go (1)

1000-1129: New ImportStoryTelling method looks good!

The new ImportStoryTelling method is well-structured and follows good practices:

  • The method signature is clear and follows the existing convention.
  • The parsing logic is broken down into clear steps for handling different components of the storytelling data.
  • Error handling is thorough, with errors being returned at each stage where they can occur.
  • The method makes good use of the newly added propertySchemaRepo field.

Consider breaking down the method into smaller sub-methods, each responsible for parsing a specific component (e.g., parseBlocks, parsePages, etc.). This could improve readability and maintainability further. However, the current structure is also acceptable.

Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 1c9469e and eede788.

Files selected for processing (11)
  • server/internal/adapter/gql/gqlmodel/convert_plugin.go (2 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_project.go (2 hunks)
  • server/internal/adapter/gql/resolver_mutation_project.go (2 hunks)
  • server/internal/usecase/interactor/project.go (6 hunks)
  • server/internal/usecase/interactor/storytelling.go (4 hunks)
  • server/internal/usecase/interfaces/project.go (2 hunks)
  • server/pkg/property/sealed.go (3 hunks)
  • server/pkg/scene/builder/builder.go (3 hunks)
  • server/pkg/scene/builder/builder_test.go (2 hunks)
  • server/pkg/scene/builder/decoder.go (1 hunks)
  • server/pkg/scene/builder/scene.go (2 hunks)
Files skipped from review as they are similar to previous changes (8)
  • server/internal/adapter/gql/gqlmodel/convert_plugin.go
  • server/internal/adapter/gql/gqlmodel/convert_project.go
  • server/internal/usecase/interactor/project.go
  • server/internal/usecase/interfaces/project.go
  • server/pkg/property/sealed.go
  • server/pkg/scene/builder/builder.go
  • server/pkg/scene/builder/decoder.go
  • server/pkg/scene/builder/scene.go
Additional comments not posted (7)
server/internal/adapter/gql/resolver_mutation_project.go (2)

113-131: LGTM!

The ExportProject mutation resolver function is implemented well:

  • It correctly retrieves the project ID from the GraphQL input and calls the Project.ExportProject usecase to export the project data, plugins, and the project itself.
  • The exported data is correctly converted to GraphQL types and returned in an ExportProjectPayload.
  • Error handling is implemented to propagate any errors encountered during the export process.

133-196: Looks good overall!

The ImportProject mutation resolver function is implemented well:

  • It correctly reads the file bytes from the GraphQL file upload input and unmarshals the JSON data.
  • It calls the appropriate usecase functions to import the project, plugins, scene, NLS layers, styles, and storytelling, passing the respective data from the JSON.
  • After importing, it retrieves the newly created project data and associated plugins using the Project.ExportProject usecase.
  • The exported data is correctly converted to GraphQL types and returned in an ImportProjectPayload.
  • Error handling is implemented after each usecase call to propagate any errors encountered during the import process.
server/pkg/scene/builder/builder_test.go (4)

14-14: LGTM!

The import statement for the value package is necessary and correctly added.


772-773: LGTM!

The exportType variable is correctly initialized to false to test the normal mode of the scene builder.


775-780: LGTM!

The test code correctly instantiates the scene builder in normal mode, calls the buildScene method, and validates the result using assertions.


781-818: LGTM!

The test code correctly updates the exportType to true, modifies the expected output structure for the export mode, calls the buildScene method, and validates the result using assertions. The changes to the expected output, such as adding the type and value fields, are appropriate for the export mode.

server/internal/usecase/interactor/storytelling.go (1)

44-44: LGTM!

The addition of the propertySchemaRepo field to the Storytelling struct and its initialization in the constructor looks good. It follows the existing conventions in the code.

Also applies to: 65-65

Comment on lines 146 to 183
prj, tx, err := usecases(ctx).Project.ImportProject(ctx, projectData)
if err != nil {
return nil, err
}
defer func() {
if err2 := tx.End(ctx); err == nil && err2 != nil {
err = err2
}
}()

pluginsData, _ := jsonData["plugins"].([]interface{})
_, err = usecases(ctx).Plugin.ImportPlugins(ctx, pluginsData)
if err != nil {
return nil, err
}

sceneData, _ := jsonData["scene"].(map[string]interface{})
_, err = usecases(ctx).Scene.ImportScene(ctx, prj, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).NLSLayer.ImportNLSLayers(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).Style.ImportStyles(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).StoryTelling.ImportStoryTelling(ctx, sceneData)
if err != nil {
return nil, err
}

tx.Commit()
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding a transaction to ensure data consistency.

To ensure data consistency in case of partial imports due to errors, consider wrapping the import process in a transaction. This way, if an error occurs during the import of any component (project, plugins, scene, etc.), the transaction can be rolled back to prevent incomplete data from being persisted in the database.

You can use the Transaction function from the usecases(ctx) to create a new transaction and wrap the import process within it. Here's a simplified example:

err := usecases(ctx).Transaction(func(tx *usecases(ctx).Tx) error {
    // Perform the import process using the transaction
    _, err := tx.Project.ImportProject(ctx, projectData)
    if err != nil {
        return err
    }
    
    // ... import other components using the transaction
    
    return nil
})
if err != nil {
    return nil, err
}

By using a transaction, you can ensure that the import process is atomic and maintain data integrity in case of any errors.

@hexaforce hexaforce force-pushed the feat/project-import-export branch from 75cc7ac to 3a479b0 Compare September 17, 2024 03:54
Comment on lines 70 to 93
case PluginExtensionTypePrimitive:
return plugin.ExtensionTypePrimitive
case PluginExtensionTypeWidget:
return plugin.ExtensionTypeWidget
case PluginExtensionTypeBlock:
return plugin.ExtensionTypeBlock
case PluginExtensionTypeVisualizer:
return plugin.ExtensionTypeVisualizer
case PluginExtensionTypeInfobox:
return plugin.ExtensionTypeInfobox
case PluginExtensionTypeInfoboxBlock:
return plugin.ExtensionTypeInfoboxBlock
case PluginExtensionTypeCluster:
return plugin.ExtensionTypeCluster
case PluginExtensionTypeStory:
return plugin.ExtensionTypeStory
case PluginExtensionTypeStoryPage:
return plugin.ExtensionTypeStoryPage
case PluginExtensionTypeStoryBlock:
return plugin.ExtensionTypeStoryBlock
default:
return plugin.ExtensionType("")
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please arrange them in alphabetical order.

@@ -992,6 +997,137 @@ func (i *Storytelling) MoveBlock(ctx context.Context, inp interfaces.MoveBlockPa
return story, page, &inp.BlockID, inp.Index, nil
}

func (i *Storytelling) ImportStoryTelling(ctx context.Context, sceneData map[string]interface{}) (*storytelling.Story, error) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename this to ImportStory

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 15

Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between 75cc7ac and a798a4e.

Files selected for processing (15)
  • server/e2e/gql_import_export_test.go (1 hunks)
  • server/internal/adapter/gql/gqlmodel/convert.go (2 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_plugin.go (2 hunks)
  • server/internal/adapter/gql/gqlmodel/convert_project.go (2 hunks)
  • server/internal/adapter/gql/resolver_mutation_project.go (2 hunks)
  • server/internal/usecase/interactor/plugin.go (2 hunks)
  • server/internal/usecase/interactor/project.go (6 hunks)
  • server/internal/usecase/interactor/project_test.go (3 hunks)
  • server/internal/usecase/interactor/storytelling.go (4 hunks)
  • server/internal/usecase/interfaces/project.go (2 hunks)
  • server/pkg/property/sealed.go (3 hunks)
  • server/pkg/scene/builder/builder.go (3 hunks)
  • server/pkg/scene/builder/builder_test.go (2 hunks)
  • server/pkg/scene/builder/decoder.go (1 hunks)
  • server/pkg/scene/builder/scene.go (2 hunks)
Files skipped from review as they are similar to previous changes (13)
  • server/e2e/gql_import_export_test.go
  • server/internal/adapter/gql/gqlmodel/convert.go
  • server/internal/adapter/gql/gqlmodel/convert_plugin.go
  • server/internal/adapter/gql/gqlmodel/convert_project.go
  • server/internal/adapter/gql/resolver_mutation_project.go
  • server/internal/usecase/interactor/plugin.go
  • server/internal/usecase/interactor/project_test.go
  • server/internal/usecase/interfaces/project.go
  • server/pkg/property/sealed.go
  • server/pkg/scene/builder/builder.go
  • server/pkg/scene/builder/builder_test.go
  • server/pkg/scene/builder/decoder.go
  • server/pkg/scene/builder/scene.go
Additional comments not posted (6)
server/internal/usecase/interactor/project.go (3)

32-32: Initialize New Repository Fields in the Struct

The addition of storytellingRepo and pluginRepo to the Project struct is appropriate for the new functionalities. Ensure that these repositories are correctly initialized in the constructor and used consistently throughout the code.

Also applies to: 46-46


54-54: Include New Repositories in Constructor Initialization

Great job updating the NewProject constructor to initialize the new repositories storytellingRepo and pluginRepo. This ensures that all required dependencies are correctly set up.

Also applies to: 68-68


563-592: Validate Imported Project Data Before Saving

When importing project data, it's crucial to validate all fields to ensure they meet the required constraints and do not introduce inconsistencies.

Please run the following script to validate the imported project data:

server/internal/usecase/interactor/storytelling.go (3)

43-44: Initialize propertySchemaRepo in Struct and Constructor

You've added propertySchemaRepo to the Storytelling struct but also ensure it's properly initialized in the constructor NewStorytelling.


1000-1129: Potential Infinite Recursion in ImportStoryTelling Function

At first glance, the ImportStoryTelling function seems to recursively call itself when processing nested properties. Ensure that there is no unintended recursion that could lead to a stack overflow.

Please verify the function implementation to ensure there is no unintended recursion. If recursion is necessary, ensure proper base cases are defined.


43-44: [Duplicate Comment]: Initialization of propertySchemaRepo

This comment has already been addressed in a previous review comment regarding the initialization of propertySchemaRepo.


prjBuilder = prjBuilder.Visualizer(visualizer.Visualizer(p.Visualizer))
prjBuilder = prjBuilder.PublishmentStatus(gqlmodel.FromPublishmentStatus(p.PublishmentStatus))

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle Potential Nil Value When Converting Publishment Status

When converting PublishmentStatus from the imported data, gqlmodel.FromPublishmentStatus(p.PublishmentStatus) might return a zero value if p.PublishmentStatus is nil or invalid.

Ensure that the PublishmentStatus is valid and handle possible nil or invalid values:

pubStatus := gqlmodel.FromPublishmentStatus(p.PublishmentStatus)
if pubStatus == project.PublishmentStatus("") {
    pubStatus = project.PublishmentStatusPrivate // default value
}
prjBuilder = prjBuilder.PublishmentStatus(pubStatus)

Comment on lines +496 to +553
func (i *Project) ExportProject(ctx context.Context, projectID id.ProjectID, operator *usecase.Operator) (*project.Project, map[string]interface{}, []*plugin.Plugin, error) {

prj, err := i.projectRepo.FindByID(ctx, projectID)
if err != nil {
return nil, nil, nil, err
}
sce, err := i.sceneRepo.FindByProject(ctx, prj.ID())
if err != nil {
return nil, nil, nil, err
}

pluginIDs := sce.PluginIds()
var filteredPluginIDs []id.PluginID
for _, pid := range pluginIDs {
if pid.String() != "reearth" {
filteredPluginIDs = append(filteredPluginIDs, pid)
}
}
plgs, err := i.pluginRepo.FindByIDs(ctx, filteredPluginIDs)
if err != nil {
return nil, nil, nil, err
}

sceneID := sce.ID()
nlsLayers, err := i.nlsLayerRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
sceneJSON, err := builder.New(
repo.LayerLoaderFrom(i.layerRepo),
repo.PropertyLoaderFrom(i.propertyRepo),
repo.DatasetGraphLoaderFrom(i.datasetRepo),
repo.TagLoaderFrom(i.tagRepo),
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}),
repo.NLSLayerLoaderFrom(i.nlsLayerRepo),
true,
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult(
ctx,
time.Now(),
prj.CoreSupport(),
prj.EnableGA(),
prj.TrackingID(),
)
if err != nil {
return nil, nil, nil, err
}

res := make(map[string]interface{})
res["scene"] = sceneJSON
return prj, res, plgs, nil
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid Using map[string]interface{} for Typed Data

The ExportProject function returns map[string]interface{} as part of its return values. Using untyped maps can lead to runtime errors and makes the code harder to maintain. It's better to define a structured type that represents the exported data.

Consider defining a struct for the export data:

type ExportedProjectData struct {
    Scene map[string]interface{} `json:"scene"`
}

...

func (i *Project) ExportProject(...) (*project.Project, *ExportedProjectData, []*plugin.Plugin, error) {
    ...
    res := &ExportedProjectData{
        Scene: sceneJSON,
    }
    return prj, res, plgs, nil
}

Handle Potential Panic When Accessing storyList

In the ExportProject function, you are accessing (*storyList)[0] at line 540 without checking if storyList is non-empty. If storyList is empty or nil, this will cause a panic due to an out-of-bounds slice access. Please add a check to ensure that storyList contains at least one story before accessing it.

Apply this diff to add the necessary check:

+var story *storytelling.Story
+if storyList != nil && len(*storyList) > 0 {
+    story = (*storyList)[0]
+}

-).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult(
+).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory(story).BuildResult(

Ensure that the BuildResult method can handle a nil story appropriately.

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
func (i *Project) ExportProject(ctx context.Context, projectID id.ProjectID, operator *usecase.Operator) (*project.Project, map[string]interface{}, []*plugin.Plugin, error) {
prj, err := i.projectRepo.FindByID(ctx, projectID)
if err != nil {
return nil, nil, nil, err
}
sce, err := i.sceneRepo.FindByProject(ctx, prj.ID())
if err != nil {
return nil, nil, nil, err
}
pluginIDs := sce.PluginIds()
var filteredPluginIDs []id.PluginID
for _, pid := range pluginIDs {
if pid.String() != "reearth" {
filteredPluginIDs = append(filteredPluginIDs, pid)
}
}
plgs, err := i.pluginRepo.FindByIDs(ctx, filteredPluginIDs)
if err != nil {
return nil, nil, nil, err
}
sceneID := sce.ID()
nlsLayers, err := i.nlsLayerRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
sceneJSON, err := builder.New(
repo.LayerLoaderFrom(i.layerRepo),
repo.PropertyLoaderFrom(i.propertyRepo),
repo.DatasetGraphLoaderFrom(i.datasetRepo),
repo.TagLoaderFrom(i.tagRepo),
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}),
repo.NLSLayerLoaderFrom(i.nlsLayerRepo),
true,
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory((*storyList)[0]).BuildResult(
ctx,
time.Now(),
prj.CoreSupport(),
prj.EnableGA(),
prj.TrackingID(),
)
if err != nil {
return nil, nil, nil, err
}
res := make(map[string]interface{})
res["scene"] = sceneJSON
return prj, res, plgs, nil
func (i *Project) ExportProject(ctx context.Context, projectID id.ProjectID, operator *usecase.Operator) (*project.Project, map[string]interface{}, []*plugin.Plugin, error) {
prj, err := i.projectRepo.FindByID(ctx, projectID)
if err != nil {
return nil, nil, nil, err
}
sce, err := i.sceneRepo.FindByProject(ctx, prj.ID())
if err != nil {
return nil, nil, nil, err
}
pluginIDs := sce.PluginIds()
var filteredPluginIDs []id.PluginID
for _, pid := range pluginIDs {
if pid.String() != "reearth" {
filteredPluginIDs = append(filteredPluginIDs, pid)
}
}
plgs, err := i.pluginRepo.FindByIDs(ctx, filteredPluginIDs)
if err != nil {
return nil, nil, nil, err
}
sceneID := sce.ID()
nlsLayers, err := i.nlsLayerRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
layerStyles, err := i.layerStyles.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
storyList, err := i.storytellingRepo.FindByScene(ctx, sceneID)
if err != nil {
return nil, nil, nil, err
}
var story *storytelling.Story
if storyList != nil && len(*storyList) > 0 {
story = (*storyList)[0]
}
sceneJSON, err := builder.New(
repo.LayerLoaderFrom(i.layerRepo),
repo.PropertyLoaderFrom(i.propertyRepo),
repo.DatasetGraphLoaderFrom(i.datasetRepo),
repo.TagLoaderFrom(i.tagRepo),
repo.TagSceneLoaderFrom(i.tagRepo, []id.SceneID{sceneID}),
repo.NLSLayerLoaderFrom(i.nlsLayerRepo),
true,
).ForScene(sce).WithNLSLayers(&nlsLayers).WithLayerStyle(layerStyles).WithStory(story).BuildResult(
ctx,
time.Now(),
prj.CoreSupport(),
prj.EnableGA(),
prj.TrackingID(),
)
if err != nil {
return nil, nil, nil, err
}
res := make(map[string]interface{})
res["scene"] = sceneJSON
return prj, res, plgs, nil

Comment on lines +498 to +501
prj, err := i.projectRepo.FindByID(ctx, projectID)
if err != nil {
return nil, nil, nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle Errors Appropriately when Fetching Project

When calling i.projectRepo.FindByID, if an error occurs, it's important to distinguish between a not found error and other types of errors. Currently, all errors are treated the same way.

Consider handling the rerror.ErrNotFound error separately to provide a more informative message:

prj, err := i.projectRepo.FindByID(ctx, projectID)
if err != nil {
    if errors.Is(err, rerror.ErrNotFound) {
        return nil, nil, nil, fmt.Errorf("project with ID %s not found", projectID)
    }
    return nil, nil, nil, err
}

Comment on lines +557 to +615

tx, err := i.transaction.Begin(ctx)
if err != nil {
return nil, nil, err
}

var p = jsonmodel.ToProjectFromJSON(projectData)

projectID, err := id.ProjectIDFrom(string(p.ID))
if err != nil {
return nil, nil, err
}
workspaceID, err := accountdomain.WorkspaceIDFrom(string(p.TeamID))
if err != nil {
return nil, nil, err
}

prjBuilder := project.New().
ID(projectID).
Workspace(workspaceID).
IsArchived(p.IsArchived).
IsBasicAuthActive(p.IsBasicAuthActive).
BasicAuthUsername(p.BasicAuthUsername).
BasicAuthPassword(p.BasicAuthPassword).
Name(p.Name).
Description(p.Description).
Alias(p.Alias).
PublicTitle(p.PublicTitle).
PublicDescription(p.PublicDescription).
PublicImage(p.PublicImage).
PublicNoIndex(p.PublicNoIndex).
CoreSupport(p.CoreSupport).
EnableGA(p.EnableGa).
TrackingID(p.TrackingID).
Starred(p.Starred)

if !p.CreatedAt.IsZero() {
prjBuilder = prjBuilder.UpdatedAt(p.CreatedAt)
}
if p.PublishedAt != nil {
prjBuilder = prjBuilder.PublishedAt(*p.PublishedAt)
}

if p.ImageURL != nil {
prjBuilder = prjBuilder.ImageURL(p.ImageURL)
}

prjBuilder = prjBuilder.Visualizer(visualizer.Visualizer(p.Visualizer))
prjBuilder = prjBuilder.PublishmentStatus(gqlmodel.FromPublishmentStatus(p.PublishmentStatus))

prj, err := prjBuilder.Build()
if err != nil {
return nil, nil, err
}
if err := i.projectRepo.Save(ctx, prj); err != nil {
return nil, nil, err
}
return prj, tx, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure Atomicity and Consistency in ImportProject Transaction

In the ImportProject function, you begin a transaction but only commit it after saving the project. However, if any of the subsequent operations (like importing scenes, properties, etc.) fail, the transaction will not be rolled back properly, potentially leaving the database in an inconsistent state.

Ensure that all related import operations are included within the transaction's scope, and handle commits and rollbacks appropriately.

func (i *Project) ImportProject(...) (*project.Project, error) {
    tx, err := i.transaction.Begin(ctx)
    if err != nil {
        return nil, err
    }
    ctx = tx.Context()
    defer func() {
        if err2 := tx.End(ctx); err == nil && err2 != nil {
            err = err2
        }
    }()

    // ... import operations ...

    tx.Commit()
    return prj, nil
}

Comment on lines +565 to +572
projectID, err := id.ProjectIDFrom(string(p.ID))
if err != nil {
return nil, nil, err
}
workspaceID, err := accountdomain.WorkspaceIDFrom(string(p.TeamID))
if err != nil {
return nil, nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle Errors When Parsing IDs

In lines 565-572, errors when parsing projectID and workspaceID are handled, but the error messages could be more descriptive. This would aid in debugging if invalid IDs are provided.

Improve error messages for better clarity:

projectID, err := id.ProjectIDFrom(string(p.ID))
if err != nil {
    return nil, nil, fmt.Errorf("invalid project ID '%s': %w", p.ID, err)
}
workspaceID, err := accountdomain.WorkspaceIDFrom(string(p.TeamID))
if err != nil {
    return nil, nil, fmt.Errorf("invalid workspace ID '%s': %w", p.TeamID, err)
}

Comment on lines +1011 to +1092
pages := []*storytelling.Page{}
for _, pageJSON := range storyJSON.Pages {

blocks := []*storytelling.Block{}
for _, blockJSON := range pageJSON.Blocks {
blockID, err := id.BlockIDFrom(blockJSON.ID)
if err != nil {
return nil, err
}
pluginID, err := id.PluginIDFrom(blockJSON.PluginId)
if err != nil {
return nil, err
}
extensionID := id.PluginExtensionID(blockJSON.ExtensionId)
_, extension, err := i.getPlugin(ctx, &pluginID, &extensionID)
if err != nil {
return nil, err
}
if extension.Type() != plugin.ExtensionTypeStoryBlock {
return nil, interfaces.ErrExtensionTypeMustBeStoryBlock
}
prop, err := property.New().NewID().Schema(extension.Schema()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, extension.Schema())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, blockJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
block, err := storytelling.NewBlock().
ID(blockID).
Property(prop.ID()).
Plugin(pluginID).
Extension(extensionID).
Build()
if err != nil {
return nil, err
}
blocks = append(blocks, block)
}

pageID, err := id.PageIDFrom(pageJSON.ID)
if err != nil {
return nil, err
}
schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStoryPage)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, pageJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
page, err := storytelling.NewPage().
ID(pageID).
Property(prop.ID()).
Title(pageJSON.Title).
Swipeable(pageJSON.Swipeable).
Blocks(blocks).
Build()
if err != nil {
return nil, err
}
pages = append(pages, page)
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check for Nil Pointers and Empty Lists

When iterating over storyJSON.Pages and pageJSON.Blocks, ensure that the code handles cases where these might be nil or empty to prevent nil pointer dereferences.

Add nil checks before iterating:

if storyJSON.Pages != nil {
    for _, pageJSON := range storyJSON.Pages {
        // ...
        if pageJSON.Blocks != nil {
            for _, blockJSON := range pageJSON.Blocks {
                // ...
            }
        }
    }
}

Comment on lines +1032 to +1047
prop, err := property.New().NewID().Schema(extension.Schema()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, extension.Schema())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, blockJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid Duplicated Code when Handling Properties

The code segments for creating and saving properties for blocks, pages, and stories are very similar. This duplication can be refactored to a helper function to improve maintainability.

Create a helper function:

func (i *Storytelling) createAndSaveProperty(ctx context.Context, schemaID id.PropertySchemaID, sceneID id.SceneID, propertyJSON map[string]interface{}) (*property.Property, error) {
    prop, err := property.New().NewID().Schema(schemaID).Scene(sceneID).Build()
    if err != nil {
        return nil, err
    }
    ps, err := i.propertySchemaRepo.FindByID(ctx, schemaID)
    if err != nil {
        return nil, err
    }
    prop, err = builder.AddItemFromPropertyJSON(prop, ps, propertyJSON)
    if err != nil {
        return nil, err
    }
    if err = i.propertyRepo.Save(ctx, prop); err != nil {
        return nil, err
    }
    return prop, nil
}

Then use it in your code:

prop, err := i.createAndSaveProperty(ctx, extension.Schema(), sceneID, blockJSON.Property)
// Repeat for pages and stories...

Also applies to: 1065-1080, 1097-1112

Comment on lines +1001 to +1004
sceneJSON, err := builder.ParseSceneJSON(ctx, sceneData)
if err != nil {
return nil, err
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check for Errors from ParseSceneJSON

Ensure that builder.ParseSceneJSON returns an error when the input is invalid, and handle the error appropriately. Also, consider validating the structure of sceneData before parsing.

Add validation or error handling for invalid input:

sceneJSON, err := builder.ParseSceneJSON(ctx, sceneData)
if err != nil {
    return nil, fmt.Errorf("failed to parse scene JSON: %w", err)
}

Comment on lines +1094 to +1129
storyID, _ := id.StoryIDFrom(storyJSON.ID)

schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStory)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, storyJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
story, err := storytelling.NewStory().
ID(storyID).
Property(prop.ID()).
Scene(sceneID).
PanelPosition(storytelling.Position(storyJSON.PanelPosition)).
BgColor(storyJSON.BgColor).
Pages(storytelling.NewPageList(pages)).
Build()
if err != nil {
return nil, err
}
if err := i.storytellingRepo.Save(ctx, *story); err != nil {
return nil, err
}

return story, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Handle Missing or Invalid Story Data

When importing the story, ensure that all required fields are present and valid. If storyJSON is nil or missing critical fields, the function could fail unexpectedly.

Add checks for required fields:

if storyJSON == nil {
    return nil, errors.New("missing story data in scene JSON")
}
if storyJSON.ID == "" {
    return nil, errors.New("missing story ID")
}
// Continue with additional field checks if necessary

Comment on lines 1000 to 1129
return nil, interfaces.ErrExtensionTypeMustBeStoryBlock
}
prop, err := property.New().NewID().Schema(extension.Schema()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, extension.Schema())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, blockJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
block, err := storytelling.NewBlock().
ID(blockID).
Property(prop.ID()).
Plugin(pluginID).
Extension(extensionID).
Build()
if err != nil {
return nil, err
}
blocks = append(blocks, block)
}

pageID, err := id.PageIDFrom(pageJSON.ID)
if err != nil {
return nil, err
}
schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStoryPage)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, pageJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
page, err := storytelling.NewPage().
ID(pageID).
Property(prop.ID()).
Title(pageJSON.Title).
Swipeable(pageJSON.Swipeable).
Blocks(blocks).
Build()
if err != nil {
return nil, err
}
pages = append(pages, page)
}

storyID, _ := id.StoryIDFrom(storyJSON.ID)

schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStory)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, storyJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
story, err := storytelling.NewStory().
ID(storyID).
Property(prop.ID()).
Scene(sceneID).
PanelPosition(storytelling.Position(storyJSON.PanelPosition)).
BgColor(storyJSON.BgColor).
Pages(storytelling.NewPageList(pages)).
Build()
if err != nil {
return nil, err
}
if err := i.storytellingRepo.Save(ctx, *story); err != nil {
return nil, err
}

return story, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error Handling and Transaction Management

The ImportStoryTelling function involves multiple database operations (e.g., saving properties and stories). Consider wrapping these operations in a transaction to ensure atomicity. If any step fails, you can roll back the changes to maintain data integrity.

Begin a transaction at the start of the function:

tx, err := i.transaction.Begin(ctx)
if err != nil {
    return nil, err
}
ctx = tx.Context()
defer func() {
    if err2 := tx.End(ctx); err == nil && err2 != nil {
        err = err2
    }
}()

Commit the transaction upon successful completion:

// After all operations
tx.Commit()

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL

Commits

Files that changed from the base of the PR and between a798a4e and c3fe185.

Files selected for processing (5)
  • server/internal/adapter/gql/gqlmodel/convert.go (2 hunks)
  • server/internal/adapter/gql/resolver_mutation_project.go (2 hunks)
  • server/internal/usecase/interactor/storytelling.go (4 hunks)
  • server/internal/usecase/interactor/storytelling_test.go (1 hunks)
  • server/internal/usecase/interfaces/story.go (1 hunks)
Files skipped from review as they are similar to previous changes (3)
  • server/internal/adapter/gql/gqlmodel/convert.go
  • server/internal/usecase/interactor/storytelling_test.go
  • server/internal/usecase/interfaces/story.go
Additional comments not posted (1)
server/internal/adapter/gql/resolver_mutation_project.go (1)

113-131: ExportProject Function Implemented Correctly

The ExportProject resolver function correctly retrieves the project by ID, exports the project data along with associated plugins, and returns the data encapsulated in an ExportProjectPayload. Error handling is properly implemented.

Comment on lines +133 to +196
func (r *mutationResolver) ImportProject(ctx context.Context, input gqlmodel.ImportProjectInput) (*gqlmodel.ImportProjectPayload, error) {

fileBytes, err := io.ReadAll(input.File.File)
if err != nil {
return nil, err
}

var jsonData map[string]interface{}
if err := json.Unmarshal(fileBytes, &jsonData); err != nil {
return nil, err
}

projectData, _ := jsonData["project"].(map[string]interface{})
prj, tx, err := usecases(ctx).Project.ImportProject(ctx, projectData)
if err != nil {
return nil, err
}
defer func() {
if err2 := tx.End(ctx); err == nil && err2 != nil {
err = err2
}
}()

pluginsData, _ := jsonData["plugins"].([]interface{})
_, err = usecases(ctx).Plugin.ImportPlugins(ctx, pluginsData)
if err != nil {
return nil, err
}

sceneData, _ := jsonData["scene"].(map[string]interface{})
_, err = usecases(ctx).Scene.ImportScene(ctx, prj, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).NLSLayer.ImportNLSLayers(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).Style.ImportStyles(ctx, sceneData)
if err != nil {
return nil, err
}

_, err = usecases(ctx).StoryTelling.ImportStory(ctx, sceneData)
if err != nil {
return nil, err
}

tx.Commit()

prj, res, plgs, err := usecases(ctx).Project.ExportProject(ctx, prj.ID(), getOperator(ctx))
if err != nil {
return nil, err
}
res["project"] = gqlmodel.ToProject(prj)
res["plugins"] = gqlmodel.ToPlugins(plgs)

return &gqlmodel.ImportProjectPayload{
ProjectData: res,
}, nil

}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ensure All Import Operations Are Within the Same Transaction

While you've added transaction processing in ImportProject, the transaction initiated may not encompass all subsequent import operations like importing plugins, scenes, NLS layers, styles, and storytelling. To maintain data consistency, consider wrapping the entire import process within a single transaction, ensuring that all operations are committed or rolled back atomically.

You can achieve this by starting the transaction at the beginning of the function and passing the transaction context to all subsequent use case calls. For example:

func (r *mutationResolver) ImportProject(ctx context.Context, input gqlmodel.ImportProjectInput) (*gqlmodel.ImportProjectPayload, error) {
    txCtx, err := usecases(ctx).Transaction(func(tx usecase.Transaction) error {
        // Perform imports using tx
        // Example:
        prj, err := tx.Project.ImportProject(ctx, projectData)
        if err != nil {
            return err
        }
        // ... import other components using tx.Plugin, tx.Scene, etc.
        return nil
    })
    if err != nil {
        return nil, err
    }

    // Retrieve and return the imported project data
    prj, res, plgs, err := usecases(ctx).Project.ExportProject(ctx, prj.ID(), getOperator(ctx))
    if err != nil {
        return nil, err
    }
    res["project"] = gqlmodel.ToProject(prj)
    res["plugins"] = gqlmodel.ToPlugins(plgs)

    return &gqlmodel.ImportProjectPayload{
        ProjectData: res,
    }, nil
}

Ensure that all your use case methods accept the transaction context or transaction interface so they operate within the same transaction.

Comment on lines +1000 to +1129
return nil, interfaces.ErrExtensionTypeMustBeStoryBlock
}
prop, err := property.New().NewID().Schema(extension.Schema()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, extension.Schema())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, blockJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
block, err := storytelling.NewBlock().
ID(blockID).
Property(prop.ID()).
Plugin(pluginID).
Extension(extensionID).
Build()
if err != nil {
return nil, err
}
blocks = append(blocks, block)
}

pageID, err := id.PageIDFrom(pageJSON.ID)
if err != nil {
return nil, err
}
schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStoryPage)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, pageJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
page, err := storytelling.NewPage().
ID(pageID).
Property(prop.ID()).
Title(pageJSON.Title).
Swipeable(pageJSON.Swipeable).
Blocks(blocks).
Build()
if err != nil {
return nil, err
}
pages = append(pages, page)
}

storyID, _ := id.StoryIDFrom(storyJSON.ID)

schema := builtin.GetPropertySchema(builtin.PropertySchemaIDStory)
prop, err := property.New().NewID().Schema(schema.ID()).Scene(sceneID).Build()
if err != nil {
return nil, err
}
ps, err := i.propertySchemaRepo.FindByID(ctx, schema.ID())
if err != nil {
return nil, err
}
prop, err = builder.AddItemFromPropertyJSON(prop, ps, storyJSON.Property)
if err != nil {
return nil, err
}
// Save property
if err = i.propertyRepo.Save(ctx, prop); err != nil {
return nil, err
}
story, err := storytelling.NewStory().
ID(storyID).
Property(prop.ID()).
Scene(sceneID).
PanelPosition(storytelling.Position(storyJSON.PanelPosition)).
BgColor(storyJSON.BgColor).
Pages(storytelling.NewPageList(pages)).
Build()
if err != nil {
return nil, err
}
if err := i.storytellingRepo.Save(ctx, *story); err != nil {
return nil, err
}

return story, nil
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add Unit Tests for ImportStory Function

The new ImportStory function introduces significant functionality for importing storytelling data. To ensure its correctness and robustness, please add unit tests covering various scenarios, including:

  • Valid data imports with expected structures.
  • Handling of invalid or corrupted data.
  • Edge cases, such as empty pages or blocks.
  • Error conditions, like missing required fields.

This will help prevent regressions and ensure that the function behaves as expected under different conditions.

@hexaforce hexaforce enabled auto-merge (squash) September 18, 2024 02:36
@hexaforce hexaforce merged commit bce51c1 into main Sep 18, 2024
16 checks passed
@hexaforce hexaforce deleted the feat/project-import-export branch September 18, 2024 02:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants