Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add import bulk/resume commands #1091

Merged
merged 25 commits into from
Oct 18, 2024
Merged
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
f154d3e
feat: add `import bulk/resume` commands
cristiand391 Oct 14, 2024
d58b593
feat: add `--line-ending`for `import bulk`
cristiand391 Oct 14, 2024
5fe987c
fix: failedRecords counter only on API data
cristiand391 Oct 14, 2024
414776c
fix: default wait to 5min
cristiand391 Oct 14, 2024
c342422
test: add import resume NUTs
cristiand391 Oct 15, 2024
b233685
test: export bulk NUTs use import bulk
cristiand391 Oct 15, 2024
dcc8db6
test: use bin/dev
cristiand391 Oct 15, 2024
db0e194
test: off-by-one
cristiand391 Oct 15, 2024
3dc1720
chore: remove `api-version` flag from `resume`
cristiand391 Oct 15, 2024
2477652
chore: update messages
cristiand391 Oct 15, 2024
d72aca0
fix: exclusive flags
cristiand391 Oct 15, 2024
3916416
fix: exactlyOne instead of exclusive, we need 1 id
cristiand391 Oct 15, 2024
ad86f63
test: update import NUTs
cristiand391 Oct 15, 2024
cde122d
test: better failure
cristiand391 Oct 16, 2024
be3e668
fix: capitalize MSO stages
cristiand391 Oct 16, 2024
97de819
fix: properly detect JSON mode
cristiand391 Oct 16, 2024
def1280
chore: use ms.error() method
cristiand391 Oct 16, 2024
7d2fe3f
chore: refactor
cristiand391 Oct 16, 2024
e221a58
fix: set correct baseUrl
cristiand391 Oct 16, 2024
32caec2
fix: add return type
cristiand391 Oct 17, 2024
afb21ce
fix: refactor bulk import cache resolver
cristiand391 Oct 17, 2024
98e13f9
fix: add fallback for terminal-link
cristiand391 Oct 17, 2024
de841e0
chore: ci-rerun
cristiand391 Oct 17, 2024
86d7f1f
fix: do not stop MSO on `error` event
cristiand391 Oct 17, 2024
9d9532f
fix: edit messages for new "data import bulk|resume" commands (#1093)
jshackell-sfdc Oct 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions command-snapshot.json
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,14 @@
],
"plugin": "@salesforce/plugin-data"
},
{
"alias": [],
"command": "data:import:bulk",
"flagAliases": [],
"flagChars": ["a", "f", "o", "s", "w"],
"flags": ["api-version", "async", "file", "flags-dir", "json", "line-ending", "sobject", "target-org", "wait"],
"plugin": "@salesforce/plugin-data"
},
{
"alias": [],
"command": "data:import:legacy:tree",
Expand All @@ -155,6 +163,14 @@
],
"plugin": "@salesforce/plugin-data"
},
{
"alias": [],
"command": "data:import:resume",
"flagAliases": [],
"flagChars": ["i", "w"],
"flags": ["flags-dir", "job-id", "json", "use-most-recent", "wait"],
"plugin": "@salesforce/plugin-data"
},
{
"alias": ["force:data:tree:import", "data:import:beta:tree"],
"command": "data:import:tree",
Expand Down
74 changes: 74 additions & 0 deletions messages/data.import.bulk.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# summary

Bulk import records to an org from a CSV file. Uses Bulk API 2.0.

# description

You can use this command to import millions of records to an org from a CSV file.

All the records in the CSV file must be for the same object, you specify the object being imported via the `--sobject` flag.

More info about how to prepare CSV files:
https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/datafiles_prepare_csv.htm

# examples

- Import Account records from a CSV-formatted file into an org.

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --wait 10 --target-org my-scratch

- Import asynchronously; the command immediately returns a job ID that you then pass to the "sf data import resume" command:

<%= config.bin %> <%= command.id %> --file accounts.csv --sobject Account --async --target-org my-scratch

# flags.async.summary

Run the command asynchronously.

# flags.file.summary

CSV file that contains the fields of the object to import.

# flags.sobject.summary

API name of the Salesforce object, either standard or custom, that you want to import to the org.

# flags.wait.summary

Time to wait for the command to finish, in minutes.

# flags.line-ending.summary

Line ending used in the CSV file. Default value on Windows is `CRLF`; on macOS and Linux it's `LR`.

# export.resume

Run "sf data import resume --job-id %s" to resume the operation.

# error.timeout

The operation timed out after %s minutes.

Run "sf data import resume --job-id %s" to resume it.

# error.failedRecordDetails

Job finished being processed but failed to import %s records.

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"

# error.jobFailed

Job failed to be processed due to:
%s

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"

# error.jobAborted

Job has been aborted.

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"
57 changes: 57 additions & 0 deletions messages/data.import.resume.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# summary

Resume a bulk import job that you previously started. Uses Bulk API 2.0.

# description

The command uses the job ID returned by the "sf data import bulk" command or the most recently-run bulk import job.

# examples

- Resume a bulk import job from your default org using an ID:

<%= config.bin %> <%= command.id %> --job-id 750xx000000005sAAA

- Resume the most recently run bulk import job for an org with alias my-scratch:

<%= config.bin %> <%= command.id %> --use-most-recent --target-org my-scratch

# flags.use-most-recent.summary

Use the job ID of the bulk import job that was most recently run.

# flags.job-id.summary

Job ID of the bulk import.

# flags.wait.summary

Time to wait for the command to finish, in minutes.

# error.failedRecordDetails

Job finished being processed but failed to import %s records.

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"

# error.timeout

The operation timed out after %s minutes.

Try re-running "sf data import resume --job-id %s" with a bigger wait time.

# error.jobFailed

Job failed to be processed due to:
%s

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"

# error.jobAborted

Job has been aborted.

To review the details of this job, run:
sf org open --target-org %s --path "/lightning/setup/AsyncApiJobStatus/page?address=%2F%s"
3 changes: 2 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,8 @@
"description": "Get a single record."
},
"import": {
"description": "Import data to your org."
"description": "Import data to your org.",
"external": true
},
"query": {
"description": "Query records."
Expand Down
22 changes: 22 additions & 0 deletions src/bulkDataRequestCache.ts
Original file line number Diff line number Diff line change
Expand Up @@ -188,6 +188,28 @@ export class BulkUpsertRequestCache extends BulkDataRequestCache {
}
}

export class BulkImportRequestCache extends BulkDataRequestCache {
public static getDefaultOptions(): TTLConfig.Options {
return {
isGlobal: true,
isState: true,
filename: BulkImportRequestCache.getFileName(),
stateFolder: Global.SF_STATE_FOLDER,
ttl: Duration.days(7),
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

};
}

public static getFileName(): string {
return 'bulk-data-import-cache.json';
}

public static async unset(key: string): Promise<void> {
const cache = await BulkImportRequestCache.create();
cache.unset(key);
await cache.write();
}
}

export class BulkExportRequestCache extends TTLConfig<TTLConfig.Options, BulkExportCacheConfig> {
public static getDefaultOptions(): TTLConfig.Options {
return {
Expand Down
Loading
Loading