You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you set source and sink to the same settings, the container will be wiped - this should never happen, since the job is already "done" before it begins.
The text was updated successfully, but these errors were encountered:
What is your goal in using the same source and sink? Since data transfers happen asynchronously using streaming but container re-creation is essentially a one-time synchronous operation which needs to happen before the first write it won't be possible to directly read and then write into a re-created empty container.
I would first suggest that you don't use the RecreateContainer setting for this scenario since that is what controls deleting and recreating the container. Beyond that other settings would depend on the intended outcome. For example, if you want to add duplicates for all of the records in the container you could use a query that leaves out "id" to autogenerate new IDs per record.
If you're hoping to read all of the data out of a container, delete it, and then repopulate back into a new instance of that container (I'm not sure why) you would need an intermediate step to store all of the data elsewhere, like a local JSON file that you could write to and then read back in. I haven't tried this but you may be able to do this in a single settings file with multiple Operations since they run sequentially.
If you set source and sink to the same settings, the container will be wiped - this should never happen, since the job is already "done" before it begins.
The text was updated successfully, but these errors were encountered: