Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to change file storage size from 10TB to 13TB #2158

Closed
dzmitry-alchanau opened this issue Jan 12, 2021 · 13 comments
Closed

Unable to change file storage size from 10TB to 13TB #2158

dzmitry-alchanau opened this issue Jan 12, 2021 · 13 comments

Comments

@dzmitry-alchanau
Copy link

Hi there,

Thank you for opening an issue. Please note that we try to keep the Terraform issue tracker reserved for bug reports and feature requests. For general usage questions, please see: https://www.terraform.io/community.html.

Terraform Version

Terraform v0.13.5

Affected Resource(s)

  • ibm_storage_file

Terraform Configuration Files

terraform {
required_version = ">= 0.13"
required_providers {
ibm = {
source = "IBM-Cloud/ibm"
version = "1.17.0"
}
}
}
resource "ibm_storage_file" "nfs01" {
type = "Endurance"
datacenter = "wdc06"
capacity = 13000 #capacity was changed from 10000 to 13000
iops = 2
allowed_virtual_guest_ids = [.....]
allowed_hardware_ids = [.....]
hourly_billing = false
notes = "[Managed by terraform]"
}


### Expected Behavior
Size of NFS should be changed from 10 to 13 TB

### Actual Behavior
after executing terraform apply got this error
Error: Error updating storage: Could not find price for endurance storage space

### Steps to Reproduce
1. `terraform apply`

Also I was able to do resizing using ibmcloud cli 
ibmcloud sl file volume-modify #volume-id-here --new-size 13000
@KyungmoIBM
Copy link

Hello Team, This issue related to CS2108767. Please check the case and join the slack channel

@VaishnaviGopal
Copy link
Collaborator

VaishnaviGopal commented Jan 15, 2021

Hi @dzmitry-alchanau. We tried to reproduce the issue. When enabled the trace, it is found that there is a restriction cap on minimum and maximum capacity wrt IOPS.

STORAGE_SPACE_FOR_2_IOPS_PER_GB","units":"GBs","capacityMaximum":"12000","capacityMinimum":"20"

So, Acc to the above msg, The maximum capacity is 12000GB when there is a limit of 2 IOPS per GB.

@dzmitry-alchanau
Copy link
Author

Hi @VaishnaviGopal , but I was able to resize this volume using ibmcloud CLI tool, without changing IOPS, which means that this limitation exists on terraform provider side and not on IBM cloud API side, if I understand this correctly

@VaishnaviGopal
Copy link
Collaborator

That is strange. Beacuse this is the response i got from CLI, which clearly states 12TB as maximum capacity.

vaishnavi@vaishnavis-MBP terraform-provider-ibm % ic sl file volume-options                                                             
name                 value   
Storage Type         performance,endurance   
Size (GB)            20,40,80,100,250,500,1000,2000-3000,4000-7000,8000-9000,10000-12000   
IOPS                 Size (GB)   20     40     80     100    250    500             1000            2000-3000       4000-7000       8000-9000       10000-12000      
                     Min IOPS    100    100    100    100    100    100             100             200             300             500             1000      
                     Max IOPS    1000   2000   4000   6000   6000   6000 or 10000   6000 or 20000   6000 or 40000   6000 or 48000   6000 or 48000   6000 or 48000      
                        
Tier                 0.25,2,4,10   
Location             ams01,ams03,che01,dal05,dal06,dal07,dal09,dal10,dal12,dal13,fra02,fra04,fra05,hkg02,hou02,lon02,lon04,lon05,lon06,mel01,mex01,mil01,mon01,osa21,osa22,osa23,osl01,par01,par04,par05,par06,sao01,sao04,sao05,sea01,seo01,sjc01,sjc03,sjc04,sng01,syd01,syd04,syd05,tok02,tok04,tok05,tor01,tor04,tor05,wdc01,wdc04,wdc06,wdc07   
Snapshot Size (GB)   Storage Size (GB)   Available Snapshot Size (GB)      
                     20                  0,5,10,20      
                     40                  0,5,10,20,40      
                     80                  0,5,10,20,40,60,80      
                     100                 0,5,10,20,40,60,80,100      
                     250                 0,5,10,20,40,60,80,100,150,200,250      
                     500                 0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500      
                     1000                0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500,600,700,1000      
                     2000-3000           0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500,600,700,1000,2000      
                     4000-7000           0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500,600,700,1000,2000,4000      
                     8000-9000           0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500,600,700,1000,2000,4000      
                     10000-12000         0,5,10,20,40,60,80,100,150,200,250,300,350,400,450,500,600,700,1000,2000,4000      

Can you show the CLI logs where you have updated file size from 10TB to 13TB?

@dzmitry-alchanau
Copy link
Author

dzmitry-alchanau commented Jan 15, 2021

~$ ibmcloud sl file volume-modify #volume-id --new-size 13000
This action will incur charges on your account. Continue?> yes
Order #order-id was placed successfully!.

Storage as a Service

13000 GBs

2 IOPS per GB

You may run 'ibmcloud sl file volume-list --order #order-id' to find this file volume after it is ready.

@VaishnaviGopal
Copy link
Collaborator

Can you try running the cmd ibmcloud sl file volume-list after the updation and paste the screenshot here. Even from UI, I see there is a limit on the storage size.

image

@dzmitry-alchanau
Copy link
Author

image

@VaishnaviGopal
Copy link
Collaborator

VaishnaviGopal commented Jan 19, 2021

Hi @dzmitry-alchanau, We have made a dev release of provider with a fix to handle beta_access keyName.
Here is the link to download.

since our account doesn't have permisssion to test this scenario, we are requesting you to test. Follow the below steps to check if you are still getting the issue if using a capacity of 13TB.

  1. change the provider version to use latest
terraform {
  required_version = ">= 0.13"
  required_providers {
    ibm = {
      source  = "IBM-Cloud/ibm"
      version = "1.19.0"
    }
  }
}
  1. Make the following directory structure

mkdir -p ~/.terraform.d/plugin-cache/registry.terraform.io/ibm-cloud/ibm/1.19.0/darwin_amd64 // if you are using macos.
mkdir -p ~/.terraform.d/plugin-cache/registry.terraform.io/ibm-cloud/ibm/1.19.0/linux_amd64 // if you are using linux machine.
mkdir -p ~/.terraform.d/plugin-cache/registry.terraform.io/ibm-cloud/ibm/1.19.0/windows_amd64 // if you are using windows machine.

  1. Place the downloaded dev release binary in the above path.
  2. Export the environment variable TF_PLUGIN_CACHE_DIR to use the dev release binary placed in plugin-cache directory and not from hashicorp registry (since it is a dev release, we have not yet registered this binary in hashicorp)

export TF_PLUGIN_CACHE_DIR="/<your_home_dir_where_terraform.d_plugin_folder_is_palced>/.terraform.d/plugin-cache"

  1. Run terraform init
  2. Run terraform apply (Give permissions to the binary placed in plugin-cache directory by going to system preferences>security&privacy>terraform-provider-ibm>allow-anyway, IF USING MACOS)

@dzmitry-alchanau
Copy link
Author

Hi, thank you for quick help, but since we are using jenkins pipline for applying terraform changes and terrafrom itself running in docker container I would like to know when this feature will be released and will be available to download from hashicorp?

VaishnaviGopal added a commit to VaishnaviGopal/terraform-provider-ibm that referenced this issue Jan 20, 2021
hkantare pushed a commit that referenced this issue Jan 21, 2021
@hkantare
Copy link
Collaborator

@hkantare
Copy link
Collaborator

hkantare commented Feb 8, 2021

closing the issue

@hkantare hkantare closed this as completed Feb 8, 2021
@surajsub
Copy link
Contributor

@VaishnaviGopal - Every account has different prevs and access restrictions. In our account we can provision upto 18 TB via the cloud cli but the terraform provider does not let us do that . Does your fix address that issue ?

@hkantare
Copy link
Collaborator

@surajibm yes that fix address that issue....Can you please test and let us know

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants