Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parted: ansible detects a change only in dry-run #183

Closed
huguesgr opened this issue Apr 15, 2020 · 5 comments · Fixed by #247
Closed

parted: ansible detects a change only in dry-run #183

huguesgr opened this issue Apr 15, 2020 · 5 comments · Fixed by #247
Labels
bug This issue/PR relates to a bug has_pr module module

Comments

@huguesgr
Copy link

SUMMARY

Once the partition initially created, subsequent run give:

  • "ok" without dry-run
  • "changed" with dry-run (--check)
    The only difference there is between the two runs is the following line:
    "script": "unit KiB set 1 lvm on"
ISSUE TYPE
  • Bug Report
COMPONENT NAME

parted

ANSIBLE VERSION
ansible 2.8.5
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/home/ansible/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/local/lib/python2.7/dist-packages/ansible-2.8.5-py2.7.egg/ansible
  executable location = /usr/local/bin/ansible
  python version = 2.7.12 (default, Oct  8 2019, 14:14:10) [GCC 5.4.0 20160609]
CONFIGURATION

OS / ENVIRONMENT

RHEL 7.4. Using SAN storage with multipath.

STEPS TO REPRODUCE
- name: create partition
  parted:
    device: '/dev/mapper/s_server01_backup_vg.1'
    number: 1
    state: present
    flags: [ lvm ]
    part_end: '100%'
EXPECTED RESULTS

Return "ok", which is what is returned without dry-run.
Here is the output without dry-run:

ok: [server01] => (item={u'device': u'/dev/mapper/s_server01_backup_vg.1', u'part_end': u'100%', u'number': 1}) => {
    "ansible_loop_var": "item", 
    "changed": false, 
    "disk": {
        "dev": "/dev/mapper/s_server01_backup_vg.1", 
        "logical_block": 512, 
        "model": "Linux device-mapper (multipath)", 
        "physical_block": 512, 
        "size": 209715200.0, 
        "table": "msdos", 
        "unit": "kib"
    }, 
    "invocation": {
        "module_args": {
            "align": "optimal", 
            "device": "/dev/mapper/s_server01_backup_vg.1", 
            "flags": [
                "lvm"
            ], 
            "label": "msdos", 
            "name": null, 
            "number": 1, 
            "part_end": "100%", 
            "part_start": "0%", 
            "part_type": "primary", 
            "state": "present", 
            "unit": "KiB"
        }
    }, 
    "item": {
        "device": "/dev/mapper/s_server01_backup_vg.1", 
        "number": 1, 
        "part_end": "100%"
    }, 
    "partitions": [
        {
            "begin": 16384.0, 
            "end": 209715200.0, 
            "flags": [
                "lvm"
            ], 
            "fstype": "", 
            "name": "", 
            "num": 1, 
            "size": 209698816.0, 
            "unit": "kib"
        }
    ], 
    "script": ""
}
ACTUAL RESULTS

Returns "changed" only in dry-run:

changed: [server01] => (item={u'device': u'/dev/mapper/s_server01_backup_vg.1', u'part_end': u'100%', u'number': 1}) => {
    "ansible_loop_var": "item", 
    "changed": true, 
    "disk": {
        "dev": "/dev/mapper/s_server01_backup_vg.1", 
        "logical_block": 512, 
        "model": "Linux device-mapper (multipath)", 
        "physical_block": 512, 
        "size": 209715200.0, 
        "table": "msdos", 
        "unit": "kib"
    }, 
    "invocation": {
        "module_args": {
            "align": "optimal", 
            "device": "/dev/mapper/s_server01_backup_vg.1", 
            "flags": [
                "lvm"
            ], 
            "label": "msdos", 
            "name": null, 
            "number": 1, 
            "part_end": "100%", 
            "part_start": "0%", 
            "part_type": "primary", 
            "state": "present", 
            "unit": "KiB"
        }
    }, 
    "item": {
        "device": "/dev/mapper/s_server01_backup_vg.1", 
        "number": 1, 
        "part_end": "100%"
    }, 
    "partitions": [
        {
            "begin": 16384.0, 
            "end": 209715200.0, 
            "flags": [
                "lvm"
            ], 
            "fstype": "", 
            "name": "", 
            "num": 1, 
            "size": 209698816.0, 
            "unit": "kib"
        }
    ], 
    "script": "unit KiB set 1 lvm on"
}
@ansibullbot
Copy link
Collaborator

@ansibullbot ansibullbot added affects_2.10 bug This issue/PR relates to a bug module module labels Apr 16, 2020
@rosowiecki
Copy link
Contributor

rosowiecki commented Apr 28, 2020

Dear @Wohlraj

Your ansible version is 2.8.5, branch 2.8 is currently accepting only critical bugfixes.

Could you repeat your test in current version (2.10 prerelase from devel branch)? From what I can read in current version parted module is more consistent and returns changed=True whenever parted command (except print) is run or would be run if check mode wasn't enabled, just like script/command modules do.

(Keep in mind: I'm not a project maintainer, just a contributor interested in parted module)

@rosowiecki
Copy link
Contributor

Hint not related to your issue with ansible: you may use whole device for LVM physical volume without creating any partitions on it. I prefer this method, but I understand that it could be confusing because someone may not recognize that the device is already used, which may have terrible consequences.

@rosowiecki
Copy link
Contributor

rosowiecki commented Apr 28, 2020

My previous comments were not so accurate, so I edited them. Parted module tries to detect if change in flags is needed, but it doesn't handle check mode well. I'll make a PR soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue/PR relates to a bug has_pr module module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants