Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Tons of AppArmor denied messages #2528

Closed
bk201 opened this issue Jul 19, 2022 · 8 comments
Closed

[BUG] Tons of AppArmor denied messages #2528

bk201 opened this issue Jul 19, 2022 · 8 comments
Assignees
Labels
area/os Harvester OS related (ex: SLE Micro) kind/bug Issues that are defects reported by users or that we know have reached a real release priority/0 Must be fixed in this release reproduce/always Reproducible 100% of the time severity/4 Function working but has a minor issue (a minor incident with low impact)
Milestone

Comments

@bk201
Copy link
Member

bk201 commented Jul 19, 2022

Describe the bug

We can see tons of AppArmor denied messages in the system log:

Jul 06 08:29:43 rancher-lab-node1 kernel: audit: type=1400 audit(1657096183.376:47589): apparmor="DENIED" operation="ptrace" profile="cri-containerd.apparmor.d" pid=44468 comm="virtlogd" requested_mask="readby" denied_mask="readby" peer="cri-containerd.apparmor.d"

It makes debugging harder.

To Reproduce
Steps to reproduce the behavior:

  1. Create VMs.
  2. journalctl -b -f

Expected behavior

Do not log the messages or allow the operation (to see if that's allowed).

Support bundle

Environment

  • Harvester ISO version: v1.0.2
  • Underlying Infrastructure (e.g. Baremetal with Dell PowerEdge R630):

Additional context
Add any other context about the problem here.

@bk201 bk201 added kind/bug Issues that are defects reported by users or that we know have reached a real release area/os Harvester OS related (ex: SLE Micro) reproduce/needed Reminder to add a reproduce label and to remove this one severity/needed Reminder to add a severity label and to remove this one labels Jul 19, 2022
@bk201 bk201 added this to the v1.0.3 milestone Jul 19, 2022
@harvesterhci-io-github-bot

Pre Ready-For-Testing Checklist

  • If labeled: require/HEP Has the Harvester Enhancement Proposal PR submitted?
    The HEP PR is at:

  • Where is the reproduce steps/test steps documented?
    The reproduce steps/test steps are at:

  • Is there a workaround for the issue? If so, where is it documented?
    The workaround is at:

  • Have the backend code been merged (harvester, harvester-installer, etc) (including backport-needed/*)?
    The PR is at:

    • Does the PR include the explanation for the fix or the feature?

    • Does the PR include deployment change (YAML/Chart)? If so, where are the PRs for both YAML file and Chart?
      The PR for the YAML change is at:
      The PR for the chart change is at:

  • If labeled: area/ui Has the UI issue filed or ready to be merged?
    The UI issue/PR is at:

  • If labeled: require/doc, require/knowledge-base Has the necessary document PR submitted or merged?
    The documentation/KB PR is at:

  • If NOT labeled: not-require/test-plan Has the e2e test plan been merged? Have QAs agreed on the automation test case? If only test case skeleton w/o implementation, have you created an implementation issue?

    • The automation skeleton PR is at:
    • The automation test case PR is at:
  • If the fix introduces the code for backward compatibility Has a separate issue been filed with the label release/obsolete-compatibility?
    The compatibility issue is filed at:

@harvesterhci-io-github-bot

Automation e2e test issue: harvester/tests#411

@bk201 bk201 added severity/4 Function working but has a minor issue (a minor incident with low impact) reproduce/always Reproducible 100% of the time and removed reproduce/needed Reminder to add a reproduce label and to remove this one severity/needed Reminder to add a severity label and to remove this one labels Jul 19, 2022
@guangbochen guangbochen added the priority/0 Must be fixed in this release label Jul 19, 2022
@tjjh89017
Copy link
Contributor

Analysis

processes in container could not be read by virtlogd.

Version

Rke2 1.21.x
containerd 1.4

Root Cause

containerd will apply AppArmor rules with custom permissions (cri-containerd.apparmor.d) to all container processes if there is no specific profile.
https://github.com/k3s-io/containerd/blob/k3s-release/1.4/contrib/apparmor/template.go#L44-L91

cri-containerd.apparmor.d profile will try to load abstractions/base which contains basic rules.
https://github.com/k3s-io/containerd/blob/k3s-release/1.4/contrib/apparmor/template.go#L123-L125

abstractions/base will be located in /etc/apparmor.d/abstractions/base and contains ptrace (readby), by default to allow all process being read. (/proc, perf tracing. etc.)

SLE Micro for Rancher will not have this file by default.
We need to install the package apparmor-abstractions via zypper.

Fix

Harvester BaseOS will be shipped with apparmor-abstractions by default after v1.0.3

Workaround

We will not provide any workaround for this issue.
It will need to reboot every node that applies this patch.
It will impact production environment a lot.

Reproduce

  1. Install Harvester v1.0.2
  2. run a VM
  3. execute journalctl -b -f
  4. It will show similar deny messages
Jul 06 08:29:43 rancher-lab-node1 kernel: audit: type=1400 audit(1657096183.376:47589): apparmor="DENIED" operation="ptrace" profile="cri-containerd.apparmor.d" pid=44468 comm="virtlogd" requested_mask="readby" denied_mask="readby" peer="cri-containerd.apparmor.d"

Test plan

  1. Install Harvester with fix.
  2. run a VM
  3. execute journalctl -b -f
  4. execute tail -f /var/log/audit/audit.log (if this file isn't existing, just skip this step)
  5. step 3 and 4 should not show similar deny messages.

@bk201
Copy link
Member Author

bk201 commented Jul 19, 2022

@irishgordo
Copy link
Contributor

irishgordo commented Jul 25, 2022

Hi @tjjh89017 ,
I was digging into trying to reproduce this with Harvester v1.0.2.

My Current Setup

Harvester v1.0.2 running KVM/QEMU:

  • 12vCPUs
  • 24531 MiB

The VM Setup

#cloud-config
password: ubuntupw
chpasswd:
  expire: false
ssh_pwauth: true
package_update: true
packages:
  - apparmor
  - apparmor-utils
  - apparmor-profiles
  - qemu-guest-agent
runcmd:
  - - systemctl
    - enable
    - '--now'
    - apparmor.service
  - - systemctl
    - enable
    - '--now'
    - qemu-guest-agent.service

During Reproduction Attempt

  • Launching the test-vm, then logging in via the serial console from the virtual machines dashboard, running sudo su then journalctl -b -f yields:
ubuntu-test-machine login: ubuntu
Password: 
Welcome to Ubuntu 20.04.4 LTS (GNU/Linux 5.4.0-1071-kvm x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage

  System information as of Mon Jul 25 22:14:18 UTC 2022

  System load:  0.44              Processes:               113
  Usage of /:   3.0% of 48.27GB   Users logged in:         0
  Memory usage: 2%                IPv4 address for enp1s0: 10.0.2.2
  Swap usage:   0%

3 updates can be applied immediately.
2 of these updates are standard security updates.
To see these additional updates run: apt list --upgradable



The programs included with the Ubuntu system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Ubuntu comes with ABSOLUTELY NO WARRANTY, to the extent permitted by
applicable law.

To run a command as administrator (user "root"), use "sudo <command>".
See "man sudo_root" for details.

ubuntu@ubuntu-test-machine:~$ sudo su
root@ubuntu-test-machine:/home/ubuntu# journalctl -b -f
-- Logs begin at Mon 2022-07-25 22:12:31 UTC. --
Jul 25 22:14:20 ubuntu-test-machine systemd[2248]: Reached target Sockets.
Jul 25 22:14:20 ubuntu-test-machine systemd[2248]: Reached target Basic System.
Jul 25 22:14:20 ubuntu-test-machine systemd[2248]: Reached target Main User Target.
Jul 25 22:14:20 ubuntu-test-machine systemd[2248]: Startup finished in 140ms.
Jul 25 22:14:20 ubuntu-test-machine systemd[1]: Started User Manager for UID 1000.
Jul 25 22:14:20 ubuntu-test-machine systemd[1]: Started Session 1 of user ubuntu.
Jul 25 22:14:29 ubuntu-test-machine sudo[2270]:   ubuntu : TTY=ttyS0 ; PWD=/home/ubuntu ; USER=root ; COMMAND=/usr/bin/su
Jul 25 22:14:29 ubuntu-test-machine sudo[2270]: pam_unix(sudo:session): session opened for user root by ubuntu(uid=0)
Jul 25 22:14:29 ubuntu-test-machine su[2271]: (to root) ubuntu on ttyS0
Jul 25 22:14:29 ubuntu-test-machine su[2271]: pam_unix(su:session): session opened for user root by ubuntu(uid=0)

nor am I able to set the audit log

root@ubuntu-test-machine:/home/ubuntu# tail -f /var/log/audit/audit.log
tail: cannot open '/var/log/audit/audit.log' for reading: No such file or directory
tail: no files remaining

I'm currently unable to see any logs related to apparmor from v1.0.2.
Should I perhaps be using a different cloud-init configuration or a different VM image?

@tjjh89017
Copy link
Contributor

@irishgordo You should execute journalctl -b -f command in harvester node.

@irishgordo
Copy link
Contributor

@tjjh89017 Oh!
Thank you for that clarification, I thought maybe that was meant to be executed on a VM instead 😅 - I'll go through and test this now

@irishgordo
Copy link
Contributor

irishgordo commented Jul 26, 2022

This looks great 😄 👍 @tjjh89017 !

  • closing this out as it passes validation.

With the same setup from above running in v1.0.2 & v1.0.3-rc1 I was able to validate that the AppArmor logs are no longer their in v1.0.3-rc1 😄
Screenshot from 2022-07-25 17-51-14
You can see the v1.0.2 KVM up on the left, as the virtual box title, and it's VIP IP is the window at the right bottom.
The v1.0.3-rc1 is the right KVM, and it's VIP IP is in the browser at the top right.

The Hardware backing the KVM/QEMU for testing was a SuperMicro X9DRD-IT+ that has 2 x Intel Xeon E5-2600, 64GB DDR3 RAM, 1TB SSD.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/os Harvester OS related (ex: SLE Micro) kind/bug Issues that are defects reported by users or that we know have reached a real release priority/0 Must be fixed in this release reproduce/always Reproducible 100% of the time severity/4 Function working but has a minor issue (a minor incident with low impact)
Projects
None yet
Development

No branches or pull requests

5 participants