Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cleanup(build): Use a single cmake module for driver_config.h #1188

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

gnosek
Copy link
Contributor

@gnosek gnosek commented Jul 5, 2023

Instead of manually generating driver_config.h when needed (approximately, since the libscap engines do not generate it, even though they rely on it), encapsulate all the logic in a single cmake module.

Note: this removes the driver_config directory from LIBSCAP_INCLUDE_DIRS. If you actually need driver_config.h, add this to your CMakeLists.txt:

include(driver_config)
include_directories(${DRIVER_CONFIG_OUTPUT_DIR})

What type of PR is this?

/kind cleanup

Any specific area of the project related to this PR?

/area build

/area driver-kmod

/area driver-bpf

/area driver-modern-bpf

What this PR does / why we need it:

driver_config.h is generated in three different places, in different ways, into two target locations (including the source directory). Clean this up and use just one well-defined place we can refer to later.

Also, (as it turned out to be a prerequisite for the fix), the BPF build now no longer touches the source directory and does not depend on any files outside the (generated) source directory.

Does this PR introduce a user-facing change?:

(I'm not sure about user-facing, but it is a change for libscap consumers, even though driver_config.h is generally not needed)

action required: driver_config.h is no longer available in ${LIBSCAP_INCLUDE_DIRS}

@poiana
Copy link
Contributor

poiana commented Jul 5, 2023

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: gnosek

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@github-actions
Copy link

github-actions bot commented Jul 5, 2023

Please double check driver/API_VERSION file. See versioning.

/hold

@FedeDP
Copy link
Contributor

FedeDP commented Jul 6, 2023

falcosecurity/driverkit#283 fixes driverkit for this PR.

@FedeDP
Copy link
Contributor

FedeDP commented Jul 7, 2023

NOTE: since driverkit jobs are not required checks, we can go on with this one.

@FedeDP
Copy link
Contributor

FedeDP commented Jul 7, 2023

Let's try for:
/milestone 0.12.0

@poiana poiana added this to the 0.12.0 milestone Jul 7, 2023
@FedeDP
Copy link
Contributor

FedeDP commented Jul 7, 2023

@gnosek can you fix other tests ?

@FedeDP
Copy link
Contributor

FedeDP commented Jul 17, 2023

/milestone 0.13.0

@poiana poiana modified the milestones: 0.12.0, 0.13.0 Jul 17, 2023
@leogr
Copy link
Member

leogr commented Aug 23, 2023

Moving to
/milestone 0.14.0
since I believe it's to late to make this work by the end of this month.

@poiana poiana modified the milestones: 0.13.0, 0.14.0 Aug 23, 2023
@Andreagit97 Andreagit97 modified the milestones: 0.14.0, TBD Sep 4, 2023
The BPF build now no longer touches the source directory
and does not depend on any files outside the (generated)
source directory.

Signed-off-by: Grzegorz Nosek <grzegorz.nosek@sysdig.com>
Instead of manually generating driver_config.h when needed
(approximately, since the libscap engines do not generate it, even
though they rely on it), encapsulate all the logic in a single
cmake module.

Note: this removes the driver_config directory from
LIBSCAP_INCLUDE_DIRS. If you actually need driver_config.h,
add this to your CMakeLists.txt:

    include(driver_config)
    include_directories(${DRIVER_CONFIG_OUTPUT_DIR})

Signed-off-by: Grzegorz Nosek <grzegorz.nosek@sysdig.com>
@incertum
Copy link
Contributor

incertum commented Feb 6, 2024

Hi @gnosek just kindly checking in on the status of this PR. When do you all think we should work on it? Thank you!

@poiana
Copy link
Contributor

poiana commented May 6, 2024

Issues go stale after 90d of inactivity.

Mark the issue as fresh with /remove-lifecycle stale.

Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

Provide feedback via https://github.com/falcosecurity/community.

/lifecycle stale

@leogr
Copy link
Member

leogr commented May 8, 2024

@gnosek any update on this?

/remove-lifecycle stale

@poiana
Copy link
Contributor

poiana commented Aug 6, 2024

Issues go stale after 90d of inactivity.

Mark the issue as fresh with /remove-lifecycle stale.

Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

Provide feedback via https://github.com/falcosecurity/community.

/lifecycle stale

@leogr
Copy link
Member

leogr commented Aug 20, 2024

@gnosek any update on this?

do we still want to merge this PR? 🤔

@gnosek
Copy link
Contributor Author

gnosek commented Aug 20, 2024

As you might imagine, this has fallen behind other things in my stack of yaks to shave. Would be nice to have it but I guess I'll redo it from scratch some day.

@poiana
Copy link
Contributor

poiana commented Sep 19, 2024

Stale issues rot after 30d of inactivity.

Mark the issue as fresh with /remove-lifecycle rotten.

Rotten issues close after an additional 30d of inactivity.

If this issue is safe to close now please do so with /close.

Provide feedback via https://github.com/falcosecurity/community.

/lifecycle rotten

@leogr
Copy link
Member

leogr commented Sep 20, 2024

/remove-lifecycle rotten

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants