Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrading Firefox Sync Server to Syncstorage-rs #5939

Closed
mreid-tt opened this issue Nov 25, 2023 · 20 comments · Fixed by #5942
Closed

Upgrading Firefox Sync Server to Syncstorage-rs #5939

mreid-tt opened this issue Nov 25, 2023 · 20 comments · Fixed by #5942
Assignees
Labels
update request to update existing package

Comments

@mreid-tt
Copy link
Contributor

mreid-tt commented Nov 25, 2023

Background

When looking at the Firefox Sync Server source repo, I see it is no longer maintained and has since been rewritten in Rust under a new repo Syncstorage-rs.

I am considering that given the significant changes this could be a new package (subject to discussion).

Manual deployment test

I've documented a complete install from scratch in Ubuntu 22.04 as a proof of concept. I will need assistance in translating some of these ideas to a SynoCommunity package.

Prerequisite 1 - MySQL

Install and secure MySQL

Source: https://www.digitalocean.com/community/tutorials/how-to-install-mysql-on-ubuntu-20-04

sudo apt update
sudo apt install mysql-server
sudo systemctl start mysql.service

Set root database password

sudo mysql
ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password BY 'password';
exit

Secure MySQL deployment

sudo mysql_secure_installation
  1. Change default root password and set other options

Test MySQL service

systemctl status mysql.service
Prerequisite 2 - Rust

Install Rust

Source: https://rustup.rs/

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

Install dependencies

Source: https://github.com/mozilla-services/syncstorage-rs#system-requirements

sudo apt install gcc
sudo apt install libcurl4-openssl-dev
sudo apt install pkg-config
sudo apt install libmysqlclient-dev

Install Python virtual environment

sudo apt install python3-virtualenv

Install diesel

cargo install diesel_cli --no-default-features --features 'mysql'

Clone and build

cd Downloads
git clone https://github.com/mozilla-services/syncstorage-rs
cd syncstorage-rs
cargo install --path ./syncserver --no-default-features --features=syncstorage-db/mysql --locked

Setup Python virtual environment

virtualenv venv
source venv/bin/activate
pip3 install -r requirements.txt
pip3 install -r tools/tokenserver/requirements.txt

Initialize the Database

SYNCSTORAGE_PW="$(cat /dev/urandom | base32 | head -c64)"
printf 'Use this for the syncstorage user password: %s\n' "$SYNCSTORAGE_PW"

# login as root sql user using whatever creds you set up for that
# this sets up a user for sync storage and sets up the databases
mysql -u root -p <<EOF
CREATE USER "syncstorage"@"localhost" IDENTIFIED BY "$SYNCSTORAGE_PW";
CREATE DATABASE syncstorage_rs;
CREATE DATABASE tokenserver_rs;
GRANT ALL PRIVILEGES on syncstorage_rs.* to syncstorage@localhost;
GRANT ALL PRIVILEGES on tokenserver_rs.* to syncstorage@localhost;
EOF

Run initial database migrations

# syncstorage db
$HOME/.cargo/bin/diesel --database-url "mysql://syncstorage:${SYNCSTORAGE_PW}@localhost/syncstorage_rs" migration --migration-dir syncstorage-mysql/migrations run

# tokenserver db
$HOME/.cargo/bin/diesel --database-url "mysql://syncstorage:${SYNCSTORAGE_PW}@localhost/tokenserver_rs" migration --migration-dir tokenserver-db/migrations run

Add sync endpoint to database

mysql -u syncstorage -p"$SYNCSTORAGE_PW" <<EOF
USE tokenserver_rs
INSERT INTO services (id, service, pattern) VALUES
    (1, "sync-1.5", "{node}/1.5/{uid}");
EOF

Add syncserver node

# the 10 is the user capacity.
SYNC_TOKENSERVER__DATABASE_URL="mysql://syncstorage:${SYNCSTORAGE_PW}@localhost/tokenserver_rs" \
    python3 tools/tokenserver/add_node.py \
    http://localhost:8000 10

Setup syncserver config file

MASTER_SECRET="$(cat /dev/urandom | base32 | head -c64)"
METRICS_HASH_SECRET="$(cat /dev/urandom | base32 | head -c64)"
cat > config/local.toml <<EOF
master_secret = "${MASTER_SECRET}"

# removing this line will default to moz_json formatted logs
human_logs = 1

host = "localhost" # default
port = 8000 # default

syncstorage.database_url = "mysql://syncstorage:${SYNCSTORAGE_PW}@localhost/syncstorage_rs"
syncstorage.enable_quota = 0
syncstorage.enabled = true
syncstorage.limits.max_total_records = 1666 # See issues #298/#333

# token
tokenserver.database_url = "mysql://syncstorage:${SYNCSTORAGE_PW}@localhost/tokenserver_rs"
tokenserver.enabled = true 
tokenserver.fxa_email_domain = "api.accounts.firefox.com"
tokenserver.fxa_metrics_hash_secret = "${METRICS_HASH_SECRET}"
tokenserver.fxa_oauth_server_url = "https://oauth.accounts.firefox.com"
tokenserver.fxa_browserid_audience = "https://token.services.mozilla.com"
tokenserver.fxa_browserid_issuer = "https://api.accounts.firefox.com"
tokenserver.fxa_browserid_server_url = "https://verifier.accounts.firefox.com/v2"
EOF

Run the server

~/.cargo/bin/syncserver --config=config/local.toml

Configure Firefox

  1. Set identity.sync.tokenserver.uri to http://localhost:8000/1.0/sync/1.5
  2. Restart Firefox

References

Main source: https://artemis.sh/2023/03/27/firefox-syncstorage-rs.html
Additional: https://gitlab.com/agile-penguin/syncstorage-rs_rhel/-/blob/main/HowTo_syncstorage_rhel_EN.md

@mreid-tt mreid-tt added the update request to update existing package label Nov 25, 2023
@mreid-tt mreid-tt self-assigned this Nov 25, 2023
@mreid-tt
Copy link
Contributor Author

@th0ma7, I was reading the Wiki page Using wheels to distribute Python packages but was not clear on whether this project needed wheels or not. Given my research above, what are your thoughts?

@hgy59
Copy link
Contributor

hgy59 commented Nov 26, 2023

@mreid-tt my last try to build syncstorage-rs is almost 2 years ago (with version 0.10.2)

The challenge will be to have python, rust and go available at the same time to build the package.
IMO the python/rust/go mk files are not yet designed to work together.

(I recently got similar issues when trying to build homeassistant with prebuilt python and prebuilt ffmpeg - or some time ago I was trying to create a package that needs native/nodejs and native/go to build and AFAICR it was tricky to set the PATH for those tools to be accessible...)

Hopefully a syncserver-rs package will not only be available for x64, because there you can use the docker solution.

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 26, 2023

@hgy59, thanks for the feedback. This is an area I'm not familiar with at all so I was seeking advice on how to start. I gather that the combination of Rust and Python is not something we cater for but was hoping we could use this as a test case to maybe support it.

Hopefully a syncserver-rs package will not only be available for x64, because there you can use the docker solution.

I believe it should be available on whichever platforms support Python and Rust. For the proof of concept in Ubuntu that was done on an ARM CPU as shown below:

$ uname -p
aarch64

@hgy59
Copy link
Contributor

hgy59 commented Nov 26, 2023

the combination of Rust and Python

This combination is working, at least to build wheels that depend on rust (like cryptography).
Due to issues with 32-bit PPC for rust, the qoriq arch is not (yet?) supported...

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 26, 2023

Hey @hgy59, I've been exploring examples in the Wiki and our packages regarding building with Rust in spksrc, but haven't found many useful ones. Reflecting on my manual build experience, I'd like to share my thoughts on what a potential package approach could entail. I'd appreciate your feedback to ensure I'm heading in the right direction.

  1. Establish a standard wheels package with install_python_virtualenv and install_python_wheels setup commands.
  2. Manage dependencies (gcc, libcurl4-openssl-dev, pkg-config, libmysqlclient-dev) using DEPENDS commands for cross/pkg-config, cross/libstdc++, cross/curl, cross/openssl3, and cross/mysql-connector-c.
  3. Create a new package at cross/diesel_cli to serve as a dependency.
  4. Configure the main Makefile to employ cargo for building the package with specific syncserver arguments.
  5. Generate a requirements-crossenv.txt file, consolidating contents from requirements.txt and tools/tokenserver/requirements.txt for practicality.
  6. In the service script's service_postinst function, initialize the database and run migrations using the included diesel_cli.
  7. Integrate additional commands in the service script to finalize server configuration.
  8. Wrap the server startup in a daemon call using the service_prestart function.

@hgy59
Copy link
Contributor

hgy59 commented Nov 26, 2023

Hey @hgy59, I've been exploring examples in the Wiki and our packages regarding building with Rust in spksrc, but haven't found many useful ones. Reflecting on my manual build experience, I'd like to share my thoughts on what a potential package approach could entail. I'd appreciate your feedback to ensure I'm heading in the right direction.

I am a little bit confused about the order of your list 😄 (namely the first point).

  1. Establish a standard wheels package with install_python_virtualenv and install_python_wheels setup commands.

those are installer functions that can be used in service-setup.sh (in service_postinst).

  1. Manage dependencies (gcc, libcurl4-openssl-dev, pkg-config, libmysqlclient-dev) using DEPENDS commands for cross/pkg-config, cross/libstdc++, cross/curl, cross/openssl3, and cross/mysql-connector-c.
  • gcc is part of the arch specific toolchain and pkg-config is in the development environment (spksrc image). no need for a cross/pkg-config package (you dont want to compile under DSM, won't you?)
  • cross/libstdc++ is only needed, when DSM provides a too old version (for sure not in DSM >= 7, in DSM 6 only few packages like dotnet-runtime need it)
  • you can omit openssl3 in the dependency list since cross/curl builds with openssl3 by default
  • does syncserver-rs support mysql db only? If so, you need mysql-connector-c, otherwise you probably could use sqlite, as you did in the owncloud package.
  1. Create a new package at cross/diesel_cli to serve as a dependency.

It looks like this is a regular dependency (rust crate) of syncserver-rs and must not be built separately (included in the next point).

  1. Configure the main Makefile to employ cargo for building the package with specific syncserver arguments.

The regular usecase is to build a cross/syncserver-rs package with
include ../../mk/spksrc..cross-rust.mk

  1. Generate a requirements-crossenv.txt file, consolidating contents from requirements.txt and tools/tokenserver/requirements.txt for practicality.

To find the pure-python and crossenv requirement, I have created a script (generate_requirements.sh). I normally use it in homeassistant (with some hundreds of requirements) like this, with a file consolidated requirements_all.txt as source:

mkdir src/requirements
cd src/requirements 
generate_requirements.sh "-r requirements_all.txt" 311

This will temp. create a virtual env with native/python311 and generate three files:

  • generated_requirements_all.txt
  • generated_requirements_crossenv.txt
  • generated_requirements_pure.txt
    The crossenv and pure files can be taken as initial requirements-crossenv.txt and requirements-pure.txt for the package. The result is not bullet proof. It needs a manual validation and probably you need also a requirements-abi.txt file. For validation I normally use the download files section in pypi. pure python packages have a "none-any" file only, others have cp311 or abi files (and some have both - pure and cross - mostly for performance of compiled code).

This script (attached below) saved me a lot of time, before I did this within the venv of the installed packages under DSM...

  1. In the service script's service_postinst function, initialize the database and run migrations using the included diesel_cli.

  2. Integrate additional commands in the service script to finalize server configuration.

  1. Wrap the server startup in a daemon call using the service_prestart function.
    You should not need service_prestart. A SERVICE_COMAND should be sufficient or you could create a start-script called in SERVICE_COMMAND if you need more than a oneliner.

generate_requirements.zip

Just some of my experiance...
@th0ma7 will have more...

@hgy59
Copy link
Contributor

hgy59 commented Nov 26, 2023

@mreid-tt
found that sqlite is not an option (mozilla-services/syncstorage-rs#12)

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 26, 2023

@hgy59 I believe you've pretty much covered it all. I do agree to disagree with you relatively to openssl3 as I prefer always including mandatory requirements... but that's purely esthetic.

Relatively to your script, it is rather nice and we would benefit from having it within the spksrc framework directly. Perhaps under a tools or misc directory. Also I wonder if jq commands could be used to parse the online status and capture when downloads are available for any platforms or being arch specific (similarly to spksrc.wheel.mk around line 57).

Personnally I'm used to manually look https://pypi.org/ and searching for the package needed under the download section you'll find wether it provides any or arch specific downloads, thus defining it requires to be in pure (for any) vs crossenv when arch specific versions are available. @hgy59 this is where I believe a jq query could be performed to automate this in your script... my two cents.

But @mreid-tt including wheels requires a few trials and errors. Having a non-x86_64 NAS helps fully testing things as it usually won't find any pre-built correspondance online, thus defaulting at using the local file for installation (whereas x86_64 will always download online version). I've made a few shortcuts to help re-building just the wheel portions such as make wheelclean, wheelcleancache and wheelcleanall which allows you to rebuild the package without having to reprocess the entire build. Have a look at mk/spksrc.spk.mk and you'll probably get the gist of it. But pay attention to the pip output messages depending of what you require: it will tell you when re-using a cached pre-built file or if it does fully rebuild it. Doing a final wheelcleancache then rebuild before your last commit is a good safeguard.

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 27, 2023

@hgy59, @th0ma7 thanks for the advice. I must admit that some of this is over my head but I'm happy to continue learning. I've made an initial update PR #5942 to incorporate some of these ideas. The first build failed but I didn't expect it to work right out the box. Do let me know if you have any thoughts on my PR.

@hgy59
Copy link
Contributor

hgy59 commented Nov 27, 2023

@mreid-tt regarding the (python) requirements, forgot to mension that in spk/python311/Makefile and spk/python311/src/requirements-crossenv.txt you find commented instructions for dedicated wheels.
i.e. mysqlclient and cryptography must be cross compiled

# [mysqlclient]
# Depends: mysql-connector-c, mariadb-connector-c
# Require environment variables
#  MYSQLCLIENT_CFLAGS
#  MYSQLCLIENT_LDFLAGS
mysqlclient==2.2.0

and related definitions in Makefile:

# [mysqlclient]
DEPENDS += cross/mysql-connector-c cross/mariadb-connector-c
ENV += MYSQLCLIENT_CFLAGS="$(CFLAGS) -I$(STAGING_INSTALL_PREFIX)/include/mysql -I$(STAGING_INSTALL_PREFIX)/include/mariadb -I$(STAGING_INSTALL_PREFIX)/$(PYTHON_INC_DIR)"
ENV += MYSQLCLIENT_LDFLAGS="$(LDFLAGS)"

for cryptography you have to find a solution, since only newer version with rust build is documented.

So syncstorage-rs will need a requirements-crossenv.txt and a requirements-pure.txt file. You cannot merge all together into one requirements file since only the wheels defined in requirements-crossenv.txt are cross compiled and included in the package.

And you have to add all requirements (that are indirect dependencies), not only the python modules that are in the requirements files of syncstorage-rs source code.
You can omit only the python modules that are already included in the python311 package (like cffi and SQLAlchemy) - you have to test, whether the required versions of such modules must match the version included in python311 package.

while looking at python311 now, I am unsure, whether mysqclient is already delivered with the python311 package or not (@th0ma7 is it?).

@hgy59
Copy link
Contributor

hgy59 commented Nov 27, 2023

@mreid-tt another hint.

If you replace
include ../../mk/spksrc.spk.mk
by
include ../../mk/spksrc.python.mk

and define some variables, you can benefit of the latest improvements by @th0ma7 to depend on prebuilt python in spk/python311. If you build spk/python311 for each arch you develop, then you must not rebuild python after a make clean in spk/ffsync. This is a huge speed up of development cycle.

You find related code in the Makefile of tvheadend, homeassistant and other packages.

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 27, 2023

while looking at python311 now, I am unsure, whether mysqclient is already delivered with the python311 package or not (@th0ma7 is it?).

@hgy59 you'll notice in the python311 makefile that I added a variable WHEELS_TEST_ALL that I set to 1 only when doing the final testing of all wheels to confirm that they still do work ok with the update, which also often includes update to wheels versions themselves. Once testing is over and all lights are green, prior to publishing I set it back to 0 as only a minimal set of mandatory pure python wheels are included (i.e. setuptools, wheel, certifi, virtualenv, etc.)

If you replace
include ../../mk/spksrc.spk.mk
by
include ../../mk/spksrc.python.mk

Indeed thnx for bringing that up (brain is still stuck in covid glue...) - I believe @hgy59 explained it well, but just to be clear, @mreid-tt in order to use optimally you must build python311 (i.e. make all-supported) prior to building your project as this will create a direct dependency over it. This will significantly reduce your build time afterwards as python won't need to be rebuilt afterwards for subsequent testing & rebuilds of your project. Note that this is exactly the behavior of online github-action as well explaining why you always find a python package included part of the resulting zip per arch builds from the summary page view.

For it to work properly, in your project you'll also need to define which python package you're referring to with:

PYTHON_PACKAGE = python311
SPK_DEPENDS = "$(PYTHON_PACKAGE)"

Then later when calling include ../../mk/spksrc.python.mk should make it all work out.

A really simple example with no dependencies as being python only is mercurial. You can also look at a slightly more complex version with duplicity then move to the real thing with @hgy59's homeassitant 😄

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 27, 2023

for cryptography you have to find a solution, since only newer version with rust build is documented.

@hgy59 what's the issue are you referring to with cryptography besides the one knowned for ppc qoriq arch?

@hgy59
Copy link
Contributor

hgy59 commented Nov 27, 2023

for cryptography you have to find a solution, since only newer version with rust build is documented.

@hgy59 what's the issue are you referring to with cryptography besides the one knowned for ppc qoriq arch?

reading the requirements.txt, it tells that a rather old cryptography==3.4.8 must be used that is not built with rust/maturin and probably needs to define CRYPTOGRAPHY_DONT_BUILD_RUST env variable.

This will probably not work with cross/cryptography or needs a different version...

and there are some ENV variables (OPENSSL_*), but those are not required for cryptography < 40

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 27, 2023

Humm, interesting one indeed. Digging into the py310 crossenv file history we got stuck to version 3.3.2 of cryptography a hell of a long time until we sorted out rust builds. So indeed version 3.4.8 may be a little challenging although I believe it only needs setting up the following environment variables in your Makefile like all other build <40 used to:

# [bcrypt] and [cryptography]
# Mandatory for rustc wheel building
ENV += PYO3_CROSS_LIB_DIR=$(STAGING_INSTALL_PREFIX)/lib/
ENV += PYO3_CROSS_INCLUDE_DIR=$(STAGING_INSTALL_PREFIX)/include/

Question really is, is it even compatible with py311 in the first place?

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 27, 2023

@hgy59, @th0ma7, thank you both for your detailed feedback. It will take me a bit to go through this as I am not very familiar with these type of builds (but I do want to learn).

Meanwhile, I've observed a common trend in the builds currently encountering failures, primarily stemming from the following error:

error: found a virtual manifest at `/github/workspace/spk/ffsync/work-[build-type]/syncstorage-rs-0.14.1/Cargo.toml` instead of a package manifest

Looking online i found this rust-lang/cargo#7599 (comment), which suggests that to way to address this is:

TLDR fix: specify the path to the executable package with --path

The challenge is that I don't think we currently have a way to pass arguments to the build. Looking at the codebase I see this example:

CARGO_BUILD_ARGS += --features 'pcre2'

The problem is that within the mk/spksrc.cross-rust.mk there is no code to accept and process the CARGO_BUILD_ARGS variable. Moreover, the --path argument seems to already be hard-coded with:

# Set the cargo install parameters
CARGO_INSTALL_ARGS += --path $(RUST_SRC_DIR)

This suggests to me that without the ability to set a path in the build configuration, the syncstorage-rs will never get built. Am I interpreting this correctly? If so, do any of the tips you've kindly provided above assist with this? If not, should I include some framework fixes to parcess the CARGO_BUILD_ARGS argument?

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 27, 2023

First things that comes to mind, you do have a few options, you could first play with the actual mk files and temporary hard-code your changes and/or you can augment existing variables when applicable (using +=). Also there exists the RUSTFLAGS variable that can be used directly as an alternate to using cargo variables that you could define using ENV += RUSTFLAGS="bla".

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 28, 2023

@th0ma7, thanks for the recommendation. I opted to first explore the route of using RUSTFLAGS. I was pleased to learn that the RUST_SRC_DIR variable in the spksrc was something I could pass in as an argument to set the path so I didn't need to do much there. For the other arguments these would need RUSTFLAGS. Looking first at the codebase I saw this example:

ENV += RUSTFLAGS="-Clink-arg=-Wl,--allow-multiple-definition"

Trying this syntax of a comma-separated string however didn't work. Looking at the documentation for Environment variables Cargo reads we note that:

RUSTFLAGS — A space-separated list of custom flags to pass to all compiler invocations that Cargo performs.

So that example would not work. Moreover, based on the build.rustflags documentation the use of extra flags is mutually exclusive of other sources. I found this to be the case that the --path argument seems to omitted when I passed RUSTFLAGS. Given that this practice of passing low-level flags was cautioned as not always backwards compatible, I opted to try another approach by amending the spksrc.

For this I added a handler to process the CARGO_BUILD_ARGS and built the arguments up as a variable (rather than passing as a whole string as this cause issues - I suspect because it lacked a unit separator). With this implementation the compilation seemed to move forward with the rust build.

The rust compilation however did not complete as it is now complaining of:

error: failed to run custom build command for `pyo3 v0.14.5`

Caused by:
  process didn't exit successfully: `/github/workspace/spk/ffsync/work-[build-type]/syncstorage-rs-0.14.1/target/release/build/pyo3-0a9fa2667963a478/build-script-build` (exit status: 1)
  --- stdout
  cargo:rerun-if-env-changed=PYO3_CROSS
  cargo:rerun-if-env-changed=PYO3_CROSS_LIB_DIR
  cargo:rerun-if-env-changed=PYO3_CROSS_PYTHON_VERSION
  cargo:rerun-if-env-changed=_PYTHON_SYSCONFIGDATA_NAME

  --- stderr
  error: Could not find either libpython.so or _sysconfigdata*.py in /github/workspace/spk/ffsync/work-[build-type]/install/var/packages/ffsync/target/lib/

I suspect that the fixes above for python will likely contribute to resolving this compile error. I'll work on this some more tomorrow.

@th0ma7
Copy link
Contributor

th0ma7 commented Nov 28, 2023

You may be missing the

# Mandatory for rustc wheel building
ENV += PYO3_CROSS_LIB_DIR=$(STAGING_INSTALL_PREFIX)/lib/
ENV += PYO3_CROSS_INCLUDE_DIR=$(STAGING_INSTALL_PREFIX)/include/

@mreid-tt
Copy link
Contributor Author

mreid-tt commented Nov 28, 2023

So syncstorage-rs will need a requirements-crossenv.txt and a requirements-pure.txt file. You cannot merge all together into one requirements file since only the wheels defined in requirements-crossenv.txt are cross compiled and included in the package.

@hgy59, so I broke up the requirements into 'pure' and 'crossenv' using manual lookups at https://pypi.org/. One project (pyfxa) didn't have any build distributions (so I assumed 'pure') and another (sqlalchemy) had both (so I left it in 'pure' and 'crossenv'). Let me now if the Makefile is correct for their reference. Regarding adding these also to the syncstorage-rs, this package exists in the cross path and based on the Available Variables in cross/ Makefiles, I don't see a WHEELS variable available there. I also don't see any other packages in the cross directory with requirements-[crossenv|pure].txt files.

You may be missing the

# Mandatory for rustc wheel building
ENV += PYO3_CROSS_LIB_DIR=$(STAGING_INSTALL_PREFIX)/lib/
ENV += PYO3_CROSS_INCLUDE_DIR=$(STAGING_INSTALL_PREFIX)/include/

@th0ma7, so I added this to both the cross/syncstorage-rs/Makefile and the spk/ffsync/Makefile but the error message has not changed.

EDIT: Further discussion on the implementation will be continued under the PR #5942.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
update request to update existing package
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants