Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge main to feature branch #7

Merged
merged 106 commits into from
Nov 10, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
106 commits
Select commit Hold shift + click to select a range
e66438f
Fix broken doc links (#658)
xiaoyongzhu Sep 12, 2022
bbdcc50
Added _scproxy necessary for MacOS (#651)
ahlag Sep 13, 2022
6de1d60
Add docs for consuming features in online environment (#609)
xiaoyongzhu Sep 14, 2022
a54180b
Clean up after moving to LFAI (#665)
xiaoyongzhu Sep 14, 2022
9cb35e6
Updating docker version in ARM template to use latest release tagged …
jainr Sep 14, 2022
0f15cf0
Added prettier documentation (#672)
ahlag Sep 15, 2022
bb2a45d
UI: Add data source detail page (#620)
ahlag Sep 15, 2022
7b2ea61
Add aerospike sink (#632)
YihuiGuo Sep 19, 2022
b89c045
Remove reference to aerospike in sbt (#680)
YihuiGuo Sep 19, 2022
fe40166
Extend RBAC to support project id as input (#673)
Yuqing-cat Sep 19, 2022
b299379
Local Spark Provider which supports to submit feature join job in loc…
Yuqing-cat Sep 20, 2022
d9ae81e
Fixing issue with docker image on demo apps not getting updated (#686)
jainr Sep 20, 2022
f7e0c8b
Lock python dependency versions (#690)
xiaoyongzhu Sep 21, 2022
5382cd0
Apply 'aggregation_features' parameter to merge dataframes (#667)
enya-yx Sep 21, 2022
de64cea
Fix data source detail page in rbac registry (#698)
Yuqing-cat Sep 22, 2022
fc407e2
Fix multi-keyed feature in anchor (direct purview) (#676)
YihuiGuo Sep 22, 2022
6035f04
Fix path with #LATEST (#684)
jaymo001 Sep 23, 2022
25aa097
Fix Feature value adaptor and UDF adaptor on Spark executors (#660)
jaymo001 Sep 23, 2022
ebcc81c
Enhance SQL Registry Error Messages (#674)
windoze Sep 23, 2022
ef3e130
bump version to 0.8.0 (#694)
Yuqing-cat Sep 23, 2022
82607ac
Fix feature type bug where inferred feature type might not be honored…
jaymo001 Sep 23, 2022
f3c1a27
Update setup.py (#702)
xiaoyongzhu Sep 24, 2022
2cf23a5
fix rbac+purview web app issue (#700)
Yuqing-cat Sep 25, 2022
dc0ca12
Remove hard coded resources in docs (#696)
enya-yx Sep 25, 2022
49d5a5d
Add e2e test for purview registry and rbac registry (#689)
blrchen Sep 26, 2022
e8a738b
Update test use runtime jar from maven for spark submission to cover …
blrchen Sep 26, 2022
5e7edf6
Enhance databricks submission error message (#710)
enya-yx Sep 27, 2022
059f2b4
Enhance purview registry error messages (#709)
blrchen Sep 28, 2022
a6da6fe
hot fix databricks es dependency issue (#713)
Yuqing-cat Sep 29, 2022
7b88372
Fix materialize to sql e2e test failure (#717)
blrchen Sep 29, 2022
461ba01
Add Data Models in Feathr (#659)
hyingyang-linkedin Sep 29, 2022
d187ae2
Revert "Enhance purview registry error messages (#709)" (#720)
blrchen Sep 30, 2022
6d1e7a6
Improve Avro GenericRecord and SpecificRecord based row-level extract…
jaymo001 Oct 5, 2022
5fc3730
Save lookup feature definition to HOCON files (#732)
jaymo001 Oct 8, 2022
356f74b
Fix function string parsing (#725)
loomlike Oct 8, 2022
b433039
Apply a same credential within each sample (#718)
enya-yx Oct 10, 2022
a325597
Enable incremental for HDFS sink (#695)
enya-yx Oct 11, 2022
bb67939
#492 fix, fail only if different sources have same name (#733)
windoze Oct 11, 2022
4f76e19
Remove unused credentials and deprecated purview settings (#708)
enya-yx Oct 12, 2022
18d776d
Revoke token submitted by mistaken (#730)
blrchen Oct 12, 2022
9f446bf
Update product_recommendation_demo.ipynb
hangfei Oct 12, 2022
39c14ca
Fix synapse errors not print out issue (#734)
enya-yx Oct 12, 2022
c075dc2
Spark config passing bug fix for local spark submission (#729)
loomlike Oct 12, 2022
d771c3c
Fix direct purview client missing transformation (#736)
YihuiGuo Oct 13, 2022
f677a17
Revert "Derived feature bugfix (#121)" (#731)
jaymo001 Oct 13, 2022
616d76e
Support SWA with groupBy to 1d tensor conversion (#748)
jaymo001 Oct 13, 2022
8d7d412
Rijai/armfix (#742)
jainr Oct 14, 2022
4bdcd57
bump version to 0.8.2 (#722)
Yuqing-cat Oct 14, 2022
3c407c3
Added latest deltalake version (#735)
ahlag Oct 14, 2022
1465f64
#474 Disable local mode (#738)
windoze Oct 15, 2022
d59ea4b
Allow recreating entities for PurView registry (#691)
windoze Oct 15, 2022
b6cff14
Adding DevSkim linter to Github actions (#657)
jainr Oct 17, 2022
3d12944
Fix icons in UI cannot auto scale (#737) (#744)
Fendoe Oct 17, 2022
3070a86
Expose 'timePartitionPattern' in Python API [ WIP ] (#714)
enya-yx Oct 17, 2022
83b79c9
Setting up component governance pipeline (#655)
jainr Oct 17, 2022
b036898
Add docs to explain on feature materialization behavior (#688)
xiaoyongzhu Oct 18, 2022
5030eee
Fix protobuf version (#711)
enya-yx Oct 18, 2022
aad580d
Add some notes based on on-call issues (#753)
enya-yx Oct 18, 2022
4b9b494
Refine spark runtime error message (#755)
Yuqing-cat Oct 19, 2022
b8e3b27
Serialization bug due to version incompatibility between azure-core a…
jainr Oct 19, 2022
fa10e72
Unify Python SDK Build Version and decouple Feathr Maven Version (#746)
Yuqing-cat Oct 19, 2022
c0e8bc8
replace hard code string in notebook and align with others (#765)
Yuqing-cat Oct 19, 2022
143ff89
Add flag to enable generation non-agg features (#719)
windoze Oct 20, 2022
59e4ccf
rollback 0.8.2 version bump PR (#771)
Yuqing-cat Oct 24, 2022
6a3a044
Refactor Product Recommendation sample notebook (#743)
jainr Oct 24, 2022
eb6b9b8
Update role-management page in UI (#751) (#764)
Fendoe Oct 25, 2022
5c17dee
Create Feature less module in UI code and import alias (#768)
Fendoe Oct 25, 2022
84dfe3b
Add dev and notebook dependencies. Add extra dependency installation …
loomlike Oct 25, 2022
07f6033
Fix Windows compatibility issues (#776)
xiaoyongzhu Oct 25, 2022
5a7c8e2
UI: Replace logo icon (#778)
Fendoe Oct 26, 2022
2f7e1fd
Refine example notebooks (#756)
loomlike Oct 27, 2022
9728e94
UI: Display version (#779)
Fendoe Oct 27, 2022
58395a8
Add nightly Notification to PR Test GitHub Action (#783)
Yuqing-cat Oct 27, 2022
70618c4
fix broken links for #743 (#789)
Yuqing-cat Oct 28, 2022
3b64c8e
Update notebook image links for github rendering (#787)
loomlike Oct 29, 2022
ff438f5
Revert 756 (#798)
blrchen Oct 31, 2022
2f868ef
remove unnecessary spark job from registry test (#790)
Yuqing-cat Oct 31, 2022
04c417e
Revert "Expose 'timePartitionPattern' in Python API [ WIP ] (#714)" (…
blrchen Oct 31, 2022
602b08f
Update CONTRIBUTING.md (#793)
hangfei Oct 31, 2022
1c95868
Fix test_azure_spark_maven_e2e ci test error (#800)
blrchen Oct 31, 2022
87baf06
Add failure warning and run link to daily notification (#802)
Yuqing-cat Oct 31, 2022
a8a88d9
Minor documentation update to add info about maven automated workflow…
jainr Oct 31, 2022
1cb4a6f
Update test_azure_spark_e2e.py
blrchen Nov 1, 2022
7ba8847
Fix doc dead links (#805)
blrchen Nov 1, 2022
8f6428d
Fix more dead links (#807)
blrchen Nov 1, 2022
3025bc1
Improve UI experience and clean up ui code warnings (#801)
Fendoe Nov 1, 2022
02e3643
Add release instructions for Release Candidate (#809)
blrchen Nov 1, 2022
244f127
Bump version to 0.9.0-rc1 (#810)
blrchen Nov 1, 2022
ded4cae
Fix bug in empty array dense tensor default value (#806)
bozhonghu Nov 1, 2022
edd00f6
Fix sql-based derived feature (#812)
jaymo001 Nov 1, 2022
f83e8f5
Replacing webapp-deploy action with workflow-webhook action. (#813)
jainr Nov 2, 2022
6b5cd00
Fix passthrough feature reference in sql-based derived feature (#815)
jaymo001 Nov 2, 2022
8946b4f
Revert databricks example notebook until fixing issues (#814)
loomlike Nov 2, 2022
f36b6a5
add retry logic for purview project-ids logic (#821)
Yuqing-cat Nov 3, 2022
c89f26d
Bump version to 0.9.0-rc2 (#822)
blrchen Nov 3, 2022
512f309
Fix SideMenu display Manageemnt item issue. (#826)
Fendoe Nov 3, 2022
e33d251
Update text and link (#828)
Fendoe Nov 3, 2022
c203d69
fix sample issues due to derived feature engine change (#829)
xiaoyongzhu Nov 4, 2022
ca9aa21
Add exception if materialize features defined on 'INPUT_CONTEXT' (#785)
enya-yx Nov 6, 2022
b03932d
Fix only first Key will show even if multiple keys are added (#837)
Fendoe Nov 7, 2022
a0f91eb
Move the version information to the bottom of the sidemenu. (#832)
Fendoe Nov 8, 2022
07f473a
Fix key cannot read properties of undefined (reading 'map') (#841)
Fendoe Nov 8, 2022
be6fa17
Model (#769)
hyingyang-linkedin Nov 9, 2022
461f587
Bump loader-utils from 2.0.2 to 2.0.3 in /ui (#846)
dependabot[bot] Nov 9, 2022
2e32e88
Maven Package Version Configuration Fix (#845)
Yuqing-cat Nov 10, 2022
b19480d
Copy/paste typo (#849)
windoze Nov 10, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Description
<!--
Hey! Thank you for the contribution! Please go through https://github.com/linkedin/feathr/blob/main/docs/dev_guide/pull_request_guideline.md for more information.
Hey! Thank you for the contribution! Please go through https://github.com/feathr-ai/feathr/blob/main/docs/dev_guide/pull_request_guideline.md for more information.

Describe what changes to make and why you are making these changes.

Expand Down
37 changes: 37 additions & 0 deletions .github/workflows/devskim-security-linter.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party (Microsoft) and are governed by
# separate terms of service, privacy policy, and support
# documentation.
# For more details about Devskim, visit https://github.com/marketplace/actions/devskim

name: DevSkim

on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
schedule:
- cron: '00 4 * * *'

jobs:
lint:
name: DevSkim
runs-on: ubuntu-20.04
permissions:
actions: read
contents: read
security-events: write
steps:
- name: Checkout code
uses: actions/checkout@v3

- name: Run DevSkim scanner
uses: microsoft/DevSkim-Action@v1
with:
ignore-globs: "**/.git/**,**/test/**"

- name: Upload DevSkim scan results to GitHub Security tab
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: devskim-results.sarif
31 changes: 12 additions & 19 deletions .github/workflows/docker-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,27 +52,20 @@ jobs:


steps:
- name: Deploy to Feathr SQL Registry Azure Web App
id: deploy-to-sql-webapp
uses: azure/webapps-deploy@v2
with:
app-name: 'feathr-sql-registry'
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE_FEATHR_SQL_REGISTRY }}
images: 'index.docker.io/feathrfeaturestore/feathr-registry:nightly'

- name: Deploy to Feathr Purview Registry Azure Web App
id: deploy-to-purview-webapp
uses: azure/webapps-deploy@v2
with:
app-name: 'feathr-purview-registry'
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE_FEATHR_PURVIEW_REGISTRY }}
images: 'index.docker.io/feathrfeaturestore/feathr-registry:nightly'
uses: distributhor/workflow-webhook@v3.0.1
env:
webhook_url: ${{ secrets.AZURE_WEBAPP_FEATHR_PURVIEW_REGISTRY_WEBHOOK }}

- name: Deploy to Feathr RBAC Registry Azure Web App
id: deploy-to-rbac-webapp
uses: azure/webapps-deploy@v2
with:
app-name: 'feathr-rbac-registry'
publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE_FEATHR_RBAC_REGISTRY }}
images: 'index.docker.io/feathrfeaturestore/feathr-registry:nightly'

uses: distributhor/workflow-webhook@v3.0.1
env:
webhook_url: ${{ secrets.AZURE_WEBAPP_FEATHR_RBAC_REGISTRY_WEBHOOK }}

- name: Deploy to Feathr SQL Registry Azure Web App
id: deploy-to-sql-webapp
uses: distributhor/workflow-webhook@v3.0.1
env:
webhook_url: ${{ secrets.AZURE_WEBAPP_FEATHR_SQL_REGISTRY_WEBHOOK }}
5 changes: 4 additions & 1 deletion .github/workflows/document-scan.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
name: Feathr Documents' Broken Link Check

on: [push]
on:
push:
branches: [main]

jobs:
check-links:
runs-on: ubuntu-latest
Expand Down
99 changes: 90 additions & 9 deletions .github/workflows/pull_request_push_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,15 @@ on:
- "docs/**"
- "ui/**"
- "**/README.md"

schedule:
# Runs daily at 1 PM UTC (9 PM CST), will send notification to TEAMS_WEBHOOK
- cron: '00 13 * * *'

jobs:
sbt_test:
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
if: github.event_name == 'schedule' || github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
steps:
- uses: actions/checkout@v2
with:
Expand All @@ -41,7 +45,7 @@ jobs:

python_lint:
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
if: github.event_name == 'schedule' || github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
steps:
- name: Set up Python 3.8
uses: actions/setup-python@v2
Expand All @@ -61,7 +65,7 @@ jobs:

databricks_test:
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
if: github.event_name == 'schedule' || github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
steps:
- uses: actions/checkout@v2
with:
Expand All @@ -87,8 +91,7 @@ jobs:
- name: Install Feathr Package
run: |
python -m pip install --upgrade pip
python -m pip install pytest pytest-xdist databricks-cli
python -m pip install -e ./feathr_project/
python -m pip install -e ./feathr_project/[all]
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Set env variable and upload jars
env:
Expand Down Expand Up @@ -123,15 +126,14 @@ jobs:
COSMOS1_KEY: ${{secrets.COSMOS1_KEY}}
SQL1_USER: ${{secrets.SQL1_USER}}
SQL1_PASSWORD: ${{secrets.SQL1_PASSWORD}}

run: |
# run only test with databricks. run in 4 parallel jobs
pytest -n 6 feathr_project/test/

azure_synapse_test:
# might be a bit duplication to setup both the azure_synapse test and databricks test, but for now we will keep those to accelerate the test speed
runs-on: ubuntu-latest
if: github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
if: github.event_name == 'schedule' || github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
steps:
- uses: actions/checkout@v2
with:
Expand Down Expand Up @@ -165,8 +167,7 @@ jobs:
- name: Install Feathr Package
run: |
python -m pip install --upgrade pip
python -m pip install pytest pytest-xdist
python -m pip install -e ./feathr_project/
python -m pip install -e ./feathr_project/[all]
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Run Feathr with Azure Synapse
env:
Expand Down Expand Up @@ -197,3 +198,83 @@ jobs:
# skip databricks related test as we just ran the test; also seperate databricks and synapse test to make sure there's no write conflict
# run in 4 parallel jobs to make the time shorter
pytest -n 6 feathr_project/test/

local_spark_test:
runs-on: ubuntu-latest
if: github.event_name == 'schedule' || github.event_name == 'push' || github.event_name == 'pull_request' || (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe to test'))
steps:
- uses: actions/checkout@v2
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up JDK 8
uses: actions/setup-java@v2
with:
java-version: "8"
distribution: "temurin"
- name: Build JAR
run: |
sbt assembly
# remote folder for CI upload
echo "CI_SPARK_REMOTE_JAR_FOLDER=feathr_jar_github_action_$(date +"%H_%M_%S")" >> $GITHUB_ENV
# get local jar name without paths so version change won't affect it
echo "FEATHR_LOCAL_JAR_NAME=$(ls target/scala-2.12/*.jar| xargs -n 1 basename)" >> $GITHUB_ENV
# get local jar name without path
echo "FEATHR_LOCAL_JAR_FULL_NAME_PATH=$(ls target/scala-2.12/*.jar)" >> $GITHUB_ENV
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install Feathr Package
run: |
python -m pip install --upgrade pip
python -m pip install -e ./feathr_project/[all]
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Run Feathr with Local Spark
env:
PROJECT_CONFIG__PROJECT_NAME: "feathr_github_ci_local"
SPARK_CONFIG__SPARK_CLUSTER: local
REDIS_PASSWORD: ${{secrets.REDIS_PASSWORD}}
AZURE_CLIENT_ID: ${{secrets.AZURE_CLIENT_ID}}
AZURE_TENANT_ID: ${{secrets.AZURE_TENANT_ID}}
AZURE_CLIENT_SECRET: ${{secrets.AZURE_CLIENT_SECRET}}
S3_ACCESS_KEY: ${{secrets.S3_ACCESS_KEY}}
S3_SECRET_KEY: ${{secrets.S3_SECRET_KEY}}
ADLS_ACCOUNT: ${{secrets.ADLS_ACCOUNT}}
ADLS_KEY: ${{secrets.ADLS_KEY}}
BLOB_ACCOUNT: ${{secrets.BLOB_ACCOUNT}}
BLOB_KEY: ${{secrets.BLOB_KEY}}
JDBC_TABLE: ${{secrets.JDBC_TABLE}}
JDBC_USER: ${{secrets.JDBC_USER}}
JDBC_PASSWORD: ${{secrets.JDBC_PASSWORD}}
JDBC_DRIVER: ${{secrets.JDBC_DRIVER}}
JDBC_SF_PASSWORD: ${{secrets.JDBC_SF_PASSWORD}}
KAFKA_SASL_JAAS_CONFIG: ${{secrets.KAFKA_SASL_JAAS_CONFIG}}
MONITORING_DATABASE_SQL_PASSWORD: ${{secrets.MONITORING_DATABASE_SQL_PASSWORD}}
COSMOS1_KEY: ${{secrets.COSMOS1_KEY}}
SQL1_USER: ${{secrets.SQL1_USER}}
SQL1_PASSWORD: ${{secrets.SQL1_PASSWORD}}
run: |
# skip cloud related tests
pytest feathr_project/test/test_local_spark_e2e.py

failure_notification:
# If any failure, warning message will be sent
needs: [sbt_test, python_lint, databricks_test, azure_synapse_test, local_spark_test]
runs-on: ubuntu-latest
if: failure() && github.event_name == 'schedule'
steps:
- name: Warning
run: |
curl -H 'Content-Type: application/json' -d '{"text": "[WARNING] Daily CI has failure, please check: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"}' ${{ secrets.TEAMS_WEBHOOK }}

notification:
# Final Daily Report with all job status
needs: [sbt_test, python_lint, databricks_test, azure_synapse_test, local_spark_test]
runs-on: ubuntu-latest
if: always() && github.event_name == 'schedule'
steps:
- name: Get Date
run: echo "NOW=$(date +'%Y-%m-%d')" >> $GITHUB_ENV
- name: Notification
run: |
curl -H 'Content-Type: application/json' -d '{"text": "${{env.NOW}} Daily Report: 1. SBT Test ${{needs.sbt_test.result}}, 2. Python Lint Test ${{needs.python_lint.result}}, 3. Databricks Test ${{needs.databricks_test.result}}, 4. Synapse Test ${{needs.azure_synapse_test.result}} , 5. LOCAL SPARK TEST ${{needs.local_spark_test.result}}. Link: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"}' ${{ secrets.TEAMS_WEBHOOK }}
8 changes: 6 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ As a contributor, you represent that the code you submit is your original work o

# Responsible Disclosure of Security Vulnerabilities

Please do not file reports on Github for security issues. Please review the guidelines on at (link to more info). Reports should be encrypted using PGP (link to PGP key) and sent to security@linkedin.com preferably with the title "Github linkedin/ - ".
Please do not file reports on Github for security issues. Please review the guidelines on at (link to more info). Reports should be encrypted using PGP (link to PGP key) and sent to feathr-security@lists.lfaidata.foundation.

# Contribution Process

Expand All @@ -14,7 +14,7 @@ The Feathr community welcome everyone, and encourage a friendly and positive env

Please read existing Github issues or development work that is in progress or in the backlog to avoid duplication. If you are interested in those existing ones, you can leave a comment in the Github issues and the community will try to involve you. If you are not sure if it's duplicated, just create a Github issue and ask!

If it's a simple bug fix(less than 20 lines) or documentation change, you can just submit your pull request(PR) without Github issues. For any other PRs, a Github issue is required.
If it's a simple bug fix (less than 20 lines) or documentation change, you can just submit your pull request(PR) without Github issues. For any other PRs, a Github issue is required.

If you want to contribute something new and it's not tracked in existing Github issues, please create a new Github issue and the community will help review the idea. Please state `why` in your Github issue. If you already have a short design in mind, you can provide a one pager in the Github issue. If the idea in general make sense, then we can proceed to the design or development work. If the change is not small, an [RFC](https://en.wikipedia.org/wiki/Request_for_Comments) should be reviewed and approved by the team.

Expand All @@ -40,7 +40,11 @@ Our open source community strives to:
- **Be respectful**: We are a world-wide community of professionals, and we conduct ourselves professionally. Disagreement is no excuse for poor behavior and poor manners.
- **Understand disagreements**: Disagreements, both social and technical, are useful learning opportunities. Seek to understand the other viewpoints and resolve differences constructively.
- **Remember that we’re different**. The strength of our community comes from its diversity, people from a wide range of backgrounds. Different people have different perspectives on issues. Being unable to understand why someone holds a viewpoint doesn’t mean that they’re wrong. Focus on helping to resolve issues and learning from mistakes.
-

## Attribution & Acknowledgements

This code of conduct is based on the Open Code of Conduct from the [TODOGroup](https://todogroup.org/blog/open-code-of-conduct/).

# Committers
Benjamin Le, David Stein, Edwin Cheung, Hangfei Lin, Jimmy Guo, Jinghui Mo, Li Lu, Rama Ramani, Ray Zhang, Xiaoyong Zhu
15 changes: 15 additions & 0 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Component Governance Pipeline
# Runs the Feathr code through Component Governance Detection tool and publishes the result under compliance tab.

trigger:
- main

pool:
vmImage: ubuntu-latest

steps:
- task: ComponentGovernanceComponentDetection@0
inputs:
scanType: 'Register'
verbosity: 'Verbose'
alertWarningLevel: 'High'
9 changes: 6 additions & 3 deletions build.sbt
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
import sbt.Keys.publishLocalConfiguration

ThisBuild / resolvers += Resolver.mavenLocal
ThisBuild / scalaVersion := "2.12.15"
ThisBuild / version := "0.7.2"
ThisBuild / version := "0.9.0-rc2"
ThisBuild / organization := "com.linkedin.feathr"
ThisBuild / organizationName := "linkedin"
val sparkVersion = "3.1.3"

publishLocalConfiguration := publishLocalConfiguration.value.withOverwrite(true)

val localAndCloudDiffDependencies = Seq(
"org.apache.spark" %% "spark-avro" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
Expand Down Expand Up @@ -46,7 +50,6 @@ val localAndCloudCommonDependencies = Seq(
"org.xerial" % "sqlite-jdbc" % "3.36.0.3",
"com.github.changvvb" %% "jackson-module-caseclass" % "1.1.1",
"com.azure.cosmos.spark" % "azure-cosmos-spark_3-1_2-12" % "4.11.1",
"org.elasticsearch" % "elasticsearch-spark-30_2.12" % "7.15.2",
"org.eclipse.jetty" % "jetty-util" % "9.3.24.v20180605"
) // Common deps

Expand Down Expand Up @@ -101,4 +104,4 @@ assembly / assemblyMergeStrategy := {
// Some systems(like Hadoop) use different versinos of protobuf(like v2) so we have to shade it.
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.protobuf.**" -> "shade.protobuf.@1").inAll,
)
)
Loading