Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix broken links in contributor guide #3956

Merged
merged 1 commit into from
Oct 25, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 8 additions & 10 deletions docs/source/contributor-guide/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,19 +83,19 @@ Tests for the code in an individual module are defined in the same source file w

### Rust Integration Tests

There are several tests of the public interface of the DataFusion library in the [tests](https://github.com/apache/arrow-datafusion/tree/master/datafusion/core/tests) directory.
There are several tests of the public interface of the DataFusion library in the [tests](../../../datafusion/core/tests) directory.

You can run these tests individually using a command such as

```shell
cargo test -p datafusion --tests sql_integration
```

One very important test is the [sql_integration](https://github.com/apache/arrow-datafusion/blob/master/datafusion/core/tests/sql_integration.rs) test which validates DataFusion's ability to run a large assortment of SQL queries against an assortment of data setups.
One very important test is the [sql_integration](../../../datafusion/core/tests/sql_integration.rs) test which validates DataFusion's ability to run a large assortment of SQL queries against an assortment of data setups.

### SQL / Postgres Integration Tests

The [integration-tests](https://github.com/apache/arrow-datafusion/blob/master/datafusion/integration-tests) directory contains a harness that runs certain queries against both postgres and datafusion and compares results
The [integration-tests](../../../integration-tests) directory contains a harness that runs certain queries against both postgres and datafusion and compares results

#### setup environment

Expand Down Expand Up @@ -154,7 +154,7 @@ Criterion integrates with Cargo's built-in [benchmark support](https://doc.rust-
cargo bench --bench BENCHMARK_NAME
```

A full list of benchmarks can be found [here](../../../datafusion/benches).
A full list of benchmarks can be found [here](../../../datafusion/core/benches).

_[cargo-criterion](https://github.com/bheisler/cargo-criterion) may also be used for more advanced reporting._

Expand Down Expand Up @@ -187,7 +187,7 @@ Below is a checklist of what you need to do to add a new scalar function to Data
- [here](../../../datafusion/physical-expr/src/math_expressions.rs) for math functions
- [here](../../../datafusion/physical-expr/src/datetime_expressions.rs) for datetime functions
- create a new module [here](../../../datafusion/physical-expr/src) for other functions
- In [core/src/physical_plan](../../../datafusion/core/src/physical_plan/functions.rs), add:
- In [physical-expr/src](../../../datafusion/physical-expr/src/functions.rs), add:
- a new variant to `BuiltinScalarFunction`
- a new entry to `FromStr` with the name of the function as called by SQL
- a new line in `return_type` with the expected return type of the function, given an incoming type
Expand All @@ -197,8 +197,6 @@ Below is a checklist of what you need to do to add a new scalar function to Data
- In [core/tests/sql](../../../datafusion/core/tests/sql), add a new test where the function is called through SQL against well known data and returns the expected result.
- In [expr/src/expr_fn.rs](../../../datafusion/expr/src/expr_fn.rs), add:
- a new entry of the `unary_scalar_expr!` macro for the new function.
- In [core/src/logical_plan/mod](../../../datafusion/core/src/logical_plan/mod.rs), add:
- a new entry in the `pub use expr::{}` set.

## How to add a new aggregate function

Expand All @@ -221,7 +219,7 @@ Below is a checklist of what you need to do to add a new aggregate function to D
## How to display plans graphically

The query plans represented by `LogicalPlan` nodes can be graphically
rendered using [Graphviz](http://www.graphviz.org/).
rendered using [Graphviz](https://www.graphviz.org/).

To do so, save the output of the `display_graphviz` function to a file.:

Expand Down Expand Up @@ -250,8 +248,8 @@ new specifications as you see fit.

Here is the list current active specifications:

- [Output field name semantic](https://arrow.apache.org/datafusion/specification/output-field-name-semantic.html)
- [Invariants](https://arrow.apache.org/datafusion/specification/invariants.html)
- [Output field name semantic](https://arrow.apache.org/datafusion/contributor-guide/specification/output-field-name-semantic.html)
- [Invariants](https://arrow.apache.org/datafusion/contributor-guide/specification/invariants.html)

All specifications are stored in the `docs/source/specification` folder.

Expand Down