Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Examples Clean-Up #1408

Merged
merged 40 commits into from
Sep 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
6f69ec8
Split simulation to be per-language, since support will generally var…
gerth2 Sep 5, 2024
cc7af9f
Strategies rework and streamline
gerth2 Sep 5, 2024
e0aac4f
WIP cleaning up sim estimates
gerth2 Sep 5, 2024
3157588
Basic streamlining to robot example. Looks Functional.
gerth2 Sep 5, 2024
e819980
wip adding GamepieceLauncher
gerth2 Sep 5, 2024
1b4e840
Added gamepiece launcher logic
gerth2 Sep 5, 2024
10712fe
Aim-at-target simple strategy added.
gerth2 Sep 5, 2024
aec81dc
Aim and Range working too
gerth2 Sep 5, 2024
942b884
Even more cleanup
gerth2 Sep 6, 2024
731d8b5
Moar Clean
gerth2 Sep 6, 2024
96d34e7
Code Block cleanup/commonization
gerth2 Sep 6, 2024
d6c37ea
RLI's are still todo, but build didn't like my toto strategy
gerth2 Sep 6, 2024
db9cc4b
another attempt
gerth2 Sep 6, 2024
d79c8d5
RLI cleanup
gerth2 Sep 6, 2024
385154b
moar cleanup
gerth2 Sep 6, 2024
306a7ff
WIP C++example rework. poseest working decently now.
gerth2 Sep 6, 2024
40200dc
aim at target example base WIP
gerth2 Sep 6, 2024
8c0b01f
AimAtTarget working
gerth2 Sep 6, 2024
d01b35b
Aim And Range working too
gerth2 Sep 6, 2024
5847be8
what the heck is an explicit falcon
gerth2 Sep 6, 2024
bbaf120
whip python
gerth2 Sep 6, 2024
601d2e7
looks like python can run examples
gerth2 Sep 6, 2024
63c27a3
more WIP python swerve
gerth2 Sep 6, 2024
d8b6b7c
wip python logging
gerth2 Sep 7, 2024
137fa36
More Python WIP
gerth2 Sep 7, 2024
bda58ad
Swerve simulation functional
gerth2 Sep 8, 2024
1da3519
Python examples fixed up and added
gerth2 Sep 8, 2024
9dee7a2
whippyformat and copyright stuff
gerth2 Sep 8, 2024
56513ce
PoseEst sample code added. Streamlined other examples.
gerth2 Sep 8, 2024
adb3098
RLI's updated
gerth2 Sep 8, 2024
5d4f588
moar RLI tweaks
gerth2 Sep 8, 2024
ab27254
i am very bad at software
gerth2 Sep 8, 2024
be8b25b
Removed broken/unneded bat file
gerth2 Sep 8, 2024
5a1c491
Some review comments
gerth2 Sep 8, 2024
cb76743
more review comments
gerth2 Sep 8, 2024
21ae9eb
Moto's review comments pt 1
gerth2 Sep 13, 2024
4b82755
spppppppoooooooooooooooooooooooootttttttttttttttttttttttttlllllllllll…
gerth2 Sep 13, 2024
ed4556c
Merge branch 'master' into example-docs-cleanup
gerth2 Sep 13, 2024
b3aee41
Pose Estimation Punchline
gerth2 Sep 13, 2024
ea9c9ad
bigger GIf
gerth2 Sep 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/source/docs/additional-resources/best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@
- Make sure you take advantage of the field calibration time given at the start of the event:
- Bring your robot to the field at the allotted time.
- Turn on your robot and pull up the dashboard on your driver station.
- Point your robot at the target(s) and ensure you get a consistent tracking (you hold one target consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there. You want to make your exposure as low as possible with a tight hue value to ensure no extra targets are detected.
- Move the robot close, far, angled, and around the field to ensure no extra targets are found anywhere when looking for a target.
- Point your robot at the AprilTags(s) and ensure you get a consistent tracking (you hold one AprilTag consistently, the ceiling lights aren't detected, etc.).
- If you have problems with your pipeline, go to the pipeline tuning section and retune the pipeline using the guide there.
- Move the robot close, far, angled, and around the field to ensure no extra AprilTags are found.
- Go to a practice match to ensure everything is working correctly.
- After field calibration, use the "Export Settings" button in the "Settings" page to create a backup.
- Do this for each coprocessor on your robot that runs PhotonVision, and name your exports with meaningful names.
Expand All @@ -26,4 +26,4 @@
- This effectively works as a snapshot of your PhotonVision data that can be restored at any point.
- Before every match, check the ethernet connection going into your coprocessor and that it is seated fully.
- Ensure that exposure is as low as possible and that you don't have the dashboard up when you don't need it to reduce bandwidth.
- Stream at as low of a resolution as possible while still detecting targets to stay within bandwidth limits.
- Stream at as low of a resolution as possible while still detecting AprilTags to stay within field bandwidth limits.
7 changes: 6 additions & 1 deletion docs/source/docs/apriltag-pipelines/multitag.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ This multi-target pose estimate can be accessed using PhotonLib. We suggest usin
```{eval-rst}
.. tab-set-code::

.. code-block:: java
.. code-block:: Java

var result = camera.getLatestResult();
if (result.getMultiTagResult().estimatedPose.isPresent) {
Expand All @@ -38,6 +38,11 @@ This multi-target pose estimate can be accessed using PhotonLib. We suggest usin
if (result.MultiTagResult().result.isPresent) {
frc::Transform3d fieldToCamera = result.MultiTagResult().result.best;
}

.. code-block:: Python

# Coming Soon!

```

:::{note}
Expand Down
38 changes: 27 additions & 11 deletions docs/source/docs/contributing/building-photon.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,19 +243,11 @@ The program will wait for the VSCode debugger to attach before proceeding.

### Running examples

You can run one of the many built in examples straight from the command line, too! They contain a fully featured robot project, and some include simulation support. The projects can be found inside the photonlib-java-examples and photonlib-cpp-examples subdirectories, respectively. The projects currently available include:
You can run one of the many built in examples straight from the command line, too! They contain a fully featured robot project, and some include simulation support. The projects can be found inside the photonlib-*-examples subdirectories for each language.

- photonlib-java-examples:
gerth2 marked this conversation as resolved.
Show resolved Hide resolved
- aimandrange:simulateJava
- aimattarget:simulateJava
- getinrange:simulateJava
- simaimandrange:simulateJava
- simposeest:simulateJava
- photonlib-cpp-examples:
- aimandrange:simulateNative
- getinrange:simulateNative
#### Running C++/Java

To run them, use the commands listed below. PhotonLib must first be published to your local maven repository, then the copy PhotonLib task will copy the generated vendordep json file into each example. After that, the simulateJava/simulateNative task can be used like a normal robot project. Robot simulation with attached debugger is technically possible by using simulateExternalJava and modifying the launch script it exports, though unsupported.
PhotonLib must first be published to your local maven repository, then the copy PhotonLib task will copy the generated vendordep json file into each example. After that, the simulateJava/simulateNative task can be used like a normal robot project. Robot simulation with attached debugger is technically possible by using simulateExternalJava and modifying the launch script it exports, though not yet supported.

```
~/photonvision$ ./gradlew publishToMavenLocal
Expand All @@ -268,3 +260,27 @@ To run them, use the commands listed below. PhotonLib must first be published to
~/photonvision/photonlib-cpp-examples$ ./gradlew copyPhotonlib
~/photonvision/photonlib-cpp-examples$ ./gradlew <example-name>:simulateNative
```

#### Running Python

PhotonLibPy must first be built into a wheel.

```
> cd photon-lib/py
> buildAndTest.bat
```

Then, you must enable using the development wheels. robotpy will use pip behind the scenes, and this bat file tells pip about your development artifacts.

Note: This is best done in a virtual environment.

```
> enableUsingDevBuilds.bat
```

Then, run the examples:

```
> cd photonlib-python-examples
> run.bat <example name>
```
2 changes: 1 addition & 1 deletion docs/source/docs/description.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
## Description

PhotonVision is a free, fast, and easy-to-use vision processing solution for the *FIRST*Robotics Competition. PhotonVision is designed to get vision working on your robot *quickly*, without the significant cost of other similar solutions.
Using PhotonVision, teams can go from setting up a camera and coprocessor to detecting and tracking targets by simply tuning sliders. With an easy to use interface, comprehensive documentation, and a feature rich vendor dependency, no experience is necessary to use PhotonVision. No matter your resources, using PhotonVision is easy compared to its alternatives.
Using PhotonVision, teams can go from setting up a camera and coprocessor to detecting and tracking AprilTags and other targets by simply tuning sliders. With an easy to use interface, comprehensive documentation, and a feature rich vendor dependency, no experience is necessary to use PhotonVision. No matter your resources, using PhotonVision is easy compared to its alternatives.

## Advantages

Expand Down
37 changes: 24 additions & 13 deletions docs/source/docs/examples/aimandrange.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,36 +4,47 @@ The following example is from the PhotonLib example repository ([Java](https://g

## Knowledge and Equipment Needed

- Everything required in {ref}`Aiming at a Target <docs/examples/aimingatatarget:Knowledge and Equipment Needed>` and {ref}`Getting in Range of the Target <docs/examples/gettinginrangeofthetarget:Knowledge and Equipment Needed>`.
- Everything required in {ref}`Aiming at a Target <docs/examples/aimingatatarget:Knowledge and Equipment Needed>`.

## Code

Now that you know how to both aim and get in range of the target, it is time to combine them both at the same time. This example will take the previous two code examples and make them into one function using the same tools as before. With this example, you now have all the knowledge you need to use PhotonVision on your robot in any game.
Now that you know how to aim toward the AprilTag, let's also drive the correct distance from the AprilTag.

To do this, we'll use the *pitch* of the target in the camera image and trigonometry to figure out how far away the robot is from the AprilTag. Then, like before, we'll use the P term of a PID controller to drive the robot to the correct distance.

```{eval-rst}
.. tab-set::

.. tab-item:: Java

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-java-examples/aimandrange/src/main/java/frc/robot/Robot.java
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-java-examples/aimandrange/src/main/java/frc/robot/Robot.java
:language: java
:lines: 42-111
:lines: 84-131
:linenos:
:lineno-start: 42
:lineno-start: 84

.. tab-item:: C++ (Header)

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimandrange/src/main/include/Robot.h
:language: cpp
:lines: 27-71
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimandrange/src/main/include/Robot.h
gerth2 marked this conversation as resolved.
Show resolved Hide resolved
:language: c++
:lines: 25-63
:linenos:
:lineno-start: 27
:lineno-start: 25

.. tab-item:: C++ (Source)

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimandrange/src/main/cpp/Robot.cpp
:language: cpp
:lines: 25-67
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimandrange/src/main/cpp/Robot.cpp
:language: c++
:lines: 58-107
:linenos:
:lineno-start: 25
:lineno-start: 58

.. tab-item:: Python

.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-python-examples/aimandrange/robot.py
:language: python
:lines: 44-98
:linenos:
:lineno-start: 44

```
42 changes: 26 additions & 16 deletions docs/source/docs/examples/aimingatatarget.md
Original file line number Diff line number Diff line change
@@ -1,45 +1,55 @@
# Aiming at a Target

The following example is from the PhotonLib example repository ([Java](https://github.com/PhotonVision/photonvision/tree/master/photonlib-java-examples/aimattarget)/[C++](https://github.com/PhotonVision/photonvision/tree/master/photonlib-cpp-examples/aimattarget)).
The following example is from the PhotonLib example repository ([Java](https://github.com/PhotonVision/photonvision/tree/master/photonlib-java-examples/aimattarget)).

## Knowledge and Equipment Needed

- Robot with a vision system running PhotonVision
- Target
- Ability to track a target by properly tuning a pipeline
- A Robot
- A camera mounted rigidly to the robot's frame, cenetered and pointed forward.
- A coprocessor running PhotonVision with an AprilTag or Aurco 2D Pipeline.
- [A printout of Apriltag 7](https://firstfrc.blob.core.windows.net/frc2024/FieldAssets/Apriltag_Images_and_User_Guide.pdf), mounted on a rigid and flat surface.

## Code

Now that you have properly set up your vision system and have tuned a pipeline, you can now aim your robot/turret at the target using the data from PhotonVision. This data is reported over NetworkTables and includes: latency, whether there is a target detected or not, pitch, yaw, area, skew, and target pose relative to the robot. This data will be used/manipulated by our vendor dependency, PhotonLib. The documentation for the Network Tables API can be found {ref}`here <docs/additional-resources/nt-api:Getting Target Information>` and the documentation for PhotonLib {ref}`here <docs/programming/photonlib/adding-vendordep:What is PhotonLib?>`.
Now that you have properly set up your vision system and have tuned a pipeline, you can now aim your robot at an AprilTag using the data from PhotonVision. The *yaw* of the target is the critical piece of data that will be needed first.

For this simple example, only yaw is needed.
Yaw is reported to the roboRIO over Network Tables. PhotonLib, our vender dependency, is the easiest way to access this data. The documentation for the Network Tables API can be found {ref}`here <docs/additional-resources/nt-api:Getting Target Information>` and the documentation for PhotonLib {ref}`here <docs/programming/photonlib/adding-vendordep:What is PhotonLib?>`.

gerth2 marked this conversation as resolved.
Show resolved Hide resolved
In this example, while the operator holds a button down, the robot will turn towards the goal using the P term of a PID loop. To learn more about how PID loops work, how WPILib implements them, and more, visit [Advanced Controls (PID)](https://docs.wpilib.org/en/stable/docs/software/advanced-control/introduction/index.html) and [PID Control in WPILib](https://docs.wpilib.org/en/stable/docs/software/advanced-controls/controllers/pidcontroller.html#pid-control-in-wpilib).
In this example, while the operator holds a button down, the robot will turn towards the AprilTag using the P term of a PID loop. To learn more about how PID loops work, how WPILib implements them, and more, visit [Advanced Controls (PID)](https://docs.wpilib.org/en/stable/docs/software/advanced-control/introduction/index.html) and [PID Control in WPILib](https://docs.wpilib.org/en/stable/docs/software/advanced-controls/controllers/pidcontroller.html#pid-control-in-wpilib).

```{eval-rst}
.. tab-set::

.. tab-item:: Java

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-java-examples/aimattarget/src/main/java/frc/robot/Robot.java
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-java-examples/aimattarget/src/main/java/frc/robot/Robot.java
gerth2 marked this conversation as resolved.
Show resolved Hide resolved
:language: java
:lines: 41-98
:lines: 77-117
:linenos:
:lineno-start: 41
:lineno-start: 77

.. tab-item:: C++ (Header)

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimattarget/src/main/include/Robot.h
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimattarget/src/main/include/Robot.h
:language: c++
:lines: 27-53
:lines: 25-60
:linenos:
:lineno-start: 27
:lineno-start: 25

.. tab-item:: C++ (Source)

.. rli:: https://raw.githubusercontent.com/PhotonVision/photonvision/ebef19af3d926cf87292177c9a16d01b71219306/photonlib-cpp-examples/aimattarget/src/main/cpp/Robot.cpp
.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-cpp-examples/aimattarget/src/main/cpp/Robot.cpp
:language: c++
:lines: 25-52
:lines: 56-96
:linenos:
:lineno-start: 25
:lineno-start: 56

.. tab-item:: Python

.. rli:: https://raw.githubusercontent.com/gerth2/photonvision/adb3098fbe0cdbc1a378c6d5a41126dd1d6d6955/photonlib-python-examples/aimattarget/robot.py
:language: python
:lines: 46-70
:linenos:
:lineno-start: 46

```
58 changes: 0 additions & 58 deletions docs/source/docs/examples/gettinginrangeofthetarget.md

This file was deleted.

Binary file added docs/source/docs/examples/images/poseest_demo.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 1 addition & 3 deletions docs/source/docs/examples/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@
:maxdepth: 1

aimingatatarget
gettinginrangeofthetarget
aimandrange
simaimandrange
simposeest
poseest
```
Loading
Loading