Skip to content
This repository has been archived by the owner on Sep 9, 2019. It is now read-only.

Releases: lasarobotics/FTCVision

Release 1.0

19 Apr 22:04
Compare
Choose a tag to compare

Now stable and ready for production! This is a giant release containing all updates requested by teams and more - just before worlds! It is strongly recommended to update now to ensure the best Vision code possible on your robot.

A few of the new features and improvements:

  • Significant improvements to FAST detection. FAST now analyzes for beacon buttons and compensates for phone orientation as well as includes optimized code for beacon detection and analysis. The FAST method now runs at a nominal 8 FPS.
  • New analysis method: REALTIME. REALTIME is the previous FAST method with significant improvements to make analysis at near-realtime rates possible! This method runs at a nominal 13-15 FPS, where 15 is the maximum achievable while still using preview frames (which use less processing power so your phone doesn't shut off after a minute or two of using full-sized frames).
  • New confidence algorithm for FAST and REALTIME methods. Previously, >50% confidence could be acceptable. Now, 90-99.99% confidence is generally an acceptable range.
  • Changes to ellipse detection behavior. Buttons, as well as beacon centers, are adjusted in all methods so that the X-axis is the axis parallel to the two beacon buttons and the Y-axis is perpendicular. This makes using the center and button data much easier.
  • Performance enhancements to detection, now caching some internal data to improve overall performance without cost to detection accuracy.
  • Improvements to CameraTester app, now displaying additional debug info when beacon.enableDebug() is turned on. By default, beacon.enableDebug() is disabled, increasing analysis speed.
  • Better error handling so if you don't have OpenCV installed or something blows up, the OpenCV initialization routines will let you know via logcat and telemetry.
  • Fail-proof* OpenCV initializer. It probably* won't ever fail no matter how much you abuse the start/stop buttons, but might force you to restart your app. Either way, FTCVision will let your know! 😃
  • Improved rotation extension that gives users increased control over the screen rotation (to ensure that Vision works in any orientation without additional setup)
  • Added color sensitivity parameter to beacon detection for better color detection

smo-key 🐻

Beta Release 2

25 Jan 23:10
Compare
Choose a tag to compare

MAJOR release containing a lot of improvements and bugfixes.

Please note that some changes between 0.8.x and 0.9.0 are breaking, that is teams will need to reinstall ftc-visionlib and well as update any opmodes to the newest versions. See the new samples in ftc-robotcontroller for examples.

Here are the major updates:

  • Improved beacon detection significantly with the addition of the FAST and COMPLEX analysis methods. FAST is 1.2-3x faster, but lacks some features and only works well at close range (<1.5 feet). COMPLEX is slower and takes more battery power, but enables additional features and works well at a further range (1-4 feet).
  • Added Vision extensions, which simplify code development. All you have to do is use a VisionOpMode (or linear / testable variant) and enable extensions such as a beacon analysis extension or a screen rotation compensation extension. Two are included out of the box.
  • Added Linear OpMode support with the LinearVisionOpMode.
  • Added an easy way to test your opmodes with the TestableVisionOpMode. It's exactly the same as a VisionOpMode, but functions differently under the hood. This new opmode "spoofs" a robot controller while allowing you to view what Vision sees.
  • Added complete documentation for over 65% of the library
  • Fixed several gaping bugs and addressed several teams' concerns

Beta Release 1

25 Nov 05:15
Compare
Choose a tag to compare

This is the beta release of FTCVision!

It is highly recommended that teams adopt vision during this beta. To assist, instructions are provided in the readme and teams can post issues on GitHub with questions and bug fixes.

Overall status:

  • Teams can detect the color of a beacon from 0.5-2.5 feet away.

Major changes in this version:

  • Robot controller now supports (and recommends) use of the stock FTC Robot Controller App. Few modifications have to be made, as detailed in the readme.
  • VisionOpModes now allow programming vision code inside a standard OpMode.
  • Smart scoring algorithm significantly improves detection of the beacon. It needs a bit of tweaking when the beacon is placed at distances greater than 3 feet, but this is a beta after all. We're working on it. :)
  • Upgraded to OpenCV 3.0, allowing the app to run on virtually any Android device! This results in one minor bug with ellipse drawing, which will be fixed by the 1.0 release.
  • Vision extensions now allow for users to simply type beaconColor.getSomething() in their OpMode to get the current beacon state.
  • Overall performance improments, allow running the app at 3x FPS than the previous version.
  • Confidence intervals are now provided to assess the strength of a match. These are provided to the OpMode.
  • Color detection improved!
  • Ellipse and other primitive detection significantly improved! (It even works from 10 feet away!)
  • And more...

Coming soon:

  • Distance detection
  • QR Code scanning
  • Detection improvements, especially at distances from 3-8 feet.
  • And more...

Basic beacon detection and small object detection

12 Oct 23:32
Compare
Choose a tag to compare

New features:

  • Keypoint detection (used to locate small objects)
  • Color blob detection (used to locate colors on the beacon, as well as flags, etc.)
  • Beacon detection (currently bare-bones yet functional, and will be improved rapidly)
  • Camera testing app
  • All in JAVA! (So it's legal. No C++ required.)

The following features are still in progress and may not be usable:

  • Implementation for robot controller (the modified robot controller app, with live camera view, has not been tested)
  • Autonomous package (so that the robot controller can access the code without a live camera view)

If you wish to test the library, you will need to install the OpenCV Manager from the app store.
And, as always, if you wish to contribute, just send smo-key a note.