diff --git a/content/project/2021-hake-ML/featured.jpg b/content/project/2021-hake-ML/featured.jpg
deleted file mode 100644
index fa8f9f9f2da..00000000000
Binary files a/content/project/2021-hake-ML/featured.jpg and /dev/null differ
diff --git a/content/project/2021-hake-ML/featured.png b/content/project/2021-hake-ML/featured.png
new file mode 100644
index 00000000000..ca85cfff180
Binary files /dev/null and b/content/project/2021-hake-ML/featured.png differ
diff --git a/content/project/2021-hake-ML/index.md b/content/project/2021-hake-ML/index.md
index 30baa54ab0e..7930b7e924f 100644
--- a/content/project/2021-hake-ML/index.md
+++ b/content/project/2021-hake-ML/index.md
@@ -34,9 +34,9 @@ image:
Active acoustic data collected with scientific echosounders from acoustic surveys have become essential for stock assessment in fisheries resources management. Over the past three decades, a wide suite of physics-based acoustic scattering models were developed to allow translating acoustic observations to biological quantities, such as biomass and abundance, for different marine organisms. In parallel, quantitative scientific echosounders have progressed from specialized equipment to one of the standard instruments on fisheries survey vessels.
-
+To take full advantage of these large and complex new datasets, in this project we aim to combine the development of machine learning methodology with a cloud-based workflow to accelerate the extraction of biological information from fisheries acoustic data. Our group has developed and used [Echopype](https://echopype.readthedocs.io/en/stable/), a Raw Sonar Backscatter data parsing Python package, and [Echoregions](https://echoregions.readthedocs.io/en/latest/), an Echoview annotation data parsing Python package. Transferring data from Echoview and proprietary echosounder formats to Python data products enables seamless integration with a rich ecosystem of scientific computing tools developed by a vast community of open-source contributors, thus allowing us to use our data to train deep learning models to predict regions of interest in echograms.
-To take full advantage of these large and complex new datasets, in this project we aim to combine the development of machine learning methodology with a cloud-based workflow to accelerate the extraction of biological information from fisheries acoustic data. Our group has developed and used [Echopype](https://echopype.readthedocs.io/en/stable/), a Raw Sonar Backscatter data parsing software, and [Echoregions](https://echoregions.readthedocs.io/en/latest/), an Echoview annotation data parsing software, to drive the workflow of this project.
+
This project is in close collaboration with the [Fisheries Engineering and Acoustics Technology (FEAT) team](https://www.fisheries.noaa.gov/west-coast/sustainable-fisheries/fisheries-engineering-and-acoustic-technologies-team) at the NOAA Fisheries [Northwest Fisheries science center (NWFSC)](https://www.fisheries.noaa.gov/about/northwest-fisheries-science-center) and uses data collected in the past 20 years off the west coast of the U.S. from the [Joint U.S.-Canada Integrated Ecosystem and Pacific Hake Acoustic-Trawl Survey](https://www.fisheries.noaa.gov/west-coast/science-data/joint-us-canada-integrated-ecosystem-and-pacific-hake-acoustic-trawl-survey) (aka the "Hake survey").