Skip to content

Commit

Permalink
[Doc] Fix broken references in Ray Tune documentation (#45233)
Browse files Browse the repository at this point in the history
This PR fixes broken references found in the Tune documentation.

Signed-off-by: pdmurray <peynmurray@gmail.com>
  • Loading branch information
peytondmurray authored Jun 11, 2024
1 parent 56565a3 commit 76f5cd4
Show file tree
Hide file tree
Showing 8 changed files with 17 additions and 12 deletions.
1 change: 1 addition & 0 deletions doc/source/tune/api/execution.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,3 +54,4 @@ tune.run_experiments

run_experiments
Experiment
TuneError
12 changes: 8 additions & 4 deletions doc/source/tune/api/logging.rst
Original file line number Diff line number Diff line change
Expand Up @@ -59,8 +59,10 @@ See the :doc:`tutorial here </tune/examples/tune-mlflow>`.

.. autosummary::
:nosignatures:
:toctree: doc/

tune.logger.mlflow.MLflowLoggerCallback
~air.integrations.mlflow.MLflowLoggerCallback
~air.integrations.mlflow.setup_mlflow

Wandb Integration
-----------------
Expand All @@ -71,8 +73,10 @@ See the :doc:`tutorial here </tune/examples/tune-wandb>`.

.. autosummary::
:nosignatures:
:toctree: doc/

tune.logger.wandb.WandbLoggerCallback
~air.integrations.wandb.WandbLoggerCallback
~air.integrations.wandb.setup_wandb


Comet Integration
Expand All @@ -84,8 +88,9 @@ See the :doc:`tutorial here </tune/examples/tune-comet>`.

.. autosummary::
:nosignatures:
:toctree: doc/

tune.logger.comet.CometLoggerCallback
~air.integrations.comet.CometLoggerCallback

Aim Integration
---------------
Expand Down Expand Up @@ -119,4 +124,3 @@ The non-relevant metrics (like timing stats) can be disabled on the left to show
relevant ones (like accuracy, loss, etc.).

.. image:: ../images/ray-tune-viskit.png

2 changes: 1 addition & 1 deletion doc/source/tune/examples/tune_analyze_results.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -479,7 +479,7 @@
"id": "184bd3ee",
"metadata": {},
"source": [
"The last reported metrics might not contain the best accuracy each trial achieved. If we want to get maximum accuracy that each trial reported throughout its training, we can do so by using {meth}`ResultGrid.get_dataframe <ray.tune.result_grid.ResultGrid.get_dataframe>` specifying a metric and mode used to filter each trial's training history."
"The last reported metrics might not contain the best accuracy each trial achieved. If we want to get maximum accuracy that each trial reported throughout its training, we can do so by using {meth}`~ray.tune.ResultGrid.get_dataframe` specifying a metric and mode used to filter each trial's training history."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion doc/source/tune/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -634,7 +634,7 @@ You can configure this by setting the `RAY_CHDIR_TO_TRIAL_DIR=0` environment var
This explicitly tells Tune to not change the working directory
to the trial directory, giving access to paths relative to the original working directory.
One caveat is that the working directory is now shared between workers, so the
:meth:`train.get_context().get_trial_dir() <ray.train.context.TrainContext.get_.get_trial_dir>`
:meth:`train.get_context().get_trial_dir() <ray.train.context.TrainContext.get_trial_dir>`
API should be used to get the path for saving trial-specific outputs.

.. literalinclude:: doc_code/faq.py
Expand Down
2 changes: 1 addition & 1 deletion doc/source/tune/tutorials/tune-distributed.rst
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ even after failure.
Recovering From Failures
~~~~~~~~~~~~~~~~~~~~~~~~

Tune automatically persists the progress of your entire experiment (a ``Tuner.fit()`` session), so if an experiment crashes or is otherwise cancelled, it can be resumed through :meth:`Tuner.restore() <ray.tune.tuner.Tuner.restore>`.
Tune automatically persists the progress of your entire experiment (a ``Tuner.fit()`` session), so if an experiment crashes or is otherwise cancelled, it can be resumed through :meth:`~ray.tune.Tuner.restore`.

.. _tune-distributed-common:

Expand Down
4 changes: 2 additions & 2 deletions doc/source/tune/tutorials/tune-storage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -212,11 +212,11 @@ you can resume it any time starting from the experiment state saved in the cloud
There are a few options for restoring an experiment:
``resume_unfinished``, ``resume_errored`` and ``restart_errored``.
Please see the documentation of
:meth:`Tuner.restore() <ray.tune.tuner.Tuner.restore>` for more details.
:meth:`~ray.tune.Tuner.restore` for more details.


Advanced configuration
----------------------

See :ref:`Ray Train's section on advanced storage configuration <train-storage-advanced>`.
All of the configurations also apply to Ray Tune.
All of the configurations also apply to Ray Tune.
4 changes: 2 additions & 2 deletions python/ray/tune/experiment/trial.py
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ def validate(formats):
"""Validates formats.
Raises:
ValueError if the format is unknown.
ValueError: if the format is unknown.
"""
for i in range(len(formats)):
formats[i] = formats[i].strip().lower()
Expand Down Expand Up @@ -660,7 +660,7 @@ def update_resources(self, resources: Union[dict, PlacementGroupFactory]):
Should only be called when the trial is not running.
Raises:
ValueError if trial status is running.
ValueError: if trial status is running.
"""
if self.status is Trial.RUNNING:
raise ValueError("Cannot update resources while Trial is running.")
Expand Down
2 changes: 1 addition & 1 deletion python/ray/tune/tuner.py
Original file line number Diff line number Diff line change
Expand Up @@ -391,7 +391,7 @@ def fit(self) -> ResultGrid:
def get_results(self) -> ResultGrid:
"""Get results of a hyperparameter tuning run.
This method returns the same results as :meth:`fit() <ray.tune.tuner.Tuner.fit>`
This method returns the same results as :meth:`~ray.tune.Tuner.fit`
and can be used to retrieve the results after restoring a tuner without
calling ``fit()`` again.
Expand Down

0 comments on commit 76f5cd4

Please sign in to comment.