Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docathon] Fix NO.1-NO.5 API label #58848

Merged
merged 3 commits into from
Nov 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion python/paddle/base/data_feed_desc.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def __init__(self, proto_file):

def set_batch_size(self, batch_size):
r"""
Set :attr:`batch_size` in :ref:`api_base_DataFeedDesc` . :attr:`batch_size` can be changed during training.
Set :attr:`batch_size` in ``paddle.base.DataFeedDesc`` . :attr:`batch_size` can be changed during training.

Examples:
.. code-block:: python
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/base/framework.py
Original file line number Diff line number Diff line change
Expand Up @@ -2112,7 +2112,7 @@ def dtype(self):
@property
def lod_level(self):
"""
Indicating ``LoD`` info of current Variable, please refer to :ref:`api_base_LoDTensor_en` to check the meaning
Indicating ``LoD`` info of current Variable, please refer to :ref:`api_paddle_Tensor` to check the meaning
of ``LoD``

**Notes**:
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/distributed/fleet/layers/mpu/mp_layers.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ class VocabParallelEmbedding(paddle.nn.Layer):
num_embeddings(int): One element which indicate the size of the dictionary of embeddings.
embedding_dim(int): One element which indicate the size of each embedding vector respectively.
weight_attr(ParamAttr|None): To specify the weight parameter property. Default: None, which means the
default weight parameter property is used. See usage for details in :ref:`api_ParamAttr` . In addition,
default weight parameter property is used. See usage for details in :ref:`api_paddle_ParamAttr` . In addition,
user-defined or pre-trained word vectors can be loaded with the :attr:`param_attr` parameter.
The local word vector needs to be transformed into numpy format, and the shape of local word
vector should be consistent with :attr:`num_embeddings` . Then :ref:`api_paddle_nn_initializer_Assign`
Expand Down
2 changes: 1 addition & 1 deletion python/paddle/nn/layer/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -1401,7 +1401,7 @@ class Embedding(Layer):
such as :ref:`api_paddle_optimizer_adadelta_Adadelta` , :ref:`api_paddle_optimizer_adamax_Adamax` , :ref:`api_paddle_optimizer_lamb_Lamb`.
In these case, sparse must be False. Default: False.
weight_attr(ParamAttr, optional): To specify the weight parameter property. Default: None, which means the
default weight parameter property is used. See usage for details in :ref:`api_ParamAttr` . In addition,
default weight parameter property is used. See usage for details in :ref:`api_paddle_ParamAttr` . In addition,
user-defined or pre-trained word vectors can be loaded with the :attr:`param_attr` parameter.
The local word vector needs to be transformed into numpy format, and the shape of local word
vector should be consistent with :attr:`num_embeddings` . Then :ref:`api_paddle_nn_initializer_Assign`
Expand Down