Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add docstring to DescrptSeA #1017

Merged
merged 2 commits into from
Aug 23, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion deepmd/descriptor/loc_frame.py
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,7 @@ def prod_force_virial(self,
natoms[i]: 2 <= i < Ntypes+2, number of type i atoms

Returns
------
-------
force
The force on atoms
virial
Expand Down
60 changes: 51 additions & 9 deletions deepmd/descriptor/se_a.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,23 +16,58 @@
from deepmd.utils.graph import load_graph_def, get_tensor_by_name_from_graph

class DescrptSeA ():
"""DeepPot-SE constructed from all information (both angular and radial) of
atomic configurations.

The embedding takes the distance between atoms as input.
r"""DeepPot-SE constructed from all information (both angular and radial) of
atomic configurations. The embedding takes the distance between atoms as input.

The descriptor :math:`\mathcal{D}^i \in \mathcal{R}^{M_1 \times M_2}` is given by [1]_

.. math::
\mathcal{D}^i = (\mathcal{G}^i)^T \mathcal{R}^i (\mathcal{R}^i)^T \mathcal{G}^i_<

where :math:`\mathcal{R}^i \in \mathbb{R}^{N \times 4}` is the coordinate
matrix, and each row of :math:`\mathcal{R}^i` can be constructed as follows

.. math::
(\mathcal{R}^i)_j = [
\begin{array}{c}
s(r_{ji}) & x_{ji} & y_{ji} & z_{ji}
\end{array}
]

where :math:`\mathbf{R}_{ji}=\mathbf{R}_j-\mathbf{R}_i = (x_{ji}, y_{ji}, z_{ji})` is
the relative coordinate and :math:`r_{ji}=\lVert \mathbf{R}_{ji} \lVert` is its norm.
The switching function :math:`s(r)` is defined as:

.. math::
s(r)=
\begin{cases}
\frac{1}{r}, & r<r_s \\
\frac{1}{r} \{ {(\frac{r - r_s}{ r_c - r_s})}^3 (-6 {(\frac{r - r_s}{ r_c - r_s})}^2 +15 \frac{r - r_s}{ r_c - r_s} -10) +1 \}, & r_s \leq r<r_c \\
0, & r \geq r_c
\end{cases}

Each row of the embedding matrix :math:`\mathcal{G}^i \in \mathbb{R}^{N \times M_1}` consists of outputs
of a embedding network :math:`\mathcal{N}` of :math:`s(r_{ji})`:

.. math::
(\mathcal{G}^i)_j = \mathcal{N}(s(r_{ji}))

:math:`\mathcal{G}^i_< \in \mathbb{R}^{N \times M_2}` takes first :math:`M_2`$` columns of
:math:`\mathcal{G}^i`$`. The equation of embedding network :math:`\mathcal{N}` can be found at
:meth:`deepmd.utils.network.embedding_net`.

Parameters
----------
rcut
The cut-off radius
The cut-off radius :math:`r_c`
rcut_smth
From where the environment matrix should be smoothed
From where the environment matrix should be smoothed :math:`r_s`
sel : list[str]
sel[i] specifies the maxmum number of type i atoms in the cut-off radius
neuron : list[int]
Number of neurons in each hidden layers of the embedding net
Number of neurons in each hidden layers of the embedding net :math:`\mathcal{N}`
axis_neuron
Number of the axis neuron (number of columns of the sub-matrix of the embedding matrix)
Number of the axis neuron :math:`M_2` (number of columns of the sub-matrix of the embedding matrix)
resnet_dt
Time-step `dt` in the resnet construction:
y = x + dt * \phi (Wx + b)
Expand All @@ -53,6 +88,13 @@ class DescrptSeA ():
The precision of the embedding net parameters. Supported options are {1}
uniform_seed
Only for the purpose of backward compatibility, retrieves the old behavior of using the random seed

References
----------
.. [1] Linfeng Zhang, Jiequn Han, Han Wang, Wissam A. Saidi, Roberto Car, and E. Weinan. 2018.
End-to-end symmetry preserving inter-atomic potential energy model for finite and extended
systems. In Proceedings of the 32nd International Conference on Neural Information Processing
Systems (NIPS'18). Curran Associates Inc., Red Hook, NY, USA, 4441–4451.
"""
@docstring_parameter(list_to_doc(ACTIVATION_FN_DICT.keys()), list_to_doc(PRECISION_DICT.keys()))
def __init__ (self,
Expand Down Expand Up @@ -488,7 +530,7 @@ def prod_force_virial(self,
natoms[i]: 2 <= i < Ntypes+2, number of type i atoms

Returns
------
-------
force
The force on atoms
virial
Expand Down
2 changes: 1 addition & 1 deletion deepmd/descriptor/se_a_ef.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,7 @@ def prod_force_virial(self,
natoms[i]: 2 <= i < Ntypes+2, number of type i atoms

Returns
------
-------
force
The force on atoms
virial
Expand Down
2 changes: 1 addition & 1 deletion deepmd/descriptor/se_r.py
Original file line number Diff line number Diff line change
Expand Up @@ -343,7 +343,7 @@ def prod_force_virial(self,
natoms[i]: 2 <= i < Ntypes+2, number of type i atoms

Returns
------
-------
force
The force on atoms
virial
Expand Down
2 changes: 1 addition & 1 deletion deepmd/descriptor/se_t.py
Original file line number Diff line number Diff line change
Expand Up @@ -341,7 +341,7 @@ def prod_force_virial(self,
natoms[i]: 2 <= i < Ntypes+2, number of type i atoms

Returns
------
-------
force
The force on atoms
virial
Expand Down
4 changes: 2 additions & 2 deletions deepmd/infer/deep_eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,12 @@ def model_type(self) -> str:

@property
def model_version(self) -> str:
"""Get type of model.
"""Get version of model.

Returns
-------
str
type of model
version of model
"""
if not self._model_version:
try:
Expand Down
39 changes: 35 additions & 4 deletions deepmd/utils/network.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,17 +93,42 @@ def embedding_net(xx,
seed = None,
trainable = True,
uniform_seed = False):
"""
r"""The embedding network.

The embedding network function :math:`\mathcal{N}` is constructed by is the
composition of multiple layers :math:`\mathcal{L}^{(i)}`:

.. math::
\mathcal{N} = \mathcal{L}^{(n)} \circ \mathcal{L}^{(n-1)}
\circ \cdots \circ \mathcal{L}^{(1)}

A layer :math:`\mathcal{L}` is given by one of the following forms,
depending on the number of nodes: [1]_

.. math::
\mathbf{y}=\mathcal{L}(\mathbf{x};\mathbf{w},\mathbf{b})=
\begin{cases}
\boldsymbol{\phi}(\mathbf{x}^T\mathbf{w}+\mathbf{b}) + \mathbf{x}, & N_2=N_1 \\
\boldsymbol{\phi}(\mathbf{x}^T\mathbf{w}+\mathbf{b}) + (\mathbf{x}, \mathbf{x}), & N_2 = 2N_1\\
\boldsymbol{\phi}(\mathbf{x}^T\mathbf{w}+\mathbf{b}), & \text{otherwise} \\
\end{cases}

where :math:`\mathbf{x} \in \mathbb{R}^{N_1}`$` is the input vector and :math:`\mathbf{y} \in \mathbb{R}^{N_2}`
is the output vector. :math:`\mathbf{w} \in \mathbb{R}^{N_1 \times N_2}` and
:math:`\mathbf{b} \in \mathbb{R}^{N_2}`$` are weights and biases, respectively,
both of which are trainable if `trainable` is `True`. :math:`\boldsymbol{\phi}`
is the activation function.

Parameters
----------
xx : Tensor
Input tensor of shape [-1,1]
Input tensor :math:`\mathbf{x}` of shape [-1,1]
network_size: list of int
Size of the embedding network. For example [16,32,64]
precision:
Precision of network weights. For example, tf.float64
activation_fn:
Activation function
Activation function :math:`\boldsymbol{\phi}`
resnet_dt: boolean
Using time-step in the ResNet construction
name_suffix: str
Expand All @@ -115,7 +140,13 @@ def embedding_net(xx,
seed: int
Random seed for initializing network parameters
trainable: boolean
If the netowk is trainable
If the network is trainable

References
----------
.. [1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Identitymappings
in deep residual networks. InComputer Vision – ECCV 2016,pages 630–645. Springer
International Publishing, 2016.
"""
input_shape = xx.get_shape().as_list()
outputs_size = [input_shape[1]] + network_size
Expand Down