Rate this Page

Struct GRUOptions#

Page Contents

Struct Documentation#

struct GRUOptions#

Options for the GRU module.

Example:

GRU model(GRUOptions(2,
4).num_layers(3).batch_first(false).bidirectional(true));

Public Functions

GRUOptions(int64_t input_size, int64_t hidden_size)#
inline auto input_size(const int64_t &new_input_size) -> decltype(*this)#

The number of expected features in the input x

inline auto input_size(int64_t &&new_input_size) -> decltype(*this)#
inline const int64_t &input_size() const noexcept#
inline int64_t &input_size() noexcept#
inline auto hidden_size(const int64_t &new_hidden_size) -> decltype(*this)#

The number of features in the hidden state h

inline auto hidden_size(int64_t &&new_hidden_size) -> decltype(*this)#
inline const int64_t &hidden_size() const noexcept#
inline int64_t &hidden_size() noexcept#
inline auto num_layers(const int64_t &new_num_layers) -> decltype(*this)#

Number of recurrent layers.

E.g., setting num_layers=2 would mean stacking two GRUs together to form a stacked GRU, with the second GRU taking in outputs of the first GRU and computing the final results. Default: 1

inline auto num_layers(int64_t &&new_num_layers) -> decltype(*this)#
inline const int64_t &num_layers() const noexcept#
inline int64_t &num_layers() noexcept#
inline auto bias(const bool &new_bias) -> decltype(*this)#

If false, then the layer does not use bias weights b_ih and b_hh.

Default: true

inline auto bias(bool &&new_bias) -> decltype(*this)#
inline const bool &bias() const noexcept#
inline bool &bias() noexcept#
inline auto batch_first(const bool &new_batch_first) -> decltype(*this)#

If true, then the input and output tensors are provided as (batch, seq, feature).

Default: false

inline auto batch_first(bool &&new_batch_first) -> decltype(*this)#
inline const bool &batch_first() const noexcept#
inline bool &batch_first() noexcept#
inline auto dropout(const double &new_dropout) -> decltype(*this)#

If non-zero, introduces a Dropout layer on the outputs of each GRU layer except the last layer, with dropout probability equal to dropout.

Default: 0

inline auto dropout(double &&new_dropout) -> decltype(*this)#
inline const double &dropout() const noexcept#
inline double &dropout() noexcept#
inline auto bidirectional(const bool &new_bidirectional) -> decltype(*this)#

If true, becomes a bidirectional GRU. Default: false

inline auto bidirectional(bool &&new_bidirectional) -> decltype(*this)#
inline const bool &bidirectional() const noexcept#
inline bool &bidirectional() noexcept#