Rate this Page

Class GRUImpl#

Inheritance Relationships#

Base Type#

Class Documentation#

class GRUImpl : public torch::nn::detail::RNNImplBase<GRUImpl>#

A multi-layer gated recurrent unit (GRU) module.

See https://pytorch.org/docs/main/generated/torch.nn.GRU.html to learn about the exact behavior of this module.

See the documentation for torch::nn::GRUOptions class to learn what constructor arguments are supported for this module.

Example:

GRU model(GRUOptions(2,
4).num_layers(3).batch_first(false).bidirectional(true));

Public Functions

inline GRUImpl(int64_t input_size, int64_t hidden_size)#
explicit GRUImpl(const GRUOptions &options_)#
std::tuple<Tensor, Tensor> forward(const Tensor &input, Tensor hx = {})#
std::tuple<torch::nn::utils::rnn::PackedSequence, Tensor> forward_with_packed_input(const torch::nn::utils::rnn::PackedSequence &packed_input, Tensor hx = {})#

Public Members

GRUOptions options#

Protected Functions

inline virtual bool _forward_has_default_args() override#

The following three functions allow a module with default arguments in its forward method to be used in a Sequential module.

You should NEVER override these functions manually. Instead, you should use the FORWARD_HAS_DEFAULT_ARGS macro.

inline virtual unsigned int _forward_num_required_args() override#
inline std::vector<torch::nn::AnyValue> _forward_populate_default_args(std::vector<torch::nn::AnyValue> &&arguments) override#
std::tuple<Tensor, Tensor> forward_helper(const Tensor &input, const Tensor &batch_sizes, const Tensor &sorted_indices, int64_t max_batch_size, Tensor hx)#

Friends

friend struct torch::nn::AnyModuleHolder