Home > Backend Development > Python Tutorial > Does a Dense Layer in Keras Flatten the Input Tensor?

Does a Dense Layer in Keras Flatten the Input Tensor?

Barbara Streisand
Release: 2024-10-21 07:53:30
Original
1074 people have browsed it

Does a Dense Layer in Keras Flatten the Input Tensor?

Understanding Keras's Dense Layer Behavior

In Keras, the Dense layer performs a dot product between its weights and the input tensor. The default behavior, as stated in the documentation, is to flatten the input tensor if its rank is greater than 2. However, contrary to this documentation, a Dense layer is actually applied on the last axis of the input tensor.

Let's clarify with an example. Assume a Dense layer with m units applied to an input tensor with shape (n_dim1, n_dim2, ..., n_dimk). The output shape would be (n_dim1, n_dim2, ..., m).

This observation implies that TimeDistributed(Dense(...)) and Dense(...) are functionally equivalent. Additionally, applying a Dense layer with shared weights has an interesting effect. Consider the following toy network:

<code class="python">model = Sequential()
model.add(Dense(10, input_shape=(20, 5)))

model.summary()</code>
Copy after login

The model summary below shows only 60 trainable parameters:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 20, 10)            60        
=================================================================
Total params: 60
Trainable params: 60
Non-trainable params: 0
_________________________________________________________________
Copy after login

This occurs because each unit in the Dense layer connects to all five elements of each row in the input with the same weights. As a result, only 10 × 5 10 (bias parameters per unit) = 60 parameters are required.

To further illustrate this behavior, consider the following visual representation:

[Image of Dense layer application on an input with two or more dimensions in Keras]

In this image, the Dense layer (shown in red) is applied to a three-dimensional input tensor. The output tensor is also three-dimensional, with each column in the input tensor independently mapped to a column in the output tensor.

The above is the detailed content of Does a Dense Layer in Keras Flatten the Input Tensor?. For more information, please follow other related articles on the PHP Chinese website!

source:php
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template