Why does nn.Conv1d work on 2d feature [b, c, h, w]?

Question:

I am wondering why conv1d works on 2d feature(batch, channel, height, width).

An nn.Conv1d(channel, channel, kernel_size=(1,1)) works when I put 2d feature, but gives different result from nn.Conv2d(channel, channel, kernel_size=1).

I want to know why conv1d works and what it mean by 2d kernel size in 1d convolution.

Asked By: Joanna Hong

||

Answers:

"I want to know why conv1d works and what it mean by 2d kernel size in 1d convolution"

It doesn’t have any reason not to work. Under the hood all this "convolution" means is "Dot Product", now it could be between matrix and vector, matrix and matrix, vector and vector, etc. Simply put, the real distinction between 1D and 2D convolution is the freedom one has to move along the spatial dimension of input. This means If you look at 1D convolution, It can move along one direction only, that is, the temporal dimension of the input (Note the kernel could be a vector, matrix whatever that doesn’t matter). On the other hand, 2D convolution has the freedom to move along 2 dimensions (height and width) of the input that is the spatial dimension. If it still seems confusing, have a look at the gifs below.

1D Convolution in action:

Note: It’s a 1D convolution with kernel size 3x3, look how it only moves down the input which is the temporal dimension.
1d conv

2D Connvolution in action:

Note: It’s a 2D convolution with kernel size 3x3, look how it moves along both width and height of the input which is the spatial dimension.
2d conv

I think It’s clear now what is the actual difference between 1D and 2D conv and why they both would produce different results for the same input.

Answered By: Khalid Saifullah

The current accepted answer is incorrect, so I write this one.

In the example the asker gives, the two convolutions are the same, up to random initialization of parameters.
This is because both use the same underlying implementation, and just pass different parameters such as kernel size. nn.Conv1d, nn.Conv2d and nn.Conv3d interpret their input differently, e.g. kernel_size=3 will become (3,3) for nn.Conv2d but (3,) for nn.Conv1d.

However, you can force these parameters to be the correct shape.
Note that stride and dilation need to be specified in some of the instances below:

import torch
from torch import nn

conv1d = nn.Conv1d(1, 1, 3, padding='same', bias=False)
conv2d = nn.Conv2d(1, 1, (3,), stride=(1,), dilation=(1,), padding='same', bias=False)
conv3d = nn.Conv3d(1, 1, (3,), stride=(1,), dilation=(1,), padding='same', bias=False)
conv1d.weight.data.fill_(1)
conv2d.weight.data.fill_(1)
conv3d.weight.data.fill_(1)
x = torch.rand(1, 1, 100)
assert (conv1d(x) == conv2d(x)).all() and (conv1d(x) == conv3d(x)).all()

conv1d = nn.Conv1d(1, 1, (3,3), padding='same', bias=False)
conv2d = nn.Conv2d(1, 1, 3, padding='same', bias=False)
conv3d = nn.Conv3d(1, 1, (3,3), stride=(1,1), dilation=(1,1), padding='same', bias=False)
conv1d.weight.data.fill_(1)
conv2d.weight.data.fill_(1)
conv3d.weight.data.fill_(1)
x = torch.rand(1, 1, 100, 100)
assert (conv1d(x) == conv2d(x)).all() and (conv1d(x) == conv3d(x)).all()

conv1d = nn.Conv1d(1, 1, (3,3,3), stride=(1,1,1), dilation=(1,1,1), padding='same', bias=False)
conv2d = nn.Conv2d(1, 1, (3,3,3), stride=(1,1,1), dilation=(1,1,1), padding='same', bias=False)
conv3d = nn.Conv3d(1, 1, 3, padding='same', bias=False)
conv1d.weight.data.fill_(1)
conv2d.weight.data.fill_(1)
conv3d.weight.data.fill_(1)
x = torch.rand(1, 1, 100, 100, 100)
assert (conv1d(x) == conv2d(x)).all() and (conv1d(x) == conv3d(x)).all()

This equality would not work if, as is stated in the currently accepted answer, nn.Conv1d could only "move along one direction only", as both spatial dimensions are much larger than the kernel size. nn.Conv1d could not have generated the full 100×100 output if it were locked to move only in one direction.

You can read more in https://discuss.pytorch.org/t/conv1d-kernel-size-explained/84323/4, as pointed out by @trialNerror in a comment to the question.

Answered By: Yuval Sieradzki