Pytorch how to stack tensor like for loop

Question:

I want to concat tensor generated in for loop, and get 2dTensor.
standard python, like below.

li = []
for i in range(0, len(items)):
    # calc something
    li.append(calc_result)

In my case, in for loop, generate torch.Size([768]) Tensor, and I want to get torch.Size([len(item),768]) Tensor.
How to do this?

Asked By: rootpetit

||

Answers:

You can use torch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.

Answered By: iacolippo

The accepted answer using torch.stack is incorrect because it inserts an additional dimension, giving a tensor of shape [1, len(items), 768].

Use torch.vstack instead:

torch.vstack(li, dim=0)

to get a tensor of shape [len(items), 768].

Answered By: James Hirschorn
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.