0

How do I fill a pytorch tensor with a non-scalar value?

For example, let's say I want to fill a pytorch tensor X of shape (n_samples, n_classes) with a 1D pytorch vector a of shape (n_classes,). Ideally, I'd like to be able to write :

X = torch.full((n_samples, n_classes), a)

where the vector a is the fill_value in torch.full. However torch.full only accepts a scalar as the fill_value (Source). So this code won't work.

I have two questions:

  1. If I can't use torch.full, what is a fast way to fill X with n_sample copies of a?
  2. Why does torch.full only accept scalar fill values? Is there a good reason for why the torch.full implementation cannot accept tensor fill values?

Regarding question 1., I am thinking about simply writing:

X = torch.ones((n_samples, n_classes)) * a

However, is there a faster/more efficient way to do this?

For reference, I've already checked out the following stack overflow posts

but none of these directly answer my question.

Thanks!

E. Turok
  • 106
  • 1
  • 7
  • 2
    have you tried `X = a.expand(n_samples, -1).clone()` ? – Phoenix Apr 09 '23 at 21:12
  • 1
    +1 for a solution using `.expand()` which provides a tensor view rather than a new tensor with copied data, if your problem is of significant size `expand` will be much more memory-efficient – DerekG Apr 10 '23 at 13:29

1 Answers1

2

I think torch.tile is what you want. The code would be

X = torch.tile(a, (n_samples, 1))

Example

>>> a = torch.arange(5)
>>> a
tensor([0, 1, 2, 3, 4])
>>> n_samples = 3
>>> X = torch.tile(a, (n_samples, 1))
>>> X
tensor([[0, 1, 2, 3, 4],
        [0, 1, 2, 3, 4],
        [0, 1, 2, 3, 4]])
vluzko
  • 335
  • 1
  • 8