1

In torch slicing creates a View i.e. the data is not copied into a new tensor i.e. it acts as ALIAS

 b = a[3:10, 2:5 ]

My understanding is that is not the case for indexed slice. f.e.

 b = a[[1,2,3] : [5,11]]

Is this correct ?

And second is there a module that mimic a view i.e. internally holds the indexes but access the original tensor i.e. act as a sort of proxy ?

Something like this, but more general :

class IXView:

    def __init__(self, ixs, ten):
        self.ixs = ixs
        self.ten = ten

    def __getitem__(self, rows) : 
        return self.ten[self.ixs[rows],:]
sten
  • 7,028
  • 9
  • 41
  • 63
  • This is not an operation that results in contiguous tensor data, so it's not possible. – Ivan Feb 15 '23 at 09:21

1 Answers1

0

You are correct that iterable-indexed tensor slices do not create a view but rather create a new copy in memory. It seems in practice that this is because any tensor view operation that creates non-contiguous tensor data then calls output.contiguous() under the hood. The one exception seems to be torch.view. More on this here.. You can see this for yourself by calling is_contiguous() or <tensor>.storage().data_ptr() to view the memory address.

a = torch.rand([10,10,10])
a.is_contiguous()
>>> True
a.storage().data_ptr()
>>> 93837543268480   # will be different for you

### normal torch slicing
b = a[3:4,2:8,5:6]
b.is_contiguous()
>>> False

b.storage().data_ptr()
>>> 93837543268480  # same as for a, because this is a view of the data in a

### List slicing of a tensor
c = a[[1,2,3],[2,3,4],:]
c.is_contiguous()
>>> True

c.storage().data_ptr()
>>> 93839531853056  # different than a
DerekG
  • 3,555
  • 1
  • 11
  • 21