- 2020-07-25 03:20
*views 90*- Tensor
- pytorch
- machine learning
- Python

<> Splicing tensor ：torch.cat() ,torch.stack()

* torch.cat(inputs, dimension=0) → Tensor

A sequence of tensors for input in a given dimension seq Connection operation

for instance ：

>>> import torch >>> x = torch.randn(2, 3) >>> x tensor([[-0.1997, -0.6900,

0.7039], [ 0.0268, -1.0140, -2.9764]]) >>> torch.cat((x, x, x), 0) # stay 0

dimension ( portrait ) Splicing tensor([[-0.1997, -0.6900, 0.7039], [ 0.0268, -1.0140, -2.9764], [-

0.1997, -0.6900, 0.7039], [ 0.0268, -1.0140, -2.9764], [-0.1997, -0.6900, 0.7039

], [ 0.0268, -1.0140, -2.9764]]) >>> torch.cat((x, x, x), 1) # stay 1 dimension ( transverse ) Splicing

tensor([[-0.1997, -0.6900, 0.7039, -0.1997, -0.6900, 0.7039, -0.1997, -0.6900,

0.7039], [ 0.0268, -1.0140, -2.9764, 0.0268, -1.0140, -2.9764, 0.0268, -1.0140,

-2.9764]]) >>> y1 = torch.randn(5, 3, 6) >>> y2 = torch.randn(5, 3, 6) >>> torch

.cat([y1, y2], 2).size() torch.Size([5, 3, 12]) >>> torch.cat([y1, y2], 1).size(

) torch.Size([5, 6, 6])

For tensors to be spliced , The number of dimensions must be the same , The dimensions to be spliced can be different in size , But the other dimensions must be the same size .

* torch.stack(sequence, dim=0)

Join input tensor sequences along a new dimension . All tensors in the sequence should be of the same shape

for instance ：

>>> x1 = torch.randn(2, 3) >>> x2 = torch.randn(2, 3) >>> torch.stack((x1, x2),

0).size() # stay 0 Dimension inserts a dimension , Distinguish and splice torch.Size([2, 2, 3]) >>> torch.stack((x1, x2), 1

).size() # stay 1 Dimension inserts a dimension , Combined splicing torch.Size([2, 2, 3]) >>> torch.stack((x1, x2), 2)

.size() torch.Size([2, 3, 2]) >>> torch.stack((x1, x2), 0) tensor([[[-0.3499, -

0.6124, 1.4332], [ 0.1516, -1.5439, -0.1758]], [[-0.4678, -1.1430, -0.5279], [-

0.4917, -0.6504, 2.2512]]]) >>> torch.stack((x1, x2), 1) tensor([[[-0.3499, -

0.6124, 1.4332], [-0.4678, -1.1430, -0.5279]], [[ 0.1516, -1.5439, -0.1758], [-

0.4917, -0.6504, 2.2512]]]) >>> torch.stack((x1, x2), 2) tensor([[[-0.3499, -

0.4678], [-0.6124, -1.1430], [ 1.4332, -0.5279]], [[ 0.1516, -0.4917], [-1.5439,

-0.6504], [-0.1758, 2.2512]]])

Combine tensors of the same shape , The dimension is inserted in the corresponding position according to the dimension sequence provided , Method arranges the data by location . In code , According to para 0 Uighur 1

When the company merges , Although the combined tensor dimensions and dimensions are equal , But the location of the data is not the same .

<> Split tensor ：torch.split(),torch.chunk()

* torch.split(tensor, split_size, dim=0)

Split the input tensor into equal shapes chunks（ If separable ）. If the tensor shape along the specified dimension cannot be split_size Integral score , Then the last block is smaller than the others .

for instance ：

>>> x = torch.randn(3, 10, 6) >>> a, b, c = x.split(1, 0) # stay 0 Dimension is an interval dimension 1 Split of >>

> a.size(), b.size(), c.size() (torch.Size([1, 10, 6]), torch.Size([1, 10, 6]),

torch.Size([1, 10, 6])) >>> d, e = x.split(2, 0) # stay 0 Dimension is an interval dimension 2 Split of >>> d.size()

, e.size() (torch.Size([2, 10, 6]), torch.Size([1, 10, 6]))

Put the tensor in the 0 In the dimension, the 1 To split , among x stay 0 Dimensions on dimensions are 3, It can be divided into 3 share .

Put the tensor in the 0 In the dimension, the 2 To split , Can only be divided into 2 share , And only the front part can be separated first 2 To split , The back is insufficient 2 As a block .

* torch.chunk(tensor, chunks, dim=0)

In a given dimension ( axis ) The input tensor is divided into blocks

Take the data above as an example ：

>>> l, m, n = x.chunk(3, 0) # stay 0 Split into 3 share >>> l.size(), m.size(), n.size() (

torch.Size([1, 10, 6]), torch.Size([1, 10, 6]), torch.Size([1, 10, 6])) >>> u, v

= x.chunk(2, 0) # stay 0 Split into 2 share >>> u.size(), v.size() (torch.Size([2, 10, 6]),

torch.Size([1, 10, 6]))

Put the tensor in the 0 Split into 3 Part time , Because the size is exactly 3, So the interval of each block is equal , All for 1.

Put the tensor in the 0 Split into 2 Part time , Cannot be evenly distributed , In terms of the above results , It can be seen as , use 0

The size of the dimension divided by the number of copies that need to be split , Take the remainder as the interval size of the last block , Then split the previous blocks at the same interval .

The number of shares split on a dimension cannot be larger than the size of this dimension

Technology

Daily Recommendation

views 5

views 4

views 3

views 3

views 3

©2019-2020 Toolsou All rights reserved,

QQ Login interface implementation code implementation mysql Addition, deletion, modification and query of database JAVA Experiment 4 set and functional programming experiment about String How to create objects VHDL——4 choose 1 Data selector C language （ Guess numbers games ） Blue Bridge Cup MCU advanced module --NE555 I don't memorize eight part essays , You go to the interview ? Hill sorting of sorting algorithm ——c++ realization python What does built-in function mean _python What are the built-in functions