data copy 없이
In-place semantics
broadcasting에 따라 in-place tensor의 모양이 변경되지 않는다

1차원은 broadcasting되는데 2차원부터 안되니까 1차원만 되는 이유
차원 없다면 뒤부터 둔다

즉 위 첫 케이스에서 3을 중간에 둘 수 없다
Backwards compatibility
How does pytorch broadcasting work?
torch.add(torch.ones(4,1), torch.randn(4))
produces a Tensor with size: torch.Size([4,4]).
Can someone provide a logic behind this?
https://stackoverflow.com/questions/51371070/how-does-pytorch-broadcasting-work
pytorch broadcasting
Many PyTorch operations support NumPy's broadcasting semantics. what is broadcasting? NumPy docs에 공식설명은 아래와 같다. broadcasting Reference https://pytorc
https://velog.io/@optjyy/pytorch-broad-casting


Seonglae Cho