data copy 없이In-place semanticsbroadcasting에 따라 in-place tensor의 모양이 변경되지 않는다1차원은 broadcasting되는데 2차원부터 안되니까 1차원만 되는 이유차원 없다면 뒤부터 둔다즉 위 첫 케이스에서 3을 중간에 둘 수 없다 Backwards compatibility How does pytorch broadcasting work?torch.add(torch.ones(4,1), torch.randn(4)) produces a Tensor with size: torch.Size([4,4]). Can someone provide a logic behind this?https://stackoverflow.com/questions/51371070/how-does-pytorch-broadcasting-workpytorch broadcastingMany PyTorch operations support NumPy's broadcasting semantics. what is broadcasting? NumPy docs에 공식설명은 아래와 같다. broadcasting Reference https://pytorchttps://velog.io/@optjyy/pytorch-broad-casting