Skip to content

Commit

Permalink
#17013: Update add_bw doc
Browse files Browse the repository at this point in the history
  • Loading branch information
mcw-anasuya committed Jan 30, 2025
1 parent 68f9add commit 85c5ff4
Showing 1 changed file with 4 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -747,6 +747,8 @@ void bind_binary_bw(
bfloat8_b/bfloat4_b is only supported on TILE_LAYOUT
{4}
Example:
>>> grad_tensor = ttnn.from_torch(torch.tensor([[1, 2], [3, 4]], dtype=torch.bfloat16), layout=ttnn.TILE_LAYOUT, device=device)
>>> tensor1 = ttnn.from_torch(torch.tensor([[1, 2], [3, 4]], dtype=torch.bfloat16, requires_grad=True), layout=ttnn.TILE_LAYOUT, device=device)
Expand Down Expand Up @@ -1183,7 +1185,8 @@ void py_module(py::module& module) {
module,
ttnn::add_bw,
R"doc(Performs backward operations for add of :attr:`input_tensor_a` and :attr:`input_tensor_b` or :attr:`scalar` with given :attr:`grad_tensor`.)doc",
R"doc(BFLOAT16, BFLOAT8_B)doc");
R"doc(BFLOAT16, BFLOAT8_B)doc",
R"doc(Tensors of type `ttnn.Tensor` do not support sharded memory configuration.)doc");

detail::bind_binary_bw(
module,
Expand Down

0 comments on commit 85c5ff4

Please sign in to comment.