Norms#
- class ivy.data_classes.container.norms._ContainerWithNorms(dict_in=None, queues=None, queue_load_sizes=None, container_combine_method='list_join', queue_timeout=None, print_limit=10, key_length_limit=None, print_indent=4, print_line_spacing=0, ivyh=None, default_key_color='green', keyword_color_dict=None, rebuild_child_containers=False, types_to_iteratively_nest=None, alphabetical_keys=True, dynamic_backend=None, build_callable=False, **kwargs)[source]#
Bases:
ContainerBase
- _abc_impl = <_abc._abc_data object>#
- layer_norm(normalized_idxs, /, *, scale=None, offset=None, eps=1e-05, new_std=1.0, out=None)[source]#
ivy.Container instance method variant of ivy.layer_norm. This method simply wraps the function, and so the docstring for ivy.layer_norm also applies to this method with minimal changes.
- Parameters:
self (
Union
[Array
,NativeArray
,Container
]) – Input containernormalized_idxs (
List
[Union
[int
,Container
]]) – Indices to apply the normalization to.scale (
Optional
[Union
[Array
,NativeArray
,Container
]], default:None
) – Learnable gamma variables for elementwise post-multiplication, default isNone
.offset (
Optional
[Union
[Array
,NativeArray
,Container
]], default:None
) – Learnable beta variables for elementwise post-addition, default isNone
.eps (
Union
[float
,Container
], default:1e-05
) – small constant to add to the denominator. Default is1e-05
.new_std (
Union
[float
,Container
], default:1.0
) – The standard deviation of the new normalized values. Default is 1.out (
Optional
[Union
[Array
,Container
]], default:None
) – optional output container, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Container
- Returns:
ret – The layer after applying layer normalization.
Examples
With one
ivy.Container
input: >>> x = ivy.Container({‘a’: ivy.array([7., 10., 12.]), … ‘b’: ivy.array([[1., 2., 3.], [4., 5., 6.]])}) >>> normalized_idxs = [0] >>> norm = x.layer_norm(normalized_idxs, eps=1.25, scale=0.3) >>> print(norm) {a: ivy.array([-0.34198591, 0.04274819, 0.29923761]), b: ivy.array([[-0.24053511, -0.24053511, -0.24053511],
[0.24053511, 0.24053511, 0.24053511]])
} With multiple
ivy.Container
inputs: >>> x = ivy.Container({‘a’: ivy.array([7., 10., 12.]), … ‘b’: ivy.array([[1., 2., 3.], [4., 5., 6.]])}) >>> normalized_idxs = ivy.Container({‘a’: [0], ‘b’: [1]}) >>> new_std = ivy.Container({‘a’: 1.25, ‘b’: 1.5}) >>> bias = ivy.Container({‘a’: [0.2, 0.5, 0.7], ‘b’: 0.3}) >>> norm = x.layer_norm(normalized_idxs, new_std=new_std, offset=1) >>> print(norm) {a: ivy.array([-1.62221265, 0.20277636, 1.41943574]), b: ivy.array([[-1.83710337, 0., 1.83710337],
[-1.83710337, 0., 1.83710337]])
}
This should have hopefully given you an overview of the norms submodule, if you have any questions, please feel free to reach out on our discord!