深度学习基本概念|张量tensor
tensorflow中,定义张量的方式如下
>>> import tensorflow as tf
>>> rank_0_tensor = tf.constant(4)
>>> rank_1_tensor = tf.constant([2.0, 3.0, 4.0])
>>> rank_2_tensor = tf.constant([[1, 2],[3, 4],[5, 6]], dtype=tf.float16)
>>> rank_3_tensor = tf.constant([[[0, 1, 2, 3, 4],[5, 6, 7, 8, 9]],[[10, 11, 12, 13, 14],[15, 16, 17, 18, 19]],[[20, 21, 22, 23, 24],[25, 26, 27, 28, 29]],])
对于张量,可以有多种可视化方式来帮助我们理解其结构, 以3阶张量为例
>>> rank_3_tensor = tf.constant([[[0,1,2,3,4], [5,6,7,8,9]],[[10,11,12,13,14], [15,16,17,18,19]], [[20,21,22,23,24], [25,26,27,28,29]]])
>>> rank_3_tensor
<tf.Tensor: shape=(3, 2, 5), dtype=int32, numpy=
array([[[ 0, 1, 2, 3, 4],
[ 5, 6, 7, 8, 9]],
[[10, 11, 12, 13, 14],
[15, 16, 17, 18, 19]],
[[20, 21, 22, 23, 24],
[25, 26, 27, 28, 29]]])>
张量有以下几个基本属性
1. shape, 形状,统计各个维度的元素数量
2. rank, 秩,维度的总数
3. axis, 轴,具体的某一个维度
>>> rank_4_tensor = tf.zeros([3, 2, 4, 5])
>>> rank_4_tensor.shape
TensorShape([3, 2, 4, 5])
# 张量的秩
>>> rank_4_tensor.ndim
4
# axis 0的元素数量
>>> rank_4_tensor.shape[0]
3
# axis 1的元素数量
>>> rank_4_tensor.shape[1]
2
# axis 2的元素数量
>>> rank_4_tensor.shape[2]
4
# axis 3的元素数量
>>> rank_4_tensor.shape[3]
5
图示如下
tensorflow通过张量这一数据结构来存储待处理的数据,并再次基础上定义了一系列的张量操作,来高效的处理深度学习运算。
赞 (0)