DenseNet

DenseNet

背景介绍

  DenseNet:作为CVPR2017年的Best Paper,DenseNet脱离了加深网络层数(ResNet)和加宽网络结构(Inception)来提升网络性能的定式思维,通过特征重用和旁路,既大幅度减少了网络的参数量,又在一定程度上缓解了梯度消失问题的产生。

DenseNet

DenseNet特点

  同样深度的DenseNet所需的参数量相比ResNet大幅减少
  Dense Block类似于ResNet中的Identity Block,Transition Block类似于ResNet中的Conv Block
  结构简单,综合了不同尺度的感受野,提升网络性能

不同尺寸DenseNet网络结构

DenseNet

DenseNet121图像分析

DenseNet

TensorFlow2.0实现

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
from functools import reduce
import tensorflow.keras as keras


def compose(*funcs):
if funcs:
return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
else:
raise ValueError('Composition of empty sequence not supported.')


class Conv_Bn_Relu(keras.layers.Layer):
def __init__(self, filters, kernel_size, strides, padding, name):
super(Conv_Bn_Relu, self).__init__()
self._name = name
self.conv = keras.layers.Conv2D(filters, kernel_size, strides, padding)
self.bn = keras.layers.BatchNormalization()
self.relu = keras.layers.ReLU()

def call(self, inputs, **kwargs):
conv = self.conv(inputs)
bn = self.bn(conv)
output = self.relu(bn)

return output


def dense_block(x, name):
shortcut = x
x = compose(Conv_Bn_Relu(128, (1, 1), (1, 1), 'same', name='{}_conv_bn_relu1'.format(name)),
Conv_Bn_Relu(32, (3, 3), (1, 1), 'same', name='{}_conv_bn_relu2'.format(name)))(x)
x = keras.layers.Concatenate(name='{}_concatenate'.format(name))([x, shortcut])

return x


def transition_block(x, filters, name):
x = compose(Conv_Bn_Relu(filters, (1, 1), (1, 1), 'same', name='{}_conv_bn_relu'.format(name)),
keras.layers.AveragePooling2D((2, 2), (2, 2), name='{}_averagepool'.format(name)))(x)

return x


def densenet121(input_shape):
input_tensor = keras.layers.Input(input_shape, name='input')
x = input_tensor

x = compose(keras.layers.ZeroPadding2D((3, 3), name='zeropadding1'),
Conv_Bn_Relu(64, (7, 7), (2, 2), 'valid', name='conv_bn_relu'),
keras.layers.ZeroPadding2D((1, 1), name='zeropadding2'),
keras.layers.MaxPool2D((3, 3), (2, 2), name='Max_Pooling'))(x)

for i in range(6):
x = dense_block(x, name='dense_block1_{}'.format(i + 1))
x = transition_block(x, 128, name='transition_block1')

for i in range(12):
x = dense_block(x, name='dense_block2_{}'.format(i + 1))
x = transition_block(x, 256, name='transition_block2')

for i in range(24):
x = dense_block(x, name='dense_block3_{}'.format(i + 1))
x = transition_block(x, 512, name='transition_block3')

for i in range(16):
x = dense_block(x, name='dense_block4_{}'.format(i + 1))

x = compose(keras.layers.GlobalAveragePooling2D(name='global_averagepool'),
keras.layers.Dense(1000, activation='softmax', name='dense'))(x)

model = keras.Model(input_tensor, x, name='DenseNet121')

return model


if __name__ == '__main__':

model = densenet121(input_shape=(224, 224, 3))
model.build(input_shape=(None, 224, 224, 3))
model.summary()

DenseNet

DenseNet小结

  DenseNet是一种简单的深度学习网络,也是一种非常有效的特征提取模型。从上图可以看出DenseNet121模型的参数量只有8M,甚至是ResNet50的参数量的三分之一,因此实际任务中可以使用DenseNet作为特征提取网络,既高效又节约内存和计算量

-------------本文结束感谢您的阅读-------------
0%