GhostNet

GhostNet

背景介绍

  GhostNet:来自华为诺亚方舟实验室,于2020年被CVPR接受,借鉴了大量优秀神经网络的特点,提出了一种新型的神经网络架构。

GhostNet

GhostNet特点

  在Ghost Module中引入瓶颈结构GroupConv分组卷积
  在Ghost Bottleneck中引入DepthwiseConv深度可分离卷积Squeeze-and-Excitation模块

Group Convolution

ShuffleNet_V2
  Group Convolution(分组卷积)传统卷积是采用一种卷积全连接的思想,特征图中的每一个像素点都结合了图像中所有通道的信息。而分组卷积特征图像每一个像素点只利用到一部分原始图像的通道
  主要作用是大大降低网络的参数量。如果一个64x64x256的图像,经过5x5的卷积核后变为64x64x256的图像,经过普通卷积的参数量为256x(256x5x5+1)=1638656,而分成32组的分组卷积的参数量为256x(8*5x5+1)=51456,参数量缩小了约32倍,当组数变成通道数时,则类似于Depthwise Convolution深度卷积

Depthwise Convolution

depthwise
  Depthwise Convolution(深度卷积):在每一个通道上单独进行卷积**
  参数depth_multiplier默认为1,代表每个通道数进行一次单独卷积,输出的通道数和输入通道数相等,设置depth_multiplier=n,则代表每个通道数进行n次单独卷积,输出通道数是输入通道数的n倍
  主要作用是大大降低网络的参数量。如果一个8x8x1024的特征图,经过5x5的卷积核后变为8x8x1024的图像,经过普通卷积的参数量为1024x(1024x5x5+1)=26215424,而深度卷积参数量为1024x(1x5x5+1)=26624,参数量缩小了约1024倍。

Squeeze-and-Excitation

SENet
  Squeeze-and-Excitation:又称为特征重标定卷积,或者注意力机制。具体来说,就是通过学习的方式来自动获取到每个特征通道的重要程度,然后依照这个重要程度去提升有用的特征并抑制对当前任务用处不大的特征
  首先是 Squeeze操作,先进行全局池化,具有全局的感受野,并且输出的维度和输入的特征通道数相匹配,它表征着在特征通道上响应的全局分布。
  然后是Excitation操作通过全连接层为每个特征通道生成权重,建立通道间的相关性输出的权重看做是进过特征选择后的每个特征通道的重要性,然后通过乘法逐通道加权到先前的特征上,完成在通道维度上的对原始特征的重标定。

GhostNet图像分析

GhostNet

TensorFlow2.0实现

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
from functools import reduce
import tensorflow.keras as keras


def compose(*funcs):
if funcs:
return reduce(lambda f, g: lambda *a, **kw: g(f(*a, **kw)), funcs)
else:
raise ValueError('Composition of empty sequence not supported.')


class Conv_Bn_Relu(keras.layers.Layer):
def __init__(self, filters, kernel_size, strides, padding, name):
super(Conv_Bn_Relu, self).__init__()
self._name = name
self.block = keras.Sequential()
if name.find('depthwise') == -1:
self.block.add(keras.layers.Conv2D(filters, kernel_size, strides, padding=padding))
else:
self.block.add(keras.layers.DepthwiseConv2D(kernel_size, strides, padding=padding))
self.block.add(keras.layers.BatchNormalization())
if name.find('relu') != -1:
self.block.add(keras.layers.ReLU())

def call(self, inputs, **kwargs):

return self.block(inputs)


def ghost_module(x, out_channel, relu, name):
shortcut = Conv_Bn_Relu(out_channel // 2, (1, 1), (1, 1), 'same', name='{}_conv_bn'.format(name) + '_relu' if relu else '{}_conv_bn'.format(name))(x)
x = Conv_Bn_Relu(None, (3, 3), (1, 1), 'same', name='{}_depthwiseconv_bn'.format(name) + '_relu' if relu else '{}_depthwiseconv_bn'.format(name))(shortcut)
x = keras.layers.Concatenate(name='{}_concatenate'.format(name))([x, shortcut])

return x


def se_block(x, filters, name):
shortcut = x
x = compose(keras.layers.GlobalAveragePooling2D(name='{}_global_averagepool'.format(name)),
keras.layers.Dense(filters // 4, name='{}_dense1'.format(name)),
keras.layers.ReLU(name='{}_relu'.format(name)),
keras.layers.Dense(filters, name='{}_dense2'.format(name)),
keras.layers.Activation('sigmoid', name='{}_sigmoid'.format(name)),
keras.layers.Reshape((1, 1, filters), name='{}_reshape'.format(name)))(x)
x = keras.layers.Multiply(name='{}_multiply'.format(name))([x, shortcut])

return x


def ghost_bneck(x, out_channel, exp_channel, kernel_size, strides, se, name):
shortcut = x
x = ghost_module(x, exp_channel, relu=True, name='{}_module1'.format(name))
if strides == (2, 2):
x = Conv_Bn_Relu(None, kernel_size, strides, 'same', name='{}_depthwiseconv_bn_relu'.format(name))(x)
if se:
x = se_block(x, exp_channel, name='{}_se_block'.format(name))
x = ghost_module(x, out_channel, relu=False, name='{}_module2'.format(name))
if shortcut.shape[-1] == out_channel and strides == (1, 1):
x = keras.layers.Add(name='{}_add'.format(name))([x, shortcut])

return x


def ghostnet(input_shape):
input_tensor = keras.layers.Input(input_shape, name='input')
x = input_tensor

x = Conv_Bn_Relu(16, (3, 3), (2, 2), 'same', name='conv_bn_relu1')(x)

x = ghost_bneck(x, out_channel=16, exp_channel=16, kernel_size=(3, 3), strides=(1, 1), se=False, name='bneck1_1')
x = ghost_bneck(x, out_channel=24, exp_channel=48, kernel_size=(3, 3), strides=(2, 2), se=False, name='bneck1_2')

x = ghost_bneck(x, out_channel=24, exp_channel=72, kernel_size=(3, 3), strides=(1, 1), se=False, name='bneck2_1')
x = ghost_bneck(x, out_channel=40, exp_channel=72, kernel_size=(5, 5), strides=(2, 2), se=True, name='bneck2_2')

x = ghost_bneck(x, out_channel=40, exp_channel=120, kernel_size=(5, 5), strides=(1, 1), se=True, name='bneck3_1')
x = ghost_bneck(x, out_channel=80, exp_channel=240, kernel_size=(3, 3), strides=(2, 2), se=False, name='bneck3_2')

x = ghost_bneck(x, out_channel=80, exp_channel=200, kernel_size=(3, 3), strides=(1, 1), se=False, name='bneck4_1')
x = ghost_bneck(x, out_channel=80, exp_channel=184, kernel_size=(3, 3), strides=(1, 1), se=False, name='bneck4_2')
x = ghost_bneck(x, out_channel=80, exp_channel=184, kernel_size=(3, 3), strides=(1, 1), se=False, name='bneck4_3')
x = ghost_bneck(x, out_channel=112, exp_channel=480, kernel_size=(3, 3), strides=(1, 1), se=True, name='bneck4_4')
x = ghost_bneck(x, out_channel=112, exp_channel=672, kernel_size=(3, 3), strides=(1, 1), se=True, name='bneck4_5')
x = ghost_bneck(x, out_channel=160, exp_channel=672, kernel_size=(5, 5), strides=(2, 2), se=True, name='bneck4_6')

x = ghost_bneck(x, out_channel=160, exp_channel=960, kernel_size=(5, 5), strides=(1, 1), se=False, name='bneck5_1')
x = ghost_bneck(x, out_channel=160, exp_channel=960, kernel_size=(5, 5), strides=(1, 1), se=True, name='bneck5_2')
x = ghost_bneck(x, out_channel=160, exp_channel=960, kernel_size=(5, 5), strides=(1, 1), se=False, name='bneck5_3')
x = ghost_bneck(x, out_channel=160, exp_channel=960, kernel_size=(5, 5), strides=(1, 1), se=True, name='bneck5_4')

x = compose(Conv_Bn_Relu(960, (1, 1), (1, 1), 'same', name='conv_bn_relu2'),
keras.layers.AveragePooling2D((7, 7), (7, 7), name='averagepool'),
Conv_Bn_Relu(1280, (1, 1), (1, 1), 'same', name='conv_bn_relu3'),
keras.layers.Conv2D(1000, (1, 1), (1, 1), 'same', activation='softmax', name='conv'),
keras.layers.Reshape((1000,), name='reshape'))(x)

model = keras.Model(input_tensor, x, name='GhostNet')

return model


if __name__ == '__main__':

model = ghostnet(input_shape=(224, 224, 3))
model.build(input_shape=(None, 224, 224, 3))
model.summary()

GhostNet

GhostNet小结

  GhostNet是一种复杂的轻量级深度学习网络,参数量为5M,其借鉴了大量优秀的深度学习网络的精髓,如MobileNet的深度可分离卷积思想,AlexNet的分组卷积思想,SENet的注意力机制,因此获得了较好的效果。

-------------本文结束感谢您的阅读-------------
0%