Back to Current Issues

Activation Functions and Training Algorithms for Deep Neural Network

Gayatri Khanvilkar, Deepali Vora,

Affiliations
Department of Information Technology Vidyalankar Institute of Technology Mumbai, India
:10.22362/ijcert/2018/v5/i4/v5i402


Abstract
Machine Learning is a Field of computer science that gives the computer the ability to learn without being explicitly programmed. It is core subpart of artificial intelligence. Whenever new data exposed, computer programs, are enabled to learn, grow, change, and develop by themselves. Machine learning is study and construction of algorithms that learn and do the prediction based on data. Deep learning is nothing but subfield of machine learning. Structure and function of human brain inspire deep learning. ‘Deep learning' name is used for stack neural network. The deep neural network is an Artificial Neural Network with number of hidden layers and hence different from the normal artificial neural network. Supervised and unsupervised manner can train it. Training of such Deep neural network is difficult also it mainly faces two challenges, i.e. over fitting and computation time. Deep neural network train with the help of training algorithms and activation function. So, in this paper mostly used Activation Function (Sigmoid, Tanh and ReLu) and Training Algorithms (Greedy layer-wise Training and Dropout) are analysed and according to this analysis comparison of activation functions and training algorithms are given


Citation
Gayatri Khanvilkar, Deepali Vora (2018). Activation Functions and Training Algorithms for Deep Neural Network. International Journal of Computer Engineering In Research Trends, 5(4), 98-104. Retrieved from http://ijcert.org/ems/ijcert_papers/V5I402.pdf


Keywords : Deep Neural network, Activation Functions, Vanishing gradient, Greedy Algorithm, Dropout Algorithm

References
[1]	Wikipedia contributors. "Machine learning." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 24 Oct. 2017. Web. 29 Oct. 2017
[2]	Wikipedia contributors. "Deep learning." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 23 Oct. 2017. Web.
[3]	Schmidhuber, Jürgen. "Deep learning in neural networks: An overview." Neural networks 61 (2015): 85-117. 


[4]	Deeplearning4j Development Team. Deeplearning4j: Opensource distributed deep learning for the JVM, Apache Software Foundation License 2.0. http://deeplearning4j.org 
[5]	Aleksander, Igor, and Helen Morton. An introduction to neural computing. Vol. 3. London: Chapman & Hall, 1990. 
[6]	"opening up deep learning for everyone", http://www.jtoy.net/2016/02/14/op ending-up-deep-learning-foreveryone.html, October 2017.
[7]	Lau, Mian Mian, and King Hann Lim. "Investigation of activation functions in the deep belief network." Control and Robotics Engineering (ICCRE), 2017 2nd International Conference on. IEEE, 2017. 
[8]	“Understanding Activation Functions in Neural Networks”, https://medium.com/the-theory-ofeverything/understanding-activationfunctions-in-neural-networks9491262884e0, September 2017. 
[9]	“Activation functions and it’s type which is better?”, https://medium.com/towards-datascience/activation-functions-and-itstypes-which-is-better-a9a5310cc8f, September 2017. 
[10]	“The Vanishing Gradient Problem”, https://medium.com/@anishsingh20/ the-vanishing-gradient-problem48ae7f501257, September 2017. 
[11]	Qian, Sheng, et al. "Adaptive activation functions in convolutional neural networks." Neurocomputing (2017).
[12]	Gay, M. "IBM ILOG CPLEX Optimization Studio CPLEX User’s Manual." International Business Machines Corporation 12 (2012).
[13]	Liu, Jun, Chuan-Cheng Zhao, and Zhi-Guo Ren. "The Application of Greedy Algorithm in Real Life." DEStech Transactions on Engineering and Technology Research mcee (2016). 
[14]	Wikipedia contributors. "Greedy algorithm." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 19 Apr. 2017. Web. 28 Oct. 2017. 
[15]	Wang, Jian-Guo, et al. "A mothed of improving identification accuracy via deep learning algorithm under the condition of deficient labelled data." Control Conference (CCC), 2017 36th Chinese. IEEE, 2017. 
[16]	Tong, Li, et al. "Predicting heart rejection using histopathological whole-slide imaging and deep neural network with dropout." Biomedical & Health Informatics (BHI), 2017 IEEE EMBS International Conference on. IEEE, 2017. 
[17]	Wang, Long, et al. "Wind turbine gearbox failure identification with deep neural networks." IEEE Transactions on Industrial Informatics 13.3 (2017): 13601368. 
[18]	Ko, Byung-soo, et al. "Controlled dropout: A different approach to using dropout on deep neural network." Big Data and Smart Computing (BigComp), 2017 IEEE International Conference on. IEEE, 2017. 
[19]	McMahan, H. Brendan, et al. "Ad click prediction: a view from the trenches." Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2013.


DOI Link : https://doi.org/10.22362/ijcert/2018/v5/i4/v5i402

Download :
  V5I402.pdf


Refbacks : Currently there are no refbacks

Quick Links


DOI:10.22362/ijcert


Science Central

Score: 13.30



Submit your paper to editorijcert@gmail.com

>