Impact Factor:6.549
 Scopus Suggested Journal: UNDER REVIEW for TITLE INCLUSSION

International Journal
of Computer Engineering in Research Trends (IJCERT)

Scholarly, Peer-Reviewed, Open Access and Multidisciplinary


Welcome to IJCERT

International Journal of Computer Engineering in Research Trends. Scholarly, Peer-Reviewed,Open Access and Multidisciplinary

ISSN(Online):2349-7084                 Submit Paper    Check Paper Status    Conference Proposal

Back to Current Issues

Activation Functions and Training Algorithms for Deep Neural Network

Gayatri Khanvilkar, Deepali Vora, , ,
Affiliations
Department of Information Technology Vidyalankar Institute of Technology Mumbai, India
:10.22362/ijcert/2018/v5/i4/v5i402


Abstract
Machine Learning is a Field of computer science that gives the computer the ability to learn without being explicitly programmed. It is core subpart of artificial intelligence. Whenever new data exposed, computer programs, are enabled to learn, grow, change, and develop by themselves. Machine learning is study and construction of algorithms that learn and do the prediction based on data. Deep learning is nothing but subfield of machine learning. Structure and function of human brain inspire deep learning. ‘Deep learning' name is used for stack neural network. The deep neural network is an Artificial Neural Network with number of hidden layers and hence different from the normal artificial neural network. Supervised and unsupervised manner can train it. Training of such Deep neural network is difficult also it mainly faces two challenges, i.e. over fitting and computation time. Deep neural network train with the help of training algorithms and activation function. So, in this paper mostly used Activation Function (Sigmoid, Tanh and ReLu) and Training Algorithms (Greedy layer-wise Training and Dropout) are analysed and according to this analysis comparison of activation functions and training algorithms are given


Citation
Gayatri Khanvilkar, Deepali Vora (2018). Activation Functions and Training Algorithms for Deep Neural Network. International Journal of Computer Engineering In Research Trends, 5(4), 98-104. Retrieved from http://ijcert.org/ems/ijcert_papers/V5I402.pdf


Keywords : Deep Neural network, Activation Functions, Vanishing gradient, Greedy Algorithm, Dropout Algorithm

References
[1]	Wikipedia contributors. "Machine learning." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 24 Oct. 2017. Web. 29 Oct. 2017
[2]	Wikipedia contributors. "Deep learning." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 23 Oct. 2017. Web.
[3]	Schmidhuber, Jürgen. "Deep learning in neural networks: An overview." Neural networks 61 (2015): 85-117. 


[4]	Deeplearning4j Development Team. Deeplearning4j: Opensource distributed deep learning for the JVM, Apache Software Foundation License 2.0. http://deeplearning4j.org 
[5]	Aleksander, Igor, and Helen Morton. An introduction to neural computing. Vol. 3. London: Chapman & Hall, 1990. 
[6]	"opening up deep learning for everyone", http://www.jtoy.net/2016/02/14/op ending-up-deep-learning-foreveryone.html, October 2017.
[7]	Lau, Mian Mian, and King Hann Lim. "Investigation of activation functions in the deep belief network." Control and Robotics Engineering (ICCRE), 2017 2nd International Conference on. IEEE, 2017. 
[8]	“Understanding Activation Functions in Neural Networks”, https://medium.com/the-theory-ofeverything/understanding-activationfunctions-in-neural-networks9491262884e0, September 2017. 
[9]	“Activation functions and it’s type which is better?”, https://medium.com/towards-datascience/activation-functions-and-itstypes-which-is-better-a9a5310cc8f, September 2017. 
[10]	“The Vanishing Gradient Problem”, https://medium.com/@anishsingh20/ the-vanishing-gradient-problem48ae7f501257, September 2017. 
[11]	Qian, Sheng, et al. "Adaptive activation functions in convolutional neural networks." Neurocomputing (2017).
[12]	Gay, M. "IBM ILOG CPLEX Optimization Studio CPLEX User’s Manual." International Business Machines Corporation 12 (2012).
[13]	Liu, Jun, Chuan-Cheng Zhao, and Zhi-Guo Ren. "The Application of Greedy Algorithm in Real Life." DEStech Transactions on Engineering and Technology Research mcee (2016). 
[14]	Wikipedia contributors. "Greedy algorithm." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 19 Apr. 2017. Web. 28 Oct. 2017. 
[15]	Wang, Jian-Guo, et al. "A mothed of improving identification accuracy via deep learning algorithm under the condition of deficient labelled data." Control Conference (CCC), 2017 36th Chinese. IEEE, 2017. 
[16]	Tong, Li, et al. "Predicting heart rejection using histopathological whole-slide imaging and deep neural network with dropout." Biomedical & Health Informatics (BHI), 2017 IEEE EMBS International Conference on. IEEE, 2017. 
[17]	Wang, Long, et al. "Wind turbine gearbox failure identification with deep neural networks." IEEE Transactions on Industrial Informatics 13.3 (2017): 13601368. 
[18]	Ko, Byung-soo, et al. "Controlled dropout: A different approach to using dropout on deep neural network." Big Data and Smart Computing (BigComp), 2017 IEEE International Conference on. IEEE, 2017. 
[19]	McMahan, H. Brendan, et al. "Ad click prediction: a view from the trenches." Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2013.


DOI Link : https://doi.org/10.22362/ijcert/2018/v5/i4/v5i402

Download :
  V5I402.pdf


Refbacks : Currently there are no refbacks

Announcements


Authors are not required to pay any article-processing charges (APC) for their article to be published open access in Journal IJCERT. No charge is involved in any stage of the publication process, from administrating peer review to copy editing and hosting the final article on dedicated servers. This is free for all authors. 

News & Events


Latest issue :Volume 10 Issue 1 Articles In press

A plagiarism check will be implemented for all the articles using world-renowned software. Turnitin.


Digital Object Identifier will be assigned for all the articles being published in the Journal from September 2016 issue, i.e. Volume 3, Issue 9, 2016.


IJCERT is a member of the prestigious.Each of the IJCERT articles has its unique DOI reference.
DOI Prefix : 10.22362/ijcert


IJCERT is member of The Publishers International Linking Association, Inc. (“PILA”)


Emerging Sources Citation Index (in process)


IJCERT title is under evaluation by Scopus.


Key Dates


☞   INVITING SUBMISSIONS FOR THE NEXT ISSUE :
☞   LAST DATE OF SUBMISSION : 31st March 2023
☞  SUBMISSION TO FIRST DECISION :
In 7 Days
☞  FINAL DECISION :
IN 3 WEEKS FROM THE DAY OF SUBMISSION

Important Announcements


All the authors, conference coordinators, conveners, and guest editors kindly check their articles' originality before submitting them to IJCERT. If any material is found to be duplicate submission or sent to other journals when the content is in the process with IJCERT, fabricated data, cut and paste (plagiarized), at any stage of processing of material, IJCERT is bound to take the following actions.
1. Rejection of the article.
2. The author will be blocked for future communication with IJCERT if duplicate articles are submitted.
3. A letter regarding this will be posted to the Principal/Director of the Institution where the study was conducted.
4. A List of blacklisted authors will be shared among the Chief Editors of other prestigious Journals
We have been screening articles for plagiarism with a world-renowned tool: Turnitin However, it is only rejected if found plagiarized. This more stern action is being taken because of the illegal behavior of a handful of authors who have been involved in ethical misconduct. The Screening and making a decision on such articles costs colossal time and resources for the journal. It directly delays the process of genuine materials.

Citation Index


Citations Indices All
Citations 1026
h-index 14
i10-index 20
Source: Google Scholar

Acceptance Rate (By Year)


Acceptance Rate (By Year)
Year Rate
2021 10.8%
2020 13.6%
2019 15.9%
2018 14.5%
2017 16.6%
2016 15.8%
2015 18.2%
2014 20.6%

Important Links



Conference Proposal




DOI:10.22362/ijcert