International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 8 ANN for English Alphabet Prediction Hazem H. Heriz, Hamza M. Salah, Sharief B. Abu Abdu, Mohammad M. El Sbihi Department of Information Technology, Faculty of Engineering & Information Technology, Al-Azhar University Gaza, Palestine Abstract: In this paper an Artificial Neural Network (ANN) model, for predicting the Letters from twenty dissimilar fonts for each letter. The character images were, initially, based on twenty dissimilar fonts and each letter inside these twenty fonts was arbitrarily distorted to yield a file of 20,000 distinctive stimuli. Every stimulus was transformed into 16 simple numerical attributes (arithmetical moments and edge amounts) which were then ascended to be suitable into a range of numeral values from 0 to 15. We naturally chose, arbitrarily, 1,000 distinctive stimuli for this research. We made certain that the scattering remnants the similar after selecting the one thousand stimuli. In this research, a neural network tool (Just NN) was used for the purpose of predicting to classify every of a huge number of black and white four-sided pixel displays as one of the 26 capital letters in the English language. Keywords: Letter Recognition, Alphabet, Artificial Neural network, Back-Propagation Neural Network, Prediction. 1. INTRODUCTION Letter Image Recognition Data was initially invented by David Slate in 1991. In the beginning, he produced these data to examine the capability of numerous variations of Holland-style adaptive classifier systems to acquire and properly predict the letter classes related with vectors of 16 numeral attributes extracted from raster scan images of the letters. Research consequences and practice worldwide offers clear indication is to classify every one of the huge number of black-and-white rectangular pixel displays as one of the 26 capital letters in the English alphabet, this is very significant for each one who use English language. The attributes are: x-box: horizontal position of box, y-box: vertical position of box, width: width of box, high: height of box, onpix: total # on pixels, x-bar: mean x of on pixels in box, y-bar: mean y of on pixels in box, x2bar: mean x variance, y2bar: mean y variance, xybar: mean x y correlation, x2ybr: mean of x * x * y, xy2br: mean of x * y * y, x-ege: mean edge count left to right, xegvy: correlation of x-ege with y, y-ege: mean edge count bottom to top, yegvx: correlation of y-ege with x. For instance, the horizontal position, together with pixels starting from the left edge of the image, of the midpoint of the least rectangular box that can be drawn with all on pixels inside the box while the vertical position, counting pixels from the bottom, of the box. All features values are numeric (1-15). The letters interval is (1-26 values) which means letter A to letter Z. Therefore, in this research, a neural network tool was used for the purpose of predicting to recognize each of a huge number of black and white four-sided pixel displays as one of the 26 capital letters in the English alphabet language. 2. LITERATURE REVIEW Abu Naser employed Artificial Neural Networks [1] and expert systems [2-3] to obtain knowledge for the learner model in the Linear Programming Intelligent Tutoring System to be able to determine the academic performance level of the learners in order to offer the learner the suitable difficulty level of linear programming problems to solve. Feed forward Back-propagation algorithm International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 9 was trained with a group of learner‟s data to predict their academic performance. The accuracy of predicting the performance of the learners was very high and thus states that the Artificial Neural Network is skilled enough to make suitable predictions. 3. ARTIFICIAL NEURAL NETWORKS DARPA Neural Network Study (1988) define a Neural Network (NN) as: "a neural network is a system composed of many simple processing elements operating in parallel whose function is determined by network structure, connection strengths, and the processing performed at computing elements or nodes.", NN technique are consists from three layers, input layer, hidden layer and output layer, see Figure 1[4-24]. Figure1: Example of Neural Network architecture The input and output layers must be numeric values preferred to contemplate compelling to (0-1) range. Therefore if we have definite input format, we must transform it to numerical values and gauge it to (0-1) range. Additionally the Weight is performance parameters of the feed-forward neural network in hidden layer. The training algorithm of the Artificial Neural Net (ANN) is exaggerated by, starting with random weights, bestowing the data, instance by instance, adjusting the weights conferring the error for each instance, and repeating until the error become very small[4]. The backpropagation algorithm adujust the weights using the variance of actual output and the function output for each instance. Neural Networks are networks of neurons, for example, as found in real (i.e. biological) brains [5, 41-46]. In this study, a neural network tool, Just Neural Network (JNN), was used for the determination of predicting and identifying each of a huge number of black and white rectangular pixel displays as one of the 26 capital letters in the English alphabet. The system incorporates Neural Network with Backpropagation[25-30] learning algorithm to inaugurate a prediction model to be useful in predicting Letter Recognition choice that employ Neural Network with Feedforward algorithm [31-40,48-54]. 4. METHODOLOGY 4.1 The Input Variables To construct a prediction model, a data set containing 1000 of records were trained. The data was taken from Odesta Corporation; 1890 Maple Ave; Suite 115; Evanston, IL 60201 David J. Slate (January, 1991) The samples character images were based on 20 different fonts and each letter within these 20 fonts was arbitrarily distorted to yield a file of 20,000 matchless stimuli. Every stimulus was transformed into 16 basic numerical attributes (statistical moments and edge counts) which were then scaled to fit into a range of numeric values from 0 to 15 (As shown in table 1). Table1: Input variables, their meaning, and Range S. Input Variable Meaning Value Range 1 x-box horizontal position of box 0-15 International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 10 2 y-box vertical position of box 0-15 3 width width of box 0-15 4 high height of box 0-15 5 onpix number of on pixels 0-15 6 x-bar mean number of on pixels in x-direction 0-15 7 y-bar mean number of on pixels in y-direction 0-15 8 x2bar mean x variance 0-15 9 y2bar mean y variance 0-15 10 xybar mean x-y correlation 0-15 11 x2ybar mean of x*x*y 0-15 12 xy2bar mean of x*y*y 0-15 13 x-ege mean edge count left to right 0-15 14 xegvy correlation of x-ege and y 0-15 15 y-ege mean edge count top to bottom 0-15 16 yegvx correlation of y-ege and x 0-15 4.2 The Output Variables Table 2 shows the output variable (Letter), its meaning, and its value rage („A‟..‟Z‟). Table 2: Output variable, its meaning, and Range S. Output Variable Meaning Value Range 1 letter Capital Letter „A‟-„Z‟ 5. DESIGN OF THE NEURAL NETWORKS 5.1 Network Architecture The network is a multilayer perceptron neural network using the linear sigmoid activation function with 16 input for the first layer, 11 input for first hidden layer and 1 input for the second layer, and 1 input for the output layer(as seen in Figure 2). 5.2 The Back-propagation Here is the back-progration algorithm which wa used in training the ANN model [11-12]: Initialize each wi to some small random value Until the termination condition is met, Do For each training example <(x1,...xn),t> Do Input the instance (x1,...,xn) to the network and compute the network outputs ok For each output unit k: k=ok(1-ok)(tk-ok) For each hidden unit h: h=oh(1-oh) k wh,k k For each network weight wj Do wi,j=wi,j+wi,j,where wi,j= j xi,j and  is the learning rate. International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 11 Figure 2: ANN final Architecture 6. Evaluation of Artificial Neural Network The Dataset of Alphabet Letters consists of 19,990 samples. We divided it into a training set which consists of 14,900 with percentage of (75%) and a validating set which consists of 4999 with percentage of (25%). We have trained the ANN model using Just Neural Network (JNN) environment. Before training the ANN model with the training dataset, we prepared the data set by normalizing it using min-max normalization method. Then we trained the ANN model using the training dataset and validating it International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 12 using the validation dataset. We determined the most important factors affecting the ANN model (as seen in Figure 3) and we got accuracy of 85.08% (as shown in Figure 4). The number of cycles for the training was 2836. Figure 3: Most Influential factors in the ANN model Figure 4: Training and validating of the ANN model 7. Conclusion In this study, a neural network model was developed for the purpose of predicting the 26 alphabet letters. Hence, the result shows that the developed system can be used as a tool to assist in identifying each of a large number of black-and-white rectangular pixel displays as one of the 26 capital letters in the English alphabet. All 26 classes have almost the same number of samples. The accuracy of the evaluation of the model was 85.08%. International Journal of Academic Pedagogical Research (IJAPR) ISSN: 2000-004X Vol. 2 Issue 11, November – 2018, Pages: 8-13 www.ijeais.org/ijapr 13 References 1. Abu-Naser, S. S. (2012). "Predicting learners performance using artificial neural networks in linear programming intelligent tutoring system." International Journal of Artificial Intelligence & Applications 3(2): 65. 2. Afana, M., et al. (2018). "Artificial Neural Network for Forecasting Car Mileage per Gallon in the City." International Journal of Advanced Science and Technology 124: 51-59. 3. Ahmed, A., et al. (2019). "Knowledge-Based Systems Survey." International Journal of Academic Engineering Research (IJAER) 3(7): 1-22. 4. Alajrami, E., et al. (2019). "Blood Donation Prediction using Artificial Neural Network." Blood 3(10): 1-7. 5. Alghoul, A., et al. (2018). "Email Classification Using Artificial Neural Network." International Journal of Academic Engineering Research (IJAER) 2(11): 8-14. 6. Alkronz, E. S., et al. (2019). "Prediction of Whether Mushroom is Edible or Poisonous Using Back-propagation Neural Network." International Journal of Academic and Applied Research (IJAAR) 3(2): 1-8. 7. Al-Massri, R., et al. (2018). "Classification Prediction of SBRCTs Cancers Using Artificial Neural Network." International Journal of Academic Engineering Research (IJAER) 2(11). 8. Al-Mubayyed, O. M., et al. (2019). "Predicting Overall Car Performance Using Artificial Neural Network." International Journal of Academic and Applied Research (IJAAR) 3(1): 1-5. 9. Al-Shawwa, M. and S. S. Abu-Naser (2019). "Predicting Birth Weight Using Artificial Neural Network." International Journal of Academic Health and Medical Research (IJAHMR) 3(1): 9-14. 10. Al-Shawwa, M. and S. S. Abu-Naser (2019). "Predicting Effect of Oxygen Consumption of Thylakoid Membranes (Chloroplasts) from Spinach after Inhibition Using Artificial Neural Network." International Journal of Academic Engineering Research (IJAER) 3(2): 15-20. 11. Al-Shawwa, M., et al. (2018). "Predicting Temperature and Humidity in the Surrounding Environment Using Artificial Neural Network." International Journal of Academic Pedagogical Research (IJAPR) 2(9): 1-6. 12. Anderson, J., et al. (2005). "Adaptation of Problem Presentation and Feedback in an Intelligent Mathematics Tutor." Information Technology Journal 5(5): 167-207. 13. Ashqar, B. A. M. and S. S. Abu-Naser (2019). "Identifying Images of Invasive Hydrangea Using Pre-Trained Deep Convolutional Neural Networks." International Journal of Academic Engineering Research (IJAER) 3(3): 28-36. 14. Ashqar, B. A. M. and S. S. Abu-Naser (2019). "Image-Based Tomato Leaves Diseases Detection Using Deep Learning." International Journal of Academic Engineering Research (IJAER) 2(12): 10-16. 15. Ashqar, B. A., et al. (2019). "Plant Seedlings Classification Using Deep Learning." International Journal of Academic Information Systems Research (IJAISR) 3(1): 7-14. 16. Elzamly, A., et al. (2015). "Predicting Software Analysis Process Risks Using Linear Stepwise Discriminant Analysis: Statistical Methods." Int. J. Adv. Inf. Sci. Technol 38(38): 108-115. 17. Elzamly, A., et al. (2016). "A New Conceptual Framework Modelling for Cloud Computing Risk Management in Banking Organizations." International Journal of Grid and Distributed Computing 9(9): 137-154. 18. Elzamly, A., et al. (2017). "Predicting Critical Cloud Computing Security Issues using Artificial Neural Network (ANNs) Algorithms in Banking Organizations." International Journal of Information Technology and Electrical Engineering 6(2): 40-45. 19. Elzamly, A., et al. (2019). "Critical Cloud Computing Risks for Banking Organizations: Issues and Challenges." Religación. Revista de Ciencias Sociales y Humanidades 4(18): 673-682. 20. Heriz, H. H., et al. (2018). "English Alphabet Prediction Using Artificial Neural Networks." International Journal of Academic Pedagogical Research (IJAPR) 2(11): 8-14. 21. Jamala, M. N. and S. S. Abu-Naser (2018). "Predicting MPG for Automobile Using Artificial Neural Network Analysis." International Journal of Academic Information Systems Research (IJAISR) 2(10): 5-21. 22. Kashf, D. W. A., et al. (2018). "Predicting DNA Lung Cancer using Artificial Neural Network." International Journal of Academic Pedagogical Research (IJAPR) 2(10): 6-13. 23. Khalil, A. J., et al. (2019). "Energy Efficiency Predicting using Artificial Neural Network." International Journal of Academic Pedagogical Research (IJAPR) 3(9): 1-8. 24. Li, L., et al. (2011). "Hybrid Quantum-inspired genetic algorithm for extracting association rule in data mining." Information Technology Journal 12(4): 1437-1441. 25. Abu Naser, S. S., et al. (2016). "Design and Development of Mobile University Student Guide." Journal of Multidisciplinary Engineering Science Studies (JMESS) 2(1): 193-197. 26. Abu Naser, S. S., et al. (2016). "Design and Development of Mobile Blood Donor Tracker." Journal of 27. Marouf, A. and S. S. Abu-Naser (2018). "Predicting Antibiotic Susceptibility Using Artificial Neural Network." International Journal of Academic Pedagogical Research (IJAPR) 2(10): 1-5. 28. Masri, N., et al. (2019). "Survey of Rule-Based Systems." International Journal of Academic Information Systems Research (IJAISR) 3(7): 1-23. 29. Nasser, I. M. and S. S. Abu-Naser (2019). "Artificial Neural Network for Predicting Animals Category." International Journal of Academic and Applied Research (IJAAR) 3(2): 18-24. 30. Nasser, I. M. and S. S. Abu-Naser (2019). "Lung Cancer Detection Using Artificial Neural Network." International Journal of Engineering and Information Systems (IJEAIS) 3(3): 17-23. 31. Nasser, I. M. and S. S. Abu-Naser (2019). "Predicting Books‟ Overall Rating Using Artificial Neural Network." International Journal of Academic Engineering Research (IJAER) 3(8): 11-17. 32. Nasser, I. M. and S. S. Abu-Naser (2019). "Predicting Tumor Category Using Artificial Neural Networks." International Journal of Academic Health and Medical Research (IJAHMR) 3(2): 1-7. 33. Nasser, I. M., et al. (2019). "A Proposed Artificial Neural Network for Predicting Movies Rates Category." International Journal of Academic Engineering Research (IJAER) 3(2): 21-25. 34. Nasser, I. M., et al. (2019). "Artificial Neural Network for Diagnose Autism Spectrum Disorder." International Journal of Academic Information Systems Research (IJAISR) 3(2): 27-32. 35. Nasser, I. M., et al. (2019). "Developing Artificial Neural Network for Predicting Mobile Phone Price Range." International Journal of Academic Information Systems Research (IJAISR) 3(2): 1-6. 36. Ng, S., et al. (2010). "Ad hoc networks based on rough set distance learning method." Information Technology Journal 10(9): 239-251. 37. Owaied, H. H., et al. (2009). "Using rules to support case-based reasoning for harmonizing melodies." Journal of Applied Sciences 11(14): pp: 31-41. 38. Sadek, R. M., et al. (2019). "Parkinson‟s Disease Prediction Using Artificial Neural Network." International Journal of Academic Health and Medical Research (IJAHMR) 3(1): 1-8. 39. Salah, M., et al. (2018). "Predicting Medical Expenses Using Artificial Neural Network." International Journal of Engineering and Information Systems (IJEAIS) 2(20): 11-17. 40. Sulisel, O., et al. (2005). "Growth and Maturity of Intelligent Tutoring Systems." Information Technology Journal 7(7): 9-37. 41. Zaqout, I., et al. (2015). "Predicting Student Performance Using Artificial Neural Network: in the Faculty of Engineering and Information Technology." International Journal of Hybrid Information Technology 8(2): 221-228. 42. Abu-Nasser, Bassem. "Medical Expert Systems Survey." International Journal of Engineering and Information Systems (IJEAIS) 1, no. 7 (2017): 218-224. 43. Abu-Nasser, Bassem S., and Samy S. Abu-Naser. "Cognitive System for Helping Farmers in Diagnosing Watermelon Diseases." International Journal of Academic Information Systems Research (IJAISR) 2, no. 7 (2018): 1-7. 44. Abu-Nasser, Bassem S., and Samy S. Abu Naser. "Rule-Based System for Watermelon Diseases and Treatment." International Journal of Academic Information Systems Research (IJAISR) 2, no. 7 (2018): 1-7. 45. Baker, J., et al. "& Heller, R.(1996)." Information Visualization. Information Technology Journal 7(2). 46. Baker, J., et al. (1996). "Information Visualization." Information Technology Journal 7(2): pp: 403-404. 47. Baraka, M. H., et al. (2008). "A Proposed Expert System For Guiding Freshman Students In Selecting A Major In Al-Azhar University, Gaza." Journal of Theoretical & Applied Information Technology 4(9). 48. Barhoom, A. M., et al. (2019). "Predicting Titanic Survivors using Artificial Neural Network." International Journal of Academic Engineering Research (IJAER) 3(9): 8-12. 49. Chand, P. S., et al. (2008). "MADAMS: Mining and Acquisition of Data by ANT-MINER Samples." Journal of Theoretical & Applied Information Technology 4(10). 50. Chen, R.-S., et al. (2008). "Evaluating structural equation models with unobservable variables and measurement error." Information Technology Journal 10(2): 1055-1060. 51. Dalffa, M. A., et al. (2019). "Tic-Tac-Toe Learning Using Artificial Neural Networks." International Journal of Engineering and Information Systems (IJEAIS) 3(2): 9-19. 52. El_Jerjawi, N. S. and S. S. Abu-Naser (2018). "Diabetes Prediction Using Artificial Neural Network." International Journal of Advanced Science and Technology 121: 55-64. 53. El-Khatib, M. J., et al. (2019). "Glass Classification Using Artificial Neural Network." International Journal of Academic Pedagogical Research (IJAPR) 3(2): 25-31. 54. Elzamly, A., et al. (2015). "Classification of Software Risks with Discriminant Analysis Techniques in Software planning Development Process." International Journal of Advanced Science and Technology 81: 35-48. 55. Naser, S. S. A. "TOP 10 NEURAL NETWORK PAPERS: RECOMMENDED READING–ARTIFICIAL INTELLIGENCE RESEARCH."