Computer vision is a data transformation retrieved or generated from webcam into another form in means of determining decision. All kinds of transformations are carried through to attain specific aims. One of the supporting techniques in implementing computer vision on a system is digital image processing as the objective of digital image processing is to transform digital-formatted picture so that it can be processed in computer. Computer vision and digital image processing can be implemented in a system of capital letter introduction and real-time handwriting reading on a whiteboard supported by artificial neural network mode âperceptron algorithmâ used as a learning technique for the system to learn and recognize the letters. The way it works is captured in letter pattern using a webcam and generates a continuous image that is transformed into digital image form and processed using several techniques such as grayscale image, thresholding, and cropping image.
Data base is an important part of a system and it stores data to be manipulated. A language called SQL (Structured Query Language) is used for manipulating those data to make needed information. There are two types of error which make SQL more difficult in practical implementation. They are syntax error and logic error. The difference between them is that syntax error can be detected by compiler so it is easy to learn by its warning. But compiler does not show error warning if logical error was occurred. It makes logic error is more difficult to understand than syntax error. To help data bases user to learn SQL in practical implementation, web based SQL compiler that be able to detect syntax and logic error is developed by using Start End Mid algorithm.
This paper presents an approach for a network traffic characterization by using statistical techniques. These techniques are obtained using the decomposition, winterâs exponential smoothing and autoregressive integrated moving average (ARIMA). In this paper, decomposition and winterâs exponential smoothing techniques were used additive and multiplicative model. Then, ARIMA based-on Box-Jenkins methodology. The results of ARIMA (1,0,2) was shown the best model that can be used to the internet network traffic forecasting.Â Â
Decision-making to determine the working units for being prioritized to be developed in order to improve fishery monitoring in WPP-711 is imperative. The Ministry of Maritime Affairs and Fisheries should make no mismatch decision-making through long-term calculation and analysis. The problem of determining the priority of working units is a complex problem, thus it is required to find an appropriate method to avoid a missmatch decision. TOPSIS is a decision-making method capable of solving multi-criteria problems. TOPSIS working principle determines the alternative by considering the shortest distance from the positive ideal alternative and furthest from the ideal negative solution. To improve the performance of TOPSIS, this research is integrated with Fuzzy logic with the aim of giving the right numeric value preference. From the test of 11 alternatives of 6 criteria, the priority of development of fishery monitoring in FMA 711 is: Pontianak Working Unit= 0.917, Batam Working Unit = 0.791 Natuna Working Unit = 0.685 and Tanjung Pinang Working Unit = 0.607. Furthermore,Â the ranking result will be used as the basis for determining the strategy in increasing the monitoring of WPP-711 to minimize State losses due to the illegal fishing within Indonesiaâs WPP-711 Regions.
Transaction data is a set of recording data result in connections with sales-purchase activities at a particular company. In these recent years, transaction data have been prevalently used as research objects in means of discovering new information. One of the possible attempts is to design an application that can be used to analyze the existing transaction data. That application has the quality of market basket analysis. In addition, the application is designed to be desktop-based whose components are able to process as well as re-log the existing transaction data. The used method in designing this application is by way of following the existing steps on data mining technique. The trial result showed that the development and the implementation of market basket analysis application through association rule method using apriori algorithm could work well. With the means of confidence value of 46.69% and support value of 1.78%, and the amount of the generated rule was 30 rules.
Climate change is expected to change peopleâs livelihood in significant ways. Several vulnerability factors and readiness factors used for measuring the prediction index of that particular country on how vulnerable of a country towards global change. Primary data was collected from University of Notre Dame Global Adaptation Index (ND-GAIN). The data has been trained for the forecasting purpose with support from the validated statistical analysis. The summary of the predicted index is visualized using machine learning tools. The results developed the correlation between vulnerability and readiness factors and shows the stability of the country towards climate change. The framework is applied to synthesize findings from Prediction index studies in South East Asia in dealing with vulnerability to climate change.
A signature is a special form of handwriting that used for human identification process. The current identification process is extremely ineffective. People have to manually compare signatures with the previously stored data. This study proposed SOM Kohonen algorithm as the method of signature pattern recognition. This method has able to visualize high-dimensional data. The image processing method is used in this study in pre-processing data phase. The accuracy of SOM Kohonen was 70 %, indicated the method used was good enough for pattern recognition.
A higher level of image processing usually contains some kind of classification or recognition. Digit classification is an important subfield in handwritten recognition. Handwritten digits are characterized by large variations so template matching, in general, is inefficient and low in accuracy. In this paper, we propose the classification of the digit of the year of a relic inscription in the Kingdom of Majapahit using Support Vector Machine (SVM). This method is able to cope with very large feature dimensions and without reducing existing features extraction. While the method used for feature extraction using the Gray-Level Co-Occurrence Matrix (GLCM), special for texture analysis. This experiment is divided into 10 classification class, namely: class 1, 2, 3, 4, 5, 6, 7, 8, 9, and class 0. Each class is tested with 10 data so that the whole data testing are 100 data number year. The use of GLCM and SVM methods have obtained an average of classification results about 77 %.
The data centers are fundamental pieces in the network and computing infrastructure, and evidently today more than ever they are relevant. Since they support the processing, analysis, assurance of the data generated in the network and by the applications in the cloud, which every day increases its volume thanks to technologies such as Internet of Things, Virtualization, and cloud computing, among others. Precisely the management of this large volume of information makes the data centers consume a lot of energy, generating great concern to owners and administrators. Green Data Centers offer a solution to this problem, reducing the impact produced by the data centers in the environment, through the monitoring and control of these. The metrics are the tools that allow us to measure in our case the energy efficiency of the data center and evaluate if it is friendly to the environment. These metrics will be applied to the data centers of the ITSA University Institution, Barranquilla and Soledad campus, and the analysis of these will be carried out. In previous research, the most common metric (PUE) was analyzed to measure the efficiency of the data centers, to verify if the University's data center is friendly to the environment. It is planned to extend this study by carrying out an analysis of several metrics to conclude which is the most efficient and which allows defining the guidelines to update or convert the data center in a friendly environment.
The research was focused on the integration of Fuzzy set theory with Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) to choose the optimum maritime security policy to achieve Indonesia recognition as the world's maritime axis. The method used is AHP with fuzzy based enhancement. Here, the weight of each criterion is calculated to overcome the criticism of the scale of unbalanced rating, uncertainty, and inaccuracy in the pairwise of comparison process. The best recommendation for Indonesian maritime policies is multi task single agency which is greatly infuenced by several factors such as technology, regulations, infrastructure, economic, politic, and socio-culture. The finding shows that the hybrid approach is able to produce the best recommendation for Indonesian maritime security policy.