Ana səhifə

Aditya polumetla in partial fulfillment of the requirements for the degree of master of science


Yüklə 1.32 Mb.
səhifə8/12
tarix25.06.2016
ölçüsü1.32 Mb.
1   ...   4   5   6   7   8   9   10   11   12

Chapter 5
Related Work

Machine learning methods have been used in various fields such as bio-informatics, natural language processing and speech recognition. In this chapter we discuss work related to ours using RWIS technology. We then discuss the work done using ML methods in areas related to weather data modeling and forecasting, and time series prediction.


5.1 Using RWIS sensors
The Road Weather Management Program of the Federal Highway Administration (FHA) along with National Weather Service (MWS) sponsor research projects which deal with using weather information obtained from RWIS sensors for roadway maintenance and related operations. These projects are described in a technical report by FHA [2004].
Knight et al., [2004] describe the advantages and difficulties in integrating the RWIS, AWOS and Automated Surface Observation System (ASOS) sensors together to form a mesonet, which can be used for weather forecasting. They discuss the benefits of including data from different networks operating the same region in weather forecasting. Our work differs from theirs in that we focus on predicting individual sensors, rather than trying to produce a model for an entire network of sensors.
Gallus et al., [2004] use Artificial Neural Networks to develop a time series prediction model for predicting frost conditions at roadways. From 180 different weather variables, they found the 8 – 10 most important variables that are beneficial in predictions by neural networks. The presence of frost was predicted. Frost predictions using RWIS data with ANNs did not provide good results, which was attributed to the methods used by RWIS in collection frost information. They also compare the data from RWIS and AWOS networks and report bias in values reported by the sensors, which is attributed to sitting positions of these sensors. In our work we use the Multilayer Perceptron algorithm to predict continuous variables. We include previous hours information in the dataset and which improves the accuracy of our results. We found some correlation between RWIS and AWOS sites that are grouped together, the bias found by Gallus et al. may be attributed to the location of the sensors. They used sensors in Iowa and we the sensors from Minnesota, which are different climatological regimes.
5.2 Weather Data Modeling and Forecasting using Machine Learning Algorithms
Hall et al. [1999] use ANNs to predict the probability of precipitation and the amount of precipitation. They include weather-related variables measured both at ground and at the upper atmosphere in the dataset along with observed rainfall information. They build two different network models to determine the probability of precipitation and the amount of precipitation. They report a change in significant variables for predicting precipitation for cold and warm seasons, which led to the development of different models for each season. They build network models which allows the users to change input variables in cases when inclusion of some variables causes the performance to drop thus allowing year round interoperability of the model and use of the model at different locations. They find that precipitation occurs when the precipitation probability predicted by the model is above 38.5%. They report a high accuracy in the predictions made, for both probability and amount of precipitation, by the use of neural networks. In our work, we use a single model whose predictions are independent of the time from which the input is derived. As indicators of model accuracy for predicting precipitation type we use the percentage of instances that are classified correctly and incorrectly for both cases when precipitation is present and precipitation is not present.
Schizas et al., [1991] use artificial neural networks for predicting minimum temperature of a day by taking temperature, wind, visibility and previous day's minimum temperature as inputs. Their best results are obtained by a network with two hidden layers and 40 output units with each output unit representing a range in temperature. Gain and momentum were chosen to be 0.01 and 0.9 respectively. Their network predicts with an accuracy of 68% at a temperature confidence range of ±3°C. In out work we use previous hours temperature values with the current temperature to improve the prediction process. We use the MultiLayer Perceptron algorithm to predict hourly temperature values. The network we build has the number of hidden units determined by the number of attributes and the classes.
Park et al., [1991] use artificial neural networks to forecast electric load to detect grid failures. They use a combination of time series and artificial neural networks They build the network using past and present information of load and weather variables. Each network is built to trace load patterns to detect errors and to recognize different load conditions. They build different networks by varying the number of hidden layers (using 1, 5 and 10 hidden layers) and using load and different forms of temperature, such as average, peak and low and past temperature values. They report the best results when using the past 2 days information on a network with 10 hidden units. Our work differs from theirs in that we have the number of hidden units used by the model fixed. We use the average of the sum of number of attributes used and the number of classes as the number of hidden units. We used the temperature values in the same form reported by the RWIS sensors.
Cano et al., [2004] use Bayesian Networks to predict rainfall and wind conditions from a data mining point of view. The weather data was collected from a set of 100 stations that were arranged in a grid-wise manner. They construct the Bayesian network using historical data obtained from the sites and was is to predict present conditions. Each site used for modeling is taken as a node in the Bayesian network. They use the K2 learning method in training of the Bayesian network. To improve the search criterion the parents in the network are allowed to include nodes which have a climatic similarity with the parent node. They report improvement in the efficiency of the search process and better results when this condition is applied. Our work differs from theirs in that we use the K2 search algorithm without giving it any prior knowledge for building the network. This allows the network to identify patterns that are not seen or measurabe.
McMillan et al., [2005] build a Bayesian hierarchical regime switching model to describe the behaviors of ozone over space and time. The model built uses the relationship between ozone to estimate spacial fields of ozone and weather condition. The model uses weather conditions like temperature, wind speed and wind direction to forecast ozone levels. Bayesian hierarchical modeling is used to build the models. The weather conditions are treated as fixed and know. The changes in the ozone field are treated as first-order Markov models in time. The models are used to detect ozone levels from the given meteorological conditions and are used to capture key ozone behaviors. The model captures various dependencies of ozone on the meteorological factors. We use Bayesian models to predict temperature, precipitation type and visibility at an RWIS sensor. We use Hidden Markov Models to predict temperature class values by taking the data in the form of a time series with each sequence consisting of hourly temperature class values.
Lau & Weng [1995] use wavelet transform to climate time series analysis. Wavelet transformations are used for study of non-stationary processes occurring over a period of time. They applied a wavelet transform to study variations in global ice volumes and temperature data. They provide a tutorial on the use of wavelet transform in weather related time series domain In our work we use Hidden Markov Models when data was formatted as a sequence in the time series.


5.3 Time Series Prediction using HMMs
Bellone et al., [2000] use a non-homogeneous HMM to predict precipitation amounts during winter months. They use weather data consisting of precipitation amounts, daily geopotential height, temperature, atmospheric pressure and relative humidity from different locations in Washington state area to build and evaluate the model. In the HMM we developed to predict precipitation type, we use temperature information along with precipitation type as inputs. Their model uses 6 states where each state corresponded to different amount of precipitation and assumed precipitation occurrence to be conditionally spatially independent. Our work differs from theirs in that to predict a variable such as precipitation type for a site we use precipitation type information from nearby sites. We did not include any other variable when predicting a particular variable. Our model has 24 states and the symbols observed at each state are used for predicting the output values. We modify the Viterbi algorithm to use the symbols emitted at each state in the most probable path for a given sequence to determine the predicted value.
Zhang [2001] uses HMMs to predict and analyze financial time series, which are a sequence of prices of financial entities, like stocks, observed over a time period at a stock market. Information from other stock markets is used help in the prediction process. An HMM is used to predict the S&P 500 Index. The general EM algorithm is modified to an exponentially weighted EM algorithm to give more emphasis to current data. The HMM developed performs better than the top mutual funds and neural networks. In our work we use information from surrounding sites to help in the prediction process of values at a site.

Chapter 6
Future Work
In this thesis, we build predictive models using both machine learning (ML) algorithms and Hidden Markov Models (HMM). These models are used to predict weather conditions and compare them with actual values to detect RWIS sensor malfunctions. In this chapter we discuss some possible improvements to the models and the general approach to enhance our work. Some obvious areas for potential improvement would be to use different ML methods to build a predictive model using larger datasets, and including sites with micro-climates and the development of malfunction models.
Many different algorithmic approaches are present in the field of machine learning. The use of other algorithms, such as Kalman filters, for weather data modeling could be explored. Kalman filters [Harvey, 1989] are used to estimate the state of a dynamic system from the data provided about the system. They are in some sense regression version of HMMs. The hidden state variables in Kalman filters are continuous, making the state sequence a sequence of numbers or a vector of real numbers. Kalman filters use linear operators with added noise to generate hidden states and outputs. they deal with linear systems and can also be extended to non-linear problems. Weather variables such as temperature can be used directly on Kalman filters (temperature was discretized for use in HMMs) as the state variables are continuous. Thus, a sequences of daily temperatures obtained for a duration of time can be used to build predictive models by Kalman filters.
In this work, we use data collected for one or two years for training and testing of the models. The performance of the model may be improved by using a larger data set spanning a larger duration of time. The number of features in the dataset for ML algorithms was limited to the present and previous three hour's temperature values along other variables like temperature offset, precipitation type and visibility. More features can be added to the feature vector by selecting the features that most affect a particular variable. In HMMs, the class string contained the value of a variable from the site used for predictions and its nearby sites. Other variable information can also be added to the class string. To prevent the length of the class string getting to be too long in such cases, which will lead to large number of symbols and very noisy probability estimates, information from two or more variables can be combined to form a new class. In this work the models built include information from data collected all round the year. The yearly data can be split into different climatic periods, such as winter, spring, summer and fall and build models for the respective period.
In this work we focus on 13 RWIS sites for predictions. The work can be extended to all the RWIS sites present in Minnesota. Models for sensors at sites with significant micro-climates, that we not included in the selected sites, can be built. Weather at sites with micro-climates tend to have a different pattern from its surrounding sites, in order to build models for such sensors more focus should be given on the historical and current information collected from the site and less on the surrounding sites. Additional information about the weather conditions, such as wind, air pressure and cloud cover, could help in determining the weather patterns followed at such sites. Information from weather advisories could be used to identify the weather condition at the location, and can also be used to match it with the current weather conditions reported by the sensors to identify sensor accuracy.
The RWIS data available to us did not include much information on days where re-calibrations or maintenance work was done. The maintenance records were made manually and did not provide much computer-understandable data. Information about malfunctions and their effect on the readings recorded by the sensor was not available. To deal with this issue, malfunction models could be generated by adding additional physical sensors to an existing sensor. The additional sensor can be then tampered with or altered to simulate conditions of mis-calibrations and malfunctions. The data from the two sensors, one with induced malfunctions and one recording actual conditions, can be compared and the resulting differences used to build malfunction models. These models could give us a range for potential errors for that sensor to identify malfunction or mis-calibrations in it.
Miscalibrations in the sensors that lead to a small scale deviations from the actual values and is seen for a long duration of time. Such slow drifts in the recorded readings are difficult to identify. New models could be built or the existing enhanced to identify such drifts in the readings, by examining for instance the long term historical differences in sensor values between sites and calculating the likelihood of the current history for a sensor.

Chapter 7
Conclusions
In this research we attempt to detect RWIS sensor malfunctions using real time sensor data. Malfunctions are identified as significant deviations in values reported by the sensor from the actual conditions present at the site. We use machine learning (ML) methods to build models for an RWIS site which are used to predict a value at that site. The predicted value is then compared to the actual value from the site to detect sensor malfunctions. To build models for RWIS sites, ML methods use historical weather data obtained from a representative sample of RWIS and AWOS sites.
We build models for RWIS sites to predict temperature, precipitation type and visibility which were identified as critical aspects of weather data for Mn/DOT. We use a variety of ML methods such as classification methods (e.g., J48 decision tree, Naive Bayes and Bayesian Networks), regression methods (e.g., Linear Regression, Least Median Square, M5P, MultiLayer Perceptron, RBF Networks and Conjunctive Rule) and Hidden Markov Models (HMMs) to predict this data. The effectiveness and accuracy of these methods in predicting this data was analyzed and their ability to detect sensor malfunctions identified.
From the results obtained for the ML methods applied on different representations of the data to predict temperate at an RWIS site we see that Conjunctive Rule and RBF Network fail completely with high errors for predicting temperature. It may be that these algorithms could have performed better if we had significantly tuned the parameters for these algorithms. The prediction of temperature by Linear Regression, Least Median Squares, M5P and Multilayer Perceptron is accurate to ±1°F. Models built by M5P and Least Median Square are used to detect sensor malfunctions because of their low standard deviation across different sites.
Including precipitation type as an additional source of information for predicting temperature has no significant effect and the results are comparable to the experiments done without using precipitation type. The prediction of temperature class value is the best when J48 decision trees are used, which correctly classifies almost all instances. A threshold distance of 2 is identified to detect malfunctions when J48 is applied to predict temperature class values.
For detecting sensor malfunctions when predicting precipitation type, a combination of results from J48 decision trees and Bayesian Networks are used as J48 identified instances with no precipitation accurately and Bayesian Networks has accuracy for predicting the presence of precipitation. The model built using M5P shows a reasonable accuracy for predicting visibility.
Hidden Markov Models (HMMs) perform well for classifying discretized temperature values. The accuracy for models built at different sites varies with a considerable amount. A threshold distance of 3 between the actual and the predicted temperature class value is used to detect malfunctions. But in most cases on temperature sensors, the HMM models performed very well. Combining the data from different sites that belong to the same group and predicting the temperature class values without site information gives similar results as when using a dataset for a respective site to predict its temperature class value. We find the model built by HMM for precipitation type have a high error when classifying the presence or absence of precipitation. It was concluded not to use the HMM model for detecting sensor malfunctions.
We believe that the models built using selective ML methods for predicting temperature, predicting type and visibility can be used effectively for detecting RWIS sensor malfunctions and can be extended to other RWIS sites that were not included in the experiments performed.

Bibliography
[Akaike, 1974] Akaike, H., A new look at the statistical model identification. IEEE Transaction on Automatic Control , vol. AC-19, pp. 716-723, 1974.
[Allen & Greiner, 2000] Allen, T. and Greiner, R., A model selection criteria for learning belief nets: An empirical comparison, Proceedings of the International Conference on Machine Learning, pp. 1047-1054, 2000.
[AllWeatherInc] All Weather Inc., Automated Weather Observing System (AWOS): International Technical Description, http://www.allweatherinc.com/pdf/int_awos.pdf.
[Aurora] Aurora Program, About RWIS,

http://www.aurora-program.org/what_is_rwis.cfm.


[Baum & Petrie, 1966] Baum, L. and Petrie, T., Statistical inference for probabilistic functions of finite state markov chains, Annals of Mathematical Statistics, vol. 37, 1966.
[Bellone et al., 2000] Bellone, E., Huges, J. and Guttorp, P., A hidden Markov model for downscaling synoptic atmospheric patterns to precipitation amounts, Climate Research, vol. 15, pp. 1 – 12, 2000.
[Bishop, 1995] Bishop, C., Neural Networks for Pattern Recognition, Oxford University Press, 1995.
[Boselly & Ernst, 1993] Boselly, S., and Ernst, D., Road Weather Information Systems, Volume 2: Implementation Guide, Report SHRP-H-351, Strategic Highway Research Program, National Research Council, Washington, DC, 1993.
[Boselly et al., 1993] Boselly, S., Thornes, J., Ulberg, C., and Ernst, D., Road Weather Information Systems, Volume 1: Research Report, Report SHRP-H-350, Strategic Highway Research Program, National Research Council, Washington, DC, 1993.
[Buhmann & Albovitz, 2003] Buhmann, M., Albowitz, M., Radial Basis Functions: Theory and Implementations, Cambridge University Press, 2003.
[Cano et al., 2004] Cano, R., Sordo, C. and Gutierrez, J., Applications of Bayes nets in meteorology, Advances in Bayesian networks, pp. 309 – 327, Springer 2004.
[Cooper & Herskovits, 1992] Cooper, G. and Herskovits, E., A Bayesian Method for the Induction of Probabilistic Networks from Data, Machine Learning, vol. 9, pp. 309-347, 1992

[Dietterich, 2002] Dietterich, T., Machine learning for sequential data: A review, Lecture Notes in Computer Science, vol. 2396, pp. 15-30, 2002.


[Dougherty et al., 1995] Dougherty, J., Kohavi, R. and Sahami, M., Supervised and unsupervised discretization of continuous features, Proceedings of the Twelfth International Conference on Machine Learning, pp 94-202, 1995.
[Durbin et al., 1989] Durbin, R., Eddy, S., Krogh, A. and Mitchison, G., Biological Sequence Analysis: Probabilistic models of proteins and nucleic acids, Cambridge University Press, 1998
[FAA, 2003] Federal Aviation Administration (FAA), Automated Surface Observing System (ASOS) / Autoamted Weather Observing System (AWOS), updated Feb 2003, http://www.faa.gov/asos/
[FAA, 2006] Federal Aviation Administration (FAA), Mechanism Data Report: Automated Weather Observing System, updated April 2006, http://www.nas-architecture.faa.gov/nas5/mechanism/mech_data.cfm?mid=37.
[Fayyad & Irani, 1993] Fayyad, U. and Irani, K., Multi-interval discretization of continuous-valued attributes for classification learning, Proceedings of 13th International Joint Conference on Artificial Intelligence, pp 1022-1027, Morgan Kaufmann, 1993.
[FHA, 2004] Collaborative Research on Road Weather Observations and Predictions by Universities, State Departments of Transportations, and National Weather Service Forecast Offices, US. Department of Transportation Federal Highway Administration, Publication No. FHWA-HRT-04-109, October, 2004.
[Forney, 1973] Forney, J., The Viterbi algorithm, Proceedings of the IEEE, vol. 61, no. 3, pp. 268–278, March 1973
[Forsyth & Rada, 1986] Forsyth, R. and Rada, R., Machine Learning applications in expert systems and information retrieval, Ellis Horwood Ltd., 1986.
[Friedman et al., 1997] Friedman, N., Geiger, D. and Goldszmidt, M., Bayesian network classifiers, Machine Learning, vol. 29, pp. 131-163, 1997.
[Gallus et al., 2004] Gallus, W. Jr, Jungbluth, K. and Burkheimer, D., Improved Frost Forecasting through Coupled Artificial Neural Network Time-Series Prediction Techniques and a Frost Deposition Model, US. Department of Transportation Federal Highway Administration, Publication No. FHWA-HRT-04-109, pp. 19 – 26, October, 2004.

[Good, 1965] Good, I., The Estimation of Probabilities: An Essay on Modern Bayesian Methods, M.I.T. Press, 1992.


[Hall et al., 1999] Hall, T., Brooks, H. and Doswell, C., Precipitation forecasting using a Neural Network, Weather and Forecasting, vol. 14, num. 3, pp. 338-345, 1999.

[Harvey, 1989] Harvey, A., Forecasting, Structural Time Series Models and the Kalman Filter, Cambridge University Press, Cambridge, 1989.


[Heckerman et al., 1995] Heckerman, D., Geiger, D. and Chickering, D., Learning Bayesian networks: The combination of knowledge and statistical data, Machine Learning, vol. 2, pp. 197-243, 1995.
[Holland, 1962] Holland, J., Outline for a logical theory of adaptive systems, Journal for Association of Computing Machinery, vol. 3, pp. 297-314, 1962.
[Knight et al., 2004] Knight, P., Ayers, B., Ondrejik, D. and Uzowke, A., Developing an Interactive Mesonet for PennDOT, US. Department of Transportation Federal Highway Administration, Publication No. FHWA-HRT-04-109, pp. 10 – 17, October, 2004.
[Kohavi, 1995] Kohavi, R., A study of cross-validation and bootstrap for accuracy estimation and model selection, Proceedings of the 14th International Joint Conference on Artificial Intelligence, 1995.
[Langley et al., 1992] Langley, P., Iba, W. and Thompson, K., An Analysis of Bayesian Classifiers, Proceedings of the Tenth National Conference on Artificial Intelligence, pp. 223-228, AAAI Press, 1992.
[Langley, 1996] Langley, P., Elements of Machine Learning, Morgan Kaufmann, San Francisco, 1996.
[Lau &Weng, 1995] Lau, K. and Weng, H., Climate Signal Detection Using Wavelet Transform: How to Make a Time Series Sing, Bulletin of the American Meteorological Society: Vol. 76, No. 12, pp. 2391- 2402, 1995.
[Littlestone, 1988] Littlestone, N., Learning Quickly When Irrelevant Attributes Abound: A New Linear-threshold Algorithm, Machine Learning, vol. 2, pp. 285-318, 1988.
[Manfredi et al., 2005] Manfredi, J., Walters, T., Wilke, G., Osborne, L., Hart, R., Incrocci, T. and Schmitt, T., Road Weather Information System Environmental Sensor Station Siting Guidelines, US. Department of Transportation Federal Highway Administration, Publication No. FHWA-HOP-05-026, April 2005.

[McMillan et al., 2005] McMillan, N., Bortnick, S., Irwin, M. and Berliner, M., A hierarchical Bayesian model to estimate and forecast ozone through space and time, Atmosphereic Environment, vol. 39, pp. 1373 -1382, 2005.


[Mitchell, 1997] Mitchell, T., Machine Learning, McGraw Hill, 1997
[Murphy, 1998] Murphy, K., Hidden Markov Model (HMM) Toolbox for MATLAB, 1998, http://www.cs.ubc.ca/~murphyk/Software/HMM/hmm.html.
[NCDC, 2005] National Climatic Data Center, Data Documentation for Data Set 3282 (DSI-3282): ASOS Surface Airways Hourly Observations, National Climatic Data Center, Asheville, NC, May 2005.
[Nilsson, 1996] Nilsson, N., Introduction to Machine Learning. Unpublished draft, Department of Computer Science, Stanford University, 1996.
[Orr, 1996] Orr, M., Introduction to radial basis function networks. Technical report, Institute for Adaptive and Neural Computation of the Division of Informatics at Edinburgh University, Scotland, UK, 1996, http://www.anc.ed.ac.uk/~mjo/papers/intro.ps.gz.
[Park et al., 1991] Park, D., El-Sharkawi, M., Marks, R. II, Atlas, L. and Damborg, M., Electric load forecasting using an artificial neural network, IEEE Transactions on Power Systems, vol. 6, issue 2, pp. 442 – 449, May 1991
[Pearl, 1988] Pearl, J., Probabilistic reasoning in intelligent systems, Morgan Kaufman, 1988
[Quinlan, 1986] Quinlan, R., Induction of Decision Trees, Machine Learning, vol. 1, pp. 81-106, 1986.
[Quinlan, 1992] Quinlan, R., Learning with Continuous Classes, Proceedings of the 5th Australian Joint Conference on Artificial Intelligence, pp. 343-348. World Scientific, Singapore, 1992.
[Quinlan, 1993] Quinlan, R., C4.5: Programs for Machine Learning, Morgan Kaufmann, San Francisco, 1993.
[Rabiner & Juang, 1986] Rabiner, L. and Juang, B., An Introduction to Hidden Markov Models, IEEE ASSP Magazine, pp. 4-15 , January 1986.

[Rabiner, 1989] Rabiner, L., A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition, Proceedings of IEEE, Vol. 77, Number 2, February 1989.

[Rissanen, 1978] Rissanen, J., Modeling by shortest data description. Automatica, vol. 14, pp. 465-471, 1978.
[Rousseeuw, 1984] Rousseeuw, P., Least Median Squares of Regression, Journal of American Statistical Association, vol. 49, pp. 871-880, December 1984.
[Rumelhart et al., 1986] Rumelhart, D., Hinton, G. and Williams, R., Learning internal representations by error propagation, Parallel Distributed Processing: Explorations in the Microstructures of Cognition, vol.I, pp. 318–362, MIT Press, 1986.
[Russell et al., 1995] Russell, S., Binder, J., Koller, D. and Kanazawa, K., Local Learning in Probabilistic Networks with Hidden Variables, Proceedings of the 14th International Joint Conference on Artificial Intelligence, Morgan Kaufmann, 1995.

[Schizas, 1991] Schizas, C., Michaelides, S., Pattichis, C. and Livesay, R., Artificial neural networks in forecasting minimum temperature, Second International Conference on Artificial Neural Networks, pp. 112 -114, November 1991.


[Todey et. al., 2002] Todey, D., Herzmann, D., Gallus, Jr. W., and Temeyer, B., An intercomparison of RWIS data with AWOS and ASOS observations in the state of Iowa, The Third Symposium on Environmental Applications:Facilitating the Use of Environmental Information, American Meteorological Society, January, 2002.
[Viterbi, 1967] Viterbi, A., Error bounds for convolutional codes and asymptotically optimum decoding algorithm, IEEE Transactions on Information Theory, vol. IT-13, no. 2, pp. 260–269, April 1967.
[Wang & Witten, 1997] Wang, Y. and Witten, I., Inducing Model Trees for Continuous Classes, In Poster Papers of the Ninth European Conference on Machine Learning, pp. 128-137, Prague, Czech Republic, April, 1997.
[Witten & Frank, 2005] Witten, I. and Frank, E., Data Mining: Practical machine learning tools and techniques, 2nd Edition, Morgan Kaufmann, San Francisco, 2005.
[Zhang, 2004] Zhang, Y., Prediction of Financial Time Series with Hidden Markov Models, Masters Thesis, Simon Fraser University, 2004.
1   ...   4   5   6   7   8   9   10   11   12


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©atelim.com 2016
rəhbərliyinə müraciət