Spatial statistics is one of the fields in statistics dealing with spatialy spread data analysis. Recently, Bayes methods are often applied for data statistical analysis. A spatial data model for predicting algae quantity in the Baltic Sea is made and described in this article. Black Carrageen is a dependent variable and depth, sand, pebble, boulders are independent variables in the described model. Two models with different covariation functions (Gaussian and exponential) are built to estimate the best model fitting for algae quantity prediction. Unknown model parameters are estimated and Bayesian kriging prediction posterior distribution is computed in OpenBUGS modeling environment by using Bayesian spatial statistics methods.
In technical photography, there are cases with requirement to keep full similarity among the object and its reflection. Due to the aberration of optical systems used in photography tools, unambiguous and precise application of projecting geometry rules becomes impossible. The problem remains in digital photography with traditional optical devices and image processing with software tools is complicated due to the both insensitivity of image matrixes and the lack of suitable correction algorithms. The study has shown that the best results can be obtained by using stenocamera and sensitive photographic film. New digital image retrieval method, obtained by combining classical stenocamera tool with digital web camera matrix is described in this paper.
This paper describes a concept of making interactive human state recognition systems based on smart sensor design. The token measures on proper ADC signal processing had significantly lowered the interference level. A more reliable way of measuring human skin temperature was offered by using Maxim DS18B20 digital thermometers. They introduced a more sensible response to temperature changes compared to previously used analog LM35 thermometers. An adaptive HR measuring algorithm was introduced to suppress incorrect ECG signal readings caused by human muscular activities. User friendly interactive interface for touch sensitive GLCD screen was developed to present real time physiological data readings both in numerals and graphics. User was granted an ability to dynamically customize data processing methods according to his needs. Specific procedures were developed to simplify physiological state recording for further analysis. The introduced physiological data sampling and preprocessing platform was optimized to be compatible with “ATmega Oscilloscope” PC data collecting and visualizing software.
When analyzing stock market data, it is common to encounter observations that differ from the overall pattern. It is known as the problem of robustness. Presence of outlying observations in different data sets may strongly influence the result of classical (mean and standard deviation based) analysis methods or models based on this data. The problem of outliers can be handled by using robust estimators, therefore making aberrations less influential or ignoring them completely. An example of applying such procedures for outlier elimination in stock trading system optimization process is presented.
OpenCL, a modern parallel heterogeneous system programming language, enables problems to be partitioned and executed on modern CPU and GPU hardware, this increases performance of such applications considerably. Since GPU's are optimized for floating point and vector operations and specialize in them, they outperform general purpose CPU's in this field greatly. This language greatly simplifies the creation of applications for such heterogeneous system since it's cross-platform, vendor independent and is embeddable , hence letting it be used in any other general purpose programming language via libraries. There is more and more tools being developed that are aimed at low level programmers and scientists or engineers alike, that are developing applications or libraries for CPU’s and GPU’s of today as well as other heterogeneous platforms. The tendency today is to increase the number of cores or CPU‘s in hopes of increasing performance, however the increasing difficulty of parallelizing applications for such systems and the even increasing overhead of communication and synchronization are limiting the potential performance. This means that there is a point at which increasing cores or CPU‘s will no longer increase applications performance, and even can diminish performance. Even though parallel programming and GPU‘s with stream computing capabilities have decreased the need for communication and synchronization (since only the final result needs to be committed to memory), however this still is a weak link in developing such applications.
Ecological system modelling is a powerful tool that provides better understanding of interspecies interaction. Although a complex model gives more information about the modelled object it also drastically increases the computational time needed to get that information. In this paper a rather simple three trophic level population dynamics model with an evolution mechanism is described which can be run on any personal computer. The performance capacity of the evolution mechanism was shown by running the model 1100 times for both carnivores and herbivores so that only one type of animals could evolve. Also it was shown that attempts of controlling the population abundances with chemicals or by hunting while being somewhat effective still can be overcome by animals if they have the ability to evolve.
During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.
Information systems begin to date increasingly faster because of rapidly changing business environment. Usually, small changes are not sufficient to adapt complex legacy information systems to changing business needs. A new functionality should be installed with the requirement of putting business data in the smallest possible risk. Information systems modernization problems are beeing analyzed in this paper and a method for information system modernization is proposed. It involves programming code transformation into abstract syntax tree metamodel (ASTM) and model based transformation from ASTM into knowledge discovery model (KDM). The method is validated on example for SQL language.