The essential part of amperometric biosensor is an enzyme. It should be selective, i.e., react only with certain substrate. The selectivity of enzyme reduces the set of possible to use enzymes. This paper demonstrates that non selective enzymes (reacting with two substrates) can be used to determine concentrations of two substrates. For this purpose the steady-state current of two double biosensors was measured. The currents were used as input for an artificial neural network to determine concentrations of the substrates. The proposed approach was approved as the relative error of determined concentrations was relatively small. Paper analyses the influence of biosensor parameters to error values. The recommendations to error values minimisation were obtained.
In this study, an NPZD model and a trophic network model that contains organism groups on the higher trophic levels were developed and linked using the “bottom-up control” approach. Such a linkage of models provides the possibility to use the advantages of both models; reproducing of the erratic behaviour of nutrients and plankton as realistic as possible, while still taking the more complex organisms in the trophic network, which respond to external forcing in a larger time scale. The models developed in this study were applied to the Curonian Lagoon that is an important estuarine ecosystem for Lithuania. The tests and simulations have proven that the results of the NPZD model were accurate enough for representing the nutrient and phytoplankton dynamics in the Curonian Lagoon as well as spatial differences which are of ecological interest. Linkage with trophic network model demonstrated NPZD model results to be consistent with the Curonian Lagoons ecosystem. The modelling results showed that primary production is relatively high in the Curonian Lagoon and is unlikely to be controlled by the organisms that are on the higher trophic levels of the food web. Analysis of the NPZD model scenarios with different nutrients inputs revealed that phosphorus is the main limiting nutrient for primary production in the Curonian Lagoon. However, different combinations of nitrogen and phosphorus inputs control the relative abundance of different phytoplankton groups. Investigation of reaction of ecosystem to water temperature increase showed that the temperature increase finally leads to decrease of available phytoplankton to upper levels of the food web.
This work put forwards an optimal BCI (Brain Computer Interface) speller design based on Steady State Visual Evoked Potentials (SSVEP) and Artificial Neural Network (ANN) in order to help the people with severe motor impairments. This work is carried out to enhance the accuracy and communication rate of BCI system. To optimize the BCI system, the work has been divided into two steps: First, designing of an encoding technique to choose characters from the speller interface and the second is the development and implementation of feature extraction algorithm to acquire optimal features, which is used to train the BCI system for classification using neural network. Optimization of speller interface is focused on representation of character matrix and its designing parameters. Then again, a lot of deliberations made in order to optimize selection of features and user’s time window. Optimized system works nearly the same with the new user and gives character per minute (CPM) of 13 ± 2 with an average accuracy of 94.5% by choosing first two harmonics of power spectral density as the feature vectors and using the 2 second time window for each selection. Optimized BCI performs better with experienced users with an average accuracy of 95.1%. Such a good accuracy has not been reported before in account of fair enough CPM.
Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, hybrid cloud and community. The most common and well-known deployment model is Public Cloud. A Private Cloud is suited for sensitive data, where the customer is dependent on a certain degree of security.
According to the different types of services offered, cloud computing can be considered to consist of three layers (services models): IaaS (infrastructure as a service), PaaS (platform as a service), SaaS (software as a service). Main cloud computing solutions: web applications, data hosting, virtualization, database clusters and terminal services. The advantage of cloud com-puting is the ability to virtualize and share resources among different applications with the objective for better server utilization and without a clustering solution, a service may fail at the moment the server crashes.
The paper proposes a technology for mass optimization of two-dimensional body applying genetic algorithms. Main attention is focused on geometry of 2D body, i. e. search for optimal coordinates of body points. Direct analysis of 2D body – von Mises stress determination – is performed using original program based on finite element method. The set of design parameters contains the coordinates of body points in 2D space. The results of numerical experiments proved the proposed technology to be efficient tool for solution of 2D body mass optimization problem.
In this paper a stochastic adaptive method has been developed to solve stochastic linear problems by a finite sequence of Monte-Carlo sampling estimators. The method is based on the adaptive regulation of the size of Monte-Carlo samples and a statistical termination procedure taking into consideration statistical modelling accuracy. Our approach distinguishes itself by the treatment of accuracy of the solution in a statistical manner, testing the hypothesis of optimality according to statistical criteria, and estimating confidence intervals of the objective and constraint functions. To avoid “jamming” or “zigzagging” solving a constraint problem we implement the ε–feasible direction approach. The proposed adjustment of a sample size, when it is taken inversely proportional to the square of the norm of the Monte-Carlo estimate of the gradient, guarantees convergence a. s. at a linear rate. The numerical study and examples in practice corroborate theoretical conclusions and show that the developed procedures make it possible to solve stochastic problems with sufficient accuracy by the means of an acceptable size of computations.
Mathematical model of biosensor with competitive substrates conversion is analysed in this work. Model is described by partial differential reaction-diffusion equations with non-linear reaction term. Because of the non-linearity the analytical solutions exist only for extreme parameter values and thus the model in general case is solved by finite difference methods. The validity of the computational model is checked by comparing numerically obtained results to the known analytical solutions at the mentioned extreme parameter values. The purpose of this work is to determine the values of model parameters at which the impact of one of the substrates on the biosensor response can be minimized.
The current paper illustrates the importance of clustering the frequent items of code coverage during test suite reduction. A modular Most maximal frequent sequence clustered algorithm has been used along with a Requirement residue based test case reduction process. DU-pairs form the basic code coverage requirement under consideration for test suite reduction. This algorithm farewell when compared with few other algorithms like Harrold Gupta and Soffa (HGS) and Bi-Objective Greedy (BOG) algorithms and Greedy algorithms in covering all the DU-Pairs. The coverage criteria achieved is 100% in many cases, except for few insufficient and incomplete test suites.
The article deals with interaction of tumour cells and leucocytes in the cylindrical cavities. This type of interaction is typical in the cases of development of a tumour in the intestine, blood vessel or in a bone cavity. Two cases are separated: the case of soft and hard tumour. In the case of a solid tumour, leucocytes can interact only with the surface cells of the tumour. This type of interaction is described by the system of two nonlinear first degree differential equations. The expressions of stationary points are obtained and analysis of their stability is performed. In the case of a soft tumour the system of two partial differential equations with first order derivatives and initial and boundary conditions is proposed. An algorithm for computing the numeric solution of the mathematical model is applied. In this case the diffusion of leucocytes and their ability to reach the tumour cells in the whole volume of the tumour is included. The algorithm is constructed and the system is solved numerically. Bifurcation curve is obtained. It separates two qualitatively different areas on the two parameter plane. Under the same initial parameters in the first area development of the tumour cells cannot be stopped, whereas in the second area leukocytes defeat the tumour cells.
Service oriented architecture (SOA) is an architecture for distributed applications composed of distributed services with weak coupling that are designed to meet business requirements. One of the research priorities in the field of SOA is creating such software design and development methodology (SDDM) that takes into account all principles of this architecture and allows for effective and efficient application development. A lot of investigation has been carried out to find out whether can one of popular SDDM, such as agile methodologies or RUP suits, be adapted for SOA or there is a need to create some new SOA-oriented SDDM. This paper compares one of SOA-oriented SDDM – SOUP – with RUP and XP methodologies. The aim is to find out whether the SOUP methodology is already mature enough to assure successful development of SOA applications. This aim is accomplished by comparing activities, artifacts of SOUP and RUP and emphasizing which XP practices are used in SOUP.