The presence of the chunky graphite is unwanted in the cast iron with the spheroidal graphite for this significantly lowers the properties of the ductile iron. This shape of the graphite is formed as the result of the slow cooling rate of the castings with large thermal point and also due to the presence of the elements which suppress the formation of the spheroidal graphite and support formation of the chunky graphite. The spheroidal graphite present in the ductile iron assures the excellent mechanical properties, while the chunky graphite significantly reduces those properties of the ductile iron. Therefore it is of importance to assume conditions under which prevented is the formation of the chunky graphite. The casts were carried out under the conditions of the regular operation of the foundry and tested were various types of modifiers and inoculators and also pre-inoculators containing the elements suppressing the formation of the chunky graphite (Al, Sb a Ba). Applied were also the chromium breaker core to suppress the formation chunky graphite which was present in the structure in the places after the feeders elimination. As whole, executed were eight casts with various types of the modifiers and inoculators.
In this paper the capacity of non-uniform sampling rate conversion techniques, involving different interpolation methods, aimed at wow defect reduction, is examined. Involved are: linear interpolation, four polynomial-based interpolation methods and the windowed sincbased method. The examined polynomial methods are: Lagrange interpolation, polynomial fitting with additional noise reduction, Hermitan and Spline. The performance of an artificially distorted audio signal, restored using non-uniform resampling, is evaluated on the basis of standard audio defect measurement criteria and compared for all of the aforementioned interpolation methods. The chosen defect descriptors are: total harmonic distortion, total harmonic distortion plus noise and signal to noise ratio.
Neutralisation of the terrorist explosive devices is a risky task. Such tasks may be carried out by robots in order to protect human life. The article describes chosen design problems concerning the new neutralisation and assisting robot SMR-100 Expert. The robot was to be designed for the use in confined spaces, particularly inside the air-crafts, buses and rail cars. In order to achieve this ambitious plan, new advanced technological designing tools had to be applied. A number of interesting design issues were approached. The successful development of the prototype robot Expert in Poland resulted in the creation of the first intervention robot in the world able to perform all necessary anti-terrorist tasks inside the passenger planes.
DNA sequencing remains one of the most important problems in molecular and computational biology. One of the methods used for this purpose is sequencing by hybridization. In this approach usually DNA chips composed of a full library of oligonucleotides of a given length are used, but in principle it is possible to use another types of chips. Isothermic DNA chips, being one of them, when used for sequencing may reduce hybridization error rate. However, it was not clear if a number of errors following from subsequence repetitions is also reduced in this case. In this paper a method for estimating resolving power of isothermic DNA chips is described which allows for a comparison of such chips and the classical ones. The analysis of the resolving power shows that the probability of sequencing errors caused by subsequence repetitions is greater in the case of isothermic chips in comparison to their classical counterparts of a similar cardinality. This result suggests that isothermic chips should be chosen carefully since in some cases they may not give better results than the classical ones.
Together with the dynamic development of modern computer systems, the possibilities of applying refined methods of nonparametric estimation to control engineering tasks have grown just as fast. This broad and complex theme is presented in this paper for the case of estimation of density of a random variable distribution. Nonparametric methods allow here the useful characterization of probability distributions without arbitrary assumptions regarding their membership to a fixed class. Following an illustratory description of the fundamental procedures used to this end, results will be generalized and synthetically presented of research on the application of kernel estimators, dominant here, in problems of Bayes parameter estimation with asymmetrical polynomial loss function, as well as for fault detection in dynamical systems as objects of automatic control, in the scope of detection, diagnosis and prognosis of malfunctions. To this aim the basics of data analysis and exploration tasks - recognition of outliers, clustering and classification - solved using uniform mathematical apparatus based on the kernel estimators methodology were also investigated
Surface topography assessments with valley exploration are of great importance. Two-process surfaces are often proposed for many combustion engines. One of the errors committed in surface topography measurements and analysis are those that occur during data processing. In this paper, improper areal form removal was taken into consideration for plateau-honed cylindrical surfaces with additionally burnished oil pockets. Usually, the reference plane is established by application of: fitting algorithms (e.g. cylindrical shape), polynomials, filters and other procedures. In many cases, the influence of the reference plane was not fully recognized during valley depth consideration. Moreover, the influence of areal form removal with edge-to-dimple and valley‑to-dimple distances was not precisely defined. In this research, commonly used algorithms for form separation in surface topography analysis were proposed for the applications being considered. The digital filter bandwidth was also specified for valley depth analysis. The distortion of edge‑located oil pockets was specified. It was assumed that application of robust techniques does not necessarily provide the desired results.