We describe the spatial variability of snow accumulation on three selected glaciers in Spitsbergen (Hansbreen, Werenskioldbreen and Aavatsmarkbreen) in the winter seasons of 1988/89, 1998/99 and 2001/2002 respectively. The distribution of snow cover is determined by the interrelationships between the direction of the glacier axes and the dominant easterly winds. The snow distribution is regular on the glaciers located E-W, but is more complicated on the glaciers located meridionally. The western part of glaciers is more predisposed to the snow accumulation than the eastern. This is due to snowdrift intensity. Statistical relationships between snow accumulation, deviation of accumulation from the mean values and accumulation variability related to topographic parameters such as: altitude, slope inclination, aspect, slope curvature and distance from the edge of the glacier have been determined. The only significant relations occured between snow accumulation and altitude (r = 0.64-0.91).
This paper concerns an approach to model the ledger-stand joints of modular scaffolds. Based on the analysis of the working range of the ledger (represented by a linear relationship between load and displacement), two models of the ledger-stand joint are analysed: first – with flexibility joints and second – with rigid joints and with a transition part of lower stiffness. Parameters are selected based on displacement measurements and numerical analyses of joints, then they are verified. On the basis of performed research, it can be stated that both methods of joint modelling recommended in this paper, can be applied in engineering practices.
The paper presents the results of the noise propagation analysis in ship structures tested in a number of AHTS (Anchor Handling Tug Supply) vessels. Statistical Energy Analysis (SEA) based on numerical model developed specially for the purpose of this numerical investigation were conducted. This numerical model enabled the analysis of both the structural elements and the acoustic spaces. For the detailed studies 47 points fixed at various ship locations were selected. Prediction results with use of the numerical model were compared with the experimental results carried out in six identical AHTS vessels. Experimental studies were performed in accordance with the requirements of the International Maritime Organization (IMO) Resolution A.468 (XII). As a result one presented a comparison of the model analysis and experimental tests results.
The aim of the article to assess the functioning of the NewConnect market over 10 years from the organizer’s and participants’ perspective. This helps to diagnose the most important organizational advantages and problems of the Polish MTF, determine further development prospects and propose potential changes to neutralize the negative factors. To illustrate the problem, a comprehensive analysis will be made of aggregated statistical data from 2007–2017, which show the changes and trends on this market, and additionally include the data comparing the current state of the NewConnect market with other alternative markets organized by European stock exchanges. The conducted research does not allow to view the NewConnect market as an organizational success. The analysis identified a number of problems in the functioning of the Polish MTF, ranging from the inappropriate organization of the primary market, resulting in the admittance of too high a number of issuers of dubious credibility, to the consequences appearing on the secondary shares market. It does not give unambiguous grounds to expect positive prospects for the market development in the future. In order to stop unfavorable trends and to improve the issuers’ quality, a discussion on the regulations regarding issuers’ admission, i.e. the size of the minimum equity, IPO, capitalization and the issue price of the debuting company, should be initiated.
In 2018, the 90th anniversary of Professor Vasiliy Danilovich Bondaletov`s birth will be celebrated. The aim of the article is to remind readers of the quantitative and qualitative method of statistical analysis in anthroponomastic research developed by Professor Bondaletov, as well as to show its advantages over simplified descriptions of the frequency of personal names. In this article, the detailed analysis of male Christian names found in customs books from Northern Russia (1633–1636 and 1678–1680) was conducted. The comparison of statistical data, according to the suggestion of Professor V. D. Bondaletov, enabled us to observe subtle differences between the abovementioned resources, namely to estimate the level of their (dis)similarity and describe the dynamics of the evolution of the resources of male Christian names throughout the 17th century, as well as changes in the popularity of various names.
‘Hard’ and ‘soft’ methods in analyses of territorial structures’. This article refers to two distinct approaches to investigations of territorial structures and their changes: the ‘intuitive’ of ‘soft’ approach and a more rigid, formalized or ‘hard’ one. The examples of analyzing the regional patterns in Poland over a almost 40 year span are called to illustrate these relations between two methodological standpoints. The conclusion states that both of them are valid and useful, however their strengths can be fully exposed when both are applied in an comprehensive way, supporting each other in a difficult process of investigation multidimensional and dynamic changes of the social territorial systems.
This paper proposes a modification of the classical process for evaluating the statistical significance of displacements in the case of heterogeneous (e.g. linear-angular) control networks established to deformation measurements and analysis. The basis for the proposed solution is the idea of local variance factors. The theoretical discussion was complemented with an example of its application on a simulated horizontal control network. The obtained results showed that the evaluation of the statistical significance of displacements in the case of heterogeneous control networks should be carried out using estimators of local variance factors.
The paper presents the use of visual evoked potentials VEP to the objective assessment of visual acuity. Methods of using visual evoked potentials (VEP) rely on the assessment of changes of the electrical action potentials generated within the cortex. To diagnose the degree of weakening eyesight a series of studies on healthy people and visually impaired ones were made. Electrophysiological studies of the eye, using noninvasive VEP examination allow a noninvasive and objective assessment of visual acuity. The use of visual evoked potentials gives the objectivity in assessment of visual acuity what may be very important in ophthalmology. This particularly may concern children examination, people with mental retardation and suspected of simulation.
The authors focus their attention on the analysis of the probability density function of the equivalent noise level, in the context of a determination of the uncertainty of the obtained results in regard to the control of environmental acoustic hazards. In so doing, they discuss problems of correctness in the applicability of the classical normal distribution for the estimation of the expected interval value of the equivalent sound level. The authors also provide a set of procedures with respect to its derivation, based upon an assumption of the determined distribution of the measurement results. The obtained results then create the plane for the correct uncertainty calculation of the results of the determined controlled environmental acoustic hazard coefficient.
The assessment of the uncertainty of measurement results, an essential problem in environmental acoustic investigations, is undertaken in the paper. An attention is drawn to the - usually omitted - problem of the verification of assumptions related to using the classic methods of the confidence intervals estimation, for the controlled measuring quantity. Especially the paper directs attention to the need of the verification of the assumption of the normal distribution of the measuring quantity set, being the base for the existing and binding procedures of the acoustic measurements assessment uncertainty. The essence of the undertaken problem concerns the binding legal and standard acts related to acoustic measurements and recommended in: 'Guide to the expression of uncertainty in measurement' (GUM) (OIML 1993), developed under the aegis of the International Bureau of Measures (BIPM). The model legitimacy of the hypothesis of the normal distribution of the measuring quantity set in acoustic measurements is discussed and supplemented by testing its likelihood on the environment acoustic results. The Jarque-Bery test based on skewness and flattening (curtosis) distribution measures was used for the analysis of results verifying the assumption. This test allows for the simultaneous analysis of the deviation from the normal distribution caused both by its skewness and flattening. The performed experiments concerned analyses of the distribution of sound levels: LD, LE, LN, LDWN, being the basic noise indicators in assessments of the environment acoustic hazards.
The paper deals with frequency estimation methods of sine-wave signals for a few signal cycles and consists of two parts. The first part contains a short overview where analytical error formulae for a signal distorted by noise and harmonics are presented. These formulae are compared with other accurate equations presented previously by the authors which are even more accurate below one cycle in the measurement window. The second part contains a comparison of eight estimation methods (ESPRIT, TLS, Prony LS, a newly developed IpDFT method and four other 3-point IpDFT methods) in respect of calculation time and accuracy for an ideal sine-wave signal, signal distorted by AWGN noise and a signal distorted by harmonics. The number of signal cycles is limited from 0.1 to 3 or 5. The results enable to select the most accurate/ fastest estimation method in various measurement conditions. Parametric methods are more accurate but also much slower than IpDFT methods (up to 3000 times for the number of samples equal to 5000). The presented method is more accurate than other IpDFT methods and much faster than parametric methods, which makes it possible to use it as an alternative, especially in real-time applications.
To achieve better precision of features generated using the micro-electrical discharge machining (micro-EDM), there is a necessity to minimize the wear of the tool electrode, because a change in the dimensions of the electrode is reflected directly or indirectly on the feature. This paper presents a novel modeling and analysis approach of the tool wear in micro-EDM using a systematic statistical method exemplifying the influences of capacitance, feed rate and voltage on the tool wear ratio. The association between tool wear ratio and the input factors is comprehended by using main effect plots, interaction effects and regression analysis. A maximum variation of four-fold in the tool wear ratio have been observed which indicated that the tool wear ratio varies significantly over the trials. As the capacitance increases from 1 to 10 nF, the increase in tool wear ratio is by 33%. An increase in voltage as well as capacitance would lead to an increase in the number of charged particles, the number of collisions among them, which further enhances the transfer of the proportion of heat energy to the tool surface. Furthermore, to model the tool wear phenomenon, a egression relationship between tool wear ratio and the process inputs has been developed.
Forecasting and analysis SWOT are helping tools in the business activity, because under conditions of dynamic changes in both closer and more distant surroundings, reliable, forward-looking information and trends analysis are playing a decisive role. At present, the ability to use available data in forecasting and other analyzes according with changes in business environment are the key managerial skills required, since both forecasting and SWOT analysis are a integral part of the management process, and the appropriate level of forecasting knowledge is increasingly appreciated. Examples of practical use of some forecasting methods in optimization of the procurement, production and distribution processes in foundries are given. The possibilities of using conventional quantitative forecasting methods based on econometric and adaptive models applying the creep trend and harmonic weights are presented. The econometric models were additionally supplemented with the presentation of error estimation methodology, quality assessment and statistical verification of the forecast. The possibility of using qualitative forecasts based on SWOT analysis was also mentioned.
Exploitation of lignite within the area of Muskau Arch, carried out from the mid-nineteenth century, contributed to the transformation of the natural environment and changes in water regime. In the post-mining subsidences pit lakes were formed. The chemical composition of waters is a consequence of the intensive weathering of pyrite (FeS2), which is present in Miocene lignite-bearing rock forming the embankments of the lakes. This process leads to the formation of Acid Mine Drainage (AMD) and finally acidification of lake waters. This paper presents results of the identification of hydrogeochemical processes affecting the chemistry of waters from these reservoirs carried out using the speciation and statistical (cluster and factor) analyses. Cluster analysis allowed to separate from the analyzed group of anthropogenic reservoirs 7 subgroups characterized by a similar chemical composition of waters. The major processes affecting the chemistry of waters were identified and interpreted with help of factor and speciation analysis of two major parameters (iron and sulfur).
The use of quantitative methods, including stochastic and exploratory techniques in environmental studies does not seem to be sufficient in practical aspects. There is no comprehensive analytical system dedicated to this issue, as well as research regarding this subject. The aim of this study is to present the Eco Data Miner system, its idea, construction and implementation possibility to the existing environmental information systems. The methodological emphasis was placed on the one-dimensional data quality assessment issue in terms of using the proposed QAAH1 method - using harmonic model and robust estimators beside the classical tests of outlier values with their iterative expansions. The results received demonstrate both the complementarity of proposed classical methods solution as well as the fact that they allow for extending the range of applications significantly. The practical usefulness is also highly significant due to the high effectiveness and numerical efficiency as well as simplicity of using this new tool.