Loading...
10 results
Search Results
Now showing 1 - 10 of 10
- Searching for the corner seismic moment in worldwide dataPublication . Felgueiras, Miguel; Santos, Rui; Oliveira Martins, João PauloIn this paper the existence of the corner frequency value for the seismic moment distribution is investigated, analysing worldwide data. Pareto based distributions, usually considered as the most suitable to this type of data, are fitted to the most recent data, available in a global earthquake catalog. Despite the undeniable finite nature of the seismic moment data, we conclude that no corner frequency can be established considering the available data set. © 2015 AIP Publishing LLC.
- Known Mean, Unknown Maxima? Testing the Maximum Knowing Only the MeanPublication . Santos, Rui; Oliveira Martins, João Paulo; Felgueiras, MiguelIn the quantitative group testing problem, the use of the group mean to identify if the group maximum is greater than a prefixed threshold (infected group) is analyzed, using n independent and identically distributed individuals. Under these conditions, it is shown that the information of the mean is sufficient to classify each group as infected or healthy with low probability of misclassification when the underline distribution is a unilateral heavy-tailed distribution.
- Estimation Through Array-Based Group TestsPublication . Oliveira Martins, João Paulo; Felgueiras, Miguel; Santos, RuiPooling individual samples for batch testing is a common procedure for reducing costs. The recent use of multidimensional array algorithms, due to the emergence of robotic pooling, is an innovative way of pooling. We show that the two-dimensional array-based group tests can provide accurate estimates for the prevalence rate even for situations in which the traditional estimators, applied to one-dimensional arrays, are not valid. Hence, a computational script was developed to determine which prevalence rate estimate minimizes the sum of the squared deviations between the number of observed and expected rows and columns whose pooled sample had a positive test result. © 2017, National Statistical Institute. All rights reserved.
- Three-dimensional array-based group testing algorithms with one-stagePublication . Oliveira Martins, João Paulo; Felgueiras, Miguel; Santos, RuiThe use of three-dimensional array-based testing algorithms is more efficient and accurate in some situations than other more commonly used algorithms to protocol pooled samples testing. We evaluate the advantages of using of this complex pooling schemes with only one stage in the problem of estimation of the prevalence rate of some disease. Using simulation work, we show that it does not seem to exist any advantage in using three or even higher-dimensional arrays for this type of problem.
- Alternative heavy tailed models in seismologyPublication . Felgueiras, Miguel; Martins, João; Santos, RuiGreat earthquakes are commonly considered as the ones with moment magnitude (Mw ) above or equal to 8.0. Since these earthquakes can destroy entire communities located near the epicentre, the search of physical laws that explain the energy released by them is an important issue. There is a connection between the radiated energy of an earthquake, its magnitude and its seismic moment (M 0). Thence, when fitting a heavy or an extremely heavy tailed distribution to a seismic moment dataset, we are in fact adjusting a mathematical model which explains the amount of energy released by these great seisms. Therefore, the main goal of this work is to study the more appropriated Pareto based models (the most used family in this field) when explaining the seismic moment of the great earthquakes. With this purpose in mind, we selected two different catalogs that accommodate recent events and are considered more accurate than other catalogs used in previous works. We conclude that the traditional Pareto distribution remains a good choice to deal with this kind of data, but Log-Pareto lead to higher p-values and Location-scale Pareto is better fitted to the biggest events.
- The reference method influence on the sensitivity of the Clostridium difficile enzyme immunoassays: A meta analysisPublication . Martins, João Paulo; Felgueiras, Miguel; Santos, RuiThe use of enzyme immunoassays to screen for toxins A and B produced by Clostridium difficile is a common procedure in algorithms designed for its detection. Moreover, the absence of a unique test capable of providing reliable results at low cost motivates a great discussion about which algorithm is the best. Thus, several studies have evaluated the performance of these enzyme immunoassays. However, all fail to provide sufficient explanations for the different behaviours observed in different studies that evaluate the same index test against a common reference method. Our main goal was to find out which factors affect the sensitivity of these assays, since the specificity is very close to 1. In this research, we verified that sensitivity increases with the prevalence rate and with the proportion of reported cases of onset diarrhea. Therefore, its use is advisable for high prevalence rates (e.g. in an epidemic setting). As far as reference methods are concerned, nucleic acid amplification tests can be used as a reference method, with a performance similar to the well-accepted toxigenic culture. The method chosen for toxigenicity screening in a toxigenic culture also seems to affect the evaluation performance of tests and should be better studied in the future.
- Estimation of prevalence in rare disease using pooled samplesPublication . Martins, J. P.; Santos, R.; Felgueiras, M.The use of pooled samples for screening infected individuals is a known procedure to reduce costs. In an estimation problem, the aim is only to determine how many individuals are infected instead of determining who is infected (classification problem). In that setting, our goal was to compare the performance of using one or two-dimensional arrays. The best performance was established according to one of the following criteria: minimizing the number of individuals or the number of tests required to attain a certain estimate accuracy. It is observed that when we want to minimize the number of individuals used, the two-dimensional procedures have a little advantage over the one-dimensional procedures. However, when the major concern is the cost, the one-dimensional procedures clearly outperform the two-dimensional procedures.
- Pareto Models for the Energy Released in EarthquakesPublication . Felgueiras, Miguel; Santos, Rui; Martins, João PauloIn this paper we explore Pareto based distributions to deal with the energy released by the major seisms. This is a relevant problem because great earthquakes can cause heavy losses, both human and material. The standard Pareto distribution, despite being usually well fitted to the data concerning the energy released by seisms, reveals some lack of fit when dealing with the energy released by the great earthquakes. Besides the more traditional Pareto and Log-Pareto, we also consider the Extended Slash Pareto (ESP) and the Location-Scale Pareto Mixture (LSPM) distributions in this work. For the less studied ESP and LSPM distributions, we present the parameters estimators and perform a simulation study in order to evaluate the estimators performance under different scenarios. Thenceforth, the four distributions are applied to two datasets (catalogs) containing information on the seisms magnitude, which has a direct connection to the energy released by the earthquakes (seismic moment). The used catalogs are considered as conveniently accurate and updated, and are being used in recent works. In conclusion, the Pareto distribution still is appropriate to fit this kind of data, but other distributions emerge as better models. The Log-Pareto distributions led to higher fitting p-values than the Pareto distribution, and LSPM also emerges as a strong competitor. LSPM is better fitted to the greatest observations and therefore gives a more accurate prevision for the energy released by the greater earthquakes.
- A Maximum Likelihood Estimator for the Prevalence Rate Using Pooled Sample TestsPublication . Martins, João Paulo; Santos, Rui; Felgueiras, MiguelSince Dorfman’s seminal work, research on methodologies involving pooled sample tests has increased significantly.Moreover, the use of pooled samples refers not only to the classification problem (identifying all the infected individuals in a population), but also refers to the problem of estimating the prevalence rate p, as Sobel and Elashoff stated. The use of compound tests is not restricted to hierarchical algorithms where the most common example is Dorfman’s two-stage procedure. Matrix schemes such as the square array algorithm or multidimensional matrices schemes in certain cases outperform Dorfman’s procedure. Maximum likelihood estimates are quite difficult to compute when a procedure does not classify all individuals. This paper presents two innovative methods to compute maximum likelihood estimates in both type of procedures.
- Explaining the seismic moment of large earthquakes by heavy and extremely heavy tailed modelsPublication . Felgueiras, Miguel MartinsThe search of physical laws that explain the energy released by the great magnitude earthquakes is a relevant question, since as a rule they cause heavy losses. Several statistical distributions have been considered in this process, namely heavy tailed laws, like the Pareto distribution with shape parameter α ≈ 0. 6667. Yet, for the usually considered Californian region (where earthquakes with moment magnitude, MW, greater than 7. 9 were never registered) the Pareto distribution with index near the above mentioned seems to have a "too heavy" tail for explaining the bigger earthquakes seismic moments. Usually an exponential tapper is applied to the distribution right tail (above the so called corner seismic moment), or another distribution is considered to explain these high seismic moment data (like another Pareto with different shape parameter). The situation is different for other regions where seisms of larger magnitudes do occur, leading to data sets for which heavy or even extremely heavy tailed models are appropriated. The purpose of this paper is to reduce the seismic moment, M0, of the very large earthquakes to particular heavy and extremely heavy tailed distributions. Using world seismic moment information, we apply Pareto, Log-Pareto and extended slash Pareto distributions to the data, truncated for M0 ≥ 1021 Nm and for M0 ≥ 1021. 25 Nm. For these great seisms we conclude that extended slash Pareto is a promising alternative to the more traditional Pareto and Log-Pareto distributions as a candidate to the real model underlying the data.
