SEMINAR de "TEORIA PROBABILITATILOR, STATISTICA si APLICATII "
Rezumat: This paper develops a new global optimisation method that applies to a family of criteria that are not entirely known. This family includes the criteria obtained from the class of posteriors that have normalising constants that are analytically not tractable. The procedure applies to posterior probability densities that are continuously differentiable with respect to their parameters. The proposed approach avoids the re-sampling needed for the classical Monte Carlo maximum likelihood inference, while providing the missing convergence properties of the ABC based methods. Results on simulated data and real data are presented. The real data application fits an inhomogeneous area interaction point process to cosmological data. The obtained results validate two important aspects of the galaxies distribution in our Universe: proximity of the galaxies from the cosmic filament network together with territorial clustering at given range of interactions. Finally, conclusions and perspectives are depicted.
Rezumat: The talk will review in the first part concepts such as martingale probability measures, Black Scholes model, and the martingale method (a duality method) applied to portfolio optimization. In the second part we will present an application to a household portfolio sharing. This talk is based on joint work with Oumar Mbodji and Adrien Nguyen.
Rezumat: The problem considered is the test of the effect of a Hilbert space valued covariate on a Hilbert space valued response, that is the test of the nullity of the conditional expectation of the response given a general covariate. This general framework includes the model check problem for standard mean and quantile regressions, functional regressions, etc. against general alternatives. It also includes the problem of testing conditional independence with functional data. The significance test for (functional) regressors in nonparametric regression with general covariates and responses is another example. We propose a new test based on kernel smoothing. The test statistic is asymptotically standard normal under the null hypothesis provided the smoothing parameter tends to zero at a suitable rate. The one-sided test is consistent against any fixed alternative and detects local alternatives a la Pitman approaching the null hypothesis. In particular we show that neither the dimension of the outcome nor the dimension of the covariates influences the theoretical power of the test against such local alternatives. The uniform consistency against special classes of functions of the covariate is also studied. Simulation experiments and a real data application illustrate the performance of the new test with finite samples.
Rezumat: In 1981 Föllmer observed that, as long as we are dealing with paths that have a quadratic variation, Ito's formula is a pathwise identity. This was a precursor of Lyons's 1994-1998 rough path theory in which paths are required to come with iterated integrals, and which leads to a pathwise robust version of stochastic (partial) differential equations. In recent years Föllmer's calculus has been revisited by a number of researchers and his results have been sharpened and extended to many new scenarios. In my talk I will present a Föllmer calculus for paths that are more irregular than semimartingales (e.g. fractional Brownian motion with small Hurst index), I will discuss extensions to path-dependent functionals, and pathwise local times. Based on joint works with Rama Cont and David Prömel.
Rezumat: We present a general method for extending Markov processes to a larger state space such that the added points form a polar set. The so obtained extension is an improvement on the standard trivial extension in which case the process is made stuck in the added points, and it renders a new technique of constructing extended solutions to S(P)DEs from all starting points, in such a way that they are solutions at least after any strictly positive time. Concretely, we apply this extension to study SDEs with singular coefficients on an infinite dimensional state space, e.g. SPDEs of evolutionary type. This talk is based on joint work with L. Beznea and M. Röckner.
Rezumat: We revisit some classical facts about cumulants and substantially refine some of them to obtain fine information about sums of dependent random variables. We illustrate our results with some examples with models from statistical mechanics and mathematical finance.
Rezumat: In this talk we will present a few techniques and approaches in data analysis that have in common the use of geometric/topological ideas and concepts for computation and interpretation. In particular, we will discuss multiscale kernel exploratory data analysis, persistent homology, and manifold regression. We will describe the challenges these techniques present, in theory and practice, as well as some recent progres. Several applications will be discussed, in statistical genetics and other fields.
Rezumat: We present a qualitative analysis for stochastic variational inequalities with oblique subgradients, by advancing from the convex framework to the non-convex one. We first provide the existence and uniqueness of a solution for the multivalued equation
The mixture between the maximal monotonicity property of the (convex) subdifferential operator ∂ϕ and the Lipschitz property of matrix mapping X↦H(X) preserves, for the product H∂ϕ, neither the first property nor the second one. The existence result is based on a deterministic approach: a generalized Skorohod problem with oblique reflection is first analyzed. Replacing in (1) the subdifferential operator with the (non-convex) Fréchet subdifferential ∂⁻ϕ of a semiconvex function ϕ, the difficulties are enhanced because, even for the penalizing smooth deterministic problem, the existence of a solution is not provided by existing results. In order to achieve our goals we first extend the well-known results of Brézis concerning the regularization of convex functions and we obtain the existence and uniqueness of the solution for the penalized problem. Imposing some geometrical assumptions on the domain, the study continues with the analysis of a non-convex Skorohod problem with oblique subgradients, followed by stochastic variational inequalities with generalized reflection. A similar approach is used for backward stochastic variational inequalities (BSVIs, for short) with multivalued operators of convex type. However, a gap arises when we consider these kind of equations governed by Fréchet type generalized subgradients. The solution consists in using piecewise deterministic Markov processes for driving the considered backward equations. The study is accompanied by a model which aims to the analysis of the infection time in some multistable gene networks.
 Anouar Gassous, Aurel Răşcanu, Eduard-Paul Rotenstein, Stochastic variational inequalities with oblique subgradients, Stoch. Process. Appl., Volume 122, Issue 7 (July), pp. 2668--2700, 2012
 Anouar Gassous, Aurel Răşcanu, Eduard-Paul Rotenstein, Multivalued BSDEs with oblique subgradients, Stoch. Process. Appl., Volume 125, Issue 8 (August), pp. 3170--3195, 2015
 Dan Goreac, Eduard-Paul Rotenstein, Infection Time in Multistable Gene Networks. A Backward Stochastic Variational Inequality with Nonconvex Switch-Dependent Reflection Approach, Set-Valued Var. Anal., Volume 24(4), pp. 707-734, 2016  Lucian Maticiuc, Eduard-Paul Rotenstein, Anticipated backward stochastic variational inequalities with generalized reflection, Stoch. Dyn., Vol. 18, No. 2, article ID: 1850008, pages: 1-21, DOI: 10.1142/S0219493718500089, 2018
 Aurel Răşcanu, Eduard-Paul Rotenstein, A non-convex setup for multivalued differential equations driven by oblique subgradients, Nonlinear Anal.-Theor., Volume 111 (December), pp. 82-104, 2014
Rezumat: In this talk we consider inverse problems both in a continuous and a discrete statistical framework. We review the latest developments in the methodology, emphasizing the similarities but also the specifics related to the nature of the setting. Moreover, several notions of convergence and the corresponding analysis results are presented . In the end, new ideas are suggested for the theoretical study of this class of inverse problems.
Rezumat: We investigate the numerical reconstruction of the missing thermal boundary conditions on an inaccessible part of the boundary in the case of steady-state heat conduction in anisotropic solids from the knowledge of over-prescribed noisy data on the remaining accessible boundary. This inverse boundary value problem is approached by employing a variational formulation which transforms it into an equivalent control problem. Four such approaches are presented and both a parameter-dependent and a parameter independent gradient based algorithms are obtained in each case. The numerical implementation is realized for the 2D case by employing the boundary element method (BEM) and assuming that the available boundary data are either exact or noisy. For perturbed Cauchy data the numerical solution is stabilized/regularised by stopping the iterative procedure according to Morozov's discrepancy principle.
Rezumat: In my my previous talk in Bucharest (the 11th of January 2018) I was presenting a new algorithm, ABC Shadow, a versatile method for fitting point processes to data. This talk presents several Gibbs point interaction models (Geyer, Connected Components and Area-Interaction) that are fitted to real three dimensional datasets from the SDSS galaxy catalogue via this algorithm. Under the hypothesis of the considered models, the fitted point processes allow a morphological and statistical characterization of the galaxies distributions. Several model validation techniques are also used. They are based on Monte Carlo likelihood asymptotics, summary statistics (K-Ripleys function, pair correlation function...) and residual analysis for point processes (residuals plots, q-q plots...). Conclusions and perspectives, are finally depicted.
Rezumat: Sarmanov’s family of multivariate distributions recently gained the interest of researchers in various domains due to its flexible structure that can model a large range of dependencies for given marginals. Therefore, we start by presenting the distribution’s main characteristics and some of its extensions studied in the literature. In particular, the flexible dependence structure motivated the consideration of Sarmanov’s distribution in the fields of insurance and finance, from which we will present several applications. More precisely, as a first example, we shall discuss the fit of the bivariate Sarmanov distribution with different types of truncated marginal distributions to a bivariate losses data set. As a second example, we introduce some trivariate Sarmanov distributions with Generalized Linear Models for marginals with the aim to incorporate some individual characteristics of the policyholders when modeling a real trivariate data set of claims frequencies. Finally, we consider the capital allocation problem, which consists in fairly allocating the capital needed to cover the aggregate loss of a company (e.g., insurance company) among its various lines of business. Risk measures are well-known tools used for this purpose, and one of the most popular such risk measure is the Tail-Value-at-Risk (TVaR). Based on this risk measure, we present some closed-type allocation formulas for risks modeled by Sarmanov’s distribution.
1. Bahraoui, Z., Bolance C., Pelican E. and Vernic R. – On the bivariate Sarmanov distribution and copula. An application on insurance data using truncated marginal distributions. SORT 39 (2), 209-230, 2015.
2. Bolance C. and Vernic R. – Multivariate count data generalized linear models: Two approaches based on Sarmanov's distribution. 21 International Congress on Insurance: Mathematics and Economics, July 3-5, Vienna 2017.
3. Kotz et al. – Continuous Multivariate Distributions. Vol. 1. Models and applications. Wiley, 2000.
4. Sarmanov, O.V. – Generalized normal correlation and two-dimensional Frechet classes. Doclady (Soviet Mathematics) 168, 596-599, 1966.
5. Vernic, R. – On the distribution of a sum of Sarmanov distributed random variables. Journal of Theoretical Probability, 29 (1), 118-142, 2016. 6. Vernic, R. – Capital allocation for Sarmanov's class of distributions. Methodology and Computing in Applied Probability 19 (1), 311-330, 2017.
Rezumat: Recent theoretical and empirical studies involving nonparametric conditional frontier models stress the importance of conditional efficiency measures as the only way to treat appropriately the presence of external factors and/or environmental variables (Z) in a production process. Conditional efficiency measures are based on the idea that the production process can be described as being conditioned by given values of the external/environmental factors. These factors can be included in the frontier model as exogenous variables and can help explaining the efficiency differentials and improving the managerial policy of the evaluated units. Conditional efficiency measures are estimated by means of a nonparametric estimator of the conditional distribution function of the inputs and outputs, conditionally on values of Z. For doing this, smoothing procedures and smoothing parameters, the bandwidths, are involved. The bandwidths for the conditioning variables play a crucial role in the process of estimating the efficiency measures since they “tune” the localization for computing the conditional efficiencies. Another important aspect is related to the second stage analysis and the explanation of differences in the efficiency levels achieved by economic producers that are facing different external/environmental conditions. We present the most recent methodological developments in nonparametric estimation of conditional efficiency, completed by numerical illustrations on simulated data and useful insights on practical implementation.
Rezumat: In this presentation we consider the nonparametric robust estimation problem for regression models in continuous time with particular semi-Markov noises. To be more specific, we are interested in estimating an unknown function S on the basis of observations that can be in continuous or discrete time. This problem of nonparametric estimation in regression models is an important chapter of theoretical and applied statistics that has been considered in many frameworks ("signal + white noise" models, "signal + color noise" regressions based on Ornstein-Uhlenbeck processes, etc.). Our main goal is to develop nonparametric adaptive robust estimation, with the noise process with large dependence; to this end, we use a particular cases of semi-Markov processes to model the dependent noises. We construct a series of estimators by projection and thus we approximate the unknown function by a finite Fourier series. As we consider the estimation problem in an adaptive setting, i.e. in situation when the regularity of the function is unknown, we develop a new adaptive method based on the model selection procedure proposed by Konev and Pergamenshchikov (2012). First, this procedure gives us a family of estimators; second, we choose the best possible one by minimizing a cost function. Under general moment conditions on the noise distribution, a sharp non-asymptotic oracle inequality for the robust risks is obtained. Our talk is based on: - V. S. Barbu, S. Beltaief, S. Pergamenshchikov, "Robust adaptive efficient estimation for semi-Markov nonparametric regression models", to appear in Statistical inference for stochastic processes, 1-48, 2018 (available also at https ://arxiv.org/abs/1604.04516v2) - V. S. Barbu, S. Beltaief, S. Pergamenshchikov, "Robust adaptive efficient estimation for a semi-Markov continuous time regression from discrete data", 1-37, 2017 (available at http://arxiv.org/abs/1710.10653).
Rezumat: Stationary processes form an important class of stochastic processes that has been extensively studied in the literature. Their applications include modelling and forecasting numerous real life phenomenon including natural disasters, sustainable energy sources, sales and market movements. One of the most essential families of stationary processes is the ARMA family. When modelling existing data with ARMA process, the first step is to fix the orders of the model. After that, one can estimate the related parameters by using standard methods such as maximum likelihood (ML) or least squares (LS) estimators. The final step is to conduct various diagnostic tests in order to determine the quality of the model. In this talk we present a novel way of fitting a model to a data that is assumed to be a realization from a discrete time stationary process. Our approach is based on a recently proved AR(1) characterisation of stationary processes, where the noise is not assumed to be white. As a result, we obtain more general and easier way to fit a model into a stationary time series, thus outperforming traditional ARMA approaches. In particular, we obtain closed form consistent estimators of various model parameters and their asymptotic normality under general conditions. The results are then applied to the ARCH model with a memory effect. ARCH models can be employed, e.g. in modeling time-varying volatility. We also discuss continuous time extensions.
Rezumat: In financial mathematics, we often model a financial market as a vector of stochastic processes on a given filtered probability space. These processes are describing the evolution in time of financial prices of securities (stocks, bonds, derivatives). The arbitrage pricing theory is a powerful tool for analyzing these prices and the change of the underlying probability measure has become the classical tool. The aim of this talk is to explain that a different technique, the change of the underlying filtration, provides a characterization risk premiums attached to particular events that have impact on the security prices (such as the default event of a firm). Intuitively, the change of a filtration redefines the available information within the model.
Rezumat: Unlike empirical propositions (e.g., that ulcer is caused by a bacteria), which are refutable and contingent, mathematical propositions are (considered) certain and necessary. The key-difference seems to be that in mathematics we have *proofs* - from axioms, using deductive logic. But then what is the status of the axioms, in particular those in the ZFC system? In what sense, if any, do we know them? Is our knowledge of a mathematical truth dependent on our knowledge of the axioms? And in what sense, if any, are the axioms even true? Since we typically do not say that we -prove- the axioms, then what kind of justification can we present for them? (The same questions can of course be raised for the logical truths involved in proofs, such as modus ponens.) This talk will survey some philosophical views proposed to address these questions. A suggestion I will gesture toward is that the difference between empirical and logico-mathematical knowledge may not be as deep as usually thought.
Rezumat: A stick of length 1 is broken in pieces by random iid cuts X_n . Sort the pieces after the n'th cut ascendently and make the Lorenz curves of them. What happens asymptotically?
We prove that the limit of these Lorenz curves do exist in some cases and conjecture that the most egalitarian distribution of the cuts is the uniform one.
Rezumat: În prima parte, voi ilustra rolul fundamental al inegalitătilor lui Hardy în teoria spatiilor de functii prin două exemple de bază: calculul functional în spatiile Sobolev si teoria spatiilor Sobolev cu ponderi. În partea a doua, voi prezenta aplicatii ale acestor teorii la studiul functiilor Sobolev unimodulare.
Rezumat: The object of this talk is to extend the classical definition of the multidimensional discrete scan statistic with the help of a score function. In this new framework, problems like finding the distribution of a monotone run in a sequence of i.i.d. random variables or scanning with different window shapes (rectangle, circle, ellipse or annulus) in a two-dimensional setting will be discussed. We propose several approximations for the distribution of the scan statistic and illustrate their accuracy by conducting a numerical comparison study.
Rezumat: This paper presents an original ABC algorithm, ABC Shadow, that can be applied to sample posterior densities that are continuously differentiable. The proposed algorithm solves the main condition to be fulfilled by any ABC algorithm, in order to be useful in practice. This condition requires enough samples in the parameter space region, induced by the observed statistics. The algorithm is tuned on the posterior of a Gaussian model which is entirely known, and then, it is applied for the statistical analysis of several spatial patterns. These patterns are issued or assumed to be outcomes of point processes. The considered models are: Strauss, Candy and area-interaction.
Rezumat: Repartitiile cvasi-stationare descriu miscarea conditionata de non-extinctie a unor procese Markov cu timp de viata aproape sigur finit. Prezentam rezultate recente privind existenta si unicitatea repartitiilor cvasi-stationare, precum si conditiile in care are loc convergenta exponentiala a probabilitatilor conditionate de non-extinctie catre o repartitie cvasi-stationara.
Rezumat: Prezentam o metoda de constructie de procese Markov de ramificare-fragmentare pe spatiu dimensiunilor de fragmente, induse de un nucleu de fragmentare continuu, sau discontinuu, cu aplicatii la un model stocastic pentru faza de fragmentare a unei avalanse. Prezentam apoi simulari ale traiectoriilor si ale distributiei proceselor, utilizand o metoda de aproximare numerica pentru solutii de ecuatii diferentiale stocastice de fragmentare. Calculam in final distributiile proceselor de ramificare care aproximeaza procesul de fragmentare. Expunerea se bazeaza pe lucrari cu Lucian Beznea si Madalina Deaconu.
- Octombrie 2017: Functional data analysis - Cristian Preda (Lille/ISMMA)
Rezumat: Date represented by curves are considered as paths of a L_2 continuous stochastic process. The Karhunen-Loeve expansion is then used in the regression and vizualisation frameworks with such data. Application with data from industry illustrate the theory.
- Septembrie 2017: Atelier de travail en stochastique et EDP