- Statistical Mechanics and Entropy
- Quantum Mechanics and Applications
- Advanced Thermodynamics and Statistical Mechanics
- Probabilistic and Robust Engineering Design
- Fault Detection and Control Systems
- Philosophy and History of Science
- Gaussian Processes and Bayesian Inference
- Neural Networks and Applications
- Bayesian Modeling and Causal Inference
- Probability and Statistical Research
- Quantum Information and Cryptography
- Adversarial Robustness in Machine Learning
- Anomaly Detection Techniques and Applications
- Cold Atom Physics and Bose-Einstein Condensates
- Advanced Text Analysis Techniques
- Sharing Economy and Platforms
- Nuclear Engineering Thermal-Hydraulics
- Forecasting Techniques and Applications
- Cognitive Science and Education Research
- Bayesian Methods and Mixture Models
- Statistical and numerical algorithms
- Quantum, superfluid, helium dynamics
- Scientific Measurement and Uncertainty Evaluation
- Transportation and Mobility Innovations
- Structural Response to Dynamic Loads
RTX (United States)
2022
Massachusetts Institute of Technology
2019-2020
University at Albany, State University of New York
2017-2018
We find that the standard relative entropy and Umegaki are designed for purpose of inferentially updating probabilities density matrices, respectively. From same set guided design criteria, both previously stated entropies derived in parallel. This formulates a quantum maximum method inferring matrices absence complete information.
The problem of measurement in quantum mechanics is studied within the Entropic Dynamics framework. We discuss von Neumann and Weak measurements, wavefunction collapse, Values as examples bayesian entropic inference.
The recent article 'Entropic Updating of Probability and Density Matrices' [1] derives demonstrates the inferential origins both standard quantum relative entropies in unison. Operationally, are shown to be designed for purpose inferentially updating probability distributions density matrices, respectively, when faced with incomplete information. We call procedure matrices 'quantum maximum entropy method'. Standard inference techniques theory can criticized lacking concrete physical...
Using first principles from inference, we design a set of functionals for the purposes ranking joint probability distributions with respect to their correlations. Starting general functional, impose its desired behavior through Principle Constant Correlations (PCC), which constrains correlation functional behave in consistent way under statistically independent inferential transformations. The PCC guides us choosing appropriate criteria constructing functionals. Since derivations depend on...
This thesis synthesizes probability and entropic inference with Quantum Mechanics (QM) quantum measurement [1-6]. It is shown that the standard relative entropies are tools designed for purpose of updating distributions density matrices, respectively [1]. The derivation completed in tandem follow from same inferential principle - minimal [21,66]. As maximum entropy method derived using mechanical formalism, may be appended to formalism remove collapse as a required postulate, agreement [11]....
Abstract We motive and calculate Newton–Cotes quadrature integration variance compare it directly with Monte Carlo (MC) variance. find an equivalence between deterministic sampling random MC by noting that is statistically indistinguishable from a method uses on randomly shuffled (permuted) function. use this statistical to regularize the form of permissible Bayesian priors such they are guaranteed be objectively comparable MC. This leads proof simple methods have expected variances less...
Kolmogorov's first axiom of probability is takes values between 0 and 1; however, in Cox's derivation having a maximum value unity arbitrary since he derives as tool to rank degrees plausibility. Probability can then be used make inferences instances incomplete information, which the foundation Baysian theory. This article formulates rule, if obeyed, allows take complex still consistent with interpretation theory being It shown that Kirkwood distributions conditional proposed by Hofmann do...
This article expands the framework of Bayesian inference and provides direct probabilistic methods for approaching tasks that are typically handled with information theory. We treat probability updating as a random process uncover intrinsic quantitative features joint distributions called inferential moments. Inferential moments quantify shape about how prior distribution is expected to update in response yet be obtained information. Further, we unique whose statistical question. find power...
This article offers a new paradigm for analyzing the behavior of uncertain multivariable systems using set quantities we call inferential moments. Marginalization is an uncertainty quantification process that averages conditional probabilities to quantify expected value probability interest. Inferential moments are higher order describe how distribution respond information. Of particular interest in this deviation, which fluctuation one variable response update another. We find power series...
This paper examines the statistical mechanical and thermodynamical consequences of variable phase-space volume element $h_I=\bigtriangleup x_i\bigtriangleup p_i$. Varying $h_I$ leads to variations in amount measured information a system but maximum entropy remains constant due uncertainty principle. By taking $h_u\rightarrow 0^+$ an infinite unobservable is attained leading energy per particle chemical equilibrium between all particles. The heat fluxing though measurement apparatus...
We find that the standard relative entropy and Umegaki are designed for purpose of inferentially updating probability density matrices respectively. From same set guided design criteria, both previously stated entropies derived in parallel. This formulates a quantum maximum method inferring absence complete information.
The recent article "Entropic Updating of Probability and Density Matrices" [1] derives demonstrates the inferential origins both standard quantum relative entropies in unison. Operationally, are shown to be designed for purpose inferentially updating probability distributions density matrices, respectively, when faced with incomplete information. We call procedure matrices "quantum maximum entropy method". Standard inference techniques theory can criticized lacking concrete physical...
Here we present a Multiple Observer Probability Analysis (MOPA) for the purpose of clarifying topics in experimental Bell scenarios. Because scenarios are interested quantum effects between nonlocal measurement devices, assign an observer to each device: Alice and Bob. Given that observers stationary space-like separated, is privy different information along their shared equi-temporal lines due permutations order they observe events. Therefore, inclined probability distributions same set...
In this article we construct a theoretical and computational process for assessing Input Probability Sensitivity Analysis (IPSA) using Graphics Processing Unit (GPU) enabled technique called Vectorized Uncertainty Propagation (VUP). VUP propagates probability distributions through parametric model in way that's time complexity grows sublinearly the number of distinct propagated input distributions. can therefore be used to efficiently implement IPSA, which estimates model's probabilistic...