- Sparse and Compressive Sensing Techniques
- Advanced Optimization Algorithms Research
- Merger and Competition Analysis
- Numerical methods in inverse problems
- Advanced Numerical Methods in Computational Mathematics
- Stochastic Gradient Optimization Techniques
- Advanced Numerical Analysis Techniques
- Gaussian Processes and Bayesian Inference
- Neurogenesis and neuroplasticity mechanisms
- Digital Platforms and Economics
- Single-cell and spatial transcriptomics
- Pluripotent Stem Cells Research
- Global trade and economics
- Model Reduction and Neural Networks
- Economic Growth and Productivity
University of Cambridge
2017-2024
Norwegian University of Science and Technology
2024
University of Bath
2024
Ufuk University
2023
The human brain has undergone rapid expansion since humans diverged from other great apes, but the mechanism of this human-specific enlargement is still unknown. Here, we use cerebral organoids derived human, gorilla, and chimpanzee cells to study developmental mechanisms driving evolutionary expansion. We find that neuroepithelial differentiation a protracted process in involving previously unrecognized transition state characterized by change cell shape. Furthermore, show are larger due...
Abstract The human brain has undergone rapid expansion since humans diverged from other great apes, but the mechanism of this human-specific enlargement is still unknown. Here, we use cerebral organoids derived human, gorilla and chimpanzee cells to study developmental mechanisms driving evolutionary expansion. We find that differentiation neuroepithelial neurogenic radial glia a protracted process in involving previously unrecognized transition state characterized by change cell shape....
Discrete gradient methods are geometric integration techniques that can preserve the dissipative structure of flows. Due to monotonic decay function values, they well suited for general convex and nonconvex optimization problems. Both zero- first-order algorithms be derived from discrete method by selecting different gradients. In this paper, we present a comprehensive analysis optimisation which provides solid theoretical foundation. We show is well-posed proving existence uniqueness...
Discrete gradient methods are geometric integration techniques that can preserve the dissipative structure of flows. Due to monotonic decay function values, they well suited for general convex and nonconvex optimisation problems. Both zero- first-order algorithms be derived from discrete method by selecting different gradients. In this paper, we present a thorough analysis which provides solid theoretical foundation. We show is well-posed proving existence iterates any positive time step, as...
Abstract The optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that frequently encountered, for example in model parameter problems. Bilevel parameters standard setting areas such as variational regularisation problems and supervised machine learning. We present efficient robust derivative-free methods called randomised Itoh–Abe methods. These are generalisations the discrete gradient method, well-known scheme from geometric...
Stepwise models of technological progress described by Philippe Aghion and his co-authors (1997, 2001, 2005) capture the incentives firms to innovate in order escape competition disincentives from sharing profits with other leaders. The yield intuitively appealing predictions about effects on innovation, but they are limited duopolies. This paper extends oligopolies shows that innovation duopoly do not generalize oligopolies.
In this paper we propose optimisation methods for variational regularisation problems based on discretising the inverse scale space flow with discrete gradient methods. Inverse generalises flows by incorporating a generalised Bregman distance as underlying metric. Its discrete-time counterparts, iterations and linearised iterations, are popular schemes that incorporate priori information without loss of contrast. Discrete tools from geometric numerical integration preserving energy...
The optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that frequently encountered, for example in model parameter problems. Bilevel parameters standard setting areas such as variational regularisation problems and supervised machine learning. We present efficient robust derivative-free methods called randomised Itoh--Abe methods. These are generalisations the discrete gradient method, well-known scheme from geometric integration,...
The effects of monopoly power or mergers on incentives to innovate are important issues for antitrust enforcement, but they receive relatively little attention in litigated cases compared the analysis predicted prices. This paper reviews what is known about relationship between market structure and innovation its implications enforcement. A focus significance inverted-U result dynamic markets identified research by Philippe Aghion, Peter Howitt, their co-authors. We note that these results...