QuIPS is a postgraduate seminar that allows PhD students to present their research, or any other interesting mathematical topic, while practicing their presentation skills in a friendly environment. Talks are usually on Thursday's from 12-1pm. All talks are in a hybrid mode, where you can watch the talk in person in MB-503 or online at the Zoom link provided in the information below. QuIPS is usually followed by a social event: we welcome everyone to stay for a while and have a little chat with the other participants after the end of each session, while enjoying a wide variety of refreshments. Everyone is welcome: QuIPS talks are designed to be accessible to all maths PhD students!
Back to Seminar listing
Have you ever wondered how living organisms navigate their environment? The most intuitive model involves taking a step in a chosen direction, repeating this process with a certain step length. Over a century ago, Pearson introduced random walk models to describe the movements of biological organisms. However, unlike passive particles driven by external forces, biological agents actively control their movement, raising important questions about how to appropriately model such active behaviour. In this talk, we delve into well-defined stochastic processes in two dimensions, starting from the Cartesian frame and transitioning into the comoving frame, which more naturally reflects the perspective of the moving organism. We analyze these processes through their probability distributions, autocorrelations, and underlying dynamical equations. Finally, we present two simple equations in the comoving frame that aim to replicate the stochastic behavior observed in the Cartesian frame, offering fresh insights into the modelling of active biological movement.
Join Quips event (Zoom)
As Halloween looms, we delve into the mathematical modeling of a zombie apocalypse through extensions of the classic SIR model. Zombies, like infections, spread through contact—but with a twist. Using the SIRZ (Susceptible, Infected, Removed, Zombie) model, this talk explores the dynamics of a fictional outbreak in the Maths building, adding new compartments and parameters to capture the unique horror of the undead. Will the Postgraduate students survive, or is a building full of smart zombies inevitable?
In this talk, we will do some algebra, some geometry but sadly no algebraic geometry. We will start by staring at the number 65537 for a while and then move on to the concept of constructible numbers. I'll try to make the geometry part fun and interactive although I can't guarantee that any of my circles will look legible. We will finish with the classification of constructible regular polygons.
Modern Portfolio Theory was brought to life by the pioneering work of Markowitz. The Portfolio Selection Problem has drawn both success and criticism through several proposals and refinements.In this talk, we will discover the Markowitz Portfolio Model, its structure, and some of the variants derived from the literature. Besides a close look at the Limited Asset Version.
Topological data analysis (TDA) is a beautiful field which takes tools from geometry, topology, abstract algebra, representation theory and many other "pure" maths disciplines and applies them to real world data. A prominent tool in TDA is persistent homology (PH), which allows one to assign a quantifiable statistic describing the soft notion of a "shape" of a data set where many other fields fail. For this reason, it is becoming a very useful statistic used in machine learning and AI, as well as in many applications in physics, biology and even the social sciences. This talk will give a crash course of PH; covering its origins, its theoretical framework, and a discussion of how and why it is used in many applications.
Impartiality as a constraint was introduced in 2007, since then it has been adapted to numerous settings. In this talk I will outline the history that lead to the introduction of impartiality. I will then explain the methods used in this setting from 2007-2024 including methods from my own work.
Cat: On the infinite plane, parallel lines appear to meet at a mysterious point far away on the horizon, a point we could never reach. Projective space is constructed to include it and (thus) allow us to solve equations we previously couldn't. We will see how such a space is constructed, use it to find where parallel lines meet, and finally discuss a couple of real-world applications.
James: Do you often wonder how many ways there are to write a prime number as the sum of eight squares? Then this is the talk for you! In this talk, I will introduce some of the theory of modular forms as functions on the complex upper half plane, explain what makes them so special, and demonstrate a few nice applications of their theory.
We will explore the evolving mathematics behind Deep Learning. But first we will delve into a short histology of its mathematical foundations, then we will ponder the question posed by mathematicians: Do we need a new Inference theory for our understanding of Deep Learning models?
Further we will uncover the intricate generalization puzzle within neural networks, and how solving it has led to establishing connections between 2000s machine learning theory and contemporary deep learning. We will see some of the ongoing dance between emerging theories and practical numerical evidence, revealing a continuous dialogue where theories are born, challenged, and reshaped in the dynamic landscape of Deep Learning mathematics.
This talk aims to provide a brief and informal introduction to the field of solving inverse problems, specifically with variational regularisation methods. These are methods that find a solution to inverse problems by solving an optimization problem which integrates data consistency and prior knowledge. As an example, I will discuss how this applies to medical imaging techniques such as MRI. Furthermore, I will give an introduction of neural networks.
Complex networks are everywhere in real life. The internet, social relationship networks, protein interaction networks, etc. are all ‘complex networks’ which we can informally define as a collection of nodes connected to each other. In a complex network, a node can be anything: a person, a website, or a cell while connections among nodes can represent any interactions or relationships. Complex networks are good tools for generalizing all these systems and understanding how they operate. In this talk, we will go through three classic complex network models and explore the properties of them.
The numerical simulation of black string spacetime in higher dimensions is a challenging task, as it typically requires a higher level of resolution than lower dimensional compact objects. Over the past few years, there has been progress in reexamining the endpoint of the well-known Gregory-Laflamme instability in black string spacetime, which suggests that certain black string solution to Einstein Equation will pinch off within a finite amount of time. In this presentation, I will delve into the numerical formulation designed for simulating higher dimensional black string spacetime and modified theories of gravity. Then, I will present some notable numerical results from our simulation.
Random movement of particles is a fundamental process in physics. Initially described by Robert Brown observing random fluctuations in the position of small particles of pollen immersed in water. This movement received the name of Brownian motion and is characterized by normal diffusion, which is a Linear behaviour of the Mean Square Displacement (MSD). In this talk, we will delve into anomalous diffusion, a concept that disrupts the linear relationship between MSD and time. We will explore its two primary categories: subdiffusion and superdiffusion. Furthermore, I will present some of the most studied anomalous diffusion processes, namely fractional Brownian motion and Levy walks.
At the heart of the vast number of new neural network models with unbelievable abilities lies the Transformer architecture. The most well-known of these are the GPT models (General Pre-trained Transformer), which brought us ChatGPT. In this talk, I will present the key components behind the Transformer architecture, and how one of those enabled the Transformer revolution we’ve seen in the last few years.
A novel approach is being considered for dose escalation in phase I-II clinical trials.Along with efficacy and toxicity as endpoints, the pharmacokinetic (PK) and pharmacodynamic (PD) information is also being considered. Patan and Bogacka (2011) conducted some simulation studies taking into account such information. However, they only considered fixed effects models for the PK/PD effects. We are considering mixed effects model for the PK/PD.
The population Fisher information matrix for PK/PD model is found by followingBazzoli, Retout, and Mentré (2009). For the dose-response model, the approach ofZhang, Sargent, and Mandrekar (2006) is followed, who consider a trinomial responsey = (y0; y1; y2)T for each patient, where y0 is a neutral response, y1 is an efficacy endpoint and y2 is a toxicity endpoint. Following the assignment of the lowest dose to a cohort of patients, the trinomial response is observed for each patient. PK and PD responses are measured at the D-optimal time points.
The dose-response curves are updated sequentially. For the next cohort, the dose isselected in such a way so that the estimated probability of efficacy is a maximum, subjectto the condition that the estimated probability of toxicity is smaller than a pre-specifiedvalue and also the efficiency of estimation of the PK/PD parameters is at least a desiredlevel. When the trial is finished according to some rules, a complete analysis of the datais carried out and the dose is chosen to recommend for further studies. Thus, an adaptive design is being implemented.
In two dimensional percolation problems, all exact results and most bounds for percolation thresholds make use of planar duality. We will look briefly at this and then consider methods of bounding thresholds without duality.
We consider the dynamics of small tracer particles for example, electron, ion, solid hydrogen or solid deuterium particles in quantum turbulent flow. The complicated interaction processes of quantized vortices, the quantum mechanical constraints on vorticity and the varying influence of both the superfluid and the normal fluid on the tracer particle effectively lead to a superstatistical Langevin-like model. The model in a certain approximation can be solved analytically. An analytic expression for the probability density function of Lagrangian velocity $v$ of the tracer particle is derived that exhibits not only the experimentally measured $v^{-3}$ tails but also the correct behaviour in the neighbourhood of the center of the distribution. The predicted PDF has an excellent agreement with experimental measurements and numerical simulations.
Questions in algebra, while deep and interesting, can be incredibly difficult. Thankfully, when studying the representation theory of the symmetric groups, one can often take algebraic properties and results and write them in the language of combinatorics; where one has a wide variety of tools and techniques to use. In this talk, we will look at the specific example of the submodule structure of 2-part Specht modules in characteristic 2, and answer which hook Specht modules are uniserial in characteristic 2. We will not need to assume the Riemann hypothesis for this talk.
Despite the significant effort to understand them, to this day, singularities in spacetime are not fully understood. This talk aims to introduce some examples of physically relevant spacetimes which are believed to have a so-called "singularity" in them. In doing so, we can take a look at different perspectives researchers have had over the years and where they agree or disagree when it comes to defining what constitutes a singularity. Moreover, we will discuss some of the consequences of the landmark theorems which are due to Hawking and Penrose, and more importantly, some very important conjectures in mathematical relativity relating to cosmic censorship.
Algebraic geometry often explores the duality between geometric objects and rings (or algebras) of functions on them. The spectrum of a ring is used within scheme theory to generalise the notion of a manifold to allow for non-smoothness and other bizarre behaviour. Starting with a (commutative) ring, we will learn how to build its spectrum first as a set, and then we will endow it with an unusual topology known as the Zariski topology. We will go through many examples, exploring strange new notions of 'large' and 'generic' points which do not arise within everyday topologies.
I will give a basic overview on the history of online problems. I will cover the initial formations of the secretary problem and show who solved it. Along with this I will show some of the variant problems such as prophet inequalities, pandoras box and online matching problems, and some general method people have used to solve them. This talk will be pretty informal and will just present a rough overview of the field.
Imagine driving around Manhattan, where each road segment between two intersections is randomly open or closed, independently and with constant probability. What is the probability that you can drive to infinitely many intersections via open roads? Now, consider a random landscape, where everything below height 0 is submerged underwater, and everything above is traversable land. What is the probability that this random landscape has an infinitely large island?
The connections between these two models-- Bernoulli percolation and Gaussian percolation-- has been a subject of increasing interest in recent times. Similar in many respects, it is conjectured that these models belong to the same ‘universality class,’ but proving this is notoriously difficult. In this talk, we will give a brief introduction to the area of percolation. We will review the history of the subject, explore the link between these two models, and outline the progress that has been made to connect them. Along the way, we hope to introduce some methods and techniques used in the field, and discuss open questions and potential future work.
This talk is introductory and requires no prerequisite knowledge.
I will give some heuristic arguments on the importance of Riemann Hypothesis and its connection to prime numbers. The talk will be very informal and as long as you have seen the statement of Cauchy residue theorem you will be fine. We will go over a sketch proof of the Prime Number Theorem and see how the Riemann Hypothesis implies a stronger statement.
Have you ever wondered, why the Axiom of Choice is obviously true, the well-ordering principle obviously false, and what Zorn's Lemma even is? We will discuss these things and learn how to best confuse undergraduate students when teaching NSF tutorials (empirically tested). I will introduce the logical foundations of modern set theory and present a choice of historical theorems and advancements in set theory. Join me on this adventure and choose how sarcastically you want to clap at any one of my uncountably many choice jokes.
In this talk, I will summarize the preliminary results of my research stay carried out during the last 3 months here. Our research has focused on trying to find a non-parametric and non-biased measure of the segregation of a system. To do this, we used the Delanuay triangulation of our system, and we simply performed statistics on these triangles, comparing these results with those that this same system would exhibit with a coloring that maximized (or broke) the spatial correlations. The contents of this talk do not require any previous background.
Our understanding of the history and origins of the universe has improved dramatically over the last hundred years, with major advancements in both theory and observation. Over the last few decades, cosmic inflation has become a standard part of the picture, and is still hotly debated. In this talk I will (very very briefly) go over the required background in relativity and standard cosmology, and at least motivate the theory of inflation.
The p-adic numbers are a key concept used throughout number theory. In this talk we will construct the p-adic numbers using an analytic approach. We will also see some of their general properties and why they are so important in number theory.
Join us on a captivating journey into the depths of the Collatz conjecture, a seemingly straightforward puzzle that has eluded resolution for almost a century. Explore the recursive intricacies of its sequences, uncover historical attempts at solving the mystery, and delve into the collaborative efforts shaping the ongoing discourse.
This talk invites seasoned mathematicians-in-training to unravel the complexities of the Collatz conjecture. Engage in a discourse that challenges the intellect and sparks curiosity, as we navigate the unsolved terrain of this mathematical puzzle, dissecting the subtleties that make it a captivating subject of study.
Incentive issues occur with some frequency when designing systems, and indeed in systems with strategic participants, the rules matter. Poorly designed systems suffer from unexpected and undesirable results. With this in mind we introduce the field of mechanism design, where the goal is to design rules or algorithms so that strategic behaviour by participants leads to desirable outcomes.
I will then cover some of my own work in this subject, based on impartially selecting subsets.
In this talk, we show that under a quadratic curvature pinching hypothesis, in regions of high curvature, the submanifold under the mean curvature flow becomes approximately codimension one in a quantifiable way. This enables us to prove that at a singularity, there exists a rescaling that converges to a smooth codimension-one limiting flow in Euclidean space, which is weakly convex and either moves by translation or is a self-shrinker.
James: A congruent number is a rational number which is the area of some right angled triangle with three rational side lengths. Tabulations of such numbers (which included 5 and 6) have been found in Arab manuscripts from the 10th century but the problem as to exactly which rational numbers are congruent remains open to this day. Only in the 13th century did Fibonacci discover that 7 is congruent, and it took until 1640 for Fermat to give the first accepted proof that 1 is not congruent. In this talk, I will outline how to relate the congruent number problem to elliptic curves and explain a theorem of Tunnell which, conditional on the Birch and Swinnerton-Dyer conjecture, completely resolves the congruent number problem.
Jordan: The causal nature of spacetime is a key takeaway of Einstein's theories of relativity; the ability to determine what events can influence others by a single glance at a diagram. The goal of this talk is to illustrate how we can retain this property whilst investigating how conformal geometry helps us to compactify spacetimes into a (finite) diagram. Consequently, we open a new way of looking at geodesics on such a manifold and how this changes the way we think about infinity for massless and massive particles.
During the last decade, the so-called replication crisis has been sweeping Social Sciences undermining the confidence generally held in academia. Among statistical methods, p-values and more specifically p-hacking have been singled out as the main culprits to blame for this trend. However, the use and interpretation of p-values is often a contentious topic even among Statisticians. In this talk we will try to discuss some of the complexities and nuances behind the use of p-values in experimental work, with the intention of shedding light on the interplay between Probability and Science that is foundational to the evaluation of empirical evidence.
We will start with a gentle introduction to the rich area of combinatorics known as extremal graph theory, assuming no previous knowledge. Time permitting, we will then see the proofs of some of the main theorems in the field, such as Mantel's Theorem, Turan's Theorem and the Erdos-Stone Theorem. We will end with a discussion of the open problems that remain.
Matroids were defined to unite the notion of independence that we get from vector spaces and from graphs, where a subset of edges is dependent if it contains a cycle. Since its discovery, Matroid Theory has made contributions to many fields including: graph theory, optimization theory, algebraic and tropical geometry, and machine learning. This talk will focus of the basics of matroid theory, as well as exploring several "nice" classes of matroids.
In recent years, there has been a growing interest in applying mathematical and computational techniques to real world situations. For example, Maths is largely used in the field of oncology. Mathematical models have the potential to aid in predicting the progression of tumors and developing new therapeutic strategies. In this talk, we will explore an example of a mathematical model for investigating glioblastoma tumor evolution. Clinical evidence shows how this type of brain tumor is highly mortal, especially in old patients, and its response to therapies is usually poor and insufficient. Moreover, glioblastoma invasiveness, prognosis and unresponsiveness to conventional treatments and surgeries is highly related to tumor cells genome, which is usually abnormal due to the present of multiple copies of chromosomes. Cells which are presenting an altered number of sets of chromosomes are called polyploidy, and clinical studies suggest these cells are more energetically demanding, and fast in migrating from a region to the other, causing the easier development of metastasis. We then propose a model that describes the evolution of glioblastoma over time, accordingly to the oxygen and glucose resources present in the patient brain. We investigate that by using both an ODEs and a machine learning approaches, to describe cells migration, birth and death depending on resources and presence of different ploidies.
Additionally, we will touch on some of the challenges that arise when applying mathematical models to cancer research, such as data limitations and model complexity, and we will propose future direction of work for this project.
Technological progress is growing by leaps and bounds, followed by new ethical dilemmas.
Are new tools to be banned because they make work too easy? Or should they rather be seen as supporting research work?
In this talk I will focus on what it means to do research, from developing an idea to publishing results in a paper.
I will then show how new technological tools may impact the way we will work within the next 10 years.
Mathematical relativity is constructed from intrinsic objects defined on special manifolds, but, what exactly do we need to study Einstein's equations? How does this relate to the theory of PDEs which are commonly talked about in this context? Moreover, what does a solution to these equations even look like? These are some of the questions I aim to answer by the end of the talk and give insight to what these abstract objects mean physically.
This talk will be a gentle introduction to ring theory and its applications to some specific Diophantine equations. We will recap the definitions of rings, ideals and Euclidean domains, and use these to prove why 26 is so special. We will then explore what else we can do with this theory and what limitations we have, before mentioning how these can be overcome in some cases by introducing the ideal class group and factorising ideals.
Have you been in this awkward situation when people talk about a physical application of noncommutative geometry? And you want to be part of that exciting conversation but cannot. Then this is the perfect QuIPS for you. First, we cover the basics of Quantum Riemannian geometry and the version applied to fine sets, and then we give some examples of physical models using them.
In this talk, I will firstly recap the basics of classical integral Calculus of Variations. We can then discuss the corresponding generalisation to supremal functionals. These functionals are far more sophisticated than their integral counterparts. Integral functionals yield Euler-Lagrange equations in divergence form, allowing for weak solutions via standard Sobolev theory. However, the full PDE system associated to a vectorial first order infinity functional is non-divergence and even exhibits discontinuous coefficients, no matter how smooth and convex the Lagrangian might be. I will conclude with a short summary of some specific problems I have investigated throughout my PhD.
Incentive issues occur with some frequency when designing systems, and indeed in systems with strategic participants, the rules matter. Poorly designed systems suffer from unexpected and undesirable results. With this in mind we introduce the field of mechanism design, where the goal is to design rules or algorithms so that strategic behaviour by participants leads to desirable outcomes.In the second part of my talk I will focus upon my research, which concerns questions of this nature in peer selection settings.
Samuel: Brownian motion, under which the mean square displacement of an ensemble of particles increases linearly with time, is well-observed in many natural diffusive processes. However, many systems are also seen to undergo anomalous diffusion, where this relationship is non-linear. In this talk I will discuss a class of time-discrete dynamical systems which exhibit anomalous diffusion and discuss methods of modelling these processes stochastically.
Adam: Spatially periodic cellular complexes are important scientific models for large homogeneous data sets, for example crystalline materials. For computational convenience, these infinite complexes are often represented by their finite quotient spaces. However, this introduces the need to extrapolate the topological structure of the infinite space in order to perform structure prediction and topological data analysis, which is difficult due to the potential for ''holes'' in an infinite cellular complex to disappear in the quotient space, or conversely for the quotient space to create phantom ''toroidal cycles''. In this talk I will discuss my research on finding ''computer friendly'' methods for constructing the homology of a periodic cellular complex from a finite quotient space by using tools from crystallographic literature and homological algebra to counteract the problem for different cases.
In this talk I will give a brief introduction to the Representation Theory of Symmetric Groups, covering some key definitions (partitions, Young diagrams, tableaux, tabloids, polytabloids, Specht modules, James modules, etc.), show some "partition magic", and talk about a few open questions.
General relativity is the framework in which gravitational physics is described. It is often deemed difficult both conceptually and mathematically due to its demand of an understanding of differential geometry and necessity for one to think in higher dimensions. In this talk, I plan to focus on the motivation behind general relativity as a theory of not just gravity, but the nature of spacetime itself. In particular, I will discuss why gravity is special, why it does not fit the standard notion of a force, how Einstein theorised we should view gravity and how one can formulate his ideas into a mathematical theory. This talk is mathematically light and has no prerequisites beyond F=ma. It is intended to welcome non-GR folk into our loving gravity bubble within the School of Mathematics.
In this talk I will present the equations of the Einstein-scalar-Gauss-Bonnet theory, a modified theory of gravity, in the 3+1 formulation with a modified gauge that proves to be strongly hyperbolic. Then I'll show some of the numerical results I have obtained from the implementation of these equations with GRChombo, a numerical general relativity code with fully adaptive mesh refinement.
In my second QuIPS talk, I will give a (self-contained) update to my previous QuIPS talk given in November on topological synchronization.
Topological synchronization is a novel model of coupled topological oscillators defined on nodes, links and higher order simplices in a simplicial complex.
In my talk, I will present the recently introduced topological Dirac operator, which provides an elegant framework to couple topological signals of different orders. I will show how such coupling leads to very different phenomenology compared to previously studied models of synchronization such as the Kuramoto model and its higher order generalisations. I will illustrate our theoretical understanding of the rich phase diagram of topological synchronization with numerical simulations and show how the relevant order parameters of the dynamics undergo an explosive phase transition to a rhythmic synchronized phase. This rhythmic phase, characterised by coherent oscillations of the order parameters, is the stable phase of synchronization in the infinite network limit, an exotic phenomenon due to the subtle phenomenology of the dynamical model.
This talk will be based partly on results presented in Calmon, Lucille, Juan G. Restrepo, Joaquín J. Torres, and Ginestra Bianconi. "Topological synchronization: explosive transition and rhythmic phase." arXiv preprint arXiv:2107.05107 (2021).
In algebraic geometry, we often have very difficult questions about the geometry of algebraic varieties. We can use tropical geometry to reduce these problems into combinatorial questions about finite polyhedral complexes. Tropical geometry has become a powerful tool in many other fields of mathematics, from machine learning to pure algebraic geometry. In this talk, we will explore the recent development of tropical ideals, whose role is to create a unified tropical scheme theory consistent with its’ algebraic counterpart. In my previous talk “Paving tropical Ideals,” I highlighted that tropical ideals form a strict generalization of polynomial ideals, but these so-called non-realizable tropical ideals are hard to come by. In this talk, we will address the geometric counterpart of this problem. We pose the following question: Are the varieties of tropical ideals more general than algebraic varieties? In the process, we will highlight the important connection of this theory to matroid products, which have not received much attention for decades.
Extra-chromosomal DNA (ecDNA) is a circular element composed by sequences of genes being outside the central nucleus of cells.
There are many evidences of its contribution to cancer initiation and treatment resistance though the promotion of oncogene amplification. Thus, investigating ecDNA evolutionary paths and mechanisms of reproduction could lead to a deep understanding of cancer development and treatment.
We present a complex mathematical model for explaining the ecDNA somatic evolutionary process, combining analytics and computational techniques. We show that random ecDNA segregation is related to peculiar dynamics, based on selection advantages and genes expression abilities. We consider complex scenarios, in line with specific biological contexts, and we present original theoretical results that confirm the already existent medical literature.
Large Deviation Theory has provided analytical tools to measure long-term statistics and rare phenomenon in equilibrium and non-equilibrium statistical mechanics. However, these tools as they currently stand cannot be readily applied to dynamical processes with memory (non-markov) and even to some complicated memoryless (Markov) processes. In these circumstances, one must resort to numerical computing.
We begin with a brief non-rigorous background to (LDs) and RL. Then, through a simple toy example (in the Markov case), we explore Reinforcement Learning (RL) as a viable and powerful computing paradigm to measure large deviations in non-equilibrium statistical mechanics.
We will briefly survey the rich interplay between representation theory of symmetric groups and combinatorics of integer partitions, combining several results in the literature that allow us to compute explicitly the structure of important representations, a luxury rarely available in algebra.
Brownian motion, under which the mean square displacement of an ensemble of particles increases linearly with time, is well-observed in many natural diffusive processes. However, many systems are also seen to undergo anomalous diffusion, where this relationship is non-linear. In this talk I will discuss a class of time-discrete dynamical systems which exhibit anomalous diffusion, and discuss methods of modelling these processes stochastically.
In the past ten years, the anti-de Sitter/conformal field theory (AdS/CFT) correspondence has been used to study the strongly correlated systems in condensed matter physics and received a great deal of attention. In this talk, I will start with a brief introduction of Bardeen–Cooper–Schrieffer (BCS) theory of superconductivity, and then move to the AdS black hole solutions in general relativity. Finally, we will see how the AdS/CFT correspondence connect the object in lab (superconductor) with Black Hole.
Geometric Analysis is about a study of geometric problems, that can be formulated as problems of systems of partial differential equations. In this talk, I will present one of the most interesting and challenging systems of PDEs in Geometric Analysis. It will be a smooth introduction to Mean Curvature Flow of codimension 1 submanifolds and we'll see how we can step up and overcome the singularities that this system is destined to face.
In this talk I will show how conformal methods can be used to analyse the non-linear stability of de Sitter-like spacetimes. Central to the analysis is the use of conformal Gaussian systems to obtain a hyperbolic reduction of the conformal Einstein field equations for which standard Cauchy stability results for symmetric hyperbolic systems can be employed. The use of conformal methods allows to rephrase the question of global existence of solutions to the Einstein field equations into considerations of finite existence time for the conformal evolution system. The last part of the talk will be focused on how to use this technique to study the stability of the subextremal Schwarzshild-de Sitter spacetime.
Applied topology is often discussed solely in the context of topology data analysis and applications of persistent homology. However, the field is rich with diverse applications in areas such as machine learning, game theory and phylogenetics just to name a few. In this talk I will attempt to give a brief overview of some of these applications which, depending on time and audience interest, may include topics such as configuration spaces, sensor networks, neurotoplogy and the game of hex (among other options).
Samuel: Brownian motion, under which the mean square displacement of an ensemble of particles increases linearly with time, is well-observed in many natural diffusive processes. However, many systems are also seen to undergo anomalous diffusion, where this relationship is non-linear. In this talk I will discuss a class of time-discrete dynamical systems which exhibit anomalous diffusion, and discuss methods of modelling these processes stochastically.
Nicholas: Ideals in polynomial rings over a field may be associated with so-called “realizable” tropical ideals in the field of tropical geometry. However, not all tropical ideals arise in this way. In my research we consider the geometric side of this question: are the varieties of tropical ideals necessarily associated with algebraic varieties? This question remains unsolved, but in this talk, we will discuss how it relates to the combinatorics of matroid products and what we call “tropically representable matroids.”
Counter-intuitive as it seems, the Banach-Tarski paradox has sparked interest from both mathematicians and laymen. It has been used both as an argument for the beauty and eccentricity of mathematics as well as against them. In this talk we will cover the precise statement, its proof as well as its connection to the Axiom of Choice, where we will take a detour. Attendees are invited to communicate interesting consequences/independences of the AoC in their fields, so we can produce a nice overview by the end of the talk.
Matroids were introduced to generalize the independence structure that arises from columns in a matroid as well as the independence structure arising from forests of a graph. Remarkably, matroids can also help us understand why the greedy algorithm will always choose a spanning tree of minimum weight in a connected graph, and they have a rich structure of their own. In this talk, we will introduce basic definitions and motivation of matroid theory.
Join us in a journey through chaos, groups, prime numbers and more! In this talk we will explore how different mathematical structures can contribute to the creation of theatre. During the talk we'll look at specific exercises to devise theatre and what mathematics are behind them, to then explain our own plays 3+D and The Axiom of Choice, and finally have an open discussion on creativity and mathematics.
We give a beginner's introduction to Category Theory, affectionately known as "abstract nonsense". The talk will assume (some) memory of undergraduate pure maths, contain plenty of examples, and will look mostly at Universal Constructions, and perhaps Functors. Some applications may be mentioned.
Adam: Persistent homology is a robust tool developed in the last 25 years which has become a vital component throughout the burgeoning field of applied topology. This talk will give a brief overview of persistent homology; covering its origins, some fundamental results, and a discussion of how and why it is used in many applications.
Svetlana: We will discuss connections between random matrices and statistical properties of energy levels in heavy elements. If nuclear system is time reversal invariant, nuclear energy levels should behave as eigenvalues of a random real symmetric matrix or a random Hermitian matrix.
In this talk, I will present a novel pathway to discontinuous synchronization. I will define topological signals and show how to couple them using the Dirac operator of networks and simplicial complexes. I will present the corresponding rich phase diagram that includes a hysteresis, discontinuous transitions and a predicted region of non-stationarity of the order parameters.
When proving theorems in any area of mathematics, it is always great if the proof can be reduced to looking at certain idealized/homogeneous objects. For manifolds, an ideal situation is being able to decompose a complicated manifold into simpler "prime" manifolds. This was the basic idea for Perelman's proof of the Poincare conjecture which used a surgery procedure involving Ricci flow. I will try to explain why you would want to use Ricci flow for this problem (and others), as well as the idea behind the surgery procedure.
In this talk I will give an overview of one of the most well-known unsolved problems in mathematics, namely the Riemann Hypothesis. This talk will be broken up to into 3 parts. In the first part I will briefly explain how different areas of mathematics interact. In the second part I will define the Riemann-zeta function and discuss some of its properties. In the third and final part which will be more discussion based I will state the Riemann Hypothesis and equivalent conjectures.
"Numbers measure quantity, and groups measure symmetry" is a fast and concise way of expressing the relevance of group theory in mathematics and beyond, in the task of understanding the diverse aspects of our existence and our universe. The group of permutations of $n$ elements $S_n$ is one the most studied finite groups. A specific representation theory has been developed for $S_n$ based on the combinatorics of integer partitions, which allows us to obtain results with techniques unavailable in the study of other finite groups. Moreover, surprising applications of the theory have been found in various areas of knowledge, both inside and outside mathematics.
We will briefly discuss a few of the applications of the theory of representations of the symmetric groups in areas so seemingly unconnected such as Chemistry, Machine Learning or Social Sciences. No requirements, other than basic group theory and linear algebra, will be needed to follow this talk.
Higher-order interactions are increasingly recognized as a fundamental aspect of complex systems ranging from the brain to social contact networks. Hypergraph as well as simplicial complexes captures higher-order interactions of complex systems and allow to investigate the relationship between higher-order structures and their functions.
In this talk, I will introduce the comprehensive multilayer framework we proposed to study higher-order percolation processes on hypergraphs. The framework provides an insight on the interplay between structures and dynamics of higher-order networks.
Join Quips event (MS Teams)
Nicholas: Tropical geometry is becoming a powerful tool in many branches of mathematics. This talk will focus on the recent development of tropical commutative algebra by Diane Maclagan and Felipe Rincon. The central object of study is the “tropical Ideal,” which generalizes the structure of polynomial ideals over fields to be suitable for study in the setting of tropical geometry (semirings). All polynomial ideals over a field can be associated to a tropical ideal, and it is a non-trivial fact that the converse is false. The specific focus of this talk will be to demonstrate how the combinatorics of matroid theory can allow us to show that most zero-dimensional tropical ideals are not associated with polynomial ideals over any field.
Llibert: In this talk I will give an overview on the so-called quintessential inflation, which unifies the inflation of the universe at early times with the current cosmic acceleration through a single scalar field. I will describe some of the most interesting models arguing how they match with the observational constraints and describe the evolution of the universe.
In 2034 LISA is due to be launched, which will provide the opportunity to extract physics from stellar objects and systems that would not otherwise be possible, among which are EMRIs. Unlike previous sources detected at LIGO, these sources can be simulated using an accurate computation of the gravitational self-force, resulting from the gravitational effects of the compact object orbiting around the massive BH. Whereas the field has seen outstanding progress in the frequency domain, metric reconstruction and self-force calculations are still an open challenge in the time domain. Such computations would not only further corroborate frequency domain calculations/models but also allow for full self-consistent evolution of the orbit under the effect of the self-force. Given we have a priori information about the local structure of the discontinuity at the particle, we will show how we can construct discontinuous spatial and temporal discretizations by operating on discontinuous Lagrange and Hermite interpolation formulae and hence recover higher order accuracy. We will show how this technique in conjunction with well-suited conformal (hyperboloidal slicing) and numerical (discontinuous time symmetric ) methods can provide a relatively simple method of lines numerical recipe approach to the problem. We will show, in particular, how this method can be applied to solve the Regge-Wheeler and Zerilli equations with a moving particle source in the time domain.
Everyone is welcome. An introduction into the PDE and formalisms involved will be provided for the first 20 minutes, this one hour talk is mostly self contained, Maths UG background is assumed.
Mathematicians have long been interested in rigorously understanding the conductivity properties of disordered materials at the quantum level, in particular after the work of the Nobel Prize winning American physicist Philip W. Anderson (1923-2020).
In 1990, Klein, Lacroix and Speis analysed a well-studied random operator model for an electron moving on a portion of lattice of the form Z x [0, W], W in N, and subject to a random potential, called Anderson model on the strip. They showed, in particular, that such a model boasts spectral localization on all of its energy spectrum, a well-defined mathematical property that is a very powerful signature of the electron getting trapped in a region by the potential.
In this present work, we focus on a more general model of a quantum particle with internal degrees of freedom moving in a quasi 1D random medium (disordered quantum wire), that we call "generalized Wegner Orbital Model".
In particular, we prove spectral and dynamical localization at all energies for such a model suggesting that the disordered materials belonging to the wide class described by this model are all perfect insulators.
In this talk, I will start by introducing basic concepts related to Anderson Localization in general, then move to the specific model considered in this work, and outline our proof of its spectral and dynamical localization. The proof combines techniques from probability theory, spectral theory of self-adjoint operator and ergodic theory, with an unexpected visit from the theory of algebraic groups...
This is a joint work with Sasha Sodin (QMUL).
Innovation is the driving force of human progress. Recent urn models reproduce well the dynamics through which the discovery of a novelty may trigger further ones, in an expanding space of opportunities, but neglect the effects of social interactions. In this talk I am going to focus on a model we have recently proposed, in which many urns, representing different explorers, are coupled through the links of a social network and exploit opportunities coming from their contacts. We will show that the pace of discovery of an explorer depends on its centrality in the social network. Our model sheds light on the role that social structures play in discovery processes and it's a novel approach in the modellization of team creativity and efficiency.
Based on my recent paper in Physical Review Letters
We prove that the eigenvectors of Wigner matrices satisfy the Eigenstate Thermalisation Hypothesis (ETH), which is a strong form of Quantum Unique Ergodicity (QUE) with optimal speed of convergence. Then, using this a priori bound as an input, we analyse the Stochastic Eigenstate Equation (SEE) and prove the Gaussian fluctuations in the QUE.
The main methods behind the above results are: (i) multi-resolvents local laws established via a novel bootstrap scheme; (ii) energy estimates for SEE.
References:
https://arxiv.org/pdf/2012.13218.pdf
https://arxiv.org/pdf/2103.06730.pdf
https://arxiv.org/pdf/2012.13215.pdf
Random Matrices are a useful tool to model realistic systems, with applications ranging from physics and ecology to networks and machine learning. In this seminar, I will review some well known applications of random matrix theory starting from the work of Wigner on heavy nuclei. I will focus on properties of spectra of large random matrices, and show interesting consequences on some paradigmatic examples.
Tropical Geometry is a quickly expanding field of mathematics that has demonstrated its utility in a wide array of disciplines such as enumerative geometry, phylogenetics, cryptography and machine learning (just to name a select few). While the fundamentals of tropical geometry are closely related to commutative algebra and algebraic geometry, it wasn’t until recently that that there has been a major push towards developing tropical commutative algebra. This talk will introduce “tropical ideals,” which provide a generalization of classical ideals through the combinatorics of matroid theory, and we will be particularly interested in the subclass of tropical ideals that we call “paving tropical ideals.” These tropical ideals carry the structure of paving matroids, and we show that the combinatorics governing paving matroids governs the structure of paving tropical ideals. This work suggests great utility in purely combinatorial approaches to understanding the unruly entirety of tropical ideals.
Link to the paper
Silvia: The use of labels to represent physical quantities on networks is tied up with the need to understand and quantify the presence of heterogeneity in the distribution of the variable of interest. In the case of spatial networks, the presence of spatial constraints could lead to the emergence of structures when a specific colouring process is implemented. In this work, we characterise and explore spatial structures that emerged from a simple random colouring process P on a 2D lattice. We provide a dynamical random growth model to reproduce the same ensemble of P and we measure some structural quantities of these spatial motifs, making a comparison with a well-known growth model, the EGM, used as control. Then, we show that the only measure of the exit time can capture the same structural information, proposing the exit time as a possible tool for the characterisation of these structures. The study of such motifs is crucial to assign a statistical significance to the measured quantities when a colouring process is implemented on spatial networks.
Antonino: In this talk I plan to give to a short introduction to classical differential algebra, I.e. differential algebra internal to the topos of sets. I will then show how one can generalise this to any topos, with a natural number object, in particular the topos \sigma-Set. Hopefully, with the time I have, I will be able to elaborate on some of the keywords I have mentioned above.
N customers arrive sequentially at an initially empty restaurant with a large number of large tables. At stage 1, customer 1 occupies the first (new) table. Given at stage j-1 there are k tables occupied, the j-th arriving customer either sits at an occupied table with probability q_j = (j-1-k\alpha )/(\theta+j -1) or else sits at a new table with probability p_j = 1-q_j This sampling process describes the (\theta, \alpha) seating plan of a Chinese restaurant process (CRP). The objective of this talk is to introduce the classical Secretary problem and some of its extension while commenting on the work undertaking to get an optimal stopping rule that maximises the probability of finding the last formed partition (table).
Keywords: Secretary problem, Optimal stopping, Poisson-Dirichlet distribution, Chinese Restaurant Process (CRP)
In this basic talk, we will search for the treasure of a leprechaun, wisely hidden inside some manifold. Meanwhile, we will also introduce the essential definitions of Riemannian geometry, with particular emphasis on different concepts of curvature. Very few formulas and a copious amount of figures will be presented. Will you be able to find the shillings?
In this seminar, we will focus on a simple model of a person dancing in a famous pre-covid club in Limehouse. As rare events are pivotal in determining the dynamics of stochastic processes modeling physical systems, we will make use of spectral methods of large deviation theory to understand how unlikely was for a 'good' dancer to get drifted away from the center of the scene. Furthermore, we will also give insights on the processes responsible for such an adrift. BYOB for an enhanced experience.
Svetlana: Renormalized young diagrams with N cells chosen according to Plancherel measure are converging to Vershik-Kerov curve in probability. For each generalized Young diagram, we can build a transitional measure. Transitional measure of Vershik-Kerov curve is semi-circle law. From convergence of ESD of random Wigner matrix to semi-circle law directly follows that Young diagram built from the interlacing sequence of roots and extremums of characteristic polynomial converges to Vershik-Kerov curve.
Aryan: In this talk, I will give a brief overview on some topics from the 90s; the Yang-Baxter equation, Quantum groups and how one can get families of the latter from the first. After giving an overview of these older results, I will describe some recent work on set-theoretical solutions of the YBE and how one can solve the same problem using lattices.
Nicholas: Matroids, first introduced in the 1930s, are a simple combinatorial generalization of linear independence. Since their introduction, matroids have become a powerful tool in contemporary mathematics, widely known for their classical applications in optimization and coding theory. This talk will focus on the basics. I will define matroids and matroid duality, demonstrate the existence of a “non-representable” matroid, and discuss the notion of matroid cryptomorphism, which is perhaps the theory’s greatest asset.
Marica: The extended conformal Einstein field equations and a gauge based on the properties of conformal geodesics can be used to analyse the non-linear stability of Einstein spaces with spatial sections of negative scalar curvature. This is done by considering a de Sitter-like spacetime, which is a vacuum spacetime with a de Sitter-like value of the cosmological constant. This class of spacetimes admits a conformal extension with a spacelike conformal boundary and represent the simplest application of the conformal field equations to the analysis of global properties of spacetimes. The existence and stability theorem for this type of spacetime can be proven by means of hyperbolic reduction procedures. The method that we use relies on conformal Gaussian systems that combined with the use of conformal field equations allows us to formulate initial value problems for the perturbed de Sitter-like spacetime not only on a standard initial hypersurface at a fiduciary finite time, but also on a hypersurface corresponding to the conformal boundary of the spacetime.
Join Quips event
Special Session for the PhD opportunities day.
We use the formalism of Quantum Riemannian Geometry for constructing a FLRW models considering a real line as 'classical time' and a polygon or a Fuzzy sphere as 'space'.
The Heisenberg model is a central model in statistical mechanics, modelling how particles behave at the macro scale by studying their microscopic interactions. I will introduce this model and its quantum version, which has a different flavour, and can by analysed with different tools. In particular, I will give an overview of how an algebraic approach can work, using the symmetric group and the Brauer algebra, and their representation theory.
Hanlin: With the hit of new pandemic threats, scientific frameworks are needed to understand the unfolding of epidemic. The use of mobile apps that are able to trace contacts is of utmost importance in order to control new infected cases and contain further propagation. In this talk I will present a theoretical approach using both percolation and message-passing techniques to the role of contact tracing, in mitigating an epidemic wave.
This work is done jointly with Ginestra Bianconi, Giacomo Rapisardi and Alex Arenas.
Domagoj: The credit exposure measures the potential loss to a party if its counterparty defaults on a financial derivatives deal. In this talk, we propose a method based on Chebyshev interpolation to accelerate the computationally demanding task of calculating credit exposures. We assess the performance of the approach for several financial derivatives in different option pricing models: Black-Scholes-Merton, Merton’s jump-diffusion, and Heston’s stochastic volatility model. We consider four different types of equity derivatives: European, digital, barrier, and Bermudan options. We report the accuracy and runtimes of the Chebyshev interpolation approach against a more direct method of exposure calculation called full re-evaluation.
In this talk, we discuss the eigenvalues of large random covariance matrices and their universal limiting distribution after the matrix is normalized. The limiting distribution of the eigenvalues is called the Marchenko-Pastur distribution. We discuss the technique of the Stieltjes transform which can give results about the convergence on local scales where there is just a constant amount of eigenvalues. We prove the optimal rate of convergence on these local scales.
Lidia: In 2034, the spaced based satellite operating in the millihertz band, Laser Interferometer Space Antenna (LISA), is due to be launched, this will provide the opportunity to extract physics from stellar objects and systems that would not otherwise be possible. Among these systems are compact objects such as a black hole (BH)/or neutron star (NS) and a massive/supermassive BHs, also known as Extreme-Mass-Ratio-Inspirals (EMRIs). These systems, unlike previous sources detected at Laser Interferometer Gravitational Wave Observatory (LIGO), need to account for an accurate estimate of the action of a local force, the gravitational self-force (GSF), resulting from the gravitational effects of the compact object orbiting around the massive BH. Modelling the orbital evolution of these systems and extract gravitational waves is an open challenge. In this talk one outlines a novel self-consistent time-domain numerical method for reconstructing the metric perturbation through the Regge-Wheeler-Zerilli/Teukolsky formalism. This method will follow the method-of-lines recipe for time evolution of partial differential equations problems and use discontinuous (pseudospectra/finite difference) collocation methods to approximate the discontinuous source functions. This work is done jointly with Juan Valient Kroon and Haris Markakis. Any fellow PhD student facing problems handling discontinuously sourced linear PDEs may find this talk of particular interest and is warmly welcomed.
Luka: Gödel's Incompleteness Theorems have been some of the most famous modern mathematical results since their publishing in 1931. Naively they are often represented as
> "Every mathematical theory has some theorem that is true but cannot be proven."
and
> "No mathematical theory can prove its own consistency."
In this talk I will give an overview of these theorems in a way suitable even for non-logicians. In particular, we will discuss the precise statements and the implications these theorems have and had.
In his book "Thesaurus of scales and melodic patterns", the music theorist Nicholas Slonimsky introduced what he called the Mother Chord: a chord containing all of the twelve notes and all the eleven intervals. The very existence of this chord gives rise to an interesting problem in combinatorics: given a natural number n, do there exist permutations P of the numbers [1, n] such that every partial sum of the images of consecutive integers under P is nonzero mod n+1? If yes, how many are there?In this talk I will introduce this problem starting from its musical background and will make some very simple observations on it, hoping to draw the participants' interest.
Stability usually is a desired property. It guarantees that things will eventually end up as expected, provided that some boundary conditions are satisfied.
Stability is an idea so intuitive it can be easily applied to a plethora of fields, ranging from mathematics to physics, to engineering. For the same reason, this word has got slightly different meanings when applied to various fields.
The notion of stability can be found in algebra, complex systems, dynamical systems, geometry, numerical analysis, probability, and statistics. Chances are almost all of us have dealt with stability at least once. And if you don’t have yet, this is the right time to see how stability can be useful for your project (or how your project can be helpful in redefining stability!)
Possible goals:
1) Do an informal survey on what “stability” means in different fields.
2) Use “stability” as a side task on some current PhD projects.
3) Develop new techniques to impose stability (e.g. in nonlinear dynamical systems)
In order to uncover the mechanisms driving the exploration and exploitation of new ideas/songs/items, the new framework of the "adjacent possible" has been recently proposed. However, we still know very little of how this is structured or how it is explored. In this talk we will try to understand how to represent it through the use of complex networks tools. In particular, I will show a new multi-agent model which generalises one of the most recent interpretations of the adjacent possible (i.e. the urn model with semantic triggering). We will make use of a real-world data set crawled from Last.fm, containing all social and musical activities of 4836 users sampled from the platform data, which will let us know in real-time who are the "closest" artists to any artist of your choice (at least based on those users tastes). We will finally see that the proposed model is able to reproduce the key features found in the empirical data set, namely (i) the heterogeneity of individuals' exploration rates, (ii) the correlations in the sequence of music consumption, (iii) the emergence of echo chambers, and (iv) the higher-order correlations between the rates of exploration and key topological properties of the social network.
In 1827 British botanist R.Brown found out that pollen particles are moving chaotically on the water surface. It happens because of collision with water molecules. This movement can be described as a random walk on a plane. The physical essence of the heat transfer problem is in energy exchange (collision) between molecules of the substance.
Diego: Hecke algebras (of type A) can be seen as a "deformation" of the symmetric group algebra. They inherit many of the structural properties of the latter, including the study of the modules of this algebra via integer partitions. We will introduce a combinatorial presentation of partitions, the abacus, present a recent result, and formulate a conjecture generalising such result involving a well known sequence. The talk will be almost exclusively on combinatorics, so no algebra background will be required.
Dean: The algebraic structure induced from the symmetric group $S_n$ is intimately linked to the combinatorics of partitions of $n$. We will describe the connection between the two and will present a recent result from the author on bar partitions, i.e. subsets of the natural numbers.
Cryptography is vitally important in our modern day society and a whole host of things depend on its existence. In this talk I shall give an overview of several different crypto-systems which are currently in use, namely, RSA and elliptic curve cryptography. I will outline the method of how they work and the advantages and disadvantages of both. I will then give some ideas behind crypto-systems which can be used in a post quantum computer world.
With probability one there exists a unique (up to isomorphism) random graph on countably many vertices, called the Rado graph. This graph has the cute property that any finite or countable graph can be embedded into it as an induced subgraph (it is universal) and that any partial automorphism may be extended to an automorphism of the whole graph (it's v symmetric). Why is this interesting? Who knows, but these are properties that definitely do not happen in finite graphs and uniqueness is always nice. In this talk I'll discuss work generalising the Rado graph to a unique, universal and symmetric (infinite dimensional) simplicial complex. Won't assume much. Joint work with M. Farber and L. Strauss.
Cartan subalgebras of C*-algebras lie at the core of the interplay between C*-algebras, topological dynamics and geometric group theory. Thus it becomes natural to ask which C*-algebras have Cartan subalgebras, and to what extent these are unique.
Using K-theory, I will show that AF algebras have unique canonical Cartan subalgebras. To this end I will give a rough overview of what K-theory is and how it applies to C*-algebras, what AF-algebras are and what the canonical Cartan subalgebra looks like. I don’t assume any specific prerequisites, but the talk becomes more digestible if you know a bit about groups, matrix algebras, functional analysis, and the notion of functoriality.
Svetlana: Combinatorial Nullstellensatz is an algebraic theorem relating the coefficient of a polynomial for a certain monomial with its values. It can be applied in various problems of additive combinatorics and graphs coloring, to proof theorems of existence when the existence of a nonzero value of a polynomial at some point means that some object satisfies the desired property.
Julio Narciso Argota Quiroz: We show a way to construct a differential calculi in the algebra of functions over a finite set. Besides, we describe the case when the set has a group structure. Also, we define geometric structures like connections, curvatures, etc.
Mathematical billiards interest both pure and applied mathematicians. In this talk, I want to present a brief overview of some known facts in a very approachable way. We will review billiards on all kinds of different tables along with their respective dynamical properties. No prerequisites are necessary, however, there shall be some side notes for the more abstract mathematicians.
Luka: Contradicting geometric intuition, the Banach-Tarski Paradox is surely one of the most controversial consequences of the Axiom of Choice. Although many students -- and mathematicians in general --have heard of the theorem's existence, the proof is seldom taught, albeit very easily understood. In this talk, I want to introduce the precise statement of the Banach-Tarski Paradox, give a good idea of the proof and a short discussion about why it does not necessarily contradict a mathematician's intuition. If the time allows for it (or if everybody attending knows the original proof already), I can also explain how the theorem follows from weaker forms of the Axiom of Choice.
Antonino: It is usually standard in classical category theory to ask that a collection of morphisms between two objects are sets as part of our data; we will generalize this notion. In turn, this will allow us to develop category theory over a base category which is monoidal closed. We will define what it means to be monoidal closed and then give examples of such categories that work.
Elliptic curves are used in all areas of Number theory and also have a number of real world applications. Mathematicians have been studying these curves since the time of the Ancient Greeks. They were interested in whether these curves have any integer solutions. In this talk we shall be focusing on whether these curves have any rational solutions (rational points) and if so how many? I will start by discussing some background theory. I shall then discuss one of the major theorems in this area, namely the Mordell-Weil Theorem, which states that the group of rational points is finitely generated. Time permitting I shall then discuss some extensions to this theorem and also open problems in this area. Throughout this talk I will not be assuming any prior knowledge about elliptic curves or group theory.
Natalie: The generalised twin prime conjecture states that for any integer $n>=1$ there are infinitely many primes $p$ such that $p+2n$ is also prime. Hardy and Littlewood generalised this, conjecturing an asymptotic formula for the number of prime $k$-tuples up to $x$. In this talk, we will discuss what is known so far in the case of prime pairs of the form $(p, p+2n)$, including an introduction to the circle method. We will then explore a related question on the number of 2-almost prime pairs (pairs of numbers which are products of exactly two primes) differing by an integer $h$ up to $x$.
Danilo: Neural Networks (NNs) are a family of black-box mathematical models. They obtained astonishing results in many different fields, like image classification, voice recognition, and reinforcement learning problems. A huge number of papers dealt with this topic in the latest two decades, showing that it is a hot topic. However, it may be hard to understand what is important to read to have a basic knowledge about this topic. To overcome this issue, the main ideas underlying NNs will be briefly introduced and summarized in this talk. Provided that there is enough time, it will be also showed how to use a simple evolutionary algorithm to train a Neural Network on a reinforcement learning problem. The talk is opened to everyone, no previous knowledge of this topic is required.
In this talk I want to give an overview of one way geometry and number theory are related. Symmetric spaces are Riemannian manifolds that are everywhere symmetric. Locally symmetric spaces are Riemannian manifolds that can be covered by a symmetric space. I will explain how this purely geometric definition relates to the notion of arithmetic groups.
Oliver's Abstract: Markov chains, those processes which do not have memory, are among the most common models used in applied mathematics (see the Page rank algorithm for a particularly over-used reference). Unfortunately there are some real-world systems which can’t be modelled in this way: processes with memory. It is natural then to ask about how one might measure the memory present in a given time series. We will first explore the general case in terms of an information theoretic framework, and find a usable estimator. We will then take a brief look at temporal networks and see that there are multiple ways to approach this, some of which don’t really work.
Francesco's Abstract: In this short talk we will get stunned by the beautiful probabilistic theory of large deviations, which predicts that the probability of rare events in a stochastic system decays exponentially with the system's size. To familiarise with some basic concepts of the theory we will try to understand the formation of mexican waves, and (if time allows) try to reproduce one at the end of the talk to congratulate the speaker.
Diagrammatic algebra is very hyped up these days: so we'd like to know why? We'll take a very elementary approach to see how the multiplication of natural numbers can be described via diagrams. To really understand why the diagrams we choose work, we'll have to think about where the monoid of natural numbers really live. Something this simple, really turns out to be an object of a higher category in disguise! The talk should be open to everyone who knows what sets and functions are!
The technique of reaching a critical point of a functional by following a "mountain pass" has found manifold applications in both analysis and geometry, among other fields. We shall study the mountain pass theorem, widely regarded as a prototype result of modern critical point theory in the calculus of variations, and consider some basic applications to non-linear partial differential equations and minimal surfaces.
This talk will be about the jamming state of granular materials and will introduce the random close packing problem. Granular materials which are composed of macroscopic grains such as sand, sugar, bearing balls etc. are ubiquitous in our everyday experience. Nevertheless, a fundamental description of both statistical and dynamic properties of granular matter is challenging. For example, it is not clear whether jamming transition (a fluid-to-solid phase transition) of granular materials is governed by a variational principle of an associated thermodynamic quantity like the free energy in equilibrium systems. Jamming transitions not only occur in granular media, but also in soft materials such as colloidal suspensions, compressed emulsions, foams, glasses and biological materials such as cells, DNA and protein packing. And you will see that this jamming transition is related with problem of identifying the densest packing of particles. Esma will try to explain the central role of the shape of the particles for this transition phase.
We will introduce the concept of moving frames and where to use them, along with examples (and animations!). The notion of a quantum ribbon with the definition of the corresponding quantum Hamiltonian on such a strip will follow. Spectral results for different kinds of ribbons will be presented, and more importantly, explained. Last but not least, the spectrum of the Möbius strip will be tackled in three different models both analytically and numerically, with comparisons of the results.
Periodic functions appear in various areas of mathematics - including Number Theory. A fundamental tool to study them is Fourier series, and I will start with a reminder about them. Taking inspiration from this setting, I will move to a more general context and explain how spectral analysis can be used to decompose the so-called right regular representation of some algebraic groups and give rise to automorphic forms.
In this talk we will be recalling some fundamental notions of category theory, in particular the notion of a limit. We shall then define an elementary topos which will be followed by some important examples, such as the category of presheaves (and if time permits, a "light" discussion of sheaves).
Financial institutions form a highly interconnected system and its disruption has serious consequences for the economy and the society.I will describe the 'PD model', a dynamical model of the financial system as a network of banks (nodes) characterized by their total assets, equity and probability of default PD per unit time. The edges of the network represent credit exposures between banks (for example loans). The contagion mechanism is an increased PD for nodes in a neighbourhood of a defaulted bank. The network is also characterized by a correlation matrix that is linked to the tendency of the nodes to default during the same time period. The results show the existence of a 'strong contagion' regime where a lower correlation between nodes is associated to higher risk. This is in sharp contrast with the standard assumption in financial credit risk where a more diversified portfolio (lower correlation) is considered less risky.The 'PD' model unifies credit risk techniques with network theory and allows measuring systemic risk (risk that a considerable amount of the network is disrupted) and to assign a systemic risk rank to each node.No previous knowledge of Finance is required.
A brief introduction to the concept of General Relativity is outlined. The aim is to give a description satisfying the mathematician (rather improbable...) and yet accessible to the intuitive learner. The emphasis will not be on differential geometry and tensor calculus, though this will be used, but rather on the physical ideas of the theory that is essential to understand. If time allows, I will briefly mention my own research within the framework given. Warning for the pure mathematician: this talk will be highly contaminated by applicable material.
Counting the number of trees on a labelled vertex set is a classical result called Cayley's formula that has been known since the 1800s, but counting of their generalisations has a much more recent history. The first question is how do you even generalise a tree to higher dimensions? As with cat skinning there are multiple ways but I will focus on just two: Q-acyclic simplicial complexes and minimal connected covers, with a particular emphasis on the latter. We will end the talk with a few results that do give Cayley type formulae for these generalisations.
A system of equations involving differential forms on a manifold is called an exterior differential system. Equivalently, an EDS can be understood as a differential ideal of the algebra of differential forms. Many interesting partial differential equations can be translated into this geometric framework to great effect. We shall introduce the most important notions in the theory of EDS, discuss the fundamental Cartan–Kähler theorem, and consider some simple applications.
Without delving too much into the language of categories and Hopf algebras, we will look at how the search for Knot Invariants, and invariants of Ribbons in a general setting feed into the structure of Quantum Groups. The Jones polynomial was discovered in the 1980s as a stronger invariant of Knots compared to the Alexander polynomial. However, in the 90s, after the discovery of Quantum groups, it was seen that the Jones Polynomial can be realised via the representation theory of quantum groups. The reason behind this theory is now understood much more generally and relates to Topological Quantum Field Theories (whatever that is!). By looking at what TQFTs in lower dimensions are, we will rediscover the axioms of Hopf Algebras, and Tensor Categories all together!
Complexity arises in many systems where the interaction of their components results in evolutionary processes and often emergent behaviour at a macro level. Examples are our society, the brain, the financial market and ecosystems. The heterogeneity of the spatial organisation of these systems quite often carries important information about their function as a whole. A particularly interesting problem in this area is the quantification of spatial segregation, i.e., the tendency of people to cluster around uniform patches of spatial settings. Despite the vast existing literature on the topic, quantifying segregation is still problematic, mainly due to the granularity of the data used, the spatial scale of the neighbourhoods or measures with the presence of one or more free parameters. An alternative formulation is to quantify the heterogeneity of the distribution of classes across a city by looking at the statistical properties of the trajectories of a Random Walk over the city graph.In my talk, I plan to give a short introduction to random walks, move on to applications on networks and finally introduce my research problem with some fresh results.
I will introduce the notion of topological groups, and in particular abelian locally compact Hausdorff groups (a.l.c.h.g.). Since general a.l.c.h.g.'s are a bit fiddly to work with and a mouthful to say, I'll focus on discrete and compact groups, and the important examples of the reals, the integers and the unit circle.The dual of a topological group is its space of characters, which is related to the Fourier Transform where it is the frequency space. Pontryagin Duality states that every a.l.c.h.g. is canonically isomorphic to its double dual, which I will attempt to prove for compact and discrete groups.
In this talk we show that variables defined by a generic chaotic system can be used to approximate a stochastic process with a Gaussian distribution.If we regard the initial values as random variables, any deterministic dynamical system generates a stochastic process, but generally this process is not a Gaussian, and is characterised by non-vanishing higher-order correlation functions. Those correlations can be described by sets of simple graphs, such as N-ary trees. We use a simple chaotic map to illustrate that the system gives a Gaussian probability density in a perturbative way.A little knowledge of Bernoulli shift in dynamical systems and probability theory would be helpful, but the talk is friendly to everyone.
Markov Chain Monte Carlo (MCMC) continues to be a significant application of Bayesian Statistical thought 50 years after its development at the dawn of the nuclear arms race. Modern revolutions in computing, including artificial intelligence, machine learning and stochastic optimization are founded upon this simple but powerful concept. Therefore MCMC continues to be a versatile tool of scientists, as its applications are limited only by the computing power available and the imagination of the Statistician, with Moore's Law largely negating the first limitation with time. This presentation attempts to engage an audience who may be unfamiliar with MCMC by providing a concise introduction into the theoretical foundations of MCMC, such as "simple" Monte Carlo integration and relevant Markov Chain concepts such as Reversibility, Ergodicity, and the Detailed Balance equation. Furthermore, the discussion will seek to briefly elucidate upon common methodologies of MCMC in practice, such as the Metropolis-Hastings Algorithm and Gibbs Sampling, along with a brief exposure to practical implementation issues such as idealized acceptance rates and convergence.
Linear operators and isometries are two important mathematical notions which are widely used in many branches of mathematics, especially in analysis-related branches. It's natural to consider the relation between them. In my talk, I will prove a theorem which shows you how far a linear operator on a Banach space is from an isometry. If time permits, I'd also like to give some cases in which an isometry is linear.A good command of knowledge of real analysis and functional analysis will make it easier for you to understand the talk well.
In this talk we will survey the rich interplay between representation theory of the symmetric group $S_n$ and combinatorics of partitions of $n$. The last happen to be in bijection with the irreducible representations of $\F{S_n}$ when $\F$ is an algebraically closed field of null characteristic. Unfortunately, this may not be true when $Char(\F)>0$. We will have a look at what information about the simple modules of $\F{S_n}$ in positive characteristic it is possible to obtain by looking at partitions of $n$. Only elementary group theory will be needed to follow the results given here.
In this talk we will study the dynamical behaviour of a conservative, nonlinear DE system of coupled oscillators governed by a "special" potential derived from Hollomon's law. This characterizes the phenomenon known in engineering as "Work Hardening". We will start with the quite famous simple harmonic oscillator and then proceed with generalizing this model, to obtain its solution in the form of generalized trigonometric functions.As an application, we study a two degrees of freedom problem. To better understand the underlying dynamics, we produce and study the corresponding Poincaré sections for a specific energy value.This talk is focused on how a simple ODEs generalization (mathematicians just love doing that) can open the door to applications in other fields.[1] Wei, Dongming, and Yu Liu. "Some generalized trigonometric sine functions and their applications". (2012).[2] D. Shelupsky, "A generalization of the trigonometric functions", The American Mathematical Monthly, vol. 66, no. 10, pp. 879-884, December 1959.[3] F. D. Burgoyne, "Generalized trigonometric functions", Mathematics of Computation, vol. 18, pp. 314-316, 1964.
As the name 'controllability' suggests, we want to control something. Basically, we want to take the solution of the considered ODE from its initial state to a chosen final state, using a suitable control. I will show that this problem is equivalent to the notion of observability, and derive a condition specifying when the ODE is controllable.Control theory has applications in many other subjects such as Physics, Biology, Economics etc. But, don’t worry, I will not address any of those in the talk.
Heavy particles suspended in turbulent fluid flows, so-called turbulent aerosols, are common in Nature and in technological applications. A prominent example is rain droplets in turbulent clouds. Due to their inertia, ensembles of aerosol particles distribute inhomogeneously over space and can develop large relative velocities at small separations.We use statistical models that mimic turbulent flow by means of Gaussian random velocity fields to describe these systems. Compared to models that involve actual turbulence, our statistical models are simpler to study and allow for an analytical treatment in certain limits. Despite their simplicity, statistical models qualitatively explain the results of direct numerical simulations and experiments.
In my talk, I will discuss how methods of non-equilibrium statistical mechanics and large deviation theory are used to study statistical models of heavy particles in turbulence. The one-dimensional versions of the models will serve as simplified playgrounds to create intuition for, and give important insights into the behaviour of higher dimensional particle systems.
Linear programming is a well studied branch of optimisation where everything is linear. Linear is usually code for 'easy', but this isn't quite the case for linear programming, as fast algorithms tend to break for some special examples. This led Smale to list the complexity of linear programming as one of the 17 most important problems in modern mathematics. In this talk I'll outline some of the main algorithms and results from linear programming, and show how a new branch of maths is shedding light on this old problem in surprising ways.
Recently, seamless phase or combined phase clinical trials have become more popular for reducing the estimated time it takes to complete the development of drugs. A seamless phase II/III clinical trial's main purpose is to compare a number of drugs or doses in a single trial conducted in two stages. The first stage studies all of the experimentaldoses or drugs and selects the population with the largest sample mean. This selected treatment will continue to the second stage for further analysis. The problem is to obtain the best estimator of the mean of the selected population. In the analysis of two-stage trials, the issue of estimation bias introduced by treatment selection has long been known. The sample mean for the selected population and the maximum likelihood estimator (MLE) are biased estimators of the corresponding population mean, due to combining data from both stages. To correct for the bias efficiently, the uniformly minimum variance conditionally unbiased estimator (UMVCUE) has been derived for trial designs with normally distributed data and unequal stage one and stage two sample sizes. Moreover, formulae for the variances of the MLE and the UMVCUEhave been obtained and are compared. Finally, simulation results for the bias of the MLE are presented.
Luckily, during this talk, we will not be buzzing around but we will focus on a way more interesting problem: how likely is it that your trousers' zipper inevitably decides to mock you in a crowd and comes naturally down!?From a first sight this looks like a bland problem to face (one can always start dressing in trousers with buttons!), but it will instead be an interesting introduction to the fascinating world of statistical mechanics as part of theoretical physics.We will study how entropy, and its convexity, are related to the zipper problem, realizing that in the (thermodynamic) limit of an infinite long zip, a phase transition appears, leaving us without trousers, and the need to formulate a new calculation tool for replacing the Legendre transform.
Named after the famous mathematicians Issai Scary and Hermann Wail, Scary-Wail duality describes a positively ghoulish relationship between the representations of the symmetric group S_n, and the general linear group GL(k). I will attempt to explain this nightmarish affair, with the ever-present spectre of the Manhattan algebra looming in the shadows.
You may know what 'factorizing a cube' usually means, but forget all that number theory now: this is graph theory, where a 'cube' is the graph formed from the vertices and edges of an n-dimensional cube, and a 'factorization' is a partition of the edges into spanning subgraphs. I will give an introduction to factorizations of graphs, focusing in particular on 1-factorizations on the complete graph and the cube. This will include some nice results (the wonderful Walecki construction!), some less nice results, and some open problems. There will also be a surprise. We will finish with a (literal) sketch of my recent results on factorizing the cube.
Category theory can sometimes have a reputation for being abstract and complicated; this talk will aim to provide an easy introduction with no prior knowledge needed nor assumed. I will try to give motivation for learning a little bit of category theory, then define relevant notions from scratch (categories, functors, natural transformations). The main goal is for us to understand the first big result: Yoneda's Lemma.
In this marvellous talk, I will give a simple proof of the Riemann Hypo... ehm, take two, action! In this astonishing seminar, I will answer one of the most crucial questions of our life: why is the pizza round? Then I will give some intuition for how to solve the same problem from the point of view of a not-so-hungry alien living in another universe.
I will introduce the notion of a groupoid, and using lots of examples I will demonstrate how many classic objects in mathematics (such as groups, graphs, dynamical systems, relations ...) give rise to groupoids. I will then try to convince you that, algebraically, groupoids are very easy to classify, and so in order to have a richer theory I will introduce the étale topology on groupoids. If time permits I will briefly (and very gently) discuss how one can construct C*-algebras from groupoids and how this all relates to my current work on Cartan subalgebras of C*-algebras.
We will start off by discussing what is probably the most famous example of sequential optimisation problem - the secretary problem, and then naturally moving on to more sophisticated, and, inevitably, more complex problems that are of a particular interest in the field. The latter are approached by means of dynamic programming, which often lead to Bellman optimality equations with no closed-form solutions. Consequently, we will be reviewing asymptotic properties of the solutions to selected problems.
Thermodynamic uncertainty relation and bounds on the time-integrated current fluctuations are the recent advances in non-equilibrium thermodynamics but have, so far, been derived only for Markovian processes. We explore the validity of one particular result (which states that the entropy production rate bounds statistical errors in current fluctuations) and discuss some open questions in the context of a simple non-Markovian toy model - a discrete-time asymmetric random walk on a ring with one-step memory.
I will give an overview of the results Peter Cameron, David Ellis and I have got on the following, self-contained, problem (and some extensions): what is the smallest a subspace of F_2^n can be such that its cyclic shifts cover all of F_2^n? The talk will contain combinatorics, basic representation and Galois theory, proofs, and (ever popular) counterexamples.
The tropical semiring is the real numbers with the operations min and plus. This odd choice of ring crops up surprisingly regularly in many fields, especially optimisation and computer science. When we try to do geometry over this ring, some surprising geometry and combinatorics falls out. In this talk we shall examine arrangements of hyperplanes over the tropical semiring and discover a link to classical graph theory via matchings. No prior knowledge required (or expected!).
Extremal combinatorics is typically about extremising combinatorial properties of constrained set systems. The most recent problem I have been working on is minimising the number of distinct unions we can achieve when choosing a fixed size family of k-subsets of [n]. Though not the most efficient description of my results, I will give a chronological account of my research, in the hope that it gives a more expressive exposition of the problem and that it might be interesting to you/cathartic to me to have the true nature of my research honestly accounted. I will also include a rapid introduction to the compression method and its limitations so that people can come away from the talk with some useful mathematics.
Given a locally compact Hausdorff group G and a unitary representation $\pi$, we can extend it to the *-algebra $L^1(G)$, consisting of all absolutely integrable functions with respect to the left Haar measure, to get a *-representation. In this way, we can define two classical group C*-algebras related to G. As a result, the C*-algebras of a group encodes all the information about unitary representations of a group. In general, the two group C*-algebras are different. Actually, we can construct a distinguished homomorphism between them two and this homomorphism is an isomorphism if and only if the group is amenable.
I will introduce the definitions of some relevant concept, give some examples and special cases to help understand the concept and finally present the central result without proof.
Abstract: Given a finite point set X in some Euclidean space one can build a simplicial complex on X (in different ways) by fixing a scale constant ‘r’. The homology groups of the simplicial complexes obtained from all possible scales ‘r’ can be used to characterize X. In fact it is possible to compute the ranks of the homology groups (for a fixed homological degree k) for all ‘r’ with an efficient algorithm, whose output is known as a persistence diagram. These diagrams can be considered ‘’good summaries’’ of the topological and geometric information of X. In my talk I will go through the procedure used to obtain persistence diagrams, hopefully give a motivation of to the claim of them being ‘’good summaries’’, and present the research questions I am studying.
Abstract: Finding the symmetries a manifold possesses is of great interest as they can provide with some insights into its global properties. Remarkably, Noether's theorem states that each symmetry generates a conserved quantity; in particular, these symmetries are encoded by the so-called Killing vectors. This results particularly useful for mechanical systems as one is able to integrate the equations of motion via quantities like energy or angular momentum.
A review of the basic notions of differential geometry, making emphasis on flows on manifolds, will lead to the concept of Killing vectors and how they are related to the symmetries of a spacetime. Additionally, I will briefly discuss my work on understanding these kind of vectors in the context of General Relativity, specifically when conformal transformations are performed.
Abstract: In the late 1950s models were introduced to study random graphs and through the probabilistic method understand properties of deterministic graphs. It wasn't until the mid 2000s however that a similar treatment was first given to study the topological properties of large random simplicial complexes. We will begin with a quick overview of the Gilbert random graph before looking at several models of random simplicial complexes and surveying some (hopefully) interesting results.
The Ricci flow is a powerful tool in geometry and topology. Most famously, it has been used by Perelman to resolve the Poincaré conjecture. While the Ricci flow can be defined in any dimension, I will focus on the case of surfaces. In this setting, the Ricci flow successfully deforms any given surface into a surface of constant curvature.After briefly reviewing the geometry of surfaces, I will define the Ricci flow and explain how it achieves this on the sphere, following a proof of Hamilton.Finally, I will connect this to my own attempts at understanding flows closely related geometric flows on surfaces with more singular behavior.
Several popular linear machine learning algorithms can be adapted to anonlinear setting by means of the kernel trick. The resulting kernelmethods operate implicitly in a high-dimensional feature space withoutthe need to carry out expensive computations in said space. However,they still require the computation, storage, and manipulation of amatrix whose size scales with the size of the dataset.As more data is readily available to researchers than ever, techniqueshave been developed to work with a low-rank approximation of the kernelmatrix instead, sampling only a small subset of relevant columns. Thistalk shall serve as an introduction to these methods.
Santa is lost in Manhattan, and needs to escape to deliver gifts to Queen Mary Maths PhD students before the night is out! Will he make it?! Join us on an exploration of strange traffic-inducing Xmas traditions, a city that literally goes on forever, and some gentle algebra representation theory, as we journey to the very centre of (the) Manhattan (algebra)
The modelling of the spread of epidemics forms a large and active area of research. The onset of a zombie apocalypse seems like an ideal time to make use of this work… if we are not prepared then how can we hope to survive? But zombies don’t seem to work like the flu, so we have to make some adaptations to the older models. I will go through some cases of how traditional epidemics are studied, and then how the models can be adapted to help forecast the end of the world.
Nowadays it is well known that there is no general solution by radicals for a polynomial f with arbitrary coefficients (lying in a field characteristic 0) when deg(f)>4. This was proved by Abel using Galois Theory. We will start by giving a brief introduction to this theory which will then use to show the cases in which a polynomial of degree 5 is solvable by radicals. If time allows it, we will also work out a characterisation on solvable quintics in terms of the resolvent and the discriminant associated with the polynomial.
C*- algebras were introduced, as natural abstractions of matrices acting on Hilbert spaces, by Murray and von Neumann in 1935 in order to explain certain physical observables of quantum mechanics. Today, they form an integral part of pure mathematics as they are able to generalize and improve on many of the notions we have for classic operators: eigenvalues, rank, projections etc. In this talk I will prove some spectral ("eigenvalue") theoretic results for Banach and C* - algebras.
I will start off by providing plenty of examples of C* and Banach algebras, as well as their spectral properties. Then I will move on to prove one of the most important spectral theorems for Banach algebras. If time permits, I will prove the Spectral Mapping Theorem and the existence of a continuous function calculus for self - adjoint operators of C* - algebras.
I aim to make the talk accessible for all attending, but knowledge of basic Functional Analysis will make you a little happier.
In this talk I shall give an introduction to noncommutative Riemannian geometry based on quantum groups and its proposed role in quantum gravity. In doing so I shall discuss the notion of a differential anomaly and how we can resolve this by increasing the dimension of the cotangent bundle beyond the classical. One such way this is done is by a mechanism known as 'spontaneous time generation', where we demonstrate that if time did not exist we would be forced to create it if space is noncommutative.
This talk is based on work completed in the mid 2000's by my supervisor S. Majid.
What does saturation mean in the context of graph theory? We say a big graph G is saturated with a small graph F if the graph G doesn't contain a copy of F, but adding any edge to G creates a copy of F. A lot is known about the maximum number of edges a saturated graph can have, but what about the minimum number of edges? It turns out this number is a slippery beast, and I'll talk about why. And then, because why not, we'll see if we can generalise these ideas to hypergraphs.
A key feature of many classical theories is that a complete knowledge of the present state of a system is sufficient to know the exact past and future behaviour of the system. As a classical theory of gravity, is this also true in general relativity? This question is at the heart of one of the major unsolved problems in mathematical relativity, the so-called strong cosmic censorship hypothesis. In this talk, I will introduce the main ideas of general relativity, show how initial data is represented, and explain the role of the strong cosmic censorship hypothesis in preserving determinism.
Abstract [PDF 45KB]
In this talk I will give an introduction to the topic of Extremal Graph theory. We will start by looking at Mantel's theorem, which tells us the maximum number of edges a graph with no triangles can have. We will then see how this generalises to Turán's theorem, which deals with larger complete graphs than triangles. If we have enough time, we may also talk about the Erdős–Stone theorem or Dirac's theorem,
No prior knowledge of graph theory will be assumed.
In 1917 Sōichi Kakeya asked the following question: "What is the least area needed to continuously rotate a needle by 360°?". If this is a question you have not came across before then you may enjoy trying to see how well you can do (by a needle we mean a unit line segment, and we may move the segment in 2d). We will look at Besicovitch's (1928) arguably surprising answer to this question and consider some related questions of a similar flavour. In particular we consider subsets of d-dimensional space which contain a unit line segment in all directions and we replace area (or more accurately measure) by some other measure of size such as Minkowski dimension or, in the case of the finite field analogue, cardinality. Time permitting we may discuss the resolution of the Kakeya conjecture for d=2 or talk about Zeev Dvir's recent (2008) solution of the finite field analogue to the Kakeya conjecture or even have time to look at a gif of a line segment rotating.
The importance of the probabilistic method in Combinatorics can hardly be overstated. Introducing the method, Erdős demonstrated its value by giving non-constructive proofs of the existence of graphs with certain exceptional properties using little more than elementary probability. Some of the results obtained have scarcely been improved in the intervening 60 years. Assuming no familiarity with graph theory I will give a gentle introduction to the method, surveying some early approachable results, before talking briefly about some recent work of Conlon, Fox and Sudakov on Sidorenko's conjecture.
In Graph Theory, for d ≥ 1, s ≥ 0, a (d, d + s) - graph is a graph whose degrees all lie in the interval {d, d + 1, . . ., d + s}. For r ≥ 1, a ≥ 0 an (r, r + a) -factor of a graph G is a spanning (r, r + a) - subgraph of G. An (r, r + a) - factorization of a graph G is a decomposition of G into edge -disjoint (r, r + a) - factors.
In this talk, I will give more general results and different techniques used to find the simple graph (r, s, a, t) - threshold number.
In Graph Theory, for d ≥ 1, s ≥ 0, a (d, d + s) - graph is a graph whose degrees all lie in the interval {d, d + 1, . . ., d + s}. For r ≥ 1, a ≥ 0 an (r, r + a) -factor of a graph G is a spanning (r, r + a) - subgraph of G. An (r, r + a) - factorization of a graph G is a decomposition of G into edge -disjoint (r, r + a) - factors.In this talk, I will provide upper and lower bounds for the simple graph (r, s, a, t) - threshold number σ (r, s, a, t), and for the multigraph (r, s, a, t) - threshold number µ(r, s, a, t). We also determine the pseudograph (r, s, a, t) – threshold number π(r, s, a, t).
We consider a one-parameter family of invertible maps of a two-dimensional lattice, obtained by applying round-off to planar rotations. We let the angle of rotation approach π /2 and show that the limit of vanishing discretisation is described by an integrable piecewise-smooth Hamiltonian flow, whereby the plane foliates into families of invariant polygons with an increasing number of sides. The round-off perturbation introduces KAM-type phenomena: a positive fraction of the unperturbed curves survives, and locally this fraction converges to a rational number strictly less than 1.
Let G be a multigraph without loop and F be a subgraph of G. An F-Tutte trail of G is a trail H of G such that(i) Each component of G\V (H) has at most three edges connecting it to H.(ii) Each component of G\V (H) containing a vertex of F has at most two edges connecting it to H.
In this talk, I will show that 2-edge connected plane graph has a Tutte trail. I also talk about the relation between Tutte trail and Hamiltonian problem.
Model theory is a branch of mathematical logic which is concerned with interpreting mathematical statements in different algebraic structures. It has a reputation for being abstract and difficult to grasp, but this talk will aim to show the power of considering a model theoretic perspective. I will give an introduction to the language of model theory then give an example of a proof made much simpler using these techniques. If time permits I will also give an idea of what aspects of model theory I use in my work.
We derive bounds on the size of regular induced subgraphs in certain regular graphs.
Given a type of object, a common goal in mathematics is to classify them. Geometry aims to do this via a moduli space, a parameter space for the objects that "reflects the geometric data in some nice way". Such hand-wavy intuition turns out to be hard to define, and spaces often become incredibly messy very quickly. As a result, the study of moduli spaces has a (kind of fair) reputation as an impenetrable mess. In this talk, we'll cut through the formal definitions with lots of concrete examples and give some insight into why such spaces can very quickly spiral out of hand. We'll conclude with an example of a particularly nice (ie. drawable) moduli space: the space of metric trees.
General Relativity constitutes the better theory we have so far to understand how gravity works at largescales. Mathematically, it is formulated using the language of differential geometry and topology, but PDE theory and analysis come into play when the dynamics are studied. In this talk I will give a brief introduction to the initial value problem in relativity and how conformal methods provide with a nice approach to it. In particular, the anti-de Sitter spacetime will be taken as an example of what happens when initial data is not sufficient to establish a well-posed problem. Finally, some comments will be made about why theoretical physicists are so interested in understanding it, even though it does not represent a physical world.
Representation Theory is one of the areas of mathematics in which a wide range of techniques can be applied, from combinatorics and differential geometry to category theory. Different abstract algebraic objects can be understood by representations of them, namely matrices (linear transformations of vector spaces). Thus, we revise the problem of studying abstract algebra into the considerably easier problem of studying linear algebra.We will start with a quick review on the purpose and applications of Representation Theory. Following this, we will address the representations of finite groups and one of its current open problems: describe and understand the composition factors of its modular representations. Then, we will focus on the modular representation theory of S_n and an object of significant importance for its study over an arbitrary field: the Specht modules.
Covariate-adjusted response-adaptive (CARA) designs use available responses to skew the treatment alloca-tion in an ongoing clinical trial in favour of the treatment arm found at an interim stage to be best for apatient's covariate pro le. There has recently been extensive research on CARA designs mainly involvingbinary responses. Though exponential survival responses have also been considered, the constant hazardproperty of the exponential model makes the mean residual life for patients constant, making it too restric-tive for wide-ranging applicability. To overcome this limitation, designs are developed for Weibull distributedsurvival responses by deriving two variants of optimal designs based on an optimality criterion. The optimaldesigns are based on the doubly-adaptive biased coin design (DBCD) in one case, and the ecient randomisedadaptive design (ERADE) in the other. The observed treatment allocation proportions for these designs con-verge to the expected targeted values, which are derived based on constrained optimization problems. Themerits of these two optimal designs are also discussed. Given the treatment allocation history, response his-tories, previous covariate information and the covariate pro le of the incoming patient, an expression for theconditional probability of a patient being allocated to a particular treatment has been obtained. To applysuch designs, the treatment allocation probabilities are sequentially modi ed based on the history of previouspatients' treatment assignments, responses, covariates and the covariates of the new patient.
The ERADE is preferable to the DBCD when the main objective is to minimise the variance of the al-location procedure. However, the former procedure being discrete tends to be slower in converging towardsthe expected target allocation proportion. Since the ERADE provides a design with minimum variance,it is better than the CARA design based on the DBCD as far as the power of the Wald test for testingtreatment di erences is concerned. An extensive simulation study of the operating characteristics of the pro-posed designs supports these ndings. It is concluded that the proposed CARA procedures can be suitablealternatives to the traditional balanced randomization designs in survival trials, provided that response dataare available during the recruitment phase to enable adaptations to the designs.
The generalised sum of remainders map over a finite non-empty set of positive integers is a map which computes the sum of the remainders upon dividing its argument by every number in the set. The dynamics of this map, i.e., the behaviour of the sequence of numbers generated by iterating the map, is being studied. This talk will present some of the recent results, which are hopefully understandable even to audiences with minimal prior knowledge of dynamical systems and number theory. This is a joint work with Ryan Kasyfil Aziz.
Discrete autoregressive processes (or DAR(p)) provide a simple way of generating time series with a controlled amount of memory, and so can be used to build time varying networks that themselves have memory. If we take these time varying networks and run processes on top of them we can then look at how these processes react to a change in the influence of memory. Here this network process will take the form of a simple epidemic spreading model and we will see that memory can either speed up or slow down the infections passage through the network.
Tropical Geometry arose in the late 90s as a way to studying algebraic geometry as polyhedra. However, it made news headlines in 2007 when economist Paul Klemperer used it to design an auction to protect banks and building societies against the financial crisis. Since then, it has found huge applications in multiple disciplines including combinatorics, game theory and, unsurprisingly, algebraic geometry. We'll give a quick introduction to the subject then focus on a couple of applications it has in others.
Most of us learnt at a young age that the game of 'Tic-tac-toe' (or 'noughts and crosses') ends in a draw, at least if both players play with a modicum of sense. At that stage, most of us probably wrote the game off, but perhaps we shouldn't have been so hasty. There are several natural generalisations of the game which are not so easy to play, and many basic questions about these are still unsolved. We will discuss some of the theory behind these and other positional games, such as 'Hex' and 'Sim', including providing answers to questions like 'how do you steal a strategy?', 'why is avoiding harder than achieving?' and 'who or what is Snaky?'
The octonions are an 8-dimensional normed real division algebra which are often overlooked due to their non-associativity. This talk will hopefully give an intuitive introduction to them and some of the challenges related to working with them, specifically in relation to their projective plane.
The talk will present a proof of the Alon-Boppana result due to A.Nilli, providing a motivation for the definition of Ramanujan graphs. This will be followed by a sketch of the result due to Marcus, Spielman and Srivastava on the existence of infinite families of Ramanujan graphs of arbitrary degree.
The Tutte polynomial for matroids is not directly applicable to polymatroids. For instance, deletion-contraction properties do not hold. We construct a polynomial for polymatroids which behaves similarly to the Tutte polynomial of a matroid, and in fact contains the same information as the Tutte polynomial when we restrict to matroids.
Braided Hopf algebra is a generalisation of quantum groups (also known as ordinary Hopf algebra), means that it is a Hopf algebra that live in braided monoidal tensor categories, and they are commute and cocommute up to a braiding map. Associated to braided group B in category of A-module over dual quantum groups, we construct a new dual quantum groups U(B^(op), A, B*), where "op" means the multiplication is braided-opposite, and B* is the dual of B. Application to this construction is to get the description of the basis of Cq[SL2], which is a very hard combinatorial problem.
In this talk, I will recall the definition of ordinary Hopf algebra and some related definitions and examples, then introduce the concept of braided Hopf algebra. Afterward, on the short time, I will state the new construction of dual quantum groups, and some related result.
Simplicial complexes can be thought of as a generalisation of networks able to encode interactions occurring between two or more nodes via links, triangles, tetrahedra etc. They are emerging as a tool for describing networks with an abundant numbers of short loops and large clustering coefficients and and have already been used to describe a large variety of complex interacting systems ranging from brain networks, to social and collaboration networks. They are also ideal mathematical objects for the discretisation of geometry and so may also open up possibilities for uncovering the hidden geometries of networks.
In this talk I’ll give a brief introduction to the field of networks, explaining the challenges faced and giving some motivation for the use of simplicial complexes. I’ll then introduce two ‘maximum entropy’ models of simplicial complexes based respectively on the placing of hard and soft constraints on their local structural properties.
The models are investigated from a statistical mechanics perspective, and an interesting relation between their entropies is identified. This relation allows us to obtain an expression for the total number of simplicial complexes that satisfy a set of constraints on their structure.
Piecewise-smooth dynamical systems have attracted a lot of interest in the last decade. Whereas deterministic models have been studied intensely, their stochastic counterpart is still in its infancy. Systems with dry friction subjected to stochastic perturbations are prominent examples of piecewise-smooth stochastic systems. There are only a few cases known, where exact results can be obtained.In the first part of my talk I will give a short introduction of concepts and methods from statistical physics, e.g. the Langevin equation and the Fokker-Planck equation. Then I will talk about dry friction models subjected to Gaussian white noise. Finally I will present some recent results for a dry friction model and coloured (exponentially correlated) noise.
Questions like:-why identical twins aren't genetically identical?-can we quantify how much time does the virus will take to infect thecell once transmitted to the bodyare something which have haunted researchers for a long time. The answersto all these questions are hidden in the fundamental biological phenomenoncalled gene expression which is an intrinsically stochastic process. Anexpression of gene can simply described as the journey of DNA to proteins.
Modelling gene expression using the tools of stochastic process can shedsome light to the above questions. To be more precise, we will be talkingabout Fano factor(variance/mean), stochastic-di erential equation(SDE), mas-ter equation, Fokker-Planck equation(FPE) and mean rst passage time(MFPT)in the context of gene expression(without delving into the intricate biologicaldetails).
Often when considering games played on graphs it can be difficult to prove results which are applicable to all graphs (see for instance the notoriously difficult conjecture of Meyniel concerning the Cops and Robbers game). In these situations it can be instructive to think instead about the behaviour on a 'typical' graph. That is, to choose a graph randomly and investigate the likely behaviour.
I will discuss some combination of the Cops and Robbers game and my work on the Revolutionaries and Spies game using no probabilistic machinery more difficult than a union bound.
McDonalds sell chicken nuggets in boxes of 6, 9 and 20 - what is the largest number of nuggets you can't buy in one transaction? This seemingly simple and silly question has been studied for over 100 years using a variety of techniques such as polyhedral geometry, semigroup theory and combinatorial optimization. We will instead use a relatively new approach via the commutative algebra of lattice ideals and see exactly how homological data can help you with your fast food purchases.
Equation-free methods make possible an analysis of the evolution of a few coarse-grained or macroscopic quantities for a detailed and realistic model with a large number of fine-grained or microscopic variables, even though no equations are explicitly given on the macroscopic level. This will facilitate a study of how the model behaviour depends on parameter values including an understanding of transitions between different types of qualitative behaviour. These methods are introduced and explained for emergence of oscillatory pedestrian counter flow in a corridor with a narrow door. In addition, the concept of control-based continuation is combined with equation-free techniques.
Modelling the motion of agents in a system as random walks is often a useful approximation. Modelling a set of agents that are in some sense connected as a network is also useful in a number of settings (such as modelling the spread of diseases). Given some notion of this ‘connectedness’ between random walkers we can attempt to create a network, and hopefully gain some insight into the system we are modelling from its properties and how they change over time. I will attempt to show how a network of such Brownian walkers can be formed and improved, then show how long one might expect some properties of the network to persist for.
Since its origins in the study of fluid dynamics, dimensional analysis has been a powerful tool for obtaining remarkable insights about the behaviour of physical systems. In this sense, I will present Buckingham Pi Theorem as a useful (and some people would say 'magical') algorithm to obtain physically relevant quantities in cases where dynamical equations are not known. Also, I will discuss it briefly in more modern mathematical terms involving Lie Algebras and rescaling groups. For the sake of clarity, I will work out several examples where this method can be applied, ranging from the Pythagorean theorem to black holes.
The 27-dimensional exceptional Jordan algebra (or Albert algebra) of 3x3 Hermitian matrices over the octonions turns out to be a fruitful object for some exceptional groups of Lie type. It was shown that the group of the automorphisms of the Jordan algebra over any field is isomorphic to the Chevalley group F_4 over the same field. The stabilizer of the determinant, which is represented by a certain cubic form, is actually a group of type E_6, and its twisted version is obtained by considering those elements of E_6 over the quadratic field, which preserve certain Hermitian form. In this talk we are going to look at the constructions of these finite groups defined by the Albert algebra.
There are many motivations from physics to construct quantum group. You can think quantum group as a notion of quantum symmetry, for which it is an algebra with additional structures that can be the role of symmetry such as duplication. It turns out that quantum symmetry has a Hopf algebra structures. Thus algebraically speaking, quantum group is a family of non-commutative non-cocommutative Hopf algebras.In this talk, I will describe how we construct Hopf algebra from usual algebra, and give a simplest example of quantum groups.
Partial Differential Equations are, in general, notoriously difficult to solve exactly, with many solutiongenerating techniques applicable only to a small class of problems. In this talk I will describe an algorithmicapproach to generating PDE solutions, called the "Symmetry Method", and discuss some of the issues whicharise in its application to covariant equations; in particular, the equations of General Relativity.No previous knowledge required!
Algebraic geometry has a reputation of being incredibly abstract, dealing with increasingly intricate and subtle objects in order to solve harder and harder problems. Tropical geometry blew this apart around ten years ago by showing these abstract objects could be reformulated as very real (even drawable) polyhedral complexes that could solve certain problems algebraic geometers could not previously. In this talk, I'll give an introduction as to how these complexes are constructed and why they're so nice to work with. No previous knowledge required, expect lots of drawings of polygons and spider webs.
The visibility graph is a method of turning a time series in to a graph. I will be explaining the method and motivation, talking about some cool and interesting results, with some application to financial time series. I'll also be talking about Feigenbaum graphs, which are super amazing visibility graphs related to the logistic map.
In this talk I will introduce an area of combinatorics called Ramsey Theory. Questions in Ramsey Theory are usually of the form:
"Suppose we are going to take some mathematical structure and cut it into pieces. How big must this structure be to guarantee one of the pieces has some property?"
This talk will include some nice colourful diagrams and assume no prior knowledge of Ramsey Theory.
"Do not worry about your difficulties in Mathematics, I can assure you mine are still greater.”Einstein never agreed with quantum mechanics, right up until the day he died. As of 1924 he was tormented with the knowledge that, on a subatomic level, space appeared 'fuzzy'. This was due to quantum effects and was encapsulated by the famous Heisenberg uncertainty principle. For him, this meant that the mathematical concept of a point in space and time does not work, the geometry of the real world appears quantum in structure. A search for the correct structure of space and time leads one into the realm of non-commutative geometry.
Non-commutative geometry aims to extend classical notions of geometry to situations where the underlying algebra is non-commutative, as is the case for the matrix algebra that occurs in quantum mechanics.In my talk I aim to give a brief introduction to a form of non-commutative geometry known as deformation quantisation. With this I will show how one can solve the problem of quantising Riemannian and other differential geometries to first order deformation (in a Planck-scale parameter). I also hope (time permitted) to discuss how this method of 'semiquantisation' could be used to quantise parallelizable manifolds, namely the 3-sphere and the 7-sphere.
I will start by defining the locally free semigroup of n generators and show that it has the algebraic structure of so-called heaps of pieces, where the pieces are dimers. After this I will outline how the generating function of this special kind of heaps can be found and will explain the connection between heap configurations and the weights of states in the stationary state of the asymmetric simple exclusion process (ASEP), a stochastic process describing the transport of particles along a discrete line.
The conformal field equations have proven to be an extremely useful tool in research in general relativity, especially in analysing properties of spacetimes at infinity. A currently ongoing area of research is whether or it is possible to formulate conformal systems described by the conformal field equations as an initial value problem. It turns out that the answer to this question is yes, at least in the case of vacuum spacetimes, as one can recast the vacuum version of the Conformal field equations as a system of wave equations (and wave equations are essential in formulating an initial value problem for any system). The question of whether or not it is possible to do the same thing with matter is still an open problem.
We'll begin the talk by briefly reviewing and summarizing the key ideas in General Relativity including some of the problems in GR and how these are solved with wave equations and conformal methods. From there we'll look at how one can derive a set of quasilinear wave equations for conformal spacetime models coupled to trace-free matter. The next part will be to show that any solution to these wave equations is also a solution to the field equations, which is achieved using a technique called the propagation of the constraints. Finally we'll discuss some applications of this method, including stability analysis of spacetimes coupled to trace-free matter.
The Einstein field equations can be thought in different ways. In the first instance, they are a system of non-linearsecond order partial differential equations for the metric coefficients with no evident structure i.e. they are nota priori hyperbolic nor parabolic nor elliptic. In 1952 Yvonne Choquet-Bruhat taught us that choosing an harmoniccoordinates one can recast the Einstein field equations as a system of quasilinear wave equations (i.e. hyperbolic)for the metric coefficients. Since then, great efforts have been done to use the notions established in Choquet-Bruhat's seminal work to address general questions about solutions to the Einstein field equations, in particular existence and stability. In this regard, one of the main ways (but not the only one!) to make inroads in these type of problems is to use the so-called vector field methods. In this talk we will first motivate the ubiquitous presence of wave equations in General Relativity and then we will introduce some of the main ideas used in vector field methods.
The combinatorics of partitions play a crucial role in understanding the symmetric group and its representations. To each partition, we can associate a Specht module, which is a representation of the symmetric group. In positive characteristic, there are many open questions about these Specht modules. In this talk we will touch on classifying the "blocks" of the symmetric group and determine when two such modules belong to the same block by Nakayama's Conjecture.
One of the first questions in rigidity theory was: Given a structure made of bars of fixed length, which are connected at their endpoints by flexible joints, when is the resulting structure flexible? This is surprisingly difficult to answer in 3-dimensions, and is still open. However, much more is known in 2-dimensions.
We can model such a 2-dimensional structure by a framework (G,p) where G is a graph whose vertices represent joints, and whose edges represent bars; and p which maps the vertices of G to the plane. So long as the coordinates in p are algebraically independent over the rationals, we can determine whether our structure is rigid or flexible by just looking at the graph. However, most real-world structures have a large amount of symmetry, and so it is not sensible to assume that p is algebraically independent over the rationals.
In this talk I will give an overview of recent work in characterising rigidity and flexibility for symmetric frameworks. And, time allowing, explain how I am currently trying to extend these results to model the symmetry of structures built in CAD-software, where edges can have constraints on their angle as well as their length.
A surprisingly large number of physical theories (including general relativity) can be written as a Lagrangian system, for which Noether's theorem provides a powerful relation between symmetries and conservation laws. In this talk, I will introduce the Lagrangian formulation and illustrate the theorem for some simple Lagrangians. Then I will outline how the theorem can be used to find conserved quantities in general relativity, and characterise black hole spacetimes ("black holes have no hair").
We will survey the concept of Decision Tree Complexity. Loosely speaking, it is the study of how much of a function's input do you need to know to determine its output. Time permitting, we will discuss some old conjectures in the area and touch briefly on connections with topological fixed point theorems.
Network science is very useful when analysing complex structures like the human brain. I will be talking about community detection (which is a method for partitioning networks) and the mathematics and algorithms it involves.The main measure of the community structure I will discuss is called "flexibility", which gives information to how a nodes connections to other nodes changes as time progresses. I will then present some results obtained after applying these methods to a data set from Professor Bullmore’s lab at Cambridge. The data consists of parcellated fMRI scans of a group of control subjects, and a group of patients diagnosed with schizophrenia, under two different drugs.
In his seminal paper of 1931, Dirac posited the existence of the magnetic monopole in an attempt to explain the quantisation of electric charge. Since then, the monopole has played a central role in questions related to inflationary cosmology, particle confinement and electromagnetic duality, to name just a few. On the other hand, the subject of topological solitons has been one of great interest in various branches of pure mathematics, from integrable systems to algebraic geometry. In this talk I will describe the topological characterisations of the monopole, touching on the mathematics of fibre bundles and the related Chern invariants, and address the question of which gauge theories allow for their existence. If time permits, I will discuss various generalisations of the magnetic monopole, such as instantons, and some exact solutions of the governing field equations.
A graph G=(V,E) is d-sparse if each subset X\subseteq V with|X|\geq d induces at most d|X|-{{d+1}\choose{2}} edges in G.Laman showed in 1970 that a necessary and sufficient condition for arealisation of G as a generic bar-and-joint framework in 2-dimensionsto be rigid is that G should have a 2-sparse subgraph with2|V|-3 edges. Although Laman's theorem does not hold when d>2,Cheng and Sitharam recently showed that if G is genericallyrigid in 3-dimensions then every maximal 3-sparse subgraph of Gmust have 3|V|-6 edges. We extend their result to all d\leq 11 byshowing that if G is generically rigid in d-dimensions then everymaximal d-sparse subgraph of G must haved|V|-{{d+1}\choose{2}} edges.
Random perturbations in dynamical systems are an ubiquitous phenomenon in many fields of science. These perturbations, e.g. represented by noise, lead to an unpredictable movement of the trajectories of the system. It is of great interest to control such a random motion or other noise-induced effects.
The first part of the talk consists of an introduction to the theory of dynamical systems and related concepts as stability analysis and bifurcation theory. Then I will proceed with the investigation of dynamical systems subjected to Gaussian white noise. In the last part, I will show how noise effects can be modulated by time-delayed feedback control.
The combination of conformal geometry with general relativity has proven immensely useful in analysing the global properties of spacetimes. One of the most relatively recent ideas is to write these equations as a system of wave equations. The first parts of this talk will very quickly show how trying to understand the evolution of Einstein's equations leads to them being written as a system of wave equations. The next part will cover the basics of conformal geometry, why one needs conformal geometry and how this motivates the conformal field equations, (with a brief explanation of how one derives the equations). Once we have covered this prior knowledge, we will see how these two ideas can be combined to derive a system of wave equations for the variables of the conformal field equations. Included in this part will be the list of wave equations that have been derived, those for Einstein-Maxwell spacetimes and for the Conformal field equations in a vacuum. Included will be the derivation of some of the simpler wave equations. Finally, to conclude we will look at the future of this subject, namely how one extends this to include matter models.
General Relativity is one of the most successful physical theories ever devised. But as well as providing accurate physical predictions, it has a mathematical elegance which makes it interesting in its own right. Assuming no previous knowledge of general relativity, I will explain the basics of the theory, and describe how symmetries of physical solutions are represented naturally in terms of Killing vectors. Then, I will describe how general relativity can be set up as an initial value problem; and finally, prove the Licnerowicz theorem, which states that any static spacetime which possesses an initial hypersurface with Euclidean topology cannot contain black holes.
Non-equilibrium networks or growing networks are network models in which new vertices are continuously added in time to the graph and are connected to the already existing vertices according to some attachment rule. In order to find out the degree distribution of such models a general approach is to define a master equation for the system under study. I will show how to use the master equation approach to solve two classical network models, the growing exponential network and the Barabasi-Albert network model.
A well-studied graph invariant is the Tutte polynomial, which can be used to calculate structural properties of a given graph. While the Tutte polynomial extends to matroids, it cannot be applied to a superclass of matroids called polymatroids. I will give an introduction to the Tutte polynomial and to matroids, and give a candidate for a Tutte-like polynomial for polymatroids.
Geometric cluster models are a class of physical toy models used to mirror the behaviour of systems such as polymer chains or cell membranes. In order to understand their physical properties depending on external parameters such as the temperature, one reaches to know their partition functions (or generating functions) and analyse their asymptotic behaviour in the limit of large system sizes.
I will start by giving a few examples for geometric cluster models and discuss some generic properties of them. At the example of Dyck paths, I will then show how their asymptotic behaviour can be investigated.
In this talk we will briefly motivate/introduce a conformal formulation of the Einstein field equations that has been used to obtain global non-linear stability results for spacetimes as de-Sitter and Minkowski. We will discuss how to use these equations to analyse the behaviour of black hole spacetimes. In particular, we will pose an asymptotic initial value problem (initial data "at infinity") and analyse the evolution equations. It will be pointed out the mechanism in the equations that is responsible for the formation of singularities.
Nonequilibrium systems are characterized by the presence of macroscopic currents. Steady states conserve the average currents, but their statistics can be complex, with anomalous phases and strong fluctuations. No general theory allows to describe it without solving the microscopic dynamics.
We present the large deviation framework for macroscopic observables and we use it to find the current statistics in a time-correlated zero range process (ZRP). We derive the exact stationary solution and a mean-field approximation on the one-dimensional lattice. Analytical and numerical calculations show that, while the steady statecorresponds to that of a Markovian ZRP with effective interaction, the probability of rare currents differs significantly from the Markovian case, with memory-induced dynamical phase transitions playing a centralrole.
The results are also interesting for problems that usually fall outside the domain of the fundamental physics, such as congestion-avoidance strategies in data streams and stochastic modeling of strongly correlated biological systems. In fact, in these cases, the rare events can be more important than the typical ones.
Random walks represent a paradigmatic model to study the diffusion properties of complex networks, and have also been extensively employed in the last decade as a tool to characterise the centrality of nodes and to identify densely connected subgraphs or communities. Biased random walks are a particularly interesting class of walks for which the probability to jump from one node to one of its neighbour depends on a function of a chosen topological property of the destination node, usually its degree, and can be therefore tuned at will in order to systematically prefer (or avoid) to move toward nodes having certain characteristics. We present here an analytical treatment of biased random walks on multiplex networks, i.e. complex networks whose units are connected by means of a variety of different relationships, represented by M separate yet interacting layers corresponding to different communication channels, and study how the entropy rate of such processes is affected by the presence of inter-layer degree correlations and other structural properties of the considered systems.
Representation theory of the symmetric group enables us to study the symmetric group via linear algebra. We seek to understand the irreducible representations of the symmetric group over any field. Over a field of characteristic 0 (say, the complex numbers), the modern construction of these irreducible representations, the Specht modules, was developed by G. D. James using combinatorics in the 1970s. In 2009, it was shown that the symmetric group algebra is non-trivially graded, and subsequently that the Specht modules are also non-trivially graded, a significant development in modular representation theory (the theory of representations over a field of positive characteristic). I will give an outline of the representation theory of the symmetric group and construct the Specht modules.
In 1954, Frobenius, Robinson and Thrall introduced a combinatorial formula, the hook length formula, for the dimension of the Specht modules over any field. Recent results have enabled us to define the graded dimension of the Specht modules combinatorially. One hopes to obtain an analagous graded hook length formula; I will discuss my developments on this problem.
We consider a game where n people have a randomly (but not independently) chosen natural numbers on their foreheads, and no two player have the same number. The players are not allowed to communicate, but they all know the distribution of the numbers. After seeing the numbers on all the other peoples foreheads, each player can choose two numbers, i and j between 1 and n. The player now wins £1 if the number on his forehead is the i'th smallest, but looses £1 if it is the j'th smallest. We want to choose a distribution such that the players do not win too much in expectation. How well can we do?
We will see that no matter the distribution, the players will always be able to ensure a positive expectation, but for any epsilon, we can choose the distribution such that they have expectation at most epsilon. However, to do this we would need to use numbers as large as 2^{2^{\dots 2^{k_n/\epsilon}}}, where the height of the tower is n-2.
The presentation is based on joint work with Troels B. Sørensen and Vincent Conitzer.
I will introduce dynamic algorithms for maintaining all pairs shortest paths (APSP) in undirected graphs with real-valued edge weights. I will highlight the central role played by shortest paths in the analysis of real-world networks.
Because of the high volatility and the increasing size of today's network data sets, the development of efficient methods for the real-time analysis dynamic networks is becoming more and more crucial.
Lastly, I will try to turn on the discussion by showing some (blurred) ideas in connection with networks dynamics, group theory, and quantum field theory.
I will introduce some (hopefully) interesting ideas from intersection theory. I will talk about the question `where should I intersect stuff?' which will lead us from school maths to something a bit more fancy!
Imagine two people are each given a set of rods, and instructions on how to connect these rods together with joints at their endpoints. Assuming they both follow the instructions correctly, is it possible that they could still end up with different structures?
If the given constraints (in this case the lengths of the rods and the rules on how to connect them) are sufficient to guarantee a unique realisation, then we say that the structure is globally rigid. In this talk we'll look at the global rigidity of frameworks in two dimensions, and see to what extent this can be determined by their underlying graph.
A scattering process can completely be characterized by its K-matrix. For chaotic quantum systems it can be modelled within the framework of Random Matrix Theory, where either the K-matrix itself or its underlying Hamiltonian is taken as a random matrix. I will show that both approaches are equivalent for a broad class of unitary invariant ensembles of random matrices, using correlation functions of products and ratios of integer powers of characteristic polynomials.
We will discover that for orthogonal invariant ensembles one needs instead correlation functions of half-integer powers, and I will present results for a few of these correlation functions in the limit of large GOE-matrices along with some further examples where these objects also arise.
Percolation Theory is the study of the structures of clusters and connectivity in (infinite) random graphs. The percolation threshold is the point at which there first appears global (or infinite) components. Percolation thresholds have important applications in many physical and real world problems, such as material science and epidemiology. In all but a few simple cases, the exact threshold of most models are not known, and so a lot of work is put into estimating them.I will give a general introduction to percolation theory before moving on to looking at how to gain rigorous confidence intervals for two and three dimensional lattices.
A number of network growth mechanisms have been suggested to explain how social connections are forged and severed over time. Among these mechanisms, a key role is played by homophily, namely the principle that similarity breeds connection. However other studies have pointed in the opposite direction. For example, economists have suggested that similarity can lead to competition for scarce resources.
This talk will examine to what extent homophily appears to govern communication in an online social network.
We consider a one-parameter family of invertible maps of a two-dimensional lattice,obtained by applying round-off to planar rotations.We let the angle of rotation approach $\pi/2$, and show that the system exhibits afailure of shadowing: the limit of vanishing discretisation corresponds to a piecewise-affine Hamiltonian flow,whereby the plane foliates into invariant polygons with an increasing number of sides.
Considered as perturbations of the piecewise-affine flow, the lattice maps assumea different character, described in terms of strip maps, a variantof those found in outer billiards of polygons. Furthermore the flow is nonlinear(unlike the rotation) and a suitably chosen Poincare return map is a twist map.
We show that the motion at infinity, where the invariant polygons approach circles,is a dichotomy: there is one regime in which the nonlinearity tends to zero, leaving only the perturbation,and a second where the nonlinearity dominates.In the domains where the nonlinearity remains, numerical evidencesuggests that the distribution of the periods of orbits is consistent with that of random dynamics,whereas in the absence of nonlinearity, the fluctuations result in intricate discrete resonant structures.
I will give an introduction to (some of) proof theory, emphasising links with combinatorics. Beginning with Hilbert's definition of a formal proof and Godel's incompleteness theorems, I will focus on Gentzen's 1936 proof of the consistency of arithmetic. This introduced a proof language called sequent calculus, and the idea that proofs have a natural dynamics given by cut elimination. This leads to linear logic and the hard problem of finding good invariants for the dynamics of proofs (which computer scientists call 'denotational semantics').
I will explain how the the asymptotics of functions can be obtained by the use of the saddle point method. To gain intuition, I will show how Stirling's formula can be obtained in this way. Afterwards, I will explain the case of two coalescing saddle points, with which I am dealing in my PhD project.
We will start with an easy introduction to Hamming Codes- a code that is both optimally error-correcting and optimal as a covering code. We will then discuss an application to an extremal problem in the hypercube. This talk is aimed at a general audience; in particular all non-standard terms in title and abstract will be defined.
I examine two online social networks where nodes are connected by directed and signed links. The analysis of degree-degree correlations in such networks provides insightful view on how the nature of interactions influence the peculiar correlation patterns observed in many social networks. Positive correlation are associated to a collaborative relation between nodes, but other kinds of interaction generate different correlation patterns. A simple theoretical model is introduced in order to reproduce the distinct degree correlation patterns emerging from positive and negative networks.
To study the global behaviour of Einstein equations and their solutions conformal geometry has proved its usefulness in existence stability results. A useful way to get intuition of the geometric meaning of a conformal transformation in general relativity is the construction of Penrose diagrams. In this talk, the general procedure of constructing such diagrams will be outlined by 3 examples: the stereographic projection, the Penrose diagrams for the Minkowski and Schwarzschild spacetimes. With the perspective given by those examples we will discuss the definition of asymptotically simple spacetimes and the main difficulty for deriving a conformal version of the Einstein field equations.
In this talk, I will present a fairly natural and easy introduction to deform a (commutative) associative algebra. Given an associative noncommutative algebra, we analysis the classical structures (like Poisson bracket, connection, etc) of which is quantisation. This approach is very different from the algebraic deformation theory of associative algebras introduced by Gerstenhaber (1964). In my case, I will focus on deformation of classical differential graded algebras (a super-commutative super Hopf algebra) over Poisson-Lie group. A large class of examples of such noncommutative deformation will be given by pre-Lie algebra, an (not necessarily associative) algebra with product such that (ab)c-(ba)c=a(bc)-b(ca).
Spline models are piecewise polynomials that are optimally smooth, but they are practically unworkable in higher dimensions and hard to fit over non-standard regions. Smooth supersaturated models are high degree polynomial models that can be fitted over non-standard regions, portray spline-like behaviour and, as polynomials, are more tractable than splines. This allows us to apply optimal design theory over non-standard regions. We use orthogonality to simplify the fitting of smooth supersaturated models via three (and a half) different basis concepts: A monomial basis, an orthogonal polynomial product (with an example using Legendre polynomials), and a multivariate orthonormal polynomial basis. A simulation study demonstrates smoothing over arbitrary regions using a multivariate orthonormal polynomial basis.
The dynamics of the Lorentz gas is deterministic but, when one deals with an ensemble of particles, macroscopic properties arises in such way that we can explore microscopic details and understand macroscopic properties of the system.In this talk, I'll introduce the concept of billiards, or Lorentz Gas, and explain how they are useful as a start to explore diffusion. Then, I will explain the case of billiards systems with soft potentials and a summary of relevant investigations in the field.
In this talk, I will briefly introduce the model: Brownian motion with dry friction. And then give remarks on how to solve a first-passage time (FPT) problem. After that, I will show that the FPT problem of a pure dry friction model and a full force model can be solved analytically. I will also show some interesting phase transition phenomena related to the FPT problem.
Suppose you know the distribution of some discrete random variable X, and you want to learn the specific value x that it takes, by asking yes/no questions. Each question cost you £1, how much will it cost you on average to learn x? If you could buy questions that can have 3 different answers instead of only 2, then how much should you be willing to pay for such a question? If you could buy yes/no questions that cost you £2 when the answer is "no" but only £0.50 when the answer is "yes", would that be better in the long run? How much is a question worth if there is a probability that you get the wrong answer? In this talk, I am going to answer these questions, to give an introduction to information theory. I will then give a short introduction to cryptogenography (the study of how to leak information without revealing yourself), where I present the idea and some results.
Topological stacks are just jazzed up topological spaces. I will talk about the following question: We have homology/homotopy groups of topological spaces, so can we extend this notion to topological stacks? I will give lots of examples and explain situations where topological stacks naturally arise.
We will be looking at enumeration problems relating to walks on a lattice with a boundary. These can be used to model polymers interacting with a surface.
Generating function techniques can be a powerful tool for solving these, and we will demonstrate this, focusing first on the simpler case of Dyck paths before moving onto a triangular lattice. We will also present some bijective results between walks, including an open problem.
We will look at a very brief overview of the representation theory of the Iwahori-Hecke algebra, before putting it into a modern, graded setting. In this setting, we will look at Specht modules and the homomorphisms between them. We will discuss new results, extending column and row removal for homomorphisms into this graded setting. We conclude with a brief application of this result.
Cops and Robbers is a perfect information graph game in which a set of cops $C=\{C_{1},\ldots,C_{m}\}$ tries to a catch a robber, $R$. This can be used as a simple model of network security.
Given a graph $G$, the cop number of the graph, $c(G)$, is the minimum number of cops that can guarantee to catch $R$ on $G$. We will look at some bounds on the cop numbers of some simple graphs before moving on to look at the cop number of random geometric graphs.
Scattering is a ubiquitous phenomenon which is observed in a variety of physical systems. As a consequence of the complicated dependence on the parameters, scattering is quite often of chaotic nature, which allows to tackle many problems within the framework of Random Matrix Theory (RMT). After a short introduction into the general theory of quantum chaotic scattering we discuss the basic ideas of two different RMT-approaches (called Heidelberg and Mexico approach) and show a proof of their equivalence
In graph theory, A graph is called Hamiltonian if there is a cycle which contains every vertex of the graph. Thomassen conjectured that "Every 4-connected claw-free graph is Hamiltonian". In this talk, I will introduce Tutte's technique which is useful for proving some class of this Hamiltonian problem.
Dynamics based on replication and mutation form the core of game theory and population dynamics. Its outcome is often not an equilibrium, but can include oscillations and chaos. In this talk I will introduce a class of models of immunological interest and discuss a method for computing all their Lyapunov exponents.
In 1999 Zhang and Yeung showed that you can send information through a network more effectively than by the routing method that is used in the internet today. In this talk I will show how that is possible. I will also define entropy, which is a measure of how much information a random variable contain, and I will define the entropy cone $\overline{\Gamma^*_n}$, which is an attempt to describe how n random variables can be related to each other. We know what this cone looks like for n\leq 3 but already for n=4, we do not have any good description of the cone. I will talk about what we know about the cone, and how the cone is related to network coding.
Random motion is a powerful and simple model to describe processes characterized by stochastic activity, and it is widely used in many fields of science. In particular, random walks have been largely used over the last few years to investigate the dynamical properties of real world networks.Here we focus on degree-biased random walks, a particular class of walks in which the transition probability from a given node to a neighbour depends on a one-parameter function of the degree of the destination node.We analyze a small set of characteristic quantities, the Mean Return Time (MRT), the Mean First Passage Time (MFPT) and the Mean Coverage Time (MCT) and we show numerically and analytically the interplay between topological quantity and the dynamical properties of the motion.We also compare the prediction of the mean-field approximation for synthetic models to the exact numerical solutions calculated for a dataset of 20 real networks.
The Tangled Nature Model of evolution is an individual based, stochastic model, which describes, with good agreement with actual observations, the evolution of a simple ecology. Its dynamics alternates periods of meta-stable configurations and periods of hectic transitions, where the model does not show clear occupancy patterns and the population is spread randomly across the type space. In this talk I will briefly introduce the model, explain the feature of its dynamical evolution and discuss the possibility of forecasting its characteristic transitions by using a deterministic approach.
Random Matrix Theory (RMT) is a rich topic with many applications, for example in physics, multivariate statistics and number theory. After giving a short introduction into this field I will focus on the supersymmetry method which is a powerful tool to reduce the number of integrals involved in various RMT calculations.
I will introduce the topic of topological field theories with some low dimensional examples. I will then work towards some kind of classification of these low dimensional cases. Time permitting, I may say something about how this generalises.
The representation theory of the symmetric group is an old and rich subject. The modern perspective was developed by Gordon James in the 1970s. The Iwahori-Hecke algebra of type A is a deformation of the symmetric group, and one motivation for studying it is that it provides a bridge between the representation theory of the symmetric and general linear groups.
The Specht modules are a family of modules of fundamental importance for the Iwahori-Hecke algebra, and an open problem is to determine which Specht modules are decomposable. I will be discussing my attempts to answer this using a very new approach via Khovanov-Lauda-Rouquier algebras. These algebras at first glance seem completely different from the Iwahori-Hecke algebra, but an amazing result of Brundan and Kleshchev is that the Iwahori-Hecke algebra is isomorphic to a certain KLR algebra. I will explain how using the KLR algebra approach makes Specht modules easier to understand.
A Langevin equation with dry friction is studied by using the path integral approach and thesaddle-point approximation. Firstly the pure dry friction case is investigated in details andits corresponding direct and indirect paths are also derived. And then the Langevin equationobtained by regularizing the dry friction force with a smooth hyperbolic tangent force areanalysed. An interesting bifurcation phenomenon for the solutions of optimal paths areobserved in the regularized case.
Origami has come under considerable mathematical attention over the last 50 years. I will give the outlines of a proof of the so called "fundamental theorem of origami" that more or less states that it is possible to fold any model. The proof will be constructive, so in other words, it will tell you how to design any model yourself!
An opportunity to meet other PhD students in the department!
Recent studies on quantum turbulence have shown that the velocity statisticsof a tracer particle obey power law statistics ( Paoletti et al. [ Phys. Rev. Lett.101 154501 2008], White et al. [Phys. Rev. Lett. 104, 075301 2010 ]) rather thanGaussian statistics as observed in classical turbulence ( Vincent et al. [ J. FluidMech. 225 1 1991 ] , Noullez [J. Fluid Mech. 339 287 1997]. A superstatisticalmodel Beck [Phys. Rev.Lett. 98, 064502 2007] is constructed for a Lagrangiantracer particle in a quantum turbulent flow. The result is in excellent agreementwith a recent experiment done by Paoletti et al. [Phys. Rev. Lett. 101 1545012008] to observe the motion of tracer particles in quantum turbulent flow ofsuperfluid 4He.
We give a very rough introduction to some aspects of three-dimensional computer graphics, and discuss the applicability of the associated hardware to general parallel computation. We then consider a problem in computational group theory and speculate about parallel solutions. No particular programming knowledge will be assumed.
Seminar series: Queen Mary Internal Postgraduate Seminar
I will introduce some basic notions from Game Theory and explain how players can “learn” to play a game by repeatedly playing it. I willshow how such a learning process can be modelled mathematically by a differential inclusion.One such learning dynamics is known as “Fictitious Play”. I will demonstrate some of the remarkable features of this dynamical system and some of the mathematical questions it gives rise to.
We consider the standard model of a discretised rotation in the limit where the rotation number goes to zero, and find that orbits approach integral curves of a piecewise-constant Hamiltonian vector field. Furthermore we define an ordering on the set of possible orbit topologies and make a conjecture about the existence of so-called 'minimal orbits'.
Families of subsets of {1,2, ... n} -- or equivalently subfamilies of the n-dimensional hypercube Q_n -- are one of the main objects of study in extremal combinatorics. The kind of question we are typically interested in is: what is the maximal size of a family F contained in Q_n with a given property P? Recently there has been some interest in answering similar questions when F lives inside a different combinatorial space. In this talk, I will review the classical results of Erdos-Ko-Rado and Sperner, before considering how they might generalise to the so-called separated hypercube.
We introduce the notions of vector fields and differential forms for Euclidean space using both a local and coordinate free approach. We define the exterior derivative of a smooth function and look at its unique extension to an operator on the exterior powers of the module of differential forms. We then discuss the basics of de Rham cohomology. We finish with Euclidean space by reformulating the familar notions of div, grad, curl, and the Laplacian in terms of the exterior derivative using Maxwell's equations as an example of an application.We then recall the notion of a differential manifold and show how the differential form approach to multivariable calculus generalises to this more general setting almost painlessly.Finally, we will mention a little about the generalisation of all of this to the setting of noncommutative geometry.
We will talk about a method of how to visualize bifurcationstructure using transient processes for one dimensional maps.Further, applying this technique, we will show how to detectunstable periodic orbits of maps and flows.
Any undirected graph can be viewed as an adjacency matrix, and the eigenvalues of such a matrix is known as the graph's spectrum. I will talk about various aspects of a graph's structure which are encoded by its spectrum, and present two proofs of the well-known theorem which says that if all a graph's eigenvalues are simple, then its automorphism group is an abelian 2-group.
The Sharkovsky Theorem- An ordering of the natural numbers corresponding to periodic orbits of continuous maps on the real line.
Colin will try to show the following: let G be a finite group such that p is the largest prime dividing the order of G, and every Sylow subgroup of G is generated by at most d elements. Then G has a nilpotent normal subgroup of index bounded by a function of p and d. (In this context, nilpotent means a direct product of groups of prime power order.) In particular, there are only finitely many primitive groups with these parameters. He will explain the group theory terminology at the start for those who are unfamiliar with it.
We had a first QuIPS problem session towards the end of last term. This is our second attempt.
People are encouraged to bring along open problems that they think other members of the group might be interested in working on, or able to help with. It can be something related to your own research, or something completely different. Hopefully we can get some useful collaborations going.
Inspired by Scott Aaronson's essay "Who can name the Bigger Number?" (link is external), we begin the talk with a game - everyone was given 30 seconds and a piece of paper, and asked to write down the biggest number they could. The discussion proceeded to an attempt to discuss some real-life big numbers - the number of cigarettes smoked in the world in a year, the number of letters in the university library, the number of seconds in a human lifetime, the number of elementary particles in the observable universe, etc.
We then went on to discuss some really big numbers - we defined Knuth's Up Arrow Notation, and made an attempt to understand quite how big Graham's number really is. Then we moved onto *really* big numbers. After a (brief) discussion of Turing Machines and the Halting Problem we defined the Busy Beaver, and the Busy Beaver shift function, which provably grows faster than any computable function.