The Azimuth Project
Applied Category Theory Seminar (Rev #4, changes)

Showing changes from revision #3 to #4: Added | Removed | Changed

Idea

We’re going to have a seminar on applied category theory here at U. C. Riverside, starting in January 2019. We will make it easy to have discussions on the Azimuth Forum and Azimuth Blog. These will work best if you read the papers we’re talking about and then join these discussions. We will also try to videotape the talks, to make it easier for you to follow along.

Here’s how the schedule of talks is shaping up so far.

January 8, 2019: John Baez - Mathematics in the 21st century

John Baez wil give an updated synthesized version of these earlier talks of his, so check out these slides and the links:

Abstract. The global warming crisis is part of a bigger transformation in which humanity realizes that the Earth is a finite system and that our population, energy usage, and the like cannot continue to grow exponentially. If civilization survives this transformation, it will affect mathematics - and be affected by it - just as dramatically as the agricultural revolution or industrial revolution. We should get ready!

You can look at these slides and videos from related talks:

January 15, 2019: Jonathan Lorand - Problems in symplectic linear algebra

Jonathan Lorand is visiting U. C. Riverside to work with Baez our group on applications of symplectic geometry to chemistry. His talk will be about other research of his:

In this talk we will look at various examples of classification problems in symplectic linear algebra: conjugacy classes in the symplectic group and its Lie algebra, linear lagrangian relations up to conjugation, tuples of (co)isotropic subspaces. I will explain how many such problems can be encoded using the theory of symplectic poset representations, and will discuss some general results of this theory. Finally, I will recast this discussion from a broader category-theoretic perspective.Abstract. In this talk we will look at various examples of classification problems in symplectic linear algebra: conjugacy classes in the symplectic group and its Lie algebra, linear lagrangian relations up to conjugation, tuples of (co)isotropic subspaces. I will explain how many such problems can be encoded using the theory of symplectic poset representations, and will discuss some general results of this theory. Finally, I will recast this discussion from a broader category-theoretic perspective.

January 22, 2019: Christina Vasilakopoulou - Systems as wiring diagram algebras

Vasilakopoulou, Christina a visiting professor at U.C. Riverside, previously worked with David Spivak. So, we really want to figure out how two frameworks for dealing with networks relate: Brendan Fong’s ‘decorated cospans’, and Spivak’s ‘monoidal category of wiring diagrams’. Vasilakopoulou will give a talk on open systems as algebras for the wiring operad diagram (or more precisely, symmetric monoidal category. category) of wiring diagrams. It will be based on this paper:

but she will focus more on the algebraic description (and conditions for deterministic/total systems) rather than the sheaf theoretic aspect of the input types. This work builds on earlier papers such as these:

January 29, 2019: Daniel Cicala - Dynamical systems on networks

Cicala will discuss a topic from this paper:

His leading choice is a model for social contagion (e.g. opinions) which is discussed in more detail here:

February 5, 2019: Jade Master - Backprop as functor: a compositional perspective on supervised learning

Here is Master’s abstract:

Abstract. Fong, Spivak and Tuyéras have found a categorical framework in which gradient descent algorithms can be constructed in a compositional way. To explain this, we first give a brief introduction to backprogation and gradient descent. We then describe their monoidal category LearnLearn, where the morphisms are given by abstract learning algorithms. Finally, we show how gradient descent can be realized as a monoidal functor from ParaPara, the category of Euclidean spaces with differentiable parameterized functions between them, to LearnLearn.

Fong, Spivak and Tuyéras have found a categorical framework in which gradient descent algorithms can be constructed in a compositional way. To explain this, we first give a brief introduction to backprogation and gradient descent. We then describe their monoidal category LearnLearn, where the morphisms are given by abstract learning algorithms. Finally, we show how gradient descent can be realized as a monoidal functor from ParaPara, the category of Euclidean spaces with differentiable parameterized functions between them, to LearnLearn.

Her talk will be based on this paper:

• Brendan Fong, David I. Spivak and Rémy Tuyéras, Backprop as functor: a compositional perspective on supervised learning.

category: courses