All posts by Mikael Vejdemo-Johansson

PhD studentship at KTH Royal Institute of Technology

Doctoral student in Machine Learning
KTH Royal Institute of Technology, School of Computer Science and Communication

KTH Royal Institute of Technology in Stockholm has grown to become one of Europe’s leading technical and engineering universities, as well as a key centre of intellectual talent and innovation. We are Sweden’s largest technical research and learning institution and home to students, researchers and faculty from around the world. Our research and education covers a wide area including natural sciences and all branches of engineering, as well as in architecture, industrial management, urban planning, history and philosophy.

Department information

KTH Computer Science and Communication (CSC) announces PhD positions in Machine Learning at the department of Robotics, Perception and Learning (RPL)

Job description

The scientific work will be conducted along either of the following research directions:

1) Geometric and Topological Methods for Machine Learning with applications to Robotics

Topological Data Analysis is a recently emerging sub-branch of machine learning that enables inference about the global structure of datasets based on rigorous mathematical theory with origins in Algebraic Topology. The research under this theme will focus on developing new geometric and topological techniques for machine learning with a focus on potential application areas in robotics. Possible applications of Geometric and Topological Data Analysis in Robotics include reasoning about robot configuration spaces and free space. This could include development of approaches to autonomously detect unsafe configurations in a self-driving car scenario and to understand how compact components of the free space can be utilized to enable new types of robot manipulation interactions by means of the concept of "Caging". A second potential sub-thread is to investigate how Topological Data Analysis may be of use to analyze representations of data determined by Deep Learning Algorithms. The research will be supervised by Florian Pokorny, Assistant Professor at RPL.

2) Data-driven scene understanding and control in Human-Robot collaborative settings

Future robotic applications are believed to a greater extent include collaboration with humans, humans that do not necessarily have a technical background. Interaction between human and robot thus need to be in a manner that the human finds natural. A collaborative robot should adapt to changes in the environment and tasks that are placed on it. It needs to be in constant learning mode and gradually adjust its behaviours given feedback from both its sensory system and the human collaborator. Research under this theme will focus on a combination of deep learning for scene understanding and generative action modelling, with reinforcement learning applied for control. Emphasis will be placed on methods with which a human can teach a robot the best way of solving a particular task, either through demonstration or by physically guiding the robot. The research will be supervised by Mårten Björkman, Associate Professor at RPL.

This is a four-year time-limited position that can be extended up to a year with the inclusion of a maximum of 20% departmental duties, usually teaching. In order to be employed, you must apply and be accepted as a doctoral student at KTH.

University Lecturer position, Uni Bremen

Dmitry Feichtner-Kozlow writes:
Dear Colleagues,

there is an open position of a university lecturer at U Bremen.
Applied Topology is definitely a possible profile which we
are looking for, so please spread the word.
The deadline is rather soon!

ATMCS7 Torino July 25-29

ATMCS7: 25-29 July 2016 -- Torino, Italy

We are excited to announce the 7th edition of Algebraic Topology: Computation, Methods, and Science (ATMCS), scheduled from 25-29 July 2016 in Torino, Italy! This conference is a joint effort of the Polytechnic University of Turin (Politecnico di Torino) and the ISI Foundation, to be hosted at the Politecnico.

We are now welcoming submissions for our Contributed sessions. For more information, please visit:

The website is continually being updated with more logistical information, so be sure to check back often for updates. Don't hesitate to contact us if you have any questions, we are available at

Looking forward to seeing you in Torino in July!
- The ATMCS Team

Main Organizers for ATMCS7:
Vin de Silva, Pomona College, USA
Anthea Monod, Duke University, USA

Local Organizers:
Giovanni Petri, ISI Foundation, Italy
Francesco Vaccarino, ISI Foundation & Politecnico di Torino, Italy

PhD position at Queen Mary, University of London

PhD Studentship in Mathematics

We invite applications for a fully funded 3 years’ PhD studentship commencing in January 2016. The research project will deal with developing new topological and geometric algorithms for video data compression and with other problems of analysis of large data. The project will be sponsored by the Queen Mary, University of London and by the Penteract 28 and will be supervised by representatives of both institutions. The candidates must have basic knowledge of pure and applied mathematics; it is desirable that the candidates are familiar with geometry, topology, statistics, elements of computer science and data analysis.

The deadline for applications is: November 16, 2015.

The application procedure is described at

For further enquiries please contact Prof. Michael Farber ( or Dr Danijela Horak (

Women in Computational Topology

Brittany Fasy writes:

Recently, a listserv for women in Computational Topology was initiated. The idea is that we can use this mailing list to remind each other about upcoming deadlines, find out who is going to SoCG or CCCG next year, etc. Feel free to encourage others (men or women) to join our network too! If you are interested in joining, send a blank email to <>. If you have any trouble joining or posting to the listserv, please email Brittany <>.

AATRN Seminar: Robert Ghrist

Today, the promised AATRN seminar series got started with Robert Ghrist as the inaugural speaker. His lecture, through WebEx, builds up the cellular sheaf perspective on networks with capacity, Max Flow / Min Cut, and the work done by Ghrist, Yasu Hiraoka, and Sanjeevi Krishnan on categorifying and sheafifying MF/MC.

Among the novel insights coming from this talk even if one has been following the UPenn developments for a while was the connection to Poincare Duality: “Flow duality is a form of Poincare duality” — S. Krishnan

The approach detailed by Krishnan in his preprint (on ) encodes a flow network as a sheaf of semi-modules of capacities over a semi-ring over the directed graph of the network. Flows correspond to homology, cuts to cohomology.

MF/MC translates to:

Theorem (S. Krishnan)

Given a network X, a distinguished (virtual) edge e and a capacity sheaf F, (all) flowvalues at e of flows on F correspond to the homotopy limit over all cuts of cutvalues of cuts on F.

This approach, and connecting flows and cuts to a lattice structure on the constraints produces a setting where multi-kind flows can easily be analyzed with the same tools as ordinary flows, and where the algebra fixes the duality gaps that show up when naively searching for minima and maxima.

The talk culminated in a primer on the homology and cohomology of directed sheaves, to set us up to read Krishnan's paper, constructing compact support cohomology and Borel-Moore homology for sheaves over directed spaces.

The added abstraction levels seem to enable MF/MC theorems for, for instance, probability distributions. It also carries a promise for insights into duality gaps in MF/MC type problems.

Applied Algebraic Topology Research Network

Peter Bubenik writes:

[…] Robert Ghrist, Konstantin
Mischaikow, Fadil Santosa and I are starting a Research Network in Applied Algebraic Topology. To start, the main activity of the network will be a weekly interactive online seminar. We have plans to expand our activities in the future.

Rob Ghrist will give the first talk on Tue Sept 23.

Please check out our web site at and become a member.

Wasserstein barycenters and Frechet means

At this past year at the IMA, there has been some attention spent on a number of interesting aspects in bringing persistent homology closer to statistical methods.

One core step in this process has been to figure out what we mean by an average persistence diagram, a question that has an answer proposed by Munch, Bendich, Turner, Mukherjee, Mattingly, Harer and Mileyko in various constellations.

The details on Frechet means for persistent homology is not what this post is about, however. Instead, I want to bring up something I just saw presented today at ICML. Marco Cuturi and Arnaud Doucet got a paper entitled Fast Computation of Wasserstein Barycenters accepted to this large machine learning conference.

In their paper, Cuturi-Doucet present work on computing the barycenter (or average, or mean) of N probability measures using the Wasserstein distance: they articulate the transport optimization problem of finding a measure minimizing Wasserstein distance to all the N given measures, and present a smoothing approach that allows them to bring the problem onto a convex optimization shape. In the talk — though not present in the paper — the authors argue that they can achieve quadratic running times through these approximation steps. Not only that, but their approach ends up amenable for GPGPU computation and significant parallelization and vectorization benefits.

Their approach works anywhere that the Wasserstein metric is defined, and so in particular should work (and most likely give the same results) on the persistence diagram setting studied by the persistent statistician constellations mentioned above.

I for one would be excited to hear anything about the relations between these (mostly) parallel developments.

ATMCS 6: Day 4

Today's summaries are provided by Isabel Darcy.

Omer Bobrowski talked about Topological Estimation for Super Level Sets.  The goal is to determine the homology of an unknown space from a sample of noisy data points.  Super-level sets of a density function f correspond to dense regions: {x | f(x) > L}.  In general, the density function is not known but can often be estimated.  One can try to reconstruct the homology by looking at U_n(L, r) = the union of balls of fixed radius r around each point in a super level set, {x | f(x) > L}.

But if not enough points are chosen (i.e., L large), then the space may not be adequately covered by U_n(L, r).  If too many points are chosen (i.e., L small), then more noise may be picked up.  However one can obtain the correct homology with high probability by looking at how U_n(L_1, r) includes into U_n(L_2, r) for L_2 < L_1.  This induces a map on their homologies.  The image of this map is the homology estimator, which equals the correct homology of the space of interest with high probability.

Elizabeth Munch talked about The Interleaving Distance for Reeb Graphs. Reeb graphs provide an efficient description to understand the properties of a real-valued function on a topological space and are useful in many applications.  Thus it would be very useful to have a method for comparing two Reeb graphs.  Interleavings (and interleaving distances) have been used to compare persistence modules.  Interleavings can be applied to Reeb graphs by defining a generalization of a Reeb graph as a functor.  One consequence is a concrete algorithm for smoothing a Reeb graph in order to remove noise.

Peter Bubenik talked about Generalized Persistence Modules and Stability.  Generalized persistence modules is an abstract formulation of persistence modules using category theory which includes many forms of persistence modules that are currently studied.  One consequence of this formulation is that one can give simpler common proofs for many standard results such as stability.

Yuliy Baryshnikov talked about Integral Operators in Euler World. One can integrate functions that take on a finite number of values using the Euler characteristic as the measure.  For example if f(x) = 4 for x in [0, 1] and 0 elsewhere, then the integral of f with respect to the Euler characteristic = 4 times the Euler characteristic of [0, 1] = 4(-1 + 2) = 4.  In applications, sometimes it is easier to solve a problem by transforming it into a simpler problem using an integral transform.

An example of an integral transform is convolution: Given functions f and g, one can create a new function, f*g, where the value f*g(x) is obtained from f and g by integrating the product f(t)g(x-t) with respect to the euler characteristic.  Given f and f*g, one would like to be able to reconstruct g: that is one would like to calculate the inverse of the Euler integral transform.  Cases where one can calculate the inverse transform were discussed.