Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really liked this article, though I don't know why the author is calling the idea of finding a separating surface between two classes of points "topology." For instance, they write

"If you are trying to learn a translation task — say, English to Spanish, or Images to Text — your model will learn a topology where bread is close to pan, or where that picture of a cat is close to the word cat."

This is everything that topology is not about: a notion of points being "close" or "far." If we have some topological space in which two points are "close," we can stretch the space so as to get the same topological space, but with the two points now "far". That's the whole point of the joke that the coffee cup and the donut are the same thing.

Instead, the entire thing seems to be a real-world application of something like algebraic geometry. We want to look for something like an algebraic variety the points are near. It's all about geometry and all about metrics between points. That's what it seems like to me, anyway.



> This is everything that topology is not about

100 percent true.

I can only hope that in an article that is about two things, i) topology and ii) deep learning, the evident confusions are contained within one of them -- topology, only.


fair, I was using 'topology' more colloquially in that sentence. Should have said 'surface'.


Ah! That clears it up.

You then mean Deep Learning has a lot in common with differential geometry and manifolds in general. That I will definitely agree with. DG and manifolds have far richer and informative structure than topology.


If I had to give a loose definition of topology, I would say that it is actually about studying spaces which have some notion of what is close and far, even if no metric exists. The core idea of neighborhoods in point set topology captures the idea of points being nearby another point, and allows defining things like continuity and sequence convergence which require a notion of closeness. From Wikipedia [0] for example

The terms 'nearby', 'arbitrarily small', and 'far apart' can all be made precise by using the concept of open sets. If we change the definition of 'open set', we change what continuous functions, compact sets, and connected sets are. Each choice of definition for 'open set' is called a topology. A set with a topology is called a topological space.

Metric spaces are an important class of topological spaces where a real, non-negative distance, also called a metric, can be defined on pairs of points in the set. Having a metric simplifies many proofs, and many of the most common topological spaces are metric spaces.

That's not to say that topology is necessarily the best lens for understanding neural networks, and the article's author has shown up in the comments to state he's moved on in his thinking. I'm just trying to clear up a misconception.

[0] https://en.wikipedia.org/wiki/General_topology




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: