I’ve long had this dream of building a system that would allow for symbolic algebra manipulations, where the representation knows about the semantics. The idea is to replace long and tedious pen-and-paper symbolic calculations, while still having direct control over each step of the calculation. In looking for something like this I’ve found specialized proof verifiers, which are doing something different. At the other end of the spectrum there’s Mathematica, which does symbolic manipulation but doesn’t understand enough semantics.
Are these projects aiming at something like what I’m describing? Or are they more about something else like verifying proofs?
Well, in principle a CAS like Mathematica (or in principle every mature alternative) allows you to implement such a workflow (i.e. naming mappings or an algebra on your objects). I think what Wolfram is concentrating on since a decade is accumulating "knowledge" into their system, i.e. exactly this kind of semantic information we talk about. However, as an end user I don't really see that, it seems to be more well-groomed behind Wolfram alpha.
On the other hand, there are these research projects which however seem to concentrate on standards rather then actually accumulating semantic knowledge.
I would love to see an adoption of hypertext and semantic mathematical notation in scientific papers. Instead of writing $E=m c^2$ in a (LaTeX) paper, we would instead define the symbols machine readably with a code like
set E = physics/Energy
set m = physics/Mass
set c = physics/constants/speed-of-light
I have never seen actual scientific papers which do this kind of stuff, i.e. which are machine readable.
Are these projects aiming at something like what I’m describing? Or are they more about something else like verifying proofs?