Sure. First, I'd recommend checking out the website of HN user 'triska at https://www.metalevel.at/prolog -- there's more to Prolog itself than most people are ever made aware of.
Frank Pfenning at CMU does some really cool research on concurrency using logical semantics and sequent calculi (https://www.cs.cmu.edu/~fp/). Session types might be the coolest thing out of this particular corner of the field, but there's so much more to explore here.
Going back some decades, concurrent constraint programming (CCP) takes the position that concurrent processes communicate by posting constraints upon a shared store. Vijay Saraswat's ask/tell CCP model has been a real inspiration to me. As best as I can tell, his work is based on forward reasoning, as opposed to Prof. Pfenning's work, which appears to use backwards reasoning (as does Prolog). (There's an interesting duality between the two kinds of reasoning, which I am not at all positioned to opine on.)
Some of the functional work I've seen on concurrency ends up going in a logical direction without realizing it; the more recent LVars papers (Lindsey Kuper and Ryan Newton) build up a lovely abstraction in Haskell that's conceptually the same as Saraswat's ask/tell logical framework, but they don't seem to realize it until several papers in. (It's all fantastic work! It's just a shame the communities have so little overlap.)
Logic programming allows you to model mutable state, and it gives you high-powered tools to reason about how that state changes over time. As a concrete example, you only need term unification and logical variables to get undirected communication channels in a logic program. Unifying a variable with a term in one part of the program allows other users of the variable to match on it and obtain information about that term. In this sense, it goes in the complete opposite direction from functional programming, which eschews state and treats it as something to be derived from a stateless foundation.
Frank Pfenning at CMU does some really cool research on concurrency using logical semantics and sequent calculi (https://www.cs.cmu.edu/~fp/). Session types might be the coolest thing out of this particular corner of the field, but there's so much more to explore here.
Going back some decades, concurrent constraint programming (CCP) takes the position that concurrent processes communicate by posting constraints upon a shared store. Vijay Saraswat's ask/tell CCP model has been a real inspiration to me. As best as I can tell, his work is based on forward reasoning, as opposed to Prof. Pfenning's work, which appears to use backwards reasoning (as does Prolog). (There's an interesting duality between the two kinds of reasoning, which I am not at all positioned to opine on.)
Some of the functional work I've seen on concurrency ends up going in a logical direction without realizing it; the more recent LVars papers (Lindsey Kuper and Ryan Newton) build up a lovely abstraction in Haskell that's conceptually the same as Saraswat's ask/tell logical framework, but they don't seem to realize it until several papers in. (It's all fantastic work! It's just a shame the communities have so little overlap.)
Logic programming allows you to model mutable state, and it gives you high-powered tools to reason about how that state changes over time. As a concrete example, you only need term unification and logical variables to get undirected communication channels in a logic program. Unifying a variable with a term in one part of the program allows other users of the variable to match on it and obtain information about that term. In this sense, it goes in the complete opposite direction from functional programming, which eschews state and treats it as something to be derived from a stateless foundation.