There are some interesting arguments that we'd be better off not assuming free will. Yuval Noah Harari (of Sapiens fame) talks about it quite a bit. He mentions his stance briefly in this interview https://www.theguardian.com/culture/2018/aug/05/yuval-noah-h... , but he goes into it deeper in his books. One of his main points is that, as neuroscience and AI get better, external actors to you are able to "understand you better than you understand yourself" and basically start to program you. People are already concerned that this is happening to some degree (the Cambridge Analytica scandal, for example), and it's likely to get more and more severe as technology improves. And one of his big points is that the people who believe most in the sovereignty of their own will will be least likely to protect themselves from such outside influences.
I agree there would be a large moral shift, and it kind of boggles my mind to try to think about it, but I don't think it's necessarily untenable.
Taking the candy bar example, in current society, even someone who doesn't believe in free will probably won't steal a candy bar, since there's a good chance they'll get caught, and then shamed and fined. There can still be a system of rules without a sense of free will.
The mindset would probably affect every aspect of life, but just looking at criminal justice, my intuition is that we would reduce how retributive it is, and switch to something more rehabilitation focused. Then again, it's hard to imagine what the knock on effects of that would be. I could see it going too far and getting exploited. Still, I'd be happy to see society experiment with moves that direction.