Anyone can legally complain if you copy their content. So they send a C&D letter and you remove whatever offended them, no harm done.
If you want to prevent bots from scraping your content then take advantage of the fact that most bots don't do Javascript: in your server code render the content of each page with some simple encoding that makes text unreadable then add a piece of javascript to window.onload() thats decodes and displays the content.
I've always wondered. Are there any bots that can do javascript? Is there a js engine you can stick in your bot so it reads everything on a page just like a user would see it?
You always have a copyright, even if you don't say so on the page.
They could of course add a robots.txt and stop nicely behaved scrapers that way, but to stop all scraping is impossible. There's always a way. The best you can hope for is to make it so hard, that they don't bother creating a custom made scraper.
If you want to prevent bots from scraping your content then take advantage of the fact that most bots don't do Javascript: in your server code render the content of each page with some simple encoding that makes text unreadable then add a piece of javascript to window.onload() thats decodes and displays the content.