I did the same lol (via scraping), run every midnight to multiple bank accounts, then sync to google spreadsheet. I was writing web automation scripts the hard way for a while, until I made a tool to simplify the process of writing web automation scripts - https://github.com/tebelorg/TagUI (frankly I am surprised why my banks never banned or send me a letter)
I use Selenium to drive a firefox browser under the assumption that banks don't care enough to detect it: https://github.com/tmerr/bank_wrangler . I am also slightly worried about banks not liking it, but if they do care I guess I will receive a scary cease and desist in the mail and I will cease and desist.
My concern is about what will happen if fraudulent activity happens, such as my account getting hacked somehow. Then they look into my account, discover I've been doing this somehow, tell me I've broken their TOS and I'm on my own.
Basically use a web automation tool to simulate the login process and then grab the data you want, save it to .csv file or something. For saved HTML pages, web automation tools generally can work with them. For saved PDF I'm not sure what is good.
In theory they shouldn't, though I believe many financial websites explicitly declare in their terms of use that automated access or web scraping data is not allowed.
Do a careful reading of your banks terms of service and you will likely find a clause either barring you from doing this completely, or voiding fraud protection.