A clever idea. People are wondering why such a thing might be useful, so let me advance a theory:
Latency.
Suppose you have a bunch of people somewhere, say, the US, and a bunch of other people somewhere else, say, China, and there's an ocean in between. If they need to work collaboratively on something, placing a datacenter in one country or the other yields asymmetric latency; someone has a lot more.
If you can just plop a datacenter exactly at the midpoint, everyone wins. It needn't be the biggest datacenter ever, just one that can handle the latency-sensitive tasks.
They pretty much say why in the project page: renewable energy [tidal, currents?] and cooling. The third, as you mention is latency --they want to be where the people are.
Plus, at depth, storms and typhoons don't affect things all that much, it's rather calm. So, the main threat might be from saboteurs rather than natural disasters [beside the salty environment] because beside a coast guard at the surface [which if contraband coming in is any proxy, it's pretty porous], you don't have a "police presence". So they'd have to rely heavily on monitoring systems.
I think you have the backwards -- sharks aren't attracted to copper cables.. but are attracted to undersea fiber optic cables.. because they carry high voltage power (for the undersea repeaters), which emits an electromagnetic field that attracts the sharks:
I stand corrected, forgot about the power cables I know sharks were attracted to the EM field around the old transatlantic phone and telegraph wires. Well at least it will be a good reason for an outage.
That may be a part of it, but I also think a major component is cooling. Cooling account for 30-40% of the running cost of a datacenter, so building with access to enough water, they've essentially cut the running cost with to 2/3 of conventional datacenters.
My guess is they are more concerned with latency for real-time services used by millions in big cities, where land is expensive. Think VR servers.
Quick googling yields [1] datacenter land selling for more than $1 million per acre in SV and [2] Google's requirements for datacenter placement. The first four points listed are cheap electricity, carbon neutrality, lots of water, and large parcels of land; the 215 to 1200 acres mentioned in [2] would cost $240 million to $1.5 billion at the price quoted in [1]. Sealed containers anchored to the free sea floor, running on free wave energy and cooled with free sea water would be a very clever way to satisfy those requirements while staying close to the customers.
More likely it's to reconcile latency requirements with national borders. If you want to be close to a country to offer low latency, but political or legal or tax reasons mean you don't want to be in that country, then an ocean datacenter can get you close enough.
Latency.
Suppose you have a bunch of people somewhere, say, the US, and a bunch of other people somewhere else, say, China, and there's an ocean in between. If they need to work collaboratively on something, placing a datacenter in one country or the other yields asymmetric latency; someone has a lot more.
If you can just plop a datacenter exactly at the midpoint, everyone wins. It needn't be the biggest datacenter ever, just one that can handle the latency-sensitive tasks.
Neat project.