Hacker News new | past | comments | ask | show | jobs | submit login
Map of the Roman Empire Published Under CC 3.0 CC-BY (ancient.eu)
126 points by a_bonobo on Nov 22, 2015 | hide | past | favorite | 29 comments



"An Incredibly Detailed Map of the Roman Empire At Its Height in 211AD" http://brilliantmaps.com/roman-empire-211/

Direct link: https://i.imgur.com/lHoCQtt.jpg

Edit: this is not as detailed, but more responsive.


I actually bought this map a few months ago seeing it on a reddit post. It's a beautiful map and you can tell care was taken into producing it. It's about $45 USD, but I'm sure that barely pays for the amount of research they put into it. The labeling itself is meticulous.

Be warned. It ships from Germany so shipping is about 2-3 weeks.


Thanks for the link! Yeah, the custom google map tiles on the OP could be great but it takes 10sec to load whenever scrolling around. Hopefully they can get some CDN image hosting.


The map uses today's terrain and coastlines though as far as I can see, which doesn't help understanding eg. lack of roads in former marshes.


The "Antiquity à la carte" map has an ancient terrain layer by default : http://awmc.unc.edu/awmc/applications/alacarte/


Actually, if you zoom in on the Netherlands - by far the most adapted (okay, also naturally changed) coastline since Roman times - you can see a thin blue line indicating the contemporary coastline.


If you zoom in, it switches suddenly between modern and ancient coastlines, but even with the ancient coastlines, it still shows all the modern canals, polders and the distinctive grachtengordel of Amsterdam.


The about page discusses the data sources:

http://pelagios.dme.ait.ac.at/maps/greco-roman/about.html

And also links an article describing the map:

http://pelagios-project.blogspot.com/2012/09/a-digital-map-o...


It's a weird mix of modern and ancient map features. If you zoom in on Netherland, you can see it switch from modern coastlines to plausible ancient coastlines, but it still has the modern map of Amsterdam clearly visible.

It seems the creators may not have been aware just how much geography changed in the past 2000 years. Or maybe they just didn't care enough about the edges of the Empire.


Well I found Lutetia but not Babaorum, Aquarium, Laudanum or Petibonum.


They were erased from history to hide their failure over some wild uncivilized Gauls.


Very nice, but labelling is a little suboptimal - Rome itself doesn't get a label until you zoom in twice. I don't know how this works in Google Maps content - is there a way to assign importance weightings to individual markers?


It's using Google Maps javascript to show tiles from pelagios.org (redirects to their blog), so they could be using any rendering system (many would of course support importance for markers).


Very interesting, however it paints a misleading picture of the empire because it is missing the most important trade and shipping routes: over the sea.


I always found it incredible how Rome, a city of angry shepherds without a real port and with no serious tradition of seafaring, ended up using the whole of the Mediterranean as its personal highway and the heart of a huge Empire.


Their empire lasted a while. Given enough centuries, it's not so hard to acquire ports and develop a tradition of seafaring.


In fact, they did the exact opposite, at least initially. Pretty soon they were up against Carthage, a huge naval power with a long tradition of seafaring inherited from its Phoenician roots, and were soundly beaten in the first few engagements. The Romans turned it around by changing the game: they developed "boarding" tech that would basically switched naval engagements into more traditional brawls -- what caribbean pirates were still doing almost 2000 years later.

In the long run, of course they acquired real sea skills; but they only leveraged those skills after they had acquired total domination over the Western Med. That's just amazing, imho.


the web page is https but tries to load the map over http :(


And? Why does it need to be secured? After all it's an ancient map. Unless of course you are afraid someone will snoop on your plans with Doc Brown?


It needs to be authenticated because I don't want to run injected javascript. (I have no idea if the inner frame contains javascript. My browser blocked it for me by default. That is the correct and good behaviour.)

> After all it's an ancient map.

It needs to be encrypted because I don't want other people to know that I'm browsing the ancient map. The map itself is public, the fact that I am reading it need not be public. Why are you afraid that I don't want you to know my reading list? Would you tell me yours?


> Why are you afraid that I don't want you to know my reading list? Would you tell me yours?

I agree with your privacy-oriented attitude and the bug report, but this sentence made me think that:

1) lots of people love to share their reading lists. Yes, of course it's always an edited version (which enables privacy), but still...

2) ... websites like this very one are basically aggregated reading lists we all share. Others like Goodreads are basically just shared reading lists for offline content.

3) There are some browser plugins that tried to build communities around webpages, i.e. you'd go to a webpage and see other people reading it at the same time. They never seemed to get much traction. However...

4) ... would I like a system where I submit my browser history and get back recommendations for other pages or for communities to join? Yeah, I'd like that. Huge privacy implications, sure, but I bet a lot of people wouldn't mind trading some privacy with an automated entity if it means getting back more meaningful connections. In practice, Google Search does that already.

In the end, for most people the problem is to share more, in order to establish significant connections and augment their own knowledge, the issue only being who or what we're sharing with. We all joke that our browser history is horrible, but we would love nothing more than a nonjudgmental entity that can use it to improve our life.


I never though of it that way.

An interesting idea that I had was quite the opposite.

Run a program that does totally random searches and web page requests (with appropriate headers) in order to create so much info as to render it useless to anyone spying. Almost like if you were running a public proxy.

Reminds me of an old Perry Mason episode where he had 25 guys help him move a car to create a bunch of fingerprints. Maybe it wasn't Perry Mason but some crime show but that was the concept.


You might be interested in this: https://cs.nyu.edu/trackmenot/

However, the difficulty is to make these requests actually look indistinguishable from normal browsing. It might "work OK" now, but the behaviour certainly is different from normal browsing habits. If everyone started using this, you can bet someone would pour a lot of money into research that filters out the automated requests. Then one basically needs to solve steganography, which is a hard problem, harder than cryptography.


That's interesting thanks for that. I wonder if someone could put together a product that would just sit on the network and do the same thing unattended 24x7 that didn't have to be installed as an extension.


Using https (encryption) always is not just about keeping content secret. It is also about ensuring content was not tampered with and ensuring you got the content from the source you expect.

In addition to that there's the whole debate about encrypting everything because otherwise if only sensitive stuff is using encryption it's that much easier to pinpoint sensitive stuff for metadata analysis etc. Much better if everything is just encrypted by default.


True (plus no more mixed content warning) but not exactly 100% of the picture, warrants a footnote here. Unless you actually know which source you expect to get from, you can't tell if https://evil.com/evil.js is intended over https://legit.com/legit.js. In the end, they are over https, and authenticated.

From a user point of view, I am not sure how we can actually defend that, but as an author of a page, you can try to enforce expectation with various HTTP headers (CSP, suborigin integrity).

Authentication is one thing, but determining the integrity is another.


I find it pretty hard to navigate at higher resolutions as there's no visible indication of the size of a river or stream after switching from zoom level 100km to 50km.


Pretty cool (unfortunately also pretty slow - at least for me).


FAIL:I don't see anything.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: