I read the whole article and it's very hard to judge the correctness of most of the article's claims. Most importantly, it's hard to judge the most important thing: the author's own research.
However, I can say that there are different ways to interpret things. The author sees tenure as a way to establish obedience to the unwritten rules of the department. Based on my knowledge of academia, including friends who were granted tenure, the process is mostly fair, but of course most people are going to want to do everything they can to maximize their chances. What the author interprets as proving obedience, I interpret as not wasting time on things the committee can't measure objectively, or doing things that might piss off people on the committee.
As someone in industry, I see academia as somewhere where people can make immense contributions to human knowledge, that cannot be done elsewhere, but they have to jump through certain hoops to do so. Outside of academia there is very little opportunity to do research. Industry is conservative, and prefers to implement and refine known techniques.
I agree with this. But research isn't everything. And there are at least some hints and signs that this person doesn't know how to get along with colleagues.
> I worked really hard to bring an exciting and rigorous operating systems class to UB.
Why did this effort require working "really hard"? Was it because of obstructionist, jealous, or stupid colleagues? Or people who wanted a boring and unrigourous course instead? Or were there perhaps legitimate reasons why others didn't want to change the existing course?
> I led a complete overhaul of our department’s undergraduate computer science curriculum. It includes two new exciting introductory programming courses that I spent a great deal of time designing.
Let me guess: the existing curriculum was terrible, boring, not at all rigorous, and there was no reason to keep any of it, and the author made sure everyone knew it. And why did the author have to spend a great deal of time? Because no one else in the department was capable of doing as good a job? Because nobody else could comprehend this grand vision?
Everything listed under "speaking out" gives me the same vibe. It doesn't seem to have crossed this person's mind that there are reasons why other people have different approaches to teaching, research, administration, hiring, etc., beyond others being obstructionist, brainwashed, or just stupid. I'm reminded of the parable reminding us [1] to not take down a fence until we have truly understood why the fence was erected in the first place.
And really... bringing a dog to work in violation of a clearly stated campus policy, repeatedly, even after having been warned, then encouraging a student petition and getting your name in a local paper about the incident? That's just asking for trouble.
(Full disclosure: I'm coming up for tenure myself, and one lesson it has taken me 5 years to understand is that people who disagree with me on campus aren't doing so out of spite, stupidity, or carelessness, they often just have different priorities than I do. Just because our department absolutely needs more resources to do a good job handling our rapidly growing student population, doesn't mean the college should make this a priority over other things.)
> Why did this effort require working "really hard"? Was it because of obstructionist, jealous, or stupid colleagues?
I was one of the students at UB while this took place. It took a lot of effort, partially because the rewriting involved a lot of student feedback and there was also a massive e-mail discussion (accidentally?) sent to the entire computer science undergraduate mailing list of one of the senior faculty chastizing a more junior faculty over how student feedback was used in remaking the program.
I can't say for anything else (including the dog situation, since I know other faculty in other departments also bring their pets to school), but I know there was some severe and public obstruction from senior faculty to more junior faculty going on during the remaking.
I am very thankful to be in a department with almost zero nasty internal politics, and I've heard real horror stories from people I trust about how nasty things can get.
So thanks for providing more context. It's hard to tell from the post whether this person is the cause of or recipient of all this drama. From the way he tells it, it wasn't just the department, but pretty much everyone he interacted with across campus. It wouldn't surprise me if the entire university was poisoned by nasty politics, but it also wouldn't surprise me if a self-assessed superstar would see it that way even if it were not.
Mind you I don't really know the rest of the "behind closed doors" stuff, so I can't tell if this is just a one-off chastizement or if this was a system-wide thing. I have heard of some broader school-wide drama associated with funding; in fact, more than one computer science professor had expressed concerns through personal blogs regarding the overhead taken from funding for research when I attended. So I don't think these concerns are necessarily unfounded or written by a wannabe superstar.
> Full disclosure: I'm coming up for tenure myself, and one lesson it has taken me 5 years to understand is that people who disagree with me on campus aren't doing so out of spite, stupidity, or carelessness, they often just have different priorities than I do
In my experience, one problem is that the priorities are frequently rooted in the self-interest of powerful PIs or staff and are to the detriment of the department / university as a whole. As a staff scientist working on many different types of projects, I frequently bump up against stupid problems which should be fixed at a university level. At one point, I tried spear heading a number of these projects (creation of a central index of core facilities and support services for our university so people can actually find resources efficiently, centralized billing and training services for shared facilities across departments, secure storage and an EMR for investigators working with patient information).
All of these projects failed to launch for selfish reasons:
central index: powerful PIs feared discovery of their private core labs which they were abusing; core labs feared institute-level data would lead to institute wide optimization and loss of local control.
centralized billing: financial admins feared loss of control and had job security issues
centralized training: core labs feared loss of control
secure storage and EMR: PIs thought this was too inconvenient, preferred leaving shit on external hard drives with no access control. Feared if this were created they would be forced to use it.
I don't ever try to fix anything now beyond the lab level, and even that is frequently challenging.
I've also seen two talented investigators passed by for tenure in our department because my PI is powerful and other PIs fear that having another person from our lab in the department will further consolidate power in my PI's hands. Our department recently spent an enormous amount of money renovating a single floor in our aging building. That floor had the departmental chair's lab on it. In my experience, academia is full of people who for the most part are in it for themselves and have no interest in improving the situation of the group / lab / department / university as a whole.
I'm sorry but I suspect your deeply, deeply wrong and your work will damage your university. Everything you talk about is about centralising knowledge and control.
In practice, this never helps in the business of getting experiments done. Students and postdocs just pend some time doing pointless training courses, overheads increase, lead-times for ordering equiment increase. And to what purpose?
> Everything you talk about is about centralising knowledge and control. In practice, this never helps in the business of getting experiments done
We have researchers that literally cannot get work done efficiently because they don't know that core labs exist on campus to serve them, and labs that spend tens of thousands to buy instruments that they seldom use for the same reasons. At a department level (let alone an institute level), we have no idea what instruments or services people need, and no usage statistics for instruments that we already have. This means that it is likely that shared facilities are sub-optimally serving the community as a whole, and labs are buying multiple copies of the same pieces of equipment when one unit could do if it were shared. The lack of central indexing also means that most labs have no idea what other labs are working on, which hinders collaboration.
For training, right now EACH core lab forces researchers to do similar training courses for the same instruments; there is no way to prove you know how to use an instrument without taking each course. Similarly, every core lab employs different billing software which financial admins / lab admins have to sign up for and deal with.
How is this at all productive or efficient for anyone? If you were to suggest a similar setup for interacting business units, you would literally be laughed out of the room at a company.
The reason people like this current system is precisely because it's inefficient and confusing. This makes it difficult to regulate at a high level and makes it ripe for abuse by powerful people.
The unofficial open dog door policy at my school was rescinded after the chairman found a puddle of vomit in the elevator. While escorting visitors. Twice.
It't a really powerful perspective, but when you do it, make sure that you actually do respect the people around you. And that you can remain true to yourself in the process. Empathy changes a person (if you're doing it right); make sure that you're changing in a way that you want to change.
It's not clear to me whether the curriculum work as all his vision or whether that was a fight, or he simply felt that others didn't view it as something helpful to his tenure case. It's not even clear whether it was all his vision or whether he was mostly trying to volunteer to lead the effort.
Every department is different, but I've definitely seen cases where the expectation is that you wait until you have tenure to participate in some changes or take on service work.
You're absolutely right about respecting others, but from his perspective others in the department also should at least listen to him, so it's hard to say whether "speaking out" is fighting or just trying to voice what he believes - which he should do if he does believe it would be helpful. (Doesn't mean he has to "win", but stopping speaking out or questioning things leads to problems.)
> Everything listed under "speaking out" gives me the same vibe.
> then encouraging a student petition and getting your name in a local paper about the incident? That's just asking for trouble.
On the whole, it sounds like there's a clear view he's "loud" about his work, and this sounds like the biggest probem.
You see, I expect univeristies to prefer boring, non rigorous courses, because more students can do them. When I was an undgerad, I could see that my own course was duller and less rigorous than that of my seniors, but more interesting and and rigrorous than those of my juniors.
>What the author interprets as proving obedience, I interpret as not wasting time on things the committee can't measure objectively, or doing things that might piss off people on the committee.
I'd be astonished if any academic committee in any university anywhere in the world had any idea how to measure the value of research objectively.
As for not pissing people off - I'd imagine it's impossibly hard to do truly original research without pissing at least some people off. There will be petty jealousies, back-biting, gossip, and all the usual nonsense. Too much, too soon, and hackles will be raised.
It's very sad. Academia seems to have become stifling rather than expansive.
My working definition of organisational dysfunction is when politics and status become primary motivators, and quality of output - and pride in that quality - become secondary.
From the outside, that seems more true of academia now than it should be.
Yep.
The academic job market is mind bogglingly brutal, meaning that academic departments have no reason to go out on any limbs. As a result hiring and tenure processes are extremely conservative and, as the author points out, normative.
I completely sympathize with the author; the promise of such shenanigans in the academic career path were part of my motivation for passing a job offer and instead take a software engineering job.
> I see academia as somewhere where people can make immense contributions to human knowledge
The problem is just like the industry (or even more so), academia is plagued with infighting, personal vendettas, skewed incentives and lots of politics.
I think many believe academia is more pure, more rational and calm environment where it is all sharing, and peaceful pursuit of knowledge and so on. The reality is very different from it.
> but they have to jump through certain hoops to do so
The hoops are part of the problems. It seems he cared for focused more on teaching and administration instead of sucking up to other faculty or just cranking out publications like crazy.
The most important bit is probably him trying to improve things.
Improving things mean undoing something that is already there. Due to tenure and length of time people with tenure hang around in academia, the status quo was established by many of the tenured people there. Saying "I want to improve that" was read as, "what you did sucked, I will make it better". That stuff is never acknowledged publicly but come voting time, it won't be forgotten either.
"I interpret as .. doing things that might piss off people on the committee."
Sounds like obedience to me.
There's a great HBO tv show, "The Wire". It's all about how an institution can consume personalities of those who belong to it. I'm pretty sure tenure is the same way. I see this happening to people who work in corporations for a long time. Their personalities change as they become the sort of person who can succeed in such environments.
That being said, I suspect standout researchers get tenure and get to be themselves, just as talented people can succeed in corporations and still be themselves.
They are generally the exception rather than the norm, however.
Hi, although I'm new here and all of you have been teaching here for 20 years on average, let me tell you why everything you have done with the curriculum is terrible and should be completely rewritten in my style. And of course, I'll have to do all of this myself, since you are all incompetent. And I better teach the first round myself too, because I can't trust you with that either.
So yes, there is a difference between not pissing off people on the committee and "obedience".
Just because someone has been doing something for twenty years doesn't mean that what they're doing can't be improved. It doesn't even mean they're necessarily competent.
The professional response to a newcomer isn't to have a snitty fit of tutting and hissing, but to consider the possibility that maybe the younger newcomer has something to offer.
If they're just being Dunning-Kruger-ish then fine - snipe away.
But it's not at all a given that the situation is that simple - especially when they've been employed as a prospect in the first place, which suggests that at least an entire hiring committee met them and considered they had promise.
No, you are right, I was perhaps too harsh. But it takes some modesty and humility to pause and consider why the system ended up the way it has, and why the elder's are resistant to change. The author shows no sign of modesty or humility, so it isn't at all clear to me that the elders were snitty or tutting and hissing. In fact, it sounds like there were other issues the author was deliberately ignoring. Finite resource allocation, campus-wide priorities, an MS program, etc.
Being able to convince others, win allies, balance priorities, and just get along with others are important skills, and the author seems to excel at none of these. At least, judging by this post. Elsewhere in this thread, a student suggested that there may really have been some nasty infighting happening, in which case this poor guy may have just been in the wrong place at the wrong time.
As a student who has participated in some decisions regarding changing courses at my institution, I think classes here are usually bad because no one really cares. It requires tact to say "this course is garbage and you don't care about it anyway; let me handle it" without bruising egos.
But I think it's bullshit to say that not bruising egos is an important skill, especially in science. The kinds of scientists whose egos are easily bruised are the kinds of scientists who Max Planck was talking about when he said "science progresses one funeral at a time." People who refuse to acknowledge constructive criticism unless it's sugar-coated will continue to pursue the same ideas even after others have proved them wrong.
This is a systematic issue. Science is full of people with big egos. When scientist A criticizes scientist B's idea, ideally scientist A would think really hard about scientist B's criticism and either say "yes, you're right" or "no, here's what you're missing" (ideally with experiments). Scientists with big egos don't do this. They reject other scientists' criticism out of hand, and they criticize people based on feelings rather than ideas. The problem is contagious: Scientists with big egos attract more scientists with big egos, since those are the people who continue to believe they are great despite the barrage of nonsensical criticism. And egos tend to grow larger, not smaller, as people rise in rank.
Appropriately weighing others' evaluations of your ideas is really hard. It requires the technical skill necessary to come up with those ideas in the first place; the social skill to distinguish between sycophantic praise and true positive evaluation; and the emotional control to ignore anger or disappointment that might result from negative evaluation and focus on the content instead. In my experience, people who can do these things make much better scientists, and are much better to work with. They are less guarded when brainstorming ideas and more willing to change their minds in the face of superior evidence. But in modern science, there are relatively few incentives that favor accurate self-evaluation, and many that favor persistence above all else. In a world where ~5% of incoming graduate students go on to become tenured professors, people who aren't great and know it (or are great but don't know it) drop out early, and the people looking for positions end up being a combination of great people and mediocre people who think they're great.
Maybe. Or maybe they care and simply have a different perspective on what is the best approach. Or maybe they have different priorities. Optimizing for one variable (e.g. making one specific course awesome) at the expense of all others (hedging against future enrollment trends, pleasing the board of directors & alumni, limited faculty resources, ....) is simply not how universities work.
So I agree with you, somewhat, in some circumstances. In the science side of thing, maybe. But we're also talking about the college's dog policy here. The author seems to think it is completely obvious that he should be free to bring his dog to his campus lab, and shows no awareness that there may be good reasons there is a no dog policy. Maybe there are students with allergies? Or maybe UB has been sued for this in the past? Or maybe its a state law? I don't know.
So reading your comment carefully, it isn't actually obvious if you think it was the author's colleagues who had the big egos and won't listen to criticism, or the author himself.
At my university, I've asked about educational priorities only to be told, by senior professors, "making this class better or worse won't help anyone's career." This is probably what you mean by "having different priorities."
A benefit to what? People arguing for years over boring research questions where the evidence clearly favors one viewpoint over another does not benefit anyone except the people involved in the argument, who can ask for more funding to resolve this hotly debated open question.
At least people should recognize the symmetry of hiring and firing. At some point, Bombardier hired each of those 7,500 people. Did this make the news? Were there HN posts praising the management for creating those jobs?
I think you're missing the implication in the original comment, that the NSA has put a backdoor into NIST and the Russian equivalent has put a backdoor into GOST, but neither can use the other's backdoor.
I cannot tell you how silly this is. If you're using GOST, you're no longer building NIST-compliant crypto. If you're using NIST, you're no longer building GOST-compliant crypto. For Christ's sake, just stop using standardized crypto if you're worried about backdoors like this. Use an eSTREAM portfolio cipher for bulk crypto, use Blake2 as your hash, and use Curve448 for key agreement and signatures.
You can just use the Noise protocol framework to accomplish this, which was designed to use all of these components.
It's still NIST compliant crypto on the outside regardless of the inner contents. Assuming NIST(GOST(plaintext)), NIST would be terribly broken if using GOST ciphertext as the payload weakened the security.
It's crypto 101 that, given a ciphertext without the key, an algorithm's correct input should be indistinguishable from random input of the same length.
I'm shocked that you think the plaintext contents would have an affect on whether or not something is NIST compliant.
I don't follow this objection even a little bit, but I'm also not very motivated to try, so: no need to clarify. I'm just going to reiterate.
I am making a very simple point. If you don't trust NIST standards because you think they're backdoored, but won't run Russian standards because you think they might be too, the answer isn't to compose the two flawed standards.
Instead: just use a crypto stack composed of well-reviewed, well-regarded components that are neither NIST nor GOST standards.
Nobody in the world thinks Curve25519 is backdoored, or that Chapoly is, or that Blake2 is.
In fact: this is what I think you should do anyways. Maybe, just maybe, you should keep using AES because it will be more performant --- but the cycles/byte cost of bulk encryption is so low that I'm skeptical that this matters. Otherwise: avoid crypto standards like NIST and GOST. Standards processes produce crypto that is at best ungainly and at worst actively harmful. Standards are evil.
I am, of course, addressing this advice to the very, very limited subset of engineers who should be working with crypto directly. Everyone else should just use Nacl.
If you think standard A is good except for a potential backdoor with the key held by entity X and you think standard B is good except for a potential backdoor held by entity Y and you assume entity X and entity Y do not cooperate, then composing A and B is completely reasonable.
It's the same thing as having 3 computers vote on the space shuttle control signals. You could follow your argument and claim, "If the software has a bug in it that would produce output different from the other 2, then don't use it!" The problem is that we don't know if there is an issue or not, so we go the safer route with multiple implementations.
You also did not address the main issue I have with your comment. You made this assertion: "If you're using GOST, you're no longer building NIST-compliant crypto.", which implies that the contents of the plaintext determine if the crypto is NIST compliant. This is completely false.
It's like claiming that uploading an AES encrypted file over an HTTPS connection is less secure than uploading via HTTP.
Considering only the secure channel problem and not the entire systems problem (which might motivate encrypting clientside in anticipation of the file being stored), encrypting before sending on a secure channel is indeed pointless, which is the reason you'll find very few soundly designed cryptosystems that do this.
The point is again simple: there are far better options to untrustworthy standards than composing them in the hopes of mitigating their flaws. It's for the same reason that we used to use hash combiners to handle MD5 and SHA1, but now we use HKDF over SHA-2.
>which is the reason you'll find very few soundly designed cryptosystems that do this.
Nearly every secure system I've dealt with (in the military side) encrypted at the network layer (VPN) and they sent encrypted files over that channel.
Yes, because (as I just said), encrypting files mitigates systems problems outside the scope of the secure channel problem. A secure channel doesn't help you if the bag of bits you send down it ends up persisted on an exported, unencrypted filesystem.
That doesn't mean that redundant clientside encryption of files is a sensible feature for a secure channel to have.
As a former "left-anarchist" I'm familiar with this sort of story. We had similar stories of left-anarchist utopias from (1) pirates (2) partisans in the Spanish civil war and (3) various "workers collectives" throughout history. None of them lasted and most collapsed due to internal problems not being crushed by the nation state.
What has been shown to work in the long run, is Western democracy.
That is a very skewed view of history. The Roman Empire lasted around 400 years. The British Empire controlled the world for almost 300 years (and the monarchy is still in place after almost 1000 years), and those are only two examples. The US was founded in 1776, The French Revolution, that gave birth to the modern Western Democracy concept happened around 1790, that is at most 240 years, and it only became predominant after WWI, so around 100 years. Although in my opinion superior (or maybe i'm just biased), western democracy is not assured to work in the long run, many countries after adopting it have gone back to monarchies or dictatorships. And after all, all forms of government that preceded it have been replaced.
Democracies are demanding the political leaders to limit their power both in mandate time-span and share of it by seeking consensus among parties with different views. This forces the governance to be moderate by design compared to chosen moderation of other forms of government. However, this presents itself as an attack surface from external powers. Political interference is way easier. Also, to be in democracy leadership role, considering its power limitations, may be quite frustrating if you want to get results quickly. This alone is enough to explain a few of relapses to dictatorships through power encroachment by ambitious political leaders. Add in here other political context deficiencies like weak political opposition and dirty (as in crime-like) measures of rivalry, and you get the idea. The populace may live "content" by various degrees in many forms of governance, but this is not what is supposed to comprise a defining differentiation. The real difference lies in the inherent ability to cope with various challenges (that have to be dealt with on political levels), my favorite of which is change in all of its forms. All forms of governance have benefits and drawbacks and employing the right one is tricky, it's an ongoing experiment that humans are yet to learn from.
We are in agreement then, democracies can fall for lots of reasons. They also have the potential to be very successful.
My point is democracy as it stands today is very young and if you only compare it against the failed attempts of Communism you will miss almost all of civilized human history.
Yeah that makes sense, but it seems fair to argue that relative to other forms of government, a properly balanced democracy which offsets the functions of government tends to be more stable and last longer than competing alternatives such as dictatorships or communism.
Dictatorship (or totalitarianism, my preferred term for it) is not assured to be unstable. If done right, the totalitarian regimes manage to periodically weed out the internal threats, including dissidence, and thus keep the remaining masses content, a trick employed at least since the dawn of written history with Ancient Egyptian police[1]. As I see it, the totalitarian political system however, being one of the most rigid of all, is more prone to corruption than others. (You may think of the pun about power tending to corrupt and absolute dictatorial power - absolute corruption, but I'm more about the precarious ability to react to different form of corruption.) The communism's main idea is to please the majority, which happens to be the mass of mediocre, at the expense of the the ambitious minority, the needs of which are taking a back seat. This makes communism a lesser environment for development, but it's not in itself a destabilizing factor.
Cuba, Chile, Argentina. They all had democracies, and then collapsed into dictatorships. I'm sure someone more knowledgeable could also name some African or Middle Eastern countries.
Does that even matter? Democracy does not make you inmune to the rest of the world. Democracy is not only an idea and if you want to claim its robustness you have to make sure it is able to cope, in practice, with internal and external opposition, just like all other forms of government. I'm just pointing out that in many cases it has not.
Oh I totally agree, it almost seems that democracy works best when wealth and resources are abundant. This abundance is often at the cost of other nations, in which democracy probably won't last because of the external pressure.
As another example for "western democracy is not assured to work in the long run, many countries after adopting it have gone back to monarchies or dictatorships". So yeah, Weimar Republic.
In few years time, we will read about Cheran mafia's ascend to power. These utopian dreams almost always result in oppressed becoming oppressors. This quote sums it up quite nicely.
“But here's some advice, boy. Don't put your trust in revolutions. They always come around again. That's why they're called revolutions.”
― Terry Pratchett, Night Watch
I'm sorry I don't understand your comment. Not that important for what? The British tradition (which tradition?) had a bigger impact on what?
And all I'm saying is that at some point we get bored of calling successful revolutions, well, revolutions, and just accept them as the new normal. But we remember the short lived ones as revolutions because we never got around to give them another name, and thus we associate revolution with failure, when it is not the case at all (one way or the other).
> I'm sorry I don't understand your comment. Not that important for what? The British tradition (which tradition?) had a bigger impact on what?
The American revolution, Constitution and Bill of Rights are all inspired and influenced by, among other things, the Magna Carta, the English revolution and English Bill of Rights.
Let's ignore the unexplained impact you claim the American Revolution has had on "Western Democracy" and examine the impact it had on democracy merely within the United States:
In 1776 the only people who were entitled
to vote were white men who owned property.
100 years and a civil war later, the 15th
Amendment was passed in an attempt to give
black men the vote.
Another half century after that women were
given the vote.
Almost another half century after that -
200 years after the American Revolution -
the Voting Rights Act had to be passed, to
finally, properly give black people the vote.
To put it another way, New Zealand had universal suffrage when Queen Victoria was on the throne; the USA didn't even fully have it for black people when Barack Obama was born.
If we follow your example then no one really has a true democracy yet, after all in most countries people under 18 (or 21 or whatever age they use) are not allowed to vote. This might be considered barbaric in the future.
You are confusing your ideal of democracy with what was actually a change in the way to look at government: it was a complete rejection of the monarchy and its god mandated right to rule unlike anything that came before.
And sure, everything is influenced by what came earlier, after all Thomas Paine, considered by many the intelectual father of both the French and American Revolutions was actually British. What started in 1776 was the beginning of modern democracy, and it inspired many western colonies to break from their colonizers with their own democratic revolutions (not all successful), but it was not perfect, it has evolved a lot since then, and it will continue to do so.
I will also argue your point about New Zealand. If your democratic process can be subverted by the whims of a single person using the power of his/her inherited authority on the assumption of being divine, then you don't really have a democracy, no matter how many people are allowed to vote.
But again, that is not the point I was making in my response to op. I was just merely pointing out that not all revolutions end in failure.
> If we follow your example then no one really has a true democracy yet, after all in most countries people under 18 (or 21 or whatever age they use) are not allowed to vote.
They're allowed to vote when they reach the age of 18 or 21. How do people excluded from voting due to their race or gender become eligible to vote?
> You are confusing your ideal of democracy with what was actually a change in the way to look at government: it was a complete rejection of the monarchy and its god mandated right to rule unlike anything that came before.
I've already given examples where monarchical rule was rejected and curtailed - events which inspired and influenced the American revolution - so why are you still pretending the first people to come up with such concepts were American revolutionaries?
> I will also argue your point about New Zealand. If your democratic process can be subverted by the whims of a single person using the power of his/her inherited authority on the assumption of being divine, then you don't really have a democracy, no matter how many people are allowed to vote.
Take a look at English history, say, Roman times (43 AD) to present. It's fascinating on multiple counts. I'm sketchy on this myself, but a rough outline:
The Romans were, of course, a foreign occupying force, but they served to keep other foreign occupying forces out. Once the Romans left (~450 AD), that stopped being the case, and England was successively invaded by Norsemen ("Northmen", a/k/a Vikings, Normandy is also settled by the Norse), and Saxons (from present day Holland / Netherlands), mostly.
William the Conqueror (Norman) was the last foreign invador to both substantially engage the dominant British inhabitants on their own soil, the last to defeat England (absent invited parties, e.g., Mortimor and the Prince of Orange). 1066 was the last time until the first World War that England was subject to any significant foreign attack, and the last time until the second World War that London itself was engaged by an enemy. I find that pretty significant.
There were numerous almost (though not quite) always peaceful devolutions of power from the Crown to an ever expanding scope of at first nobles, and finally commoners. The Magna Carta (1215), War of Roses (1485), the English Civil Wars (1642 & 1648), Commonwealth of England (1649), the Protectorate (1653-1659), the Bill of Rights (1689), Chartism (1838-1858) (the UK's answer to the Revolutions of 1848 and their populist reforms), Local Governance Acts (1888, 1894), and increasing welfare-state reforms of the 1930s - 1950s.
Throughout almost all of this has been a devolution of power from the centre to the periphery.
It's also occurred largely by the fact that those claiming additional power -- barons, lords, a growing merchantile and political class, and finally the proletariat -- could effectively make such demands, threatening disruption otherwise.
The concept of a revolution in the modern sense seems, well, quite modern: France (1794), Europe generally (1848), Russia (1917) and the subsequent Communist Revolutions of China and Cuba. (I'm excluding the Communisation of Eastern Europe which was in fact an act of imperial oppression by Soviet Russia). The Fascist revolutions of Italy and German (a strongly cautionary tale). The fall of the USSR and Communist Eastern Europe, though should fit. I'd include the Iranian Revolution (1979), though that was not in the Liberal tradition.
More generally, power transitions are from one oligarchical group to another, though frequently playing on public sentiments.
The Brits had a gradual accumulation of rights. First more for the upper nobility, later for the common man, too. They also had pretty stable governance for quite a while now.
The Civil War interrupted that stability a bit (as
AlgorithmicTime in a dead comment points out), but that wasn't as much upheaval as eg the French revolution.
> Not that important for what?
Western Democracy. Sorry for the confusion, I was relying too much on context here.
I now get what you are saying, thanks for the clarification.
I agree with you, democracy can be traced to the British, and further back to the Romans, and even the Greeks, but it wasn't until the French Revolution that the core concept of it was implemented; namely that there are no divine rights of kings (Britain also contemplated that idea, it is a shame they didn't follow thru).
Democracy is not only about voting or rights or property, it is also the idea that the people can decide for themselves they way they want to live. Voting is just the best way we've come up to decide that until now.
The Industrial Revolution was a factor in the uptake of democracy, sure. Just as WWI, WWII and the Cold War were. But it all started in 1776 in the US with the Declaration of Independence and in the 1790s with the French Revolution, which was the inspiration for South America to declare independence from Spain and establish their own democracies with different levels of success. The French Revolution didn't last in France, not at first, but the core concepts are the basis of our current democracy largely in part because the American Revolution kept them alive and spread them.
> What has been shown to work in the long run, is Western democracy.
How "long" has "Western democracy" been around for?
In any case, I don't think it matters which -cracy/-archy/-ism it is. What ulitimately works is that the people be allowed to do what they want to do. ...within universal moral standards (:Murder is Bad.)
As long as people are allowed to buy what they want, watch, read and play what they want, build what they want, go where they want, work where they want — if they have enough money — and say what they want and have relationships with whomever they want, then on the whole they will be complacent if not content or happy. They won't care what type of government it is or who is running it.
Take a contemporary issue for example. Do you think you can even vote out the NSA and dismantle all domestic surveillance mechanisms at this point in America? But the majority of people don't care because they're more or less content.
I reside in a third-world country that isn't exactly a bastion of Western Democracy, yet the people here more-or-less live their daily lives out as one might in a Western country, so they don't really pay their government much mind.
I think you are missing the bigger picture. This is a story of a town dealing with the problems that arise in a failed state. The Mexican state and federal governments have failed to provide the basic services that they are responsible for, most importantly security.
Even when every single person complains, the result of democracy is not to make each and every person happy.
Case in point: You can change things, even radically, in the western democracies. All it takes is to create a new party (the law and the constitutions place only very broad limits) - the you can get elected.
The point that new parties have not actually been elected is not a sign that democracy doesn't work, it is a sign that people in summary don't actually think any of them would be an improvement.
Just because you don't like some of the outcome doesn't mean it's not a democracy.
> The point that new parties have not actually been elected is not a sign that democracy doesn't work, it is a sign that people in summary don't actually think any of them would be an improvement.
Whilst this is true, it also matters how the people's will is summarised, and (perhaps even more importantly) how that summing method affects the behaviour of the people.
The obvious example being that first-past-the-post systems have a tendency to get ~2 dominant parties (e.g. US and UK), whilst proportional systems tend to get more parties or coalitions (e.g. Germany and Scotland).
Hence, in a first-past-the-post system, the fact that new parties don't tend to get elected is a sign that people don't think those parties offer enough of an improvement to convince a majority of voters to sacrifice their influence on the race between the big 2. It's like a giant prisoner's dilemma, where it's possible for every individual to think that some new party is a big improvement, but for nobody to vote for that party as they don't trust others to cooperate.
Although there are second-order effects, e.g. it may be worth chasing a seat in parliament even if there's no hope of forming a government; or getting second-order effects from an issue/protest vote (e.g. voting UKIP to influence whichever majority party wins to focus on immigration)
No, just like the language lawyering it's neither necessary nor sufficient, but the term is of course a bit fuzzy anyway.
I guess "xenophobic asshole" is a more accurate term but I thought "nazi" conveys enough meaning without resorting on language Americans might find offensive.
To clarify: there are valid arguments against immigration and there are especially valid criticisms on how immigration is handled (e.g. I would argue the German approach has for a long time largely consisted of hoping they either go away or magically assimilate on their own). But any argument that is based either on the requirement for immigrants to abandon their entire identity or on ethnicity alone is blatantly xenophobic and dehumanizing.
Another possibility is that the prosecutor didn't consider it relevant to the case whether Goodman was a journalist or not, but chose to answer the question anyway. E.g. the quote "I think she put together a piece to influence the world on her agenda, basically. That’s fine, but it doesn’t immunize her from the laws of her state" in no way implies that her actions would have been legal if she were a journalist.
It is the article that asserts that "arresting a working journalist" requires a special explanation.
This is a very real problem for Trump. Trump has already been forced to change his rhetoric on Israel, and even Pence doesn't support Trump's plan for Syria (letting Russia/Assad win).
On the other hand, Trump has a huge amount of support for him personally that hasn't been seen before, so maybe he can leverage this against the neocons in his own party.
Clinton's plan for Syria is scary. Ironically it displays the kind of toughness that Trump claims to represent. Personally I think it is a bad move, and Russia has already been reasonable, e.g. backing the Iran deal.
You don't even need to be a libertarian or an isolationist to believe that the neoconservative approach is wrong.
This is the only issue that really worries me, Clinton might actually attempt to get some leverage over Russia as she says. In such a bold move, Russia will feel so cornered that it may break out into a major war, with possible use of limited nuclear weapons.
The western world doesn't understand the mentality of Russia, they have had two regime collapses this century already. Putin believes that everything needs to be done to avoid a third one.
I'm not even in the US, but have been following the election closely. What a circus, both candidates are horrible choices. But Clinton's foreign policy is what scares me, at least based on what I have heard so far.
Another concern is some type of civil unrest after the election.
This is how Japan was pushed into WWII, and how China right now is being pushed in arming up.
Quick history lesson:
China, basically dominated the world, even when the world weren't aware of China's existence for most of the history, then when England and US came knocking into China's door, they managed to force China into a mix of submission, collapse and opening.
Then US tried to repeat the feat with Japan, starting with the infamous "black ships" (how the japanese called the mysterious US warships when japan still used wooden ships).
Japan then started a serious attempt to avoiding "being the new China", and started to literally imitate US and Europe: invade everywhere, and attempt to become a colonizing superpower.
This in the end is the reason why Japan ended in WWII.
Russia saw what happened to countries around them, Iraq was literally created by England, with borders intentionally crappy to create internal problems (Lawrence of Arabia publicy proposed this), US and Europe actions in Japan and Korea region basically turned Japan and Korea into virtual US colonies, in fact, Japan plans I mentioned earlier failed, badly, Japan DID became a "new China" that must obey US interests, and instead of "black ships" at their ports, ended with a permanent base in their territory.
Not only to Putin, but to the russian population, stuff like trying to sanction Russia, is viewed as an strongarm attempt to pull Russia into submission, to the russian population, the fact that they are becoming poorer due to US sanctions, and US allies oil-price meddling, isn't a reason to become angry at Putin, to them it is reason to consider US the ultimate enemies, and do their best to support Putin no matter what happens.
US, England and France seemly doesn't understand that after 2 centuries meddling in Asia in a imperialistic manner, one country that always has been very imperial themselves, will see them as a major threat and will never, ever, back down.
To Russia, nuclear war is more desirable than "slavery", it is better to die, than to submit.
(this is not even counting yet the psychological effects of Russia terrain... Russia geography is so fucked-up that only people that are mentally resilient and willing to endure famine, poverty and extreme situations will live there)
I wish the press would call her out on using the term "leverage". If the US military confront Russia's proxies at the no-fly-zone and force them to turn back, then this would given them leverage against Russia. That's what the term implies. Changing that situation on the ground and then going to Russia and saying "now what". When Clinton uses the term "leverage" it's misleading voters, who don't realize that enforcing the no-fly-zone will require confronting Russian planes directly.
This is what worries me, especially given the state of both the mainstream press and feminism right now. Here in the UK, we've already seen the leader of the opposition accused of supporting harassment and violence against women for opposing bombing Syria merely because some of pro-bombing MPs that had anti-war protests outside their offices were female. Imagine what will happen once the President of the US is a hawkish, well-connected Democrat woman who can defend a man accused of brutally raping a 12 year old girl, leaving her with massive internal injuries, by convincing the court to put the girl through a forced psychiatric examination so nasty she refused to testify afterwards using an expert witness who argued that little girls fantasise and lie about sex with older men all the time - and have the press spin this a feminist act that only right-wing propagandists could object to. We're doomed.
Putin made two points about this in June, 2016[0]. First, it's not possible to verify what's in the warheads of anti-ballistic missiles. Second, defense and offense are part of a single set of capabilities. If you remove the opponent's ability to retaliate against a first strike, that is just as dangerous to them as if you had increased your offensive capability. I'm well out of my depth in terms of judging if these statements are true, but this is what he said.
The issue isn't the tactical implications of putting ABM launchers in Poland. The issue is the false claim (propaganda?) that the US put nuclear weapons in Poland.
I'm actually talking about Romania, which isn't technically on the border, if we now insist that Ukraine is not Russia affiliated, but still damn close, missile fly time wise.
And yes, they are Nukes. They were moved there after the Turkish 'coup'.
So how does that work? The government/rich people/"capital" conditioned people to consume services whether they need them or not, in order to create these jobs? If so can you give some examples of this? Or is it that government/industry created regulations and practices that created these jobs (I could think of law as being like this, but not many people work in law, less than 1% of non-farm employment in fact)?
My take is that we use the labor that was being used in farms for new productive purposes including health, education (as consumption, not investment) and entertainment. It's undeniable that life expectancy is increasing so we are getting something out of this increased efficiency.
However, I can say that there are different ways to interpret things. The author sees tenure as a way to establish obedience to the unwritten rules of the department. Based on my knowledge of academia, including friends who were granted tenure, the process is mostly fair, but of course most people are going to want to do everything they can to maximize their chances. What the author interprets as proving obedience, I interpret as not wasting time on things the committee can't measure objectively, or doing things that might piss off people on the committee.
As someone in industry, I see academia as somewhere where people can make immense contributions to human knowledge, that cannot be done elsewhere, but they have to jump through certain hoops to do so. Outside of academia there is very little opportunity to do research. Industry is conservative, and prefers to implement and refine known techniques.