Hacker News new | past | comments | ask | show | jobs | submit login
The Rise of "Logical Punctuation". (slate.com)
253 points by brianl on May 13, 2011 | hide | past | favorite | 200 comments



>> If it seems hard or even impossible to defend the American way on the merits, that's probably because it emerged from aesthetic, not logical, considerations. According to Rosemary Feal, executive director of the MLA, it was instituted in the early days of the Republic in order "to improve the appearance of the text. A comma or period that follows a closing quotation mark appears to hang off by itself and creates a gap in the line (since the space over the mark combines with the following word space)." I don't doubt Feal, but the appearance argument doesn't carry much heft today; more to the point is that we are simply accustomed to the style.

This is the real story here I think: people invented the rule to suit their preferences, but over time we've forgotten the rule's origin and now treat it like a holy truth. (Or worse yet a matter of "grammar"! Run, run - you've made a grammatical mistake!) You are likely to discover this over and over again if you study the background of many rules that (some) writing teachers insist on and that people like Lynn Truss use as an excuse to foam at the mouth.

Here are some of my least-favorite myths, in no particular order:

+ You should never end a sentence with a preposition. (Sheer bullshit: English uses countless phrasal verbs ('throw away') and in many other cases avoiding the final preposition produces stuffy nonsense.)

+ You should never split an infinitive. (A completely made-up rule, based on mistakenly trying to apply Latin rules to a Germanic language.)

+ The word 'hopefully' can only mean 'in a hopeful spirit' and therefore you shouldn't say, "Hopefully, we'll arive before lunch tomorrow." (Sheer bullshit again: 'hopefully' there functions as an adverb modifying the entire clause 'we'll arrive before lunch tomorrow'. The sentence as a whole clearly and obviously means "It is to be hoped that..." or less formally "We hope that..." This use of 'hopefully' is no different than 'fortunately', 'sadly', 'happily' or 'luckily' in countless sentences.)

+ Don't start sentences with 'but' or 'and' or 'however'. (Just goofy.)

+ Never use the passive voice. (Overdoing it at the least: Yes, a lot of beauraucratic and other bad writing uses the passive in excess, but the passive is not per se evil or always wrong.)


You should never end a sentence with a preposition.

The mathematician Paul Halmos loved issues like these and once constructed a sentence that ends in five prepositions: "What did you want to bring that book that I didn't want to be read to out of up for?"


  I lately lost a preposition
  It hid, I thought, beneath my chair
  And angrily I cried, "Perdition!
  Up from out of in under there."

  Correctness is my vade mecum,
  And straggling phrases I abhor,
  And yet I wondered, "What should he come
  Up from out of in under for?"
-Morris Bishop in the New Yorker, 27th September, 1947


This is the kind of thing up with which I will not put.


In my mind this "sentence" validates the need for the rule. :-)


Ah, but can you rephrase it using the rule to make it better?

"For what did you bring up that book from out of which I did not want to be read?"

Not a huge improvement.


It's a shitty and awkward sentence to begin with (with which to begin), primarily because it unnecessarily uses the passive voice to entangle two ideas that are clearer when approached one at a time in the active voice:

I didn't want anyone to read out of that book to me. Why did you want to bring it up?

See? Contrary to what is suggested above, the whole thing is MUCH better with the active voice. Note that in the case of "bring up," "up" is an adverb, not a preposition, so I am not ending the second sentence with a preposition.


Why did you bring me this book? I don't want to read it.

Speak normally and you don't have an issue. :-)


Simpler, but with a changed meaning. The original speaker does not want to be read to from the book; he does not contemplate reading it to himself.


I'd say it's an improvement; unlike the original, I am able to parse your version.


"I do not want to be read to from that book, why did you bring it?"

There is always a better rewrite.


Every language has ways of "legally" abusing it. Individual examples of which do not invalidate the language as a whole.

See also http://www.ioccc.org/ .


I'm trying to parse this but it still doesn't make perfect sense

  What
   did you want to bring that book
    that I didn't want
     to be
      read
      to
     out
    of
   up
  for


Its got some idiom in it as well. Try:

What did you want to bring that book (that I didn't want to be read to out of) up for?


  What
    did you want to bring that
      book that I didn't want to be read 
        to
      out of
    up
  for?


It's also a chiasm!


Adherence to arbitrary grammar rules is a signal. It shows that the writer has training in what the rules are, and the ability to follow them. To some readers, this can be a signal of credibility. I don't care if the rules make sense; I care if someone will think less of my writing if I break the rules.


People often reply something along these lines. I don't entirely disagree, but I have some big reservations. First, which rules? That is, does it still bother you if a writer uses 'will' where some very stuffy people would prefer (even insist on!) 'shall'? If not, why not? If so, why? Notice that no matter which way you jump on this, you're already making choices. What about 'nauseous' and 'nauseated'? Do you insist on that distinction in writing? How about in speech? Do you correct people who say 'can' instead of 'may'? How about 'less' and 'few'? Do you say 'your being here makes me happy' or 'you being here makes me happy'? There's always going to be room for some pedantic asshat to correct you. It's up to you, however, to pick the cases where you push back, or shrug, or accept the correction quietly (because you've had the fight too often). You also have to decide who you want to impress. If you carefully rearrange all sentences so that they don't end with a preposition, you may impress the grammar school teachers and the self-satisfied pedants of the world, but you will only look silly to the professors of Linguistics over at Language Log[1].

So it's not entirely good enough to say "Just follow the arbitrary rules in order to demonstrate that you are a thoughtful, well-educated, intelligent person." It's always a case of pick your rules and your battles.


I do have to choose among rules. The choices that I make are also a signal. I am an American, and I work in a profession where writing is important. I try to make style and grammar choices that match those of well-thought-of American writers in my profession. I do try to avoid choices that might make some people think I don't know the "correct" definitions of words (nauseous vs. nauseated, for example). I don't care as much about "will" and "shall," because American writers I respect mostly use "will," unless they are writing statutes or contracts. Also, I try never to correct other people; I am not much of a prescriptivist.


You have a [1] but forgot to include the footnote.


Sorry about that. It's linked later in this thread, but still, here it is.

http://languagelog.ldc.upenn.edu/nll/


This is similar to "geeks do not respect suits". Suits are meant to signal credibility. But since geeks value ability much higher than style, the suit, being worn by "less intelligent" salespeople and managers, became a negative signal.

Similarly, perhaps online the use of American punctuation is becoming a negative signal, showing pedantic training as a "professional writer" rather than a true ability to convey ideas.


It can also be a powerful signal of an unoriginal thinker, forever destined to do what they're told and to simply be a replaceable cog in a machine.

If you are looking for a technical writer, accountant, copy machine operator, security manager, etc. a rule follower is okay. These people don't ask questions, don't go against the grain and can be relied upon to simply produce output in a well structured process at a uniform consistency. In effect they are trained to produce fast food. This can be highly desirable in many cases -- but it shouldn't be lauded.

If you are looking for a novel thinker, a rule follower is the last place you want to go. You'd want them to know the rules, and be familiar with them, but not be chained to them/use them as a crutch or excuse. However, like most creative endeavors, if can be hard to get consistent, regular output. They aren't unaware of the rules, they simply ignore them. They aren't better than a person that can assemble words into sentences with the regularity of a automated assembly line, but they are definitely different.

Where it gets confusing is knowing the difference between somebody who purposely ignores the rules to achieve a desired output vs. somebody who is completely ignorant of the rules. I've personally found that people who are strict rule followers have the hardest time with this distinction.


When I am reading, I first look for signs that the author has been trained to know the rules. Later, I look for him or her to break them.

There's something about knowing that you are breaking a rule that makes it feel more legitimate to me. You are using your license instead of your ignorance. I enjoy seeing that in what I am reading.

Edit: I wrote this in part to encourage you to relax your style and care a little less. Tell me what you think :)


I recently finished Bill Simmons The Book of Basketball. He frequently breaks a grammatical rule, then includes a footnote saying something like "Yes, that really did deserve a double negative."


Understood, but what about this: Before reading this article I didn't know about these rules. In Germany, we seem to follow the "british way". Couldn't resist..

So this argument seems only to hold if you a) assume that your readers are from the USA (or NA in general, no idea about Canada?) so that they value this style and b) that they don't just stumble upon your texts: They need to know where you're from.

This is necessary because it seems that a British person could easily claim that you're incompetent because you follow the exact rule we're talking about, but it doesn't apply locally for him. Okay, that's a bad example. AE and BE in writing is probably different enough so that someone speaking/writing BE will just chalk this up as another difference.

But here comes the rest of the world. We know about fall/autumn, colo[u]r and whatnot, but I'd say that it's not generally easy to say where some english text originates from, geographically/where the author lives. So if I cannot distinguish AE and BE with certainty, do I blame you or a BE person for handling this "wrong"?


Knowledgable readers who care will likely know both standards; at that point it's a matter of consistency (or, at the very least, never do something that breaks both sets of rules).


Agreed. It's largely a class marker, and class registers as credibility, depending on the subject matter.


I'd say that in this case I'd give more credibility to someone who follows logical punctuation, as long as they followed it consistently. I prefer it both logically and aesthetically, and, perhaps because of this bias, would wonder if someone following the "periods and commas inside the quotes" rule is just mindlessly following a rule.

Language evolves over time, and in my opinion logical punctuation has become common enough to be a credible choice. Most people these days take the "do not end sentences with a preposition" rule with a grain of salt, and I think that the MLA method of quoting is headed in a similar direction.


As the legend goes, the "never end a sentence with a preposition" rule was actually a revisionist prescription at some point in the (relatively recent) past. The goal was to make English more "elegant" and Latinate in its usage. This prescription actually flies in the face of English's origins. English did not derive from Latin, even though it borrows a lot of words and phrases from Latin. The rules of English grammar do not follow those of Latin, nor should they.


Modern English is a hodgepodge of a variety of languages. The history of the language is really quite interesting, and I suggest anyone look into it if they have interest. If you try to read pre-Norman English today, you'd be hopelessly lost (unless you were previously trained to do so). Take a look at the original text of Beowulf and compare it to, say, the Canterbury Tales. The Canterbury Tales is a rough read; Beowulf is unintelligible.


True, although the rules of grammar and construction have not changed as dramatically as the vocabulary, spellings, or pronunciations have. Gramatically and syntactically speaking, English is still quite Germanic. That's why it is (supposedly) easy for a native English speaker to learn Dutch or German; their conjugations and sentence structures are more intuitive to us than those of the Romance languages.


+ You should never end a sentence with a preposition. (Sheer bullshit: English uses countless phrasal verbs ('throw away') and in many other cases avoiding the final preposition produces stuffy nonsense.)

This is an ineffective counterexample. "Away" is an adverb, not a preposition.


Hah: I'm a dope. I originally wrote 'throw up' and changed it because I thought people would object that it was informal. I picked 'throw away' pretty much at random, starting with 'throw', and I didn't even stop to analyze 'away'.

I can no longer edit the original, but thanks for the correction.


A better example might be, "Do you want to come with?"


But easily revised to "Do you want to come with us?"

Not that I particularly think the rules are inviolable. I see them as guidelines: if you're using the passive voice then make sure that it won't be better in active. If you're ending with a preposition, then perhaps there's a clearer way to phrase it.


Which is extremely informal, as the pronoum "me" or "us" is omitted from the end, but the context of usage would imply its presence.


(If you feel a bit like the author of this comment you might enjoy the Language Log: http://languagelog.ldc.upenn.edu/)


With the exception of the third point, I actually like all of these rules, kind of. However, I don't really like them as rules, per se, so much as guidelines. The thing of it is, if a sentence has been ended with a preposition, more often than not it can be rewritten in a clearer manner. Similarly, while passive voice is useful, to abundantly use it tends to make your writing dull and unclear. Similarly, split infinitives often obfuscate meaning, and sentences beginning with "but", "or", and", or "however" tend to be better served combined with the preceding sentence. Thus, as a general rule, I prefer for amateur writers to follow these guidelines as rules.

That being said, once you know how to write well enough and you understand when these guidelines make your writing better and when they make it worse, for the love of $deity, please break them all over the place.


"Where's the library at?"

"You should never end a sentence with a preposition."

"Where's the library at, jerk?"


Well, everyone has their own standard. Rules falling within that standard are important for clarity, and rules outside that standard are needless and fussy. However...

You should never end a sentence with a preposition.

I like to follow this one, but only as a hobby, and not in casual conversation unless I'm talking to another pedant. I do admit that the syntactical knots into which I sometimes need to tie myself in order to avoid the preposition at the end make the sentence harder rather than easier to parse. On the other hand, sometimes prepositions at the end can make sentences harder to understand, in instances where the sentence seems to have a different meaning until you get to the preposition at the end. (I can't think of any examples right now.)

You should never split an infinitive

Agreed, that's just plain random.

Never use the passive voice

I have never heard this from anyone except Microsoft Word Grammar Check.

Don't start sentences with 'but' or 'and' or 'however'.

Now now, here I'm really going to have to disagree with you. "However" is reasonable, but "and" and "but" really belong in the middle of the sentence, not at the beginning. There are exceptional circumstances under which you can break this rule, but they're few and far between.


Never use the passive voice

I have never heard this from anyone except Microsoft Word Grammar Check.

An English teacher told my class that it's often used to lie. I think she was right. Maybe it'd be better to say that it is often used to mask the truth (hey, we were fourteen or fifteen, and subtlety wasn't exactly one of our strengths). Consider the difference between "mistakes were made" and "I screwed up."

(On the other hand, also consider the difference between "I screwed up" and "we screwed up." You can still play fast and loose with the truth using the active voice.)

I think it's fair to say that overuse of the passive voice is often associated with strained credibility, so you'd often be better off avoiding it, or at least using it sparingly. Of course, sometimes it reads more smoothly, and in those cases you should use it.


However, if you're writing fiction, passive voice is your friend. My high school teachers never realized this despite handing out creative fiction assignments.


Passive voice is for lab reports, research papers, similar scientific/engineering documentation, although I also see a lot of "we" used in papers, especially CS papers.


The use of active voice in scientific papers is an increasing trend. In the old days it was thought more scientifically appropriate to merely state that things were done without personalizing it, but gradually folks came to realise that scientific papers are hard enough to read already and that page after page of passive voice just makes it harder.

I tend to use a blend.


Well, I certainly agree that it's typically harder to write a long document in the passive voice. Describing what I did accurately and clearly is a big enough challenge without also having to contort everything into the passive voice.


Using the passive voice is not recommended by me. A book was read by me that contained the rule against it: The Elements of Style by Strunk and White. Also a passage in Stephen King's On Writing that makes the same point was read by me a bit later in my life.

In short, I do not recommend the passive voice. The Elements of Style has a rule against it, and I read the same point later in life in a passage from Stephen King's On Writing.

But you don't have to believe them. You can figure out why they recommend it for yourself.


Julia Lennon bore John Lennon in 1940 and Mark David Chapman murdered him in 1980.

John Lennon was born in 1940 and murdered in 1980.

The passive voice is recommended in some situations.


John Lennon: 1940-1980 Mother: Julia Killer: Mark David Chapman

J/k

Point taken.


A critical thinker would try to find a sentence that sounds best in the active voice, then try to find one that works best in the passive voice. Succeeding both times, he would then conclude that both are useful.

Specifically, the passive voice allows you to mention the action while omitting the actor. This is sometimes useful, other times harmful.


A mean spirited person would manage to slip a dig at someone's critical thinking skills into an otherwise well-taken comment. Not that this paragraph implies anything about the comment I'm replying to.

(Seriously, wtf)

Using passive voice is also a characteristic error of young bad writers. I would not presume to tell an old good writer how to do their job.

And incidentally, it doesn't take the awesome might of my critical thinking army, arrayed like the crusaders going to war, pennons snapping in the wind, to point out that there are better reasons to avoid the passive voice than 'MS Grammar Check says so'.


> Never use the passive voice > > I have never heard this from anyone except Microsoft Word Grammar Check.

"Never" is definitely over-stated, but avoiding passive voice does tend to make writing more concise, clear, and engaging. Using passive voice habitually is probably just careless.


I have to admit I use But, in particular, at the beginning of sentences fairly frequently. I usually follow the shorter is better rule when it comes to sentences and try to avoid long sentences that have semicolons and other non-period bridges.(Not always, but most of the time.) I find it tends to make writing snappier and easier to read.


The arbitrariness of grammar rules you mention is nothing compared with the standardization of spelling. I see a connexion between the motivations behind both - pedantry.


In both cases, standardization of spelling and grammar, they have the role of streamlining communication. Try reading posts that have non-standardized spelling. If the spelling is phonetic based, you have to pause and think how the word sounds. Awkward grammatical sentences are hard to parse.

There is a point to standardizing these things, though, past a certain point, it's a bit absurd and your point stands.


Regarding phonetic spelling, non-phonetic spelling such as is found in English is unusual - the idea of a French or German spelling bee is incoherent. Even though a book such as Le Morte d'Arthur is unfamiliar at first, the spelling differences disappear pretty quickly once an English speaker's attention is focused on it.


Rules are useful limitations until you fully understand the subject, then you can break them.

By following your "myths" you will avoid writing bad English, when you know what you are doing you can selectively break them for a desired effect.


If any word is improper at the end of a sentence, a linking verb is.

A preposition is a word you should never end a sentence with.


Where has anyone ever said that a sentence cannot be started with "however," which is an adverb?


I wonder if Wikipedia adopting that style has had any influence on its popularity online, or if that's an example of convergent evolution. It was decided way back in 2002 to use the logical punctuation style there, in a fairly ad-hoc way when it was still a small project. Part of the motivation appears to have been a sort of UK/US compromise. It's since been reworded significantly, but this was the original style suggestion there (introduced on August 23, 2002):

In most cases, simply follow the usual rules of English punctuation. A few points where the Wikipedia may differ from usual usage follow.

With quotation marks, we suggest splitting the difference between American and English usage.

Although it is not a rigid rule, it is probably best to use the "double quotes" for most quotations, as they are easier to read on the screen, and use 'single quotes' for "quotations 'within' quotations". This is the American style.

When punctuating quoted passages, put punctuation where it belongs, inside or outside the quotation marks, depending on the meaning, not rigidly within the quotation marks. This is the British style.


Like the two spaces after a sentence rule, the traditional quote style has much to do with typesetting. And so in the monospaced, limited font, early online setting, those rules were not very relevant, and were often ignored. When it wasn't being overdiscussed. Waaay before Wikipedia.

Now, the programmer style is actually different from these. You'll occasionally see a programmer write «He said "Foo.".»


> Part of the motivation appears to have been a sort of UK/US compromise.

That may've been part of the reason, but why were we trying to have a UK/US compromise when the user base was so small? I suggest that it's the previously mentioned programmer habit of using logical quotation, and what occupation was heavily represented among early Wikipedians...?


>When I asked Feal and Carol Saller, who oversees the Chicago Manual of Style, if there was a chance their organizations would go over to the other side, they both replied, in essence: "How about never? Is never good for you?"

It seems to me the next logical question here is, "why not?" Just about the only arguments in favor of "American" punctuation are tradition and some hazy sense that periods outside qoutes look wrong, whereas the best argument for logical punctuation is that the point of writing is to communicate clearly, and logical punctuation is more clear at virtually no cost.


As a non-native speaker, this is one thing I found very irritating about American English writing. It never made sense to me to put something in quotes that is not part of the quote.


Having grown up in the American system, I have to confess that this has always bothered me but I didn't realize that it was only American writing! I guess I never noticed it when I read well-edited british source material.

Now that I know doing it the 'logical' way is also considered conventional, I'm going to have a hard time doing it the American way anymore.


As an English English speaker (writer), I concur!


Informal writing rarely features narration to any significant degree, which is where the so-called American Style is most "logical" to the extent that we're even really talking about logic. In forums and email and texting, quotes are usually either blocked-off text, or the quotes are used to specifically emphasize an exact character string (often a single word).

Wikipedia is also generally not about narration, and quotes are usually meant to be exact.

With narration, the goal is not to convey exactness rather to tell a story. Interrupting a character's quote to insert ", he said," influences the original meaning (if there even is such a thing) no matter where the punctuation lies. But that's not important because specifying precisely what a character said usually isn't the point of a story.

Furthermore, if you write your own sentence, and finish with quote of an entire sentence, why isn't there a period for both sentences? Brian said, "let's go.".

Looked at this way, it's easy to see why, given the choice, narrators would choose the more aesthetically pleasing placement inside the quotation marks.


if you write your own sentence, and finish with quote of an entire sentence, why isn't there a period for both sentences? Brian said, "let's go.".

I actually do this sometimes. But I'm not consistent with it. I think it's also correct to say <<Brian said, "let's go".>> [1]. It's correct because we are allowed to quote just a portion of the sentence, which in this case happens to be every word of it.

[1] angle quotes just for the clarity that another level of normal quotes would destroy.


It might be interesting to determine what IS consistent about your use of periods like that. Is it purely random or does it depend on context?


I like quotation dashes for dialogue. The French style can be somewhat ambiguous (although it rarely results in a problem). The style commonly used in some other languages is both elegant and easy to read. See http://en.wikipedia.org/wiki/Non-English_usage_of_quotation_..., paragraph starting with "In Italian, Catalan, ...."


Shouldn't it be: "Brian said, "[l]et's go.".?"?


One too many question marks, imho. Shouldn't it be: "Brian said, "[l]et's go."."?


Heh, that would be precise.


> Furthermore, if you write your own sentence, and finish with quote of an entire sentence, why isn't there a period for both sentences? Brian said, "let's go.".

You'll never completely avoid ambiguities no matter what you do. Take the question mark in the following sentence:

Brian asked, "should we go?"

Is Brian asking a question or the person who's quoting Brian?


Right, the point being that with certain styles of writing it makes a lot more sense to resolve for aesthetics rather than a simpler grammar. This perspective was largely ignored by the Slate article.

The question sentence could be resolved "logically" by placing the period outside the quotes, and possibly dropping the question mark entirely as you are already describing the quotation as a question. But if you're telling a story, including a question mark and putting it inside the quotes and leaving off extraneous periods is the best way to convey the overall meaning.


Now if only ISO 8601 date formatting would catch on... (yyyy-mm-dd)


Also: 24h time with no timezones.


No on the timezone deletion. While I support the ISO Date and 24 hour clock, doing away with timezones adds confusion and accomplishes little.

People generally live their lives within the local sphere. There is a lot said in the phase "My plane lands at Narita at 13:00/1PM" that isn't said in the phrase "My plane lands at Narita at 05:00 GMT".

With the first, I know that the locals are just finishing lunch, it's likely to be the warmest part of the day, and that it'll be busy getting to the hotel, but everything should be open.

Converting everyone to GMT means that every time you leave your region, you need to figure out what time customary activities take place - and you need to do this for every single different place you visit.

Standardized GMT doesn't work unless you are a hermit with no friends or need for communication.


How would that help? Timezones help to coordinate hours with day and night. If you know something happended somewhere at 3a.m. you can assume it was night (unless it was a polar day). Without timezones the task becomes much more difficult.


I don't know. I think that we're at a point where knowing the absolute time is more important to knowing the relative time (relative to daylight).

If I'm in New York following the nuclear crisis in Japan, it's probably more important for me to know that the next news conference is two hours from now than that it's in "the morning" there. If I'm planning a meeting I certainly want to know the absolute time it's occurring, and only care about the relative time depending on how courteous I am.

I'm not saying we should stop caring about working hours and daylight hours in various places. I'm just saying that those vary wildly and are dynamic, and we should reflect that in our concept of "time".


I disagree. If I make an appointment for a meeting I'm flying to, 5 timezones away, it's better to be able to say 'let's meet at 9am' rather than having to look up what time '2 hours after getting out of bed' is in that particular location. Similarly, if I read news about Japan that says 'an explosion occurred at 10pm' I prefer to know that that means 'after the regular working day' instead of having to look up when the working day ends there.


Maybe the ideal solution would involve specifying all times in two timezones (local and GMT), side-by-side? Practically, I'm now interested in adding something like a <time> HTML tag that would let users decide how they want to see it, and see it in their own localtime automatically. JQuery plugin anyone?


The real problem I have with everyone using UTC is that the day of the week would change at a time other than midnight.


Just keep the day of the week relative to local time; it isn't really important except in local issues anyway.


There was a useful (I found) discussion about no timezones on here a few days ago, when discussing Samoa's decision to switch to the other side of the date line - http://news.ycombinator.com/item?id=2528338


I agree with this. yyyy.mm.dd.hh.mm.ss Also? The Holocene calendar, and the International Fixed Calendar.

But then, I'm a nerd.


I say we should stop beating around the bush and just switch to stardates already.


Why no timezones? Do you propose that all times always be given as 24-hour UTC?


Exactly. With international communication on the rise, it's best (IMHO) to do away with the timezone concept, and just use one time format everywhere.

India is +5:30, Nepal is +5:45, California observes DST but Arizona doesn't... the EU does, but they start and end DST a week before the US...

It's ridiculous.

The initial gut reaction against this is that people would hate waking up at 18:00 and going home at 8:00, but I think people would get used to this pretty quickly. After all, we do this twice a year as it is.


Consider also that there is one hour per year which does not exist and one hour per year which happens twice (at-least within the context of local time if you live in an area with daylight savings)

There is also Namibia which used to lose hours in daylight savings time and Bangladesh which had dst in 2009 then dropped it, this makes historical time calculations tricky.


Times should really be given as Miami Internet Beats[1], since it's compatible with everyone in US Eastern and doesn't have antiquated concepts like "hours," or "minutes."

[1]: http://miamibeats.org/


How about going metric?


Charles Stross does that in his novel Glasshouse: durations are described in kiloseconds, megaseconds, and gigaseconds. (100Ksec is just under 28 hours -- a reasonable proxy for a day -- and 1 Gsec is about 31 years.) In a space-faring society, with no standard astronomical day or year, it made sense.


It's the national standard in my country!

(Then again, it's the national standard of two whole countries, everyone else ignores it. :-( )


The same in mine. I had no idea it was so rare :(


One is china, I think. Which is the other one?


Last time I checked it was Sweden and Japan. Dunno if more countries have seen the light since. :-)


That's the way it's not only written, but spoken in Japanese.


Japan uses three different calendars, depending on context.

Imperial / kōki which counts from alleged founding of Japan in 660BC

Era / nengō which counts the years from the start of the Emperor's reign

European / seireki adopting the International standard date counting from the alleged birth date of Jesus Christ

Today is 平成23年5月13日 of Mikado Heisei, the 125th Emperor of Japan


In all three systems, it is both written and spoken year-month-day, which was the point.

Kōki is not used by anyone with a straight face in this day and age.

"Mikado Heisei" is an extremely uncommon way to refer to the emperor. "帝平成" gets 724 hits on Google, and "Mikado Heisei" gets 6, in 5 different languages---the first of which is your post above. Even the Japanese-language wikipedia page on him doesn't use the word "帝" anywhere.


Yes please!

I'm sick and tired of people arguing that you put the month first because "That's how you say it".


Of course you only say it that way in America.

Today is the Thirteenth of May, not May 13.

I prefer dd/mm/yyyy to yyyy/mm/dd though, just to keep what is usually the most important information at the front.


I like YYYY-MM-DD for two reasons:

1) Sorting the string representations sorts in time order.

2) It eliminates ambiguity almost completely, because extremely few places use YYYY-DD-MM. The ambiguity of most ##/##/## dates is irritating.


I prefer how we did it in the military: 13MAY2011. mysql: %d%b%Y

No ambiguity, easy to parse and 3-letter month is natural delimiter between day and year numbers.


dd-mm-yyyy can easily be confused with mm-dd-yyyy. The only format that is obvious is yyyy-mm-dd.


Until somebody starts using yyyy-dd-mm and screws everything up.

Besides, often when you write dates you leave the year off entirely.


I'm with you on that.

If only MSSQL Server wasn't broken in the way it interprets that if your user locale is set to UK English (give it a string formatted that way in other locales and the behaviour is as expected, in "English (British)" it sees the format as yyyy-dd-mm).


I'm British, I've worked with a large number of clients' SQL Server installs. I think I've only twice seem yyyy-dd-mm as a default behaviour, yyyy-mm-dd was far more common.


I've only seen it assume yyyy-dd-mm when the user's locale is set to "English (British)". In most instances I've come across this setting is just left at the default ("English", which implies "English (American)") even if everything else in the stack is localised for the UK, so the oddity isn't noticed. My build instructions now explicitly state that "English (British)" should not be used due to the way it affects how dates in strings are interpreted when the format NNNN-NN-NN is seen.


Odd.

From memory, the safest format to use is actually yyyymmdd (without the hyphens) though, which is apparently supposed to be interpreted unambiguously by SQL Server.


Ah, the old 112 instead of 120 format.


Y'know, I got so fed up with memorising those numbers at an old job that I built a view that enumerated all the valid ones against GETDATE() :-)


I have a formatDate() function lying around somewhere, but I cannot create functions or views on many of the DBs. convert(varchar(8), @myDate, 112) seems a daft way to go about things - and I have been wondering if I have been doing something wrong...


Agreed. The CONVERT syntax is rubbish, but the output is so much more powerful. I'd rather have an equivalent of Date.ToString("YYYY-mmm-DD hh:mi:ss.ssss") in SQL but, sadly.... It could be written as a UDF but I'm not sure performance would be up to production use.

I had a wonderful stack of little helper functions and views I'd cobbled together at that job and had had free rein to stick into the real databases; they made development so much easier. The downside of employment though, that sort of work gets stuck with each employer you do it for and can't so easily be carried around like a toolkit.

I experimented a bit with some of the code from http://www.simple-talk.com/community/blogs/philfactor/archiv... a while back on some toy projects; it makes some interesting reading and had me building a few similar functions around my own conventions and rules. On a larger scale, I would be sorely tempted to build something like that into the Model database and insist on a clean output as part of the approval criteria.


It could be written as a UDF but I'm not sure performance would be up to production use.

If you ever find yourself in that situation again, write the UDF as a in-line table-valued function. You get multiple return values and avoid the scalar UDF hit, since the engine will execute it in-line (appropriately enough).

eg:

select t.date, d.date_as_string from table1 t cross apply dbo.format_date(t.date,'yyyy-mm-dd') d


yyyymmdd is the only safe format on MSSQL.


Just imagine a programming language where you have to write

    print "some text," function(), "moretext";
instead of

    print "some text", function(), "moretext";


  print "some text," function(), "moretext;"


print("some text," function(), "moretext;)"


Just looking at this make me physically uncomfortable.


A closing parenthesis would never be moved inside quotation marks in any system I know of.


You know, I think the next INTERCAL may come from ideas like this.


In Forth, you write

      ." some text"
to print 'some text' without the leading blank. But that's for a completely different reason.


Just imagine having to write perl in the first place ;P


That's clearly inconsistent, it should be:

   print "some text," function(,) "moretext;"


And the American system is very inconsistent as well. Exclamation marks and question marks would lie logically outside in the American system. For example:

- And then he told me he was 'sleeping in late.'

But:

- "What did he mean when he said he was "sleeping in late"?

- And can you believe it, he was "sleeping in late"!


I think the American style looks better in non-fixed-width fonts, because it looks much closer to the way actual handwriting should look: the comma or period underneath the quotation mark.

I suspect this is the reason people started doing it that way in the first place.

And I suspect people started doing it that way on both sides of the Atlantic. It's just that Britain ended up standardizing one way, America the other. (Possibly yet another instance of English language usage evolving more quickly in Britain than in America.)

I'd appreciate if anybody can confirm or deny this hypothesis. And I find it disappointing that the Slate article has no historical treatment of the issue.

(Honestly, both styles look weird to me in fixed-width fonts, which basically arose in tandem with the modern computer.)


Computer fonts nowadays should be able to change ." and ". so the . is under the " and ditto ," and ", so the , is under the "


Somebody should totally engineer a font that does that; if Zapfino can have a special ligature for "Zapfino," you could certainly make ones for '".', '",', and '";'.

And yes, I just mixed punctuation styles. Period-in-quotes simply looks better.


I thought the font had a ligature for the string "Zapfino,". That was confusing.


You should be able to do it in TeX with \kern.


The technology to do this has been around for some time if you want to. However, this is the first time I recall seeing anyone advocate it for normal usage, and I have never encountered a pro-grade font that is typeset this way by default. Why would you want to remove potentially significant ordering information from punctuation, particularly when multiple punctuation marks in close proximity tend to be confusing enough already?


Actually, fixed-width fonts arose with the typewriter, which is how these sorts of differences have had time to get established. Variable-width fonts basically didn't exist except in professionally published documents before computers and word processing.


You're right that fixed-width fonts would have arisen with the typewriter, my mistake.

Variable-width fonts basically didn't exist except in...

You're not giving enough credit to the printing press. Consider newspapers and books (even the Bible, which practically everybody read), which (I imagine) were printed with variable-width fonts for centuries, right from "the beginning" (i.e. Gutenburg).


Right--but newspapers and books are "professionally published documents." Latterly, there were variable-width fonts in typewriters but that was relatively late in the history of that instrument.


there were variable-width fonts in typewriters but that was relatively late in the history of that instrument

Interesting, I did not know that.


I'm British and, when I was at school, we were taught that punctuation should go inside the quotation mark. This was one of those horribly prescriptive rules, the breaking of which was considered very wrong indeed. I'm surprised that this article calls this "American Style", given my experiences.

I started ignoring the convention pretty when I started using computers because, as the article says, it's hard to defend it on merits and it just looks plain odd, especially in a fixed-width font.


I'm British, and it's only in the last few years that I've heard of the American compulsion to modify quotes by adding punctuation.

I'm glad it seems to be vanishing.


I'm American, and I was taught to use the "British style". In fact, I didn't even realize there was an "American style"--I always considered it a punctuation error, though I guess it's common and inconspicuous enough that I didn't even realize publications like the NYT would make this 'error'.


I'm kind of surprised that this hasn't appeared in the news before, given how important an issue it is for technical writing. After all, you don't tell someone to delete a line in vi by typing "dd."


As a British tech blogger who writes on American tech sites, I've spent a lot of time arguing this particular subject with editors, copy editors, and proof readers.

Grammar should help the reader, not hinder. Logical/readable grammar all the way.


Good. I never quite understood why I was putting punctuation inside quotation marks all these years, besides the fact that my teachers told me that it is the right way.

Let me see how this feels when I use "logical punctuation".

Yes, that feels good.


As a teen I refused to put the punctuation in the quotes, because it felt wrong to me. I wish I had a better articulation, or even knew that the British did it the way I did. I had a teacher who consistently dinged me on it, but I persisted. Finally I've been vindicated!


I've been doing this for years despite knowing what it's "supposed" to be. I'm glad to finally not be in the minority.

And I had no idea it had a name or that it was common outside the US.


Even though I have followed the 'proper' way all my life, I recently realized that there are many instances where having the punctuation inside is confusing. I now have absolutely no problem putting it on the outside when it's less confusing.

I'm seriously considering putting it on the outside all the time now.


Well if we're going to start making English logical we have a lot of work to do.


Simplified English has been around for a while, there's even a Wikipedia language for it: http://simple.wikipedia.org/

iirc Lincoln was a big proponent of it.


And it still hasn't caught on, which probably indicates that it never will.

Though actually, many of the differences between US and proper English spellings come from Webster's attempts to rationalize the English language. It didn't work.


Weird.

Last week I just made a commitment to start doing this the "correct" way. I find it's very difficult after years of programming, though.

Another problem I have is with capitalization on titles. You're supposed to capitalize only the larger words, but I have to go all initial caps. The inconsistency between caps drives me nuts, even though I know it's the "right" way to do things.

It's fascinating to see topics like this kind of float around for months or years and then suddenly become news items. Wonder if a shift is really happening? Or is the story just noticing a trend in people making the same mistakes?


> Another problem I have is with capitalization on titles. You're supposed to capitalize only the larger words, but I have to go all initial caps.

The "rules" for titles are particularly stupid.

A title is almost invariably distinguished in some other way. If it's a heading, it is typically printed in bold if typeset and underlined if written by hand. Citations are normally printed in italics when typeset and written within quotation marks by hand.

Meanwhile, it has been shown beyond any doubt by now that Capitalising Every Word Except a Few in Some Arbitrary Fashion Hurts Readability, which is particularly damaging when you're talking about text that readers will often want to scan at speed.


Down in England title caps like that went out around the 1950s. Compare say the UK Guardian with the New York Times and there is a huge readability difference. US print newspaper design is very retro, like the UK was in the nineteenth and early 20th century.


It's true that some of our newspapers here (I'm in England) have moved into the new millenium with their headings, but alas quite a few textbooks and business reports still languish in the typographical dark ages, even new ones.

As far as I can tell, title case is still one of those quaint ideas that you teach in English classes at school because the syllabus says so, even though it is an objectively inferior approach and is not particularly popular in real world usage any more. (See also: Almost any comma usage when handwriting letters or envelopes that you were taught as a child; not splitting an infinitive, beginning a sentence with a conjunction, or ending a sentence with a preposition; spelling out certain small integer numbers in full; and your teacher's pet view of the Oxford comma.)


Title caps is much less common in British writing.


It's weird that this just got to the home page of hacker news too: http://news.ycombinator.com/item?id=2544198. I've never heard about 'logical' puctuation before and today I've seen it twice in HN alone....


Is there a recognised difference between:

The Rise of "Logical Punctuation".

and

He said, "I've been outside".

?

To me, "logical" punctuation in the first case would be as written, and in the second case would be:

He said, "I've been outside.".

indicating that both the enclosing and enclosed sentence is complete.


Punctuation is supposed to remove ambiguities, and help you read and understand more easily. In your example

He said, "I've been outside.".

the collection of punctuation marks does very little for making the sentence more understandable, and (IMHO) looks ugly. The name "logical punctuation" is just a name for the style, do not take the word "logical" too literally.


Perhaps "compositional" would be more appropriate.


No, the second one would be written:

He said, "I've been outside."


It's implicit in the closing quotes that you've reached a pause or end of some kind.


Not necessarily:

He said, "I've been outside.", but I don't believe him.


I don't see a conflict there:

He said, "I've been outside", but I don't believe him.

The period within the quotes adds nothing.


It adds that that the sentence is complete.

Consider:

He said, "The world".

vs

He said, "The world.".

vs

He said, "The world [...]".


Also according to the Internet, "you" is spelled "u". Everyone should adjust their internal dictionaries so that u don't become an establishment sellout.

Personally, it makes sense for "scare quotes" to not contain punctuation, as they are not complete sentences. But it doesn't make sense for direct quotes to not contain punctuation, as in `He said, "Hello there".' He didn't say "Hello there", he said "Hello there."

(It might seem logical to have two periods in that case, but it's ugly, so the second one can just be omitted for maximum conciseness. That's why the period goes inside the quotation marks. Similarly, it would be confusing and ugly to pretend to end a sentence in the middle of a sentence, so quotes that are not at the end of the sentence "end" with a comma. The period is a pretty strong message to pause, and you don't want to overuse it.

IMO.)


With the end-of-statement period, I agree you can make that case, but it's done with other punctuation as well, since the American rule is more of a typographical rule than a semantic one (you put low-on-the-line punctuation before high-on-the-line punctuation).

For example:

  British: "I would not", he objected, "enjoy that".
  American: "I would not," he objected, "enjoy that."
The period is arguably part of the quoted statement as you say, but the comma definitely isn't.


The British style would actually include the period/full stop in this case, since it's quoted speech and the quoted speech ends with a period.


Hence the "logical" bit. It's not logical to have the period or commas outside of the quote if the quote actually includes a period or comma.


Strange. I am British and I was taught your American style. And have always done it that way.


If the period is part of the quote, it would be inside the quote, as in `He said, "Hello there."' I thought the article made that pretty clear. Certainly nobody argued for two periods.


I use two periods.


I'm pretty sure the article is in agreement with you that when the punctuation is part of the quote, then it remains in the quotes. For example, he said "Hello, there."


Why do you choose to omit the second rather than the first period? If you're after conciseness, surely either is up for ommission, and the external period unambiguously means the current sentence is at an end. An internal period is ambiguous as to whether the current sentence ends or not.


It amuses me that an article about punctuation and typographic conventions completely ignores the standard of using typographer’s quotes instead of inch marks. “This,” instead of "this." Same goes for apostrophes versus foot marks.

As a graphic design student myself, I am quite snobbish about using perfect typography. This includes proper quotation marks, as well as following the rule of putting periods and commas within quotes. (I also follow the rules religiously when it comes to en-dashes versus em-dashes and hyphens, and when to uses spaces around them. I also make sure to only use a single space after period.)


This is something I've struggled with in my writing, especially as someone who's done blogging and AP style writing for a news outlet... Sometimes I write a sentence that might end in a question mark, which shouldn't be a part of a quote and rather placing the punctuation mark outside of the quotes (or inside for that matter) I end up completely re-working the sentence to avoid the problem. But I would have to agree that it doesn't always make sense to stick so firmly to that grammar rule. It's nice to see that I'm not alone in that thought.


I wonder what are your thoughts on the double-spacing after full stops. I find it terrible, but I know projects who enforce this style in code comments.


You should only double-space if you use a typewriter.

If you use a computer to write your text, and especially if it's going online, one space is all you should need to use.


Actually code comments is a rare place where double spacing would make sense, as code in general is designed to be readable in fixed-width type.


Or if you're following PEP 8.


A comma or period that follows a closing quotation mark appears to hang off by itself and creates a gap in the line (since the space over the mark combines with the following word space)." I don't doubt Feal, but the appearance argument doesn't carry much heft today; more to the point is that we are simply accustomed to the style.


Been doing it for years, never looked back. Quotes should delimit the quote, your period is outside that quote. The closest I get is where a comma would work in a quote, and the writing interjects the speaker. e.g., "Grab that," foo barred, "and get over here", because "Grab that", foo barred, ", and get over here" is fugly.


Yeah, I thought the period-inside-quotes thing was stupid and only used it when I had to for school. Stupid stupid stupid.


I've been ignoring that rule for a long time. If it wasn't in the original, it doesn't go in quotes.

And another thing. Using "an" when a word starts an H is followed very inconsistently. In fact, I only seem to hear it in the phrase "an historic". "An human" doesn't really speak that way.


Deciding whether to use "a" or "an" should be made based on the first sound of the next word as spoken, not on the first letter as written.


"An historic event" is wrong:

An is the form of the indefinite article that is used before a spoken vowel sound: it doesn’t matter how the written word in question is actually spelled. So, we say ‘an honour’, ‘an hour’, or ‘an heir’, for example, because the initial letter ‘h’ in all three words is not actually pronounced. By contrast we say ‘a hair’ or ‘a horse’ because, in these cases, the ‘h’ is pronounced.(http://www.oxforddictionaries.com/page/aoranhistoric/a-histo...)


There's nothing wrong with "An historic event".

I'm from London.

(However, it does irritate me no end to hear RP speakers say "An historic", pronouncing the "h")


yeah, I hear people pronounce the "h" and use "an" but not do it consistently. Didn't realize its silent in England.


I've only ever seen that with variations on "history". This is probably wrong, but when I first came across it at a young age I thought it was because British English often doesn't pronounce the leading 'H' making the word start with a vowel sound. <shrug>


And another thing. Using "an" when a word starts an H is followed very inconsistently.

Is that a rule in American English? I don't remember hearing it in school.


As best I recall, the rule is based on pronunciation, not on spelling. So for words where the "h" is pronounced, you treat it as a consonant and lead with "a": "go for a hike." Whereas for words where "h" is silent, so pronunciation thus leads with a vowel, you lead with "an": "it would be an honor."

"Historic" is pronounced differently, mostly based on region. You can go to "a historic occasion," with hard H, or "an 'istoric occasion," with a soft one.

Or, for native speakers: type what you'd pronounce.


An herb? An hour? An horchata?


Sure, but there's also 'an historical novel' - less likely, unless you are a character in said novel.

I would say that the real rule is not about the letters (vowel or consonant), but the sounds. So, you receive 'an MBA' because it's pronounced 'an em-bee-eigh'.


Incidentally, the soft h on "herb" is one of the things that bugs me most about American English (and it took me several years of living here to even figure out that it was consistently done that way).

Correct me if I'm wrong, but don't Americans pronounce a hard "h" on "herbivore"? How does "herb", then, lose its h?


I actually say "erbivore", as do most people I know.


Really? Oh, in that case it's at least consistent (though it still bugs me, for no particularly good reason other than that it's unusual).

What about the name "Herb"?


I don't know any personally but I usually hear the H enunciated in the name. But Herb is short for "Herbert", which is a German name, so maybe we inherited the pronunciation rule there.


Of course the period should follow the quotation marks. Periods, exclamation marks, and question marks are the delimiters for English sentences and belong at the end in the same way that a semicolon belongs at the end of a line in C/C++.


The resolution here seems obvious to me: when you find yourself in a bike shed conversation, go with the standard. If, however, one option or the other is preferable for some practical reason, choose that option.


Try writing a parser that recognizes punctuation inside the quotes, then try to write one that recognizes it outside. It's quite clear that the latter is cleaner and simpler.


is it me or is this site using http://en.wikipedia.org/wiki/Prime_(symbol) "" instead of quotation marks? “ ” .


A recent development in Swedish - at least in online communications - is the use of the French manner of putting a bloody space in front of exclamation marks ! Looks awful.


The Chicago Manual of Style can go jump in a ditch. American style is terrible and should be abandoned immediately.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: