gcp123 3 days ago

Author makes a good point. "1700s" is both more intuitive and more concise than "18th century". The very first episode of Alex Trebeck's Jeopardy in 1984 illustrates how confusing this can be:

https://www.youtube.com/watch?v=KDTxS9_CwZA

The "Final Jeopardy" question simply asked on what date did the 20th century begin, and all three contestants got it wrong, leading to a 3-way tie.

  • runarberg 3 days ago

    In Icelandic the 1-based towards counting is used almost everywhere. People do indeed say: “The first decade of the 19th century” to refer to the 18-aughts, and the 90s is commonly referred to as “The tenth decade”. This is also done to age ranges, people in their 20s (or 21-30 more precisely) are said to be þrítugsaldur (in the thirty age). Even the hour is sometime counted towards (though this is more rare among young folks), “að ganga fimm” (or going 5) means 16:01-17:00.

    Speaking for my self, this doesn’t become any more intuitive the more you use this, people constantly confuse decades and get insulted by age ranges (and freaked out when suddenly the clock is “going five”). People are actually starting to refer to the 90s as nían (the nine) and the 20-aughts as tían (the ten). Thought I don’t think it will stick. When I want to be unambiguous and non-confusing I usually add the -og-eitthvað (and something) as a suffix to a year ending with zero, so the 20th century becomes nítjánhundruð-og-eitthvað, the 1990s, nítíu-og-eitthvað and a person in their 20s (including 20) becomes tuttugu-og-eitthvað.

  • tempodox 3 days ago

    Logic is in short supply and off-by-one errors are everywhere. Most people don't care. I think it's more doable to learn to just live with that than to reprogram mankind.

    • ryukoposting 2 days ago

      The publishing industry already has style guides for large swaths of the industry.

      Imagine, for a moment, that AP adopted the OP's "don't count centuries" guidance. An enormous share of English-language publishing outfits would conform to the new rule in all future publications. Within a couple months, a large share of written media consumption would adopt this improved way of talking about historical time frames.

      The best part? Absolutely no effort on the part of the general public. It's not like the OP is inventing new words or sentence structures. There's zero cognitive overhead for anyone, except for a handful of journalists and copywriters who are already used to that kind of thing. It's part of their job.

      I think a lot of people take these sorts of ideas to mean "thou shalt consciously change the way you speak." In reality, we have the systems in place to make these changes gradually, without causing trouble for anyone.

      If you don't like it, nobody's trying to police your ability to say it some other way - even if that way is objectively stupid, as is the case with counting centuries.

    • phkahler 2 days ago

      Nobody's asking to reprogram anyone. Just stop using one of two conventions. The reason to do it is simple and obvious. I'm really baffled at the responses here advocating strongly for the current way. But I guess that's just a "people thing"

      • 1over137 2 days ago

        Asking me to stop using my preferred convention is tantamount to 'reprogramming' me.

        • BobaFloutist 2 days ago

          I'm amazed you didn't even hedge by saying "telling me to", claiming that a request to shift convention is tantamount to a reprogramming is certainly a bold, provocative claim.

        • mananaysiempre 2 days ago

          Reprogramming mankind is unreasonable. Reprogramming you may not be.

    • swyx 2 days ago

      oooor we can slowly migrate towards sensibility as we did celsius and centimeters

      • jonathan_landy 2 days ago

        Re temp, I’m glad we use F for daily life in the USA. The most common application I have for temp is to understand the weather and I like the 0-100 range for F as that’s the typical range for weather near me.

        For scientific work I obviously prefer kelvin.

        Celsius is nearly useless.

        • SkeuomorphicBee 2 days ago

          For me the best feature of Celsius, the one that makes it much better for weather, is the zero on the freezing point of water. Everything changes in life when water start to freeze, roads get slippery, pipes burst, crops die. So it is important that such a crucial threshold is represented numerically in the scale. In other words, going from 5 to -5 in Fahrenheit is just getting 10° colder, nothing special, while going from 2 to -2 in Celsius is a huge change in your daily life.

        • fennecbutt a day ago

          Perhaps it's just because you're not used to it. 17-18c is perfect, 25 is a mild summer day. 30-35 full swing summer and 40 and up is oh no global warming. 5-7 is chilly, 0 is cold, -single digit is damn it's a cold winter and -double digits is when tf did I move to Canada.

        • dtech 2 days ago

          95% of the world uses Celcius without problems because they're used to it. You'd either also be fine with it or you belong to a sub-5th percentile which couldn't figure it out, take your pick.

          • rootusrootus 2 days ago

            > sub-5th percentile which couldn't figure it out

            Ironic, given that one of the prime arguments in favor of metric is that it is easier.

            Why do non-US people even care? And do y'all care that you are wrong? The US has recognized the SI. Citizens continue to use measurements they are comfortable with, and it does not hurt anyone. We are also not the only nation that has adopted SI but not made it mandatory. The UK is an obvious example.

            Again, I'm back to 'why does anyone else even give a shit'? Aren't there more interesting things to ponder?

            • BrandoElFollito 2 days ago

              What does "adopted" mean in that context? (serious question)

        • inglor_cz 2 days ago

          "Celsius is nearly useless."

          http://i.imgur.com/3ZidINK.png?1

          For anyone not living in the US or Jamaica or Belize, it is Fahrenheit that is completely useless. Which is something like 7,7 billion people.

          0 = water freezing temp is hugely useful heuristics for anyone living in moderate climate.

          • rootusrootus 2 days ago

            > For anyone not living in the US

            So what I am hearing is that sure, it makes perfect sense for US citizens to continue using Fahrenheit.

            • inglor_cz 2 days ago

              US residents...

              If you, as a US citizen, settle abroad, be prepared to run into a wall with Fahrenheits. People in the rest of the world don't have the intuitive grasp whether 50 degrees Fahrenheit is warm or cold.

              • rootusrootus 2 days ago

                > US residents

                Yeah that's the right terminology. I knew it when I said citizens it wasn't quite right but I blanked on the right answer. 'Residents' is pretty obvious.

                > be prepared to run into a wall with Fahrenheits

                I agree it's worth knowing just enough about celsius to use it casually when you are traveling. e.g. I just remember 20 is room temperature and every 5C is about 10F. Close enough. And remembering '6' is enough to remember how km and miles are related.

                Anyone who is settling abroad ought to be able to pick up intuitive celsius in a couple days. When everyone around you uses the same measuring unit, you adapt pretty quickly IME.

        • creeble 2 days ago

          I agree. For ambient temp, F is twice as accurate in the same number of digits. It also reflects human experience better; 100F is damn hot, and 0F is damn cold.

          Celsius is for chemists.

          • lye 2 days ago

            There's very little difference between e.g. +25°C and +26°C, not sure why you would need event more accuracy in day to day life. There are decimals if you require that for some reason.

            Celsius works significantly better in cold climates for reasons mentioned in another comment.

            • _gabe_ 2 days ago

              If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments? The decimals are used, because the change between 25C and 26C is actually pretty big :)

              In my old apartment, the difference between 73F and 74F was enough to make me quite cold or hot. And that’s a difference of about 0.5C. I’m not arguing that Farenheit is better, but I definitely do prefer it for setting my thermostat (which is a day to day thing) , but then again I grew up using it so that could be why I prefer it too.

              • ajuc 2 days ago

                > If that’s the case why do the Celsius thermostats I used while on vacation in Canada use 0.5C increments?

                Probably because they were made for US and changed the labels? I've never seen a thermostat with 0.5 C increments in Europe.

                > the change between 25C and 26C is actually pretty big

                I would maybe be able to tell you if it's 23 or 27, certainly I can't tell 1 C difference.

          • nephanth 12 hours ago

            > Celsius is for chemists

            Or cooks. Or anyone who cooks, which is most people

          • ajuc 2 days ago

            The difference between -1 C and +1 C is VASTLY more important in daily life than the difference between 26.5 and 27 C.

            Farmers, drivers, people with gardens need to know if it will get subzero at night.

            Nobody cares if it's 26.5 C or 26 C.

        • abraae 2 days ago

          > Celsius is nearly useless.

          That's like ... your opinion man.

          Personally I like knowing that water boils at exactly 100 degrees.

          • prerok 2 days ago

            At sea level, yes :)

            I do agree, though I live in Europe and C is the norm. I could never wrap my head around F.

            That said, I think 0 is more important in daily life, below or above freezing. How much is that in F again?

            • dboreham 2 days ago

              As a dweller of a cold place in the USA, F is pretty handy because "freezing" isn't terribly cold. Having 0F be "actually quite seriously cold" is useful.

              • ajuc 2 days ago

                My parents care a lot about "przymrozek" - which is when it gets sub-zero C at night and you need to cover the plants and close the greenhouse doors and put a heater there so the plants survive. They give warnings in radio when this happen outside of regular winter months.

                There's also special warning for drivers if it was sub-zero because then the water on the roads freezes and it's very hard to break.

                I'd say it's way more important a distinction than anything that F makes obvious.

              • User23 2 days ago

                Also, conveniently, freezer temperature is 0F not 32F.

            • autoexec 2 days ago

              We just need a new scale just for weather where 100 is 100F and 0 is 32F/0C then everyone can be happy. We'd have a lot more days with subzero temperatures though

          • runarberg 2 days ago

            You just use one thing and you’ll learn it. When I was a kid my country changed from archaic 12 point “wind levels” to m/s. It took everybody a few weeks to adjust but it wasn’t hard. It was a bit harder for me after moving to America to adjust to Fahrenheit, but as you experience a temperature, and are told it is so many Fahrenheit, you’ll just learn it. I have no idea at what temperature water boils in F simply because I never experience that temperature (and my kettle doesn’t have a thermometer).

            That said I wished USA would move over to the unit everyone else is using, but only for the reason that everyone else is using it, that is the only thing that makes it superior, and it would take Americans at worst a couple of months to adjust.

            • rootusrootus 2 days ago

              > only for the reason that everyone else is using it

              That is an honest answer, which is refreshing. Beside that, there is not really any particular reason that the US has to make SI mandatory. We adopted SI nearly 50 years ago, we just did not make it mandatory. The US has a bit of national identity which leans towards rebelling, so making SI mandatory would probably be contentions anyway. And it's just not worth the argument, since it buys us very little of actual value.

              • runarberg 2 days ago

                Temperature is easy, probably the easiest unit to convert... Everyone would get used to it pretty soon after they started using it regularly. There would be some legacy systems out there which would annoying to convert (which is already the case) but within a generation nobody would bother with Fahrenheit at all.

                I think the hardest unit to convert is probably length as there is not only a bunch of legacy systems and equipment out there, but Americans are very accustomed to fractional sub-units as opposed to the decimal cm, mm, etc. I’m not sure e.g. the building industry would ever stop saying e.g. four and five eighths. Personally I hate fractional lengths when using american tools. E.g. I’m used to a 11 mm wrench being smaller than a 13 mm wrench. I need to stop and think before I know which is smaller a five eights or a three quarters.

                • rootusrootus 2 days ago

                  > american tools

                  That's an interesting way to phrase it. I, and everyone I know, have both metric and SAE tools. At least for wrenches & sockets.

                  > I need to stop and think before I know which is smaller a five eights or a three quarters.

                  I'm with you there. I've gotten in the habit of just mentally converting every SAE size to 32nds. I wouldn't really mind losing SAE, but that is not happening. What really makes my blood pressure goes up is Ford ... they mix metric and SAE fasteners on their cars. WTF! Pick one! Subaru is at the other end, easy to work on because 10 & 12mm wrenches will work for maybe 9 out of 10 bolts or nuts.

        • MostlyStable 2 days ago

          I agree that for weather F is better, but I don't think it's so much better as to be worth having two different temp scales, and unlike K, C is at least reasonable for weather, and it works fine for most scientific disciplines.

        • ryukoposting 2 days ago

          I don't see enough love for feet and inches.

          A foot can be divided cleanly into 2, 3, 4, and 6. Ten is a really sucky number to base your lengths on. It only divides nicely into 2 and 5.

          • runarberg 2 days ago

            People normally just use the subunit which doesn’t divide. E.g. height is usually referred to in cm. If accuracy is important they use millimeters. Roadsigns for cars use km but downtown wayfinding signs for pedastrians use meters.

            I agree it is really nice to use base-12 until it brakes, but it brakes much worse then metric. If you have to divide into 32nds everything about feet and inches is much worse (in metrics we would just use millimeters). The worst offender are wrenches which don’t order intuitively. In metric, if you 13 mm wrench is too big, you just grab an 11 mm wrench. In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?

            • ryukoposting an hour ago

              Stepping down to the next unit doesn't necessarily make anything tidier. If I need to cut a 3.5-foot piece of wood into thirds, then I cut it into 14-inch pieces. If I need to cut a 1-meter piece of wood into thirds, I cut it into 33.3-centimeter pieces.

              Or, perhaps I want to hang two photos on a wall, spacing them evenly - the math from the example above applies again.

              Regarding your example of dividing 12 into 32 parts - I think that's another good example of the elegance of imperial units. Dividing a foot into 32 parts is 3/8 of an inch! A nice, tidy unit that you'll find on any ruler or measuring tape.

              >In inches if your 13/16th inch wrench is too big, do you grab the 5/8th? or three-quarters next?

              Neither - I'd grab the 25/32" wrench ;) You make a good point.

              I will say that fractional units become more and more intuitive as you use them more often. In a pinch you can just multiply both parts of the fraction by two.

              Here's the thing: with wrenches in fractional units, you can do a binary search. Let's say you start with the 1/2 inch wrench. Too small? grab the 3/4. Too big? Try the 1/4. Work your way down.

              ...or, just remember that a huge share of bolts you'll come by are 7/16" and just start there.

        • nephanth 12 hours ago

          I find it quite strange that Fahrenheit stuck in the USA with its wide range of climates of all places.

          I mean, that "0F to 100F is weather temperature range" completely falls apart unless you live in a very cold climate.

          • ryukoposting an hour ago

            Sure, temperatures go outside those bounds, but only in the most extreme of weather conditions. Below zero? Above 100? You should probably stay inside today.

        • LocalH 2 days ago

          At least conversion between Celsius degrees and Kelvin is easy and lossless

        • lye 2 days ago

          What the hell are you talking about. If it's 0°C outside (or below that), I know that it's high time to put winter tires on because the water in the puddles will freeze and driving on summer tires becomes risky. I had to look it up, but apparently that's +32 °F. Good luck remembering that.

          +10°C is "it's somewhat cold, put a jacket on". +20°C is comfortable in light clothing. +30°C is pretty hot. +40°C is really hot, put as little clothing as society permits and stay out of direct sun.

          Same with negatives, but in reverse.

          Boiling water is +100°C, melting ice is very close to 0°C. I used that multiple times to adjust digital thermometers without having to look up anything.

          It's the most comfortable system I can imagine. I tried living with Fahrenheit for a month just for fun, and it was absolutely not intuitive.

          • ryukoposting an hour ago

            If you had to "look it up" to remember that 32°F is freezing (or that 212°F is boiling), then you clearly didn't "live with Fahrenheit" long enough to have developed even the most basic intuitions for it. That's first-grade stuff.

          • zdragnar 2 days ago

            You'll want winter tires on well before the air temperature hits freezing for water. Forecasts aren't that predictable, and bridges (no earth heat sink underneath) will ice over before roads do.

            40 F is a good time for getting winter tires on.

            As someone who lives in a humid, wet area that goes from -40 at night in winter to 100+ F in summer, I also vastly prefer Fahrenheit.

            The difference between 60, 70, 80 and 90 is pretty profound with humidity, and the same is true in winter. I don't think I've ever set a thermometer to freezing or boiling, ever. All of my kitchen appliances have numbers representing their power draw.

            • lye 2 days ago

              Well, it's been working fine for me for about 15 years, let's agree to disagree here. I would still find it easier to remember to change the tires at +1°C than whatever the hell it comes down to in Fahrenheit.

              I too live in a region with 80 (Celsius) degree yearly variation (sometimes more; the maximum yearly difference I've lived through is about 90 degrees IIRC: -45 in January to +43 in July), and Fahrenheit makes absolutely no sense to me in this climate.

              • rootusrootus 2 days ago

                > Well, it's been working fine for me for about 15 years, let's agree to disagree here.

                If you want to convince yourself, go out on the road in non-winter tires when it is sub-40F, find an open space where you can experiment, and then do a panic stop. Like you might have to do if someone jumps out in front of you.

                That is what convinced me to not wait until it was freezing before I put on cold weather tires.

              • happyraul 2 days ago

                Winter tyres are less to do with freezing water and more to do with the way the tire compound in summer tires hardens/loses elasticity and therefore grip in lower temperatures, around 7 degrees Celsius.

      • User23 2 days ago

        It’s been tried. The “rational” calendar reform was something of a failure.

    • zepolen 3 days ago

      That's what most people think and the world keeps trucking along.

      It's the rare people that don't who actually change the world.

      • tempodox 3 days ago

        You can change the world if you make it easier to meet a need enough people have. Persuading everyone they're holding it wrong is not that.

        • lukan 3 days ago

          "You can change the world if you make it easier to meet a need enough people have"

          True and should not be forgotten in this debate.

          But clear communication is a need many people have.

        • __MatrixMan__ 2 days ago

          Persuasion by argument, maybe not. But if you simply ask for clarification when you hear "nth century" but not when you hear "n-hundreds" then you've effectively made it easier for the speaker to meet their need one way over the other way.

          Same thing for "this weekend" when. Not spoken during a weekend.

    • dgb23 2 days ago

      Specifically I agree, but generally I disagree. I’m very glad we got the metric system, standards for commonly used protocols and so on.

    • jodrellblank 2 days ago

      What's "more logical" about "the seventeenth century" compared to "the sixteen hundreds"?

      • hombre_fatal 2 days ago

        I’d say more sensible. It’s always weird to me to use the number 17 to talk about years that start with 16. Makes more sense to just say the 1600s.

        • dantyti 2 days ago

          After their 16th birthday, the person is going through their 17th year.

          Just like 11:45 can be told as "a quarter to 12"

          • rootusrootus 2 days ago

            > After their 16th birthday, the person is going through their 17th year.

            While that is true, does it not illustrate exactly the problem? Nobody ever says someone is in their 17th year when they are 16. That would be very confusing.

            • dantyti a day ago

              People in my country sometimes do. As well as uni students, who always say which year they’re in, not how many years they’d finished.

    • adamomada 2 days ago

      You just made me realize that the common saying “the eleventh hour” isn’t what anyone thinks it is

    • leereeves 2 days ago

      > I think it's more doable to learn to just live with that than to reprogram mankind.

      Why not just fix the calendar to match what people expect?

      There was no time when people said "this is year 1 AD". That numbering was created retroactively hundreds of years later. So we can also add year 0 retroactively.

  • jrockway 3 days ago

    On the other hand "1700s art" sounds like trash compared to "18th century art".

    • burkaman 3 days ago

      I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless, and if possible it would be more useful to say something like "neoclassical art from the 1700s". "18th century" isn't an artistic category, but it kind of sounds like it is if you just glance at it. "Art from the 1700s" is clearly just referring to a time period.

      • BlarfMcFlarf 2 days ago

        Agreed. The haiku is “18th century art” as that’s when it was first invented. So it’s either a uselessly broad category, or an indefensibly Eurocentric one.

      • darby_nine 3 days ago

        > I think that's good, because it helps you realize that categorizing art by century is kind of arbitrary and meaningless

        no it won't lol, people will pay just as much through the new dating system as they would through the old.

        • coldtea 3 days ago

          People pay as much for art because they are the rare combination of educated person with money which values the aesthetics and artifacts of an era, or as something to signal their wealth to others, or as a way to launder money.

    • bandyaboot 3 days ago

      If using “1700s”, I’d write it as “art of the 1700s”.

    • rz2k 3 days ago

      How about if you say "settecento"? Maybe it is a new confusion that they drop a thousand years, and maybe it would imply Italian art specifically.

      • lucb1e 3 days ago

        Just to make sure I understood this, that would be used as "17th settecento" to mean 1700s right?

        (This Xth century business always bothered and genuinely confused me to no end and everyone always dismissed my objections that it's a confusing thing to say. I'm a bit surprised, but also relieved, to see this thread exists. Yes, please, kill all off-by-one century business in favor of 1700s and 17th settecento or anything else you fancy, so long as it's 17-prefixed/-suffixed and not some off-by-anything-other-than-zero number)

        • animaomnium 3 days ago

          "settecento" can be read as "seven hundred" in Italian; gramps is proposing to use a more specific word as a tag for Italian art from the 1700s. Of course, 700 is not 1700, hence the "drop 1000 years". The prefix seventeen in Italian is "diciassette-" so perhaps "diciasettecento" would be more accurate for the 1700s. (settecento is shorter, though.)

          Hope this clarifies. Not to miss the forest for the trees, to reiterate, the main takeaway is that it may be better to define and use a specific tag to pinpoint a sequence of events in a given period (e.g. settecento) instead of gesturing with something as arbitrary and wide as a century (18th century art).

        • rz2k 3 days ago

          Think of it as the 700s, which is a weird way to refer to the 1700s, unless you are taking a cue from the common usage. That’s just how the periods are referenced by Italian art historians.

          • hk__2 3 days ago

            > That’s just how the periods are referenced by Italian art historians.

            And Italian people in general.

          • psychoslave 3 days ago

            Not much different from 60s refering to 1960 to 1969, to my mind

        • coldtea 3 days ago

          settecento means "700". Just proposed above as a way to say 18th century or 1700s, same as we sometimes remove the "2000" and just say "the 10s" for the decade starting 2010 (nobody cares for the 2011-as-start convention except people you don't want to talk to in the first place).

    • cm2187 2 days ago

      And 1700s already has a different meaning, i.e. early 18th century.

  • semireg 3 days ago

    The right answer was, and still is: Jan 1, 1901

    • glitcher 3 days ago

      Incorrect, this answer wasn't given in the form of a question ;)

    • readthenotes1 3 days ago

      How can that be if 15 of those centuries are on the Julian calendar?

      • whycome 3 days ago

        Also, when they switched things in 1582:

        https://www.britannica.com/story/ten-days-that-vanished-the-....

        > The most surreal part of implementing the new calendar came in October 1582, when 10 days were dropped from the calendar to bring the vernal equinox from March 11 back to March 21. The church had chosen October to avoid skipping any major Christian festivals.

      • Archelaos 2 days ago

        The "original" Julian calendar was indifferent to year number systems. The Romans typically used the consular year, although Marcus Terentius Varro "introduced" the ab urbe condita (AUC) system in the 1st century BC, which was used until the Middle Ages. From the 5th to the 7th century, the anno Diocletiani (also called anno martyrum) after emperor Diocletian was used primarily in the eastern empire (Alexandria), or the anno mundi (after the creation of the world). It was Dionysius Exiguus in the 6th century, who replaced the anno Diocletiani era with the Anno Domini era. His system become popular in the West, but it took a long time until it also was adopted in the East. Its application to years before the birth of Christ is very late: we come across it first in the 15th century, but it was not widespread before the 17th century.

        All these systems used the Julian system for months and days, but differed in terms of the year and (partialy) in the first day of the year.

      • pdonis 3 days ago

        The century in which the switch occurred (which was different in different countries) was shorter than the others. As were the decade, year, and month in which the switch occurred.

    • hgomersall 3 days ago

      No, the first century began Jan 1, 0000. Whether that year actually existed or not is irrelevant - we shouldn't change our counting system in the years 100, 200 etc.

      • Izkata 3 days ago

        The calendar goes from 1 BC to 1 AD, there is no year 0.

        • rcoveson 3 days ago

          There is no year zero according to first-order pedants. Second-order pedants know that there is a year zero in both the astronomical year numbering system and in ISO 8601, so whether or not there is a year zero depends on context.

          It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

          • worstspotgain 3 days ago

            Yes but, is there such a thing as a zeroth-order pedant, someone not pedantic about year ordinality? As a first-order meta-pedant, this would be my claim.

            Moreover, I definitely find the ordinality of pedantry more interesting than the pedantry of ordinality.

            • jessekv 2 days ago

              Interesting indeed. I suppose third-order pedantry must be "jerk".

            • nativeit 3 days ago

              Thank you for your service.

          • marc_abonce 3 days ago

            > It's ultimately up to us to decide how to project our relatively young calendar system way back into the past before it was invented. Year zero makes everything nice. Be like astronomers and be like ISO. Choose year zero.

            Or, just to add more fuel to the fire, we could use the Holocene/Human year numbering system to have a year zero and avoid any ambiguity between Gregorian and ISO dates.

            https://en.wikipedia.org/wiki/Holocene_calendar

          • m2f2 2 days ago

            Talking about standards let's not pick and choose.

            First, let's get rid of miles and feet, then we could even discuss this.

            • lobsterthief 2 days ago

              If only—I think most US citizens who actually work with units of measurement on a daily basis would love to switch to the metric system. Unfortunately, everyone else wants to keep our “freedom units” (and pennies)

          • jl6 3 days ago

            We are all defacto ISO adherents by virtue of our lives being so highly computer-mediated and standardized. I’m fully on board with stating that there absolutely was a year zero, and translating from legacy calendars where necessary.

          • hollerith 3 days ago

            I vote for a year zero and for using two's complement for representing years before zero (because it makes computing durations that span zero a little easier).

        • hgomersall 3 days ago

          What does that even mean? Do we allow for the distortion due to the shift from the Julian to Gregorian calendars, such that the nth year is 11 days earlier? Of course not, because that would be stupid. Instead, we accept that the start point was arbitrary and reference to our normal counting system rather than getting hung up about the precise number of days since some arbitrary epoch.

          • pdonis 3 days ago

            > What does that even mean?

            It means just what it says. In the common calendar, the year after 1 BC (or BCE in the new notation) was 1 AD (or CE in the new notation). There was no "January 1, 0000".

            • hgomersall 3 days ago

              As I said twice, whether that date actually existed or not is irrelevant.

              • pdonis 3 days ago

                > whether that date actually existed or not is irrelevant.

                No, it isn't, since you explicitly said to start the first century on the date that doesn't exist. What does that even mean?

                • lupire 3 days ago

                  The first day of the 1st Century is Jan 1, 1 AD.

                  The point is that some days got skipped over the centuries, but there's no need to make the Centuries have weird boundaries.

                  • pdonis 3 days ago

                    > The first day of the 1st Century is Jan 1, 1 AD.

                    That's not what the poster I originally responded to is saying. He's saying the 1st Century should start on a nonexistent day.

                    • antonvs 3 days ago

                      You can make this work by having the 1st century start on the last day of 1 BC. Think of it as an overlap if you like; it doesn't really matter.

                      That allows for consistent zero-indexed centuries. It doesn't have any other practical consequences that matter.

                    • hgomersall 3 days ago

                      No, I'm saying we ignore when it actually started and instead use the normal rules of counting to decide what to call the respective centuries.

                • zarzavat 2 days ago

                  0 CE = 1 BCE

                  10 C = 50 F = 283.15 K

                  1 = 0.999…

                  Things can have more than one name. The existence of the year 0 CE is not in question. What’s in question is whether that’s a good name for it or not.

        • coldtea 3 days ago

          Hence why the parent wrote "Whether that year actually existed or not is irrelevant".

          They might or might not have a point, but they already addressed yours.

  • d0mine 3 days ago

    There is no "0" year, 1 is the 1st year, so 100th year is still the 1st century, therefore 2nd century starts in 101 and 20th in 1901.

    • notfed 3 days ago

      I find this decree frustrating. Someone could have just as easily said "the 'first' century starts at 1 BC" to account for this.

      • Thorrez 3 days ago

        Then what is the last year of the first century BC? 2 BC? Now there's an off-by-2!

      • 38 3 days ago

        Or better yet just year 0, why not? Do we say the 80s start in 1981?

        • rdlw 3 days ago

          The concept of zero was not popularized in 500s Europe, when the system was devised.

          • karatinversion 3 days ago

            And also, the system is a direct descendant of regnal numbering, where zero wouldn’t have made sense even if invented (there is no zeroth year of Joe Biden’s term of office).

    • coldtea 3 days ago

      Doesn't matter, we can just agree the first century had 99 years, and be done with it.

      We have special rules for leap years, that would just be a single leap-back century.

      At the scale of centuries, starting 2nd century at 100 as opposed to 101 is just an 1% error, so we can live with it. For the kind of uses we use centuries for (not to do math, but to talk roughly about historical eras) it's inconsequential anyway.

  • ajuc 2 days ago

    Depends on the language. Century being 3 syllables really makes it long in English, but it's still 5 syllables vs 5 syllables.

    In Polish: [lata] tysiącsiedemsetne (6 [+2] syllables) vs osiemnasty wiek (5 syllables).

  • zozbot234 2 days ago

    1700s means 1700–1709, i.e. roughly the first decade in the 18th century. Just like '2000s'. The OP acknowledges this issue and then just ignores it.

    • Viliam1234 2 days ago

      I have a solution that would work in writing, but not sure how to pronounce it:

      1700s means 1700–1709

      1700ss means 1700–1799

      To go one step further:

      2000s means 2000-2009

      2000ss means 2000-2099

      2000sss means 2000-2999

  • masswerk 3 days ago

    So shouldn't this be the "0-episode"? ;-)

    (0, because only after the first question, we have actually 1 episode performed. Consequently, the 1-episode is then the second one.)

  • swyx 2 days ago

    that is fascinating trivia. you could do a whole Jeopardy on Jeopardy facts alone

  • drivers99 3 days ago

    There are numerous common concise ways to write the 18th century, at the risk of needing the right context to be understood, including “C18th”, “18c.”, or even “XVIII” by itself.

    • anyfoo 3 days ago

      These are even more impractical, so I wonder what your point is? I can come up with an even shorter way to say 18th century, by using base26 for example, so let's denote it as "cR". What has been gained?

  • Kwpolska 3 days ago

    > very first

    It’s actually the second.

    > Trebeck's

    Trebek's*

    • card_zero 3 days ago

      Let's reform Alex Trebek's name, it's difficult.

      • c0balt 3 days ago

        And while we are at it, Tim Apple

        • bitwize 3 days ago

          There was an airport-novel series about a future where people's surnames are the company they work for. It was called Jennifer Government.

          Some of the characters in Death Stranding, namely the main one, have a given-name, profession, employer convention -- as in Sam Porter Bridges.

          • krsdcbl 3 days ago

            Death strandings naming is not too far from very common naming conventions throughout history, it's a nicely subtle touch.

            Glenn Miller, Gregory Porter and Sam Smith just happen to have been more inclined to make music.

          • monkeyfun 3 days ago

            Ahhh, reminding me of Nation-States too. What a curious little website / online community.

          • nkrisc 3 days ago

            And in the far future of that future surnames like Government or PepsiCo or Alcoa will be as common as Smith, Fletcher, and Miller.

  • dantyti 2 days ago

    What about languages that don’t have an equivalent to “the Xs” for decades or centuries?

    Also, 1799 is obviosly more than 1700, as well as 1701 > 1700 – why should the naming convention tie itself to the lesser point? After one’s third birthday, the person is starting their fourth year and is not living in their third year.

    I feel this is relevant https://xkcd.com/927/

  • darby_nine 3 days ago

    > Author makes a good point. "1700s" is both more intuitive and more concise than "18th century".

    Yea, but a rhetorical failure. This sounds terrible and far worse than alternatives.

    If we want a better system we'll need to either abandon the day or the Gregorian (Julian + drift) caliendar.

milliams 3 days ago

It's easy, we should have simply started counting centuries from zero. Centuries should be zero-indexed, then everything works.

We do the same with people's ages. For the entire initial year of your life you were zero years old. Likewise, from years 0-99, zero centuries had passed so we should call it the zeroth century!

At least this is how I justify to my students that zero-indexing makes sense. Everyone's fought the x-century vs x-hundreds before so they welcome relief.

Izzard had the right idea: https://youtu.be/uVMGPMu596Y?si=1aKZ2xRavJgOmgE8&t=643

  • mmmmmbop 3 days ago

    > We do the same with people's ages.

    No, we don't.

    When we refer to 'the first year of life', we mean the time from birth until you turn 1.

    Similarly, you'd say something like 'you're a child in the first decade of your life and slowly start to mature into a young adult by the end of the second decade', referring to 0-9 and 10-19, respectively.

    • Uehreka 3 days ago

      > No, we don't.

      But practically speaking we usually do. I always hear people refer to events in their life happening “when I was 26” and never “in the 27th year of my life”. Sure you could say the latter, but practically speaking people don’t (at least in English).

      • xg15 2 days ago

        I think of the age number "practically" as the number of "birthday celebrations" I have experienced, excluding the actual day of birth. That's the same as the amount of completed years I've lived on this earth, and one less than the year I'm living in, because that year is not yet completed. (Except of course on birthdays)

        But I think this also illustrates just how averse our culture is to using zero-indexing in counts: The age number absolutely is zero-indexed - a baby before before the first birthday is zero years old. But no one calls it like that, instead we drop the year count entirely and fall back to the next-largest nonzero unit, i.e. we say the baby is so-and-so-many months old. And for newborns not yet a month old, we count in weeks, etc.

        I think, culturally, it's not that surprising as this method of counting is older than the entire concept of "zero". But I think it shows that there is little hope of convincing a large number of non-nerd people to start counting things with zeros.

      • mjmahone17 3 days ago

        “Half one” is archaic English, and common German, for 12:30. Similarly “my 27th year” just sounds archaic to me: I wonder if you went through a bunch of 19th century writing if you’d see ages more often be “Xth year” vs “X-1 years old”.

        There may be something cultural that caused such a shift, like a change in how math or reading is taught (or even that it’s nearly universally taught, which changes how we think and speak because now a sizeable chunk of the population thinks in visually written words rather than sounds).

        • HPsquared 2 days ago

          A lot of European languages say "I have x years" instead of "I am x years old". It emphasises the "milestone" nature, as in "I have x full years".

        • einherjae 3 days ago

          Isn’t “half one” used as a short form of “half past one” these days, I.e. 01:30? That has been a source of confusion for someone used to the Germanic way.

          • larusso 3 days ago

            I had this exact topic with an Irish coworker who lives in Germany and has issues to convey the right time. For me as a German „half one“ is half of one so 12:30. Same for „Dreiviertel eins“ -> „threequarter one“ being 12:45 and „Viertel eins“ -> „quarter one“ being 12:15. To be fair the logic behind this is also under constant confusion as some parts of Germany rather use „viertel vor“ or „viertel nach“ -> „quarter to“ „quarter after“ and have no understanding of the three quarter business.

          • OJFord 2 days ago

            In the UK yes, I think not in AmE? At least I'm pretty sure they don't say 'quarter to' or 'quarter past', and do say 'a half after'.

            (I had some confused conversation with a bus driver once. Bizarre experience to have so much language barrier between two EFL speakers, in English!)

          • beAbU 2 days ago

            The Irish like to say "half one" meaning "half past one". In my native timekeeping parlance "half een" means 12h30. Germanic/Dutch origin.

            So whenever I talk time with the locals here I repeat the time back in numerical style to avoid confusion.

            "The shop opens tomorrow at half ten".

            "Thanks, store opens at nine thirty. See you then."

            "No..."

        • Unbefleckt 3 days ago

          Had no idea myself, my peers, my family and my community used archaic English.

          • stavros 3 days ago

            Do you say "half ten" to refer to 9:30? If so, you're using archaic English, yep!

      • mtlmtlmtlmtl 3 days ago

        That's not really indexing from 0 though. It's just rounding the amount of time you've lived down to the nearest year. You get the same number, but semantically you're saying roughly how old you are, not which year you're in. This becomes obvious when you talk to small children, who tend to insist on saying e.g "I'm 4 and a half". And talking about children in their first year, no one says they're 0. They say they're n days/weeks/months old.

      • SllX 3 days ago

        In an indirect manner, we do mark having lived the 27th year in the following forms, we just don’t say it exactly the way you phrased it:

        1. On your 26th Birthday, when you say you turned 26 what it means is that you have now lived 26 years. People generally understand this, even if they are going to be spending the next year saying they are 26.

        2. It is not uncommon for people to demarcate their age on their birthday in revolutions around the Sun, as a kind of meme. “I’ve now traveled around the Sun twenty-six times.” or something like that, when reflecting on their lives on their Birthday.

        The colloquial usage is our legally-defined age. A shortcut for our laws to take, the age-gating ones anyway. It hasn’t replaced our cultural understanding of what the first year of our life actually was.

    • jcelerier 3 days ago

      The first year of life is the year indexed with zero, just like the first centimeter/inch in a ruler is the centimeter/inch indexed with zero

      • Sardtok 3 days ago

        And so is the first century of the zero-indexed calendar.

      • mmmmmbop 3 days ago

        I agree, that was my point.

    • kelnos 3 days ago

      > When we refer to 'the first year of life', we mean the time from birth until you turn 1.

      Sure, but no one ever uses that phrasing after you turn one. Then it's just "when they were one", "when they were five", whatever.

      So sure, maybe we can continue to say "the 1st century", but for dates 100 and later, no more.

      • rootusrootus 2 days ago

        > Sure, but no one ever uses that phrasing after you turn one.

        Heck, few people say anything about 'the first year of life' even when talking about someone that young. It is too imprecise, because things change so rapidly. In my experience the most common convention is to use months to describe age before someone turns 2.

    • furyofantares 3 days ago

      On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

      Can't say I've ever had to refer to someone's first year or first decade of their life, but sure I'd do that if it came up. Meanwhile, 0-indexed age comes up all the time.

      • subroutine 3 days ago

        The number we put on the cake represents the number of "years old" (i.e. the number of birthday anniversaries) not the number of birth days someone had (obviously). Zero year-olds are 0, one year-olds are 1, ...

      • drdec 3 days ago

        > On your sixth birthday we put a big 5 on your cake and call you a 5 year old all year.

        If you are going to be that pedantic, I would point out that one only has one birthday.

        (Well, unless one's mother is extremely unlucky.)

        • naniwaduni 3 days ago

          "Birthday" does not mean the same thing as "date of birth".

          • Detrytus 2 days ago

            “Birthday” is just short for “birth day anniversary”, I guess…

          • copperx 3 days ago

            yes, birthday and birth day are different things. Just like everyday and every day have different meanings, and it isn't confusing (to most people).

          • furyofantares 3 days ago

            Well regardless the number we constantly use is 0-indexed.

      • layer8 3 days ago

        birthday != date of birth

        “Birthday” really means “anniversary of the date of birth”.

        • samatman 3 days ago

          Spanish has us beat on this one: cumpleaños, "completed year" basically.

    • daynthelife 3 days ago

      My preference is semi-compatible with both conventions:

      First = 0 Second = 1 Toward = 2 Third = 3 …

      This way, the semantic meaning of the words “first” (prior to all others) and “second” (prior to all but one) are preserved, but we get sensical indexing as well.

  • kstrauser 3 days ago

    We don't 0-index people's ages. There are a million books about "baby's first year", while they're still 0 years old.

    • Terretta 3 days ago

      Except we do, as soon as we need the next digit.

      In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

      We talk of a child in their 10th year as being age 10. Might even be younger. Try asking a people if advice about a child in their "5th year of development" means you're dealing with a 5 year old. Most will say yes.

      So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

      Similarly "centuries" don't have a century digit until the 100s, which would make that the 1st century and just call time spans less than that "in the first hundred years" (same syllables anyway).

      It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

      • kergonath 3 days ago

        > In "figure of speech", or conventual use, people start drinking in their 21st year, not their 22nd. In common parlance, they can vote in their 18th year, not their 19th.

        That’s not the case, though. They can vote (and drink, in quite a few countries) when they are at least 18 years old, not when they are in their 18th year (who would even say that?)

        People are 18 years old (meaning that 18 years passed since their date of birth) on their 18th birthday. There is no need of shoehorning 0-based indexing or anything like that.

        > Most will say yes.

        Most people say something stupid if you ask tricky questions, I am not sure this is a very strong argument. Have you seriously heard anybody talking about a child’s “5th year of development”, except maybe a paediatrician? We do talk about things like “3rd year of school” or “2nd year of college”, but with the expected (1-indexed) meaning.

        > So perhaps it's logical to count from zero when there's no digit in the magnitude place, because you haven't achieved a full unit till you reach the need for the unit. Arguably a baby at 9 months isn't in their first year as they've experienced zero years yet!

        It’s really not. To have experienced a full year, you need a year to have passed, which therefore has to be the first. I think that’s a cardinal versus ordinal confusion. The first year after an event is between the event itself and its first anniversary. I am not aware of any context in which this is not true, but obviously if you have examples I am happy to learn.

        > It's unsatisfying, but solves the off by one errors, one of the two hardest problems in computer science along with caching and naming things.

        Right. I know it is difficult to admit for some of us, but we are not computers and we do not work like computers (besides the fact that computers work just fine with 1-indexing). Some people would like it very much if we counted from 0, but that is not the case. It is more productive to understand how it works and why (and again cardinals and ordinals) than wishing it were different.

        • Terretta 3 days ago

          > who would even say that?

          Writers.

          And yes, cardinal versus ordinal is my point. The farther from the origin, the less people are likely to want them different.

      • kdmccormick 3 days ago

        What? No. When you are 0, it is your first year. When you are 21, you have begun your 22nd year. In the US you are legal to drink in your 22nd year of life.

        You are correct that nobody says "22nd year" in this context, but nobody says "21st year" either. The former is awkward but the latter is just incorrect.

        • Terretta 3 days ago

          > nobody says "21st year" either

          On the contrary, enough people say it, it's a quora question:

          https://www.quora.com/What-does-it-mean-to-be-in-your-twenty...

          Authors love phrases like this. Which, in turn, comes from another ordinal/cardinal confusion stemming back to common law:

          "A person who has completed the eighteenth year of age has reached majority; below this age, a person is a minor."

          That means they completed being 17, but that's just too confusing, so people think you stop being a minor in your 18th year.

          • bregma 3 days ago

            It's just not true. You've completed being 17 years old on your 18th birthday, when you enter your 19th year and can count 18 years under your belt.

            Consider a newborn. As soon as they're squeezed out they are in their first year of life. That continues until the first anniversary of their decanting, at which point they are one year old and enter their second year of life.

            There is nobody, nobody, who refers to a baby as being in their zeroth year of life. Nor would they refer to a one-year-old as still being in their first year of life as if they failed a grade and are being held back.

            The pattern continues for other countable things. Breakfast is not widely considered the zeroth meal of the day. Neil Armstrong has never been considered the zeroth man on the moon nor is Buzz Aldrin the first. The gold medal in the Olympics is not awarded for coming in zeroth place.

            • kelnos 3 days ago

              > It's just not true.

              No one's saying it's true! All that's being claimed is that writers will often use phrases like "became an adult in their 18th year" or "was legally allowed to drink in their 21st year".

              It's completely incorrect, but some people use it that way, and ultimately everyone understands what they actually mean.

          • antonvs 3 days ago

            The top response in your Quora link is that your 21st year "means you’re 20. You have had your 20th birthday, but not yet your 21st." That is the conventional definition.

            People commonly make the mistake of thinking otherwise, but that's all it is. A mistake.

      • afiori 2 days ago

        This is an instance of the general problem that a path of lenght n has n+1 nodes

    • bmacho 3 days ago

      If you point at year long intervals, then those will be year long intervals indeed.

      Nevertheless the traditional "how old are you" system uses a number 1 less.

    • User23 3 days ago

      We also talk about someone’s first day at work during that day.

    • bitwize 3 days ago

      You are, in general, n-1 years old in your nth year. Only when you complete your nth year do you turn n years old.

    • mixmastamyk 3 days ago

      How old is the baby now? Six months…

    • copperx 3 days ago

      Ages are zero indexed, but people avoid saying zero by counting age in months in year 0, then switch to years in year one.

    • OJFord 2 days ago

      Yeah we do, because their 'first year' isn't their age. We do their age in (also zero-indexed) months/weeks/days.

      In Indian English terms, we do 'complete' age - aiui more common in India is to one-index, i.e. you're the age of the year you're in, and to disambiguate you might hear someone say they're '35 complete', meaning they have had 35 anniversaries of their birth (36 incomplete).

  • drewcoo 3 days ago

    > we should have simply started counting centuries from zero

    Latin, like Lua, is 1-indexed.

    • IshKebab 3 days ago

      I feel like the Romans had an excuse for that mistake. Not sure about Lua.

      • bigstrat2003 3 days ago

        I don't think it is a mistake for Lua. The convention to zero-index arrays is not sacrosanct, it's just the way older languages did it (due to implementation details) and thus how people continue to do it. But it's very counter-intuitive, and I think it's fair game for new languages to challenge assumptions that we hold because we're used to past languages.

        • worstspotgain 3 days ago

          Zero-based arrays are counter-intuitive for a while, but if you deal with a lot of data, you typically realize that it's a small price to pay to make manipulation much easier in many contexts. For instance, if you have a ring buffer of size N and an unwrapped position P, the wrapped position is:

          Zero-based: P % N

          One-based: ((P - 1) % N) + 1

          It might seem trivial, but each +/-1 is an opportunity for confusion and a bug nest. With zero-based arrays, it's often the case that the only required +/-1's are when producing and consuming human-readable one-based text.

          The next stop on the zero-based epiphany train is the realization that a convenient way to store a range is a { first, first_past } tuple. The size of the range is (first_past - first). The whole-array range is { 0, size }, while a simple empty range is { 0, 0 } (zero is often the default initialization, simplifying things further.)

          Both elements are indices, so they can be similarly manipulated, compared and range-checked, making many 'if' clauses easier to think about and verify. If there is a bug, it often ends up being harmless because of the arithmetic properties of this scheme.

          Once you start dealing with multiple ranges, the advantages are even more obvious. Two ranges are adjacent iff (first_a == past_b || first_b == past_a). The intersection of two ranges is { max(first_a, first_b), min(past_a, past_b) }, which is nonempty iff they overlap. An array of M adjacent ranges is stored as a uniform (M+1)-tuple.

          This realization has become so second-nature for me that I'm probably overlooking four or five even better examples here.

        • weinzierl 3 days ago

          > "it's just the way older languages did"

          It's a C family (predecessors and descendants) idiosyncrasy that very unfortunately got out of hand. Most other old languages had either 1-based indexing or were agnostic. Most notably FORTRAN, which is the language for numerical calculations is 1-based.

          The seminal book Numerical Recipies was first published as 1- based for FORTRAN and Pascal and they only latter added a 0-based version for C.

          Personally, coming from Pascal, I think the agnostic way is best. It is not only about 0-based or 1-based but that the type system encodes and verifies the valid range of index values, e.g. in Pascal you define an array like this:

              temperature = array [ -35 .. 60 ] of real;
          
          You will get an immediate compule-time error if you use

             temperature[61];
          
          At least with Turbo Pascal you could chose if you wanted run-time checks as well.

          I have a hard time wrapping my head around the fact that this feature is pretty much absent from any practically used language except ADA.

        • IshKebab 3 days ago

          It's not sacrosanct but by the time Lua was created it was very clearly the right choice.

          • afiori 2 days ago

            It is the best choice, but the difference is mostly in working manually with slices and offsets and array windows, for indexing and iterating they are mostly the same with maybe a small benefit for mathy notation (the reason why Julia is 1 indexed)

        • antonvs 3 days ago

          It's not counter-intuitive at all, it only seems that way because people are now used to languages with zero-based indexing. That's almost entirely because of the C language, which used pointer offset arithmetic with its arrays.

          Outside of that machine context, where an array is a contiguous block of RAM that can be indexed with memory pointers, there's no particular reason to do offset indexing. 1-based works just fine - "first element, second element" - works just fine and is perfectly intuitive.

          Different types of indexing can make sense in different situations. Some languages even allow that. In Ada, for example, arrays can start at whatever index you define.

          • demurgos 3 days ago

            There are reasons unrelated to pointer implementations such as the interval argument from Dijkstra's article or conversions between flat and multidimensional array indexes. There's a reason why most mathematical sequences start at zero: it leads to simpler expressions. Vec (and especially matrix) indexing should have been zero-based.

          • IshKebab 3 days ago

            > 1-based works just fine

            It really doesn't. You can make it work obviously but you end up with much less elegant code, with +1 and -1 all over the place. E.g. for accessing a row of a matrix you get [width(i-1)+1, widthi+1) instead of the far saner [widthi, (width+1)i)

            Generic code also becomes much more awkward.

            • antonvs 3 days ago

              Both are less elegant in different scenarios. In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

              Neither is a true one-size-fits-all solution. They're different kinds of indexes that serve different purposes. The choice of zero-based everywhere is an engineering tradeoff, nothing more.

              • IshKebab 2 days ago

                > In many business scenarios with zero-based indexes, you need i+1 everywhere because no-one talks about the e.g. the zeroth year of a company's operation.

                Perhaps, but this is extremely rare compared to tasks that are far more elegant with 0-based indexing. Also the worst you can get there is a single +1 in the display code, while trying to shoe-horn algorithms into 1-based code can get much more awkward.

  • munchler 3 days ago

    If people start saying “zeroth century”, it’s only going to create confusion, because “first century” will then become ambiguous.

  • eddieroger 2 days ago

    Your metaphor is comparing apples and oranges. When we could life, it's "one year old" or "aged one year," both of which mark the completion of a milestone. Using the term "18th century" is all-encompassing of that year, which is a different use case. When one recollects over the course of someone's life, like in a memoir, it would be normal to say "in my 21st year", referring to the time between turn 20 years old and 21 years old.

TheAceOfHearts 2 days ago

Another example of confusing numeric systems emerges from 12-hour clocks. For many people, asking them to specify which one is 12AM and which one is 12PM is likely to cause confusion. This confusion is immediately cleared up if you just adopt a 24-hour clock. This is a hill I'm willing to die on.

  • flakes 2 days ago

    A few months ago, my girlfriend and I missed a comedy show because we showed up on the wrong day. The ticket said Saturday 12:15am, which apparently meant Sunday 12:15am, as part of the Saturday lineup. Still feel stupid about that one.

  • rbits 16 hours ago

    That's not inherent to AM/PM, it's cause for some reason the AM/PM switches on the boundary of 11 to 12, not 12 to 1. To fix this while still having AM work with 24hr time, it should be 0 instead of 12

  • rootusrootus 2 days ago

    > just adopt a 24-hour clock. This is a hill I'm willing to die on.

    I don't know if I feel that strongly about it but I tend to agree. I see more value in adopting a 24 hour clock than making SI mandatory. AM/PM is silly.

  • runarberg 2 days ago

    You usually know it from context, and if not, 12 noon or 12 midnight is quite common.

    But I do wished people would stop writing schedules in the 12 hour system. You get weird stuff like bold means PM etc. to compensate for the space inefficiency of 12 hour system

  • cvdub 2 days ago

    That’s why I always say “12 noon” when writing out scheduling instructions for something at 12PM.

wryoak 3 days ago

I thought this article was railing against the lumping together of entire spans of hundreds of years as being alike (ie, we lump together 1901 and 1999 under the name ”the 1900s” despite their sharing only numerical similarity), and was interested until I learned the author’s real, much less interesting intention

  • endofreach 3 days ago

    Many people find their own thoughts more interesting than the ones of others. Some write. Many don't.

  • layer8 3 days ago

    [flagged]

    • psychoslave 3 days ago

      I found your sarcasm very enjoyable and ended up disappointed I couldn't come with the next meta-sarcasm. How dare you?!

MarkLowenstein 3 days ago

A lot of this runaround is happening because people get hung up on the fact that the "AD" era began as AD 1. But that year is not magic--it didn't even correlate with the year of Jesus's birth or death. So let's just start the AD era a year before, and call that year "AD 0". It can even overlap with BC 1. BC 1 is the same as AD 0. Fine, we can handle that, right? Then the 00s are [0, 100), 100s are [100, 200), etc. Zero problem, and we can start calling them the 1700s etc., guilt free.

  • jrockway 3 days ago

    I would also accept that the 1st century has one less year than future centuries. Everyone said Jan 1, 2000 was "the new millenium" and "the 21st century". It didn't bother anyone except Lua programmers, I'm pretty sure.

    • tetris11 3 days ago

      > It didn't bother anyone except Lua programmers, I'm pretty sure.

      What's this reference to? Afaik, Lua uses `os.time` and `os.date` to manage time queries, which is then reliant on the OS and not Lua itself

      • jrockway 3 days ago

        Array indexes start at 1 by default, instead of 0.

  • arp242 3 days ago

    Things like "17th century", "1600s", or "1990s" are rarely exact dates, and almost always fuzzy. It really doesn't matter what the exact start and end day is. If you need exact dates then use exact dates.

    A calendar change like this is a non-starter. A lot of disruption for no real purpose other than pleasing some pedantics.

    • jhbadger 3 days ago

      Exactly. Historians often talk about things like "the long 18th century" running from 1688 (Britian's "Glorious Revolution") to 1815 (the defeat of Napoleon) because it makes sense culturally to have periods that don't exactly fit 100-year chunks.

      https://en.wikipedia.org/wiki/Long_eighteenth_century

      • chefandy 3 days ago

        I thought the article was going to argue against chunking ideas into centuries because it's an arbitrary, artificial construct superimposed on fluid human culture. I could get behind that, generally, while acknowledging that many academic pursuits need arbitrary bins other people understand for context. I did not expect to see arguments for stamping out the ambiguities in labellibg these arbitrary time chunks. Nerdy pub trivia aside, I don't see the utility of instantly recalling the absolute timeline of the American revotion in relation to the enlightenment. The 'why's— the relationships among the ideas— hold the answers. The 'when's just help with context. To my eye, the century count labels suit their purpose for colloquial usage and the precise years work fine for more specific things. Not everything has to be good at everything to be useful enough for something.

  • card_zero 3 days ago

    This reminds me that centuries such as "the third century BC" are even harder to translate into date ranges. That one's 201 BC to 300 BC, inclusive, backward. Or you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750. [Edit: no it doesn't.]

    In fact archeologists have adapted to writing "CE" and "BCE" these days, but despite that flexibility I've never seen somebody write a date range like "the 1200s BCE". But they should.

    • arp242 3 days ago

      Some people have proposed resetting year 1 to 10,000 years earlier. The current year would be 12024. This way you can have pretty much all of recoded human history in positive dates, while still remaining mostly compatible with the current system. It would certainly be convenient, but I don't expect significant uptick any time soon.

      For earlier dates "n years ago" is usually easier, e.g. "The first humans migrated to Australia approximately 50,000 years ago".

      • abhinavk 2 days ago

        Human Era. There is a Wendover Productions video about this. They also sell a calendar.

    • localhost8000 3 days ago

      > you might see "the last quarter of the second millennium BC", which means minus 2000 to about minus 1750.

      From comparing some online answers (see links), I'd conclude that even though the numbers are ordered backward, "first"/"last"/"early"/"late" would more commonly be understood to reference the years' relative position in a timeline. That is, "2000 to about minus 1750" would be the first quarter of the second millennium BC.

      https://en.wikipedia.org/wiki/1st_century_BC (the "last century BC") https://www.reddit.com/r/AskHistorians/comments/1akt4zm/this... https://www.quora.com/What-is-the-first-half-of-the-1st-cent... https://www.quora.com/What-is-meant-by-the-2nd-half-of-the-5... etc

      • card_zero 3 days ago

        Oh you're right, I tripped up. "The last quarter of the second millennium BC" means about minus 1250 to minus 1001.

        I often get excited by some discovery sounding a lot older than it actually is, for reasons like this.

  • pictureofabear 3 days ago

    We're too deep into this now. Imagine how much code would have to be rewritten.

    • hypertele-Xii 3 days ago

      Imagine how much code is being rewritten all the time for a variety of reasons. Code is live and must be maintained. Adding one more reason isn't much of a stretch.

wavemode 3 days ago

I do tend to say "the XX00s", since it's almost always significantly clearer than "the (XX+1)th century".

> There’s no good way to refer to 2000-2009, sorry.

This isn't really an argument against the new convention, since even in the old convention there was no convenient way of doing so.

People mostly just say "the early 2000s" or explicitly reference a range of years. Very occasionally you'll hear "the aughts".

  • greenbit 2 days ago

    How about the 20-ohs?

    Think of how individual years are named. Back in for example 2004, "two thousand and four" was probably the most prevalent style. But "two thousand and .." is kind of a mouthful, even if you omit the 'and' part.

    Over time, people will find a shorter way. When 2050 arrives, how many people are going to call it "two thousand and fifty"? I'd almost bet money you'll hear it said "twenty fifty". Things already seem to be headed this way.

    The "twenty ___" style leads to the first ten years being 20-oh-this and 20-oh-that, so there you have it, the 20-ohs.

    (Yes, pretty much the same thing as 20-aughts, gotta admit)

  • conception 3 days ago

    The 2000-2009’s are the aughts!

    • kmoser 3 days ago

      I think you mean "twenty-aughts" (to differentiate them from the nineteen-aughts, 1900-1909).

      • jhbadger 3 days ago

        I wonder at what point we can just assume decades belong to the current century. Will "the twenties" in the US always primarily mean Prohibition, flappers, and Al Capone or will it ever mean this decade?

        • fragmede 3 days ago

          I say give it 11 years or so for 2020's kids to starting come into age, and twenties babies will refer to babies born in the 2020's and not centenarians.

    • buzzy_hacker 3 days ago

      The noughties!

      • hansvm 3 days ago

        You ought naught to propose such noughts!

  • savanaly 3 days ago

    You can always just say "the 2000s" for 2000-2010. If the context is such that you might possibly be talking about the far future then I guess "the 2000's" is no longer suitable but how often does that happen in everyday conversation?

BurningFrog 3 days ago

Immigrating from a country that uses "1700s", it probably took a decade before I had internalized to subtract 1 to get the real number.

I will resent it till I die.

  • Too 3 days ago

    Here we say something like the "ninteen-hundred-era" for the 1900s, "ninteen-hundred-ten-era", for 1910s, "ninteen-hundred-twenty-era", etc. In writing 1900-era, 1910-era, 1920-era. The most recent decades are referred to with only the "70-era" for the 70s. The word for age/epoch/era in our language is a lot more casual in this setting.

    The 20xx vs 200x does indeed leave some room for ambiguity in writing, verbally most people say 20-hundred-era vs 20-null-null-era.

  • bowsamic 3 days ago

    I find it weird when people take a long time for these little things. My wife still struggles with the German numbers (85 = fünfundachtzig) and the half thing with time (8:30 = halb neun) even though I managed to switch over to those very quickly. I think it depends on the person how hard it is

    • BurningFrog 3 days ago

      I think this one was hard for me because centuries don't come up that often.

      If it was something I saw/heard every day, I would have adapted much faster.

  • dinkumthinkum 2 days ago

    Resent is a little strong isn’t it? I feel like today we have such a victim culture that we are oppressed by the most trivial of matters.

networked 3 days ago

> This leaves ambiguous how to refer to decades like 1800-1809.

There is the apostrophe convention for decades. You can refer to the decade of 1800–1809 as "the '00s" when the century is clear from the context. (The Chicago Manual of Style allows it: https://english.stackexchange.com/a/299512.) If you wanted to upset people, you could try adding the century back: "the 18'00s". :-)

There is also the convention of replacing parts of a date with "X" characters or an em dash ("—") or an ellipses ("...") in fiction, like "in the year 180X". It is less neat but unambiguous about the range when it's one "X" for a digit. (https://tvtropes.org/pmwiki/pmwiki.php/Main/YearX has an interesting collection of examples. A few give you the century, decade, and year and omit the millennium.)

Edit: It turns out the Library of Congress has adopted a date format based on ISO 8601 with "X" characters for unspecified digits: https://www.loc.gov/standards/datetime/.

  • dkdbejwi383 3 days ago

    How do you speak it though? The “oh ohs”? “Noughts”? “Zeroes”?

pentagrama 3 days ago

I got a fight recently with a philosophy teacher about that, I changed the dates like the op to be more clear on my writing, she took it so seriously, it was a big fight about clarity vs. tradition but really superficial and mean on both sides. Now I wish to be* more articulate and have a good debate. I wrote it in my way on the final exam and approved, she had to deal with it I guess.

* Sorry, I don't know how to write that in past, like haber sido in Spanish, my main language.

treve 3 days ago

This was confusing to me as a kid, especially as we entered the 21st. I also still remember learning about the Dutch golden age in elementary school, but can't remember if it was the 1600s or 16th century.

I'm running into a similar issue recently. Turns out that many people saying they are '7 months pregnant' actually mean they are in the 7th month, which starts after 26 weeks (6 months!)

Doctor_Fegg 3 days ago

> Did the American revolution happen before, during, or after the Enlightenment?

I’ve no idea. When did the American revolution happen?

Not everyone’s cultural frame of reference is the same as yours. I can tell you when the Synod of Whitby happened, though.

  • mixmastamyk 3 days ago

    The American and French Revolutions are a pretty big deal on the road to modern democracy, as well as being tied to 1700s Enlightenment ideals. Everyone educated should know this.

    • Tainnor 3 days ago

      Of course, they are important, but so are many other things - and speaking e.g. from a European POV, a lot of other events are simply much more salient and commonplace - and the same is probably even more true for other continents (would a random reasonably educated American or European person know when the Meiji restauration happened or when Latin America became independent?). You can't expect everyone to have memorised all the important dates.

      • mixmastamyk 3 days ago

        America was a backwater at the time and therefore the best place to experiment with European enlightenment ideals. Which it did, and was a direct factor in the French Revolution. I also learned about numerous revolutions in Latin America from Mexico to Bolivar to San Martin over the early 1800s.

        The events that directly affect the modern world should be covered in school. I’d say revolutions that created large modern states would be among them.

        • Tainnor 3 days ago

          Of course, we learn about the American Revolution in schools, but people aren't going to remember every date they were taught in schools. The founding of Rome or the Punic Wars are also hugely important for today's world, but not everyone can place them.

          The reason most US Americans probably can place the American Revolution is because I assume it's so often commememorated there. In Germany, people would be much more likely to remember the years 1933, 1949 and 1989, because of how often they're referenced.

        • kergonath 3 days ago

          > I’d say revolutions that created large modern states would be among them.

          The Russian revolution as well. And the Chinese one. It’s quite difficult to make sense of the late 20th century (yes, I know) without them. Or the early 21th.

          • mixmastamyk 3 days ago

            Agreed, and it doesn't mean memorizing a date as folks here keep mentioning. The nearest decade is fine in most cases.

            • kergonath 2 days ago

              Indeed. The order of things (what influenced what, and what directly caused what) is much more important than the accurate date

            • Tainnor 2 days ago

              do you remember the decade of the Treaty of Westfalia? Because if you live in central Europe, that's at least as important as the American Revolution.

              • mixmastamyk 2 days ago

                I slightly misspoke above. The important part in my opinion is the milestone of democracy. That it happened nearby wars is somewhat incidental and common.

                What I mostly remember about the HRE is that it wasn’t Holy, nor Roman, nor an Empire. :D. My first guess of mid 1600s is not too far off.

                (Westphalian sovereignty is interesting… reading about it now.)

                Those governments have been replaced multiple times over the centuries however, so achievements at the time were not stable.

                • Tainnor a day ago

                  > Those governments have been replaced multiple times over the centuries however, so achievements at the time were not stable.

                  The Peace of Westfalia signalled the end of major religious strife within Christianity and laid the foundation for today's modern conception of a nation-state. In my opinion, this is at least as important as the American Revolution.

    • IshKebab 3 days ago

      A pretty big deal in America. I don't think knowledge of the exact date of the American Revolution is a requirement for education outside America. At least no more than "17something...ish".

      • IIAOPSW 3 days ago

        "17something...ish" is enough to answer (or at least make a high confidence guess at) the original question (was the American Revolution contemporary with the enlightenment?)

      • Dalewyn 3 days ago

        For that matter, a lot of historical dates we consider important are only important to us because A) we're westerners and B) we got them drilled into us by textbooks and classes.

        The remaining majority of the world (the west is a minority) sincerely couldn't care less about the American or French or Industrial Revolutions or Columbus (re)discovering America or the Hundred Years War or the Black Death or the Fall of Rome or whatever else.

        Kind of like how we as westerners generally couldn't care less about Asian, African, Middle Eastern, Indian, or Polynesian histories.

        The culture we grow up in and become indoctrinated by determines what is important and what is not.

        And just so we're clear, this bit of ignorance is perfectly fine: Life is short, ain't nobody got time for shit that happened to people you don't even know who lived somewhere you will never see.

        • mixmastamyk 3 days ago

          Appeal to ignorance and anti-intellectualism, congrats.

          • Dalewyn 3 days ago

            Let me put it this way: Can you really blame someone for not knowing a historical fact that is completely irrelevant to their life, especially when they probably have more pressing concerns to learn and care about?

            We only have so many hours in a day and so many days in a lifetime, while knowledge is practically infinite.

            • mixmastamyk 3 days ago

              You could say that about anything.

              I limited my initial statement to educated folks, and presumably those who would like to be one.

              /history/democracy/milestones -> relatively important.

              • IshKebab 2 days ago

                Absolute rubbish. There are plenty of facts that it would be quite surprising for an educated person not to know. The date of the American Revolution is not one of those facts, except for Americans.

                Just like it would be unusual for an educated person not to know when the Battle of Hastings was... unless they aren't British.

                And that's about as close a country to America as you can get. Do you think the educated in Singapore or Kenya learn about the American Revolution? Hell even in the UK we did not spent a single history lesson on it. I'm not exaggerating.

                • mixmastamyk 2 days ago

                  It’s not entirely surprising your country would prefer to forget that war. Disappointing perhaps but not surprising.

                  In my opinion, a battle is generally not important compared to a milestone of democracy. I learned about the Magna Carta in 1215 and saw a copy once and appreciate that knowledge.

                  But sure, keep arguing for ignorance and dismissing education if you’d like—we’ll enjoy the show here in posterity.

                  • IshKebab a day ago

                    > It’s not entirely surprising your country would prefer to forget that war. Disappointing perhaps but not surprising.

                    Maybe not, but that's hardly the only thing from history that we didn't learn about. I think you're massively underestimating how much history there is. Most of the world has millennia of history. It's not like America where you can cover the entire history of the country.

                    Also history is an optional lesson past age ~14. You can choose to do Geography instead. But I'm not going to call you uneducated for not knowing what a medial moraine is.

                    • Tainnor a day ago

                      > Most of the world has millennia of history.

                      And I also feel that it's myopic to think that a historical event is less significant because it happened longer ago and its effects are more diluted. But there are plenty of turning points in history that explain why today's world looks the way it does, the American (or French) revolutions are but some of them.

              • Dalewyn 3 days ago

                As very broad subjects? Yes. But any particular factoid about them is practically irrelevant for most people who don't actually have anything to do with that factoid.

                Hell, I would even go as far as to say the American Revolution is irrelevant even for most Americans because it has nothing of practical value. We (Americans) all know about it to varying degrees, but again that is due to growing up and being indoctrinated in it.

                • mixmastamyk 2 days ago

                  "Nothing is important to know besides my lunch time..." yawn

                  • im3w1l 2 days ago

                    This is such a boring dismissal of a very interesting subject - what information is important to whom and why.

                    A pretty important reason for learning about the history of democracy is to learn how and why to preserve it.

                    • mixmastamyk 2 days ago

                      This is the most boring knuckledragging subthread I’ve ever read here. Arguing we shouldn’t bother knowing anything but what’s underneath our nose, without qualifications. Not even a theoretically useful philosophy, such as solipsism is being espoused here.

                      All the while masquerading as intelligent conversation. It’s only interesting until you follow it to its conclusion. Anti-intellectualism in a nutshell, and you should be embarrassed to be seen in the vicinity. :D

                      • Dalewyn 2 days ago

                        Is it nice to learn about the American or French Revolutions? Absolutely. But time is a limited resource and must be rationed according to each individual's needs.

                        Most people are far too busy living lives to have time to spare spelunking into tomes; they usually don't even understand what democracy actually is either, but they get by fine in life just the same.

                        The reality is no man can learn all there is to know, it's physically impossible and lines in the sand must be drawn. Given that, which side of the line a certain historical factoid lies will vary; some will consider the American Revolution important and some will not, and some will be indoctrinated by their society, all are fine.

                        Also, I'm going to cite the HN Guidelines here for your reference going forwards:

                        >Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

                        >Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

          • Tainnor 2 days ago

            Your problem is that, instead of encouraging people to learn, you're just dismissing people who don't know one very specific historical fact as "uneducated". People in Europe generally know that America is a democracy and many probably also know that it was the first modern one, but to an average European it doesn't matter that much that it happened in the 1770s as opposed to, say, the 1650s.

            • mixmastamyk 2 days ago

              If you didn’t know it, why not learn it now? Here’s your opportunity, one of today’s 10k.

              Arguing you shouldn’t need to learn it isn’t impressive in any shape or form. It’s middle-brow level (non)curiosity, and less than welcome here.

              • Tainnor a day ago

                Never said one shouldn't learn new things. Just that it's unreasonable to expect that "every educated person knows X" - because time is finite.

          • bowsamic 3 days ago

            Yes, those things cannot be blamed

    • bowsamic 3 days ago

      No one that I know of in Europe gets taught this

      • scbrg 3 days ago

        Swede here. Both the American an the French revolution were taught when I went to school in the ninth and tenth decades of the twentieth century. As GP points out, they're both fairly significant events that had side effects relevant even to us up here.

        I would be extremely surprised if Sweden was unique among European nations in this regard.

      • dboreham 2 days ago

        I grew up in Scotland. We studied American History in school. In fact we didn't study English History, except as it pertained to Scottish History, so to this day I'm hazy on Magna Carta, turbulent priests and so on.

        • bowsamic 2 days ago

          I’m glad to be proven wrong. I’m English btw. We learnt nothing about America but we learnt everything about King Henry VIII

      • kzrdude 2 days ago

        Why not learn it anyway. Broader 'bildung' is a lifelong pursuit.

        • bowsamic 2 days ago

          I’m from working class England and that’s absolutely not the culture for us. Education is considered superfluous unless it makes you money

          • kzrdude 2 days ago

            Right, and I'm from a (partly) academic family. I guess it's a culture/class thing too.

  • kelnos 3 days ago

    It's tiresome when people seem to think it's necessary to be annoyed that others make reference to their own cultural frame of reference in their writing.

    Even more tiresome when they feel the need to comment about it.

    Most people here, even non-Americans, likely have at least a rough idea of when the American Revolution was. And those who don't will either just gloss over it and think no more of it, or find the answer on the internet in a shorter amount of time than it took me to type this sentence. And then there are people like you. Look at the completely useless subthread you've spawned! Look at the time I've bothered to waste typing this out! Sigh.

    • jcul 3 days ago

      Also the author does give the year in the next paragraph. So no googling required.

      I also didn't know what the date of the American revolution was, but I understood it was just an example.

      > if you’re like me, you’ll find the question much easier to answer given the second version of the sentence, because you remember the American revolution as starting in 1776, not in the 76th year of the 18th century.

    • Doctor_Fegg 2 days ago

      Yes, that's entirely the point.

      The article is "Here's a thing I don't understand! Let's say how silly it is by comparing it to a thing I do understand."

      I mean, thanks for that. I could write an article about "Why do people insist on quoting the American revolution as a reference date when they could just say 'late 18th century'". Which would make the same point and get us precisely as far along as this article did.

      > Look at the time I've bothered to waste typing this out!

      I feel the same way, dude. I feel exactly the same.

layer8 3 days ago

As a kid I came across a book titled “Scientists of the 20th century”, and I was intrigued how the authors knew about future scientists.

cvoss 2 days ago

The same off-by-one annoyance under discussion bites the author in this very article and he didn't even notice: He calls 1776 the 76th year of the 18th century. But it's not! It's the 77th year of that century!

  • bonzini 2 days ago

    The 18th century started in 1701 and ended in 1800.

    • OJFord 2 days ago

      I could give you 'ended in 1800' (I'd prefer 'at') but it very much started in 1700, not 1701.

      • bonzini 2 days ago

        I didn't make the convention. The 17th century started on January 1, 1701 and ended on December 31, 1800.

        That's because the purported year of the birth of Christ was the "first year of the lord", or AD 1, and that's when the first century started. In turn that's because at the time Latin had a word for nothing but not a word for zero, so you couldn't count years starting at zero.

        That also means that those old enough to have partied on December 31, 1999 were technically partying for the beginning of the last year of the second millennium.

        • OJFord 2 days ago

          Only American Christians say 'year of our lord' after the date, if that's what that means to you and is how you want to think of it that's fine, but be aware nobody else is doing that, even though they're working in AD/BC.

          • bonzini 2 days ago

            "Year of the lord" (not "our" lord, please double check what I wrote) is the literal translation of Anno Domini and it's the reason why years are counted from one instead of zero. I quoted the expression, wrote "lord" in lowercase and added "purported" to make it clear that the religious reference was only for historical reasons, I don't know what else I could have done. If I wanted to imbue some religious meaning I probably would have said "Jesus" or "the Christ".

            I am also not American and not a native English speaker. In fact Latin languages say in expanded form "après Jésus-Christ", "dopo Cristo", "despues de Cristo" (abbreviation is only used in writing), so even speakers who are not religious very much know the reference even if they couldn't care less. We're stuck with it.

            I agree that it's more of a "who wants to be a millionaire" quirk than something that actually matters, but this case of correcting someone was one of the few cases where it matters.

            • OJFord 2 days ago

              That is what they say though.

              I am Christian in a very literal sense, would use AD dating etc. and have no problem with it whatsoever, I just don't feel the need to suffix (or sometimes prefix) it with 'year of our lord' the way some Americans do; I find it quite jarring.

              It's like feeling the need to say 'day of rest' or 'sabbath' every time you say a particular day of the week, or some other (possibly areligious) descriptor of something whenever it's mentioned.

              • satvikpendem a day ago

                You're on a tangent, who cares what it's called but the fact is that years started at 1, not 0, therefore centuries also start on the first year, not the zeroth year.

              • bonzini 2 days ago

                I am replying specifically to "if that's what that means to you" which felt a bit ad hominem sinceI didn't even use the words "year of our lord".

                • OJFord a day ago

                  I was referring to 1701 to 1800 inclusive, which is not the standard way people refer to centuries and is not what other people will mean or will think you mean, in general.

                  • bonzini a day ago

                    Ok, then you're just wrong, just like you were wrong about my nationality and faith, and it's pointless to argue with you.

  • satvikpendem 2 days ago

    You didn't read far enough because he specifically notes this. The 18th century started on January 1, 1701, therefore 1776 is indeed the 76th year of that century, not the 77th, as the year 1700 is part of the 17th century.

osigurdson 3 days ago

If I lived much longer than 100 years I might care more about the precision of such language. However, as it stands, I know what people mean when they say "remember the early 2000s". I know that doesn't mean the 2250s for example - a reasonable characterization for someone living in 3102 perhaps.

James_K 3 days ago

One point, the singular they has been in use for centuries, where this essay suggests it's a recent invention.

  • networked 3 days ago

    The essay doesn't really say anything about when the singular "they" was invented. What it says is that it used to be low-status and unsophisticated language.

    > In the 1970s, fancy people would have sniffed at using “they” rather than “he” for a single person of unknown sex like this. But today, fancy people would sniff at not doing that. How did that happen?

    > I think “they” climbed the prestige ladder—people slowly adopted it in gradually more formal and higher-status situations until it was everywhere.

    • dahart 3 days ago

      The essay’s narrative is overly simplified and misleading about details. Singular they has been common for centuries. The idea that it was lower status is a more recent invention. The author might be referring more to use of they as a personal pronoun. Anyway, whatevs, language changes, that part of author’s message is good.

    • Tao3300 2 days ago

      Yeah, I'm old enough that I remember being taught not to use singular they in elementary school, but I'm young enough that he/she was given as the preferred alternative.

MobiusHorizons 2 days ago

> there's no good way to write 2000 - 2009

I have heard people use "the aughts" to refer to this time range [1]. I guess if I was trying to be specific about which century one could say "the two thousand aughts" or "the eighteen hundred aughts". But I think in that context i'd be more likely to say "in the first decade of the 1800s"

1: https://en.wikipedia.org/wiki/Aughts

  • navane 2 days ago

    I call them the two thousand zeroes (2000s). Or, when people ask me from which year my car is, I say two thousand zero -- emphasizing the smallest digit.

    • PopAlongKid 2 days ago

      Why not twenty hundred for 2000, just like 1900 is nineteen hundred?

      Similarly, I never understood why people still say things like "two thousand ten" instead of twenty-ten for year 2010. No one ever went around saying "one thousand nine hundred ten" for 1910, did they?

      • navane 2 days ago

        I don't think they said ten hundred for the year 1000 either. I do use the twenty a for individual years though.

bee_rider 3 days ago

I’ve just taken to writing things like: 201X or 20XX. This is non-standard but I don’t care anymore, referencing events from 20 years ago is just too annoying otherwise.

In spoken conversation, I dunno, it doesn’t seem to come up all that often. And you can always just say “20 years ago” because conversations don’t stick around like writing, so the dates can be relative.

adolph 2 days ago

The issue, of course, is that “counted centuries” are off by one from how we normally interact with dates—the 13th century starts in AD 1201.

The author misses. The author would like to skip some significant interpretive steps, chiefly when our dynamically typed language uses number words or characters in different contexts. Suggested reading is about the differences between and use cases of nominal, ordinal, cardinal, interval and ratio.

https://www.statisticshowto.com/probability-and-statistics/s...

kazinator 2 days ago

> No one gets confused about what “the 1700s” means.

Well, there are a few longtermists who do. Do you mean the 01700s? Or some other 1700s that have been cut to four digits for conversational convenience?

:)

When someone's salary depends on being confused, there are ways to dissuade them. But when it's their hobby, forget it.

frithsun 3 days ago

I am emotionally invested in having been born in "the twentieth century" instead of "the 1900s."

ash 3 days ago

The article does not really resolve the ambiguity with 2000s which usually means 2000-2009, not 2000-2099.

That said, in Finnish language people never count centuries. It's always "2000-luku" and "1900-luku", not 21th and 20th.

  • tetris11 3 days ago

    > This leaves ambiguous how to refer to decades like 1800-1809. For these you should ~~specify the wildcard digits as “the 180*s” manually~~ write out the range. Please do not write “the 181st decade”.

    I think it wouldn't be wrong to say "the 1800s decade" or "the primus 2000s" or "the alpha 1900s"

    • ash 3 days ago

      I don't find it persuasive. First of all, people are not going to start writing and saying wildcards. Second, there is an established convention to say "2000s" and mean 2000-2009.

      • tetris11 2 days ago

        no, that part was crossed out as a joke (hence my attempt to add '~' before and after the crossed out section)

  • danschuller 2 days ago

    It means that now but that meaning will fade away as the decades advance.

acheron 3 days ago

Sure, then we can switch to French Revolutionary metric time.

coldtea 3 days ago

>The issue, of course, is that “counted centuries” are off by one from how we normally interact with dates—the 13th century starts in AD 1201. There’s a simple solution. Avoid saying “the 18th century”, and say “the 1700s” instead. Besides being easier to understand, it’s also slightly shorter.

The kind of person who cares and reads about "the Xth century" can also trivially understand the date range involved.

The kind of person who can't tell 18th century is the 1700s and 21st century is 2000s, it would make them little good to read history, unless they get the basics of counting, calendars, and so on down.

  • UberFly 3 days ago

    About 20% would understand you. About 50% would tell you the 18th century was the 1800s, and 30% would just stare at you confused.

  • yard2010 3 days ago

    My high school history teacher taught me a trick. Just subtract 1 from the century to get the correct year. I never remember if it's subtract or add though, so I'm trying both with 21th century and see if I'm right first.

  • cobbaut 2 days ago

    I was expecting this as the top comment. We learned about centuries in school when I was 9 or 10 or something and nobody found this a hard issue to tackle.

  • Perseids 3 days ago

    I'm sorry, but that is just elitist bullshit. First, even if we accept your implicit premise, that it is a training hurdle only, there is enormous value in accessible science, literature and education. In our connected society and in a democracy everyone benefits from everybody else understanding more of our world. In software engineering we have a common understanding that accidental complexity reduces our ability to grasp systems. It's no different here.

    Second, your implicit premise is likely wrong. Different people have different talents and different challenges. Concrete example: In German we say eight-and-fighty for 58. Thus 32798 gets two-and-three-thirty-seven-hundred-eight-and-ninety where you constantly switch between higher and lower valued digits. There are many people, me included that not-seldomly produce "Zahlendreher" – transposed digits – because of that, when writing those numbers down from hearing alone, e.g. 32789. But then, there are also people for whom this is so much of a non-issue that when they dictate telephone numbers they read it in groups of two: 0172 346578 becomes zero-one-seven-two-four-and-thirty-five-and-sixty-eight-and-seventy. For me this is hell, because when I listen to these numbers I need to constantly switch them around in my head with active attention. Yet others don't even think about it and value the useful grouping it does. My current thesis is that it is because of a difference between auditory and visual perception. When they hear four-and-thirty they see 34 in their head, whereas I parse the auditory information purely auditory.

    What I want you to take from my example, is that these issue might not be training problems alone. I have learned the German number spelling from birth and have worked in number intensive field and yet I continue to have these challenges. While I have not been deeply into history, I suspect that my troubles with Xth century versus x-hundreds might persist, or persist for a long time, even if I get more involved in the field.

    • coldtea 3 days ago

      >I'm sorry, but that is just elitist bullshit.

      That's fine, the thought-stopping accusation of "elitism" doesn't bother me. It's a preoccupation for people preferring equality based on dumbing things for the lowest common denominator, lest - god forbid - someone has to make an effort.

      I don't think lowly and condescedingly of people like that, I think they're capable of learning and making the effort - they're just excused and encouraged not to.

      People who actually have learning difficulties (because of medical conditions or other issues) or people from diffirent cultures accustomed to other systems, are obviously not the ones I'm talking about - and don't excuse the ones without such difficulties, and in countries that have used this convention for 1000+ years.

      >Second, your implicit premise is likely wrong. Different people have different talents and different challenges

      The huge majority that confuses this doesn't do it because they have a particular challenge or because their talents lie elsewhere. They do it because they never bothered, same way they don't know other basic knowledge, from naming the primary colors to pointing to a major country on the map. They also usually squander their talents in other areas as well.

      Besides, if understanding that 18th century is the 1700th is "a challenge", then the rest of history study would be even more challenging. This is like asking to simplify basic math for people who can't be bothered to learn long division, thinking this will somehow allow them to do calculus.

      • dahart 2 days ago

        > This is like asking to simplify basic math for people who can’t be bothered to learn long division, thinking this will somehow allow them to do calculus.

        Oh dang, you kinda sorta had me until here. This sways me towards @Perseid’s point. :P Math education is full of unnecessary mental friction, and it pushes lots of people away. We know that finding better, simpler ways to explain it does, statistically, allow more people to do calculus. Long division is a good example, because it’s one of the more common places kids separate & diverge between the ones who get it and the ones who don’t, and there are simpler alternatives to explaining long division than the curriculum you and I grew up with, alternatives that keep more people on the path of math literacy.

        We can see similar outcomes all over, in civil and industrial design, and in software and games, from cars to road signs and building signs to user interface design - that making things easier to understand even by small amounts affects outcomes for large numbers of people, sometimes meaningfully affecting safety.

        The numbering of centuries is admittedly a simple thing, but maybe it actually is unintentionally elitist, even if you don’t think condescendingly, to suggest people shouldn’t complain about a relatively small mental friction when having to convert between century and year? Yes most educated people can handle it without problems, but that doesn’t tell us enough about how many more people would enjoy it more or become educated if we smoothed out how we talk about it and make it slightly easier to talk about history. This particular example might not change many lives, but it adds up if we collectively improve the design of writing and education traditions, right? Especially if we start to consider the ~20% of neurodivergent people, and ~50% of less than average people.

        > They do it because they never bothered

        Why should people have to bother, if it’s not necessary? Your argument that some people are lazy might be deflecting. Is there a stronger argument to support the need to continue using this convention? Being able to read old history might be the strongest reason, but why should we waste energy and be okay excluding people, even if they are just lazy, by perpetuating a convention that has a better alternative?

      • card_zero 2 days ago

        There are a lot of shibboleths and pointless conventions in the world. Flutists are called "flautists" because classical musicians aspire to be Italian, and if you don't pretend to be Italian too you'll embarrass yourself. Minute hands were originally long because they pointed to an outer dial of minutes, while hour hands were distinguished by being decorated, but now being slightly longer is just a stylization that means minute hand (even though "minute" means "small") and we have two pointers using the same dial for different enumerations, one without the relevant numbers and distinguished by fractional differences in its width and length. British English spellings are substantially French, and this is perpetuated as a matter of national pride.

        Like the word shibboleth, these examples are all kinds of language. Even the clock hands are a sort of visual language. Nth century is another language element. The conventions make outsiders stumble, but for insiders they're familiar and shedding them would be disturbing. Over time they become detached from their origins, and more subtle and arbitrary.

        In programming we have "best practise", which takes good intentions and turns them into more arbitrary conventions. These decisions are unworked again later by people saying "no that's dumb, I'm not going to do that", even if it is "how we do it" and even if learning it is a sign of cleverness. We have to be smart to learn to do dumb pointless things like all the other smart people.

        Is this good? Keeps us on our toes, maybe? Or keeps us aligned with bodies of knowledge? I think it's definitely good that we have the force of reformist skeptics to erode these pointless edifices, otherwise we'd be buried in them. But new ones are clearly being built up, naturally, all the time. Is that force also good? Alright, yes, it probably is. Put together, this is a knowledge-forming process with hypothesises (I don't like using Latin plurals, personally) and criticisms, and it's never clear whether tradition or reform is on the dumb or overcomplicated side: it remains to be seen, as each case is debated (if we can be bothered).

ineedaj0b 2 days ago

"There’s no good way to refer to 2000-2009, sorry."

I believe this time is called 'the aughts', at least online. I say it in person but I might be the outlier.

  • shrimp_emoji 2 days ago

    Fuck that. It's "the 2000s".

    • linearrust 2 days ago

      That only works for now because we are so close to 2000. So there is little ambiguity whether "the 2000s" refer to the century or the decade. But in the future, "the 2000s" will refer to the century. Just like the 1900s refer to the 20th century rather than the decade (1900-1909).

      • wasmitnetzen 2 days ago

        Not my problem, that can be fixed by a future generation of pedants. It'll be good enough for 50 years or so.

    • baobabKoodaa 2 days ago

      I would have thought that extends beyond 2009

      • saalweachter 2 days ago

        We don't need to refer to the third millennium as a whole right now, being in the first part of it. And by the time people need to refer to the 2000s, they will no longer have a reason to reference the 2000s.

        You can also refer to, in speech, the twenty-hundreds, without ambiguity; I suppose, if you want it more compact in text, you could always write the 20-00s.

_dain_ 2 days ago

They should have names:

    - 1500s: The Columbian Century
    - 1600s: The Westphalian Century
    - 1700s: The Century of Enlightenment
    - 1800s: The Imperial Century
    - 1900s: The Century of Oil
    - 2000s: The Current Century (to be renamed in 2100)
You might think it's Eurocentric, and you'd be right. But every language gets to name them differently, according to local history.
arp242 3 days ago

I've always written it like "1900s", and always considered "20th century" to be confusing. Having to mentally do c-- or c++ is confusing and annoying.

I deal with the "2000s-problem" by using "00s" to refer to the decade, which everyone seems to understand. Sometimes I also use "21st century"; I agree with the author that it's okay in that case, because no one is confused by it. For historical 00s I'd probably use "first decade of the 1700s" or something along those lines. But I'm not a historian and this hasn't really come up.

samatman 3 days ago

Technically, decades and centuries start in a January with one or two zeros at the end, respectively. So the 1700s and the 18th century are exactly the same interval of time.

ISO 8601-2:

> Decade: A string consisting of three digits represents a decade, for example “the 1960s”. It is the ten-year time interval of those years where the three specified digits are the first three digits of the year.

> Century: Two digits may be used to indicate the century which is the hundred year time interval consisting of years beginning with those two digits.

ggm 3 days ago

Pick your battles. Is this easier or harder to win on than de-gendering romance languages?

How about Americans stop with dropping the "and" in nineteen hundred AND twenty?

  • psychoslave 2 days ago

    I'm close to finish a linguistic project where I provide a different paradigm perspective, where gender is a subcategory of grammatical geste, and I give five additional flections to all nouns in French that refer to living entities which usually decline only under only two genders at most.

    Maybe I should start thinking about my next battle. :)

  • UberFly 3 days ago

    Nineteen Twenty is just so much more to the point. Don't have time for those dang extra baggage words.

cabalamat 2 days ago

You need to count from 0.

1 BC should be renamed year 0. Then the years 0-99 are the 0th century, the years 1900-1999 are the 19th century, etc.

To avoid confusion between new style and old style centuries, create a new word, "centan", meaning "100 years" and use cardinal instead of ordinal numbers, for conciseness. Then the years 1900-1999 are the 19-centan.

  • im3w1l 2 days ago

    It's always fun how to debate how to square circles, something has to give, but what? My proposed solution is to make the first "century" 99 years.

dash2 2 days ago

I use "the 1900s" to mean 1900-1910.

I understand the difficulty, but I don't think it is too terrible for us to get used to it and we aren't gonna change the past 500 years of literature that already did this.

Also, it's ironic for a bunch of people who literally count arrays from zero to be complaining about this... :-P

  • fsckboy 2 days ago

    counting arrays from zero was C's contribution to the public consciousness (inherited from BCPL, and perhaps from ASM) people here generally hate C so it's triggering and they are trying to forget[1]

    [1] this wasn't a footnote, it was forget array index 1.

cat_multiverse 2 days ago

Great article, in my Master's and PhD despite being in a stodgy philological field I always opted for this for clarity and conciseness. It can be hard for people to let go because they want to sound clever.

But oh, dear writer, slightly irksome that you learned copyediting but do not use en-dashes for your date ranges!

  • elric 2 days ago

    Saying stuff like "the seventeen hundreds" works in English, but it doesn't necessarily work in other languages. In Dutch that would be "de jaren zeventienhonderd", which sounds like crap compared to "de achtiende eeuw".

    > because they want to sound clever

    Citation needed. I've never considered "nth century" to be a product of trying to sound clever.

t_mann 3 days ago

> There’s no good way to refer to 2000-2009

I like the German Nullerjahre (roughly, the nil years). Naught years or twenty-naughts works pretty well too imho.

  • Georgelemental 3 days ago

    In the USA we say “the aughts”

    • kelnos 3 days ago

      Speak for yourself. I refuse to use that abominable construct!

      • zuminator 3 days ago

        I'd agree to the extent that it sounds kind of twee and affected, but what would you use in its place?

        • greenbit 2 days ago

          The "ohs". Twenty-ohs. Rhymes with that oaty breakfast cereal.

the__alchemist 2 days ago

Confusing convention. I think most pre teens realize this when learning history, then move on to accept it as a quirk. I would prefer we stop propagating it, as the author says. Don't accept confusing notation when a better alternative is also in common use!

renewiltord 3 days ago

I'm going to count centuries but just call it the 0th century until 100 CE. I anticipate no problems.

yard2010 3 days ago

There are only 3 tough problems in engineering; 1) naming stuff 2) off by one errors

nurtbo 3 days ago

The aughts or naughts (or aughties) are a pretty easy to understand way to refer to 2000-2009, though saying “the early aughts” is clearly more verbose than saying 2000-2003 (except that 2000-2003 looks more specific than is meant)

  • kelnos 3 days ago

    I think that last point, "there’s no good way to refer to 2000-2009, sorry", was a bit tongue in cheek, refusing to acknowledge "the aughts", since it is a terrible, terrible, stupid way to refer to anything.

croemer 3 days ago

I think this is incorrect. Don't centuries start with the 00? In that case the first year of a century is 0, and the 76th year would be 75, not 76 as the author writes:

> starting in 1776, not in the 76th year of the 18th century.

  • mason55 3 days ago

    No, because there was no year zero. The first century started in the year 1 AD.

    • labster 3 days ago

      There was an astronomical year zero, it’s mainly historians who don’t want the year zero to happen.

      • croemer 3 days ago

        And there is a year zero in the ISO 8601:2004 system, the interchange standard for all calendar numbering systems (where year zero coincides with the Gregorian year 1 BC; see conversion table). [via Wikipedia]

  • globular-toast 3 days ago

    I had no idea that the "21st century" started in 2001. Totally thought it started in 2000. Is that why so many things refer to 2001? Mind blown...

nvader 3 days ago

> There’s no good way to refer to 2000-2009, sorry.

"Noughty", Naughty!

dan-robertson 3 days ago

Terms like the ‘long 18th century’ feel like they make less sense when talking about 1700s. Though that mightn’t outweigh the confusion from the ordinals.

hamasho 3 days ago

It's funny that for me it feels right to associate the 20s with 2020 and the 40s with 1940, but somehow, the 30s is very foreign, and I can't think of 1930 or 2030 either.

  • kelnos 3 days ago

    Funny, when you say "20s" I think of the 1920s (aka the "Roaring 20s" here in the US). I wonder if it's an age thing (I'm in my 40s), or perhaps a cultural or regional thing.

    The 30s as 1930s seems pretty solid to me: US Great Depression, start of WWII, not to mention many of the smaller conflicts that led up to it.

    • globular-toast 3 days ago

      I definitely think of the 1920s too, although I'm only a bit younger. I don't get why anyone would refer to now as the 20s because it's still happening and hasn't necessarily been shaped yet. That and it's close enough to still refer to individual years.

      What is odd is that I never feel the need to refer to the 00s. I've never said "naughties" or anything else. In the 90s, people already talked about the 80s, but for me it's all been pretty much the same since the 00s. I don't even know what I'd talk about from the 00s. The music? 2001 by Dr. Dre came out in 1999. It could have been made today. Fight Club was 1999. Could have been made today. The fashion? People have straight hair and wear t-shirts still today.

    • hamasho 2 days ago

      You may get the point. My background as a Japanese in my 30s may affect how I perceive those periods. I have feelings for the 40s because WWII is mentioned often. For the 80s, even though I wasn't born yet, many people talk about it as the best period for Japan, so I have a concrete image of that too. And the 90s and 00s are when I was a kid/teenager so I have an emotional connection with the period. Other than these, I have weak connections and just think of the current or near-past periods.

    • greenbit 2 days ago

      When you find yourself putting a sticker that says 2030 on your license plate, chances are 30s won't mean 1930s very much longer.

transfire 3 days ago

We just need a new word to mean “0-indexed count of centuries”.

Not sure what a good word for this would be, but maybe just use what we already say — “hundreds”.

So, in the late 17th hundreds, …

karaterobot 3 days ago

I have to say that I don't think this is really a problem.

  • kelnos 3 days ago

    When compared to climate change, sure, not really. But it's definitely annoying sometimes.

ivanwood 2 days ago

Nobody* outside of these circles cares about these questions. And spend hours arguing about.

* You know, these you wierdos call 'normies'

huma 3 days ago

Thankfully, most of us quit writing centuries in Roman numerals, it's about time we quit centuries as well :) Sadly, however, the regnal numbers continue to persist

bowsamic 3 days ago

No. I like the current system and I will continue to use it even if OP somehow manages to make everyone switch over, which they won’t manage to

selimnairb 2 days ago

We should also stop talking about decades for the purposes of periodization. For example, the “‘80s” arguably ran from the election of Reagan in 1980 to the appointment of Bush W. by the supreme court in 2000. Politically, economically, and culturally, the “90’s” were an elaboration on the “80’s” (e.g. the deregulation and market reforms of Clintonism just being a whitewashed expansion of what Reagan did).

  • navane 2 days ago

    The nineties clearly started in '89 with the fall of the Berlin wall and consequent "victory" of the cold war, and ended with the dotcom crash in 2000; the cyclical nature of economic growth wasn't bested after all.

  • ineedaj0b 2 days ago

    The presidency of Bill Clinton was a much different time than that of Reagan in the US.

runarberg 3 days ago

> There’s no good way to refer to 2000-2009, sorry.

The author is wrong here. The correct way (at least in spoken West Coast American English) is the Twenty-aughts. There is even a Wikipedia page dedicated to the term: https://en.wikipedia.org/wiki/Aughts If you want to be fancy you could spell it like the 20-aughts. I suppose there is no spelling it with only digits+s though, which maybe what the author was looking for.

  • whycome 3 days ago

    I think you actually just agreed with them.

  • kelnos 3 days ago

    I'm not sure why, but every time I hear someone use that term, I cringe. The word just feels... off... to me for some reason. Like it's an abomination.

carrja99 2 days ago

We reset them every time a religious figure comes along anyway.

articlepan 2 days ago

Solution for 2000-2010: "the first decade of the 2000s".

I realize this is counting by decades/centuries again, but if we just do it for the first decade/century under a larger span it's easy to read.

  • ysofunny 2 days ago

    nonetheless, typing out, and even saying "the first decade of the 2000s" takes too long

    we need something shorter, like the double-ohs "00s"

dudeinjapan 3 days ago

> There’s no good way to refer to 2000-2009, sorry.

In terms of music this is true.

kingkawn 3 days ago

This is pretty easy to understand if you try whatsoever

Y_Y 2 days ago

> Please do not write “the 181st decade”.

Well now I have to

redhippo 3 days ago

I thought the exact same thing... when I was eight. Then I just learned the mental gymnastics of hearing, say, "20th century" and associating it with "1900" at got along with it. Really, it's a bit dated, but if people can function with miles and furlongs, they can handle centuries...

  • kelnos 3 days ago

    It's funny, I'm in my 40s and my brain still pauses longer than I'd expect is necessary to do that conversion. I think for 21st or 20th century I'm fine (since I've lived in both of them), but anything prior and it takes me a beat to figure out which date range it is.

Grom_PE 3 days ago

I agree. Another good point to get rid of counting centuries would be that in some languages (Russian) centuries are written in Roman numerals. It's annoying having to pause and think of conversion.

James_K 3 days ago

Another ambiguity, though perhaps less important, is that "2000s" could refer to 2000-2999.

  • lucb1e 3 days ago

    Took me a while to understand what's ambiguous about that. For anyone else, what they're saying is that this often (typically, I guess) refers to 2000-2099 and not the other 900 years.

  • MarkLowenstein 3 days ago

    Could make the convention to say "the twenty hundreds" when referring to [2000, 2100), and "the two thousands" for [2000, 3000).

    • kelnos 3 days ago

      I actually really like that!

      But then how about [2000, 2010)? During the current time period, I expect people are more likely to refer to the decade rather than the century or millennium.

      • MarkLowenstein 36 minutes ago

        I don't have a good solution for that. Perhaps we're finding out why people have come up with weird terms like "naughts" or saying "aught-five".

  • kelnos 3 days ago

    Not "another"; the article addresses this, and admits to not having a good solution, aside from just writing out the actual dates as you did.

  • riffic 3 days ago

    Alternatively: the third millennium.

  • whycome 3 days ago

    life in the 20s is weird.

Isamu 3 days ago

Well yeah, most of the time if I want to be understood I will say “the 1700s” because it is straightforward to connect with familiar dates.

We still say “20th century” though because that’s idiomatic.

  • macintux 3 days ago

    The author's point is that idioms change, and this one should.

    • Isamu 2 days ago

      I think you misunderstand me. I mean ONLY the specific idiom “the 20th century” because we’ve just spent more than a century using that and it is unambiguous, it’s clear.

      Use “the 1700s” and the like for all other cases.

szundi 3 days ago

Resistance is futile

wwilim 3 days ago

1800-1899 is the eighteen-hundreds, 1800-1809 is the eighteen-noughties. Easy.

tempodox 3 days ago

Meh. I acknowledge that the author can split a hair with their bare hand while blindfolded. But to convince everyone else they would have to lift the rest of the world to their level of pedantry.

istrice 2 days ago

There is no such thing as a word for "1700s" in most European languages.

Also in English it sounds weird, as you have to pronounce it "seventeen-hundreds" whereas the correct pronunciation is "one-thousand-seven-hundred". So 1700s is unsuitable for formal writing or speaking and doesn't map naturally to most languages of Western civilization.

But yeah, I guess the author finds it hard to subtract 1 in his mind :) I could go off about the typical US-centric arrogance that I see on this site, but I think it's already pretty funny as it is.

  • tyg13 2 days ago

    Seventeen-hundreds doesn't sound stilted at all, to this native English speaker. I would use this even in a very formal context, and certainly no one would bat an eye. One-thousand-seven-hundreds is almost certainly incorrect in English -- I would actually find this to be a very clear marker of not speaking English very well. Are you a native speaker? I find your claims rather bold and quite frankly incorrect.

    Also, curious to find out (from elsewhere in this thread) is that Finnish does not typically use centuries. Rather, they use a construction that maps directly to 1700s (1700-luku). I would be careful in accidentally applying your own cultural bias when accusing others of the same ;)

    • jonashus 2 days ago

      In Swedish it's also very rare to count centuries. We say 1700-talet. Also 2000-talet (pronounced either tvåtusen-talet/twothousand-talet or tjugohundra-talet/twentyhundred-talet) would mainly refer to 2000-2100. To refer to 2000-2009 we say 00-talet (nollnoll-talet/zerozero-talet).

hcfman 2 days ago

Because humanity won’t make it through another century ?

psychoslave 3 days ago

A bit disappointing, as I was expecting something far more disruptive like an alternative calendar that makes century as a notion a useless tool.

I wish we had some calendar with a departure point far less anthropocentric. So instead of all the genocides of Roman empire, each look at a calendar would be an occasion to connect to the vastness of the cosmos and the vacuity of all human endeavors in comparison to that.

kleiba 3 days ago

Meh, too small of an issue to be bothered about it.

  • OmarShehata 3 days ago

    small issues are easy to solve!

    I think it isn't as sexy/interesting as what I thought the article was going to be about (about a different way of talking about our history, in eras maybe vs centuries or something).

    this strikes me as kind of like a small PR to our language that makes an incremental improvement to a clearly confusing thing. Should be easy to merge :)

    • kergonath 3 days ago

      > Should be easy to merge :)

      Famous last words.

      Seriously, though, we should have learnt at this point: we cannot solve social issues with technology. Everybody is working on a different tree, and social dynamics that govern the spread of idioms and from which fork you cherry-pick or merge is much more complex than what git is designed to handle.

    • AnimalMuppet 3 days ago

      No, not easy. You want to change ao social convention? That is not easy.

      Even changing all the places it got encoded into software probably wouldn't be easy.

    • bowsamic 3 days ago

      > small issues are easy to solve!

      Citation heavily needed

      • OmarShehata 2 days ago

        > Citation heavily needed

        I'm not going to give you any proof of this, because I know you've seen it, but I think I just wasn't very clear on what I meant here.

        I'm talking about PRs that fix a typo in the docs, or tweak confusing wording. You know the ones I'm talking about, the ones that get merged immediately because it's clearly just better

    • kleiba 2 days ago

      Maybe not easy to solve, but not of enough importance/impact that would justify spending an extended amount of time changing the current practice.

      My original post got downvoted, but I stand by it because empirically, it seems to be clear: the current practice may not be perfect but has obviously stood the test of time. So let's move on, there's more important things to worry about.

      • OmarShehata 2 days ago

        "the current practice may not be perfect but has obviously stood the test of time"

        has it stood the test of time? or is just an arbitrary choice that no one has bothered to change?

        I'm sorry, it just seems like we're looking at spilled milk in the middle of the room saying, "well, it's been here since I walked into the room, obviously it's supposed to be here?" no we can just clean it up, no one's going to stop you

        it's actually super, super easy to change this: you just starting using the new term. You don't even need to get anyone else on board, everyone understands what the new term means (this isn't even like inventing a new word, it's just a rephrasing that's just easier for the speaker & listener).

        The goal isn't to get everyone to switch overnight. That's not how conventions are created nor how they spread. You just start doing the thing if it's useful to you. Whether it spreads beyond that is not up to you.

        My takeaway from the post is simply: hey, if you find this confusing, here's a much simpler way you can talk about it! I am personally going to start using this, and I am going to get immediate benefits from this, and so are my readers. That's it, that's all it takes.

        • kleiba 2 days ago

          I am personally going to start using this, and I am going to get immediate benefits from this, and so are my readers. That's it, that's all it takes.

          More power to you!

          My point, however, is that why most people would be bothered by spilled milk in the middle of a room and clean it up, the fact that we've stuck with "counting centuries" for a long time implies that it really isn't a big deal to most people.

          If it's important enough for you to adapt your speech patterns, there's nothing wrong with it. I wouldn't expect most people to follow suit, though, because no-one cares.