Friday, March 6, 2026

cinema history class: the crawling eye (1958)

The session: The Cold Can Kill Ya!
With plummeting temperatures, Keith shows us four movies with achingly cold settings


As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 4: The Crawling Eye (1958)
Directed by Quentin Lawrence

My Level of Prior Knowledge:
Never heard of it.

Plot Synopsis:
A series of mysterious deaths near a Swiss mountain coincide with a strange radioactive cloud that never leaves the summit. Scientists discover the cloud hides telepathic, tentacled creatures that descend from the mountain to hunt humans, forcing the investigators to confront the monsters before they spread beyond the isolated alpine town.

Plot:
The Crawling Eye is a fascinating artifact of 1950s science fiction. One of the more interesting aspects for me was seeing Forrest Tucker in a relatively restrained leading-man role. I’m so used to him as the loud, blustery Sergeant O’Rourke on F Troop (and in a similar mode on Dusty's Trail) that it almost feels like watching a completely different actor.

The movie takes its time getting where it’s going. For a while it’s a slow-moving mystery about a strange radioactive cloud hanging over a mountain and the unexplained deaths of climbers who wander too close to it. Eventually, though, the movie shifts gears and gives us a full-on climactic confrontation with the titular creatures — enormous tentacled eyeballs that emerge from the cloud and begin attacking everything in sight.

Visually, the fiendish eyes are…well, interesting. They’re certainly memorable. But aside from that central effect, there’s not a lot in the way of spectacle. The real standout, oddly enough, is the sound design. The noises the creatures make — especially the awful, squishy shrieks when they’re injured — are surprisingly effective and do a lot of the heavy lifting in making the monsters feel threatening.

This is very much classic 1950s sci-fi territory: scientists, mysterious radiation, remote mountain laboratories, and alien invaders whose plans are never entirely explained. In fact, the movie never really tells us what the creatures want. Are they scouts for a conquering alien race? Are they colonizers preparing Earth for takeover? Or are we simply dealing with an extremely unfortunate case of cloudy with a chance of eyeballs?

While I can appreciate the film on its own terms, this particular brand of 1950s creature feature isn’t really where my main interests lie. This one was much more a Bobbo choice -- and his rating relative to mine reflected that.

Yet, despite its flaws — the pacing, the limited effects, and the somewhat vague alien agenda — The Crawling Eye is kind of low-key great in its own way.

And all jokes aside, I just know that Joe would have rated this a 10 -- if he had been there.



Wednesday, March 4, 2026

the big arch: mcdonald's gets it right


When I go to McDonald’s, which isn't very often, my default order is a Big Mac. I genuinely love the taste of a Big Mac. The sauce, the lettuce, the pickles, the whole odd architectural arrangement of the thing — it’s a very distinctive flavor.

But there’s one problem: A Big Mac doesn’t satisfy.

I eat one, and when I’m done, I immediately feel like I could eat another. And after that… maybe another. A Big Mac is delicious, but it never quite leaves me feeling like I actually ate.

So when McDonald’s introduced a new burger called the Big Arch (currently being offered for a limited time), I figured I’d give it a try. OK. That's not quite accurate. I was champing at the bit, waiting for the grand introduction. And after making the rounds of distributing our shaloch manos baskets to friends and neighbors, the next stop was McDonald's.

One important thing that I noticed was simple but important: one was enough.

The Big Arch actually satisfied me. I ate it, finished it, and didn’t feel the urge to immediately order a second burger. That alone puts it in a very different category from the Big Mac.

Meat vs. Everything Else

My theory is that the key difference is the ratio of meat to “other stuff.”

The Big Mac has two thin patties buried under a lot of bun, lettuce, and sauce — plus the famous middle bun, which seems designed primarily to increase the bread-to-meat ratio.

The Big Arch, by contrast, is built around two much larger patties, with white cheddar cheese, onions (both fresh and crispy), pickles, lettuce, and a tangy sauce. The toppings are there, but the meat is clearly the main attraction.

And since meat is the thing that actually satisfies hunger, this feels like the correct design philosophy for a hamburger.

A Familiar Flavor…for a Moment

When I first bit into the Big Arch, I briefly got a hint of Big Mac flavor. That’s probably coming from the sauce, which is clearly related to Big Mac sauce but seems a bit tangier.

But that sensation lasted only a moment. Very quickly it felt like I was eating what the Big Mac has always pretended to be: an actual burger.

A Big Mac is delicious, but, while it is burgerlike, it never really feels like a burger in the traditional sense. The Big Arch does.

Is It Basically a Double Quarter Pounder?

A colleague of mine — who asked not to be named (I don't know why) — told me that the Big Arch is essentially very close to a Double Quarter Pounder with Cheese, just with different toppings.

Instead of ketchup and mustard, you get the tangy sauce. Instead of the standard American cheese, you get white cheddar. There are more onions, and the overall construction is a little different.

I haven’t tested this theory yet, but if the Big Arch disappears (and it’s currently being advertised as a limited-time item), I may experiment with the Double Quarter Pounder as a substitute.

The Cheese Question

One thing I’m still unsure about is the cheese.

The Big Arch uses white cheddar rather than the standard American cheese McDonald’s puts on Big Macs and Quarter Pounders. I think it’s better — it certainly tasted better to me — but I’m not entirely confident that wasn’t just the overall burger being better.

More research may be required.

Final Verdict

The bottom line is simple: I would absolutely order the Big Arch again.

In fact, I’d go further than that. It’s way, way, way better than any burger McDonald’s has ever sold before.

Which raises an interesting question: What would Joe rate it?

I can’t say for sure. But if a made-for-TV movie about frozen scientists can earn a 10, I suspect a McDonald’s burger that finally gets the meat-to-everything-else ratio right would at least be in contention.

Tuesday, March 3, 2026

cinema history class: a cold night's death (1973)

The session: The Cold Can Kill Ya!
With plummeting temperatures, Keith shows us four movies with achingly cold settings

(note: This is not an official trailer)

As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 3: A Cold Night's Death (1973)
Directed by Jerrold Freedman

My Level of Prior Knowledge:
Never heard of it.

Plot Synopsis:
After a scientist dies under mysterious circumstances at an isolated research station, two investigators are sent to continue his work and determine what happened. As strange occurrences mount and tensions rise, the men begin to suspect that something more than the cold and isolation is lurking in the facility.

Plot:
The movie is essentially a two-man show, and it works beautifully on that level. Robert Culp and Eli Wallach carry the most of the film, and their performances create a slow-burn tension that never lets up. Much of the movie is just the two of them talking, arguing, speculating, and gradually becoming more suspicious of both the situation and each other. It’s a reminder that when the acting is good enough, you don’t need elaborate spectacle to hold an audience.

In many ways, the film is an endurance test. The pacing is deliberate and the mystery unfolds slowly, which may try the patience of viewers expecting constant action. But that patience is rewarded. The final reveal is handled with remarkable restraint and effectiveness, and it lands as one of the best executed reveals I can remember seeing. It’s the kind that suddenly re-contextualizes everything that came before it. It's not quite Sixth Sense level reinvention, but it's up there.

One thing the movie does extraordinarily well is make you feel the cold. The isolated research station, the howling wind outside, and the sense of being trapped in a hostile environment all come through vividly. And when Robert Culp is stuck digging outdoors, you shiver for him. Of the three films we watched so far in this session, this one was probably the most thematically appropriate for a “The Cold Can Kill Ya!” session.

I did have one small but persistent annoyance. Throughout the film, the scientists repeatedly refer to the chimpanzees used in their experiments as “monkeys.” If this were just ordinary people talking, I wouldn’t think twice about it. But these are supposed to be scientists studying primates. You’d think they’d know the difference between a monkey and an ape.

The movie is also interesting in a broader cinematic context. It feels strongly reminiscent of The Thing from Another World, with its isolated research station, creeping paranoia, and sense of an unseen threat lurking nearby. At the same time, it clearly anticipates elements that would later appear in The Thing. That creates a nice bit of cinematic symmetry: a movie influenced by a 1951 film that in turn feels like a precursor to the 1982 remake.

Perhaps the most surprising thing about the whole experience is that this was a made-for-TV movie. Television movies from that era are often remembered as cheap or disposable, but this one is neither. It’s tightly written, well acted, atmospheric, and genuinely suspenseful.

And, of course, Joe gave it a 10. While I didn;t, I can understand where that 10 comes from.
 


Saturday, February 28, 2026

strategically asking about cheese



Sharon and I try to get breakfast together regularly. Almost always on Saturdays — which is why we call it Dadurday. We do miss a week here and there. Life has a way of scheduling over the things that matter most. But in principle, Saturday morning is ours.

We usually go to the Landmark Diner in Roslyn. I usually order a burger. And I say this without exaggeration: their burgers are among the best I’ve ever had. Perfectly grilled, great flavor, emotionally reassuring.

And then comes the question:

“Do you want cheese?”

This is where things become complicated.

In theory, I like cheese on a burger. In practice, I like pepper jack on a burger. Other cheeses are...other cheeses. American is tolerable, I guess, though it hardly counts as cheese. Cheddar is serviceable. Swiss is OK. Blue cheese is, in my considered judgment, gross. I understand that some people claim to enjoy it. I wish them well.

Plain is also good -- burgers din't need cheese. Whether I want pepper jack or plain depends on mood, weather conditions, and perhaps deeper existential factors.

Early on, when asked about cheese, I would inquire about options. Pepper jack was not among them. Sometimes I would specifically ask. The answer remained no. Sometimes I declined cheese. Sometimes I chose a different cheese.

But over time, I realized I was not merely ordering lunch.

I was participating in a game.

This is a classic signaling problem. The diner is a rational actor. Its objective function is simple: maximize profit by selling food people want. My objective is to consume a burger, ideally topped with pepper jack.

However, information is imperfect.

If I simply decline cheese, the diner concludes I have no cheese demand.

If I hear the list and decline without commentary, the signal is noisy. Perhaps I’m indecisive. Perhaps I’m temporarily lactose-averse. No actionable data.

If I select cheddar or Swiss, then from the diner’s perspective, the equilibrium holds. The absence of pepper jack did not cost them a sale. No incentive to adjust supply.

Which leads to my optimal strategy.

To shift the equilibrium, I must create a credible signal of unmet demand.

So now, I:

Express interest in cheese.

Ask what kinds they have.

Specifically inquire about pepper jack.

Upon learning (again) that they do not have it, I decline cheese and note, gently, that I would have taken it if pepper jack were available.

This transforms a private preference into observable lost revenue.

I am, in effect, conducting a one-man market intervention.

Will this tip the curve? Probably not. It’s entirely possible that my weekly inquiry disappears into the noise of a busy Saturday shift. The kitchen may not be maintaining a Pepper Jack Request Ledger.

But in game-theoretic terms, I have at least moved from a pooling equilibrium (all cheese preferences indistinguishable) to a separating one (my preference clearly signaled).

And I have done so politely.

None of this is criticism. The people at Landmark are great. Truly. Always friendly. Always welcoming. Always patient when I ask for the cheese list as though new dairy products might have entered the market since last week. It’s consistently a terrific experience.

Dadurday, of course, is not actually about cheese.

It’s about sitting across from Sharon and talking. About her week. About what she’s thinking. About whatever small or large thing is occupying her mind. It’s about carving out time before the day accelerates. And even if we're spending some of the time on our phones, that's OK.

The burger is excellent. The game theory is mildly amusing.

But the real equilibrium I’m protecting is this one: we show up.

And if, someday, the server says, “Yes, we have pepper jack,” I’ll smile, order it, and Sharon will probably roll her eyes — because she has heard this analysis before.

Which is, in its own way, part of the tradition.

Wednesday, February 25, 2026

the alphabetic struggle for the presidency


Political historians tend to divide American history into eras — Founding, Reconstruction, the New Deal, the Cold War, the modern age. These divisions track ideology, party realignments, wars, and economic upheaval.

But beneath those visible shifts lies a quieter contest — measurable, cumulative, and surprisingly dramatic.

The running total of letters in the surnames of Presidents of the United States.

By counting, day by day, the cumulative occurrence of each letter across administrations, a pattern emerges. The results are not random. They form arcs. They show reversals. They reveal consolidation and realignment.

What follows is the verified history of the struggle for alphabetical supremacy.

The Founding Volatility (1789–1825)

The republic begins with N in the lead, thanks to the two N’s in Washington. The early administrations of Washington, Jefferson, and Madison allow N to establish an initial advantage.

That stability proves short-lived.

In 1801, under John Adams, A briefly takes the cumulative lead — the only time in American history that A sits atop the standings. The moment is fleeting. Later that same year, under Jefferson, N reclaims the lead.

The early republic is not yet settled. Margins are small. A single presidency can alter the balance.

In 1825, during the administration of Monroe, O overtakes N for the first time. At this point, it might appear that O’s long reign has begun.

History, however, remains unsettled.

The Nineteenth-Century Tug-of-War

The middle of the nineteenth century reveals a system in flux.

In 1837, under Van Buren, N retakes the lead.
In 1853, under Fillmore, O reclaims it.
In 1857, under Buchanan, N takes it back again.

The margins during this era are narrow, and leadership changes hands through incremental accumulation rather than dramatic surges. No letter establishes durable supremacy. The system oscillates.

This is the Alphabetic Reconstruction Period — unstable, competitive, unresolved.

The Roosevelt Realignment (1933)

The next decisive shift comes in 1933.

Under the second Roosevelt, O retakes the cumulative lead. Unlike earlier reversals, this one holds. For more than five decades, O remains on top.

The twentieth century does not invent O’s strength, but it consolidates it. Coolidge, Hoover and Roosevelt reinforce what had previously been contested ground. From 1933 forward, O governs the cumulative standings with quiet durability.

Other letters rise but do not displace it.

The Eisenhower Acceleration and the E Near-Miss

The closest challenge to O during its long reign came not from N, but from E.

The groundwork was laid by eight years under Eisenhower — the only instance in presidential history in which a single letter appears three times in a surname. No other administration has produced such concentrated orthographic reinforcement.

Those years produced a measurable compression of the gap between E and O. And that continued under Kennedy with his two E's. By the time Kennedy died, E had narrowed the difference to just 32 cumulative occurrences (53,378 to 53,346).

Had Kennedy completed his full term — and, speculatively, a second term — E would have overtaken O for the first time in American history and solidified its lead.

It is tempting to search for deeper significance in that near-miss.

But the data are dramatic enough without inventing motive for the alphabet.

E ultimately settles into a stable third position.

The Modern N Restoration (1987–Present)

In 1987, under Reagan, N retakes the cumulative lead.

This time, it does not relinquish it.

From that point forward — through Clinton, Bush, Obama, Trump, and beyond — N maintains its position at the top. What had once been a volatile rivalry stabilizes into a modern alignment.

As of the projected end of the current term in 2029, the standings are clear:

N: 67,786
O: 65,916
E: 59,190

The margin is not overwhelming, but it is steady.

After nearly two centuries of oscillation and mid-century consolidation under O, the system has returned to its founding configuration — N on top.

What Does This Reveal?

The early republic was volatile. The nineteenth century oscillated. The twentieth century consolidated. The late twentieth century realigned.

These are phrases historians already use to describe American political development.

Here, they apply equally well to cumulative orthography.

It would be irresponsible to claim deeper meaning. The letters accumulate because surnames contain them. The graph rises because time passes.

And yet, when viewed across 240 years, the pattern feels structured. A led once. O governed for decades. E nearly staged a coup. N reclaimed supremacy and has held it since 1987.

History is written by the winners.

Even when the winners are consonants.

Sunday, February 22, 2026

cinema history class: curtains (1983)

The session: The Cold Can Kill Ya!
With plummeting temperatures, Keith shows us four movies with achingly cold settings


As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 2: Curtains (1983)
Directed by Richard Ciupa and Peter R. Simpson

My Level of Prior Knowledge:
Never heard of it.

Plot:
Thursday night at Cinema History Class brought us Curtains — but with a twist before the first frame even flickered. Keith, our usual ringmaster of celluloid mayhem, ceded the podium to his friend Chris Gullo, who does some acting but is also known as a film historian and author (with books on Peter Cushing, Donald Pleasance and others to his credit). More to the point, he recently wrote about Curtains for Dark Side magazine and was clearly champing at the bit to share his research. He did so with enthusiasm, context, and just enough behind-the-scenes intrigue to make the film’s rough edges feel like part of the legend. He even raffled off a vintage lobby poster. Bobbo won it. I’ll admit to a flicker of envy, but then I realized Bobbo will treasure it in a way I probably wouldn’t. The universe distributes ephemera wisely.

As for the film itself: it’s very good. Genuinely tense. The jump scares are plentiful enough to keep your shoulders up around your ears, but not so relentless that you never get to exhale. There’s a rhythm to the dread. And the imagery — especially that now-iconic skating scene with the doll-faced killer — is striking, eerie, and memorable. Even when the narrative wobbles, the visuals carry authority.

That said, the script doesn’t give us much to hang onto in terms of character. The actresses assembled for the audition blur together; their rivalries are sketched rather than etched. A stronger investment in who these women are might have elevated the body count into something more tragic than procedural. One performer who does stand out, as he almost always does, is John Vernon. Vernon brings that oily gravitas of his — cultured, manipulative, faintly amused — and you can’t take your eyes off him. Even when the movie falters, he doesn’t.

And falter it does, especially at the end. The finale is ambiguous in a way that left me more confused than intrigued. Not so much “Let’s ponder the implications” as “Wait, what exactly just happened?” As Chris explained in his talk, much of this unevenness stems from behind-the-scenes conflict between director Richard Ciupka and producer Peter R. Simpson. Simpson reportedly took a heavy editorial hand, reshaping scenes and even adding the ending — shot much later — that Ciupka didn’t want. In fact, Ciupka was so unhappy that he declined to have his name on the finished product; the film is credited to “Jonathan Stryker,” which amusingly is the name of the manipulative director character within the story. That tug-of-war also explains some of the more puzzling continuity glitches — including the moment when a body falls from a window and appears to execute a physics-defying 90-degree turn into another window. That wasn’t supernatural horror; it was post-production horror.

Still, for all its production scars, Curtains lingers. The atmosphere works. The set pieces work. The mask works. It’s the kind of flawed genre piece that invites discussion — which, in our little Thursday enclave, is half the fun anyway.

Joe missed this session, but he would have given the film a 10 and Chris Gullo’s presentation a 10. The raffle would have gotten a lower score — unless he won. In that case, it would have been a 10 too.




Saturday, February 21, 2026

the crossover chicago needs


 Television keeps rebooting things that already ended properly. Meanwhile, one of the most structurally inevitable crossovers in TV history has never happened:

Married... with Children and Shameless.

They share a city. They share a worldview. They share a spiritual allergy to self-improvement.

And here’s the key: Married…with Children never had a real finale. It didn’t conclude. It just stopped in 1997. The Bundys were left wandering Chicago continuity without narrative supervision. Shameless ended — and in doing so, it killed Frank Gallagher.

Chicago now has a vacancy.

And Chicago does not tolerate a vacuum in dysfunction.

What Happened to the Bundys

The Bundy house survived decades. What it could not survive was property tax reassessment and a variable-rate mortgage Bud once described as “strategically aggressive liquidity optimization.”

Al described it differently.

“We didn’t lose the house. The house lost us. It couldn’t handle the pressure.”

Bud Bundy, improbably successful in crypto-security compliance, convinced his parents to extract equity. Then the market shifted. Then taxes rose. Then Al ignored mail.

“I don’t open envelopes,” Al explains. “That’s how they get you.”

Peggy refused to “downsize into moral surrender.” Kelly assumed escrow was a person. Bud called it “temporary dislocation.” Al called it:

“Living proof that America hates a man who once scored four touchdowns in a single game.”

Bud purchased a distressed two-flat on the South Side as a “cash-flow property with urban upside.” Al’s review:

“So we downgraded from a house we couldn’t afford to a neighborhood that can’t afford us.”

Bud also acquires the commercial note on the building housing the Alibi. Al finds this out the hard way.

“So my son doesn’t own the bar,” Al says. “He just owns the panic attached to it.”

The Gallaghers, Post-Frank

Frank Gallagher is gone. The Gallagher house still stands — stubborn, dented, defiant.

Fiona Gallagher returns to stabilize paperwork and destabilize herself. Lip is sober and tense. Ian and Mickey are married and combustible. Debbie is running three side hustles and one emotional deficit. Liam is the only adult within a five-mile radius.

Bud shows up to inspect his investment.

Al comes along. Al surveys the block.

“I like it. It’s like our old neighborhood — but honest about it.”

Inside the Gallagher house, Al studies the chaos.

“This isn’t neglect. This is efficiency. Why clean something you’re just going to disappoint again?”

Within weeks, Al has adopted Frank’s old stool at the Alibi.

“Relax,” Al tells the regulars. “I’m not replacing the guy. I’m just here to lower expectations.”

Fiona and Bud

Fiona meets Bud in his upgraded form: tailored suits, controlled tone, generational resentment. They bond over paternal disappointment. Al finds out.

“You’re dating my son?” Al says to Fiona. “That’s like trading in a broken appliance for a refurbished one.”

Kelly discovers the affair and arrives on the South Side in full influencer mode. She launches a lifestyle stream titled “Suburban Goddess Goes Urban.”

Al watches one episode.

“You know, Peg, if stupidity were electricity, we could power this block.”

Debbie immediately monetizes Kelly’s following. Lip distrusts Bud’s money. Al distrusts Bud’s existence.

“I raised him to be a failure. Now he’s a success. Where did I go wrong?”

Peggy Finds Her Climate

Peggy Bundy thrives. The Gallagher block runs on entropy, unpaid utilities, and creative denial. Peggy calls it “community.” She and Mickey bond instantly.

Peggy to Al:

“I finally found people who understand that housework is a social construct.”

Al replies:

“So is marriage. And yet here we are.”

Marcy and Carl: The Cougar Doctrine

Marcy D'Arcy meets Carl Gallagher. Carl is younger. Uniformed. Confident in the vague way authority sometimes is. Marcy frames it as mentorship. It escalates.

Jefferson objects. Marcy informs him she is “evolving.”

Al weighs in:

“Marcy trading in Jefferson for Carl? I guess she finally decided to date someone with arrest authority instead of just arrest potential.”

Carl, navigating midlife suburban ambition, asks Al for advice. Al considers this.

“Son, if a woman tells you she sees potential in you, run. Potential is what women see when they don’t like the current model.”

Liam writes a school essay titled Late Capitalism and the Migration of Sitcom Predators.

The Central Conflict

Bud restructures the Alibi’s lease. Al explodes.

“I don’t need a landlord. I need a liver.”

Bud pitches a neighborhood redevelopment scheme involving digital escrow and tax incentives. Al’s summary:

“So the plan is we fix everything by charging ourselves more?”

The deal unravels. Fiona nearly leaves again. Marcy files paperwork. Carl investigates a complaint that circles back to Bud. Kelly livestreams the collapse. Peggy orders takeout.

The Gallagher house remains theirs. The Alibi survives. The Bundys remain displaced. Al returns to his stool.

“You know what the difference is between me and Frank Gallagher?”
He takes a sip.
“Timing.”

Why This Isn’t Nostalgia

In the 1990s, the Bundys were exaggerated stagnation.

In the 2010s, the Gallaghers were exposed precarity.

In the 2020s, those aren’t different genres.

They’re neighbors.

Married…with Children never got a finale. Shameless lost its patriarch.

Chicago still has room for one more man sitting where responsibility should be.

Al Bundy walked so Frank Gallagher could fall down the stairs.

Now somebody has to keep the stool warm.

Monday, February 16, 2026

you always remember your first subway map

Last year the subway system introduced a new map — or perhaps it had already begun phasing it in before 2025 and I only noticed when the old one had mostly disappeared.

That old one — the map that replaced the 1972 Vignelli diagram — had been with us since 1979. Designed by Michael Hertz Associates, it became simply the subway map. It lasted more than forty-five years.

And yet, to me, the definitive New York subway map is still the Vignelli map.

Which is odd.

Because it was only in use for seven years.

Abstraction

The Vignelli map was beautiful — provided you already knew where you were going.

If you knew you needed the F to West 4th Street, it was perfect. The design was schematic, rational, almost electrical. Lines ran at 45- or 90-degree angles. The subway looked like a circuit board. It felt solvable.

It treated the subway as a system.

It did not treat New York as a city.

Distances were abstract. Central Park was a tidy shape. Boroughs were compressed or widened. Landmarks were mostly absent. If you were trying to understand where things actually were in relation to each other above ground, the map could mislead you.

I learned that the hard way.

Woodside and the Popsicle Sticks

Still during the Vignelli era, I decided I needed to go to an art supply store in Woodside to buy popsicle sticks for some craft project. I don’t remember how I figured it out, but I determined which stop on the 7 train was closest and concluded that it would be a short walk.

The map seemed to confirm this.

I got off the train and started walking.

And walking.

And walking.

Eventually I reached the store and bought my popsicle sticks, but the walk was much longer than I had anticipated. The map had flattened distance. Stations that appeared neatly spaced were not, in fact, equidistant in real life. And Queens, with its relatively sparse subway coverage compared to Manhattan, has a way of stretching space between stops.

The Vignelli map had made the trip look like a hop.

Reality was a hike.

It was a beautiful system diagram. It was not a pedestrian guide.

The Hertz Correction

The 1979 redesign tried to fix that. The Hertz map overlaid the subway lines onto something resembling actual geography. Shorelines curved. Subway lines curved. Parks looked like parks. You could approximate direction and distance.

It also introduced one of the system’s smartest ideas: trunk-line color coding through Midtown Manhattan.

The B, D and F were orange because they ran on the Sixth Avenue trunk.
The N, Q and R were yellow because they ran on Broadway.
The 1, 2 and 3 were red; the 4, 5, 6 green; the A, C, E blue.

The color didn’t belong to the letter. It belonged to the trunk.

Which meant the letters could change color.

The Q has not always been yellow. From 1988 to 2001, when it ran via Sixth Avenue, it was orange. The V —  which existed for just under a decade, from 2001 to 2010 — was orange for the same reason. The M used to be brown, part of the old Nassau Street trunk, before rerouting through Sixth Avenue turned it orange.

Even the Q’s name is historical shorthand. It descends from “QB” (“Brighton via Bridge”), alongside QT (“Brighton via Tunnel”) and QJ (“Brighton to Jamaica”). Historically, the Brighton Line in Brooklyn was the Q. I wish I could end this paragraph with the snarky observation that today’s Q doesn't go to Brighton anymore. But it does.

The W, despite what my memory occasionally insists, has always been yellow. Broadway. My recollections are not immune to rerouting.

The Loss of Purple

As a kid, I did not think in terms of trunk-line logic.

I thought in terms of identity. I thought of the F as my train.

The F used to be purple.

It was a beautiful purple. Distinct. Singular. Royal.

Then trunk-line standardization arrived and the F turned orange, joining the B and D. Meanwhile the E — which I irrationally regarded as a rival — remained blue (though a darker shade).

This felt like betrayal.

I understood that color now signified shared Manhattan infrastructure. But I also knew was that the F had lost something beautiful. The 7 train became purple, and it has remained so ever since.

Clutter and Compromise


The Hertz map’s weakness was the inverse of Vignelli’s. It tried to show everything. Geography. Water. Parks. Bus connections. Street grids. Transfers. It was useful — and sometimes cluttered.

The new map introduced in 2025 appears to be an attempt at compromise. It returns to schematic clarity — strong geometry, simplified layout — while preserving trunk-line color coding. Instead of a single thick orange band in Midtown, multiple orange services run in parallel. It is less geographic than Hertz, more geographic than Vignelli.

It tries to split the difference.

The First Map Is the Real Map

The Vignelli map lasted seven years. The Hertz map lasted more than forty-five.

And yet the Vignelli map is the one in my head.

It was the map in use when I first became fascinated with the subway — when tracing colored lines across boroughs and memorizing station names felt like decoding a secret system. It made the city look legible.

When I was young, I found it odd that the map wasn’t redesigned every few years. Why not refresh it like baseball cards? Why not issue new editions, with completely new designs?

It took adulthood to realize that redesigning infrastructure costs money. Stability matters. A transit map is not a collectible.

It is a working tool.

I’m not entirely convinced the system needed this latest redesign. But maps are arguments about what matters — clarity or fidelity, abstraction or realism, system or city.

In the end, the definitive subway map is the one you first learned to read — the one that made you believe the city could be understood.

Even if it occasionally left you walking much farther than expected for popsicle sticks in Woodside.

Saturday, February 7, 2026

cinema history class: the thing from another world (1951)

The session: The Cold Can Kill Ya!
With plummeting temperatures, Keith shows us four movies with achingly cold settings


As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 1: The Thing from Another World (1951)
Directed by Christian Nyby

My Level of Prior Knowledge:
I knew of this movies existence, since Keith showed us the remake a year ago.

Plot:
A group of Air Force personnel and scientists at an Arctic base uncover a crashed spacecraft frozen in the ice. They soon realize the wreck carried a hostile alien life-form that feeds on blood and begins stalking the isolated outpost. As tensions rise between scientific curiosity and military caution, the group bands together to stop the creature before it can escape into the world.

Reaction and Other Folderol:
My watching The Thing from Another World and grading it was unfair to it, and I’ll own that up front. I had seen the remake (titled simply The Thing) at Keith’s about a year ago, and that made it hard not to spend this entire screening waiting for things that simply weren’t going to happen. The two films tell very different stories: the 1951 version is essentially “there is a big, scary alien loose, stalking the base and we need to stop it,” while the remake leans into a far more paranoid idea—an alien that imitates whatever it comes into contact with. As I watched, I kept expecting shape-shifting paranoia and molecular body horror, even finding myself wondering if the Star Trek: The Next Generation episode, "Aquiel," was inspired by the later film. Ironically, my understanding is that the remake is actually closer to the original short story than this movie is.

That mismatch probably cost the film a few points in my mental scorecard, because I did like it—just not as much as the remake. I want to believe that’s mostly about plot: the later version simply gives me more to chew on. But I also have to admit that I’m a product of my time, and early-1980s movie sensibilities feel more natural to me than early-1950s ones. Bobbo, being older and much more steeped in ’50s sci-fi, was clearly more in the movie’s wavelength than I was, which is exactly how these things tend to shake out.

A lot of the movie really works. The pacing is solid, the atmosphere is tense, and the setting does a lot of heavy lifting. That said, the human-looking vegetable alien—kept mostly in shadow for understandable reasons—was, at moments, a little hard not to chuckle at. There’s also a scene where the alien runs out of the base while on fire, and I genuinely found myself wondering whether this image somehow planted a seed that later bloomed into Flaming Carrot. I’m not saying it did. I’m just saying my brain went there.

One area where the movie really impressed me was in a couple of its quieter science-fiction ideas. The fact that shooting the vegetable creature doesn’t immediately kill it actually makes perfect sense: plants don’t have vital organs the way humans do, so bullets aren’t automatically fatal. If I went out to my front yard and fired a cannon at my oak tree, it would be damaged, sure, but not “dead” in any meaningful sense — whereas if the oak tree somehow returned fire, I would be extremely dead. Even creepier (and wonderfully so) is the scene where the scientist calmly shows off the tray of plants he's been growing. He explains that he started with little bits of the alien that fell off in one of the scuffles, and he’s feeding them human blood. That moment lands like proper science fiction: unsettling, logical in its own warped way, and far more disturbing than anything that jumps out from the shadows.

One thing that deserves real credit is how believable the human characters are. People argue, make bad decisions, push their own agendas, and react in ways that feel recognizably human rather than purely “movie logic.” That grounded behavior goes a long way toward selling the danger and gives the film a seriousness that still holds up.

And Joe wasn’t there. Had he been, he would have confidently declared this a ten, immediately, without hesitation, and then spent the next five minutes explaining why any objections were missing the point entirely. Because some tens are tennier than others. Or maybe not, since Bobbo gave it a ten, it would be hard to argue that Joe shouldn't.


Sunday, February 1, 2026

cinema history class: dig your grave friend... sabata's coming

The session: Viva Sabata!
Four Movies featuring Sabata, a James Bond of the wild west


As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 4: Dig Your Grave Friend... Sabata is Coming (1971)
Directed by Gianfranco Parolini

My Level of Prior Knowledge:
I know I've seen this before -- there were a couple scenes that I recognized. But I have no idea when or under what circumstances.

Plot:
A civil war soldier returns to his father's home, only to find the old man dead. Seeking revenge, he finds himself partnered with an unlikely ally and at odds with a beautiful and brilliant woman who doesn't quite know whom to trust. Together they navigate corrupt officials, hired guns, and shifting loyalties as old scores are settled and new ones are created.

Reaction and Other Folderol:

Let’s get this out of the way first: Dig Your Grave Friend… Sabata’s Coming is a Sabata film in name only.

There is a character named Sabata in the movie, but he’s a tertiary figure and about as far from the iconic Lee Van Cleef version as you can get. This Sabata is just a run-of-the-mill hired gun. Unlike Sabata, he’s a bad guy. Unlike Sabata, he has no clever gadgets or gimmicks. And unlike Sabata, he dies.

Joe said what we were all thinking: it’s obvious this movie was not developed as a Sabata film at all. The name was slapped onto it later to capitalize on the popularity of the character. Maybe they could have given Richard Harrison’s character, Steve, the Sabata name instead—that might have made it fit a little better structurally. But even then, it still wouldn’t really feel like a Sabata movie.

So rather than judge this as a failed or bogus entry in the Sabata saga, it makes more sense to look at it simply as a spaghetti western on its own terms.

And on those terms, it’s entertaining. Good in some ways. But not great by any means.

One thing the movie does surprisingly well is function as a kind of buddy film. Steve ends up paired with Leon, an unintended sidekick who helps him navigate an increasingly complicated situation. (It’s hard not to notice that Leon looks a lot like Ron Jeremy, once that thought enters your head.) The buddy dynamic works better than expected—Steve and Leon play off each other nicely, and their interactions give the movie a bit of momentum it might otherwise lack.

In fact, that relationship is probably the main reason the film works as well as it does. When it’s leaning on the interplay between those two characters, the movie feels lighter on its feet and more engaging.

Where it starts to falter is in its reliance on humor. Several of the barroom fight scenes tip over into outright slapstick, drifting into Three Stooges territory. It’s not that humor has no place in a spaghetti western—but here it’s sometimes pushed too far, undercutting any tension the scene might otherwise have had.

On the other hand, the film does occasionally tap into the kind of nastiness that spaghetti westerns are known for. There are moments of cruelty and grit that remind you this genre can still bite when it wants to. Those moments help keep the movie from becoming entirely frivolous.

This isn’t a deep film, and it doesn’t really offer anything new. But it is fun. As long as you’re not expecting Sabata—despite what the title insists—you can get some enjoyment out of it.

After thoroughly discussing what was wrong with the movie, Joe gave it a ten. But he acknowledged that, within the realm of tens, it’s an eight. Because some tens aren’t as tenny as others.

Saturday, January 31, 2026

struggle was always the default


I’ve seen a lot of social media posts lately lamenting what’s presented as one of life’s great injustices: most people work for decades, often until around age 65, and only then—if they’re lucky—get to stop working and “enjoy life.” Some retire earlier, some later. Some never really retire at all, continuing to work well past traditional retirement age because they have no choice. This reality is often described not just as unfortunate, but as fundamentally wrong.

And increasingly, the blame is laid at the feet of “civilization.”

The argument seems to go something like this: modern society has imposed an unnatural burden on people, forcing them into decades of toil that humans were never meant to endure. I’ve even seen claims that humans are the only animals saddled with this bizarre notion of “work,” as if the very concept is an invention of spreadsheets, office parks, and capitalism.

But this strikes me as getting things almost exactly backwards.

Modern civilization—especially the specialization of labor that comes with it—has been one of the greatest improvements in human quality of life. Other animals don’t go to jobs in the way we do, but they absolutely still work. For animals in the wild, life is a constant, unrelenting struggle for survival. Food must be hunted or foraged. Shelter must be found or built. Predators must be avoided. Injuries can be fatal. Aging doesn’t come with a gold watch and a pension plan. There is no retirement in the wild.

The same was true for humans before civilization, and for much of early civilization as well. People gathered food. They hunted. They planted. They defended themselves. They worked simply to stay alive. Civilization didn’t invent work; it gradually made survival less brutal. The progress hasn’t been perfectly smooth or monotonic, but the long-term trend is unmistakable. Life today—certainly in America, and broadly across the developed world—is vastly better than life anywhere a hundred years ago, let alone a thousand or ten thousand years ago. We enjoy comforts our ancestors couldn’t have imagined.

The uncomfortable truth is that humans are animals, and our natural condition involves effort. Food does not arrive at the table without work. Shelter does not magically appear. Clothing, transportation, medicine, and infrastructure all require human labor somewhere along the chain. One of the great achievements of advanced civilization is that we now get more of these things with less total human effort than ever before—not that effort has disappeared entirely.

Some of the confusion may be fueled by science fiction. Franchises often portray futures in which humans have transcended mundane labor, freed at last to pursue self-actualization without economic necessity. It’s an appealing vision. But it’s also a fantasy—at least for now.

There are people who believe that artificial intelligence, advancing exponentially and perhaps even designing future versions of itself, will finally deliver that utopia. A world where all material needs are met and no one has to do work they don’t enjoy. I’m skeptical. Human nature hasn’t been repealed, and neither have scarcity, incentives, or power. But it’s an open question. We’ll see.

Wednesday, January 28, 2026

the reunion we need. the reunion we deserve.


TV keeps rebooting the wrong shows. Enough already. What we actually need is a BJ and the Bear reunion movie.

Picture this: BJ McKay rolls back into Orly County for Sheriff Lobo’s funeral. The man was a pain, a bully, and—let’s be honest—a legend. The town turns out. There are speeches about “law and order” that somehow leave out the part where he terrorized long-haul truckers for sport.

BJ reconnects with Deputy Hawkins, who—miraculously—has aged into exactly what he always was: a straight-arrow, by-the-book lawman who still thinks the system works if you just follow the rules hard enough. Hawkins is genuinely happy to see BJ. Less happy to see Bear riding shotgun, because some things never change.

Enter the problem: Sheriff Lobo’s son. He’s running for sheriff himself, fueled entirely by resentment. He grew up hearing bedtime stories about “that drifter trucker and his damn monkey” humiliating his father on a weekly basis. He bears a grudge. (Yes. Bears. I regret nothing.)

At first it’s petty harassment—traffic stops, inspections, old ordinances dusted off for no reason. But it escalates when BJ realizes Orly County has been “cleaned up” in the worst possible way: small operators pushed out, corruption repackaged as respectability, and Lobo Jr. using his father’s legacy as a cudgel. Hawkins is stuck in the middle, trying to do the right thing while pretending this isn’t personal.

And Bear? Bear sees through everything. The chimp knows the score immediately. He becomes the moral center of the story, which feels right.

By the end, BJ isn’t just passing through anymore. He’s forced to decide whether some towns are worth fighting for—even when the fight looks suspiciously like the same old one. Also there’s a truck chase, at least one courtroom scene that makes no legal sense, and Bear absolutely steals a set of keys at a crucial moment.

Tell me this wouldn’t work. Tell me this wouldn’t be better than the ninth reboot of something nobody asked for. And Joe will give it a ten!

Sunday, January 25, 2026

on tipping -- and why I still do it


I'm posting this because I have seen a bunch of social media posts (on a bunch of platforms) about the subject of tipping. I know that, the algorithm being what it is, most people aren't seeing what I'm seeing. I also realize that no one asked for my opinion. Too bad.

I don’t like the current tipping system. If I were designing things from scratch, I’d much prefer a system where tips are not expected — or better yet, not allowed at all. In that world, servers would be paid a proper, predictable wage, menu prices would be higher to reflect that reality, and everyone would know what they’re paying for upfront. No math at the table, no moral arithmetic afterward.

But that isn’t the system we have.

In the system we do have, servers rely on tips. And just as importantly, the prices I see on the menu are built around that fact. They’re lower precisely because the restaurant is not paying full wages and is instead shifting part of that responsibility onto the customer. So if I were to refuse to tip on principle, I wouldn’t be staging a protest — I’d simply be taking advantage of artificially low pricing while someone else absorbs the cost.

That doesn’t sit right with me.

I can dislike the system and still acknowledge the reality I’m participating in. Until the rules change, choosing not to tip doesn’t punish “the system”; it punishes the person who brought the food to my table. And simply saying that "it's the restaurant's job to pay them better" doesn't address the fact that I'm getting a lower menu prices because the restaurant isn't paying them better. 

That said, I do notice that expectations have shifted. When I was growing up, 15% was considered a solid tip for decent service. You tipped more for exceptional service, less if things went badly, but 15% was the baseline. Now the default seems closer to 20%, with suggested amounts sometimes climbing higher than that. Whether that reflects rising costs of living, social pressure, or tip creep driven by point-of-sale screens, I’m not entirely sure — but it has changed.

So I find myself in an uncomfortable middle ground: disliking the system, recognizing its flaws, noticing its evolution, and still tipping — because opting out unilaterally isn’t reform, it’s just imbalance.

I’d happily support a different model. I’d even pay higher menu prices for it. But until that model actually exists, I tip — not because I love the system, but because I live in it.

Saturday, January 24, 2026

cinema history class: return of sabata (1971)

The session: Viva Sabata!
Four Movies featuring Sabata, a James Bond of the wild West


As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 3: Return of Sabata (1971)
Directed by Gianfranco Parolini

My Level of Prior Knowledge
I may have heard of it. Maybe. I dunno. I certainly didn't really know much about it except that there were several films with the Sabata character. Sort of like Sartana or Django.

Plot:
Master gunman Sabata comes out of hiding to expose and dismantle a powerful gold-smuggling conspiracy run by corrupt officials and bankers, outwitting rivals with gadgets, disguises, and razor-sharp strategy.

Reaction and Other Folderol:
Before getting into the movie itself, a small sidenote about something that annoyed me more than it should have: there’s a character named McIntock who is very frequently referred to as “McClintock.” I realize this is not a major sin, or even a particularly interesting one. But once I noticed it, I couldn’t not notice it, and it became one of those tiny irritations that just sat there, tapping me on the shoulder for the whole movie.

Stepping back for a moment, Sabata Returns also marks the end of the official Sabata trilogy, and Keith deserves thanks for guiding us through all three films. I found that they form a neat little case study in how quickly a character can drift into self-parody, depending on tone, direction, and how much the filmmakers trust the audience versus how much they feel the need to mug for it. I acknowledge that that wasn't Keith's intention. He (and most of the others in the room) like Sabata more than I do. I guess there's no accounting for taste.

This time out, Lee Van Cleef is back after sitting out the second film, where he was replaced by Yul Brynner. One might think this was a in for me, but I realized after the second Sabata film, I prefer Brynner's serious interpretation of the character. This movie leans hard into broad comedy—much more than the first two—and for me that’s the single biggest reason it didn’t work. I’m increasingly realizing that I prefer Van Cleef when he plays it straight: not necessarily as a villain, but as a controlled, intimidating presence. He’s terrific in For a Few Dollars More, The Good, the Bad and the Ugly, and Death Rides a Horse—all performances built on stillness, menace, and restraint. Here, he’s asked to be a comedian far too often, and it just doesn’t suit him.

The opening scene didn’t help. The whole dinner-theater-meets-circus setup felt less like a western and more like a particularly campy episode of Batman, complete with winking theatrics and exaggerated villainy. That tone more or less sets the agenda for what follows.

Once again, the film relies heavily on familiar faces in familiar roles, giving the whole thing the feel of a repertory theater company endlessly remixing the same parts. Some people seemed to enjoy that sense of continuity. I didn’t. Maybe if there had been more here that I genuinely liked, I would have found it comforting instead of repetitive.

Gimmicks also return in force: more acrobatics, and now an elaborately deployed foot-operated slingshot, just in case you were worried the movie might go five minutes without reminding you how clever it is. We also get, yet again, the slimy Mexican character and the untrustworthy associate who’s only in it for the money—familiar beats, hit once more.

Add to that a score I found actively annoying, and a runtime that felt far longer than it needed to be, and I spent a good chunk of the film thinking, Jesus Christ—when will this end?

And, of course, Joe gave it a ten -- no surprise there. But at least this time he limited his review to a movie that actually exists.



Wednesday, January 21, 2026

cinema history class: adios, sabata (1970)

The session: Viva Sabata!
Four Movies featuring Sabata, a James Bond of the wild West




As always, there may be spoilers here. And the trailer may be NSFW and/or NSFL.

Week 2: Adios, Sabata (1970)
Directed by Gianfranco Parolini

My Level of Prior Knowledge
I'd heard of it, but didn't really know much about it except that there were several films with the Sabata character. Sort of like Sartana or Django.

Plot:
An enigmatic gunslinger arrives in a corrupt frontier town where a stolen gold shipment has entangled politicians, businessmen, and outlaws alike. Playing rival factions against each other with wit, gadgets, and lethal precision, he uncovers the conspiracy behind the theft and ensures that justice—on his own terms—is served.

Reaction and Other Folderol:
One of the more interesting bits of trivia surrounding Adiós, Sabata is its casting sleight-of-hand. Lee Van Cleef—who originated the role of Sabata—was unavailable, having been contracted to star in The Magnificent Seven Ride! as Chris Adams, a role famously played by Yul Brynner in the original The Magnificent Seven. So Brynner, in a bit of cinematic role-swapping irony, stepped into Van Cleef’s boots and became Sabata.

I’ll admit I’m in the minority here, but I actually preferred Brynner’s take on the character. His Sabata has more gravitas—less of a wink, more of a stare. The humor is still present, but it’s dialed back in favor of poise and authority, with Brynner striking deliberate poses that feel almost mythic. Most of the room leaned toward Van Cleef’s version precisely because of its playful, ironic edge; I liked that this Sabata took himself (and the stakes) more seriously.

In fact, Adiós, Sabata often feels less like a sequel than a careful remake, following much of the same structural DNA as the first film. The character Ballantine is essentially a stand-in for Banjo: the same greedy, slippery ally you can’t quite trust, but who sticks around as long as the arrangement benefits him. Like Banjo before him, Ballantine ultimately tries to make off with the gold—only to be outplayed by Sabata in almost the same fashion.

Even the action escalation feels familiar. When straightforward gunplay wasn’t enough in the first film, Sabata introduced dynamite as a recurring visual flourish. Here, that role is filled by nitroglycerin, with explosions punctuating shootouts in much the same way. Likewise, the flashy physical gimmicks have simply been swapped out: the coin-tossing and acrobatics of the original are replaced by a character who can hurl a metal ball with his feet at terrifying speed and accuracy. One improbable trick retires; another clocks in.

One thing that did grate on me, though, was seeing actors from the first film reappear in very similar roles—but as entirely different characters. I know this is a common spaghetti-western practice, but it chips away at the internal logic of the world. For me, it breaks the illusion of continuity and makes the whole thing feel more like repertory theater than a shared cinematic universe.

Still, despite all the repetition, I ended up rating Adiós, Sabata higher than its predecessor. The familiar framework works better for me when it’s treated with a straighter face, and Brynner’s more solemn interpretation elevates material that might otherwise feel like a retread.

And then there’s Joe, who gave it a ten—because in this universe, Yul Brynner plays Sabata seriously, and that’s a ten—while confidently insisting that the nonexistent alternate-universe version starring Lee Van Cleef doing it more humorously would also be a ten, which is impressive given that he apparently now reviews movies that were never made.





Sunday, January 11, 2026

a's or athletics -- what a trademark dispute reveals, and why it matters to me


I’ve long been interested in sports franchise movement and name changes, particularly in Major League Baseball. So when I learned that the team formerly known as the Oakland Athletics had been denied trademarks for “Las Vegas Athletics” and “Vegas Athletics,” but approved for “Las Vegas A’s,” it immediately caught my attention.

On its face, this looks like a narrow legal story. But it touches on something much older and more fundamental: what baseball teams are actually called, who decides that, and how those decisions have changed over time.

What I called them — and what the record says

Growing up, I thought of the team simply as the A’s. I knew that was short for “Athletics,” just as I knew the Mets were short for “Metropolitans” and the Yankees for "Yankee-ee-ees." But in conversation, on uniforms, and on baseball cards, they were the A’s.

That impression was reinforced by my father. When we talked baseball, he talked about the A’s. Not the Athletics.

At some point much later, when I started doing more systematic historical work, I needed a single, consistent source of truth for team identities. For me, that source is Baseball-Reference. It’s not perfect, but it is transparent, consistent, and careful about continuity.

Baseball-Reference lists the franchise’s nickname as “Athletics” continuously from 1901 to the present. No alternation. No official back-and-forth. Just Athletics.

That surprised me at first — but I accepted it, because when you decide on a source of truth, you have to live with it.

A childhood lesson in “correctness”

The distinction mattered to me even as a kid.

I remember wanting to show off for my grandpa Ed by reciting all the major league teams in alphabetical order. There were only twenty-four teams then — sometime between 1969 and 1976 — but that detail matters only because I had memorized the list by sorting my baseball cards by team.

I knew there was one potential snag. Did I say A’s or Athletics? It mattered, because it determined whether the team came before or after the Astros.

So I asked my grandfather which was right.

Without hesitation, he said, “Athletics, of course.”

That caught me off guard. I was used to saying the A’s. My father said the A’s. But my grandfather’s answer carried a different kind of authority — as though, whatever fans said informally, there was still a sense that the proper name existed underneath.

I didn’t articulate it that way at the time, but the lesson stuck.

When team names weren’t “official” yet

Part of the confusion here comes from the fact that baseball team names did not begin as formal, declared brands.

In the 19th and early 20th centuries, nicknames were often:

  • informal,
  • media-driven,
  • situational,
  • and sometimes accidental.

Teams could have multiple nicknames at once, and newspapers freely experimented. The Cincinnati Kelly’s Killers, the Chicago Orphans, the Brooklyn Bridegrooms — these weren’t the result of branding exercises. They were labels that caught on because writers used them and readers understood them.

There was no trademark strategy. No naming committee. No press conference unveiling a logo.

From informal tradition to formal branding

Over time, that looseness disappeared.

As franchises became long-lived commercial entities, names hardened into official identities. By the late 20th and early 21st centuries, naming (and renaming) a team became a major corporate exercise, often accompanied by fan outreach, surveys, and carefully managed rollouts.

A clear example is the Cleveland franchise’s transition to the Cleveland Guardians, a process that explicitly solicited public input and emphasized deliberateness. Expansion franchises, in particular, often lean heavily on fan suggestions when selecting names, precisely because the name is now understood as a long-term asset.

In other words, the sport moved from organic, informal naming to formal, legally protected branding — and that evolution sets the stage for the current trademark dispute.

Why the trademark office cares

Trademark law has little patience for ambiguity.

“Athletics” is an extremely old name, one that predates modern trademark norms and was used by multiple teams in different cities over the decades. It is also descriptively weak: the word does not inherently distinguish one specific commercial source.

“A’s,” by contrast, is distinctive. It is visually iconic, closely associated with a specific logo, and has decades of consistent use on uniforms and merchandise. From a trademark perspective, it cleanly identifies this franchise.

Seen that way, it is not especially surprising that “Las Vegas A’s” cleared a hurdle that “Las Vegas Athletics” did not.

Why I find this funny

Here’s where this circles back to my own work — and why the situation genuinely amuses me.

For years, in my ongoing tracking project (“Stoopidstats,” which I really should trademark), I have listed this franchise as the Athletics, even though I personally thought of them as the A’s. I did that because Baseball-Reference is my source of truth, and consistency matters.

This isn’t cosmetic. I track cumulative wins — and games over .500 — by nickname.

A friend recently noticed this and thought it odd. He was sure that the team’s official name must have alternated over time between A’s and Athletics, and was surprised to learn that, according to my source, it never did.

Which makes the current situation deliciously ironic: after decades of thinking of them as the A’s but listing them as the Athletics, the franchise may now be forced, by trademark reality, to become the A’s.


What this means for my stoopidstats

From a purely statistical standpoint, this makes things interesting.

Adding Las Vegas (or Vegas, or Nevada) as a location already complicates franchise tracking. Adding Nevada as a state does too. And adding “A’s” as a distinct nickname would create something entirely new.

Right now, “Athletics” ranks third all-time in wins by nickname, with 10,302, behind only “Giants” and “Reds.” Fourth place belongs to the “Pirates,” with 10,263 wins. That 39-win margin between them will, of course change over the next few years, but when the "Athletics" name goes away (if it goes away) it is likely to still be third and "Pirates" is likely to still be fourth. But from that point on, "Athletics" will sink, and "Pirates" will likely be the first to pass it. And, of course, I’ll get to watch a new entry (A's) start with zero wins and slowly climb the rankings, passing such memorable but defunct names as "Mansfields," Tip-Tops" and "Quicksteps."

That’s not a tragedy. It’s a reminder.

Names, memory, and authority

This whole episode highlights the tension between:

  • how fans remember teams,
  • how historians catalog them,
  • and how the law insists they be defined.

I grew up saying the A’s. My father said the A’s. My grandfather insisted on Athletics. Baseball-Reference sided with my grandfather. Trademark law may ultimately side with my childhood self. And I still think "Phillies" is a lazy name, but that's a subject for a different post.