Items to fit into your overhead compartment |
| How about some dubious travel advice today? From TimeOut, whoever they are. Why Western Montana is best visited outside of summer Here’s my case for an off-season visit to the home of Yellowstone and Glacier National Parks. I want to be clear: I'm not ragging on Montana here. I know we have at least two members here from that scenic state. So that's, what, 10% of its population? What I take issue with is "outside of summer." I've driven through Montana "outside of summer." Sure was pretty... until I had to venture out of my nice warm car with its heated seats to refuel, and nearly froze into a statue. Western Montana, in particular, overflows with visitors during the summer, thanks to its proximity to Yellowstone and Glacier National Parks. I would, admittedly, like to see Yellowstone, preferably at some time when it's not overly infested with tourists. Glacier National Park, though? That sounds cold. Even though June through August might be the most popular months to visit, that doesn’t mean it’s the best time for a trip. "Best" is, of course, a value judgement and an opinion. Because I don't like being in a crowd or following one, I do enjoy traveling in the less-than-peak season. We used to go to the Outer Banks every September, which was still pretty warm but less crowded, especially with most kids being in school and consequently not shrieking and running around. Sure, the hurricane risk was greater, but we only had one vacation interrupted by impending doom, and that's what travel insurance is for. Still, you won't catch me there in December. That's probably why the Wright brothers did their testing there and then: everyone else had the good sense to stay the hell out of Kitty Hawk. Peak season problems don’t end with endless elbow rubbing on the trails and inside restaurants. I know I joke a lot about *shudder* the outdoors, but, shh, don't tell anyone, I do enjoy the occasional stroll out in full view of the accursed daystar. What I do not enjoy is having to deal with other people while doing so. While I'm not in the least bit interested in climbing Mount Everest anyway, pictures like this one Temperatures are at their highest, making outdoor activities a little less enjoyable. And you just lost me. I want warm. Hell, I'm okay with hot. Not to mention, the mosquitoes and black flies can be relentless. That's because you ventured *shudder* outside in the first place. Wildfire season overlaps with this busy time of the year, bringing added risk to your trip. Okay, that's fair. Being stuck in the wild during a wildfire would be an actual nightmare. Lest we forget, though, Yellowstone is home to a supervolcano that will one day wipe out most of the life in North America and maybe beyond, and if you're there, you're at Ground Zero. Fine, though, I'll admit that a) being at Ground Zero would be preferable to suffering somewhere on the fringe, and b) the risk of supervolcanic activity is way lower than that of wildfires. One of the best parts of going to a new place is connecting with those who live there. I can name endless trips where a random conversation or even just small talk with a barista has resulted in amazing recommendations and provided important additional context on a place. Now, on this point, I can absolutely relate. Except you need to replace "barista" with "bartender." Montanans will tell you that fall is the state’s most underrated season. Many of the outdoor activities popular in summer are still possible, but with fewer people. I think it depends on exactly when in the fall. Late September? Sure. Early December? Did that, hated it. Winter gets a bad rap, but it’s actually one of the most picturesque times to visit Western Montana. Sure, temps can be frigid, but that’s part of the appeal. No it's not. The author also waxes poetic about spring there. Like I said, the area is worth seeing. It's just that some of it is worth seeing through a window from a climate-controlled space. |
| "We're doomed." "Oh, come on, Waltz. Be more positive!" "Okay. I'm positive we're doomed." From Psychology Today, back in May: The Paradox of Trying to Be Positive Why real well-being starts with feeling, not forcing positivity. A hydrogen ion walks into a bar. "I'd like a beer, please." Bartender goes "Are you sure?" "Yes, I'm positive." Do you get the sense that positivity has become a kind of moral code? It's certainly a forced social norm. We're encouraged to focus on the bright side, reframe the negative, and show up with a smile—regardless of how we're actually feeling. When your paycheck depends on you projecting happiness, you learn to act happy fast. I vaguely remember doing a bit in here, or maybe it was the previous blog, about how no, forcing a smile doesn't make you happy. Can't be arsed to find it again now. But here's the paradox: In trying to be positive, we often end up disconnected—from ourselves, from others, and from what's really going on. On the positive side, have you seen what's really going on? Disconnection starts to sound like the key to happiness. There is, of course, value in finding solutions and making meaning out of difficult experiences. Assertion without evidence. Emotions are messy, inconvenient, and at times overwhelming and difficult to interpret. Yet they play a vital role. They're not instructions; they're information. Much like the lights on a car dashboard, emotions signal that something needs our attention. Ignoring them doesn’t solve the problem. It often makes things worse. You know what that reminds me of? It sounds a lot like how we're supposed to handle physical pain, too. Suck it up. Walk it off. Take another lap. Emotions often precede cognition. They arise from a deep, non-verbal place—an intuitive knowing that's rooted in the body. I will give this author points for not making up some off-the-cuff evolutionary psychology "reason" for this. But why do we do this? Often because feeling what we're actually feeling is uncomfortable. There’s also cultural reinforcement. Social media encourages us to present a life that looks successful, joyful, and ever-improving. Okay, well, I'm not a shrink, but from personal experience: no one wants to be around the Eeyore. The person who projects loneliness, sadness, melancholy, depression: they are shunned and avoided. We don't want to catch whatever it is they're carrying. So of course people are going to act all happy and fulfilled on social media, because social media is all about generating likes and followers. Or, alternatively, or maybe additionally, there's a lot of negativity out there, but it gets lost in the radiance. This is the real paradox, by the way: the people who need attention the most can't get it because no one wants to be around them. This is where the concept of toxic positivity comes in—the idea that positivity becomes harmful when it invalidates real emotional experience. That may be the clearest definition I've seen of toxic positivity. And ironically, the more we suppress, the more intense those suppressed emotions tend to become. What we resist doesn't disappear—it builds. And it can show up in unexpected ways: irritability, disconnection, fatigue, even physical symptoms. The mind-body link is still underexplored (I blame Descartes, who insisted they were different entities). But it's real. The impacts don’t stop with mental health. There’s emerging evidence suggesting that chronic emotional suppression may also affect the body. It works the other way around, too. One issue is that people continue to think of the mind as something separate from the body. Psychological research shows that people who can name and describe their emotions with greater precision (a skill known as emotional granularity) are better able to regulate them, experience fewer symptoms of distress, and recover more quickly from adversity. Oh, great. Another thing I suck at: naming emotions. Once you get past anger or joy, I'm at a loss. So perhaps the invitation isn’t to be positive, but to be real. To meet ourselves where we are, without rushing to reframe or override. And yet, there's a lot of potential humor to be found in looking at the bright side of things. "My house just burned down." "Hey, it lit up the neighborhood for a while!" "My dog died." "Look on the bright side: no more picking up dookie!" "The world is experiencing an unprecedented warming trend." "Not to worry: nuclear winter will counteract global warming!" Dark humor is like food: Not everyone gets it. So how do we live with sadness and melancholy and darkness in a world that demands joy and cheer and light? I don't know. Me, I try to remind myself that depression is a wonderful driver of creativity. See? I can look at the bright side. |
| I like to laugh. I like to make people laugh. I especially like to laugh when people groan at my puns. But comedy is one of those elusive subjects: the more you analyze it, the less funny it is. Here, Big Think tries their hand at analyzing it. The biggest joke here is that this is from BT's "Mini Philosophy" department. I've long maintained that philosophers have no sense of humor. We have another word for philosophers with a sense of humor: we call them comedians. There are many kinds of laughter. You can laugh cheerfully, mirthlessly, dryly, cruelly, drunkenly, unexpectedly, and pointedly. Or my personal favorite, sarcastically. Ha! Laughter is a noun with many possible adverbs. And you can't spell slaughter without it. This raises a problem for anyone wanting to tell a joke. Because a joke, at its most basic, is something that is intended to make someone laugh. And so, given the sheer variety of laughter, it makes sense that there’s an equally sheer variety of jokes. I'm all for transparency in humor. A joke might be good-natured or mean. It might be childish or intellectual. Anyone who knows me should know that I like the ones that are both childish and intellectual. The philosophy of humor is such an ill-defined and borderless discipline that a writer would be foolish to try to say anything meaningful at all. And yet, who knows comedy better than a fool? In this week’s Mini Philosophy interview, I spoke with Brett Belle, who runs the hugely successful social media account Mom’s Dad Jokes, about what she thinks makes her and her jokes so popular. Well, they're dad jokes, so of course they're pop-ular. I’m going to put my neck out and suggest that all jokes and all laughter can be divided into two categories: affiliative and adversarial. Naturally, as a childish intellectual, my Oppositional Defiant Disorder kicks in here and demands to find counterexamples. But all I can come up with, at least for now, is the observation that many jokes fall into both categories. And that those categories can switch depending on speaker and audience. Let me give you an example. When I was a kid, a friend of mine and I would exchange Jewish jokes. These jokes could be really fucking dark (most of them involved the Holocaust). I'm not going to perpetuate Judeophobia by repeating them here, but chances are you know most of them because I was a kid a very long time ago. Later in life, when I decided that comedy would be a core principle of my existence, I thought about why we did that. Was it a sense of self-hatred? A pushing back against the cruel, unfeeling God who decided to make us Jewish? No, none of that. We did it, at least unconsciously, to armor ourselves against the real cruelty, which is Other People who would inevitably make such jokes as outsiders. Mostly, though, we did it because it was funny. Now, anyone who's not Jewish tells those jokes, I'll zap 'em with my space laser. But we give a pass to those of our in-group telling jokes to others of the in-group. In other words, the joke itself can be affliative or adversarial; it all depends on context. A lot of what we call good-natured and wholesome humor is intended to be affiliative. These are uncontroversial jokes that aim to bring people together and get everyone chuckling or smiling along. “Why couldn’t the skeleton go to the ball?” I ask, and you smile, nod, and say, “Yeah, I know that one.” Okay, fine, I'll admit it. I didn't know that one. "Because she had no body to go with." Dude, that's a pun, and we've already established that puns are only funny to the punster. Of course, the epitome of the affiliative joke is the Christmas cracker joke. It’s been much commented on, but the fact that cracker jokes are so awful is not a bug but a feature. My fellow Americans might not be familiar with Christmas crackers. They're a British thing. Think, like, fortune cookies? Only inedible ("cracker" in this sense is like "popper") and containing puns instead of questionable wisdom. Again with the puns. Like I said: not really jokes. For an affiliative joke to work, it has to be almost universally inoffensive. No one around the table can be offended or object to the joke. Rudolph is not there to cry about his shining nose, and Jim Skelton doesn’t storm off in a huff because you mocked his appearance. Note the qualifier there: "almost." The joke doesn't exist that won't offend someone. Like with me and my friend up there I told you about: if we told those jokes to our parents, we'd have been grounded until we were 18. Maybe 25. They'd be offended. The problem is that to be so universally inoffensive, an affiliative joke usually ends up being overworn and anodyne. On that point, I can agree. Of course, not all affiliation has to be universal. Sometimes an affiliative joke aims to affiliate only a certain portion of the room, group, or world... This is a kind of tribal affiliation that depends on an adversary. Much has been made in recent years about the difference between "punching up" and "punching down" in comedy. It's okay to make jokes about the group in power. It's not usually okay to make jokes, especially mean-spirited ones, about those without social power. Apparently, the author here did a whole column on it. There's even a link in the article. I didn't follow it. “I don’t find laughing at someone else’s expense that funny, personally,” Brett says. “Like, that’s a personal thing. But I do think there’s a line. Satire, for example — poking fun at politicians or someone in power when things get ridiculous — can help people see how silly a situation is. There’s a difference between that and just making fun of someone because of who they are. There’s a fine line, and I try to stay on the side that doesn’t hurt.” We all have our own lines we won't cross. Mine's a picket line. Seriously, though, that's really just a subset of what I just said about punching. Of course, not all jokes can be so easily divided into these categories. Because often, jokes come from and are about something far harder to identify. “Jokes and dreams,” the Freudian says. Jokes and dreams are two of the best ways to reveal someone’s unconscious life. As pretty much everything Freud hypothesized turned out to be less than true, I'm not sure I can believe this. But I do agree that the categorization has many exceptions, which makes me wonder how useful the categorization really is. Still. I've been known to make puns in my dreams. Sometimes, I even remember them after I wake up. Whether affiliative or adversarial, jokes tell us how we relate to others — and to ourselves. The affiliative joke wants to belong; the adversarial joke wants to set apart. While I'm on board with this, at least provisionally, I should note that this isn't limited to jokes. All kinds of writing or spoken communication reveals something about the communicator. I mean... that's a big part of what communication is for. In the end, I think it boils down to the basic advice any comedian receives: "read the room." Comedy is a two-way street, and sometimes, there's an accident or construction and traffic gets backed up to the intersection with Philosophy Avenue, or the other one at Stretched Metaphor Boulevard. |
| Kids these days. No respect. And what is this source? VegOut? Get the hell out of here with that hippie crap. You know you're peak Boomer when these 6 modern conveniences genuinely confuse (and annoy) you Nothing says “peak Boomer” like getting mad at a QR code. Generation wars exist to take our attention away from the true villains. But okay, this one time, I'll bite. (Note: I am not, by any definition, a Boomer. But, you know. Meh. Whatever.) My dad can rebuild a carburetor blindfolded. They still make carburetors? I'm surprised this brat even knows what one is. But ask him to scan a QR code at a restaurant and he looks like I've handed him a Rubik's cube. Oh, I'm perfectly capable of scanning a QR code at a restaurant. I just hate it. I put up with it some during the pandemic, but now? Now, it's just an excuse for restaurants to introduce dynamic pricing, and while I may not be a Boomer, I get tired of squinting at the damn phone. Physical menus let you see everything at once, flip back and forth, point to items. He's not being difficult; he's mourning an interface that made sense. Solution: don't go to restaurants that still do this bullshit. 2. Self-checkout machines that assume you're a thief Self-checkout machines operate on suspicion, requiring constant validation. For a generation that values trust and efficiency, this feels insulting. I don't care about that shit. What I care about are a) loss of jobs; b) if I gotta do your job, where's my employee discount and c) I have to call an attendant over anyway to check my ID for the beer, so it's easier to just go through a staffed line. 3. Apps for things that used to take one phone call He needed to schedule a doctor's appointment. The receptionist told him to download their app. Twenty minutes later, he'd created three accounts, forgotten two passwords, and was ready to throw his phone. That's bad app design, not old age. 4. Subscription services for things you should own This goes deeper than technology frustration. Boomers grew up in an economy where you bought things and they were yours. Subscription-everything feels like paying forever for something you should possess. This, now. This, I'm 100% on board with. Companies love that shit because of revenue streams and shareholder value and whatnot. I hate it. I hate it with a passion only exceeded by my hatred for commercials. I can see subscribing to streaming services, but I'm absolutely not okay with having to rent everything. 5. Contactless payment when cash works fine He stands at checkout holding exact change while the cashier points to a card reader. "Just tap it." He doesn't want to tap it. On the other extreme, I embraced card payments long ago. And while I do wish they'd make it more standardized where to tap (it's not always where the little tappy logo is situated), I'm perfectly okay with it. Carrying money would mean I have to lie to beggars when they ask me for spare change, and I don't like lying to people. Well, most of the time. Maybe he's onto something. Maybe we've gotten so fluent in these systems that we've stopped asking whether they're actually better or just newer. I'll give the author points for seeming to understand what's behind the generational difference, rather than just the old "okay, Boomer" bullshit. A little more of that understanding might ease these made-up generational battles, so They'll have to come up with something new to distract us. |
| Here's something fairly recent from PopSci, though its subject is a century old: The radioactive ‘miracle water’ that killed its believers In the 1920s, Radithor promised to cure everything from wrinkles to leukemia, but its unintended results were deadly. I mean, technically, it cured wrinkles and leukemia. Along with that pesky condition called "living." William Bailey promised to cure anything that ailed you. “Just a tiny bottle of apparently lifeless, colorless, and tasteless water” was, he advertised in a 1929 pamphlet for his product, Radithor, “the greatest therapeutic force known to mankind.” Oh, sure, we're horrified (or amused, or both at the same time) now, but think about all the panaceae being pushed and promoted by hucksters now. While they might not be as deadly as radioactive water, I promise you, in 100 years, if any humans are still around then, they'll be looking back at us and going, "Horse paste? Seriously?" Hell, some of us are already doing that. The life-threatening (heart disease, leukemia), embarrassing (impotence, flatulence), and annoying (poison ivy, wrinkles) could all be remedied with Radithor’s main ingredient, “internal sunshine”—that is, highly radioactive radium isotopes. This is how pseudoscience works in its unholy union with marketing. It was effective then. It's effective now. The classic example is "snake oil," but that's unfair; snake oil Radium “was mainstream, and it became mainstream because the radium industry wanted this to happen,” explains the historian of science and technology. “Science and commerce are so intertwined that we cannot really separate them,” she says. It is a phenomenon that, like the radioactive elements of Radithor, remains dangerous today, if not handled with care. People want to believe in miracle cures, and desperate people will try anything. Remember Laetrile? They do not. The story of Radithor’s best-known victim has also endured: In 1927, Eben Byers, a wealthy and well-known Pittsburgh businessman, broke his arm and a physician recommended Radithor. It does not help that not all doctors are equally competent. They are, of course, also human. Except for the AI ones. You can't trust those, either. Eben Byers hadn’t been fooled into consuming radium; every bottle of Radithor proudly announced itself as “CERTIFIED Radioactive Water.” Instead Byers had been caught in the intersection where fledgling scientific understanding met an untapped commercial market. Want to see the latest incarnation of that? I'm not saying that it's on the same level of hazard as radium water, but it's still using poorly-understood science to play on peoples' fears and get money for it. Marie and Pierre Curie discovered radium in 1898 and spent the first decade of the new century refining the method of isolating the element from ore. No, I don't blame the Curies. While acknowledging that radium therapies were not yet fully accepted by the scientific community, the magazine noted “there is a growing belief that radium emanation does have a definite place in medical treatment.” And of course, belief means it's true. Right? Yes, eventually, radioactivity did find a place in medical treatment (the article mentions this at the very end). But that was after years, even decades, of studies and development and scientific understanding. Not to mention how radiation quickly went from miracle cure to movie villain, what with, you know, atomic bombs and whatnot. The takeaway here should not be "you can't trust science." It should be a healthy skepticism, even outright rejection, of hucksters using pseudoscience, fear, and hope to discredit actual science and make money from peoples' ignorance. |
| Finally! Someone gets it! From Nautilus: How Scavenging Made Us Human Our early ancestors were more like vultures than we might like to think You'll have to follow the link to see the lovely illustration of a vulture. Vultures, hyenas, and other scavengers tend to have less than stellar reputations. Yeah, well, so do certain politic- oh. Right. If you see vultures circling, you can probably assume that some creature is nearing its end or has just departed. Ugh. Wrong. And they were doing so well, too. Okay, technically, "you can probably assume" that, but you'd be wrong, too. And they’re freeloaders: They don’t work for their lunch as much as the hunters of the animal kingdom do, they just steal the spoils. That's not right, either. I mean, it is, in a way, but it's not like your moocher cousin who comes over and eats the food you were planning on munching on yourself. They clean up what the predators shun, and in doing so, they perform an important service. It's like the sanitation workers who clean fatbergs out of sewers: no one wants to be them, but someone has to do it, or things get worse for all of us. It might come as a surprise, then, to learn that early humans may have relied heavily on scavenging, even after they had the tools to hunt. It shouldn't be all that surprising. We don't have the strength or speed of some other predators, so it's reasonable to think our ancestors lived off leavings. What's not reasonable is to assume they did without evidence, and this article claims evidence. Earlier scholars thought scavenging was too unpredictable, and already-dead animals too scarce, for it to be a frequent approach to finding food for ancient humans. And the risks—of attack from a lingering predator or of catching a disease from the rotting meat—would have been too great. This is how science works. The scientists also suggest that humans are, in fact, well adapted to scavenge: They have defenses that could protect against disease from carrion, such as a particularly acidic stomach to help kill off potential pathogens. And when humans learned to use fire to cook, that would have added another layer of protection. Next question: was fire harnessed for that reason, or was it adapted to that activity? The new work suggests that scavenging persisted among humans long after hunting emerged. And whaddaya know? We still eat dead things. What? Eating live things is better? |
| From aeon, an article that might put you to sleep. If it doesn't, my comments probably will. What sleep is It is our biggest blind spot, a bizarre experience that befalls us every day, and can’t be explained by our need for rest After decades of research, there is still no clearly articulated scientific consensus on what sleep is or why it exists. This is a good thing. It leaves room for future discoveries. Also, when we finally do figure out sleep, someone else will figure out a way to project ads into dreams, and I don't want to be alive for that. Discussed exclusively in utilitarian terms, we are force-fed the idea that sleep exists solely for our immediate benefit. Is this really all we ever want to know about a third of our existence? Sleep is perhaps the biggest blind spot, or the longest blind stretch, if you will, of our life. Oh, I don't know about that. I've had articles in here before about consciousness, and there's no scientific consensus about that, either. I feel like if we had a handle on one, we'd have some clues about the other, but what do I know? However, in my opinion, to say sleep is important is to miss the point entirely. Sleep is the single most bizarre experience that happens to all of us, against our will, every day. For various definitions of "against our will." As far as I'm concerned, consciousness and wakefulness happen against our will. The predominant view is that sleep provides some sort of restoration for the brain or the body: what goes awry – out of balance – in waking is almost magically recalibrated by sleep. It is true that I sometimes refer to getting some sleep as "rebooting." Like a snake eating its own tail, waking and sleep consume each other in an endless cycle, without beginning or end. There is no mercy, and lack of sleep can be paid back only by sleep. The image of burning a candle at both ends endures. I used to doodle a candle burning on both ends and also in the middle. You know, back in college, when I had to pull all-nighters just to stay on the treadmill. We live on a half-asleep planet where, speaking of our species alone, at any given time close to 2 billion people may be asleep. Can't resist the urge to point out the discrepancy here: 2 billion isn't even close to half of us. I do understand poetic license, however. Not surprisingly, echoing Aristotle’s view of it as ‘a privation of waking’, sleep is typically defined by what it is not, rather than what it is. It is not moving, not acting, not responding, being disconnected from the world, doing nothing, at least from the observer’s perspective. Which is one reason, I think, why people sleep-shame: it's a holdover from Calvinism, on which I have another article waiting in the green room. The 19th-century Scottish physician and philosopher Robert MacNish thought of sleep as ‘the intermediate state between wakefulness and death: wakefulness being regarded as the active state of all animal and intellectual functions, and death as that of their total suspension.’ You want my philosophical, entirely non-scientific, description of what sleep is for? No? Well, you're getting it anyway: sleep exists to prepare us for our inevitable death. The article swings close to this admittedly morbid proclamation, but never quite touches it. According to one theory, it is in the organism’s best interest to remain ignorant, and sleep exists to prevent us from acquiring unnecessary knowledge. I think it should be obvious by now that I can't agree with that "theory," not without a whole lot more evidence. Because there is no such thing as unnecessary knowledge. What is going on inside the sleeping brain during our typical night, and what can we learn from studying it to explain changes in our responsiveness to the world outside, or an occurrence of mixed states such as somnambulism, sleep paralysis, lucid dreaming or out-of-body experiences? The article lumps those "mixed states" under parasomnia, or sleep disorders. But how can we know these are disorders, if we don't fully understand the "order?" The belief in the superiority of wakefulness over sleep, combined with our inability to suppress our primitive, primordial need, breeds only resentment toward sleep. Sleep has become a problem we have to deal with. Having failed to find a quick cure, losing the war on sleep, there is a growing incentive among entrepreneurs and scientists alike to find a creative solution to that. On the other hand, sleep is becoming a commodity, a beauty product, a medicine, something that can be packaged and sold for our benefit. I kind of alluded to this up there: if we ever do figure out sleep, from a scientific perspective, it will become even more a victim of capitalism. Want a nightmare scenario for science fiction? Someone figures out how to get by on an hour's sleep, or to do away with it entirely. Imagine the productivity gains! There's a lot more at the link, of course (including a brief divergence into evo-psych guesswork, which lowers my estimation of the article). It is, as I said, enough to put one to sleep. Or maybe wake one up to a different perspective. |
| A language lesson (or not) from Mental Floss: America’s 10 Most Commonly Misunderstood Slang Terms A dirty bird in Kentucky is a good thing, actually. First, researchers used data from two sites, OnlyInYourState and EnjoyTravel.com, to create a list of state-specific terms. They then asked 1028 U.S. residents to guess what they thought each one meant. The 10 terms that were wrongly defined most frequently are listed below (along with some entertaining honorable mentions). I'd treat this as "for entertainment purposes only." Tavern // South Dakota In South Dakota, a tavern isn’t always—as most survey participants assumed—a bar. Sometimes, it’s a ground-beef sandwich similar to a sloppy joe. I was unaware of this name for that kind of sandwich. In my defense, I've never spent more time in South Dakota than I had to. Carry // Mississippi The common assumption was that carry in Mississippi meant “to have a gun on your person.” And it does mean that—but it can also mean “to drive (someone),” in the same way you might say, “I have to take my mom to the airport.” And then shoot her. Gnarly // California Gnarly is such a classic bit of ’80s slang that you can’t fault respondents for assuming it’s a synonym for cool. Funny thing about words: they mean what we want them to mean. Borrow pit // Montana A borrow pit is a pit formed when material is excavated (i.e. borrowed) from it and relocated somewhere else. Montana? We call it that in Virginia. There are a few more there at the site. Some are mildly amusing. Nothing earth-shakingly important; just a bit of fun. |