ZINE

Matt Klein

Overlooked cultural trends explained. Webby-winning intel on our media and social shifts to understand tomorrow. zine.kleinkleinklein.com

Episodes

  1. 11/19/2024

    Interview with Douglas Rushkoff: Program or Be Programmed

    Interview Intro — MK: We’re here to celebrate the 15th anniversary of Program or Be Programmed, arguably one of the most foundational books on digital literacy. Program or Be Programmed originally offered 10 commands to intentionally navigate the biases of our digital technologies. Now a decade and a half since its original release, Program or Be Programmed — for better or worse — is more important than it has ever been… especially in the context of AI… the newest and 11th command in the re-released version... ...Which was published earlier this week. In prepping for this chat, I came across an online review that I think is perhaps best description of the value and intent of this work. Now mind you, this is a random review from Nathan G. on Amazon nearly 15 years ago... And I think Nathan nails it. Thank you, Nathan G. “What makes this book worthy of the Neil Postman Award that it won (I just learned that such an award exists) is its refusal to let any digital technology become transparent. From the first Arpanet connections to email to the ubiquitous vibrating phones (and "phantom phone buzz syndrome"), Rushkoff keeps his sharp eye on the assumptions that one has to make before the technology makes any sense: that one should adjust one's personal biological rhythms to the atemporal "always on" existence of computer networks rather than vice versa; that the world should conform its complexity to the reductionism of binary choices; and that human beings are meant to exist as infinitesimal nodes in a vast global network, just to name three. Spelling out those assumptions, Rushkoff does not so much give ten commands as ask ten penetrating questions, questions that ought to haunt human beings as we jump on board the Internet train.” Now 15 years later, we are certainly haunted by these questions and the Internet train feels to be hijacked, barreling and completely off its tracks. Program or Be Program is a time machine to a moment of opportunity 15 years ago. At the time, a chance to pre-read the rules of engagement in order to mindfully approach our tech. Re-reading it in 2024, in addition to its new text, elicited, for me, a total spectrum of emotion. On one side, sweet relief that the biases are all laid out for us. There’s no mystery. We know what our tech wants from us. And on the other side of the spectrum, utter frustration, as if we totally blew it. Yet now, excitingly and optimistically, we’re presented with another opportunity. Another chance to reapply these commands amidst the emergence of AI. I’m eager to not just continue reflecting upon these commands, but further, inquire why we didn’t embrace tech’s anti-human biases, and ultimately what we can do better this time around. It’s not too late. With that, I’d like to welcome to the Zoom stage window, author of Program or Be Programmed: 11 Commands for the AI Future out this week, Douglas Rushkoff. A Snippet — MK: This past summer, you were caught in the spin cycle of the podcast Substack post content whirlwind. I find myself too often programmed, getting demanded for more and more and more, whether that be other people or whether that be myself. How did you turn off that spin cycle? DR: For me, it took a willingness to risk everything. So, in the good old days, I know this sounds freaking insane, a journalist could get a contract with a magazine for like a year or two. And this is what I would do. They say, Rushkoff, we want you to write a thousand words or a 1,500 word piece about technology and society once a month and we'll give you like $3,000, like $2 a word to do this thing. And you would have that and you might have a publisher or I got a gig doing commentaries for NPR, and between three or four things, there's enough money to live your life through these sort of entities. And I was lucky to get under the wire so I could depend on a lot of these entities, but the entities are pretty much all gone. And as a journalist, even a super respectable place like The Atlantic, you can write a thousand word piece for $200 and it's like, oh man, how am I going to live? So I, like many migrated to the self-funded beg for your supper, you know platforms: Patreon for the podcast and Substack. I moved from Medium where I used to get paid to Substack to make my own little thing. And I can feel, and we don't need to go into the cues of it, but between the business models and the way ads work and the way subscriptions work, I can feel Substack wanting two pieces a week. And I can feel Patreon wanting at least a podcast and some other content drop a week, and this many cross promotions with other podcasters, and I got to then do three Instagram posts and a LinkedIn thing. And I could feel the implied pace of the machine impacting my pace of living. And I realized that I mean, I can, but I, at my best, I'm not writing two Substack pieces a week. I'm not going to impose myself on people two times a week. They don't need to hear from me two times a week. They really don't. Honestly, I believe I say deep things. I think about them a long time. I craft the things that I write so they work on two or three levels — like Program or Be Programmed. It's working on three levels at once. It's not that it's dense, but it's multidimensional stuff. And honestly, it takes a while to unpack. If it's good writing, it should take a while to unpack. So I had to say, I'm not going to use these platforms the way they're asking me to use them. I'm going to write one or two Substacks a month. I'm going to go back to doing Team Human, my podcast, really every other week, the way I used to. I was doing it every week and my producer said, you know, we're getting a lot of email from people saying it takes a week or two to really digest everything that happened in that episode. Can you slow down? Can you slow down because they don't want, “Oh, no, there's a whole other one.” So I'm doing that and I wrote a nice note to everybody saying: Look I'm gonna be doing less content. Drop out if you want. It's really okay. We'll make this work. It's all good. And I slowed it down and ended up back in that beautiful exploratory place. I meet about one person a month who I really, really have to do a Team Human conversation with, and I have about one or two big independent ideas that I want to share in a monologue once a month. And when I go down to that pace, life naturally supports it. I'm never searching for the idea or who can I interview next? It just happens in the natural unfolding. And then you get this much more organic and supported rhythm. The easiest thing to get to conform is the technology. The hardest thing to get to conform is your nervous system is the sun is the sky is your life, you know? So Program or Be Programmed really means understand the technology well enough to use it in a way that's compatible with you and your creative cycle, rather than trying to conform your humanity to it. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    1h 3m
  2. 09/30/2024

    Unplugging Is Not The Solution You Want

    The crisis of endless troughs of content, outrage-bait, attention incentives, intermittent reward slot machine dynamics, comparative quantified self-worth, static profiles demanding fixed personal brands, Zoom-fatigue, and our weakening muscle of discerning fact from fiction it’s all just. too. much. for our brains to manage. Statistics about our declining state of mental health amid our increases in screen time are not even needed to support any of tech’s effects. Pause and take a breath. You can feel it. That’s enough. For remedies, we’ve been prescribed to delete our apps, curb notifications, turn our phones greyscale, invest in flip phones, and look away from screens two hours before bed. Oh, and especially don’t doom-scroll or check work email (same thing) as soon as you wake up.  Whoops. Meanwhile, the recommendations from the other extreme advise us to become more productive by upgrading our tool stacks — as if we can transform into machines, ourselves, dominating our problems with “hacks.” Using more tech to defeat the pitfalls of tech? Nice try. How many times have you plowed through your timed app limits in the last week? Over the years, countless books and campaigns such as the Social Dilemma raised awareness and proposed the aforementioned solutions. But I ask: Has anything meaningfully changed over the last decade? No. Really. What’s changed? We now have TikTok. Screen time climbs higher. Our meditation apps interrupt us with push notifications urging us to return to their feed of more video content. A three year old is now impressively proficient with an iPad — good luck snatching that thing out of their clutch. Sludge content, unafraid, pleases our needs for stimulation. Hell, “brain-rot” is a phrase we don’t even question. Meanwhile, techie public intellectuals pivoted to “AI safety” and now promote their podcasts and newsletters on the very platforms they were catastrophizing just a few years earlier. With the emergence of AI, we face an onslaught of new issues (consent, copyright, fictitious content, etc.) without ever adequately rectifying those which came before. Most dramatic yet least questioned, AirPods and wearables continue to break sales records... Our tech is increasingly on and inside of our bodies. So much for “getting away.” Via digital trackers, we willingly give away our heartbeat, body temperature, and DNA to companies without an iota of control. And we’re the ones paying them. Whoops. We’re heading in the opposite direction we’ve been proposing. We are no more better off since we’ve coined “the tech-lash” a decade ago. Our interventions have failed. Or did we even try? With such a lack of any meaningful progress, perhaps we require a wholly different approach if we’re to experience a healthier, happier and more productive life with our devices, screens and apps. Perhaps, we must mature our discourse beyond fighting against these issues, to accepting them. It’s a strange word choice and seemingly antithetical to what we want, but bear with me... Instead of unplugging, how may we become more mindful in embracing and strengthening our ability to live with these consequences? Only then can we find better mastery with and over them. You are not weak. The Ignorance of Unplugging In Digital Minimalism, one of the most popular books on developing a better relationship with modern tech, author Cal Newport writes, “People don’t succumb to screens because they’re lazy, but instead because billions of dollars have been invested to make this outcome inevitable.” To overcome this malicious investment against us, Cal urges readers to digital detox. Simply spend a month away from your screens to discover new hobbies and take up journaling. Let me be clear: I could not agree more with this advice. We all need to touch more grass (or mosh), kindle both passions and meaning, establish new rituals, and engage in a habit of reflection. Absolutely. Yes. But! I for one refuse to accept these (“detoxing”) tactics as comprehensive or sustainable in our current moment and, further, the moments which come next. Unplugging is a short-term, unsustainable, selfish and frankly, privileged approach to the downsides of our everyday technological struggles. While billions of “bad” dollars are spent hooking, billions of “good” dollars are spent onboarding more and more and more people around our planet online. Countries, states, townships, schools, hospitals and non-profits are all ensuring that every facet of society is accessible via a smartphone. This includes countries, which have historically been left behind — connection begets prosperity whether we like it or not. As a result, employment, banking, housing, education, dating, healthcare, you name it — the foundations of any functioning society are now only accessible online. It takes a certain privilege to opt out. I’m not alone in this controversial perspective. Author and consultant, Venkat Rao, calls such harsh disconnection, “Waldenponding,” “I'd like to try living off the grid for a bit in a log cabin, myself, for a summer or something. I'm also on board with trying and adopting experimental rituals like a no-devices sabbath day... But as an attitudinal foundation for relating to society and technology, Waldenponding is, I am convinced, a terrible philosophy at both a personal and collective level. It's a world-and-life negation. A kind of selfish free-riding, tragedy of the commons: not learning to handle your share of the increased attention-management load...” [...] “The way to manage your attention is not to ‘unplug’ or do some sort of b******t Classical Liberal virtue signaling crap of ‘I only read Ancient Greek authors’ but to be sensitive to your current mind size and consciously target the zone you want to be in...” What is it (Exactly) You’re Fighting Against? New organizations are popping up to fight this “media addiction.” This is the equivalent of picking a fight with air. I suppose these orgs don’t have an issue with cave paintings, which are just another form of social media. Books? Ok. eBooks? Ok. Book reviews? Ok. A book review on a social platform? Someone’s comment on a book review? A number which represents the number of people who agree with that book review? Where exactly would you like to draw the line between kosher and caustic? Concern and attention is a noble effort, but they — and we — require specificity and nuance in such conversations. Otherwise, we get more noise without change. This is where we are now. What exactly makes cave paintings, art galleries, newspapers or bulletin boards acceptable forms of social media and how do we distinguish good vs. bad, and healthy vs. unhealthy? Is it algorithmic rankings geared toward engagement, no age minimums, endless feeds, abundance of metrics, growth-for-growth-sake business models, or something else? Pick those, not “media.” We’ve got a lot to improve. But righteously banning, outlawing or withholding social tech and media means stripping away the support for millions dealing with mental health, alcoholism, religion, chronic illness, cancer, divorce, sexuality, or child raising. This isn’t an all or nothing deal. Unplugging is a false binary. That’s the computer thinking spreading amongst us like a virus. The very limiting 0’s and 1’s. Black and white. Disconnect and thrive or stay plugged in and rot. Why is it all or nothing? We are not machines — we’re better at thinking than that and than “them.” Also, not for anything, have you actually tried leaving the house without a phone recently? How is that supposed to make us less anxious? What’s Left Behind For Others A frequent proposed resolution to this conundrum is: “Adopt self-restraint and create better habits.” Love it. How may we begin treat the Internet as a tool like a fishing rod? We only fish when we’re “hungry.” The prickly flaw in this thinking is that unplugging does not reverse any psychological damage already inflicted upon us, nor does it eliminate the opportunities for others to also fall trap when you’re unplugged. It’s all still there awaiting everyone else. Further, unplugging — or the failure to do so — now installs shame. The more we preach “better habits,” and the less we adopt them, the worse off we feel. Starting the day checking email in bed somehow feels even worse when the world is telling we’re a loser for doing so. Judgement compounds our predicament. When our “tech issues” are no longer contained to a device and spill out into our streets affecting us physically, our solutions must not just revolve around the technology itself (i.e. removing it), but addressing the human conditions first and foremost. Human intervention, not technological elimination. As Chris Dancy, the author of Don’t Unplug: How Technology Saved My Life and Can Save Yours Too puts it, “We don’t need a virtual reality, we have a reality. We don’t need innovation, we need compassion. We don’t need to fix the problems of the world with technology, we need to fix technology with our human spirit.” (Please) Don’t Forfeit Your Agency Now at this point, all of these arguments may be coming off as tech-defensive, algorithmic-apologetic, or straight up nihilistic — i.e. Just give up. Accept this dystopian state. However, this is not an admittance of defeat, but a realistic approach for empowerment. What if unplugging is the running away? I struggle to accept the fact that our minds are too weak. That’s defeat. How may we learn to live with? How may we learn to live in light of? And how may we learn to live despite? This holistic approach honors our capability, not diminishes it. We can both adopt schedules to unplug and develop the strength for healthy online engagements in this environment. We can work towards limiting screen time and accepting our new di

    19 min
  3. 07/16/2024

    Interview with Rory Sutherland: Emotional Efficiency & Reframes

    Interview Transcript — MK: You’re listening to an audio edition of ZINE, the Webby-Awarding Winning publication making sense of our current cultural moment, relationship with tech and one another, and what may come next. My name is Matt Klein and I am a digital anthropologist, cultural theorist, strategist and writer, researching overlooked social shifts. I’m also currently the Head of Global Foresight at Reddit. If we’re to author our preferred futures, we first have to be proficient in our zeitgeist. In other words, we can’t write culture if we first don’t know how to read it. And today’s chat is an attempt at exactly that. Celebrated as “one of the leading minds in the world of branding” by NPR and "the don of modern advertising" by The Times, Rory Sutherland is the Vice Chairman of Ogilvy U.K. He’s also the founder of their behavioral science practice. Rory writes the Spectator's 'Wiki Man' column and presents series for BBC Radio 4. His TED talks about reframing perspective, and re-prioritizing details have racked up millions and millions and millions of views. He’s also the author of Alchemy: The Dark Art and Curious Science of Creating Magic in Brands, Business, and Life, which was published in 2019, and is an absolute must read. Rory decodes human behavior and blends scientific research, absurdly entertaining storytelling, and deep psychological insight, which makes him, in my eyes, one of the most important and influential thinkers of our time. In college, I first came across Rory’s talk, “Life lessons from an ad man” where he makes it clear that advertising simply adds value to a product by changing our perception, not the product itself. Yet such reframes can be applied to all elements of our life. He let me see that marketing isn’t simply about slinging crap that people don’t need, but rather is a practice in helping solutions be adopted by those who need them most. Rory discusses how the Eurostar could have spent its budget not trying to increase the speed and decrease the time of its trains, but instead could have spent only a fraction of its budget on models and alcohol, and passengers would request the train ride to be longer, not shorter. These are examples of simply reframing existing problems and solutions, recognizing innate value. In my eyes, strategies to dial up humanity and empathy, and resist the urge to reinvent wheels and spend unnecessarily. And it’s these stories that wanted me to start working in communications and strategy. I’ve been following Rory ever since, and find that his best, most insight interviews are the ones where he just goes off. I’m excited that this, was one of those experiences. As a significant personal influence, here is my chat with Rory Sutherland. MK: I am endlessly fascinated in making sense of culture. More specifically, what's overlooked? What are people not paying enough attention to? And I cannot think of a better person to help answer those questions than yourself. I have a laundry list of questions, but maybe we'll, we'll start simple. What's on your mind? What are you thinking about? What's exciting you? What's worrying you? What are you thinking about in culture right now? RS: I think that question, by the way, is the right question to ask, which is — what we're talking about quite often is the product of a kind of media feedback loop where effectively every news publication and to some extent social media, but actually I think social media is less guilty in some ways than the mainstream media is — effectively decides what's important based on what other people are reporting. And it's been a facet of mainstream media for ages where there's this kind of effective echo chamber where people in the newspapers watch 24-hour TV news, and people on 24-hour TV news read the newspapers to decide what's actually worth talking about. And I think it leads to this complete imbalance where certain things get discussed far too much or ignored for far too long, and other things get ludicrously over-publicized. And this leads to the problem, because I was just writing about this actually — Daniel Kahneman's observation, he died last month, great man, and his, one of his most famous sort of dictums was: “Nothing is as important as you think it is while you are thinking about it.” In other words, what determines our sense of priority and our sense of importance is often our attention. And our attention, although not beyond our control, is often quite arbitrary. And as a consequence, I think, what happens is because things garner people's attention, they then deem those things to be important. They talk about them more, which then spreads the degree of attention they get elsewhere. And, you know, I'm not necessarily, well, I am actually supporting those people who, you know, there are various scientists who said, look, we give far too little attention to the risk of a meteorite strike or a really major earthquake. There are certain potential catastrophes against which possibility we spend almost nothing, not because they're unimportant, they're monumentally important, indeed they could be cataclysmic sort of extinction events, but because they're unlikely. And I think those people had a point, which is that, look, at the very least, you know, the odds of a meteorite strike are relatively low, we don't seem to have had a real, really biggie since the dinosaurs got eliminated, unless you include, what is it, the Tunguska event in northern Siberia. But nonetheless, the consequences of such an event would be so immense that we should at least give it, you know, what you might call a few million dollars a year of scientific attention. And the same thing, by the way, happens in business where certain trends become absurdly over-dominant. I would argue that offshoring and globalization and actually, also, the replacement of human beings with technology have become almost just self-replicating activities where you know, you don't even have to — nobody even makes the contrary case — they've just been assumed to be true. You manufacture something in a low cost economy. You optimize around a single supply chain with whichever suppliers cheapest, regardless of the risk that might pose to resilience. And there were certain business behaviors, which achieve that kind of weird escape velocity, where if you suggest doing them, no one — investing enormously in AI — no one even makes the contrary case. It's become so fashionable that effectively everybody just nods along with you for fear of looking stupid, and it's it — I don't know if you know Chris Williamson well, you might have met him in Austin for all I know, but he talks a lot about the Abilene paradox or the Abilene Effect where people don't really believe things they just pretend to believe them in order to fit in and go along with what they think is prevailing opinion and at moments of sublime absurdity. With the Abilene Paradox, you literally get a group of people agreeing to do something, which no one individual in the group wants to do at all, simply because they've misread the room and they assume that everybody else is in favor of what's being proposed and that to raise a doubt would make them look awkward and, you know, feel silly and therefore people stay quiet. One of the things I think we don't talk about nearly enough is the importance of — particularly post-COVI — the importance of the widespread adoption of video conferencing, indeed of technologies such that we are using now, because I think it's actually very, very important and should have major economic effects. And I also think many, many businesses and many organizations have the opportunity to reinvent themselves around this technology. Everything from remote medicine to remote psychology to remote financial advice. And I think the technology is much, much more important than people give it credit for. The reason it doesn't get much attention is because the technology itself is old. I mean, you know, you had video conferencing on Star Trek. Okay. You had technology like that on Star Trek. You then had Skype in the late nineties. There's nothing remotely new about it as a technological possibility. What is new about it is the widespread adoption and normalization of it. And many, many technologies only really deliver the goods 20 to 30 years after they're invented. That was true of electrification of manufacturing. True of, actually, the internal combustion engine. You might argue the electric car — the electric car, as distinct from the electric motor, which was being pioneered in the Edwardian era, you know, has only finally come into its own with the invention of better batteries more than a hundred years later. The fact that a technology is not new does not make it unimportant, but it does make it much less talk-able, simply because you feel stupid talking about something that's been kicking around for 50 years. And there used to be a comedy sketch in the U.K. called The Fast Show. And there was a character there who was — it was a kind of character sketch of, “Isn't X brilliant?” And he'd go around going, “Electricity, brilliant!” Right? And he'd get excited about things that it was, somehow socially ridiculous to be excited about. And you know, if I go, “Isn't video conferencing fantastic?” I look like a bit of a t**t. By the way, that's the British use of the word, which is not very offensive. I will make that point to American listeners — in Britain the T-word is a slightly strengthened word of the word ‘Twit.' It's not particularly rude. But you feel a bit of an idiot if you get excited about a 20 year old technology. It's rather like getting excited about running water. Now, the fact that running water is new doesn't mean it's not important. I had a kind of early epiphany, bizarrely, with video conferencing before the pandemic. Actually, in the summer, in the

    58 min
  4. 09/12/2023

    Betty Crocker's Egg is a Myth. Embrace Unknowing.

    Originally published via Future Commerce Have you heard of the infamous “Betty Crocker Egg” story? It goes: During the 1950’s, sales of instant cake mixes were struggling. A worried General Mills, owner of the Betty Crocker brand, brought in consumer psychologist Ernest Dichter (creator of the focus group) to conduct interviews with housewives. In his discussions, he learned that housewives’ guilt from the effortlessness of the instant cake mix made using the product “too simple.” The process (or lack thereof) was self-indulgent “cheating” compared to the more rewarding process of baking from scratch. Therefore, the mix was a problematic buy. An insight and opportunity: “What if we left out the powdered eggs from the mix and allowed people to add fresh ones themselves, increasing participation, decreasing guilt, and ultimately increasing sales?” It worked. Once the new cake mix requiring fresh eggs was released, sales of the product began to soar — a win for both the baker and brand. This story reveals the seemingly irrational consumer mind and is a case study of the importance of in-person qualitative research. Only by looking beyond market data could we learn about “premium friction” or that the opposite of a good idea (e.g., more work, not less) may also be a good idea. For this reason, marketers, strategists and innovators alike love sharing it. The Betty Crocker tale supports “The IKEA Effect,” a cognitive bias coined by behavioral economist and author Dan Ariely. As proposed in his study, by putting together our furniture (rather than buying it pre-assembled), we create a unique, more personal relationship with it, increasing the perceived value of our creation. Like requiring fresh eggs, our participation changes perceived value. But here’s the problem: The “Egg Story” as we know it is b******t. Critical Omissions and Confirmation Bias Why is it b******t? It’s missing critical nuance. There are five missing details which re-tellers leave out: First, Dichter’s findings include, but no one acknowledges, that fresh eggs produce superior cakes. Author and historian Laura Shapiro confirms this overlooked truth in Something from the Oven: Reinventing Dinner in 1950s America: "Chances are, if adding eggs persuaded some women to overcome their aversion to cake mixes, it was at least partly because fresh eggs made better cakes." The original dry egg mix produced cakes that stuck to the pan, burnt quickly, had a shorter shelf life, and tasted like eggs. We knew fresh eggs made for better cakes because... Second, a patent for fresh eggs in cake mixes was first filed in 1933, decades before Dichter discovered their “psychological importance.” The original patent reads: “The housewife and the purchasing public in general seem to prefer fresh eggs...” Companies were debating dry vs. fresh eggs since the very inception of the cake mix product, not just when “sales were struggling.” (More on that in a second.) Paul Gerot, CEO of Pillsbury during the time called the egg mix “The hottest controversy we had over the product” from the get-go. The story makes it seem like fresh eggs were this novel discovery. In reality, these companies had been debating them for years. Third, around this time, cake mixes were actually selling incredibly well, but only when they weren’t flying off of shelves did it cause worry. Between 1947 and 1953, sales of cake mixes doubled. The concern only arose during the late ‘50s, when there wasn’t a “decline,” but just a modest +5% growth — a “flattening,” if anything. Cake mix sales didn’t suddenly flatten at once because of a mass onset of guilt... especially after years of excitement and growth. There are endless explanations for a flattening such as novelty wearing off, market saturation, product competition or evolving tastes. The story makes this seem like a brand problem when in fact it was a shared category problem. Which brings us to... Fourth, when sales stalled for the category, General Mills and Pillsbury adopted two different strategies: General Mills required the fresh egg, while Pillsbury offered the complete dry egg mix. If the fresh egg were such a business-saving idea, we should have seen Betty Crocker wipe Pillsbury out of business. We didn’t. Pillsbury still thrives today. And fifth, while Dichter was onto something with baker participation, the egg shouldn’t be the star of this story; it should be icing. According to historian Laura Shapiro, it wasn’t as much the fresh eggs that brought cake mixes back from their slowed growth but inspirational advertisements empowering homemakers to decorate their cakes with extravagant and personal flair. The introduction of frosting and elaborate decorations turned excitement away from the cake's inside and taste, to the cake's exterior and splendor. This is how General Mills and Pillsbury brought cake mix sales back to life — through the broader cultural turn towards the mimetics of homemaker aesthetics. As Michael Park writes for Bon Appétit on the history of cake mixes: “It didn't hurt that slathering a cake-mix cake with sugary, buttery frosting helped mask the off-putting chemical undertones that still haunted every box. It worked. By the time the over-the-top cake-decorating fad was over, cake mixes had invaded the average American kitchen, and have been there ever since.” There we have it: the full story of Dichter and Betty Crocker’s egg. Alternative Truths Form Their Own Realities But with these untold truths now laid out, new lessons emerge. First and foremost, nuance isn’t fun and doesn’t make for biting hot-take lessons on social media. Details are inconveniences and potential contradictions when pithiness spreads. It reminds us to question our viral headlines: “What’s missing?” Something always is. When an alternative “truth” like the almighty fresh egg eclipses the real truth (the complete story of egg patents, real sales figures, and icing), new realities form. Stories don’t have to be true to be effective – they just have to sound right; or confirm our existing biases. The world we perceive is manufactured from the stories we hear. Any narrative which prevails becomes “the truth,” whether it’s complete or factually correct. Perception is reality and reality is only the stories available. But aren’t all of our stories made up? Isn’t everything just a social construct? In 2021, Dan Ariely (author of that “The IKEA Effect” paper) was accused of manipulating his data in a 2012 study after other researchers could not replicate results and considered his raw data suspiciously manipulated when compared with the published findings in the study. The paper was later retracted. The he-said-they-said drama thickens. While Ariely claims he didn’t touch the raw data provided to him, in a statement to NPR, The Hartford (the originator of the consumer data) insists that someone altered the data after they gave it to him.  There’s different data, but neither party supposedly altered it. Meanwhile, Harvard Business School professor Francesca Gino, a collaborator of Ariely, is currently on leave after being accused of falsifying her data... from the same exact 2012 study that Ariely is accused of. The kicker: this paper is about “nudges” to prevent people from lying. Over the last decade, behavioral economics has become a thrilling topic for psychologists, marketers and all interested in the mind. Ariely puts it best as his book title — we’re Predictably Irrational. One study by Gino claims silently counting to 10 before deciding what to eat can increase the likelihood of choosing healthier food. It’s now common to believe that small “nudges” like requiring a fresh egg can influence our psyches at scale. Further, our behavioral idiosyncrasies can be distilled down and explained by simple cognitive biases. Or maybe not. What if we don’t know as much as we think we do about what’s happening inside our brains. Maybe we can’t actually explain stalling cake mix sales. And maybe signing the top of a piece of paper doesn’t actually prevent lying as Ariely and Gino’s paper suggests. Maybe we just don’t know. And that’s okay. Our current moment with UFOs, or now UAP’s, “unidentified anomalous phenomena” is a great exercise. When a former Navy pilot testifies in-front of Congress that the Pentagon is hiding evidence of alien spacecrafts and knowledge of “non-human remains” we’re left with an opportunity. The real truth here is that we’ll never ever get the full truth. We’re invited to admit, “Maybe we’ll just never know.” Wonder, awe, mystery and unknowing are beautiful traits that our AI competition will never experience. Bask in naivety. One of our fatal flaws is our adamancy that we know how everything works. How may we be happier or more productive if we were mindful of our hubris? On a warpath for not just truth, but an easy truth, we overlook other valuable lessons: the world cannot always make sense, personalized participation (icing) is more effective than conventional participation (eggs), and elusive focus groups may not reveal as much as some extensive desk research may. Yes, there’s a purpose for the half-truthed Betty Crocker tale, but there’s much to be learned in full-truthed tales. We should be open to the full truth as much as we’re open to admitting we just don’t know. And if there’s a pithy story to come from the Betty Crocker tale, it’s not about the power of participation. It’s that maybe your product is just crap. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    13 min
  5. 07/12/2023

    Colleges Are Dying, Long Live Higher Education

    To forecast our future, we have to identify patterns of change early. But rather than only seeking out collections of signals representing growth, it behooves us to simultaneously study what’s crumbling – signals of decay. After all, growth stems from deep fractures. One of today’s most glaring fractures worthy of our attention is higher education. The changing landscape of higher education is ground zero for radical social change and required innovation. Good news! TVs, toys and software have never been cheaper in human history. Bad news: College tuition and textbooks have never been more expensive. This is according to the Bureau of Labor Statistics which has been tracking the prices of consumer goods and services relative to inflation for the last two decades. College tuition — second to healthcare — is the most “increasingly expensive” buy in America. How coincidental that these are two of the most important purchases one can obtain, and certainly the sort which more should have access to, not less? According to the National Center for Education Statistics, in the 1968 academic year, it cost $1,545 to attend a public, four-year institution (including tuition, fees, room and board). In 2020, it was $29,033. For the fifth of college students attending private schools, that figure is significantly higher. Noteworthy as the cost of (manufacturing) education and textbooks have not risen at the same rate. Is it any more expensive to “produce” education today? This is perhaps why NYU, among many schools across the country, are developing “Schools for Professional Studies” — certificate program alternatives dedicated to furthering education during a moment when traditional degrees are slipping. According to a report from the National Student Clearinghouse Research Center, the number of students who earned undergraduate degrees fell by -1.6% in 2022, reversing nearly a decade of steady growth. As of last year, only 51% of Gen Z are interested in pursuing a four-year degree, down from 71% a couple years earlier. The pandemic and Zoom screens have put things into focus. And students’ parents are on the same page: nearly half of parents don’t want their kids to go straight to a four-year college. Graduate degrees are falling out of favor just as dramatically. For The Wall Street Journal, Lindsay Ellis reports, “At Harvard, widely regarded as the nation’s top business school, M.B.A. applications fell by more than 15% [in 2022]. The Wharton School of the University of Pennsylvania recorded more than a 13% drop. At other elite U.S. programs — including Yale University’s School of Management, as well as the business schools at the University of Chicago and New York University — applications dropped by 10% or more for the class of 2024. Cost was the biggest factor blunting demand.” Meanwhile, this decline is about to worsen — not just because of prices and attitudes, but because of significant demographic change. Kevin Carey, VP for Education Policy at New America, a think-tank, wrote for Vox: “[In 2026] the number of students graduating from high schools across the country will begin a sudden and precipitous decline, due to a rolling demographic aftershock of the Great Recession. Traumatized by uncertainty and unemployment, people decided to stop having kids during that period. But even as we climbed out of the recession, the birth rate kept dropping, and we are now starting to see the consequences on campuses everywhere. Classes will shrink, year after year, for most of the next two decades. People in the higher education industry call it ‘the enrollment cliff.’” Like any business facing disruption, many are pivoting to diversify revenue. Earlier this year I learned NYU was growing its Marketing certificate program for those seeking to enter the field or gain more experiences from practicing experts. I raised my hand and began the process to volunteer as an Adjunct Professor at night. I reviewed syllabi, audited classes, and had planning talks which spanned months. The paperwork finally began and I was set to lead seminar discussions for NYU’s Fundamentals of Advertising course. That was until the provost learned I, myself, didn’t hold a Masters degree. I was axed. Despite my desperate pleas, they “weren’t interested in having any further discussions with me.” The very institution struggling to keep up with the evolving education landscape by providing degree alternatives, couldn’t fathom that anyone without a postgrad degree could be qualified to provide their students value. Or they did until they learned I wasn’t their ilk. It was flawed logic academics would have loved to call out. As a practitioner with a decade of marketing strategy experience and current guest speaker at schools including Yale, Parsons, Queens College, Franklin & Marshall, University of Oregon, and oh, NYU already that month, my lack of formal degree called for immediate disqualification. Meanwhile, NYU’s certificate program was left with a gentleman deconstructing TV ads from the 90s to a dozen black Zoom screens. (Mind you: TV is the fastest declining advertising medium in this field's history, and this instructor’s “acceptable” higher education was a law degree.) Am I still salty? Isn’t it clear? It was utterly perplexing and hard not to take personally. But then: clarity. In a moment when higher education must be reworked and reimagined, perhaps institutions themselves may not be the best qualified providers for our required alternatives. There’s inherent conflict and rigidity preventing these educational gatekeepers from offering a fair and valuable alternative. As much as they want to bet on red — alternatives — they must simultaneously bet on black — protecting their historic brands and upholding the value of the traditional degree. This may be an incompatible strategy. This is not about me and my opportunity to teach, but about students’ return on investment, experiences and preparedness. A genuine commitment to education would mean an integration of more diverse perspectives, and valuing practitioners’ experiences over their (lack of) degrees. Failure to prioritize students’ outcomes will only accelerate the exact reason why NYU has to develop an alternative in the first place. NYU isn’t incentivized by its Continued Learning cohorts. It’s incentivized by its astronomically priced degrees and brand. I was deemed a valuable asset for students until the institution learned I was at the same “academic achievement tier” as their students. My overwhelming passion, practical expertise and ultimately the students’ education were all overlooked, deprioritized. As Clay Shirky, (coincidently, Vice Provost at NYU), put it in Bryan Alexander’s book Academia Next: The Futures of Higher Education, “The biggest threat those of us working in colleges and universities face isn’t video lectures or online tests. It’s the fact that we live in institutions perfectly adapted to an environment that no longer exists.” Alexander, a futurist and senior scholar at Georgetown University, wrote, “Much of American higher education now faces a stark choice: commit to experimental adaptation and institutional transformation, often at serious human and financial costs, or face a painful decline into an unwelcoming century.” And lastly, as Anthropologist Grant McCracken puts it, “The university that cannot fix itself is disqualified from educating our young.” The Value of Edu To envision solutions for higher education and continued learning, we have to understand the current landscape and how we got here. Understanding begets autonomy and action. Institutional higher education is headed straight off a cliff. If we agree not everyone has to or should get a four-year-degree from a university, why is this institutional implosion problematic? Environments for emotional and intellectual growth are critical for both culture and society. For individual and collective growth, we should be fighting to ensure increased opportunities for people to explore new subjects, broaden worldviews, and develop critical thinking skills. This ultimately leads to increased curiosity, creativity and innovation — attributes which develop more informed and engaged citizens in our democracy. This isn’t happening. The inverse is — less young adults are obtaining such experiences. The recent ruling on affirmative action will further widen the access gap to educational opportunity. How will colleges, employers, and organizations maintain their commitment to diversity and inclusion? Considerations might include new selection criteria for systems of admission, rigorous outreach programs for increasing the number of minority applicants, and stronger partnerships between high schools and postsecondary institutions with an emphasis on matriculating students who face adversity. This decision can’t be the final word, according to President Joe Biden. He’s right. Without affirmative action, universities will be limited in their ability to consider race, ethnicity, nationality and socioeconomic status in the admission process. But that doesn’t prevent them from iterating upon decades worth of progress in diversifying America’s campuses. We can no longer rely upon institutions to be the sole providers of continued education in our futures. Financially, it no longer makes sense. Approximately 44M Americans have student loan debt, amounting to more than $1.6T by the end of last year — roughly the GDP of the state of New York or the entirety of the country Spain. While Federal Reserve data reveals adults under 30 are more likely to have student loan debt compared to older adults, nearly a quarter of outstanding student loan debt is owed by Americans over 50. This multi-generational burden restricts social mobility and cultural participation. Debt prohibits. Meanwhile, the Supreme C

    48 min
  6. 03/23/2023

    How To Approach Online Culture

    The following is a summary of my 2023 SXSW Talk: Movements > Trends. Here’s Part II. Firstly — there is no such thing as “online culture” vs. “culture.” That’s the digital dualism fallacy kicking in. It’s just one in the same. But for the sake of common understanding — “online culture” in this instance is the fast culture memeified online discourse, which organizations are too often obsessed with. It’s a shift that occurred ~15 years ago. 2007 was a monumental year for marketing. Facebook introduced Pages. Brands suddenly looked exactly like our friends. They weren’t. But nonetheless, brands saw the opportunity. And it was a glimmering one. “What do we have to do or say to feel like a friend?” Ever since the 00’s, brands have been seeking out material and excuses to join in online discourse across social — the perceived “hotbed” of culture. “If we win these discussions, we win culture... and then sales.” It’s uncertain if this notion has even been measured or supported, but was — and often remains — the collective hypothesis. Regardless, “trending” headlines and the meme of the moment became the focus for “friendly relatability.” Attempts to resonate and cut through, optimizing for attention, has resulted in an obsession: scan, track, measure, understand and activate upon whatever’s “trending." Brands say bae, express nihilism — are they depressed? — and are now seemingly... horny? Hashtags, challenges, and aesthetics have replaced the original intention of a “trend”: a meaningful social shift in human behavior. We’ve come to conflate “trending” with “trends.” In the process of chasing cool, most discussed “trends” are really just frivolous entertainment. We’ve lost the plot. Meanwhile, two other macro factors have helped further reverse the figure and ground. In a moment of chronic uncertainty, trends have become our “answers” — comforting explanations of what comes next. And simultaneously while culture also feels stagnant, trends have become our “progress” — comforting change. As a result, the number of published trend reports have roughly tripled since 2016. Trends are trending. And the trending is seen as trends. It’s a mess. Yet in primary research when asking 1,500 people globally if they’ve heard of ten “trends” — from Cottagecore and Barbiecore, to Indie Sleaze and Permacrisis — 43% haven’t heard of a single one. Utter “vibe shift” to the general public, and they’ll think you’re speaking a foreign language. ...Because you are. And meanwhile, for the 57% of people who have heard of one of the most discussed “trends,” less than half of those people have actually participated in any capacity. The vast majority of people have not heard of what cultural thinkers and strategists obsess over, and the general public isn’t doing anything with it. “Trends” as we currently know them are really only for ourselves. That’s fine... but for as long as we recognize they’re untethered from the real needs and desires of real people. These are empty vessels for us to fill whatever explanations we wish into them. They are our Rorschach tests. Cottagecore is whatever we want it to be... because it doesn’t actually exist. If our foundational task is to understand people, we’re way off the mark. For this reason, we need to break up with trends as we currently know them. It’s a toxic relationship. The critical caveat here is that understanding culture remains a priority, but the nuance is mistaking “trending” with substantial ideas worthy of strategy and investment. We must continue to study these signals, but with a dose of skepticism and healthy distance. If anything, they’re signals in themselves, not substantial shifts. Cottagecore as a viral, idyllic aspirational aesthetic is one thing. A sensibility. But we have to hold that in conjunction with the reality that this “trend” only applies to a fraction of a fraction of people... with minimal behavior being nudged. More precisely, there are three reasons as to why our current approach to understanding “online culture” requires a gut-check: Firstly, it’s exhausting. Sixty-four percent of people feel culture is accelerating. And that’s according to the consumers. How about the strategists tasked with keeping the pulse, analyzing and activating? We now have anti-trend trends, and currently #corecore — the trends have gone meta: online commentary about the absurdity of living online. We’re chasing “trends” which are inherently fleeting, and ephemerality has a notoriously low ROI. At SXSW this year, a leading social platform argued for the importance of “ephemeral trends” — we know what that means, right? When has investing in “temporary” ever been a sound business decision? This is simply an unsustainable and unwise practice. The second reason we need to interrogate our approach is because our current process is futile. Two-thirds of people believe brands are trying way too hard today. Even if a brand was to successfully chase down and capture the fleeting and act upon it, their mere presence undermines the outcome. As a brand, wrap your arms around something and you (often) kill it. That’s just how it works. But many still don’t want to accept this law. Take a look at r/FellowKids — the unfortunately still growing graveyard of cringe. Brand participation begets erosion. And for the brand who doesn’t mind the cringe and leans in regardless for engagements sake, psst... it’s still cringe. And the third reason we need to break up with “internet trends” as we know them is because these concepts are often inherently empty — devoid of meaning. Let’s go back to physics. Sorry. The equation for force is mass times acceleration. (F)orce = (m)ass • (a)acceleration Or more simply, force is calculated by the “weight” of something times its “speed.” Why do we care about force? Because culture is made up of forces: the crosswinds, efforts and influences of ideas and behaviors. For us to understand what to pay attention to, we need to be calculating “force.” But the problem is, we’re using the wrong variables. We’re failing physics. Of course we are. For the variable of speed, we have to recognize, today, everything is fast. Everything. Seventy-four percent of people believe algorithms can make anything go viral. In this context, speed is table-stakes. Anything new just moves fast. Fast is the norm. And as a result, we’re confusing speed with newness. Ironically, it’s perhaps the slower moving or sustained shifts that are more valuable to us today — the ones with prolonged energy. And for weight, today, everything is big. Everything. Again thanks to algorithms, everything has a trillion views. Fame is democratized and each piece of content can reach more people than the average blockbuster. Size is what’s distracting us. But size isn’t the metric we need to be paying attention to. Consider a balloon and bowling ball. Both are the same size, but very different weights. Remember, it’s weight that we’re after. We’re too often mistaking what’s trending for a machine with the real desires of humans. So our current working formula is: Force = Size • Newness We’re way off and exhausted. We need to go back to the original formula. Force = Weight (or the meaningfulness) • Speed (or the momentum) Or more simply, we need to focus on bowling balls over balloons. Balloons are cheap, pop or fly away. Why would that ever be a winning strategy? When the vast majority of people would prefer brands to “serve my needs by understanding what I care about” (70%) over “appears relevant by leaning into the latest trends” (30%) a new strategy is required. It’s about going back to basics. Physics 101: remembering the true definition of force — does this actually have weight and sustained energy? Psychology 101: remembering the human — does this actually mean something to a real person, not an algorithm? Business 101: remembering ROI — does this actually move a needle and is a sound investment of time, energy and resources? And if there’s no astounding “yes” to the above, let’s put it aside for the moment and just keep tabs on it. Not doing so is a disservice to our clients, ourselves and our industry. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    10 min
  7. 11/28/2022

    Modern Religions For A Lonely World

    Meet Liver King. He’s a media personality caricature repping the “all meat diet.” He chomps animal brains to win big in the attention economy, as much as he fights for the reassessment of what a more nutritious diet may entail. His success primarily lies in the former: attention. Many dismiss his honesty. There are countless videos “exposing” his regimen and potential steroid use. But it’s moot. Controversy only adds to his hyper-masculine mythology. His Carnivore Diet has been around for as long as the internet has. The pitch ranges from weight loss, increased energy, higher testosterone, and mental clarity. But several more drivers are now giving this “lifestyle” newfound energy. Firstly, it’s never been easier to get in touch with a tribe of like-minded thinkers. Often exposed via algorithmic means, an odd practice effortlessly reaches millions today. A video — or the mere thumbnail of one — is an invite for new, potential inductees. With this, we can now choose our own adventure of truth and determine what’s healthiest for us. Secondly, the attention around the all meat diet has risen with the larger adoption of veganism — also coincidently driven by health benefits. The blossoming of plant-based diets has allowed a counter trend to enter and thrive. It’s no surprise that we see the Carnivore Diet rage in a moment when meat-alternatives are increasingly finding their way onto menus. After all, many cultural trends are just tensions. Equal and opposite reactions. Trend. Counter-trend. Cause. Effect. Further, meat consumption also symbolizes status and mastery over one’s domain — one which is currently aflame and we’re hastily losing. Promoting one’s machismo dominance is also quite timely as we simultaneously evolve beyond a gender binary. Again: Trend. Counter-trend. Back to Liver King... A six pack, grizzly beard and bloody goat intestines appear to run counter to animal rights, environmental decline and gender fluidity. And here lays the ultimate overarching pitch and final driver to this all meat diet: identity and the community which comes along with it. You don’t even have to consume the raw liver. You just have to consume the content. The all meat diet is a starter pack of values. Worship him or ridicule him — either gives you the opportunity to express your beliefs, find a vocal role in this world, and bring you closer to those who feel the same about animals, the environment or gender. Modern Religions In Tara Isabella Burton’s book, Strange Rites: New Religions for a Godless World, she reminds us that religion is more than places of worship or mere deities. Religion can be anything that provides us meaning, purpose, ritual and community. An all meat diet is a religion. And Liver King is our high priest. Burton reports: “Back in 2007, 15% of Americans called themselves religiously unaffiliated, meaning that they didn’t consider themselves to be members of any traditional organized religion. By 2012, that number had risen to 20%, and to 30% when it came to adults under thirty. Now, those numbers are higher. About a quarter of American adults say they have no religion. And when you look at young millennials — those born after 1990 — those numbers reach almost 40%.” But while younger generations claim to be “less religious,” that’s not to say they aren’t rabidly seeking spirituality, answers or belonging. Definitions and modern examples of religion just haven’t caught up to the surveys. Outside of entertainment fandom, more glaring today: politics and social justice have become our loudest religious replacements. Helen Lewis, staff writer at The Atlantic puts it, “Many common social-justice phrases have echoes of a catechism: announcing your pronouns or performing a land acknowledgment shows allegiance to a common belief, reassuring a group that everyone present shares the same values. But treating politics like a religion also makes it more emotionally volatile, more tribal (because differences of opinion become matters of good and evil) and more prone to outbreaks of moralizing and piety.” Burton points out: “A full 72% of the Nones [those who are religion-less] say they believe in, if not the God of the Bible, at least something.” Today, righteousness is up for creative interpretation and gospel is co-written in the comments. Dogecoin Dogma Meme stonks and crypto provide hundreds of thousands moral meaning (giving power to the people), devout purpose (going to the moon or taking down “The Man”), steady ritual (buying the dip or “gm”), and passionate community (servers to subreddits). There’s a prophet: Satoshi, and sacred text: The White Paper. The very first block is even called The Genesis Block. This is all a profoundly deep, shared belief in something. A contagious energy. A shared spirit. There are morals and morale here — crypto is seen as a path to salvation, “the answer to all of humanity’s problems.” Bloomberg’s Lorcan Roche Kelly calls bitcoin: “The first true religion of the 21st century.” Karl Marx claimed that “Religion is the opium of the people,” but instead, modern religions are really the amphetamines of the people. Praying with Potter Harry Potter is perhaps the most established modern religion we’ve got. With a moral compass from shared sacred scripture, Potter has been offering a profound sense of belonging to the Wizarding World for a quarter of a century now. Potterheads take pilgrimages to Hogsmeade™ village at Universal Studios Orlando and congregate around their own interpretations of the new testament: fan fiction. The Hogwarts house system even provides specific denominations for even deeper affiliations. Endangering the ecosystem to pay their respects, fans have been recently urged to stop leaving socks at the fictional grave of Dobby at Freshwater West Beach in Wales. And since the very beginning, traditional religious groups have either attacked or compared the magic of the series to their own beliefs. Religious disaffiliation now also occurs when members reject its leaders’ own base actions. Bitcoin and Gryffindor are symbols of modern religions if we’ve ever seen them. In Sync For younger generations raised on remix culture, we see the stitching together of behaviors and content as new religions. And these religions also stitch us together. As Burton writes, “In his 1911 book The Elementary Forms of Religious Life, Durkheim argues that religion is basically the glue that keeps a society together: a set of rituals and beliefs that people affirm in order to strengthen their identity as a group. Religion is a ‘unified system of beliefs and practices which unite in one single moral community called a Church all those who adhere to them.’ This church, furthermore, is sustained not through a top-down hierarchy, or through some invisible spirit, but rather through the collective energy of its adherents, a process he calls ‘collective effervescence,’ a shared intoxication participants experience when they join together in a symbolically significant, socially cohesive action.” From diets like all meat, to the absence of food like OMAD (one meal a day), to the slur-hurled cultish targets like Goop or CrossFit — the gospel of wellness grants opportunities for shared values, goals and rituals. These socially cohesive practices are Durkheim’s “collective effervescence.” And this religious collectiveness is a solve for Cultural Synchrony — cohesion and concurrence during a moment of social polarities and algorithmic segmentation. Modern religions sync us. Worshiping Workism Our “Great Resignation,” Anti-Work and Overemployed movements also check the boxes of modern religions. For the last two decades, as traditional religion declined and capitalism thrived, work stepped in as a seamless substitute. Blackberries and boardrooms as altars, we prayed for promotions. We went as far as replacing “career” with calling and passion. WeWork’s entire rise (and fall) can be traced back to Neumann’s religious aspirations. And with that, another component of religion is the leader. As Joe Rogan ironically points out, “There’s some weird thing about human beings where they gravitate towards a big leader [...] There’s almost like a cheat code.” From Musk and Trump, to Billy McFarland, Anna Delvey, Elizabeth Holmes, and Sam Bankman-Fried, the line between a charismatic leader, and cult of personality is razor thin. The exploitations of scam culture within the context of our yearning for modern religion is worthy of our mindfulness. Gary Vee is our “youth pastor of capitalism.” But only recently — with a pandemic, unemployment, and widespread WFH holding a mirror to this greedy, corporate faith, a catalyst for mass reflection — many have reconsidered this theology. In 2020, when governments legally withheld purpose from the masses, the vibe shift was underway. As a truly endless spiritual pursuit, millions more are now stepping off the treadmill toward dream job nirvana. Did it ever really exist, though? Arguably most influential of all, when the church is physically closed and our religious practice is reduced to a Zoom screen in an empty apartment without real socialization, we lose our religion. Loneliness While other religious stand-ins like QAnon, cosplay, K-pop, stans, wicca, astrology, anti-vax, Disney Adults, online sleuthing (think: Couch Guy detectives), or young men devout to their Bored Apes or DAO’s governance, all check the boxes of meaning, purpose, ritual and community — the most influential driver of our newfound spirituality is our loneliness. Above all, today’s modern religions provide community. According to multiple studies, 56% of Gen Z report “growing up lonely” (more likely than any other cohort). 59% of 18-29 year olds have “lost contact with friends” since 2020 (more than any other cohort). And 9-in

    19 min
  8. 09/19/2022

    The Creator Paradox: Cultural Stasis Amidst Creative Surplus

    Part I:The Tension There’s a new dilemma. Only it’s not that “new” of a dilemma. At the beginning of this summer, decades of glacier-paced cultural change was captured perfectly in a single weekend. The top of the charts revealed our endangered media ecosystem. You’ve heard this song plenty before. Thanks to inclusion in Netflix’s fifth season of Stranger Things, Kate Bush’s 1985 song “Running Up That Hill (Make a Deal with God)” found itself back in the zeitgeist. It went from 22,000 streams per day to 5.1M. Momentarily, a 37-year-old track was the most streamed song on Spotify. Meanwhile, Top Gun: Maverick, a sequel to the 1986 original, broke box office records, banking $156 million the same weekend. This was right before Jurassic World stomped in — the seventh installment since 1993. Then came Minions 2 — a sequel and a spin off to the Despicable Me franchise, which in itself already had three installments. Further, in video games that weekend, 9 out of 10 best selling titles were from franchises. And the New York Times Best Sellers list saw James Paterson, the Guinness World Records holder for the most #1 New York Times bestsellers, taking up two of the top five spots in fiction. It was the summer weekend for big premieres. But in fact, nothing about these releases were particularly that new. Most noteworthy though, this pattern of mega-successful reboots stood against a backdrop of another story... These titles were released at a moment when more people are creating more content than ever before in history. Spotify boasts 70,000 tracks uploaded every day. YouTube is uploading 30,000 hours of new content every hour. Nearly 3M unique podcasts exist. Twitch is broadcasting +7.5M streamers, indie game releases and play are both growing year over year, and roughly 4M books are published annually in the U.S. — nearly half of those self-published, a +250% increase over just five years. On one hand, we have a booming Creator Economy, with an ever-expanding democratization of tools for production to anyone with an idea. So much so, that according to 1,000 surveyed Americans by ZINE, 86% of people believe there is an overwhelming amount of entertainment available today. Yet meanwhile on the other hand, we seem to have also found ourselves culturally stunted. Our box office and streaming platforms are soggy with the same regurgitated franchises. Reboots rule the roost, and familiar faces hog our charts, while notable newcomers redefining genres feel few and far between. With this, 64% of people declare they are getting fed up with today’s reboots, sequels and remakes. What gives? How is it that during a moment of radical creator liberation and audience frustration, we’re finding ourselves with the same tropes and hooks? Chris Anderson’s 2006 optimistic Long Tail vision promised us that “specificity” — the shallow and obscure — would be economically feasible as the internet would connect the niche to its audience. Aggregators will win, the odd would thrive, and those on the edges would celebrate. Creators could finally connect to their 1,000 true fans. But as seen from the macro view, a diverse, bottom-up media ecosystem is in fact not thriving. Instead, the inverse is happening. Homogeneity is winning. Part II:Sameness Everywhere In an analysis by Adam Mastroianni, a postdoc scholar at Columbia Business School, “the same” keeps rising to the top — across all media. Simply, there are fewer winners. Mastroianni calls this our Cultural Oligopoly. “A cartel of superstars has conquered culture,” he writes. “Until the year 2000, about 25% of top-grossing movies were prequels, sequels, spin offs, remakes, reboots, or cinematic universe expansions. Since 2010, it’s been over 50% every year. In recent years, it’s been close to 100%.” “Since 2000, about a third of the top 30 most-viewed shows are either spin offs of other shows in the top 30 (e.g., CSI and CSI: Miami) or multiple broadcasts of the same show (e.g., American Idol on Monday and American Idol on Wednesday).” “In the 1950s, a little over half of the authors in the Top 10 had been there before. These days, it’s closer to 75%.” “In the late 1990s, 75% or less of best selling video games were franchise installments. Since 2005, it’s been above 75% every year, and sometimes it’s 100%.” Software engineer Azhad Syed identified the same “Cultural Oligopoly” in his analysis of the music industry. “The number of different artists that crack the Top 100 is decreasing over time. In conjunction with fewer and fewer artists on the charts, each of those artists is charting 1.5x to 2x as many songs per year.” Meanwhile, “old” music — defined as having been released more than 18 months — now accounts for 72% of the market in the U.S. And though 18 months is admittedly a flawed definition of “old,” more widely, the consumption of old music is growing, while demand for new music is also declining. In assessing this record for The Atlantic, music critic and historian Ted Gioia writes, “Never before in history have new tracks attained hit status while generating so little cultural impact.” The old is winning financially, but it’s also winning creatively. Rolling Stone Magazine forecasts the continued rise of “interpolations” — the cousin of sampling in which song structure is borrowed and made “new.” “Don’t expect interpolations to slow down anytime soon — rather, the total opposite is likely. Publishing companies are sitting on mountains of instantly recognizable songs [...] Now that the business is focused around streaming singles, they have a chance to juice them once again.” As a result, the hottest private equity investments as of late have been the publishing catalogs of accomplished artists. In fact, according to VP of Business and Legal Affairs at Sony Music Publishing, Dag Sandsmark, “The world’s largest music publisher has received twice as many requests for samples and interpolations from its catalog two years in a row.” Which translates to this: today, from film and TV, to books, video games, and music, there’s statistically less diversification rising to the top. And while it’s given that everything in culture is a remix, the intensity of today’s reliance on what’s come before seems worthy of our attention. What’s causing this systemic malfunction? Part III:Causes of Creative Collapse 01.Conflicting Ecosystems Most obviously, we’re discussing two very distinct and seemingly competing media environments. For creators, there’s the bottom-up, democratized access to tools, enabling massive amounts of content to be made and syndicated frictionlessly. In the Creator Economy everyone can be a player and “make it.” On the other hand, there’s the top-down, institutional power of filtering and recommendation, held by establishments incentivized by outsized financial returns. Large, risk-averse institutions — arguably just run by in-house lawyers and accountants at this point — play it safe to “protect shareholder value.” These divergent models are currently inconducive. It’s this fundamental dynamic that sits center stage at our paradox. When there are two drastically different sets of environments, incentives, and breeds of “Creators” today — everyday maker vs. established institution — it’s hard to expect normies to be plucked out and be bet on by gatekeepers already in power. 02.It’s (Mostly) Trash Then there’s the question of quality.  While the Long Tail is certainly diverse, it’s also made up of a lot of... noise. Amateurs are amateurs, no matter how many there are. A reason we don’t see new creators’ work rise is simply because the majority of it isn’t even worthy (or because there’s just too much to sift through). Another angle here is the lack of funding for emerging creators, fueling pursuits. For a young, talented artist today, where are grants or opportunities for backing outside of peer crowdsourcing? In the absence of infinite time but facing infinite content, we actually need some gatekeepers. Further, we need financing for those who aren’t... trash. 03.Institutional Consolidation By its very nature, the Long Tail of content is segmented into ever-smaller pieces for ever-more discerning audiences. But as the Long Tail lengthens where more create, the classic bell curve forms: the obscure gets more obscure, while the largest common denominator gets more... basic. Look no further than Netflix’s most recent pivots, which make it clear they’re no longer interested in many, risky, artistic bets, but instead, “Bigger, better, fewer.” Ironically, this is no different than what preceded them. Also, Netflix was once seen as the promising example for the opportunity of the Long Tail. Instead, over the last decade, Netflix has been slashing its library of titles. As of 2010, Netflix housed 6.7K films. Today, a decade later that number is down -45%. Much of today’s mass-produced work aims to satisfy the average. As a result, we’re left with average. The middle is saltine-cinema: the largest financial opportunity. Take or leave Martin Scorsese’s critique of Marvel, his take on the state of film — this “consolidation” — shouldn’t be controversial: “The art of cinema is being systematically devalued, sidelined, demeaned, and reduced to its lowest common denominator, ‘content.’” This dovetails with one of Mastroianni’s own hypotheses for today’s Cultural Oligopoly: a systemic reflex towards concentration. The big habitually eats the small. Movie studios, music labels, TV stations, and publishers of books and video games have all consolidated. And this concentration is simultaneously occurring across religion, political parties, language, top visited websites, newspapers, cities, and most discussed: wealth and businesses. The winners we’re left with today are so

    36 min
  9. 08/01/2022

    A_Framework_To: Find Overlooked & De-bias Trends

    The META Trends are invaluable in identifying where the collective, trend forecaster psyche is at. But as we learned in a five year look back: biases thrive, agendas direct, risk is feared, quantification is scarce and toxic optimism influences. Deeper, as we learned in a series of exercises with AI: analyzed cultural data reveals what we humans think is most important, may not actually be the case. All of this META Trend work is predicated upon industry trend reports... which, as we’re learning, may not be as dependable as we once hoped. The META Trends are insightful, but they and the industry reports used to get there, leave us with an incomplete picture of what’s driving culture forward. Only with friction, daringness and originality, can we analyze the sharp edges and fringes of culture that have influence. The weak, the uncomfortable and the complex help color our picture of the future. As we uncovered, AI can be helpful in discovering overlooked micro-trends which were hidden within the one million words of analyzed reports. However, many of these discoveries are things: Gut Health, Fluid Fashion, Privacy Enhancing Tech, etc. What we also need to augment is our ability to identify more nuanced, emotional overlooked trends. But this is a task a human can do best. Creative extrapolation is our superpower. So to identify the overlooked, we can use the META Trends as filters seek out what’s not surfaced. But we can also use these META Trends in another way... Sarah DaVanzo and I created a framework to spin out unique perspectives of any existing trend. 4X Interrogative Questions_ To Identify the Overlooked * Outside = What is an outsider’s POV or experience? * Other Side = What is the inverse or contradictory tension? * Dark Side = What is the malicious or distressing angle? * Back Side = What is the devious or inappropriate twist? Interrogating non-obvious dimensions of even the most trite, overly reported trends can reveal new ideas, threats and opportunities. For example, let’s use the most reported trend for 2022: Eco- Everything: a continued obsession with sustainability and an integration of green-thinking into all products and services. Applying the 4X Interrogative Questions we get: Outside = How are those on the equator signaling new norms of climate migration? → What does this reveal about the effects of climate on the less prepared, mobile or privileged? Other Side = How do consumers reckon with still opting in for two-day shipping amidst climate marches? → What does this reveal about a fear of sacrifice and collective cognitive dissonance? Dark Side = How are therapists managing to counsel those with onset climate anxiety, a new diagnosis? → What does this reveal about the spillover, emotional toll of something once believed to just be a physical crisis? Back Side = How do we account for the carbon footprint of online porn? → How can we speak to the stigmatized and uncomfortable drivers of humanitarian risk? By using this framework, we can open the door to new, often overlooked components of any cultural discussion. We net out with valuable trailheads to then explore. Call them insights, counter-trends, or just components of the original trend itself — it makes no difference. These are simply elements of culture that should be acknowledged. To continue this exercise, let’s go through all 14x 2022 META Trends to reveal some critical, often overlooked pieces of the puzzle. 01. Eco- Everything ♻️ Overlooked = Sustainable living is unaffordable for many, climate migration will be unachievable for a growing elderly population, and paper straws and PR plays are jokes to Gen Z 02. Digital Default 🌐 Overlooked = 27.6M U.S. households still don't have home internet, motion sickness and wanting to know what’s behind us still curbs VR adoption, and our desire to experiment with identity runs deep 03. xX~VIBES~Xx 🍄 Overlooked = Indigenous communities are being destroyed from drug tourism, vibe-therapeutics only address the surface level of social trauma, and bad trips and high-THC will ironically exacerbate mental health issues 04. Radical Inclusivity 🌎 Overlooked = Many are frozen, genuinely unsure how to appropriately participate, deadly racism remains omnipresent and largely unaddressed at an institutional-level, and are dating apps designed to facilitate discrimination with race filters? 05. Kid’ing 🪁 Overlooked = A generation just skipped a pivotal developmental period of play, humor and nuance (our greatest assets) keep dividing us, and joking amidst a backdrop of complete devastation can feel complicated 06. Home Hubs 🏠 Overlooked = For some, more time at home means more abuse, young adults can’t even afford a “Home Hub,” and as co-living thrives, brining home sexual partners — which is already declining — becomes even more difficult 07. Algo_Minded 🧠 Overlooked = Parents struggle to limit screen time, which stunts young adults’ social lives, McMindfulness has capitalism seeping into our wellness and dreams, and analog sex and companionship remain undefeated in brining mental relief 08. Renews & Reinventions 💭 Overlooked = It’s a privilege to drop out —  but those who most want to can’t, what happens when YOLO savings run out and startups fail?, and how quickly will employees return to a 9-to-5 in-office job when there’s economic uncertainty? 09. Virtual Valuables 🔑 Overlooked = Almost everyone wants to be an expert and “new things” are an easy topic, grifters can find ways to infiltrate and exploit any technology, and there remains scarce practical case studies of “world-changing tech” here 10. Now! Now! NOW! 🛒 Overlooked = You only lose from bad commerce experiences — rarely winning from average or good ones, deals are the only consistent predictor of loyalty, and predictive commerce may be too fast and uncomfortable for many 11. Me Inc. 💼 Overlooked = People with personal brands are struggling to launch personal businesses, financial illiteracy is failing us, and sex-work is minting a new class of millionaires overnight 12. Sounds Good 🔊 Overlooked = Overcrowded growing cities are victims of noise pollution, how do the 360M hearing impaired globally participate here?, and erotic audio is soothing millions to bed 13. Op-purr-tunity 🐩 Overlooked = We are seeing pets as replacements to children and spouses, stigmatization and restrictions remain across establishments (parks, bars, theaters, etc.), and animal abuse has risen alongside pet adoptions 14. Feed University 🎓 Overlooked = The line between expert and amateur is razor thin, academic institutions are struggling for relevance, and this is only jet fuel to our existing mis- and disinformation dilemmas These overlooked components are not comprehensive by any means — and arguably some may be obvious — but they offer a splice of critical nuance that’s missing from our daily trend conversations. We consistently need tensions, devil’s advocates and contrarians to see the full picture. Mindfulness of blind-spots, a willingness to both challenge and expand upon worldviews, and respect with interrogation, are the imperative and often missing traits required to more comprehensively grasp the zeitgeist and author a preferred future. Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    9 min
  10. 07/25/2022

    How To Spot Trends with AI

    After Sarah DaVanzo and I leveraged NWO.ai’s invaluable AI to score and re-rank the META Trends, we were left stuck with one finding: Both the global and U.S. AI data-driven ranks were significantly different from the original human rank. The AI declared that what we humans thought was most important was not actually the case.  Were we just splitting hairs of importance here, or were these divergent rankings a signal that our META Trends (which came from source material) were not as important as we once thought? Maybe more influential cultural shifts are out there waiting to be exposed.  And if so, how can we find them? We debated important but missing META Trends for weeks — but, how important could these be if the experts couldn’t agree upon their importance by not collectively highlighting them within their reports which we analyzed? But simultaneously, according to our work analyzing the last five years of META Trends, the “most important trends” being reported haven’t changed much. There was no denying, though: important, nuanced cultural shifts were missing from our list of 14 META Trends. So, how could we identify and highlight these overlooked trends... and further, in a way that isn’t subjective (Sarah’s opinion against mine)? We considered just naming our favorite cultural phenomenon not included in the original META rank, or we could have just surfaced interesting leftover trends from the 40+ reports that didn’t make their way into one of the 14 META Trend themes, but both approaches would have thrown us into the same trap which we immediately called out after publishing the most recent annual Meta Trend report: the prevalence of bias and scarcity of risk in the trends and foresight field is concerning at best... While Sarah and I both have historical proof and a pedigree of accurate trend forecasting, our life experiences and methods differ. Just listing our favorites felt too qualitative. So we designed another experiment with NWO.ai. Experiment 04.AI META Trend Identification_ Comparison Left on its own, could AI identify similar or different — perhaps missing — META Trends? This time we fed all of the text from the original 40+ sourced trend reports into the NWO.ai AI platform. Nearly one million words of text. We figured that the AI could process this information with a different, extraordinary comprehension than us humans, who attempted to do the same when creating that original 2022 rank. We hypothesized the AI would make more connections — ergo identify META Trends completely overlooked by the humans. By crunching all of the reports and instructing the AI to identify META Trend patterns (clusters, themes, etc.), would it come back with missing valuable, social shifts? Answer: Not even close. We were very wrong to believe AI could complete this exercise similar to that of an expert trend spotter. From the one million words of text inputted, the AI used Natural Language Processing (NLP) and clustered like-with-like, arriving at 72 clusters of “trends.” Interestingly, there was very little overlap with our 14 META Trends — a handful at best, which were really just optimistic stretches. Further, the AI’s clustered “trends” weren’t even trends, but rather general topics like “technology” and “pandemic.” It’s not to say that these themes weren’t impressive — they were — but these findings aren’t helpful to an experienced cultural strategist who can arrive at more provocative groupings. So to answer the question: Could AI identify overlooked META Trends: No. But feeling we were onto something we asked a follow up... Experiment 05.AI Micro-Trend Identification_ Extrapolation Rather than identifying large patterns which we’d call META Trends, could we use the AI to identify and rank smaller, perhaps overlooked micro-trends from within the reports? To figure this out, instead of having the AI merely organize the reports’ text, we instructed the AI to take its newly created meaning of the one million words (i.e. it’s 72 clusters) and use diverse internet data sources to measure and rank each and every signal. The goal of the experiment was to understand what the consumer energy is behind every micro-trend and rank them accordingly. We’d called these the AI-identified trends.It was a complicated process, but essentially we asked the AI to take the signals it captured from the 40+ industry reports, use them as a launching off point, and then use all the available online information to easily rank and validate them.  The AI came back with 1,062 newly scored micro-trends. This was a ranking of AI-identified, human-overlooked trends, via abstracted meanings and associations all from the 40+ reports’ text. It turned up gold.  Precisely, these were trends buried — or, hidden — within the industry reports that the AI pulled out using advanced NLP techniques and a vast amount of data. Here is a curation of the top 40 ranked, overlooked micro-trends discovered by the AI: Inclusive Insurance, Disruptive Winds, Food Inflation, DAO’s (Decentralized Autonomous Organizations), Rising Energy, Super Apps, Longevity Food, Unisex Fragrance, Dating Fatigue, Clothing Rental, AI-Music, Biodynamic Farming, Gender Affirmation, Rainwater Harvesting Systems, Wearable Robotics, Dopamine Dressing, Sleep Coaches, Financial Coaches, Gut Health, Caregiver Leave, Self-Hypnosis, Alcohol-Free Beer, Psychoactive Tea, Sperm Freezing, Touch-Free, Post traumatic, Carbon Pawprint, Sustainability Calculator, Land Stewards, Privacy Enhancing Tech, Anonymous Marketplace, Subscriptions, Fluid Fashion, Land Availability, Period Products, Paid Menstrual Leave, Mutual Aid, Workplace Conditions, Banned Advertising, and Media Anxiety Perhaps most noteworthy: “Russia Initiative” was a buried “trend” identified by the AI from the structured text of the industry trend reports. The AI then used various online data sources to measure and score the energy behind this (and all of the other 1061 signals). While the AI picked up this shift, not a single report explicitly mentioned a pending war at the time of their writing. The AI was literally able to give voice to cultural change indirectly alluded to from within the human-authored reports. Conclusion:Humans for Sensemaking & AI For Discovery and Inspiration Our experiments found that humans outperform AI in decoding the zeitgeist and defining cultural shifts at large (META Trends, Mega or Macro Trends). We’d call this “sensemaking.” This skill is essentially being able to synthesize wide-ranging, already structured data, and intuitively pattern match and creatively stitch narratives. Humans have an edge over AI when it comes to seeing the big picture and making non-obvious connections. Humans can derive META Trend patterns. But as we found out, the AI cannot. Humans bring context to the table: historical knowledge, existing understanding of worthy trend criteria, and most important, ties to business use cases and priorities. Simply, we humans know what to look for. But... This is also our fatal flaw as it translates into bias... Meanwhile, we uncovered AI has a distinct advantage when it comes to unifying, processing and analyzing diverse unstructured data sets at scale and with unmatched speed. AI beats humans when finding the most noteworthy weak and emerging signals (aka micro-trends) — concepts undetectable to the human eye due to the sheer volume of data. AI’s superpower in this context is “discovery” and “inspiration.” We also learned AI works well when it deconstructs and analyzes both human-structured data (ex. our original META Trends) and massive troves of unstructured data, using them as source material or trailheads in its own search for novelty. Ultimately, with insight from AI’s more precise rankings, its detection of signal vs. noise, and its delivered inspiration, it’s undeniable: AI is a crucial fixture in a successful cultural intelligence system. This series of experiments run by Sarah, NWO.ai and myself demonstrate the need and role of AI and cultural data at scale. This is the future of cultural intelligence. That’s the clearest takeaway here. The optimal cultural intelligence system combines Humans-and-Machines in a series of orchestrated hand-offs, repeating the pattern of construction and deconstruction. Humans are best utilized for sourcing and defining large, complex social themes, while AI is best utilized for prioritizing these weighty trends, sourcing micro-trends, and checking humans’ sometimes messy, qualitative approaches. Nobody’s perfect. And together is better than alone. But one question remains: Are there other trends out there unreported by the industry’s published trend reports, unidentified by the META Trend analysis, and undiscovered by the AI. Answer: No doubt. Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    10 min
  11. Using AI To Quantify & Size META Trends

    07/18/2022

    Using AI To Quantify & Size META Trends

    Earlier this year Sarah DaVanzo and I published the fifth annual 2022 META Trend analysis, a distillation of 40+ industry trend reports to ultimately identify the most frequently reported (i.e. noteworthy) trends for the year. Fourteen (meta) trends were identified to represent what the entire trends industry was collectively forecasting. During this time we also announced that for the first year, we’d quantify each of these META Trends to more precisely size and evaluate their cultural influence. It’s been a moment, but we finally crunched the data... To get here though, we first had to answer a series of very thorny questions: How do we define the criteria or “borders” of each META Trend, which sources of data should be leveraged to score each, should we study these trends’ influence within a U.S. or global data set, how do we even collect and analyze this data at scale, and how do we complete this exercise without adding any human interference, tipping the scales of objectivity? Once answering those could we then shine light on the larger questions at hand: * By leveraging cultural data, would AI rank the META Trends differently than how the humans did? * Left on its own, could AI identify similar or different (perhaps missing, overlooked) META Trends? * And conclusively, what are the best tasks and roles for humans versus AI in order to develop a successfully orchestrated cultural intelligence system? While asking ourselves all of these questions, we formed a partnership with the team at NWO.ai. The NWO.ai platform, which was amongst the finalists for LVMH's Innovation award, and an Industry Cloud partner of SAP, was formed in 2020 to identify consumer signals before they become exponential. Its AI algorithms learned over the last 2.5 years, and now boasts statistical trend prediction accuracy. They effectively quantify culture. With Sarah’s previous experience collaborating with them earlier this year, we determined their platform would be the perfect tool for our quantitative META Trend analysis. Experiment 01.2022 META Trend Scoring & Ranking_ Humans vs. AI As for our first question: If AI was to collect cultural data against each of the META Trends and then score their importance to rank each of them, would the AI ranking match or produce different results from the original human rank? Answer: Different. As a reminder, our “human ranking” was completed by Sarah and myself manually counting the frequency of similar trend mentions throughout the 40+ industry reports. For example, sustainability trends received the most attention and real estate across the analyzed 2022 reports, and hence the “Eco- Everything” META Trend was born and ranked in the top spot. So, to more precisely size and rank these META Trends, NWO.ai’s AI calculated a series of keyword “portfolios” of each trend. These portfolios were essentially groupings of keywords and phrases (i.e. booleans) representing each of the 14 META Trends. Sarah and I authored these portfolios ourselves, but to curb any subjectivity, we only leveraged the language used from the original reports’ descriptions. To be clear, Sarah and I did not forecast these 14 META Trends – these were simply the most talked about concepts throughout the industry. NWO.ai then measured the cultural importance of each META Trend via its portfolio of keywords. Consumer interest was quantified using a variety of data sources spanning: social, news publications, search, investments, patents, scientific journals, e-commerce data, and even film scripts. Ultimately, an AI-derived “Impact Score” was calculated by aggregating: volume (quantity of these signals across sources), frequency (volume over day), reach (distribution of publications), etc. These scores were finally normalized on a 0-100 scale to fairly pit each of the META Trends against one another and create our official AI ranking. This AI ranking from the cultural data behind each of this year’s 14 META Trends revealed that the convenience economy (Now! Now! NOW!) is dominating culture by a magnitude of more than double that of some other META Trends. In other words, Now! Now! NOW! or the endless demands of innovation surrounding online shopping, has a cultural impact more than 3x the size of the META Trend xX~VIBES~Xx, which is our desire to tune in, drop out, and create spaces or purchase products to fulfill and focus. With these AI scores, we then (re-)ranked each META Trend from the human approach. This is where things got interesting. We had rank discrepancies. While the human ranking process (i.e. mention count) identified Eco- Everything as the most prevalent META Trend... according to Eco- Everything’s portfolio of AI scored keywords, it is in fact the 9th most culturally impactful META Trend globally. Meanwhile Now! Now! NOW! received the highest Impact Score from the AI. Originally, it was only identified as the 10th most important META Trend. But according to the AI it is in fact #1. Perhaps unfortunately we’re engaging in some wishful thinking here: According to millions of cultural data points scraped and analyzed, consumerism beats out industry trend reports’ hype of sustainable innovation. Experiment 02.2022 META Trend Scoring & Ranking_ U.S. vs. Global When the first differentiation of rankings came back, we were only using a U.S. data set, achieved by geo-fencing our analyzed cultural data to North America. We wondered if there would still be discrepancies in the ranking if we created a new, specific global rank by opening up our aperture, sources and data. Was there a difference? Answer: Not really. The META Trend rankings by the AI are by and large the same globally vs. U.S., statistically reinforcing America’s cultural influence. In other words, META Trends’ impact in the U.S. reflects similar impact globally. Therefore, this scoring suggests that U.S. trends can be proxies for global trends as they ultimately ripple outwards from the states. But more importantly, because we tested the scoring twice (for the U.S. and globally), it reveals that AI data-driven trend scoring and ranking is impressively consistent. So to zoom back out, the rank difference between the AI and human methods shows glaring variation. This should make us all think — perhaps even question — the subjectivity and accuracy of the industry’s reported trends. After all, we were scoring what the humans (the most experienced trend forecasters, no less) originally published. How valid are these concepts if millions of data points and AI couldn’t mirror our collectively proclaimed importance of them? Or conversely, maybe these were in fact the most worthy trends to score and we’re just splitting hairs between the most important of the important. Or again on the other hand, just perhaps, there are trends out there with higher Impact Scores just never identified by the experts... In any case... The AI scoring revealed that Now! Now! NOW!, Home Hubs and Radical Inclusivity are the top three META Trends of 2022, from both a U.S. and global perspective. This suggests that no matter one’s vertical, these three META Trends are both qualitatively (human-identified) and quantitatively (AI-scored) important. Whether you’re national or global, strategically double down in these spaces. Experiment 03.AI Deconstructs The Anatomy of Trends_ Identifying Drivers In exploring NWO.ai’s platform, we noticed the AI could do something the human process could never — it allowed us to dig deeper and uncover the keywords (i.e. signals, concepts, trends, etc.) beneath the META Trend’s surface. In other words, what is driving each META Trend forward? If you recall, because we originally created portfolios of keywords for each META Trend to score them, we had the opportunity to score each META Trend’s DNA strands to determine which specific elements are having the most influence. Knowing what is driving a trend by ranking its most important components (i.e. keywords) can help us envision how it will evolve over time. NWO.ai granted us the ability to plot each META Trend’s portfolio of keywords on a 2x2 matrix and rank them by their current growth, speed, tone and forecast. Essentially answering the question: which explicit keywords are growing or declining in volume, and is this change exponential or momentarily still? Conclusion:AI Trend Scoring, Ranking & Driver Identification Is Superior to Humans By leveraging AI to score each of the META Trends, not only did we create a more accurate prioritization with a U.S. vs. global nuance — something which humans could never achieve — but the AI also allowed us to reach a granularity and inspect the anatomy of each META Trend. We learned which components are most significantly influencing its growth. But from these experiments, we also have a warning: An AI data-driven system can prioritize completely different trends than us humans. While AI can confidently unlock insight humans can only dream of, its results are so divergent from a human approach that healthy questioning is required: for the emerging software, but also primarily, the human’s “expert” input. The AI declared that what we humans thought was most important was not actually the case. This begged our next question: What does the AI find most important? Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    11 min
  12. 03/31/2021

    Preparing and Mourning for Deletion Death

    I remember the first video I watched on YouTube. It was also the first I saved to my Favorites playlist. “Muffins” is an absurdist ad for a bakery, who’s long list of muffin flavors descended into bird, fire and blood. I was 13 and it was the funniest thing I ever watched. That was of course before I came across Charlie the Unicorn. YouTube became a home where I collected countless creations. I’d later share these discoveries with friends at sleepovers, watching their faces, awaiting cackles of approval. We’d then go off to look for more. It was entertainment, but also opportunity for pure connection. Over the years, my compilation of favorited videos grew. OkGo music videos, a Soul Train dance line, viral ad analyses, a Reggie Watts performance, song mashups, and random clips lived amongst even more nonsensical skits like The Landlord and Unbelievable Dinner. As SNL’s Digital Shorts hit YouTube, Laser Cats and pushed me well over 100 saved videos. I continued exploring and curating into high school and college, frequently spending nights traveling in its time machine and rewatching the best of the bests. The playlist became a library reflecting my passions, tastes and sense of humor of the time  —  an archive of nostalgia  —  but it wasn’t just a mirror. It was literally me in videos. A meticulously curated self-portrait. A personal collage of URLs formed over the years. From 2007 to March 2021. The morning YouTube emailed me that my entire playlist was deleted, 14 years of favoriting accumulated to over 2,000 videos. Now they were all gone. As I skimmed the notice during my morning email routine in bed, my heart sunk and fury burned. I tried to grasp their rationale. YouTube claimed my playlist  —  and I  —  violated their community guidelines, specifically their child safety policy. “Because it’s the first offense, your account isn’t affected.” Yet my playlist was permanently deleted. I was unsure how this was a “warning.” Nothing made sense. A video must have been flagged, and instead of taking down just that piece of content, my playlist of thousands of other acceptable videos got wiped along with it. But even that felt unlikely as I couldn’t imagine favoriting a video that would violate their guidelines, let alone child safety. I don’t have a traffic ticket to my name and here YouTube was convicting me of peddling content which “sexualized, endangered or inflicted emotional distress on minors.” That YouTube didn’t even specify what video in my playlist led to their decision made the email even colder. In my appeal, my only hope of salvation, I constructed and deconstructed my defense for close to an hour so it could fit within their character limit. No amount of space could fit my resentment and desperation. Could I trick the machine or convince the human to give me my playlist back? No. Sincerely, no-reply@youtube.com I’m still processing the loss. According to research, over 8,000 Facebook users die each day and by 2060 there will be hundreds of millions of dead users. By the end of the century: two billion. One day they’ll be more dead users than active ones on today’s most popular platforms. What does an online, global graveyard look like? My content grievance is difficult to describe. After all, it’s a new human condition. And while the loss could be trivialized to a list of videos, it feels so much more than that. The death of my playlist feels like losing progress, or something cherished, but also feels as if I lost a piece of myself. Because content is so intertwined with identity  —  we are what we tweet  —  we can’t solely focus on how we treat human death online. We also need to focus on content death online. We lie to ourselves when we believe personal creations online will never decay, fade, rust or go missing. They’d be secure in the elusive cloud. Indefinitely pristine. But not only are we wrong about their longevity, we never consider they could be murdered. Not by villainous criminals, but by the platforms that we trusted to host our personal artifacts. To make sense of my experience I spoke with Katie Gach, a Ph.D Candidate of Technology, Media & Society at the University of Colorado Boulder, expert on digital death and researcher with Facebook’s Memorialization team. “The lack of tangible, tactile reality that our data has lets us imagine it in these really spiritual, ethereal, eternal ways,” says Gach. “When we don’t see where these things are, we expect them to be everywhere all the time.” After all, only some are privy to the fact our data is housed in a very material server rack in a cold, humming room. It’s this disconnect between presumption and reality  —  a confusion between virtual and physical  —  that obfuscates our ability to see issues like digital death so clearly. We’re also wrong to believe that our information superhighways won’t crumble. Much like our bridges and roads offline, the foundational infrastructure to our online experiences also requires planning and maintenance. When sites and apps go down nowadays, frustration is met with shock. How could this ever happen? Dr. Elaine Kasket, author of All the Ghosts in the Machine: Illusions of Immortality in the Digital Age, warns us, “You cannot trust corporations to safeguard your data.” They won’t last forever. Data should always be under our control. Dr. Kasket’s invaluable advice runs counter to another widely-held false presumption: it’s the platform which owns or regulates our data. Not as long as we also maintain a copy... This blurred responsibility is disconcerting as it implies a tension between us and the platform. “We presume a level of control and agency in our technology, because we made it,” says Gach. “It’s for us.” So then how is it that we built and live with technology which doesn’t entirely work on our behalf? That’s the question I keep coming back to when thinking about my deleted playlist. While I commend YouTube’s intent to clean up its platform, another fellow human actively designed the system  —  or rather overlooked a part of that system  —  which inflicted my pain. Substitute “inflict pain” for “create echo chambers”, “spread misinformation”, “catalyze violence” or “addict via dopamine” and we get all the fuel for today’s techlash. How could we do this to ourselves? My loss feels traumatic for multiple reasons. The sudden and unexpectedness of the deletion didn’t allow for any preparation. The deletion hurt, but it felt more barbaric at the hands of a friend, a platform I revered and devoted hours to for over a decade. My loyalty and goodwill wasn’t considered. Despite my smiling avatar, I was a faceless violator. This facelessness also went the other way. Because YouTube is a faceless hundred-billion dollar corporation, there is no one person to blame. For this reason, it’s easy to understand why Wojcicki, YouTube’s former CEO was so often a target  —  she was the only one that could be named and held accountable no matter the dilemma. Further, there’s no ritual to memorialize this type of loss. It’s difficult to wake up to an automated email and just carry on. And lastly, I have no closure. I crave just one employee responsible for this design or decision to personally voice understanding. Mourning without empathy or recognition is achingly isolating. I don’t want offenders spoiling the platform I loved, but I also don’t want to be wrongly labeled an offender in an unwieldy effort to clean up a mess YouTube created itself. More than anything else, I don’t want anyone else to experience what I did. While we optimistically wait for YouTube to insert more “human” into their process, or wait for the systems to improve from their unbearable and nearly unforgivable state, for now, the burden seemingly falls on us  —  the user. “Doing things at a community level is the intervention that I have come to,” shares Gach. “Where you make sure that people have their stuff backed up and have their stuff preserved. If your friend is making something that’s meaningful to you, save it for them.” As we depend upon today’s platforms, we need to be aware of the possibility of loss and willing to perform the maintenance. Unlike offline death, this loss is preventable. Gach continues, “I want people to be more prepared about what their data can be beyond themselves. It also gives people permission to say, ‘Hey, the attachment I feel to this data, isn’t stupid.’ It makes sense that you feel attached to this thing that you created, or this person who made this thing.” When it comes to legacy, I won’t be able to pass down or let my YouTube playlist represent me after I pass. But at least I’ll have this piece to memorialize its importance in my life. And it will be printed. This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe

    10 min

About

Overlooked cultural trends explained. Webby-winning intel on our media and social shifts to understand tomorrow. zine.kleinkleinklein.com