Tuesday, April 18, 2023

Facebook owe you money

 

Facebook may owe you money as part of a major privacy lawsuit. Here’s how to find out

Facebook may owe you money as part of a major privacy lawsuit. Here’s how to find out

If you used Facebook in the United States from 2007 through 2022, you can probably file a claim to collect. Read More

 




 https://www.fastcompany.com/90882616/facebook-lawsuit-settlement-claim-file-money-how-privacy?utm_source=newsletters&utm_medium=email&utm_campaign=FC%20-%20Compass%20Newsletter.Newsletter%20-%20FC%20-%20Compass%204-18-23&leadId=632792&mkt_tok=NjEwLUxFRS04NzIAAAGLMhWt2k6xJ8Sr_8LTwT1Lie3YtPa_3rA8QIWqPEMSepJROkinkSfLjMx8TQiAa4l5mwFKKCxrUQksXuRW-hRMmkFv_557exg14PRY0no


Facebook may owe you money as part of a major privacy lawsuit. Here’s how to find out

If you used Facebook in the United States from 2007 through 2022, you can probably file a claim to collect.

Facebook may owe you money as part of a major privacy lawsuit. Here’s how to find out
[Photo: Thomas Trutschel/Photothek via Getty Images]

Meta Platforms, Inc., owner of Facebook, has agreed to enter into a $725 million settlement covering numerous lawsuits, which alleged that Facebook improperly shared user data without users’ knowledge. Nearly every Facebook user in America is able to claim part of this $725 million. Here’s what you need to know:

  • What’s happened? Facebook had numerous lawsuits filed against it over various privacy allegations. These lawsuits alleged that Facebook shared user data without the user’s permission, shared a user’s friend’s data without their permission, or did not properly monitor how third parties were using shared user data. Because of the number of suits against Facebook, most were rolled into one and became a class-action suit against the company. Facebook has now agreed to settle the class-action suit by paying out a total of $725 million to affected users.
  • Has Facebook admitted wrongdoing? No, Facebook denies any wrongdoing and liability for the allegations. But that’s normal when a company agrees to settle a class action.
  • Who can get part of the $725 million? With a few exceptions, nearly every Facebook user in the United States between May 24, 2007, and December 22, 2022 can make a claim for part of the settlement.
  • Will all of the $725 million go to those affected? No. Some of it will go to cover the costs of administration and legal fees.
  • How much will I get as part of the settlement? That depends upon how many users submit a claim to be included in the settlement.
  • Do I need to take action to get paid? Yes. In order to receive a payment from the settlement, you need to submit a claim. You must submit a claim by August 25, 2023. You can also choose to opt out of the settlement if you want to sue Facebook yourself, or you can object to the settlement. You must do either of those things by July 26, 2023.
  • How will I get paid? You can choose how you want to receive your settlement. Options include a pre-paid MasterCard, a direct deposit to your bank account, via PayPal or Venmo, and more.
  • What if I do nothing? You won’t get any payment from the settlement and you’ll give up your right to sue Facebook over allegations covered in this settlement.
  • Where can I learn more about the settlement and my options? The official settlement website has a detailed FAQ here.
  • How do I make a claim? You can make a claim on the official settlement website here.

ABOUT THE AUTHOR

Michael Grothaus is a novelist and author. His new novel, the speculative fiction 'BEAUTIFUL SHINING PEOPLE', is out now

 More

Origins of Creativity





 https://www.newyorker.com/magazine/2023/04/24/the-cult-of-creativity-samuel-weil-franklin-book-review?utm_source=nl&utm_brand=tny&utm_mailing=TNY_Magazine_041723&utm_campaign=aud-dev&utm_medium=email&utm_term=tny_weekly_digest&bxid=630de66c203059c3580154ca&cndid=70710298&hasha=551c37974dfebd5e2b0fa3e6a8733435&hashb=5a215e847b8f37515c1083d8def6262bb7431858&hashc=4410bb874c8ac507e87c3b923f198f5016bb851a5f736770abc0caec342b90df&esrc=NL_page



The Origins of Creativity

The concept was devised in postwar America, in response to the cultural and commercial demands of the era. Now we’re stuck with it.
An illustration of a person's upper torso with a colorful and dynamic explosion of shapes in place of their head. Around...
The rise of a consumer economy, worries about conformity and alienation, competition from the Soviets: all were addressed by promoting “creativity.”Illustration by Cristina Spano
Listen to this story

What is “creative nonfiction,” exactly? Isn’t the term an oxymoron? Creative writers—playwrights, poets, novelists—are people who make stuff up. Which means that the basic definition of “nonfiction writer” is a writer who doesn’t make stuff up, or is not supposed to make stuff up. If nonfiction writers are “creative” in the sense that poets and novelists are creative, if what they write is partly make-believe, are they still writing nonfiction?

Biographers and historians sometimes adopt a narrative style intended to make their books read more like novels. Maybe that’s what people mean by “creative nonfiction”? Here are the opening sentences of a best-selling, Pulitzer Prize-winning biography of John Adams published a couple of decades ago:

In the cold, nearly colorless light of a New England winter, two men on horseback traveled the coast road below Boston, heading north. A foot or more of snow covered the landscape, the remnants of a Christmas storm that had blanketed Massachusetts from one end of the province to the other. Beneath the snow, after weeks of severe cold, the ground was frozen solid to a depth of two feet. Packed ice in the road, ruts as hard as iron, made the going hazardous, and the riders, mindful of the horses, kept at a walk.

This does read like a novel. Is it nonfiction? The only source the author cites for this paragraph verifies the statement “weeks of severe cold.” Presumably, the “Christmas storm” has a source, too, perhaps in newspapers of the time (1776). The rest—the light, the exact depth of frozen ground, the packed ice, the ruts, the riders’ mindfulness, the walking horses—seems to have been extrapolated in order to unfold a dramatic scene, evoke a mental picture. There is also the novelistic device of delaying the identification of the characters. It isn’t until the third paragraph that we learn that one of the horsemen is none other than John Adams! It’s all perfectly plausible, but much of it is imagined. Is being “creative” simply a license to embellish? Is there a point beyond which inference becomes fantasy?

The Best Books We Read This Week

Read our reviews of notable new fiction and nonfiction, updated every Wednesday.

One definition of “creative nonfiction,” often used to define the New Journalism of the nineteen-sixties and seventies, is “journalism that uses the techniques of fiction.” But the techniques of fiction are just the techniques of writing. You can use dialogue and a first-person voice and description and even speculation in a nonfiction work, and, as long as it’s all fact-based and not make-believe, it’s nonfiction.

The term “creative nonfiction” is actually a fairly recent coinage, postdating the advent of the New Journalism by about twenty years. The man credited with it is the writer Lee Gutkind. He seems to have first used “creative nonfiction,” in print, anyway, thirty years ago, though he thought that the term originated in the fellowship application form used by the National Endowment for the Arts. The word “creative,” he explained, refers to “the unique and subjective focus, concept, context and point of view in which the information is presented and defined, which may be partially obtained through the writer’s own voice, as in a personal essay.”

But, again, this seems to cover most writing, or at least most writing that holds our interest. It’s part of the author function: we attribute what we read not to some impersonal and omniscient agent but to the individual named on the title page or in the byline. This has little to do with whether the work is classified as fiction or nonfiction. Apart from “just the facts” newspaper journalism, where an authorial point of view is deliberately suppressed, any writing that has life has “unique and subjective focus, concept, context and point of view.”

Maybe Gutkind wasn’t naming a new kind of writing, though. Maybe he was giving a new name to an old kind of writing. Maybe he wanted people to understand that writing traditionally classified as nonfiction is, or can be, as “creative” as poems and stories. By “creative,” then, he didn’t mean “made up” or “imaginary.” He meant something like “fully human.” Where did that come from?

One answer is suggested by Samuel W. Franklin’s provocative new book, “The Cult of Creativity” (Chicago). Franklin thinks that “creativity” is a concept invented in Cold War America—that is, in the twenty or so years after 1945. Before that, he says, the term barely existed. “Create” and “creation,” of course, are old words (not to mention, as Franklin, oddly, does not, “Creator” and “Creation”). But “creativity,” as the name for a personal attribute or a mental faculty, is a recent phenomenon.

Like a lot of critics and historians, Franklin tends to rely on “Cold War” as an all-purpose descriptor of the period from 1945 to 1965, in the same way that “Victorian” is often used as an all-purpose descriptor of the period from 1837 to 1901. Both are terms with a load of ideological baggage that is never unpacked, and both carry the implication “We’re so much more enlightened now.” Happily, Franklin does not reduce everything to a single-factor Cold War explanation.

In Franklin’s account, creativity, the concept, popped up after the Second World War in two contexts. One was the field of psychology. Since the nineteenth century, when experimental psychology (meaning studies done with research subjects and typically in laboratory settings, rather than from an armchair) had its start, psychologists have been much given to measuring mental attributes.

For example, intelligence. Can we assign amounts or degrees of intelligence to individuals in the same way that we assign them heights and weights? One way of doing this, some people thought, was by measuring skull sizes, cranial capacity. There were also scientists who speculated about the role of genetics and heredity. By the early nineteen-hundreds, though, the preferred method was testing.

The standard I.Q. test, the Stanford-Binet, dates from 1916. Its aim was to measure “general intelligence,” what psychologists called the g factor, on the presumption that a person’s g was independent of circumstances, like class or level of education or pretty much any other nonmental thing. Your g factor, the theory goes, was something you were born with.

The SAT, which was introduced in 1926 but was not widely used in college admissions until after the Second World War, is essentially an I.Q. test. It’s supposed to pick out the smartest high-school students, regardless of their backgrounds, and thus serve as an engine of meritocracy. Whoever you are, the higher you score the farther up the ladder you get to move. Franklin says that, around 1950, psychologists realized that no one had done the same thing for creativity. There was no creativity I.Q. or SAT, no science of creativity or means of measuring it. So they set out to, well, create one.

They ran into difficulties almost immediately, and Franklin thinks that those difficulties have never gone away, that they are, in a sense, intrinsic to the concept of creativity itself. First of all, how do you peel away “creativity” from other markers of distinction, such as genius or imagination or originality or, for that matter, persistence? Are those simply aspects of a single creative faculty? Or can one score high on an originality or a persistence measure but low on creativity?

Then, do you study creativity by analyzing people commonly acknowledged to be creative—the canonical artist or composer or physicist—and figure out what they all have in common? Or could someone who has never actually created anything be creative, in the way that innately intelligent people can end up in unskilled jobs—the “born to blush unseen” syndrome? If that were the case, you would need an I.Q. test for creativity—call it a C.Q. test—to find such latent aptitudes.

But are all acts we call creative in fact commensurable? Is there some level on which the theory of relativity is no different from “Hamlet” or Pokémon? Psychologists said yes. Making something new, original, and surprising is what is meant by being creative, and a better mousetrap qualifies. What about creating something new, original, and terrible, like a weapon of mass destruction? Psychologists seem to have danced around that problem. For the most part, being creative, like being intelligent, rich, and thin, was something a person could never have too much of.

When psychologists asked what sort of habits and choices were markers of creativity, they came up with things like “divergent thinking” and “tolerance for ambiguity.” They reported that, on tests, creative people preferred abstract art and asymmetrical images. As Franklin points out, those preferences also happened to match up with the tastes of the mid-century educated classes. To put it a little more cynically, the tests seem to have been designed so that the right people passed them.

Franklin is understandably skeptical of the assumptions about mental faculties and inherent aptitudes made by the psychologists whose work he writes about. “By insisting on a psychological cause for creative accomplishment,” he says, “and bracketing all social factors, they deprived themselves of some of the most obvious explanations for creative accomplishment, trapping themselves instead in a tautological spiral that left them bewildered and frustrated.”

But, of course, this is also the problem with the SAT. In a meritocratic society, if creative accomplishment is, like intelligence, rewarded in the workplace, then it must be correlated with some inborn aptitude. Otherwise, we are just reproducing the existing social hierarchy. As Franklin observes, the creativity fad of the nineteen-fifties seems to have had zero impact on the privileged status of white males. The same is true of the SAT. It was not until colleges developed other methods of evaluating students with an eye toward increasing diversity, which generally meant giving less weight to standardized tests, that more dramatic effects on the demographics of higher education were seen.

The Origins of Creativity
Cartoon by Becky Barnicoat

Workplaces, including businesses and the military, were the other area where the concept of creativity shows up after 1945. Postwar organizations prized creativity. In Franklin’s account, these two streams, psychological research and business demands, arose semi-independently, but they obviously fed into and reinforced each other. Employers wanted creative workers; psychologists claimed they had the means to identify them. The former gave work to the latter.

Why the imperative to hire creative people? Franklin suggests that competition with the Soviets, spurred by anxiety about a technology gap, drove the country to search for better ways to get the most out of its human resources. You could argue that the women’s movement arose out of the same impulse. Forget about women’s dignity and right to self-fulfillment. It was just irrational, when you were fighting a Cold War, to exclude half the population from the labor force.

But American industry might have come up with other rubrics besides “creativity” to use in retooling the workforce. Probably the principal factor in the shift to creativity was not the Cold War but the transformation of the American economy from manufacturing to service (which includes financial services, health care, information, technology, and education). Franklin reports that, in 1956, the number of white-collar workers exceeded the number of blue-collar workers for the first time in American history. That is a huge shift on the production side, and it coincided with a huge shift on the demand side—consumerism. The postwar economy was the supermarket economy: products, many of which might be manufactured offshore, sit on the shelf, begging you to buy them. This meant that business had to conceive of its priorities in a new way.

In the old manufacturing economy, if you operated a factory using the techniques of “scientific management,” your workers were not required to think. They were required only to perform set tasks as efficiently as possible. In that kind of business, creativity just gets in the way. But, if your business is about sales, marketing, product design, innovation, or tweaks on standard products, you need ideas, which means that you want to hire the kind of people who can come up with them.

An early and persistent strategy for maximizing creativity in the workplace was known as “brainstorming.” Management set up sessions where workers got together and batted around ideas, on the theory that discussions held without an agenda or top-down guidance would encourage people to speculate freely, to think outside the box. The belief was that this was how creative people, like artists and poets, came up with new stuff. They needed to be liberated from organizational regimens. So workers played at being artists. Dress was informal; sessions were held in relaxed settings designed to look like living rooms; conversation was casual (though someone was taking notes). The idea was not to accomplish tasks. The idea was to, essentially, make stuff up.

Brainstorming would eventually morph into a process called Synectics. Synectics is a far more immersive and permissive form of problem-solving, closer to group therapy. The assumption there is that you want to access the subconscious. That’s where the really novel ideas are.

Franklin suggests that brainstorming and Synectics sessions produced lots of bad or unusable ideas, and no surprise. You can’t free-associate a design solution or a marketing strategy from scratch. You need to have a pretty informed idea of what the box is before you can think outside it.

But part of the point of this brainstorming must have been to enable workers to feel ownership of the product. They weren’t just punching a clock. They were contributing to the creation of something, even if it was something for which there was no crying need. Franklin tells us that Synectics can be credited with two products: Pringles and the Swiffer. I guess you can’t argue with that—though it’s interesting to learn that when you descend into the depths of the subconscious, you emerge with . . . a Pringle.

Franklin argues that the appeal of workplace creativity was that it addressed two anxieties about modern life: conformity and alienation. Postwar intellectuals worried about the “organization man” (the title of a book by the journalist William Whyte) and the “other-directed” personality (diagnosed in the sociologist David Riesman’s “The Lonely Crowd”). These were seen as socially dangerous types. People who did what they were told and who wanted to be like everyone else, who were not “inner-directed,” were people easily recruited to authoritarian movements. They were threats to liberal democracy, and hence to the free-market economy.

The branch of psychology most attuned to anxieties about alienation and conformity is known as humanistic psychology. For the humanistic psychologists, creativity is linked to the concept of authenticity. It is, at bottom, a means of self-expression. Uncreative people are rigid and repressed; creative people are authentically themselves, and therefore fully human. As the psychologist and popular author Rollo May put it, creativity is not an aberrant quality, or something associated with psychic unrest—the tormented-artist type. On the contrary, creativity is “the expression of normal people in the act of actualizing themselves.” It is associated with all good things: individualism, dignity, and humanity. And everyone has it. It just needs to be psychically unlocked.

You can see hints of the counterculture here, and humanistic psychology did lead, as Franklin notes, to encounter therapy, T-groups, and sensitivity training. What’s interesting, though, is that it was in American business, and not the Haight-Ashbury, that these ideals first became enshrined. Countercultural values turned out to be entirely compatible with consumer capitalism in the information age. “The postwar cult of creativity,” Franklin says, “was driven by a desire to impart on science, technology, and consumer culture some of the qualities widely seen to be possessed by artists, such as nonconformity, passion for work, and a humane, even moral sensibility, in addition to, of course, a penchant for the new.”

The industry that most avidly grabbed on to the term “creative” to glamorize what it did was the very motor of consumerism: advertising. In the nineteen-fifties and sixties, ad agencies abandoned the old “reason why” mode of advertising a product (“Here is what you need it for”) and replaced it with branding. They were no longer selling a product. They were selling an idea about the product. People were buying an image they wanted to be associated with. It was the adman’s job to create that image.

Creating an image for a marketing campaign or tweaking a product line seems pretty distant from writing a poem or painting a picture. But the creativity conceptualizers, Franklin says, sought to elide the difference. He thinks that management theorists wanted to appropriate the glamour and prestige of the artist and confer those attributes on admen and product designers.

Yet wasn’t the glamour and prestige of the artist related to a popular belief that artists are not interested in worldly things or practicality? Workplace creativity was supposed to be good for business. It was supposed to increase productivity and make money, things that are not supposed to motivate poets. Yet it’s easy to believe that business could co-opt the reputation of the fine arts without much trouble. The joy of creation plus a nice income. It was the best of both worlds.

Readers do not normally wish books longer, but a couple of discussions are missing from “The Cult of Creativity.” One is about art itself. The early Cold War was a dramatic period in cultural history, and claims about originality and creativity in the arts were continually being debated. Among the complaints about Pop art, when it bounded onto the scene, in 1962, was that the painters were just copying comic books and product labels, not creating. It’s possible that as commercial culture became more invested in the traditional attributes of fine art, fine art became less so.

One also wishes for more on the twenty-first century. Franklin says that the creativity bubble began to shrink in the nineteen-sixties, but it plainly got reinflated in the nineteen-nineties. The pages Franklin devotes to the contemporary creativity landscape are the freshest and most fun in the book.

The iconic image of the startup economy—casually dressed workers in open spaces jotting inspirations on a whiteboard—is a barely updated version of the old nineteen-fifties brainstorming sessions. Those startup workers are also taking ownership (usually in the form of stock options, it’s true) of the products the company makes.

The landscape of the tech universe is shifting right now, but for several decades a whole creativity life style became associated with it. Work was play and play was work. Coders dressed like bohemians. Business was transacted (online) in cafés, where once avant-gardists had sipped espresso and shared their poems. “The star of this new economy,” Franklin writes, was “the hip freelancer or independent studio artist, rather than the unionized musician or actor who had been at the heart of the cultural industries.” In his view, this is perfectly natural, since “creativity” was an economic, not aesthetic, notion to begin with. “The concept of creativity,” he concludes, “never actually existed outside of capitalism.”

Franklin doesn’t mention “creative nonfiction,” either. But his book does give us a way of understanding the term as an effort to endow nonfiction writers with the same qualities—individualism, outside-the-box thinking, and invention—that creative people are assumed to possess. “Creative nonfiction” in this respect doesn’t mean “made up.” It’s an honorific. In an economy that claims to prize creative workers, the nonfiction writer qualifies.

Creating things today seems to be as cool as it ever was. Fewer college students may be taking literature courses, but creative-writing courses are oversubscribed. And what do those students want to write? Creative nonfiction. ♦