Categories
Appointments and Executive Changes Away (JRSK Inc) Customer Relations Haselden, Stuart Korey, Steph Luggage and Packing Social Media Start-ups Uncategorized

Away C.E.O. Is Back, Just Weeks After Stepping Down

She apologized for her management style and stepped down as chief executive. Now, she says it was a mistake to fall on her sword and is taking her job back.

Former employees of Away luggage, one of the fastest-growing retail start-ups in recent years, accused the company’s chief executive, Steph Korey, of creating a toxic culture within the company in an article published by the technology website The Verge that went viral last month.

The article included text messages that a Verge editor described on Twitter as showing Ms. Korey using the workplace messaging application Slack “as a tool to stalk and bully junior and minority employees.”

In the article, former employees — who were identified by pseudonyms — contended that Ms. Korey pushed them too hard. In one message quoted in the article, which was sent at 3 a.m., she told employees on the customer service team that they could not work from home or submit vacation requests until customer service problems she had identified were resolved. In others, she came across as passive-aggressive.

Within hours of its publication, the article had created a social media firestorm around the company, which is worth more than $1 billion in the private market with plans to go public. For a company focused on a millennial audience and a brand that seeks to evoke a sense of community, the story was viewed internally as existential.

Within 24 hours, Ms. Korey had issued a lengthy apology. “I am sincerely sorry for what I said and how I said it. It was wrong, plain and simple,” she said. “I can imagine how people felt reading those messages from the past, because I was appalled to read them myself,” she wrote. Days later, the company said that it was hiring a new chief executive and that Ms. Korey would become executive chairwoman.

The episode, the latest example of a fast-growing company run by young founders that has found itself in a crisis, was viewed within the insular world of start-ups as a swift fall for Ms. Korey, Away’s 31-year-old co-founder.

The new chief executive, Stuart Haselden, plans to start his job on Monday, having been recruited from Lululemon Athletica, the company famous for its leggings.

But there is one new, significant wrinkle: His title won’t be chief executive — he will be co-chief executive with Ms. Korey. She isn’t going anywhere. The company plans to announce the move on Monday morning.

“Frankly, we let some inaccurate reporting influence the timeline of a transition plan that we had,” Ms. Korey said in an interview last week. With some time and perspective, she said, the company’s board members decided to reverse themselves. “All of us said, ‘It’s not right.’”

The members of Away’s board say they feel as if they fell victim to management by Twitter mob.

The company now says it disputes The Verge’s reporting and has hired Elizabeth M. Locke, the lawyer who successfully brought a defamation case against Rolling Stone magazine for a story about a supposed gang rape at the University of Virginia. It is unclear whether Away plans to bring a lawsuit.

In a statement, The Verge said, “Steph Korey responding to our reporting by saying her behavior and comments were ‘wrong, plain and simple’ and then choosing to step down as C.E.O. speaks for itself.”

Sitting in a windowless conference room at the company’s SoHo headquarters, Ms. Korey, at one point nearly breaking down in tears, said that the month since the article was published had been a tough lesson about management — and herself. She was bombarded by criticism on Twitter and other social media platforms that she thought would put the company’s future in jeopardy.

“It’s very upsetting if suddenly total strangers tell you that you should get an abortion,” said Ms. Korey, who is pregnant. One user on Twitter wrote: “Imagine how she’ll treat that baby.”

In the moment, she said, she chose to take herself out of the chief executive role and make herself executive chairwoman. “I said, ‘I don’t know if the company needs a C.E.O. under fire right now,’” she said. “‘Why don’t we just accelerate our transition plan?’”

In a separate interview, Ludwig Ensthaler, a partner at the venture capital firm Global Founders Capital and the only independent director on Away’s four-member board, confirmed that it had been Ms. Korey’s decision to step down and that there was no pressure from outside investors. He added that he should not have accepted the restructuring plan she proposed in the first place.

Ms. Korey had already recruited Mr. Haselden to the company to become its president, with the promise that, after a transition period, he would be elevated to chief executive to help take the company public. When the plan changed after the Verge article was published, she said she would become executive chairwoman and Mr. Haselden the chief executive. But behind the scenes, she said, she expected both of them to operate pretty much in their original roles, just with different titles. Ms. Korey’s co-founder, Jen Rubio, will remain president and chief brand officer.

“I honestly thought that people didn’t care that much about the inner workings of Away,” she said, “Who is C.E.O. and who is executive chairman — that wasn’t something that, at a private company that’s less than four years old that sells travel products, I just didn’t think would be news and people would care.”

But, she said, it quickly became clear that her plan to remain at Away — effectively in the same role but with a new title — was not understood inside or outside the company.

“The way it became perceived it was like I stepped down and like I left the company,” she said. “I have a very external-facing role working with new vendors, working with new partners, recruiting new candidates. And without a change, it looks like they have a board director reaching out to them who doesn’t work at the company.”

Mr. Haselden said in a telephone interview that the article didn’t paint Ms. Korey as the person he knew and said her original decision to step aside “was very selfless in trying to defuse the firestorm of social media.”

“But it just created a misconception that she was exiting the business, which was never the intent,” he added. Making them both co-chief executive, he said, “will clarify how we intended to operate from the beginning.” Ms. Korey said she still planned to eventually step aside after a transition period and Mr. Haselden will become the sole chief executive.

Whether the article reflected an accurate picture of the company — The Verge has since published several updates, clarifications and corrections — it is hard to judge if Ms. Korey herself has changed.

The company provided a trove of emails from employees that suggested they loved working for her. Yet even after the Verge article appeared, employees continued to leak screenshots of Away’s Slack channels to the site, suggesting that whatever changes had been made, some people inside the company remained unhappy.

Ms. Korey said she has done a lot of soul-searching since the article was published. While she maintained that it misrepresented her behavior, she said she recognized that she had made mistakes and could improve.

“When I think back on ways I’ve phrased feedback, there have been times where the word choice isn’t as thoughtful as it should have been, or the way it was framed actually wasn’t as constructive as it could have been,” she said. “Those are not, in the eyes of our leadership and the eyes of our board, terminal, unsolvable problems.”

Categories
Biden, Joseph R Jr Bosworth, Andrew (1982- ) Computers and the Internet Dorsey, Jack Facebook Inc Google Inc Instagram Inc Online Advertising Political Advertising Presidential Election of 2020 Sandberg, Sheryl K Social Media Trump, Donald J Twitter Uncategorized United States Politics and Government Warren, Elizabeth Zuckerberg, Mark E

Facebook Says It Won’t Back Down From Allowing Lies in Political Ads

SAN FRANCISCO — Defying pressure from Congress, Facebook said on Thursday that it would continue to allow political campaigns to use the site to target advertisements to particular slices of the electorate and that it would not police the truthfulness of the messages sent out.

The stance put Facebook, the most important digital platform for political ads, at odds with some of the other large tech companies, which have begun to put new limits on political ads.

Facebook’s decision, telegraphed in recent months by executives, is likely to harden criticism of the company heading into this year’s presidential election.

Political advertising cuts to the heart of Facebook’s outsize role in society, and the company has found itself squeezed between liberal critics, who want it to do a better job of policing its various social media platforms, and conservatives, who say their views are being unfairly muzzled.

The issue has raised important questions regarding how heavy a hand technology companies like Facebook — which also owns Instagram and the messaging app WhatsApp — and Google should exert when deciding what types of political content they will and will not permit.

By maintaining a status quo, Facebook executives are essentially saying they are doing the best they can without government guidance and see little benefit to the company or the public in changing.

In a blog post, a company official echoed Facebook’s earlier calls for lawmakers to set firm rules.

“In the absence of regulation, Facebook and other companies are left to design their own policies,” Rob Leathern, Facebook’s director of product management overseeing the advertising integrity division, said in the post. “We have based ours on the principle that people should be able to hear from those who wish to lead them, warts and all, and that what they say should be scrutinized and debated in public.”

Other social media companies have decided otherwise, and some had hoped Facebook would quietly follow their lead. In late October, Twitter’s chief executive, Jack Dorsey, banned all political advertising from his network, citing the challenges that novel digital systems present to civic discourse. Google quickly followed suit with limits on political ads across some of its properties, though narrower in scope.

Reaction to Facebook’s policy broke down largely along party lines.

The Trump campaign, which has been highly critical of any attempts by technology companies to regulate political advertising and has already spent more than $27 million on the platform, largely supported Facebook’s decision not to interfere in targeting ads or to set fact-checking standards.

“Our ads are always accurate so it’s good that Facebook won’t limit political messages because it encourages more Americans to be involved in the process,” said Tim Murtaugh, a spokesman for the Trump campaign. “This is much better than the approaches from Twitter and Google, which will lead to voter suppression.”

Democratic presidential candidates and outside groups decried the decision.

“Facebook is paying for its own glowing fake news coverage, so it’s not surprising they’re standing their ground on letting political figures lie to you,” Senator Elizabeth Warren said on Twitter.

Ms. Warren, who has been among the most critical of Facebook and regularly calls for major tech companies to be broken up, reiterated her stance that the social media company should face tougher policies.

The Biden campaign was similarly critical. The campaign has confronted Facebook over an ad run by President Trump’s campaign that attacked Joseph R. Biden Jr.’s record on Ukraine.

“Donald Trump’s campaign can (and will) still lie in political ads,” Bill Russo, the deputy communications director for Mr. Biden, said in a statement. “Facebook can (and will) still profit off it. Today’s announcement is more window dressing around their decision to allow paid misinformation.”

But many Democratic groups willing to criticize Facebook had to walk a fine line; they have pushed for more regulation when it comes to fact-checking political ads, but they have been adamantly opposed to any changes to the ad-targeting features.

On Thursday, some Democratic outside groups welcomed Facebook’s decision not to limit microtargeting, but still thought the policy fell short.

“These changes read to us mostly as a cover for not making the change that is most vital: ensuring politicians are not allowed to use Facebook as a tool to lie to and manipulate voters,” said Madeline Kriger, who oversees digital ad buying at Priorities USA, a Democratic super PAC.

Facebook has played down the business opportunity in political ads, saying the vast majority of its revenue came from commercial, not political, ads. But lawmakers have noted that Facebook ads could be a focal point of Mr. Trump’s campaign as well as those of top Democrats.

Facebook’s hands-off ad policy has already allowed for misleading advertisements. In October, a Facebook ad from the Trump campaign made false accusations about Mr. Biden and his son, Hunter Biden. The ad quickly went viral and was viewed by millions. After the Biden campaign asked Facebook to take down the ad, the company refused.

“Our approach is grounded in Facebook’s fundamental belief in free expression, respect for the democratic process and the belief that, in mature democracies with a free press, political speech is already arguably the most scrutinized speech there is,” Facebook’s head of global elections policy, Katie Harbath, wrote in the letter to the Biden campaign.

In an attempt to provoke Facebook, Ms. Warren’s presidential campaign ran an ad falsely claiming that the company’s chief executive, Mark Zuckerberg, was backing the re-election of Mr. Trump. Facebook did not take the ad down.

Criticism seemed to stiffen Mr. Zuckerberg’s resolve. Company officials said he and Sheryl Sandberg, Facebook’s president, had ultimately made the decision to stand firm.

In a strongly worded speech at Georgetown University in October, Mr. Zuckerberg said he believed in the power of unfettered speech, including in paid advertising, and did not want to be in the position to police what politicians could and could not say to constituents. Facebook’s users, he said, should be allowed to make those decisions for themselves.

“People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society,” he said.

Facebook officials have repeatedly said significant changes to its rules for political or issue ads could harm the ability of smaller, less well-funded organizations to raise money and organize across the network.

Instead of overhauling its policies, Facebook has made small tweaks. Mr. Leathern said Facebook would add greater transparency features to its library of political advertising in the coming months, a resource for journalists and outside researchers to scrutinize the types of ads run by the campaigns.

Facebook also will add a feature that allows users to see fewer campaign and political issue ads in their news feeds, something the company has said many users have requested.

There was considerable debate inside Facebook about whether it should change. Late last year, hundreds of employees supported an internal memo that called on Mr. Zuckerberg to limit the abilities of Facebook’s political advertising products.

On Dec. 30, Andrew Bosworth, the head of Facebook’s virtual and augmented reality division, wrote on his internal Facebook page that, as a liberal, he found himself wanting to use the social network’s powerful platform against Mr. Trump.

But Mr. Bosworth said that even though keeping the current policies in place “very well may lead to” Mr. Trump’s re-election, it was the right decision. Dozens of Facebook employees pushed back on Mr. Bosworth’s conclusions, arguing in the comments section below his post that politicians should be held to the same standard that applies to other Facebook users.

For now, Facebook appears willing to risk disinformation in support of unfettered speech.

“Ultimately, we don’t think decisions about political ads should be made by private companies,” Mr. Leathern said. “Frankly, we believe the sooner Facebook and other companies are subject to democratically accountable rules on this, the better.”

Mike Isaac reported in San Francisco and Cecilia Kang reported from Washington. Nick Corasaniti contributed reporting from New York.

Categories
Bosworth, Andrew (1982- ) Computers and the Internet Data-Mining and Database Marketing Facebook Inc News and News Media Online Advertising Parscale, Brad (1976- ) Political Advertising Presidential Election of 2020 Rumors and Misinformation Russian Interference in 2016 US Elections and Ties to Trump Associates Social Media Uncategorized

Lord of the Rings, 2020 and Stuffed Oreos: Read the Andrew Bosworth Memo

On Dec. 30, Andrew Bosworth, a longtime Facebook executive and confidant of Mark Zuckerberg, wrote a long memo on the company’s internal network.

In the post, titled “Thoughts for 2020,” Mr. Bosworth — who oversaw Facebook’s advertising efforts during the 2016 election and is now in charge of the company’s virtual and augmented reality division — admitted that President Trump’s savvy use of Facebook’s advertising tools “very well may lead to” his re-election. But he maintained that the company should not change its policies on political advertising, saying that doing so in order to avert a victory by Mr. Trump would be a misuse of power, comparing it to a scene from “The Lord of the Rings.”

Mr. Bosworth, who is seen by some inside Facebook as a proxy of sorts for Mr. Zuckerberg, also weighed in on a variety of issues that have vexed Facebook for the past few years, including data privacy scandals, Russian interference, political polarization and the debate over whether Facebook is healthy for society.

Here is the full post as written:

The election of Donald Trump immediately put a spotlight on Facebook. While the intensity and focus of that spotlight may be unfair I believe it isn’t unjust. Scrutiny is warranted given our position in society as the most prominent of a new medium. I think most of the criticisms that have come to light have been valid and represent real areas for us to serve our community better. I don’t enjoy having our flaws exposed, but I consider it far better than the alternative where we remain ignorant of our shortcomings.

One trap I sometimes see people falling into is to dismiss all feedback when they can invalidate one part of it. I see that with personal feedback and I see it happening with media coverage. The press often gets so many details wrong it can be hard to trust the veracity of their conclusions. Dismissing the whole because of flaws in parts is a mistake. The media has limited information to work with (by our own design!) and they sometimes get it entirely wrong but there is almost always some critical issue that motivated them to write which we need to understand.

It is worth looking at the 2016 Election which set this chain of events in motion. I was running our ads organization at the time of the election and had been for the four years prior (and for one year after). It is worth reminding everyone that Russian Interference was real but it was mostly not done through advertising. $100,000 in ads on Facebook can be a powerful tool but it can’t buy you an American election, especially when the candidates themselves are putting up several orders of magnitude more money on the same platform (not to mention other platforms).

Instead, the Russians worked to exploit existing divisions in the American public for example by hosting Black Lives Matter and Blue Lives Matter protest events in the same city on the same day. The people who shows up to those events were real even if the event coordinator was not. Likewise the groups of Americans being fed partisan content was real even if those feeding them were not. The organic reach they managed sounds very big in absolute terms and unfortunately humans are bad at contextualizing big numbers. Whatever reach they managed represents an infinitesimal fraction of the overall content people saw in the same period of time and certainly over the course of an election across all media.

So most of the information floating around that is widely believed isn’t accurate. But who cares? It is certainly true that we should have been more mindful of the role both paid and organic content played in democracy and been more protective of it. On foreign interference, Facebook has made material progress and while we may never be able to fully eliminate it I don’t expect it to be a major issue for 2020.

Misinformation was also real and related but not the same as Russian interference. The Russians may have used misinformation alongside real partisan messaging in their campaigns, but the primary source of misinformation was economically motivated. People with no political interest whatsoever realized they could drive traffic to ad-laden websites by creating fake headlines and did so to make money. These might be more adequately described as hoaxes that play on confirmation bias or conspiracy theory. In my opinion this is another area where the criticism is merited. This is also an area where we have made dramatic progress and don’t expect it to be a major issue for 2020.

It is worth noting, as it is relevant at the current moment, that misinformation from the candidates themselves was not considered a major shortcoming of political advertising on FB in 2016 even though our policy then was the same as it is now. These policies are often covered by the press in the context of a profit motive. That’s one area I can confidently assure you the critics are wrong. Having run our ads business for some time it just isn’t a factor when we discuss the right thing to do. However, given that those conversations are private I think we can all agree the press can be forgiven for jumping to that conclusion. Perhaps we could do a better job exposing the real cost of these mistakes to make it clear that revenue maximization would have called for a different strategy entirely.

Cambridge Analytica is one of the more acute cases I can think of where the details are almost all wrong but I think the scrutiny is broadly right. Facebook very publicly launched our developer platform in 2012 in an environment primarily scrutinizing us for keeping data to ourselves. Everyone who added an application got a prompt explaining what information it would have access to and at the time it included information from friends. This may sound crazy in a 2020 context but it received widespread praise at the time. However the only mechanism we had for keeping data secure once it was shared was legal threats which ultimately didn’t amount to much for companies which had very little to lose. The platform didn’t build the value we had hoped for our consumers and we shut this form of it down in 2014.

The company Cambridge Analytica started by running surveys on Facebook to get information about people. It later pivoted to be an advertising company, part of our Facebook Marketing Partner program, who other companies could hire to run their ads. Their claim to fame was psychographic targeting. This was pure snake oil and we knew it; their ads performed no better than any other marketing partner (and in many cases performed worse). I personally regret letting them stay on the FMP program for that reason alone. However at the time we thought they were just another company trying to find an angle to promote themselves and assumed poor performance would eventually lose them their clients. We had no idea they were shopping an old Facebook dataset that they were supposed to have deleted (and certified to us in writing that they had).

When Trump won, Cambridge Analytica tried to take credit so they were back on our radar but just for making [expletive] claims about their own importance. I was glad when the Trump campaign manager Brad Parscale called them out for it. Later on, we found out from journalists that they had never deleted the database and had instead made elaborate promises about its power for advertising. Our comms team decided it would be best to get ahead of the journalists and pull them from the platform. This was a huge mistake. It was not only bad form (justifiably angering the journalists) but we were also fighting the wrong battle. We wanted to be clear this had not been a data breach (which, to be fair to us, it absolutely was not) but the real concern was the existence of the dataset no matter how it happened. We also sent the journalists legal letters advising them not to use the term “breech” which was received normally by the NYT (who agreed) and aggressively by The Guardian (who forged ahead with the wrong terminology, furious about the letter) in spite of it being a relatively common practice I am told.

In practical terms, Cambridge Analytica is a total non-event. They were snake oil salespeople. The tools they used didn’t work, and the scale they used them at wasn’t meaningful. Every claim they have made about themselves is garbage. Data of the kind they had isn’t that valuable to being with and worse it degrades quickly, so much so as to be effectively useless in 12-18 months. In fact the United Kingdom Information Commissioner’s Office (ICO) seized all the equipment at Cambridge Analytica and found that there was zero data from any UK citizens! So surely, this is one where we can ignore the press, right? Nope. The platform was such a poor move that the risks associated were bound to come to light. That we shut it down in 2014 and never paid the piper on how bad it was makes this scrutiny justified in my opinion, even if it is narrowly misguided.

So was Facebook responsible for Donald Trump getting elected? I think the answer is yes, but not for the reasons anyone thinks. He didn’t get elected because of Russia or misinformation or Cambridge Analytica. He got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser. Period.

To be clear, I’m no fan of Trump. I donated the max to Hillary. After his election I wrote a post about Trump supporters that I’m told caused colleagues who had supported him to feel unsafe around me (I regret that post and deleted shortly after).

But Parscale and Trump just did unbelievable work. They weren’t running misinformation or hoaxes. They weren’t microtargeting or saying different things to different people. They just used the tools we had to show the right creative to each person. The use of custom audiences, video, ecommerce, and fresh creative remains the high water mark of digital ad campaigns in my opinion.

That brings me to the present moment, where we have maintained the same ad policies. It occurs to me that it very well may lead to the same result. As a committed liberal I find myself desperately wanting to pull any lever at my disposal to avoid the same result. So what stays my hand?

I find myself thinking of the Lord of the Rings at this moment. Specifically when Frodo offers the ring to Galadrial and she imagines using the power righteously, at first, but knows it will eventually corrupt her. As tempting as it is to use the tools available to us to change the outcome, I am confident we must never do that or we will become that which we fear.

The philosopher John Rawls reasoned that the only moral way to decide something is to remove yourself entirely from the specifics of any one person involved, behind a so called “Veil of Ignorance.” That is the tool that leads me to believe in liberal government programs like universal healthcare, expanding housing programs, and promoting civil rights. It is also the tool that prevents me from limiting the reach of publications who have earned their audience, as distasteful as their content may be to me and even to the moral philosophy I hold so dear.

That doesn’t mean there is no line. Things like incitement of violence, voter suppression, and more are things that same moral philosophy would safely allow me to rule out. But I think my fellow liberals are a bit too, well, liberal when it comes to calling people Nazi’s.

If we don’t want hate mongering politicians then we must not elect them. If they are getting elected then we have to win hearts and minds. If we change the outcomes without winning the minds of the people who will be ruled then we have a democracy in name only. If we limit what information people have access to and what they can say then we have no democracy at all.

This conversation often raises the alarm around filter bubbles, but that is a myth that is easy to dispel. Ask yourself how many newspapers and news programs people read/watched before the internet. If you guessed “one and one” on average you are right, and if you guessed those were ideologically aligned with them you are right again. The internet exposes them to far more content from other sources (26% more on Facebook, according to our research). This is one that everyone just gets wrong.

The focus on filter bubbles causes people to miss the real disaster which is polarization. What happens when you see 26% more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more. This is also easy to prove with a thought experiment: whatever your political leaning, think of a publication from the other side that you despise. When you read an article from that outlet, perhaps shared by an uncle or nephew, does it make you rethink your values? Or does it make you retreat further into the conviction of your own correctness? If you answered the former, congratulations you are a better person than I am. Every time I read something from Breitbart I get 10% more liberal.

What does all of this say about the nature of the algorithmic rewards? Everyone points to top 0.1% content as being acutely polarized but how steep are the curves? What does the top 1% or 5% look like? And what is the real reach across those curves when compared to other content? I think the call for algorithmic transparency can sometimes be overblown but being more transparent about this type of data would likely be healthy.

What I expect people will find is that the algorithms are primarily exposing the desires of humanity itself, for better or worse. This is a Sugar, Salt, Fat problem. The book of that name tells a story ostensibly about food but in reality about the limited effectiveness of corporate paternalism. A while ago Kraft foods had a leader who tried to reduce the sugar they sold in the interest of consumer health. But customers wanted sugar. So instead he just ended up reducing Kraft market share. Health outcomes didn’t improve. That CEO lost his job. The new CEO introduced quadruple stuffed Oreos and the company returned to grace. Giving people tools to make their own decisions is good but trying to force decisions upon them rarely works (for them or for you).

In these moments people like to suggest that our consumers don’t really have free will. People compare social media to nicotine. I find that wildly offensive, not to me but to addicts. I have seen family members struggle with alcoholism and classmates struggle with opioids. I know there is a battle for the terminology of addiction but I side firmly with the neuroscientists. Still, while Facebook may not be nicotine I think it is probably like sugar. Sugar is delicious and for most of us there is a special place for it in our lives. But like all things it benefits from moderation.

At the end of the day we are forced to ask what responsibility individuals have for themselves. Set aside substances that directly alter our neurochemistry unnaturally. Make costs and trade-offs as transparent as possible. But beyond that each of us must take responsibility for ourselves. If I want to eat sugar and die an early death that is a valid position. My grandfather took such a stance towards bacon and I admired him for it. And social media is likely much less fatal than bacon.

To bring this uncharacteristically long and winding essay full circle, I wanted to start a discussion about what lessons people are taking away from the press coverage. My takeaway is that we were late on data security, misinformation, and foreign interference. We need to get ahead of polarization and algorithmic transparency. What are the other big topics people are seeing and where are we on those?

Categories
Bickert, Monika Computers and the Internet Facebook Inc Presidential Election of 2020 Rumors and Misinformation Social Media Uncategorized United States Politics and Government Video Recordings, Downloads and Streaming

Facebook Says It Will Ban ‘Deepfakes’

WASHINGTON — Facebook said on Monday that it would ban videos that are heavily manipulated by artificial intelligence, known as deepfakes, from its platform.

In a blog post, a company executive said Monday evening that the social network would remove videos altered by artificial intelligence in ways that “would likely mislead someone into thinking that a subject of the video said words that they did not actually say.”

The policy will not extend to parody or satire, the executive, Monika Bickert, said, nor will it apply to videos edited to omit or change the order of words.

Ms. Bickert said all videos posted would still be subject to Facebook’s system for fact-checking potentially deceptive content. And content that is found to be factually incorrect appear less prominently on the site’s news feed and is labeled false.

The company’s new policy was first reported by The Washington Post.

Facebook was heavily criticized last year for refusing to take down an altered video of Speaker Nancy Pelosi that had been edited to make it appear as though she was slurring her words. At the time, the company defended its decision, saying it had subjected the video to its fact-checking process and had reduced its reach on the social network.

It did not appear that the new policy would have changed the company’s handling of the video with Ms. Pelosi.

The announcement comes ahead of a hearing before the House Energy & Commerce Committee on Wednesday morning, during which Ms. Bickert, Facebook’s vice president of global policy management, is expected to testify on “manipulation and deception in the digital age,” alongside other experts.

Because Facebook is still the No. 1 platform for sharing false political stories, according to disinformation researchers, the urgency to spot and halt novel forms of digital manipulation before they spread is paramount.

Computer scientists have long warned that new techniques used by machines to generate images and sounds that are indistinguishable from the real thing can vastly increase the volume of false and misleading information online. And false political information is circulating rapidly online ahead of the 2020 presidential elections in the United States.

In late December, Facebook announced it had removed hundreds of accounts, including pages, groups and Instagram feeds, meant to fool users in the United States and Vietnam with fake profile photos generated with the help of artificial intelligence.

David McCabe reported from Washington, and Davey Alba from New York.

Categories
china Chinese Nationalist Party (Taiwan) Computers and the Internet Democratic Progressive Party (Taiwan) Elections Han Kuo-yu Politics and Government Propaganda Rumors and Misinformation Social Media Taiwan Tsai Ing-wen Uncategorized Voting and Voters

Awash in Disinformation Before Vote, Taiwan Points Finger at China

TAIPEI, Taiwan — At first glance, the bespectacled YouTuber railing against Taiwan’s president, Tsai Ing-wen, just seems like a concerned citizen making an appeal to his fellow Taiwanese.

He speaks Taiwanese-accented Mandarin, with the occasional phrase in Taiwanese dialect. His captions are written with the traditional Chinese characters used in Taiwan, not the simplified ones used in China. With outrage in his voice, he accuses Ms. Tsai of selling out “our beloved land of Taiwan” to Japan and the United States.

The man, Zhang Xida, does not say in his videos whom he works for. But other websites and videos make it clear: He is a host for China National Radio, the Beijing-run broadcaster.

As Taiwan gears up for a major election this week, officials and researchers worry that China is experimenting with social media manipulation to sway the vote. Doing so would be easy, they fear, in the island’s rowdy democracy, where the news cycle is fast and voters are already awash in false or highly partisan information.

China has been upfront about its dislike for President Tsai, who opposes closer ties with Beijing. The Communist Party claims Taiwan as part of China’s territory, and it has long deployed propaganda and intimidation to try to influence elections here.

Polls suggest, however, that Beijing’s heavy-handed ways might be backfiring and driving voters to embrace Ms. Tsai. Thousands of Taiwan citizens marched last month against “red media,” or local news organizations supposedly influenced by the Chinese government.

That is why Beijing may be turning to subtler, digital-age methods to inflame and divide.

Recently, there have been Facebook posts saying falsely that Joshua Wong, a Hong Kong democracy activist who has fans in Taiwan, had attacked an old man. There were posts about nonexistent protests outside Taiwan’s presidential house, and hoax messages warning that ballots for the opposition Kuomintang, or Chinese Nationalist Party, would be automatically invalidated.

So many rumors and falsehoods circulate on Taiwanese social media that it can be hard to tell whether they originate in Taiwan or in China, and whether they are the work of private provocateurs or of state agents.

Taiwan’s National Security Bureau in May issued a downbeat assessment of Chinese-backed disinformation on the island, urging a “‘whole of government’ and ‘whole of society’ response.”

“False information is the last step in an information war,” the bureau’s report said. “If you find false information, that means you have already been thoroughly infiltrated.”

Taiwanese society has woken up to the threat. The government has strengthened laws against spreading harmful rumors. Companies including Facebook, Google and the messaging service Line have agreed to police their platforms more stringently. Government departments and civil society groups now race to debunk hoaxes as quickly as they appear.

The election will put these efforts — and the resilience of Taiwan’s democracy — to the test.

“The ultimate goal, just like what Russia tried to do in the United States, is to crush people’s confidence in the democratic system,” said Tzeng Yi-suo of the Institute for National Defense and Security Research, a think tank funded by the government of Taiwan.

Fears of Chinese meddling became acute in recent months after a man named Wang Liqiang sought asylum in Australia claiming he had worked for Chinese intelligence to fund pro-Beijing candidates in Taiwan, buy off media groups and conduct social media attacks.

Mr. Wang’s account remains largely unverified. But there are other signs that Beijing is working to upgrade its techniques of information warfare.

Twitter, which is blocked in mainland China, recently took down a vast network of accounts that it described as Chinese state-backed trolls trying to discredit Hong Kong’s protesters.

A 2018 paper in a journal linked to the United Front Work Department, a Communist Party organ that organizes overseas political networking, argued that Beijing had failed to shape Taiwanese public discourse in favor of unification with China.

In November, the United Front Work Department held a conference in Beijing on internet influence activities, according to an official social media account. The department’s head, You Quan, said the United Front would help people such as social media influencers, live-streamers and professional e-sports players to “play an active role in guiding public opinion.”

“We understand that the people who are sowing discord are also building a community, that they are also learning from each other’s playbooks,” said Audrey Tang, Taiwan’s digital minister. “There are new innovations happening literally every day.”

In Taiwan, Chinese internet trolls were once easily spotted because they posted using the simplified Chinese characters found only on the mainland.

That happens less these days, though there are still linguistic slip-ups.

In one of the YouTube videos from Mr. Zhang, the China National Radio employee, a character in the description is incorrectly translated into traditional Chinese from simplified Chinese. Mr. Zhang did not respond to a message seeking comment.

Puma Shen, an assistant professor at National Taipei University who studies Chinese influence efforts, does not believe that disinformation from China is always guided by some central authority as it spreads around the internet.

“It’s not an order from Beijing,” Mr. Shen said. Much of the activity seems to be scattered groups of troublemakers, paid or not, who feed off one another’s trolling. “People are enthusiastic about doing this kind of stuff there in China,” he said.

In December, Taiwan’s justice ministry warned about a fake government notice saying Taiwan was deporting protesters who had fled Hong Kong. The hoax first appeared on the Chinese social platform Weibo, the ministry said, before spreading to a Chinese nationalist Facebook group.

Sometimes, Chinese trolls amplify rumors already floating around in Taiwan, Mr. Shen said. He is also on the lookout for Taiwanese social media accounts that may be bought or supported by Chinese operatives.

Ahead of midterm elections in 2018, his team had been monitoring several YouTube channels that discussed Taiwanese politics. The day after voting ended, the channels disappeared.

After Yu Hsin-Hsien was elected to the City Council that year in Taoyuan, a city near Taipei, mysterious strangers began inquiring about buying his Facebook page, which had around 280,000 followers. Mr. Yu, 30, immediately suspected China.

His suspicions grew after he demanded an extravagantly high price and the buyers accepted. Mr. Yu, who represents Ms. Tsai’s party, the Democratic Progressive Party, did not sell.

“Someone approaches a just-elected legislator and offers to buy his oldest weapon,” Mr. Yu said. “What’s his motive? To serve the public? It can’t be.”

Recently, internet users in Taiwan noticed a group of influencers, many of them pretty young women, posting messages on Facebook and Instagram with the hashtag #DeclareMyDeterminationToVote. The posts did not mention candidates or parties, but the people included selfies with a fist at their chest, a gesture often used by Han Kuo-yu, the Kuomintang’s presidential candidate.

Mr. Han’s campaign denied involvement. But some have speculated that China’s United Front might be responsible. The United Front Work Department did not respond to a fax requesting comment.

One line of attack against Ms. Tsai has added to the atmosphere of mistrust and high conspiracy ahead of this week’s vote.

Politicians and media outlets have questioned whether Ms. Tsai’s doctoral dissertation is authentic, even though her alma mater, the London School of Economics, has confirmed that it is.

Dennis Peng hosts a daily YouTube show dedicated to proving otherwise. His channel has 173,000 subscribers. Theories about Ms. Tsai’s dissertation have circulated in China, too, with the help of the Chinese news media.

Mr. Peng, a former television anchor, once supported Ms. Tsai. He was proud that Taiwan elected a female president. Now he says he is not being paid by anyone, including China, to crusade against her.

He is not worried about being smeared as fake news.

“Let news and fake news compete against each other,” Mr. Peng said. “I trust that most people aren’t so stupid. Everybody eventually figures it out.”

Steven Lee Myers contributed reporting. Wang Yiwei contributed research from Beijing.

Categories
Computers and the Internet Los Angeles (Calif) Social Media Uncategorized your-feed-fashion your-feed-internet-culture your-feed-longreads

Hype House and the Los Angeles TikTok Mansion Gold Rush

LOS ANGELES — Hype House, the physical location of a new content creator collective, is a Spanish-style mansion perched at the top of a hill on a gated street in Los Angeles. It has a palatial backyard, a pool and enormous kitchen, dining and living quarters.

Four of the group’s 19 members live in the house full time; several others keep rooms to crash in when they are in town. And all day long, a stream of influential young internet stars come by to pay homage to the new guard.

Hype House was formed in December by some of TikTok’s most talked-about stars. They introduced themselves with a Backstreet Boys-esque photo shoot, and within minutes #hypehouse began trending; videos including the hashtag #hypehouse have accrued nearly 100 million views on TikTok.

The group handle that distributes their content surpassed three million followers on TikTok in just over a week and a half. In the days leading up to Christmas it was all anyone under the age of 18 on TikTok seemed to be talking about.

So-called collab houses, also known as content houses, are an established tradition in the influencer world. Over the last five years they have formed a network of hubs across Los Angeles.

In 2014 members of an early collab channel called Our Second Life lived and worked together in what they called the 02L Mansion. The next year, nearly all the top talent on Vine moved into a large apartment complex at 1600 Vine Street.

Soon after, YouTuber mansions were popping up all over the city. The Vlog Squad shacked up in Studio City, while Team 10, Jake Paul’s infamous YouTuber collective, rented a giant house in West Hollywood before eventually decamping to a mansion in Calabasas.

Another group of YouTubers rented a $12 million mansion in the Hollywood Hills and deemed it the Clout House.

Now, the TikTokers have arrived — and everything about TikTok happens faster than it does anywhere else.

Credit…Michelle Groskopf for The New York Times
Credit…Michelle Groskopf for The New York Times
Image
Credit…Michelle Groskopf for The New York Times

Collab houses are beneficial to influencers in lots of ways. Living together allows for more teamwork, which means faster growth, and creators can provide emotional support for what can be a grueling career.

“It’s a brilliant move for power players on these platforms to lift each other up,” said Sam Sheffer, a YouTuber and technologist. “‘Elevate others to elevate yourself’ is a saying, and it really rings true with this new generation of TikTokers.”

“From a management perspective, it’s great,” he added. “It just means all the kids will focus on content.”

Hype House was the brainchild of Chase Hudson, 17, a TikTok star with more than eight million followers who is known online as Lilhuddy, and Thomas Petrou, 21, a YouTube star.

The pair began plotting a move in November. Within 13 days they had signed a lease on their current residence. Originally, Chase hoped to name the group House of Olympus. He still thinks it sounds cooler, but then Alex Warren, 19, suggested the name Hype House, and Chase was outvoted.

Credit…Michelle Groskopf for The New York Times
Credit…Michelle Groskopf for The New York Times
Credit…Michelle Groskopf for The New York Times
Credit…Michelle Groskopf for The New York Times

Finding the right location for the house was key. A good collab house has lots of natural light, open space and is far from prying neighbors. A gated community is ideal, to prevent swarms of fans from showing up.

Brent Rivera, a YouTube star with more than 17 million followers on TikTok who also runs a talent incubator, said the perfect collab house “needs to be big, and the more amenities the better, like a pool, nice bathroom, nice lighting, big back and front yard, room for activities and fun stuff you can do inside or outside.”

Residents also must be able to film. Many influencers prefer the short-term rental structure of Airbnb, in part because obtaining a lease can be tough when you’re young and have an unpredictable income.

But unfortunately many Airbnbs in Los Angeles have a no-filming rule. (Homeowners worry about, among other things, tripods scratching the floors and the potential property damage that comes with YouTube stunts.)

The location Chase and Thomas found for Hype House checked all the boxes and had some additional features that make it perfect for TikTok: plenty of giant mirrors and a bathroom the size of a small apartment to film in. Because everyone just moved in, Hype House is also nearly without furniture, which makes shooting easier.

On Dec. 30, members clustered into the bathroom in rotating groups, doing back flips in front of a phone propped up on a roll of toilet paper supported by a Smartwater bottle. Fifteen-second clips of a DaBaby song looped until everyone had memorized the agreed-upon choreography.

After one group finished filming, they headed downstairs to lounge on three beanbag chairs. The house has a large glistening pool, but it’s too cold to swim in it right now. Hype House members prefer to hang out on the stone porches overlooking it. The sweeping staircase is also a popular backdrop.

Alex, Thomas, Daisy Keech, 20, and Kouvr Annon, 19, live at the house full time. As the oldest, Thomas acts as a default den mother. Though Chase helped put money down for the house, Thomas manages schedules, handles the house issues and resolves the inevitable conflicts. Unlike Team 10 and other groups, Hype House doesn’t take a cut of anyone’s revenue.

The house does have strict rules, however. Creators can have friends over, but it is not a party house. If you break something, you have 15 days to replace it. And if you want to be a part of the group, you need to churn out content daily.

“If someone slips up constantly, they’ll not be a part of this team anymore,” Thomas said. “You can’t come and stay with us for a week and not make any videos, it’s not going to work. This whole house is designed for productivity. If you want to party, there’s hundreds of houses that throw parties in L.A. every weekend. We don’t want to be that. It’s not in line with anyone in this house’s brand. This house is about creating something big, and you can’t do that if you’re going out on the weekends.”

In order to make a splash on the internet, you need the right people and so Chase acts as Hype House’s unofficial talent scout and a behind-the-scenes operator. He has a knack for spotting influencers early and knows what qualities it takes to get big online.

You have to be young, you have to “have a lot of energy and personality and honestly a little weird. The weird people get the furthest on the internet,” Chase said. “You either have to be talented at something, or a weird funny mix, or extremely good looking.”

Alex said, “If you have all three, you’re a TikTok god.”

The undisputed star of the group is Charli D’Amelio, a 15-year-old from Connecticut known as the reigning queen of TikTok. She and Chase appear to be dating; the two most often speak of each other as best friends.

Charli has amassed more than 15 million followers since joining the app this summer, and her fan base continues to grow at a wild rate. Her dance routines spur thousands of copycat videos; her rise has been so sharp and fast that she has become a meme.

Charli’s sister, Dixie D’Amelio, is 18 and has five million followers. Because they are still in school, both girls will continue to live with their parents in Connecticut but come out to Los Angeles when their schedules allow.

Charli is polite, thoughtful and soft-spoken in person. She is a trained dancer and has ambitions to dance full time. In December she performed with Bebe Rexha at a Jonas Brothers concert. Hype House has provided a safe space to help her cope with the stress and attention that come with overnight fame.

“The internet can be a little harsh,” she said. “Everyone here is ready to bring positivity and kindness.” Charli also credits the group for expanding her creativity and helping her branch into new content formats like vlogging.

“I’m trying things outside my comfort zone that I might not have done if I was alone in my room,” she said.

But her roots remain in dance. “I grew up in the dance competition world — everyone’s dream is to dance onstage. I’ve been a performer my whole life,” she said. “I say all the time, this is a dream. I’m living out everything I’ve ever wanted to do so early.”

Marc D’Amelio, who is Charli and Dixie’s father, said: “As parents, one thing we say all the time is that this is just about creating options for our kids. We don’t know where this is going, we don’t have any plans for Charli or Dixie to do this or that. We’re just riding it and enjoying it, and hopefully they can do things they love and most importantly be happy.”

The competition among young influencers in Los Angeles is fierce. Many YouTubers who have felt secure in their status as internet elites are now being threatened by the new wave of talent from TikTok that is flooding the city.

And even since the arrival of Hype House, many other TikTok collectives have been making plans to take on Los Angeles. Some TikTokers began discussing a Melanin Mansion for black creators, noting that Hype House is predominately white.

Cabin Six, an L.G.B.T.-focused collective, held public auditions on TikTok last week, as did Diversity University, another TikTok group with plans to organize in Los Angeles in March.

“TikTok has brought a younger group of creators. That energy is kind of pushing on a lot of older creators,” said Josh Sadowski, 19, a TikToker with nearly four million followers who lived in another TikTok collab house. “There’s all these kids who want to move to L.A. and make content, and TikTok is pushing their growth so much. Everybody is really, really driven. They’re bringing that energy to L.A., and it’s rubbing off on everyone else. No one wants to miss out.”

Evidence of this is all over the city. TikTok’s primary United States office — the company is based in China — is in Los Angeles. At sunset on a recent Friday, six TikTok shoots were taking place simultaneously on the Venice boardwalk.

Several TikTok creators began hosting twice-a-week collab days at the Burbank Town Center in the fall; Josh was shocked at how many kids began showing up.

Every influencer brings friends and “the group just gets bigger and bigger,” he said. “The energy is very different. I’ve been around YouTubers, but the energy now, people are so motivated and you can feel that motivation in these collabs. It creates a hype.”

TalentX Entertainment, a talent management incubator, has rented a giant collab house in Bel Air called the Sway House, where six TikTokers, all with millions of followers, will move in on Jan. 3. One member of the Council House, a group of British and Irish TikTokers, visited Los Angeles this week and posted about his plans to “infiltrate America.”

Too much hype inevitably attracts drama, and Hype House members are extremely wary of it. They are careful about who they film with, what they wear, how they act and how things can be interpreted online.

If a Hype House member has a girlfriend, for instance, that member may avoid filming with another girl alone, so as not to start rumors.

The house itself could bring drama someday. MaiLinh Nguyen, a former videographer for Jake Paul, said money can play a huge role in trouble.

“I don’t think it’s sustainable to just be a collective forever,” she said. “At some point if they want to do a pop-up shop, or release Hype House merch, they need to figure out how to divvy things up financially and they’re going to have to legitimize it as a business.”

Michael Gruen, the vice president of talent at TalentX Entertainment, said many of these collectives are creating valuable intellectual property. A commission structure should be negotiated from the start, he said, and thought should be given to incorporation and insurance and everything else that comes along with running a business.

“As I’ve told many of these creator houses,” Mr. Gruen said, “before you dig deep into raising the value of the I.P., make sure that you have the splits organized so it doesn’t come into play and ruin friendships.”

Carson King, 20, a YouTuber who lives in a collab house with several YouTuber friends, said that for him and many others, a looser arrangement can work great, and creates less pressure.

“I think it’s a dream for a lot of people to be able to move in with friends and be able to work on whatever you want to work on,” he said. He and his housemates keep things like whiteboards around their collab house so they can write down video ideas anytime.

“The big struggle creators have is that people around them don’t understand at all the culture of what they’re doing,” said Mitch Moffit, 31, a YouTuber who lived in a collab house when he was starting out.

This is the value for young people: If you want to immerse yourself in influencer and internet culture, there’s no better place to be. Chase, Thomas, Charli and other members of Hype House are aware of how lucky they are, how fleeting fame can be, and they don’t want to squander the opportunity.

“It’s 24/7 here. Last night we posted at 2 a.m.,” Thomas said. “There’s probably 100 TikToks made here per day. At minimum.”

Categories
ace&jig Computers and the Internet Dresses Elizabeth Suzann (Fashion Label) Facebook Inc Fashion and Apparel Instagram Inc Noihsaf Pyne&Smith Clothiers Shopping and Retail Social Media Uncategorized Women and Girls your-feed-fashion

How to Make Friends Online the Old-Fashioned Way (Buying Clothes Together)

Image
Credit…Riah Beth Photography

Emily Useche, who is 27 and lives in Arkansas, had just put her baby down for a nap one afternoon when she decided to post some family photos on Facebook. But she didn’t simply upload them for friends and family to see.

She also posted the photos to a private Facebook group for a whole other community: A fan club for Pyne & Smith Clothiers. Ms. Useche was wearing one of that brand’s dresses in the photos — a style she had posted about once before when she saw it being sold secondhand — and was ready to show it off. Minutes after she posted, other members replied with compliments for her, and praise for the sunflower check dress she was wearing.

The group, Pyne & Smith Clothiers BST and Chat, is one of a number of so-called buy-sell-trade communities. Part social club and part marketplace, the groups have sprung up on Instagram and Facebook and have, for some users, become a daily place to socialize and shop.

While many serve enthusiasts of mass market brands, others are powered by dedicated followers of idiosyncratic indie brands, the sort rarely featured in glossy magazines and often escape the notice of major retailers. But they have devoted followers, many of whom are attracted by the idea of slow, ethical fashion.

Facebook and Instagram communities can be a very real alternative to traditional retailers, providing shoppers with not only products, but also friends.

“A lot of us are millennials who are trying really hard to take steps toward sustainability,” said Lacey Camille Schroeder, 32 and a jewelry designer who lives in Cedar Rapids, Iowa. She created the PSC buy-sell-trade Facebook group. “People buying these dresses tend to be like-minded when it comes to fashion. A lot of them are in the ‘crunchy’ category.”

That line was founded by Joanna McCartney. She stumbled into making clothes in 2014 when she couldn’t find a linen dress she liked during the hot Los Angeles summer.

Made of flax linen and produced in California, the dresses look like the kind you could wear to a dinner party and to collect eggs from your free-range chickens the next day. Their prices range from $146 to $186, though by the time the dresses make it to this group, they’re usually sold for about $120 each.

Ms. Schroeder set up the group, which has 2,888 members, two years ago when a follower of the Pyne & Smith Clothiers Instagram said she was looking to sell a gently used Pyne & Smith dress that was taking up space in her closet.

Ms. Schroeder got on the phone with Ms. McCartney and hammered out the group guidelines.

Civility and a promise to be kind when posting critical feedback are among the few requirements for membership, and Ms. Schroeder said she rarely has to moderate conversations.

In some cases, a single dress may be sold and passed between three or four members, who connect with each other and facilitate their own sales along the way.

Groups range from small pop-up Instagram hashtags like #JamieandTheJonesForSale, with fewer than 100 posts, to accounts like Noihsaf Bazaar, which was started on Instagram in 2013 and now has more than 30,000 followers.

Noihsaf was founded when Kate Lindello, 36, a stylist, fashion blogger and stay-at-home mother, wanted to sell a pair of Rachel Comey flats that didn’t fit.

Today Noihsaf, which focuses on emerging and independent designers, operates multiple Instagram accounts, including one for vintage and one for beauty products, and posts 1,200 to 1,500 items weekly on its main resale account.

Ms. Lindello employs three freelancers to help her sort through the hundreds of daily submissions and choose items to post. Unlike volunteer-run accounts, Noihsaf charges a $3.80-per-sale fee.

“Tech is a blessing and a curse,” Ms. Lindello said. “We’re behind our phones so much, but you also have the chance to make this human connection.” In 2017, after posting a pair of her own denim jeans on the account, she was surprised to see that the buyer lived only two miles down the road.

“I could have mailed those jeans to Allison in Duluth, but I wanted to know who this person was,” she said. “I emailed her, and she said she’d just drop by my house. She ended up being a New Yorker who had just moved here, and we’re buddies now. She’s my kid’s dentist.”

Around that same time, Nicolle Rountree, an African-American logistics manager who lives in New Orleans and wears plus-size clothing, was fed up with feeling unwelcome in stores and buying new pants every month when fast fashion ones fell apart.

Through online research, Ms. Rountree discovered Elizabeth Suzann, a label that offers classic staples in natural fabrics in sizes XXS through 4XL — and then discovered that used Elizabeth Suzann clothing was being sold on Instagram accounts like Sell/Trade Elizabeth Suzann and Sell/Trade Slow Fashion.

One day, a fellow Instagram shopper tagged her in a post for a used pair of black Clyde pants in size 16 that she had spotted. Ms. Rountree bid by commenting on the post and bought them from the seller for $125 (normally $245), becoming the third owner of the pants and a committed Elizabeth Suzann customer.

This year, Ms. Rountree became a volunteer moderator of the Sell/Trade Slow Fashion Instagram account (more than 18,000 followers), which hosts and curates sale posts for slow fashion items, hosts trade forums and prompts weekly discussions about ethical fashion. Through the group, she has met more and more women who care about slow fashion.

It’s an online community that became even more real in October, when Ms. Rountree met two other moderators of the group and road-tripped to the Elizabeth Suzann sample sale in Nashville.

“I got out of the car, and there’s this line of women, many of whom I knew, mostly by their Instagram handles, and they ran up to me and hugged me. It blew me away,” she said. “We were all there waiting and shopping in terrible 90-degree Southern summer heat, all stripped down to just bras and underwear. And people are handing you stuff to try on, and you’re handing them stuff to try on, and you don’t even know them. They’re strangers who aren’t strangers.

“I’m a black woman who lives in the South,” Ms. Rountree said. “I have never felt that safe around that many people before.”

Sali Kelley, 50 and an American child care provider and E.S.L. teacher in Italy, has also seen her life changed by online buy-sell-trade communities. Between 2015 and 2016, Ms. Kelley’s best friend left the country, leaving her adrift and depressed, and she and her family moved from Milan to Varese, a smaller city in northern Italy.

Feeling alone and isolated, Ms. Kelley found herself having more interactions online. Eventually, most of them centered around a newly discovered passion: slow fashion, and one brand in particular, Ace & Jig, a female-run American company that uses vivid Indian textiles to create whimsical, colorful clothing.

Though Ms. Kelley was initially turned off by Ace & Jig’s retail prices (new pieces are $200 to $300), she began searching Instagram, where she discovered hundreds of women selling under hashtags like #aceandjigforsale (more than 16,000 posts) and #aceandjigcommunity (more than 5,000). Noihsaf also has a channel dedicated to Ace & Jig.

Before long, Ms. Kelley had started an Instagram account dedicated to celebrating the label, as well as a private message group for plus-size members to trade their Ace & Jig items. She even began organizing an April 2020 meeting for fans in Paris and London, and says it’s not unusual for her to spend hours each week chatting with other Ace & Jig fans and commenting on community posts.

She is also managing the cross-country journey of an Ace & Jig shirt that is being mailed from fan to fan every couple of weeks.

“The rules are basically there’s no rules,” Ms. Kelley said. “You wear it once and post a picture of it and pass it on.” Termed the “traveling Baja,” after the shirt style and “The Sisterhood of the Traveling Pants,” the shirt is size XS but seems to fit most of the women who want to participate, Ms. Kelley said.

Currently making its way through Tennessee after traveling from Italy through 13 other states, the shirt is a way for people in the community to connect that Ms. Kelley said she dreamed up one night when she couldn’t sleep.

“Most of us are women with the same core values who care about women’s issues,” Ms. Kelley said of the 500 or so online friends in her network. “We talk about kids, life, jobs. We’re constantly messaging each other and commenting on each others posts. If I haven’t seen someone post for a while, I’ll check and ask, ‘Hey, are you O.K.?’”

Categories
Biobot Analytics Inc Computers and the Internet Corporate Social Responsibility Delivery Services DynamiCare Health Entrepreneurship Food Greenhouse Gas Emissions Lemontree Foods Inc Mobile Applications Nonprofit Organizations OpenAQ Opioids and Opiates Pear Therapeutics Inc Pinterest Propel Inc Social Media Start-ups Two Thousand Nineteen Uncategorized

The 2019 Good Tech Awards

Two years ago, I started what has become one of my favorite annual traditions. Instead of a year-end column rounding up all the dubious and objectionable things technology companies did over the last year — a true fish-in-a-barrel assignment — I highlighted some examples of “good tech.” I wanted to give kudos to the kinds of tech projects that don’t always make headlines but that improve people’s lives in tangible ways.

I’ll admit, handing out awards for good technology in 2019 feels a little like congratulating Godzilla for not destroying all of Tokyo. There was plenty of bad tech news to write about this year: Facebook’s foibles, Amazon’s aggression, SoftBank’s stumbles. But to me, the tech industry’s very public shortfalls make celebrating its quieter successes even more important. The tech industry, after all, is not a monolith, and many engineers and entrepreneurs work on projects that help society. So here, with no further ado, are this year’s winners.

To OpenAQ, for educating us about the air we breathe.

Air pollution is a vastly underestimated problem. Polluted air is linked to one in eight deaths worldwide, and studies have shown that bad air quality can cause cognitive impairment in young people and increase the risk of dementia and Alzheimer’s disease in the elderly. But until recently, there was no good source of air quality data that researchers and activists could rely on.

Christa Hasenkopf, an atmospheric scientist, decided to fix that. She and a software developer started OpenAQ, an open-source platform that collects air quality data from governments and international organizations in a single place and makes it free and accessible. Want to know how the nitrogen dioxide levels in Hyderabad, India, compare with those in Kampala, Uganda? OpenAQ can tell you. Want to build an app that alerts people in your city when air quality dips below a healthy threshold? You can do that, too.

The company says it has processed 188 million air quality measurements this year, making it a powerful weapon for policymakers, environmental groups and concerned citizens trying to clean up the air.

To DynamiCare Health, Biobot Analytics and Pear Therapeutics, for using tech to address the opioid crisis.

Few public health problems in the United States have proved as intractable as the opioid epidemic. But in 2019, three Massachusetts start-ups used technology to chip away at it.

DynamiCare Health, based in Boston, has built a mobile app meant to help keep recovering users of opioids and other drugs on the wagon. The app — already in use in eight addiction treatment systems across the country — allows users to test their breath and saliva remotely, check into group meetings and therapy sessions, and earn money on an electronic debit card by meeting their sobriety goals.

Biobot, a company started by two graduates of the Massachusetts Institute of Technology, analyzes sewage samples to determine the opioid use levels in a given neighborhood. (Opioid use leaves telltale byproducts called metabolites, which can be chemically detected in urine.) Once this data is collected, public health officials can use it to set priorities for treatment programs, detect spikes in use in a neighborhood and monitor the effectiveness of prevention programs over time.

Pear Therapeutics, another Boston outfit, makes “digital therapeutics” — essentially apps that use cognitive behavioral therapy techniques to help recovering addicts stick with their treatment programs. Its anti-opioid program, Reset-O, was cleared by the Food and Drug Administration late last year and can now be prescribed by doctors in conjunction with other treatments.

To Lemontree, Goodr and Propel, for helping feed the hungry.

Lemontree, a nonprofit food-delivery app based in New York, was started by Alex Godin, an entrepreneur who sold a workplace collaboration start-up to Meetup several years ago. The company sells Blue Apron-style meal kits to low-income families for $3 apiece. Meal kits are packed by volunteers, and they can be bought with food stamps.

Goodr, described by its founder, Jasmine Crowe, as a “food delivery app in reverse,” is a platform based in Atlanta that helps save some of the 72 billion pounds of food wasted in the United States every year and give it to people in need. Restaurants sign up on the site to have their excess food picked up and donated to local nonprofits and homeless shelters. Goodr operates in six cities, including Chicago, Miami and Philadelphia, and says it has diverted 2.1 million pounds of food and provided 1.8 million meals since 2017.

Propel, a Brooklyn start-up, is the creator of Fresh EBT, a popular app that helps low-income users manage their food stamps and other benefits. After doing battle with a larger government contractor last year, Propel recovered this year and says more than two million households use it every month.

To Pinterest, for taking a stand against social media toxicity.

When you think of Pinterest, you probably picture mood boards, D.I.Y. hacks and mommy-bloggers. But the social network spent much of 2019 doing the kinds of tough, principled work that its bigger rivals often neglected.

In August, the company announced that users searching for vaccine-related information would be shown results from authoritative sources like the World Health Organization and the Centers for Disease Control and Prevention, rather than being led down rabbit holes filled with misinformation. The company also introduced a “compassionate search” experience, which offers mental health advice and exercises to users whose behavior indicates they might be feeling anxious or depressed, such as people who search for things like “sad quotes” or who look up terms relating to self-harm. And in December, Pinterest joined other wedding websites in announcing that it would limit the promotion of wedding venues that were once slave plantations.

Pinterest hasn’t always operated flawlessly. But while its competitors were giving grandiose speeches and supplicating at the White House, the company’s content-moderation choices stood out as an example of a social network with a moral compass.

To Big Tech’s climate activists, for pressuring executives to walk the walk.

In a year when climate change was the subject of mass global demonstrations, Silicon Valley’s silence could have been deafening. Tech companies like Amazon, Microsoft and Google count fossil fuel companies and anti-environmental groups among their customers — a fact that doesn’t sit well with some employees. Those employees made their dissatisfaction known this year, joining climate strikes and walkouts and publicly calling on their own executives to do more to fight climate change.

In April, more than 4,200 Amazon employees sent an open letter to Jeff Bezos, the company’s chief executive, urging him to end the company’s contracts with oil and gas companies and commit to ambitious carbon-reduction goals. Amazon later announced a plan to become carbon neutral by 2040.

To Gypsy Guide, for enlightening my summer road trip.

If I’m being honest, the best app I used in 2019 wasn’t TikTok or some new A.I.-powered facial recognition app. It was Gypsy Guide, a simple, understated app that gives guided audio tours of national parks and other tourist destinations. The app uses your phone’s GPS to track your route through a park, and it narrates relevant facts as you drive past them. My wife and I drove through Yellowstone and the Grand Tetons this summer, and Gypsy Guide (which could really use a new name) quickly became our car soundtrack.

Gypsy Guide is not the slickest app in the world, and it’s not making anyone a billionaire. But it kept us entertained for hours, and it taught me things I wouldn’t have known. (Did you know that a concave depression in a mountain caused by a glacier’s erosion is called a “cirque”? Me neither.) It was a good reminder that not every tech start-up has to address some deep, existential need to be worthwhile. There are simpler pleasures, too.

Categories
Berners-Lee, Tim Bitcoin (Currency) Blockchain (Technology) Computers and the Internet Dorsey, Jack Facebook Inc Libra (Currency) Social Media Twitter Uncategorized Zuckerberg, Mark E

Internet Giants, Defied by Bitcoin, Now See Its Tech as a Remedy

SAN FRANCISCO — Not so long ago, the technology behind Bitcoin was seen in Silicon Valley as the best hope for challenging the enormous, centralized power of companies like Twitter and Facebook.

Now, in an unexpected twist, the internet giants think that technology could help them solve their many problems.

The chief executive of Twitter, Jack Dorsey, said last week that he hoped to fund the creation of software for social media that, inspired by the design of Bitcoin, would give Twitter less control over how people use the service and shift power toward users and outside programmers.

Likewise, Facebook’s chief executive, Mark Zuckerberg, has said he hopes the same concepts from Bitcoin could “take power from centralized systems and put it back into people’s hands.”

This push toward decentralization — the buzzword people in tech are using to describe these projects — has already gained enough currency and has sounded outlandish enough that it was one of the central themes of the satirical HBO show “Silicon Valley.”

Though Bitcoin’s digital tokens are widely used among the tech set, its underlying concept — a network of computers managing the currency without anyone in charge — is what’s most interesting to many people working on decentralization.

Countless entrepreneurs are working on decentralization projects, including the creator of the World Wide Web, Tim Berners-Lee. He founded Solid, which seeks to fix the problems of the centralized internet by shifting the ownership of personal data away from big companies and back toward users.

But the other efforts have largely been aimed at taking down Twitter and Facebook rather than helping them solve their problems. And the two behemoths have plenty of problems, from policing their sites for toxic content to dealing with pressure from regulators who think tech companies have grown too powerful.

Not surprisingly, the efforts at Twitter and Facebook have faced skepticism and questions about whether they are just trying to land some positive press while dodging responsibility — and regulations.

“When a company does something like this when it is under pressure, it becomes a way to distract attention by appearing to do something,” said Mitra Ardron, the head of the decentralized web project at the Internet Archive, which has hosted the Decentralized Web Summit the last four years.

Many people working on decentralization projects are concerned that Twitter and Facebook are trying to align themselves with the work’s countercultural spirit without giving up their enormous power.

“The monoliths see it as a threat to their model, so they try to weave in the concepts into their own products to maintain control,” said Eugen Rochko, the founder of Mastodon, a competitor to Twitter. With around two million users, Mastodon has been one of the most successful alternative projects.

Mr. Dorsey said Twitter was just starting to look at the idea and had committed only five people to it. Facebook has moved ahead with its Bitcoin-inspired cryptocurrency and has beefed up encryption, but the company has otherwise taken few steps to decentralize its services. Mr. Dorsey and Mr. Zuckerberg, though, have frequently discussed decentralization, suggesting they have a personal fascination that goes beyond business interests.

Mr. Dorsey also hired a small team at his second company, Square, to work full time on Bitcoin, without any commercial responsibilities. And he recently announced that he was hoping to take an extended sojourn in Africa to understand how Bitcoin was working there.

“It’s clearly catching on in part because people believe in it,” said Neha Narula, the director of the Digital Currency Initiative at the M.I.T. Media Lab. “It’s not necessarily that it is cheaper or more efficient or faster or easier. In fact, it is much harder. But it’s clear that this idea speaks to people.”

Mr. Dorsey’s tweets last week suggest that he wants the new team, Blue Sky, to build essentially a basic version of Twitter that would be available for anyone to copy. This would make it easier for outside developers to build on top of Twitter and to compete with it. A competitor might be able to offer a version without ads, or one that recommends tweets to readers based on different standards.

While that would most likely pose a commercial threat to Twitter, Mr. Dorsey said it would also force the service to be “far more innovative than in the past” and could draw more overall users to it.

The idea of decentralization harks back to the basic design and ideals of the internet, which was supposed to be a global gathering place where everyone was welcome and no one was in charge.

Mr. Dorsey said the invention of Bitcoin had made it possible to revive those early ideals. The key to Bitcoin is its blockchain database, which provides a way for a network of disconnected computers to agree on a single set of records for every Bitcoin in existence.

Mr. Dorsey is following in the steps of the many cryptocurrency advocates who have argued that the underlying technology could be used to record all the users and activity on a social network, and to agree on a single set of rules for the network, without having any single company in charge. He said, though, that it would most likely take “many years.”

Facebook has pursued several projects over the past year that would shift control to its users.

The company’s most notable effort with blockchains is the Libra cryptocurrency, which aims to create money outside the control of any one company. The Libra effort has faced crippling opposition from politicians, regulators and even some of the project’s original partners. But it appears to have inspired central banks in China and Europe, which are also considering ways to duplicate Bitcoin’s underlying technology.

Already, many start-ups have tried to use blockchains to create social networks to compete with Twitter and Facebook. But these networks, with names like Minds and Steemit, have faced many of the same problems that Bitcoin has, struggling to attract mainstream attention and leaving users to fend off hackers themselves. Many investors have largely given up on blockchain investments.

Several up-and-coming projects focused on decentralization, including Mr. Berners-Lee’s Solid, have steered clear of the blockchain entirely because they don’t believe it is useful for anything other than financial transactions.

Mr. Dorsey said one of the great appeals of a decentralized future was that Twitter would no longer be the only one in charge of deciding what is and isn’t allowed on the network.

To many people, that sounded like an effort by Mr. Dorsey to wash his hands of the hardest but arguably most important responsibility of social networks today: identifying and filtering bad actors and disinformation.

“I’m concerned that Twitter may try to foist the responsibility for dealing with these problems onto the decentralization community,” said Ross Schulman, the senior policy technologist at New America’s Open Technology Institute.

A spokeswoman for Facebook had no comment on the company’s efforts.

Mastodon, the Twitter competitor, allows anyone to tweak the software in order to create his or her own version of Mastodon. If people don’t like the rules set up in one version, they can move to another.

But Mastodon has provided a window into just how difficult these problems are to deal with, even with decentralization.

The Mastodon software was created to form a refuge from anger and hate speech on Twitter. But recently, a social network with close ties to hate crimes and the far right, Gab, used Mastodon’s software to create a new home after it was pushed off the mainstream internet. Mastodon’s leaders were opposed to it but could do little to stop it.

“Building these types of decentralized social networks comes with a slew of challenges that we haven’t figured out how to solve yet,” said Ms. Narula, who was a co-author of an article titled “Decentralized Social Networks Sound Great. Too Bad They’ll Never Work.”

Categories
Democratic National Committee News and News Media Presidential Election of 2020 Rumors and Misinformation Shorenstein Center on Media, Politics and Public Policy Social Media Uncategorized

2020 Campaigns Throw Their Hands Up on Disinformation

In 2018, Lisa Kaplan assembled a small team inside the re-election campaign for Senator Angus King, an independent from Maine. Wary of how Russia interfered in the 2016 presidential election, it set out to find and respond to political disinformation online.

The team noticed some false statements shared by voters, and traced the language back to Facebook pages with names like “Boycott The NFL 2018.” It alerted Facebook, and some pages were removed. The people behind the posts, operating from places like Israel and Nigeria, had misled the company about their identity.

Today, Ms. Kaplan said, she knows of no campaigns, including among the 2020 presidential candidates, that have similar teams dedicated to spotting and pushing back on disinformation.

They may “wake up the day after the election and say, ‘Oh, no, the Russians stole another one,’” she said.

The examples are numerous: A hoax version of the Green New Deal legislation went viral online. Millions of people saw unsubstantiated rumors about the relationship between Ukraine and the family of former Vice President Joseph R. Biden Jr. A canard about the ties between a Ukrainian oil company and a son of Senator Mitt Romney, the Utah Republican, spread widely, too.

Still, few politicians or their staffs are prepared to quickly notice and combat incorrect stories about them, according to dozens of campaign staff members and researchers who study online disinformation. Several of the researchers said they were surprised by how little outreach they had received from politicians.

Campaigns and political parties say their hands are tied, because big online companies like Facebook and YouTube have few restrictions on what users can say or share, as long as they do not lie about who they are.

But campaigns should not just be throwing their hands up, said some researchers and campaign veterans like Ms. Kaplan, who now runs a start-up that helps fight disinformation. Instead, they said, there should be a concerted effort to counter falsehoods.

“Politicians must play some defense by understanding what information is out there that may be manipulated,” said Joan Donovan, a research director at Harvard University’s Shorenstein Center. Even more important for politicians, she said, is pushing “high-profile and consistent informational campaigns.”

Too many campaigns are now left on their heels, said Simon Rosenberg, who tried to thwart disinformation for the Democratic Congressional Campaign Committee before the 2018 midterm election.

“The idea of counterdisinformation doesn’t really exist as a strategic objective,” he said.

Political groups are not ignoring false information. Bob Lord, the chief security officer of the Democratic National Committee, encourages campaigns to alert his organization when they see it online.

The committee also gives advice on when and how to respond. He said campaigns must decide when the costs of ignoring a falsehood outweighed drawing additional attention to it by speaking out.

But he said his reach was limited.

“The amount of disinformation that is floating around can cover almost any possible topic,” Mr. Lord said, and his team cannot look into each reported piece. If campaigns need connections to social media companies, he said, “we’re happy to make some.”

In September, President Trump’s re-election campaign released an ad that included an incorrect statement about Mr. Biden’s dealings with Ukraine. The campaign posted the ad on Facebook and the president’s Twitter account. Between the two services, the ad has been viewed more than eight million times.

Mr. Biden’s campaign publicized letters that it had written to Facebook, Twitter, YouTube and Fox News, asking the companies to ban the ad. But it remained up. In mid-November, the Biden campaign released a website called Just the Facts, Folks.

Jamal Brown, a spokesman for the Biden campaign, said it was not the campaign’s responsibility alone to push back on all falsehoods. But, he said, “it is incumbent upon all of us, both public- and private-sector companies, users, and elected officials and leaders, to be more vigilant in the kinds of content we engage and reshare on social media.”

Several months ago, a team at the Democratic Congressional Campaign Committee flagged some ads on Facebook to the office of Representative Ilhan Omar, a Minnesota Democrat. The ads called for an investigation into unfounded accusations that she had violated several laws.

After the committee and Ms. Omar’s campaign contacted Facebook, the company said it would limit the prominence of the ads in people’s feeds. But the ads, which have now reached over one million views, remain active.

Facebook does not remove false news, though it does label some stories as false through a partnership with several fact-checking organizations. It has said politicians like Mr. Trump can run ads that feature their “own claim or statement — even if the substance of that claim has been debunked elsewhere.”

In October, Twitter announced plans to forbid all political ads. But the company does not screen for false accusations. Twitter said it did not want to set a precedent for deciding what is and is not truthful online.

In an email, Ms. Omar said it was “not enough” to rely on private companies alone.

“We as a nation need to think seriously about ways to address online threats to our safety and our democracy while protecting core values like free speech,” she said.

Academics and researchers said it was surprising how little outreach there had been from campaigns that faced disinformation operations. Many of the researchers can dissect when a false idea first appeared online, and how it spread.

Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab, said there needed to be “more ingrained information sharing” among politicians, campaign staff, social media companies, civil society groups and, in some cases, law enforcement to counteract the increasing volume of election disinformation.

But when disinformation is used as a tool in partisan politics, Mr. Brookie said, the discussion becomes “a Rorschach test to reaffirm each audience’s existing beliefs, regardless of the facts.”

“One side will accuse the other, and then disinformation itself is weaponized,” he said.

Chris Pack, communications director of the National Republican Congressional Committee, said the disinformation that his party fought was “perpetuated by a liberal press corps that is still incapable of wrapping their heads around the fact that President Trump won the 2016 election.”

That leaves some in the research community wary of wading in at all, said Renee DiResta, the technical research manager for the Stanford Internet Observatory, which studies disinformation.

“I think this is a concern for a lot of academics who don’t want to work directly with a campaign,” Ms. DiResta said, “because that would be problematic for their neutrality.”

Nick Corasaniti contributed reporting.