Categories
California Data Storage General Data Protection Regulation (GDPR) Law and Legislation privacy Science and Technology Uncategorized

What’s the Price of Getting Your Data? More Data

The new year ushered in a landmark California privacy law that gives residents more control over how their digital data is used. The Golden State isn’t the only beneficiary, though, because many companies are extending the protections — the most important being the right to see and delete the personal data a company has — to all their customers in the United States.

In the fall, I took the right of access for a test drive, asking companies in the business of profiling and scoring consumers for their files on me. One of the companies, Sift, which assesses a user’s trustworthiness, sent me a 400-page file that contained years’ worth of my Airbnb messages, Yelp orders and Coinbase activity. Soon after my article was published, Sift was deluged with over 16,000 requests, forcing it to hire a vendor to deal with the crush.

That vendor, Berbix, helped verify the identity of people requesting data by asking them to upload photos of their government ID and to take a selfie. It then asked them to take a second selfie while following instructions. “Make sure you are looking happy or joyful and try again” was one such command.

Many people who read the article about my experience were alarmed by the information that Berbix asked for — and the need to smile for their secret file.

“This is a nightmare future where I can’t request my data from a creepy shadow credit bureau without putting on a smile for them, and it’s completely insane,” Jack Phelps, a software engineer in New York City, said in an email.

“It just seems wrong that we have to give up even more personal information,” wrote another reader, Barbara Clancy, a retired professor of neuroscience in Arkansas.

That’s the unpleasant reality: To get your personal data, you may have to give up more personal data. It seems awful at first. Alistair Barr of Bloomberg called it “the new privacy circle of hell.”

But there’s a good reason for this. Companies don’t want to give your data away to the wrong person, which has happened in the past. In 2018, Amazon sent 1,700 audio files of a customer talking to his Alexa to a stranger.

The right to have access to personal data is enshrined in the new California Consumer Privacy Act. The law is modeled in part on privacy regulations in Europe, known as the General Data Protection Regulation, or G.D.P.R. Soon after Europe’s law went into effect, in May 2018, a hacker gained access to the Spotify account of Jean Yang, a tech executive, and successfully filed a data request to download her home address, credit card information and a history of the music she had listened to.

Since then, two groups of researchers have demonstrated that it’s possible to fool the systems created to comply with G.D.P.R. to get someone else’s personal information.

One of the researchers, James Pavur, 24, a doctoral student at Oxford University, filed data requests on behalf of his research partner and wife, Casey Knerr, at 150 companies using information that was easily found for her online, such as her mailing address, email address and phone number. To make the requests, he created an email address that was a variation on Ms. Knerr’s name. A quarter of the companies sent him her file.

“I got her Social Security number, high school grades, a good chunk of information about her credit card,” Mr. Pavur said. “A threat intelligence company sent me all her user names and passwords that had been leaked.”

Mariano Di Martino and Pieter Robyns, computer science researchers at Hasselt University in Belgium, had the same success rate when they approached 55 financial, entertainment and news companies. They requested each other’s data, using more advanced techniques than those of Mr. Pavur, such as photoshopping each other’s government ID. In one case, Mr. Di Martino received the data file of a complete stranger whose name was similar to that of Mr. Robyns.

Both sets of researchers thought the new law giving the right to data was worthwhile. But they said companies needed to improve their security practices to avoid compromising customers’ privacy further.

“Companies are rushing to solutions that lead to insecure practices,” Mr. Robyns said.

Companies employ different techniques for verifying identity. Many simply ask for a photo of a driver’s license. Retail Equation, a company that decides whether a consumer can make returns at retailers like Best Buy and Victoria’s Secret, asks only for a name and driver’s license number.

The wide array of companies now required to hand over data, from Baskin Robbins to The New York Times, have varying levels of security expertise and experience in providing data to consumers.

Companies such as Apple, Amazon and Twitter can ask users to verify their identity by logging into their platforms. All three give a heads-up via email after data is requested, which can help warn people if a hacker got access to their account. An Apple spokesman said that after a request is made, the company uses additional methods to verify the person’s identity, though the company said it couldn’t disclose those methods for security reasons.

If consumers can’t verify their identity by logging into an existing account, Mr. Di Martino and Mr. Robyns recommend that companies email them, call them or ask them for information that only they should know, such as the invoice number on a recent bill.

“Regulators need to think more about the unintended consequences of empowering individuals to access and delete their data,” said Steve Kirkham, who worked on Airbnb’s trust and safety team for five years, before founding Berbix in 2018. “We want to prevent fraudulent requests and let the good ones go through.”

It is on regulators’ minds. The California law requires businesses to “verify the identity of the consumer making the request to a reasonable degree of certainty” and to have a more stringent verification process for “sensitive or valuable personal information.”

Mr. Kirkham said Berbix requested the first selfie to test whether a person’s face matched their ID; the second selfie, with a smile or some other facial expression, ensures that someone isn’t simply holding a photo up to the camera. Mr. Kirkham said Berbix ultimately deleted the data collected within seven days to a year, depending on the retention period requested by the company that hires the firm. (Sift deletes its data after two weeks.)

“It’s a new threat vector companies should consider,” said Blake Brannon, vice president of product at OneTrust, another company that helps businesses comply with the new data privacy laws. OneTrust offers the 4,500 organizations using its service the option to create several levels of identity verification, such as sending a token to someone’s phone or verifying ownership of an email address.

“If I’m requesting something simple or lightweight, the verification is minimal, versus a deletion request,” Mr. Brannon said. “That will require more levels of verification.”

Mr. Kirkham of Berbix said the verification process discouraged some people from making the data request at all.

“A lot of people don’t want to give more information,” Mr. Kirkham said. “Their assumption is that you will do something nefarious with it.”

He added: “But that’s the irony here. We require additional information from people to protect them. We want to make sure you are who you say you are.”

Categories
Apple Inc Barr, William P computer security Computers and the Internet Cook, Timothy D Corporate Social Responsibility iPhone Justice Department Naval Air Station Pensacola Shooting (2019) privacy Software Uncategorized United States Politics and Government

Apple Takes a (Cautious) Stand Against Opening a Killer’s iPhones

SAN FRANCISCO — Apple is privately preparing for a legal fight with the Justice Department to defend encryption on its iPhones while publicly trying to defuse the dispute, as the technology giant navigates an increasingly tricky line between its customers and the Trump administration.

Timothy D. Cook, Apple’s chief executive, has marshaled a handful of top advisers, while Attorney General William P. Barr has taken aim at the company and asked it to help penetrate two phones used by a gunman in a deadly shooting last month at a naval air station in Pensacola, Fla.

Executives at Apple have been surprised by the case’s quick escalation, said people familiar with the company who were not authorized to speak publicly. And there is frustration and skepticism among some on the Apple team working on the issue that the Justice Department hasn’t spent enough time trying to get into the iPhones with third-party tools, said one person with knowledge of the matter.

The situation has become a sudden crisis at Apple that pits Mr. Cook’s longstanding commitment to protecting people’s privacy against accusations from the United States government that it is putting the public at risk. The case resembles Apple’s clash with the F.B.I. in 2016 over another dead gunman’s phone, which dragged on for months.

This time, Apple is facing off against the Trump administration, which has been unpredictable. The stakes are high for Mr. Cook, who has built an unusual alliance with President Trump that has helped Apple largely avoid damaging tariffs in the trade war with China. That relationship will now be tested as Mr. Cook confronts Mr. Barr, one of the president’s closest allies.

“We are helping Apple all of the time on TRADE and so many other issues, and yet they refuse to unlock phones used by killers, drug dealers and other violent criminal elements,” Mr. Trump said Tuesday in a post on Twitter. “They will have to step up to the plate and help our great Country.”

Apple declined to comment on the issue on Tuesday. Late Monday, after Mr. Barr had complained that the company had provided no “substantive assistance” in gaining access to the phones used in the Pensacola shooting, Apple said it rejected that characterization. It added that “encryption is vital to protecting our country and our users’ data.”

But Apple also offered conciliatory language, in a sign that it did not want the showdown to intensify. The company said it was working with the F.B.I. on the Pensacola case, with its engineers recently holding a call to provide technical assistance.

“We will work tirelessly to help them investigate this tragic attack on our nation,” Apple said.

At the heart of the tussle is a debate between Apple and the government over whether security or privacy trumps the other. Apple has said it chooses not to build a “backdoor” way for governments to get into iPhones and to bypass encryption because that would create a slippery slope that could damage people’s privacy.

The government has argued it is not up to Apple to choose whether to provide help, as the Fourth Amendment allows the government to violate individual privacy in the interest of public safety. Privacy has never been an absolute right under the Constitution, Mr. Barr said in a speech in October.

Mr. Cook publicly took a stand on privacy in 2016 when Apple fought a court order from the F.B.I. to open the iPhone of a gunman involved in a San Bernardino, Calif., mass shooting. The company said it could open the phone in a month, using a team of six to 10 engineers. But in a blistering, 1,100-word letter to Apple customers at the time, Mr. Cook warned that creating a way for the authorities to gain access to someone’s iPhone “would undermine the very freedoms and liberty our government is meant to protect.”

Bruce Sewell, Apple’s former general counsel who helped lead the company’s response in the San Bernardino case, said in an interview last year that Mr. Cook had staked his reputation on the stance. Had Apple’s board not agreed with the position, Mr. Cook was prepared to resign, Mr. Sewell said.

The San Bernardino case was bitterly contested by the government and Apple until a private company came forward with a way to break into the phone. Since then, Mr. Cook has made privacy one of Apple’s core values. That has set Apple apart from tech giants like Facebook and Google, which have faced scrutiny for vacuuming up people’s data to sell ads.

“It’s brilliant marketing,” Scott Galloway, a New York University marketing professor who has written a book on the tech giants, said of Apple. “They’re so concerned with your privacy that they’re willing to wave the finger at the F.B.I.”

Mr. Cook’s small team at Apple is now aiming to steer the current situation toward an outside resolution that doesn’t involve the company breaking its own security, even as it prepares for a potential legal battle over the issue, said the people with knowledge of the thinking.

Some of the frustration within Apple over the Justice Department is rooted in how police have previously exploited software flaws to break into iPhones. The Pensacola gunman’s phones were an iPhone 5 and an iPhone 7 Plus, according to a person familiar with the investigation who declined to be named because the detail was confidential.

Those phones, released in 2012 and 2016, lack Apple’s most sophisticated encryption. The iPhone 5 is even older than the device in the San Bernardino case, which was an iPhone 5C.

Security researchers and a former senior Apple executive who spoke on the condition of anonymity said tools from at least two companies, Cellebrite and Grayshift, have long been able to bypass the encryption on those iPhone models.

Cellebrite said in an email that it helps “thousands of organizations globally to lawfully access and analyze” digital information; it declined to comment on an active investigation. Grayshift declined to comment.

Cellebrite’s and Grayshift’s tools exploit flaws in iPhone software that let them remove limits on how many passwords can be tried before the device erases its data, the researchers said. Typically, iPhones allow 10 password attempts. The tools then use a so-called brute-force attack, or repeated automated attempts of thousands of passcodes, until one works.

“The iPhone 5 is so old, you are guaranteed that Grayshift and Cellebrite can break into those every bit as easily as Apple could,” said Nicholas Weaver, a lecturer at the University of California, Berkeley, who has taught iPhone security.

Chuck Cohen, who recently retired as head of the Indiana State Police’s efforts to break into encrypted devices, said his team used a $15,000 device from Grayshift that enabled it to regularly get into iPhones, particularly older ones, though the tool didn’t always work.

In the San Bernardino case, the Justice Department’s Office of Inspector General later found the F.B.I. had not tried all possible solutions before trying to force Apple to unlock the phone. In the current case, Mr. Barr and other Justice Department officials have said they have exhausted all options, though they declined to detail exactly why third-party tools have failed on these phones as the authorities seek to learn if the gunman acted alone or coordinated with others.

“The F.B.I.’s technical experts — as well as those consulted outside of the organization — have played an integral role in this investigation,” an F.B.I. spokeswoman said. “The consensus was reached, after all efforts to access the shooter’s phones had been unsuccessful, that the next step was to reach out to start a conversation with Apple.”

Security researchers speculated that in the Pensacola case, the F.B.I. might still be trying a brute-force attack to get into the phones. They said major physical damage may have impeded any third-party tools from opening the devices. The Pensacola gunman had shot the iPhone 7 Plus once and tried destroying the iPhone 5, according to F.B.I. photos.

The F.B.I. said it fixed the iPhones in a lab so that they would turn on, but the authorities still couldn’t bypass their encryption. Security researchers and the former Apple executive said any damage that prevented third-party tools from working would also preclude a solution from Apple.

A Justice Department spokeswoman said in an email: “Apple designed these phones and implemented their encryption. It’s a simple, ‘front-door’ request: Will Apple help us get into the shooter’s phones or not?”

While Apple has closed loopholes that police have used to break into its devices and resisted some law enforcement requests for access, it has also routinely helped police get information from phones in cases that don’t require it to break its encryption. Apple has held seminars for police departments on how to quickly get into a suspect’s phone, and it has a hotline and dedicated team to aid police in time-sensitive cases.

In the past seven years, Apple has also complied with roughly 127,000 requests from American law enforcement agencies for data stored on its computer servers. Such data is unencrypted and access is possible without a customer’s passcode.

In 2016, when the standoff between Apple and the government was at its most acrimonious, Mr. Cook said Congress should pass a law to decide the boundaries between public safety and technological security. In court filings, Apple even identified an applicable law, the Communications Assistance for Law Enforcement Act.

On Monday, Mr. Barr said the Trump administration had revived talks with Congress to come up with such a law.

Jack Nicas reported from San Francisco, and Katie Benner from Washington.

Categories
Alshamrani, Mohammed Saeed Apple Inc Barr, William P computer security Federal Bureau of Investigation Justice Department Mass Shootings Naval Air Station Pensacola Shooting (2019) privacy San Bernardino, Calif, Shooting (2015) Uncategorized United States Defense and Military Forces United States Politics and Government

Barr Asks Apple to Unlock Pensacola Killer’s Phones, Setting Up Clash

WASHINGTON — Attorney General William P. Barr declared on Monday that a deadly shooting last month at a naval air station in Pensacola, Fla., was an act of terrorism, and he asked Apple in an unusually high-profile request to provide access to two phones used by the gunman.

Mr. Barr’s appeal was an escalation of a continuing fight between the Justice Department and Apple pitting personal privacy against public safety.

“This situation perfectly illustrates why it is critical that the public be able to get access to digital evidence,” Mr. Barr said. He called on technology companies to find a solution and complained that Apple had provided no “substantive assistance,” a charge that the company strongly denied on Monday night, saying it had been working with the F.B.I. since the day of the shooting.

Detailing the results of the investigation into the Dec. 6 shooting that killed three sailors and wounded eight others, Mr. Barr said the gunman, Second Lt. Mohammed Saeed Alshamrani — a Saudi Air Force cadet training with the American military — had displayed extremist leanings.

Mr. Alshamrani warned on last year’s anniversary of the Sept. 11, 2001, attacks that “the countdown has begun” and posted other anti-American, anti-Israeli and jihadist social media messages, some within hours of attacking the base, Mr. Barr said. “The evidence shows that the shooter was motivated by jihadist ideology,” the attorney general said.

The government has also removed from the country some 21 Saudi students who trained with the American military, Mr. Barr said. He stressed that investigators found no connection to the shooting among the cadets, but said that some had links to extremist movements or possessed child pornography. Mr. Barr said the cases were too weak to prosecute but that Saudi Arabia kicked the trainees out of the program.

The battle between the government and technology companies over advanced encryption and other digital security measures has simmered for years. Apple, which stopped routinely helping the government unlock phones in late 2014 as it adopted a more combative stance and unveiled a more secure operating system, has argued that data privacy is a human rights issue. If Apple developed a way to allow the American government into its phones, its executives argued, hackers or foreign governments like China would exploit the tool.

But frustrated law enforcement officials accuse Apple of providing a haven for criminals. They have long pushed for a legislative solution to the problem of “going dark,” their term for how increasingly secure phones have made it harder to solve crimes, and the Pensacola investigation gives them a prominent chance to make their case.

In a statement Monday night, Apple said the substantive aid it had provided law enforcement agencies included giving investigators access to the gunman’s iCloud account and transaction data for multiple accounts.

The company’s statement did not say whether Apple engineers would help the government get into the phones themselves. It said that “Americans do not have to choose between weakening encryption and solving investigations” because there are now so many ways for the government to obtain data from Apple’s devices — many of which Apple routinely helps the government execute.

It will not back down from its unequivocal support of encryption that is impossible to crack, people close to the company said.

Justice Department officials said that they needed access to Mr. Alshamrani’s phones to see data and messages from encrypted apps like Signal or WhatsApp to determine whether he had discussed his plans with others at the base and whether he was acting alone or with help.

“We don’t want to get into a world where we have to spend months and even years exhausting efforts when lives are in the balance,” Mr. Barr said. “We should be able to get in when we have a warrant that establishes that criminal activity is underway.”

The confrontation echoed the legal standoff over an iPhone used by a gunman who killed 14 people in a terrorist attack in San Bernardino, Calif., in late 2015. Apple defied a court order to assist the F.B.I. in its efforts to search his device, setting off a fight over whether privacy enabled by impossible-to-crack encryption harmed public safety.

The San Bernardino dispute was resolved when the F.B.I. found a private company to bypass the iPhone’s encryption. Tensions between the two sides, however, remained, and Apple worked to ensure that neither the government nor private contractors could open its phones.

Image
Credit…Robyn Beck/Agence France-Presse — Getty Images

Mr. Barr said that Trump administration officials have again begun discussing a legislative fix.

But the F.B.I. has been bruised by Mr. Trump’s unsubstantiated complaints that former officials plotted to undercut his presidency and by a major inspector general’s report last month that revealed serious errors with aspects of the Russia investigation. A broad bipartisan consensus among lawmakers allowing the bureau to broaden its surveillance authorities is most likely elusive, though some lawmakers singled out Apple for its refusal to change its stance.

“Companies shouldn’t be allowed to shield criminals and terrorists from lawful efforts to solve crimes and protect our citizens,” Senator Tom Cotton, Republican of Arkansas, said in a statement. “Apple has a notorious history of siding with terrorists over law enforcement. I hope in this case they’ll change course and actually work with the F.B.I.”

Apple typically complies with court orders to turn over information on its servers. But said that it would turn over only the data it had, implying that it would not work to unlock the phones.

Investigators secured a court order within a day of the shooting, allowing them to search the phones, Mr. Barr said. He turned up the pressure on Apple a week after the F.B.I.’s top lawyer, Dana Boente, asked the company for help searching Mr. Alshamrani’s iPhones.

Officials said that the F.B.I. was still trying to gain access to the phones on its own and approached Apple only after asking other government agencies, foreign governments and third-party technology vendors for help, to no avail.

The devices were older models: an iPhone 7 with a fingerprint reader and an iPhone 5, according to a person familiar with the investigation.

Justice Department officials said that investigators have yet to make a final determination about whether Mr. Alshamrani conspired with others. They said that the Saudi government was offering “unprecedented” cooperation but that “we need to get into those phones.”

Mr. Barr and other law enforcement officials described a 15-minute shootout before security officers shot and killed Mr. Alshamrani. During the firefight, Mr. Alshamrani paused at one point to shoot one of his phones once, Mr. Barr said, adding that his other phone was also damaged but that the F.B.I. was able to repair them well enough to be searched.

Mr. Alshamrani also shot at photographs of President Trump and one of his predecessors, said David Bowdich, the deputy director of the F.B.I. A person familiar with the investigation identified the unnamed president as George W. Bush.

Mr. Alshamrani’s weapon was lawfully purchased in Florida under an exemption that allows nonimmigrant visa holders to buy firearms if they have a valid hunting license or permit, officials said.

Law enforcement officials have continued to discuss Mr. Alshamrani’s phones with Apple, they said.

“We’re not trying to weaken encryption, to be clear,” Mr. Bowdich said at a news conference, noting that the issue has come up with thousands of devices that investigators want to see in other cases.

“We talk about this on a daily basis,” he said. Mr. Bowdich was the bureau’s top agent overseeing the San Bernardino investigation and was part of the effort to push Apple to crack into the phone in that case.

But much has also changed for Apple in the years since Tim Cook, its chief executive, excoriated the Obama administration publicly and privately in 2014 for attacking strong encryption. Obama officials who were upset by Apple’s stance on privacy, along with its decision to shelter billions of dollars in offshore accounts and make its products almost exclusively in China, aired those grievances quietly.

Now Apple is fighting the Trump administration, and Mr. Trump has shown far more willingness to publicly criticize companies and public figures. When he recently claimed falsely that Apple had opened a manufacturing plant in Texas at his behest, the company remained silent rather than correct him.

At the same time, Apple has financially benefited more under Mr. Trump than under President Barack Obama. It reaped a windfall from the Trump administration’s tax cuts, and Mr. Trump said he might shield Apple from the country’s tariff war with China.

He had said last month that finding a way for law enforcement to gain access to encrypted technology was one of the Justice Department’s “highest priorities.”

Mr. Alshamrani, who was killed at the scene of the attack, came to the United States in 2017 and soon started strike-fighter training in Florida. Investigators believe he may have been influenced by extremists as early as 2015.

Mr. Barr rejected reports that other Saudi trainees had known of and recorded video of the shooting. Mr. Alshamrani arrived at the scene by himself, and others in the area began recording the commotion only after he had opened fire, Mr. Barr said. They and other Saudi cadets cooperated with the inquiry, he added.

Jack Nicas contributed reporting from San Francisco.

Categories
Data-Mining and Database Marketing Federal Trade Commission Google Inc Online Advertising privacy Uncategorized Video Recordings, Downloads and Streaming YouTube.com

4 Things to Know About YouTube’s New Children Privacy Practices

In September, Google agreed to pay a $170 million fine and make privacy changes as regulators said that its YouTube platform had illegally harvested children’s personal information and used it to profit by targeting them with ads. The penalty and changes were part of an agreement with the Federal Trade Commission and the attorney general of New York, which had accused YouTube of violating the federal Children’s Online Privacy Protection Act.

On Monday, YouTube said it was beginning to introduce changes to address regulators’ concerns and better protect children. Here is what you need to know about those changes.

YouTube said that, starting Monday, it would begin to limit the collection and use of personal information from people who watched children’s videos, no matter the age of the viewer. Federal law prohibits online services aimed at children under 13 from collecting the personal information of those young users without parental consent.

YouTube said it had also turned off or limited some features on children’s videos tied to personal information. These include comments and live-chat features, as well as the ability to save videos to a playlist.

YouTube will no longer show ads on children’s videos that are targeted at viewers based on their web-browsing or other online activity data. Instead, the company said, it may now show ads based on the context of what people are viewing.

YouTube said viewers who watched a video made for children on its platform would now be more likely to see recommendations for other children’s videos.

In September, YouTube said it would require all video producers on its platform to designate their videos as made for children or not made for children. In November, it introduced a new setting to help producers flag children’s content, a designation that signals YouTube to limit data collection on those videos. The video service said that it was also using artificial intelligence to help identify children’s content and that it could override a video producer’s categorization if its system detected a mistake.

YouTube is one of the most popular platforms for children. Some animated videos on YouTube channels aimed at younger children — like Cocomelon Nursery Rhymes and ChuChu TV — have been viewed more than a billion times.

The platform’s new limits on data-mining send a signal to other popular sites offering children’s content that they also may be subject to the federal children’s online privacy law. Musical.ly, a wildly popular video social network now known as TikTok, also had to pay a fine last year to settle F.T.C. charges that it had illegally collected children’s personal information.

Categories
bigid CCPA Enterprise Funding GDPR privacy Security Startups TC Tiger Global Management

BigID bags another $50M round as data privacy laws proliferate

Almost exactly 4 months to the day after BigID announced a $50 million Series C, the company was back today with another $50 million round. The Series D came entirely from Tiger Global Management. The company has raised a total of $144 million.

What warrants $100 million in interest from investors in just four months is BigID’s mission to understand the data a company has and manage that in the context of increasing privacy regulation including GDPR in Europe and CCPA in California, which went into effect this month.

BigID CEO and co-founder Dimitri Sirota admits that his company formed at the right moment when it launched in 2016, but says he and his co-founders had an inkling that there would be a shift in how governments view data privacy.

“Fortunately for us, some of the requirements that we said were going to be critical, like being able to understand what data you collect on each individual across your entire data landscape, have come to [pass],” Sirota told TechCrunch. While he understands that there are lots of competing companies going after this market, he believes that being early helped his startup establish a brand identity earlier than most.

Meanwhile, the privacy regulation landscape continues to evolve. Even as California privacy legislation is taking effect, many other states and countries are looking at similar regulations. Canada is looking at overhauling its existing privacy regulations.

Sirota says that he wasn’t actually looking to raise either the C or the D, and in fact still has B money in the bank, but when big investors want to give you money on decent terms, you take it while the money is there. These investors clearly see the data privacy landscape expanding and want to get involved. He recognizes that economic conditions can change quickly, and it can’t hurt to have money in the bank for when that happens.

That said, Sirota says you don’t raise money to keep it in the bank. At some point, you put it to work. The company has big plans to expand beyond its privacy roots and into other areas of security in the coming year. Although he wouldn’t go into too much detail about that, he said to expect some announcements soon.

For a company that is only four years old, it has been amazingly proficient at raising money with a $14 million Series A and a $30 million Series B in 2018, followed by the $50 million Series C last year, and the $50 million round today. And Sirota said, he didn’t have to even go looking for the latest funding. Investors came to him — no trips to Sand Hill Road, no pitch decks. Sirota wasn’t willing to discuss the company’s valuation, only saying the investment was minimally diluted.

BigID, which is based in New York City, already has some employees in Europe and Asia, but he expects additional international expansion in 2020. Overall the company has around 165 employees at the moment and he sees that going up to 200 by mid-year as they make a push into some new adjacencies.

Categories
Cameras Jennifer Ebichi Michael Egwuagu Murders, Attempted Murders and Homicides Pflugerville (Tex) privacy Ring Inc Security and Warning Systems Surveillance of Citizens by Government Uncategorized

Man Captured on Doorbell Camera Footage Confessing to Murder

A man was captured on home security camera footage confessing to the murder of his sister on Friday, shortly after she was stabbed to death in a Texas home, the authorities said.

The woman, Jennifer Chioma Ebichi, 32, had been stabbed at least a dozen times when the authorities found her on the kitchen floor at the home in Pflugerville, according to documents provided by the Travis County District Clerk’s Office. Her younger brother, Michael Egwuagu, 25, was arrested on a murder charge.

An arrest affidavit said one witness saw Mr. Egwuagu “exit the residence smiling and with a bloody kitchen knife in his hand stating, ‘I killed Jennifer.’ Michael’s clothing was covered in blood.”

It added that footage from a doorbell camera at the home corroborated the witness testimony.

The episode is one of several recent examples of doorbell cameras — increasingly affordable and popular security tools that can be connected to home Wi-Fi systems — yielding footage that becomes useful to the local authorities.

“Every time there is more surveillance and more captured of the lived experience, that will be helpful for police investigators,” said Andrew Guthrie Ferguson, a law professor and the author of “The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement.”

“The consequences are an erosion of privacy and security at our homes and in our private moments,” he added. “The trade-off is one that is hard, but also one I’m not sure citizens have fully understood when they decided to buy a little extra security for their home.”

One of the best-known doorbell camera brands is Ring, which makes a doorbell that doubles as a security camera and was acquired by Amazon in 2018. According to data shared publicly by the company, it now has partnerships with over 700 local police and sheriff’s departments, including the Travis County Sheriff’s Office.

The authorities can access footage via Ring’s Neighbors app, which people can use to share videos and monitor criminal activity in their neighborhood. When the police seek videos from a certain location, Ring asks users in the area if they are willing to share their footage.

Users can refuse, but the police can still obtain footage using other legal avenues, such as obtaining a warrant.

Image
Credit…Travis County Sheriff’s Office, via Associated Press

“Ring will not disclose user videos to police unless the user expressly consents or if disclosure is required by law, such as to comply with a warrant,” the company said in a statement Thursday. “Ring objects to overbroad or otherwise inappropriate legal demands as a matter of course.”

It was unclear whether a Ring camera was involved in the Pflugerville case; other popular home security camera brands include Wyze and Nest. The sheriff’s department declined to say which brand of camera had filmed Mr. Egwuagu on Friday.

The murder charge captured additional attention because Mr. Egwuagu had been known as a star football player at the University of Texas, San Antonio. He was a safety who tried out for National Football League scouts in 2017 and 2018.

After Mr. Egwuagu left the residence in Pflugerville, an Austin suburb, around 5 p.m. on Friday, witnesses said he knelt down in the street as though he were praying, then removed his clothing and placed it in a trash can, the arrest affidavit said. The arrest affidavit also said that Ms. Ebichi’s two children were present at the time of her death.

An autopsy showed that Ms. Ebichi had been in her first trimester of pregnancy when she died. Dr. J. Keith Pinckard, the chief medical examiner in Travis County, estimated that she had sustained one dozen to two dozen stab wounds, according to the arrest affidavit.

Mr. Egwuagu is being held on a $500,000 bond. A statement from the office of Krista A. Chacona, a lawyer representing Mr. Egwuagu, said: “We do not have any comment at this time except to say that this is a very painful and difficult time for the family. We would ask that people please respect their privacy and allow them time to grieve.”

In recent weeks, home security cameras have raised concerns about data leaks and hacking. Executives at Wyze, the company behind a budget-friendly home security camera, said this week that the information of 2.4 million of their customers had been exposed to the public because of an employee error.

And last month, there were reports of at least four individual cases of camera security systems being hacked; in one case involving a Ring security camera, a man was able to speak to an 8-year-old girl whose bedroom was being filmed. He used a racist slur and said he was Santa Claus.

On Wednesday, a violent episode that had been captured on home surveillance footage was posted on YouTube by the Las Vegas Metropolitan Police Department. The footage shows a woman who appears to be trying to escape from a man. He can be seen running after her, kicking her down some stairs and dragging her toward a white car. The police posted the video to seek help from the public in identifying the man and the woman.

“Police are going to see new opportunities, and they’re going to seize those opportunities because more information is obviously better for them,” Mr. Ferguson said. “But it all comes at a cost to a certain sense of personal privacy, and also the collective privacy of your neighborhood and your community and who’s surveilling whom in particular neighborhoods.”

Categories
California Computers and the Internet Data-Mining and Database Marketing e-commerce Law and Legislation Mobile Applications privacy Uncategorized

What Does California’s New Data Privacy Law Mean? Nobody Agrees

Millions of people in California are now seeing notices on many of the apps and websites they use. “Do Not Sell My Personal Information,” the notices may say, or just “Do Not Sell My Info.”

But what those messages mean depends on which company you ask.

Stopping the sale of personal data is just one of the new rights that people in California may exercise under a state privacy law that takes effect on Wednesday. Yet many of the new requirements are so novel that some companies disagree about how to comply with them.

Even now, privacy and security experts from different companies are debating compliance issues over private messaging channels like Slack.

The provision about selling data, for example, applies to companies that exchange the data for money or other compensation. Evite, an online invitation service that discloses some customer information for advertising purposes, said it would give people a chance to opt out if they do not want their data shared with third parties. By contrast, Indeed, a job search engine that shares users’ résumés and other information, posted a notice saying that people seeking to opt out “will be asked to delete their account.”

Image
Credit…Jeenah Moon for The New York Times

The issue of selling consumer data is so fraught that many companies are unwilling to discuss it publicly. Oracle, which has sold consumer information collected by dozens of third-party data brokers, declined to answer questions. T-Mobile, which has sold its customers’ location details, said it would comply with the law but refused to provide details.

“Companies have different interpretations, and depending on which lawyer they are using, they’re going to get different advice,” said Kabir Barday, the chief executive of OneTrust, a privacy management software service that has worked with more than 4,000 companies to prepare for the law. “I’ll call it a religious war.”

The new law has national implications because many companies, like Microsoft, say they will apply their changes to all users in the United States rather than give Californians special treatment. Federal privacy bills that could override the state’s law are stalled in Congress.

The California privacy law applies to businesses that operate in the state, collect personal data for commercial purposes and meet other criteria like generating annual revenue above $25 million. It gives Californians the right to see, delete and stop the sale of the personal details that all kinds of companies — app developers, retailers, restaurant chains — have on them.

“Businesses will have to treat that information more like it’s information that belongs, is owned by and controlled by the consumer,” said Xavier Becerra, the attorney general of California, “rather than data that, because it’s in possession of the company, belongs to the company.”

Some issues, like the practices that qualify as data selling, may be resolved by mid-2020, when Mr. Becerra’s office plans to publish the final rules spelling out how companies must comply with the law. His office issued draft regulations for the law in October. Other issues may become clearer if the attorney general sues companies for violating the privacy law.

For now, even the biggest tech companies have different interpretations of the law, especially over what it means to stop selling or sharing consumers’ personal details.

Google recently introduced a system for its advertising clients that restricts the use of consumer data to business purposes like fraud detection and ad measurement. Google said advertisers might choose to limit the uses of personal information for individual consumers who selected the don’t-sell-my-data-option — or for all users in California.

Facebook, which provides millions of sites with software that tracks users for advertising purposes, is taking a different tack. In a recent blog post, Facebook said that “we do not sell people’s data,” and it encouraged advertisers and sites that used its services “to reach their own decisions on how to best comply with the law.”

Uber responded to Facebook’s notice by offering a new option for its users around the world to opt out of having the ride-hailing service share their data with Facebook for ad targeting purposes.

“Although we do not sell data, we felt like the spirit of the law encompassed this kind of advertising,” said Melanie Ensign, the head of security and privacy communications at Uber.

Evite, the online invitation service, decided in 2018 to stop selling marketing data that grouped its customers by preferences like food enthusiast or alcohol enthusiast. Since then, the company has spent more than $1 million and worked with two firms to help it understand its obligations under the privacy law and set up an automated system to comply, said Perry Evoniuk, the company’s chief technology officer.

Although Evite no longer sells personal information, the site has posted a “do not sell my info” link. Starting Wednesday, Mr. Evoniuk said, that notice will explain to users that Evite shares some user details — under ID codes, not real names — with other companies for advertising purposes. Evite will allow users to make specific choices about sharing that data, he said. Customers will also be able to make general or granular requests to see their data or delete it.

“We took a very aggressive stance,” Mr. Evoniuk said. “It’s beneficial to put mechanisms in place to give people very good control of their data across the board.”

Companies are wrangling with a part in the law that gives Californians the right to see the specific details that companies have compiled on them, like precise location information and facial recognition data. Residents may also obtain the inferences that companies have made about their behavior, attitudes, activities, psychology or predispositions.

Apple, Facebook, Google, Microsoft, Twitter and many other large tech companies already have automated services enabling users to log in and download certain personal data. Amazon said it would introduce a system that allowed all customers of its United States site to request access to their personal information.

But the types and extent of personal data that companies currently make available vary widely.

Apple, for instance, said its privacy portal allowed people whose identities it could verify to see all of the data associated with their Apple IDs — including their App Store activities and AppleCare support history.

Microsoft said its self-service system enabled users to see the most “relevant” personal information associated with their accounts, including their Bing search history and any interest categories the company had assigned them.

Lyft, the ride-hailing company, said it would introduce a tool on Wednesday that allowed users to request and delete their data.

A reporter who requested data from the Apple portal received it more than a week later; the company said its system might need about a week to verify the identity of a person seeking to see his or her data. Microsoft said it was unable to provide a reporter with a list of the categories it uses to classify people’s interests. And Lyft would not say whether it will show riders the ratings that drivers give them after each ride.

Experian Marketing Services, a division of the Experian credit reporting agency that segments consumers into socioeconomic categories like “platinum prosperity” and “tough times,” is staking out a tougher position.

In recent comments filed with Mr. Becerra’s office, Experian objected to the idea that companies would need to disclose “internally generated data about consumers.” Experian did not return emails seeking comment.

The wide variation in companies’ data-disclosure practices may not last. California’s attorney general said the law clearly requires companies to show consumers the personal data that has been compiled about them.

“That consumer, so long as they follow the process, should be given access to their information,” Mr. Becerra said. “It could be detailed information, if a consumer makes a very specific request about a particular type of information that might be stored or dispersed, or it could be a general request: ‘Give me everything you’ve got about me.’”

Categories
23andMe Ancestry.com Barbara Rae-Venter DeAngelo, Joseph James DNA (Deoxyribonucleic Acid) FamilyTreeDNA Forensic Science GEDmatch Inc Gene by Gene Ltd Genealogy privacy Uncategorized

What You’re Unwrapping When You Get a DNA Test for Christmas

The company GEDmatch, the DNA database that facilitated an arrest in the Golden State Killer case and in dozens of other cases since, emerged from a desire to connect people to their relatives. For the past decade, the site’s co-founder Curtis Rogers has been running the company out of a small yellow house in Lake Worth, Fla.

When Mr. Rogers first learned that the DNA of GEDmatch users had played a critical role in identifying a suspected serial killer, he was upset. “I didn’t think this was an appropriate use of our site,” he said in an interview in May 2018, five weeks after the arrest of Joseph James DeAngelo. This month, Mr. Rogers sold GEDmatch to Verogen, a commercial forensic company best known for providing police and F.BI. labs with tools for making predictions about suspected criminals’ ancestry, eye color and hair color.

FamilyTreeDNA, a DNA database of 2 million people, similarly was built from its founders’ desire to help people connect with relatives. “We feel the only person that should have your DNA is you,” Bennett Greenspan, the company’s president, said in a news release in 2017. But it’s also a company that offers law-enforcement officials, for a fee of $800, the ability to search its database for relatives of suspected killers and rapists.

So what do these developments mean for that DNA kit sitting under your Christmas tree? Men’s Journal calls them “one of the hottest gifting ideas,” and US Weekly promises that “they’re going to love it, no matter how tough of a critic they are.” But is using one of these kits also opening the door to letting the police use your DNA to arrest your cousin?

The answer in this rapidly evolving realm depends largely on which sites you join and the boxes you check off when you do. And even if you never join any of these sites, their policies could affect you so long as one of your 800 closest relatives has.

15 million (Ancestry) and 10 million DNA profiles (23andMe)

If there is a DNA test under your tree it probably came from one of these two companies. Both market themselves extensively during the holidays. 23andMe likens its myriad ancestry and health reports to “150 personalized gifts in one colorful box,” while Ancestry takes a more sentimental tack, urging families to discover their “unique story.”

Short answer: No. But the fight over access is intensifying.

Longer answer: Each of these databases is big enough to identify nearly all 300 million Americans’ DNA through their cousins, researchers have found. This makes them a tantalizing tool for law enforcement officials, who say the data could help them solve thousands of violent crimes and identify unknown victims if only they could put a name to associated DNA.

To identify a suspect’s blood, for example, investigators do not need to find the person who cut his hand smashing through a window. They just need to match to a couple of his second or third cousins in a DNA database. From there, a genetic genealogist can puzzle out how these cousins are related to one another and the suspect by building out a series of family trees. Often this leads to an arrest.

Part of the reason that these databases have grown so rapidly, however, is that they have promised to keep law enforcement out.

Both companies require a court order for access and say that they have not yet permitted law enforcement to conduct a genetic search. But interest is high. Eric Heath, Ancestry’s chief privacy officer, said in an interview last month that he received 24 emails in 2019 requesting access to the site from law enforcement. These emails included a request to upload DNA to try to identify a suspect in a cold case and a request to search for relatives of an unidentified body. Mr. Heath said he responded by sharing the site’s policy and the requests ended there.

But they may not in the future. In July, a judge in Florida granted a detective a search warrant to obtain access to nearly 1 million GEDmatch users (more on GEDmatch below) who had elected not to help law enforcement. Many privacy advocates and genealogists were horrified and warned that the development would encourage warrants for searching bigger sites like Ancestry.

Anne Marie Schubert, the Sacramento County district attorney, has been involved in advising law enforcement agencies on how to solve crimes with genealogy sites since her agency helped crack the Golden State Killer case. She said she supported the judge’s ruling. “I commend Florida for taking that first step,” she said, calling it “a natural progression in an evolving world.” She said she believes that investigators will eventually gain access to Ancestry.

Should such a search warrant be served to Ancestry, Mr. Heath said, the company is prepared to fight. “We’re willing to push back and narrow the scope and squash it however we need to,” he said. 23andMe made a similar commitment on its blog.

Ms. Schubert is among a group of people who are encouraging the big companies to allow users to opt in to help law enforcement. Mr. Heath said that won’t happen. Other databases serve that purpose, he said, adding, “I don’t want to manage that.”

1.3 million DNA profiles

It is unlikely that anyone will be getting you the gift of a GEDmatch subscription. Uploading to the site is free and the company does not offer DNA tests. But when uploads to other sites rise, uploads to GEDmatch typically follow. That’s because the site functions as a means to get more from your DNA; you can take a DNA file analyzed by another company, like 23andMe, and upload it to GEDmatch to find more relatives and ancestry data.

Short answer: Maybe.

Longer answer: People who upload to GEDmatch can choose among four settings. These include help law enforcement, opt out of law enforcement searches, and research mode, which is supposed to hide your profile from everyone.

It recently became clear, however, that no one on the site is fully protected from law enforcement searches. When the judge in Florida granted a detective a warrant to search the full database, that included nearly a million profiles of people who had chosen not to help police. This search warrant applies only to this case, but could encourage other detectives to request similar warrants.

Just last week, a forensic company purchased GEDmatch. The move delivers dueling messages about the site’s future. The new owner, Verogen, said that it would actively fight future search warrants and that users can still opt out of helping police. But Verogen is also a company that has built its business, so far, on catering to law enforcement.

2.5 million DNA profiles

Like the others, FamilyTreeDNA wants to help connect you with your family history. But unlike most other such companies, it actively welcomes law enforcement uploads. The company offers a variety of packages for police departments, including one that comes with a genetic genealogist who works for the site to help authorities parse their results and potentially solve a crime.

(Barbara Rae-Venter, who helped crack the Golden State Killer case is the director of this new unit. Read more about her here.)

Yes. If you join the site without modifying the settings, you are agreeing to help law enforcement officials identify DNA from crime scenes. If you opt out or are based in Europe, however, you will not appear in search results for police officers who follow the rules.

FamilyTreeDNA has created a highly unusual vetting system for each formal law enforcement request. Connie Bormans, the company’s lab director, reviews the details of each case. “If it meets the criteria, then we’ll say O.K., this is a good candidate; we’ll send you all the paperwork we need.” Recently, for example, she denied a missing person request from a law enforcement agency that wasn’t technically an abduction, she said.

2.5 million DNA profiles

MyHeritage exists primarily to help people find relatives and build out family trees. Recently, the company also began offering health risk reports.

Short answer: They are not supposed to.

Longer answer: Law enforcement is “strictly prohibited” from using the site without a court order, according to the terms of service. Logistically it would be possible, however. Like GEDmatch and FamilyTreeDNA, MyHeritage accepts DNA files analyzed elsewhere.

Categories
DNS DNS over HTTPS DOH google NCTA Policy privacy Security

Why big ISPs aren’t happy about Google’s plans for encrypted DNS

Why big ISPs aren’t happy about Google’s plans for encrypted DNS

Enlarge (credit: Thomas Trutschel/Photothek via Getty Images)

When you visit a new website, your computer probably submits a request to the domain name system (DNS) to translate the domain name (like arstechnica.com) to an IP address. Currently, most DNS queries are unencrypted, which raises privacy and security concerns. Google and Mozilla are trying to address these concerns by adding support in their browsers for sending DNS queries over the encrypted HTTPS protocol.

But major Internet service providers have cried foul. In a September 19 letter to Congress, Big Cable and other telecom industry groups warned that Google’s support for DNS over HTTPS (DoH) “could interfere on a mass scale with critical Internet functions, as well as raise data-competition issues.”

On Sunday, The Wall Street Journal reported that the House Judiciary Committee is taking these concerns seriously. In a September 13 letter, the Judiciary Committee asked Google for details about its DoH plans—including whether Google plans to use data collected via the new protocol for commercial purposes.

Read 18 remaining paragraphs | Comments

Categories
Biz & IT GPS privacy Security trackers vulnerabilities

600,000 GPS trackers for people and pets are using 123456 as a password

Dog plush toy with tracker attached.

Enlarge (credit: Shenzhen i365 Tech)

An estimated 600,000 GPS trackers for monitoring the location of kids, seniors, and pets contain vulnerabilities that open users up to a host of creepy attacks, researchers from security firm Avast have found.

The $25 to $50 devices are small enough to wear on a necklace or stash in a pocket or car dash compartment. Many also include cameras and microphones. They’re marketed on Amazon and other online stores as inexpensive ways to help keep kids, seniors, and pets safe. Ignoring the ethics of attaching a spying device to the people we love, there’s another reason for skepticism. Vulnerabilities in the T8 Mini GPS Tracker Locator and almost 30 similar model brands from the same manufacturer, Shenzhen i365 Tech, make users vulnerable to eavesdropping, spying, and spoofing attacks that falsify users’ true location.

Researchers at Avast Threat Labs found that ID numbers assigned to each device were based on its International Mobile Equipment Identity, or IMEI. Even worse, during manufacturing, devices were assigned precisely the same default password of 123456. The design allowed the researchers to find more than 600,000 devices actively being used in the wild with that password. As if that wasn’t bad enough, the devices transmitted all data in plaintext using commands that were easy to reverse engineer.

Read 5 remaining paragraphs | Comments