“In July 2020, the feds indicted more Chinese government hackers for their part in “a hacking campaign lasting more than 10 years to the present, targeting companies in countries with high technology industries, including the United States, Australia, Belgium, Germany, Japan, Lithuania, the Netherlands, Spain, South Korea, Sweden, and the United Kingdom.” In September of the same year, the U.S. Cybersecurity and Infrastructure Security Agency announced that hackers with China’s Ministry of State Security used “commercially available information sources and open-source exploitation tools to target U.S. Government agency networks.”
In March of this year, Mandiant, a cybersecurity firm, revealed that hackers sponsored by the Chinese state were able to “successfully compromise at least six U.S. state government networks.”
Many reports about state-sponsored hacking note that this isn’t a one-sided affair. U.S. officials don’t advertise it, but there’s evidence they’re doing their part to steal sensitive data from Chinese companies and government agencies.”
“Over the coming weeks, AT&T is rolling out cellphone location tracking that’s designed to route emergency calls to 911 more quickly. The company says the new feature will be nationwide by the end of June and should make it easier for, say, an ambulance to reach someone experiencing a medical emergency. At first glance, it seems like a no-brainer. But it’s also a reminder that as phone companies promise to save lives, they’re also using a lot more data about you in the process.”
“Compliance costs are high and fines for failing to do so are significant”
“The working paper’s four writers drilled down into the data and determined that GDPR helped push the exit of a third of available apps and also suppressed the introduction of new apps into the market. New app introductions in the quarters following the launch of GDPR enforcement dropped by half.
“Whatever the privacy benefits of GDPR, they come at substantial costs in foregone innovation,” the authors note.”
“a sharp decline in both successful and unsuccessful apps entering the market. It wasn’t just bad or predatory apps being affected by GDPR.”
” They also calculate that GDPR raises costs to produce apps by more than 30 percent.”
“The report’s authors conclude that when the quality of a product is unpredictable (like an unknown or not-yet-existent app), the ease of entry into a marketplace is important to help determine its value to consumers. When regulatory barriers like GDPR drive up entry costs, then there can be “substantial [consumer] welfare losses” in the form of stillborn products and services we might want or need but never see.”
“It’s called a “keyword warrant,” and it’s basically an open request for information on anyone who searches for particular terms online. Instead of the government saying, “I want all of arson suspect John Doe’s Google searches,” it’s, “I want information on all the people who searched Google for ‘arson.'”
The problem is evident. In the first scenario, investigators have already determined a suspect based on some evidence that they present to a judge, the typical standard for requesting a search warrant. In the second scenario, the government is asking search engines to provide data that they can use for whatever reason. It’s an open invitation for a fishing expedition. And many innocent people could get caught in the net.”
“We are often told that law enforcement must have a way to get around strong encryption technologies in order to catch bad guys. Such a “backdoor” into security techniques would only be used when necessary and would be closely guarded so it would not fall into the wrong hands, the story goes.
The intelligence community does not yet have a known custom-built backdoor into encryption. But intelligence agencies do hold a trove of publicly unknown vulnerabilities, called “zero days,” they use to obtain hard-to-get data. One would hope that government agencies, especially those explicitly dedicated to security, could adequately protect these potent weapons.
A recently released 2017 DOJ investigation into a breach of the CIA Center for Cyber Intelligence’s (CCI) “Vault 7” hacking tools publicized in 2016 suggests that might be too big of an ask. Not only was the CCI found to be more interested in “building up cyber tools than keeping them secure,” the nation’s top spy agency routinely made rookie security mistakes that ultimately allowed personnel to leak the goods to Wikileaks.”
“It would be reassuring, in a sense, if the FBI’s misfeasance could be explained by anti-Trump bias. But as Horowitz noted in his report, the fact that “so many basic and fundamental errors were made by three separate, hand-picked teams on one of the most sensitive FBI investigations,” one that “was briefed to the highest levels within the FBI” and “FBI officials expected would eventually be subjected to close scrutiny,” suggests a much deeper problem involving unrestrained overzealousness, confirmation bias, tunnel vision, and groupthink—tendencies that threaten all Americans who value their privacy and reputations.
Even Comey, who claims the dishonesty described by Horowitz “does not reflect the FBI culture of compliance and candor,” wonders if the failure might be “systemic,” meaning there could be “problems with other cases.””