Posted on

How the NSA is disrupting foreign hackers targeting COVID-19 vaccine research

The headlines aren’t always kind to the National Security Agency, a spy agency that operates almost entirely in the shadows. But a year ago, the NSA launched its new Cybersecurity Directorate, which in the past year has emerged as one of the more visible divisions of the spy agency.

At its core, the directorate focuses on defending and securing critical national security systems that the government uses for its sensitive and classified communications. But the directorate has become best known for sharing some of the more emerging, large-scale cyber threats from foreign hackers. In the past year the directorate has warned against attacks targeting secure boot features in most modern computers, and doxxed a malware operation linked to Russian intelligence. By going public, NSA aims to make it harder for foreign hackers to reuse their tools and techniques, while helping to defend critical systems at home.

But six months after the directorate started its work, COVID-19 was declared a pandemic and large swathes of the world — and the U.S. — went into lockdown, prompting hackers to shift gears and change tactics.

“The threat landscape has changed,” Anne Neuberger, NSA’s director of cybersecurity, told TechCrunch at Disrupt 2020. “We’ve moved to telework, we move to new infrastructure, and we’ve watched cyber adversaries move to take advantage of that as well,” she said.

Publicly, the NSA advised on which videoconferencing and collaboration software was secure, and warned about the risks associated with virtual private networks, of which usage boomed after lockdowns began.

But behind the scenes, the NSA is working with federal partners to help protect the efforts to produce and distribute a vaccine for COVID-19, a feat that the U.S. government called Operation Warp Speed. News of NSA’s involvement in the operation was first reported by Cyberscoop. As the world races to develop a working COVID-19 vaccine, which experts say is the only long-term way to end the pandemic, NSA and its U.K. and Canadian partners went public with another Russian intelligence operation aimed at targeting COVID-19 research.

“We’re part of a partnership across the U.S. government, we each have different roles,” said Neuberger. “The role we play as part of ‘Team America for Cyber’ is working to understand foreign actors, who are they, who are seeking to steal COVID-19 vaccine information — or more importantly, disrupt vaccine information or shake confidence in a given vaccine.”

Neuberger said that protecting the pharma companies developing a vaccine is just one part of the massive supply chain operation that goes into getting a vaccine out to millions of Americans. Ensuring the cybersecurity of the government agencies tasked with approving a vaccine is also a top priority.

Here are more takeaways from the talk, and you can watch the interview in full below:

Why TikTok is a national security threat

TikTok is just days away from an app store ban, after the Trump administration earlier this year accused the Chinese-owned company of posing a threat to national security. But the government has been less than forthcoming about what specific risks the video sharing app poses, only alleging that the app could be compelled to spy for China. Beijing has long been accused of cyberattacks against the U.S., including the massive breach of classified government employee files from the Office of Personnel Management in 2014.

Neuberger said that the “scope and scale” of TikTok’s app’s data collection makes it easier for Chinese spies to answer “all kinds of different intelligence questions” on U.S. nationals. Neuberger conceded that U.S. tech companies like Facebook and Google also collect large amounts of user data. But that there are “greater concerns on how [China] in particular could use all that information collected against populations other than its own,” she said.

NSA is privately disclosing security bugs to companies

The NSA is trying to be more open about the vulnerabilities it finds and discloses, Neuberger said. She told TechCrunch that the agency has shared a “number” of vulnerabilities with private companies this year, but “those companies did not want to give attribution.”

One exception was earlier this year when Microsoft confirmed NSA had found and privately reported a major cryptographic flaw in Windows 10, which could have allowed hackers to run malware masquerading as a legitimate file. The bug was so dangerous that NSA reported the vulnerability to Microsoft, which patched the bug.

Only two years earlier, the spy agency was criticized for finding and using a Windows vulnerability to conduct surveillance instead of alerting Microsoft to the flaw. The exploit was later leaked and was used to infect thousands of computers with the WannaCry ransomware, causing millions of dollars’ worth of damage.

As a spy agency, NSA exploits flaws and vulnerabilities in software to gather intelligence on the enemy. It has to run through a process called the Vulnerabilities Equities Process, which allows the government to retain bugs that it can use for spying.

Read More

Posted on

Fearing coronavirus, a Michigan college tracks its students with a flawed app

Schools and universities across the United States are split on whether to open for the fall semester, thanks to the ongoing pandemic.

Albion College, a small liberal arts school in Michigan, said in June it would allow its nearly 1,500 students to return to campus for the new academic year starting in August. Lectures would be limited in size and the semester would finish by Thanksgiving rather than December. The school said it would test both staff and students upon their arrival to campus and throughout the academic year.

But less than two weeks before students began arriving on campus, the school announced it would require them to download and install a contact-tracing app called Aura, which it says will help it tackle any coronavirus outbreak on campus.

There’s a catch. The app is designed to track students’ real-time locations around the clock, and there is no way to opt out.

The Aura app lets the school know when a student tests positive for COVID-19. It also comes with a contact-tracing feature that alerts students when they have come into close proximity with a person who tested positive for the virus. But the feature requires constant access to the student’s real-time location, which the college says is necessary to track the spread of any exposure.

The school’s mandatory use of the app sparked privacy concerns and prompted parents to launch a petition to make using the app optional.

Worse, the app had at least two security vulnerabilities only discovered after the app was rolled out. One of the vulnerabilities allowed access to the app’s back-end servers. The other allowed us to infer a student’s COVID-19 test results.

The vulnerabilities were fixed. But students are still expected to use the app or face suspension.

Track and trace

Exactly how Aura came to be and how Albion became its first major customer is a mystery.

Aura was developed by Nucleus Careers in the months after the pandemic began. Nucleus Careers is a Pennsylvania-based recruiting firm founded in 2020, with no apparent history or experience in building or developing healthcare apps besides a brief mention in a recent press release. The app was built in partnership with Genetworx, a Virginia-based lab providing coronavirus tests. (We asked Genetworx about the app and its involvement, but TechCrunch did not hear back from the company.)

The app helps students locate and schedule COVID-19 testing on campus. Once a student is tested for COVID-19, the results are fed into the app.

If the test comes back negative, the app displays a QR code which, when scanned, says the student is “certified” free of the virus. If the student tests positive or has yet to be tested, the student’s QR code will read “denied.”

Aura uses the student’s real-time location to determine if they have come into contact with another person with the virus. Most other contact-tracing apps use nearby Bluetooth signals, which experts say is more privacy-friendly.

Hundreds of academics have argued that collecting and storing location data is bad for privacy.

The Aura app generates a QR code based on the student’s COVID-19 test results. Scan the QR code to reveal the student’s test result status. (Image: TechCrunch)

In addition to having to install the app, students were told they are not allowed to leave campus for the duration of the semester without permission over fears that contact with the wider community might bring the virus back to campus.

If a student leaves campus without permission, the app will alert the school, and the student’s ID card will be locked and access to campus buildings will be revoked, according to an email to students, seen by TechCrunch.

Students are not allowed to turn off their location and can be suspended and “removed from campus” if they violate the policy, the email read.

Private universities in the U.S. like Albion can largely set and enforce their own rules and have been likened to “shadow criminal justice systems — without any of the protections or powers of a criminal court,” where students can face discipline and expulsion for almost any reason with little to no recourse. Last year, TechCrunch reported on a student at Tufts University who was expelled for alleged grade hacking, despite exculpatory evidence in her favor.

Albion said in an online Q&A that the “only time a student’s location data will be accessed is if they test positive or if they leave campus without following proper procedure.” But the school has not said how it will ensure that student location data is not improperly accessed, or who has access.

“I think it’s more creepy than anything and has caused me a lot of anxiety about going back,” one student going into their senior year, who asked not to be named, told TechCrunch.

A ‘rush job’

One Albion student was not convinced the app was safe or private.

The student, who asked to go by her Twitter handle @Q3w3e3, decompiles and analyzes apps on the side. “I just like knowing what apps are doing,” she told TechCrunch.

Buried in the app’s source code, she found hardcoded secret keys for the app’s backend servers, hosted on Amazon Web Services. She tweeted her findings — with careful redactions to prevent misuse — and reported the problems to Nucleus, but did not hear back.

A security researcher, who asked to go by her handle Gilda, was watching the tweets about Aura roll in. Gilda also dug into the app and found and tested the keys.

“The keys were practically ‘full access’,” Gilda told TechCrunch. She said the keys — since changed — gave her access to the app’s databases and cloud storage in which she found patient data, including COVID-19 test results with names, addresses and dates of birth.

Nucleus pushed out an updated version of the app on the same day with the keys removed, but did not acknowledge the vulnerability.

TechCrunch also wanted to look under the hood to see how Aura works. We used a network analysis tool, Burp Suite, to understand the network data going in and out of the app. (We’ve done this a few times before.) Using our spare iPhone, we registered an Aura account and logged in. The app normally pulls in recent COVID-19 tests. In our case, we didn’t have any and so the scannable QR code, generated by the app, declared that I had been “denied” clearance to enter campus — as to be expected.

But our network analysis tool showed that the QR code was not generated on the device but on a hidden part of Aura’s website. The web address that generated the QR code included the Aura user’s account number, which isn’t visible from the app. If we increased or decreased the account number in the web address by a single digit, it generated a QR code for that user’s Aura account.

In other words, because we could see another user’s QR code, we could also see the student’s full name, their COVID-19 test result status and what date the student was certified or denied.

TechCrunch did not enumerate each QR code, but through limited testing found that the bug may have exposed about 15,000 QR codes.

We described the app’s vulnerabilities to Will Strafach, a security researcher and chief executive at Guardian Firewall. Strafach said the app sounded like a “rush job,” and that the enumeration bug could be easily caught during a security review. “The fact that they were unaware tells me they did not even bother to do this,” he said. And, the keys left in the source code, said Strafach, suggested “a ‘just-ship-it’ attitude to a worrisome extreme.”

An email sent by Albion president Matthew Johnson, dated August 18 and shared with TechCrunch, confirmed that the school has since launched a security review of the app.

We sent Nucleus several questions — including about the vulnerabilities and if the app had gone through a security audit. Nucleus fixed the QR code vulnerability after TechCrunch detailed the bug. But a spokesperson for the company, Tony Defazio, did not provide comment. “I advised the company of your inquiry,” he said. The spokesperson did not return follow-up emails.

In response to the student’s findings, Albion said that the app was compliant with the Health Insurance Portability and Accountability Act, or HIPAA, which governs the privacy of health data and medical records. HIPAA also holds companies — including universities — accountable for security lapses involving health data. That can mean heavy fines or, in some cases, prosecution.

Albion spokesperson Chuck Carlson did not respond to our emails requesting comment.

At least two other schools, Bucknell University and Temple University, are reopening for the fall semester by requiring students to present two negative COVID-19 tests through Genetworx. The schools are not using Aura, but their own in-house student app to deliver the test results.

Albion students, meanwhile, are split on whether to comply, or refuse and face the consequences. @Q3w3e3 said she will not use the app. “I’m trying to work with the college to find an alternative way to be tested,” she told TechCrunch.

Parents have also expressed their anger at the policy.

“I absolutely hate it. I think it’s a violation of her privacy and civil liberties,” said Elizabeth Burbank, a parent of an Albion student, who signed the petition against the school’s tracking effort.

“I do want to keep my daughter safe, of course, and help keep others safe as well. We are more than happy to do our part. I do not believe however, a GPS tracker is the way to go,” she said. “Wash our hands. Eat healthy. And keep researching treatments and vaccines. That should be our focus.

“I do intend to do all I can to protect my daughter’s right to privacy and challenge her right to free movement in her community,” she said.


Send tips securely over Signal and WhatsApp to +1 646-755-8849 or send an encrypted email to: zack.whittaker@protonmail.com

Read More

Posted on

Google launches the final beta of Android 11

With the launch of Android 11 getting closer, Google today launched the third and final beta of its mobile operating system ahead of its general availability. Google had previously delayed the beta program by about a month because of the coronavirus pandemic.

Image Credits: Google

Since Android 11 had already reached platform stability with Beta 2, most of the changes here are fixes and optimizations. As a Google spokesperson noted, “this beta is focused on helping developers put the finishing touches on their apps as they prepare for Android 11, including the official API 30 SDK and build tools for Android Studio.”

The one exception is some updates to the Exposure Notification System contact-tracing API, which users can now use without turning on device location settings. Exposure Notification is an exception here, as all other Android apps need to have location settings on (and user permission to access it) to perform the kind of Bluetooth scanning Google is using for this API.

Otherwise, there are no surprises here, given that this has already been a pretty lengthy preview cycle. Mostly, Google really wants developers to make sure their apps are ready for the new version, which includes quite a few changes.

If you are brave enough, you can get the latest beta over the air as part of the Android Beta program. It’s available for Pixel 2, 3, 3a, 4 and (soon) 4a users.

Read More

Posted on

Legal clouds gather over US cloud services, after CJEU ruling

In the wake of yesterday’s landmark ruling by Europe’s top court — striking down a flagship transatlantic data transfer framework called Privacy Shield, and cranking up the legal uncertainty around processing EU citizens’ data in the U.S. in the process — Europe’s lead data protection regulator has fired its own warning shot at the region’s data protection authorities (DPAs), essentially telling them to get on and do the job of intervening to stop people’s data flowing to third countries where it’s at risk.

Countries like the U.S.

The original complaint that led to the Court of Justice of the EU (CJEU) ruling focused on Facebook’s use of a data transfer mechanism called Standard Contractual Clauses (SCCs) to authorize moving EU users’ data to the U.S. for processing.

Complainant Max Schrems asked the Irish Data Protection Commission (DPC) to suspend Facebook’s SCC data transfers in light of U.S. government mass surveillance programs. Instead, the regulator went to court to raise wider concerns about the legality of the transfer mechanism.

That in turn led Europe’s top judges to nuke the Commission’s adequacy decision, which underpinned the EU-U.S. Privacy Shield — meaning the U.S. no longer has a special arrangement greasing the flow of personal data from the EU. Yet, at the time of writing, Facebook is still using SCCs to process EU users’ data in the U.S. Much has changed, but the data hasn’t stopped flowing — yet.

Yesterday the tech giant said it would “carefully consider” the findings and implications of the CJEU decision on Privacy Shield, adding that it looked forward to “regulatory guidance.” It certainly didn’t offer to proactively flip a kill switch and stop the processing itself.

Ireland’s DPA, meanwhile, which is Facebook’s lead data regulator in the region, sidestepped questions over what action it would be taking in the wake of yesterday’s ruling — saying it (also) needed (more) time to study the legal nuances.

The DPC’s statement also only went so far as to say the use of SCCs for taking data to the U.S. for processing is “questionable” — adding that case by case analysis would be key.

The regulator remains the focus of sustained criticism in Europe over its enforcement record for major cross-border data protection complaints — with still zero decisions issued more than two years after the EU’s General Data Protection Regulation (GDPR) came into force, and an ever-growing backlog of open investigations into the data processing activities of platform giants.

In May, the DPC finally submitted to other DPAs for review its first draft decision on a cross-border case (an investigation into a Twitter security breach), saying it hoped the decision would be finalized in July. At the time of writing we’re still waiting for the bloc’s regulators to reach consensus on that.

The painstaking pace of enforcement around Europe’s flagship data protection framework remains a problem for EU lawmakers — whose two-year review last month called for uniformly “vigorous” enforcement by regulators.

The European Data Protection Supervisor (EDPS) made a similar call today, in the wake of the Schrems II ruling — which only looks set to further complicate the process of regulating data flows by piling yet more work on the desks of underfunded DPAs.

“European supervisory authorities have the duty to diligently enforce the applicable data protection legislation and, where appropriate, to suspend or prohibit transfers of data to a third country,” writes EDPS Wojciech Wiewiórowski, in a statement, which warns against further dithering or can-kicking on the intervention front.

“The EDPS will continue to strive, as a member of the European Data Protection Board (EDPB), to achieve the necessary coherent approach among the European supervisory authorities in the implementation of the EU framework for international transfers of personal data,” he goes on, calling for more joint working by the bloc’s DPAs.

Wiewiórowski’s statement also highlights what he dubs “welcome clarifications” regarding the responsibilities of data controllers and European DPAs — to “take into account the risks linked to the access to personal data by the public authorities of third countries.”

“As the supervisory authority of the EU institutions, bodies, offices and agencies, the EDPS is carefully analysing the consequences of the judgment on the contracts concluded by EU institutions, bodies, offices and agencies. The example of the recent EDPS’ own-initiative investigation into European institutions’ use of Microsoft products and services confirms the importance of this challenge,” he adds.

Part of the complexity of enforcement of Europe’s data protection rules is the lack of a single authority; a varied patchwork of supervisory authorities responsible for investigating complaints and issuing decisions.

Now, with a CJEU ruling that calls for regulators to assess third countries themselves — to determine whether the use of SCCs is valid in a particular use-case and country — there’s a risk of further fragmentation should different DPAs jump to different conclusions.

Yesterday, in its response to the CJEU decision, Hamburg’s DPA criticized the judges for not also striking down SCCs, saying it was “inconsistent” for them to invalidate Privacy Shield yet allow this other mechanism for international transfers. Supervisory authorities in Germany and Europe must now quickly agree how to deal with companies that continue to rely illegally on the Privacy Shield, the DPA warned.

In the statement, Hamburg’s data commissioner, Johannes Caspar, added: “Difficult times are looming for international data traffic.”

He also shot off a blunt warning that: “Data transmission to countries without an adequate level of data protection will… no longer be permitted in the future.”

Compare and contrast that with the Irish DPC talking about use of SCCs being “questionable,” case by case. (Or the U.K.’s ICO offering this bare minimum.)

Caspar also emphasized the challenge facing the bloc’s patchwork of DPAs to develop and implement a “common strategy” toward dealing with SCCs in the wake of the CJEU ruling.

In a press note today, Berlin’s DPA also took a tough line, warning that data transfers to third countries would only be permitted if they have a level of data protection essentially equivalent to that offered within the EU.

In the case of the U.S. — home to the largest and most used cloud services — Europe’s top judges yesterday reiterated very clearly that that is not in fact the case.

“The CJEU has made it clear that the export of data is not just about the economy but people’s fundamental rights must be paramount,” Berlin data commissioner Maja Smoltczyk said in a statement [which we’ve translated using Google Translate].

“The times when personal data could be transferred to the U.S. for convenience or cost savings are over after this judgment,” she added.

Both DPAs warned the ruling has implications for the use of cloud services where data is processed in other third countries where the protection of EU citizens’ data also cannot be guaranteed too, i.e. not just the U.S.

On this front, Smoltczyk name-checked China, Russia and India as countries EU DPAs will have to assess for similar problems.

“Now is the time for Europe’s digital independence,” she added.

Some commentators (including Schrems himself) have also suggested the ruling could see companies switching to local processing of EU users’ data. Though it’s also interesting to note the judges chose not to invalidate SCCs — thereby offering a path to legal international data transfers, but only provided the necessary protections are in place in that given third country.

Also issuing a response to the CJEU ruling today was the European Data Protection Board (EDPB). AKA the body made up of representatives from DPAs across the bloc. Chair Andrea Jelinek put out an emollient statement, writing that: “The EDPB intends to continue playing a constructive part in securing a transatlantic transfer of personal data that benefits EEA citizens and organisations and stands ready to provide the European Commission with assistance and guidance to help it build, together with the U.S., a new framework that fully complies with EU data protection law.”

Short of radical changes to U.S. surveillance law, it’s tough to see how any new framework could be made to legally stick, though. Privacy Shield’s predecessor arrangement, Safe Harbour, stood for around 15 years. Its shiny “new and improved” replacement didn’t even last five.

In the wake of the CJEU ruling, data exporters and importers are required to carry out an assessment of a country’s data regime to assess adequacy with EU legal standards before using SCCs to transfer data there.

“When performing such prior assessment, the exporter (if necessary, with the assistance of the importer) shall take into consideration the content of the SCCs, the specific circumstances of the transfer, as well as the legal regime applicable in the importer’s country. The examination of the latter shall be done in light of the non-exhaustive factors set out under Art 45(2) GDPR,” Jelinek writes.

“If the result of this assessment is that the country of the importer does not provide an essentially equivalent level of protection, the exporter may have to consider putting in place additional measures to those included in the SCCs. The EDPB is looking further into what these additional measures could consist of.”

Again, it’s not clear what “additional measures” a platform could plausibly deploy to “fix” the gaping lack of redress afforded to foreigners by U.S. surveillance law. Major legal surgery does seem to be required to square this circle.

Jelinek said the EDPB would be studying the judgement with the aim of putting out more granular guidance in the future. But her statement warns data exporters they have an obligation to suspend data transfers or terminate SCCs if contractual obligations are not or cannot be complied with, or else to notify a relevant supervisory authority if it intends to continue transferring data.

In her roundabout way, she also warns that DPAs now have a clear obligation to terminate SCCs where the safety of data cannot be guaranteed in a third country.

“The EDPB takes note of the duties for the competent supervisory authorities (SAs) to suspend or prohibit a transfer of data to a third country pursuant to SCCs, if, in the view of the competent SA and in the light of all the circumstances of that transfer, those clauses are not or cannot be complied with in that third country, and the protection of the data transferred cannot be ensured by other means, in particular where the controller or a processor has not already itself suspended or put an end to the transfer,” Jelinek writes.

One thing is crystal clear: Any sense of legal certainty U.S. cloud services were deriving from the existence of the EU-U.S. Privacy Shield — with its flawed claim of data protection adequacy — has vanished like summer rain.

In its place, a sense of déjà vu and a lot more work for lawyers.

Read More

Posted on

Rocket Lab’s first launch of 2020 is a mission for the National Reconnaissance Office

Rocket Lab has announced its first mission for 2020 – a dedicated rocket launch on behalf of client the U.S. National Reconnaissance Office (NRO) with a launch window that opens on January 31. The Electron rocket Rocket Lab is using for this mission will take off from its Launch Complex 1 (LC-1) in New Zealand, and it’ll be the first mission Rocket Lab secured under a new contract the NRO is using that allows it to source launch providers quickly and at short notice.

This new Rapid Acquisition of a Small Rocket (RASR) contract model is pretty much ideal for Rocket Lab, since the whole company’s thesis is based around using small, affordable rockets that can be produced quickly thanks to carbon 3D printing used in the manufacturing process. Rocket Lab has already demonstrated the flexibility of its model by bumping a client to the top of the queue when another dropped out last year, and its ability to win an NRO mission under the RASR contract model is further proof that its aim of delivering responsive, timely rocket launch services for small payloads is hitting a market sweet spot.

The NRO is a U.S. government agency that’s in charge of developing, building, launching and operating intelligence satellites. It was originally established in 1961, but was only officially declassified and made public in 1992. Its mandate includes supporting the work of both the U.S. Intelligence Community, as well as the Department of Defense.

Increasingly, the defense industry is interested in small satellite operations, mainly because using smaller, more efficient and economical satellites means that you can respond to new needs in the field more quickly, and that you can also build resiliency into your observation and communication network through sheer volume. Traditional expensive, huge intelligence and military satellites carry giant price tags, have multi-year development timelines and offer sizeable targets to potential enemies without much in the way of redundancy. Small satellites, especially acting as part of larger constellations, mitigate pretty much all of these potential weaknesses.

One of the reasons that Rocket Lab opened its new Launch Complex 2 (LC-2) launch pad in Wallops Island, Virgina, is to better serve customers from the U.S. defense industry. Its first mission from that site, currently set to happen sometime this spring, is for the U.S. Air Force.

Source: TechCrunch