Running Ubuntu VMs on Windows just got a whole lot more streamlined

Article intro image

Microsoft and Canonical have been working for some time to make Ubuntu and Windows play nice with each other. Ubuntu was the first distribution supported in the Windows Subsystem for Linux, and now an Ubuntu image is available through Hyper-V Quick Create, which offers three-click creation of Virtual Machines.

The system image has Ubuntu Desktop 18.04 LTS configured and ready to go, and this showcases some of the other Linux integration work that Microsoft has been doing. The Hyper-V virtual machine client, Virtual Machine Connection, has two ways of working. The normal way is to display the output of the virtual video card that the virtual machine uses and, similarly, to emulate PS/2 mouse and keyboard input, as if the client were the physical hardware. This works with any operating system (the virtual video card supports rudimentary modes like VGA and the text mode used by DOS; it can also support high-resolution graphics modes when used with a suitable display driver). But it is relatively slow and inflexible.

The other way, used automatically with modern Windows VMs, is “Enhanced Session Mode.” In an Enhanced Session, the virtual machine transmits a variation of RDP (Remote Desktop Protocol, Microsoft’s protocol for Windows’ Remote Desktop features) directly to the hypervisor, which then delivers it to the Hyper-V client. Enhanced Sessions have a number of advantages: you can resize the client window, and the VM is notified of the change of resolution; you can copy and paste between the virtual machine and the host; there’s automatic sharing of folders between guest and host; and the mouse doesn’t get trapped inside the client window.

Read 3 remaining paragraphs | Comments

Ultimate.ai nabs $1.3M for a customer service AI focused on non-English markets

For customer service, Ultimate.ai‘s thesis is it’s not humans or AI but humans and AI. The Helsinki- and Berlin-based startup has built an AI-powered suggestion engine that, once trained on clients’ data-sets, is able to provide real-time help to (human) staff dealing with customer queries via chat, email and social channels. So the AI layer is intended to make the humans behind the screens smarter and faster at responding to customer needs — as well as freeing them up from handling basic queries to focus on more complex issues.

AI-fuelled chatbots have fast become a very crowded market, with hundreds of so called ‘conversational AI’ startups all vying to serve the customer service cause.

Ultimate.ai stands out by merit of having focused on non-English language markets, says co-founder and CEO Reetu Kainulainen. This is a consequence of the business being founded in Finland, whose language belongs to a cluster of Eastern and Northern Eurasian languages that are plenty removed from English in sound and grammatical character.

“[We] started with one of the toughest languages in the world,” he tells TechCrunch. “With no available NLP [natural language processing] able to tackle Finnish, we had to build everything in house. To solve the problem, we leveraged state-of-the-art deep neural network technologies.

“Today, our proprietary deep learning algorithms enable us to learn the structure of any language by training on our clients’ customer service data. Core within this is our use of transfer learning, which we use to transfer knowledge between languages and customers, to provide a high-accuracy NLU engine. We grow more accurate the more clients we have and the more agents use our platform.”

Ultimate.ai was founded in November 2016 and launched its first product in summer 2017. It now has more than 25 enterprise clients, including the likes of Zalando, Telia and Finnair. It also touts partnerships with tech giants including SAP, Microsoft, Salesforce and Genesys — integrating with their Contact Center solutions.

“We partner with these players both technically (on client deployments) and commercially (via co-selling). We also list our solution on their Marketplaces,” he notes.

Up to taking in its first seed round now it had raised an angel round of €230k in March 2017, as well as relying on revenue generated by the product as soon as it launched.

The $1.3M seed round is co-led by Holtzbrinck Ventures and Maki.vc.

Kainulainen says one of the “key strengths” of Ultimate.ai’s approach to AI for text-based customer service touch-points is rapid set-up when it comes to ingesting a client’s historical customer logs to train the suggestion system.

“Our proprietary clustering algorithms automatically cluster our customer’s historical data (chat, email, knowledge base) to train our neural network. We can go from millions of lines of unstructured data into a trained deep neural network within a day,” he says.

“Alongside this, our state-of-the-art transfer learning algorithms can seed the AI with very limited data — we have deployed Contact Center automation for enterprise clients with as little as 500 lines of historical conversation.”

Ultimate.ai’s proprietary NLP achieves “state-of-the-art accuracy at 98.6%”, he claims.

It can also make use of what he dubs “semi-supervised learning” to further boost accuracy over time as agents use the tool.

“Finally, we leverage transfer learning to apply a single algorithmic model across all clients, scaling our learnings from client-to-client and constantly improving our solution,” he adds.

On the competitive front, it’s going up against the likes of IBM’s Watson AI. However Kainulainen argues that IBM’s manual tools — which he argues “require large onboarding projects and are limited in languages with no self-learning capabilities” — make that sort of manual approach to chatbot building “unsustainable in the long-term”.

He also contends that many rivals are saddled with “lengthy set-up and heavy maintenance requirements” which makes them “extortionately expensive”.

A closer competitor (in terms of approach) which he namechecks is TC Disrupt battlefield alum Digital Genius. But again they’ve got English language origins — so he flags that as a differentiating factor vs the proprietary NLP at the core of Ultimate.ai’s product (which he claims can handle any language).

“It is very difficult to scale out of English to other languages,” he argues. “It also uneconomical to rebuild your architecture to serve multi-language scenarios. Out of necessity, we have been language-agnostic since day one.”

“Our technology and team is tailored to the customer service problem; generic conversational AI tools cannot compete,” he adds. “Within this, we are a full package for enterprises. We provide a complete AI platform, from automation to augmentation, as well as omnichannel capabilities across Chat, Email and Social. Languages are also a key technical strength, enabling our clients to serve their customers wherever they may be.”

The multi-language architecture is not the only claimed differentiator, either.

Kainulainen points to the team’s mission as another key factor on that front, saying: “We want to transform how people work in customer service. It’s not about building a simple FAQ bot, it’s about deeply understanding how the division and the people work and building tools to empower them. For us, it’s not Superagent vs. Botman, it’s Superagent + Botman.”

So it’s not trying to suggest that AI should replace your entire customers service team but rather enhance your in house humans.

Asked what the AI can’t do well, he says this boils down to interactions that are transactional vs relational — with the former category meshing well with automation, but the latter (aka interactions that require emotional engagement and/or complex thought) definitely not something to attempt to automate away.

“Transactional cases are mechanical and AI is good at mechanical. The customer knows what they want (a specific query or action) and so can frame their request clearly. It’s a simple, in-and-out case. Full automation can be powerful here,” he says. “Relational cases are more frequent, more human and more complex. They can require empathy, persuasion and complex thought. Sometimes a customer doesn’t know what the problem is — “it’s just not working”.

“Other times are sales opportunities, which businesses definitely don’t want to automate away (AI isn’t great at persuasion). And some specific industries, e.g. emergency services, see the human response as so vital that they refuse automation entirely. In all of these situations, AI which augments people, rather than replaces, is most effective.

“We see work in customer service being transformed over the next decade. As automation of simple requests becomes the status-quo, businesses will increasingly differentiate through the quality of their human-touch. Customer service will become less labour intensive, higher skilled work. We try and imagine what tools will power this workforce of tomorrow and build them, today.”

On the ethics front, he says customers are always told when they are transferred to a human agent — though that agent will still be receiving AI support (i.e. in the form of suggested replies to help “bolster their speed and quality”) behind the scenes.

Ultimate.ai’s customers define cases they’d prefer an agent to handle — for instance where there may be a sales opportunity.

“In these cases, the AI may gather some pre-qualifying customer information to speed up the agent handle time. Human agents are also brought in for complex cases where the AI has had difficulty understanding the customer query, based on a set confidence threshold,” he adds.

Kainulainen says the seed funding will be used to enhance the scalability of the product, with investments going into its AI clustering system.

The team will also be targeting underserved language markets to chase scale — “focusing heavily on the Nordics and DACH [Germany, Austria, Switzerland]”.

“We are building out our teams across Berlin and Helsinki. We will be working closely with our partners – SAP, Microsoft, Salesforce and Genesys — to further this vision,” he adds. 

Commenting on the funding in a statement, Jasper Masemann, investment manager at Holtzbrinck Ventures, added: “The customer service industry is a huge market and one of the world’s largest employers. Ultimate.ai addresses the main industry challenges of inefficiency, quality control and high people turnover with latest advancements in deep learning and human machine hybrid models. The results and customer feedback are the best I have seen, which makes me very confident the team can become a forerunner in this space.”

Microsoft Managed Desktop lets Redmond handle your desktop devices

Article intro image

Just as the cloud freed many administrators from the day-to-day tedium of tending to Exchange servers and infrastructure like Domain Controllers, Microsoft Managed Desktop (MMD) could do the same for the corporate desktop. The new service combines Microsoft 365 Enterprise (a combined Windows 10, Office 365, and Enterprise Mobility bundle), hardware leasing, and cloud-based device management to deliver secured, updated, and maintained systems, all with software maintenance handled by Microsoft.

Redmond says that it’s offering the service in response to customer desire to hand off day-to-day device management tasks and spend more time addressing the needs of their organizations.

The new service will work on what the company calls “modern hardware”: systems with the right hardware security features and remote-management capabilities. This will include both first-party Surface systems and, in coming months, third-party machines from companies such as Dell and HP. With MMD, customers will be able to put their credentials into systems straight from the OEM. Machines will retrieve their configuration, enroll in device management, and install necessary applications using Windows AutoPilot. There should be no need for IT personnel to ever touch the machines.

Read 2 remaining paragraphs | Comments

Why the Pentagon’s $10 billion JEDI deal has cloud companies going nuts

By now you’ve probably heard of the Defense Department’s massive winner-take-all $10 billion cloud contract dubbed the Joint Enterprise Defense Infrastructure (or JEDI for short).
Star Wars references aside, this contract is huge, even by government standards.The Pentagon would like a single cloud vendor to build out its enterprise cloud, believing rightly or wrongly that this is the best approach to maintain focus and control of their cloud strategy.

Department of Defense (DOD) spokesperson Heather Babb tells TechCrunch the department sees a lot of upside by going this route. “Single award is advantageous because, among other things, it improves security, improves data accessibility and simplifies the Department’s ability to adopt and use cloud services,” she said.

Whatever company they choose to fill this contract, this is about modernizing their computing infrastructure and their combat forces for a world of IoT, artificial intelligence and big data analysis, while consolidating some of their older infrastructure. “The DOD Cloud Initiative is part of a much larger effort to modernize the Department’s information technology enterprise. The foundation of this effort is rationalizing the number of networks, data centers and clouds that currently exist in the Department,” Babb said.

Setting the stage

It’s possible that whoever wins this DOD contract could have a leg up on other similar projects in the government. After all it’s not easy to pass muster around security and reliability with the military and if one company can prove that they are capable in this regard, they could be set up well beyond this one deal.

As Babb explains it though, it’s really about figuring out the cloud long-term. “JEDI Cloud is a pathfinder effort to help DOD learn how to put in place an enterprise cloud solution and a critical first step that enables data-driven decision making and allows DOD to take full advantage of applications and data resources,” she said.

Photo: Mischa Keijser for Getty Images

The single vendor component, however, could explain why the various cloud vendors who are bidding, have lost their minds a bit over it — everyone except Amazon, that is, which has been mostly silent, happy apparently to let the process play out.

The belief amongst the various other players, is that Amazon is in the driver’s seat for this bid, possibly because they delivered a $600 million cloud contract for the government in 2013, standing up a private cloud for the CIA. It was a big deal back in the day on a couple of levels. First of all, it was the first large-scale example of an intelligence agency using a public cloud provider. And of course the amount of money was pretty impressive for the time, not $10 billion impressive, but a nice contract.

For what it’s worth, Babb dismisses such talk, saying that the process is open and no vendor has an advantage. “The JEDI Cloud final RFP reflects the unique and critical needs of DOD, employing the best practices of competitive pricing and security. No vendors have been pre-selected,” she said.

Complaining loudly

As the Pentagon moves toward selecting its primary cloud vendor for the next decade, Oracle in particular has been complaining to anyone who will listen that Amazon has an unfair advantage in the deal, going so far as to file a formal complaint last month, even before bids were in and long before the Pentagon made its choice.

Photo: mrdoomits for Getty Images (cropped)

Somewhat ironically, given their own past business model, Oracle complained among other things that the deal would lock the department into a single platform over the long term. They also questioned whether the bidding process adhered to procurement regulations for this kind of deal, according to a report in the Washington Post. In April, Bloomberg reported that co-CEO Safra Catz complained directly to the president that the deal was tailor made for Amazon.

Microsoft hasn’t been happy about the one-vendor idea either, pointing out that by limiting itself to a single vendor, the Pentagon could be missing out on innovation from the other companies in the back and forth world of the cloud market, especially when we’re talking about a contract that stretches out for so long.

As Microsoft’s Leigh Madden told TechCrunch in April, the company is prepared to compete, but doesn’t necessarily see a single vendor approach as the best way to go. “If the DOD goes with a single award path, we are in it to win, but having said that, it’s counter to what we are seeing across the globe where 80 percent of customers are adopting a multi-cloud solution,” he said at the time.

He has a valid point, but the Pentagon seems hell bent on going forward with the single vendor idea, even though the cloud offers much greater interoperability than proprietary stacks of the 1990s (for which Oracle and Microsoft were prime examples at the time).

Microsoft has its own large DOD contract in place for almost a billion dollars, although this deal from 2016 was for Windows 10 and related hardware for DOD employees, rather than a pure cloud contract like Amazon has with the CIA.

It also recently released Azure Stack for government, a product that lets government customers install a private version of Azure with all the same tools and technologies you find in the public version, and could prove attractive as part of its JEDI bid.

Cloud market dynamics

It’s also possible that the fact that Amazon controls the largest chunk of the cloud infrastructure market, might play here at some level. While Microsoft has been coming fast, it’s still about a third of Amazon in terms of market size, as Synergy Research’s Q42017 data clearly shows.

The market hasn’t shifted dramatically since this data came out. While market share alone wouldn’t be a deciding factor, Amazon came to market first and it is much bigger in terms of market than the next four combined, according to Synergy. That could explain why the other players are lobbying so hard and seeing Amazon as the biggest threat here, because it’s probably the biggest threat in almost every deal where they come up against each other, due to its sheer size.

Consider also that Oracle, which seems to be complaining the loudest, was rather late to the cloud after years of dismissing it. They could see JEDI as a chance to establish a foothold in government that they could use to build out their cloud business in the private sector too.

10 years might not be 10 years

It’s worth pointing out that the actual deal has the complexity and opt-out clauses of a sports contract with just an initial two-year deal guaranteed. A couple of three-year options follow, with a final two-year option closing things out. The idea being, that if this turns out to be a bad idea, the Pentagon has various points where they can back out.

Photo: Henrik Sorensen for Getty Images (cropped)

In spite of the winner-take-all approach of JEDI, Babb indicated that the agency will continue to work with multiple cloud vendors no matter what happens. “DOD has and will continue to operate multiple clouds and the JEDI Cloud will be a key component of the department’s overall cloud strategy. The scale of our missions will require DOD to have multiple clouds from multiple vendors,” she said.

The DOD accepted final bids in August, then extended the deadline for Requests for Proposal to October 9th. Unless the deadline gets extended again, we’re probably going to finally hear who the lucky company is sometime in the coming weeks, and chances are there is going to be lot of whining and continued maneuvering from the losers when that happens.

Cryptocurrency mining attacks using leaked NSA hacking tools are still highly active a year later

It’s been over a year since highly classified exploits built by the National Security Agency were stolen and published online.

One of the tools, dubbed EternalBlue, can covertly break into almost any Windows machine around the world. It didn’t take long for hackers to start using the exploits to run ransomware on thousands of computers, grinding hospitals and businesses to a halt. Two separate attacks in as many months used WannaCry and NotPetya ransomware, which spread like wildfire. Once a single computer in a network was infected, the malware would also target other devices on the network. The recovery was slow and cost companies hundreds of millions in damages.

Yet, more than a year since Microsoft released patches that slammed the backdoor shut, almost a million computers and networks are still unpatched and vulnerable to attack.

Although WannaCry infections have slowed, hackers are still using the publicly accessible NSA exploits to infect computers to mine cryptocurrency.

Nobody knows that better than one major Fortune 500 multinational, which was hit by a massive WannaMine cryptocurrency mining infection just days ago.

“Our customer is a very large corporation with multiple offices around the world,” said Amit Serper, who heads the security research team at Boston-based Cybereason.

“Once their first machine was hit the malware propagated to more than 1,000 machines in a day,” he said, without naming the company.

Cryptomining attacks have been around for a while. It’s more common for hackers to inject cryptocurrency mining code into vulnerable websites, but the payoffs are low. Some news sites are now installing their own mining code as an alternative to running ads.

But WannaMine works differently, Cybereason said in its post-mortem of the infection. By using those leaked NSA exploits to gain a single foothold into a network, the malware tries to infect any computer within. It’s persistent so the malware can survive a reboot. After it’s implanted, the malware uses the computer’s processor to mine cryptocurrency. On dozens, hundreds, or even thousands of computers, the malware can mine cryptocurrency far faster and more efficiently. Though it’s a drain on energy and computer resources, it can often go unnoticed.

After the malware spreads within the network, it modifies the power management settings to prevent the infected computer from going to sleep. Not only that, the malware tries to detect other cryptomining scripts running on the computer and terminates them — likely to squeeze every bit of energy out of the processor, maximizing its mining effort.

At least 300,000 computers or networks are still vulnerable to the NSA’s EternalBlue hacking tools.

Based on up-to-date statistics from Shodan, a search engine for open ports and databases, at least 919,000 servers are still vulnerable to EternalBlue, with some 300,000 machines in the US alone. And that’s just the tip of the iceberg — that figure can represent either individual vulnerable computers or a vulnerable network server capable of infecting hundreds or thousands more machines.

Cybereason said companies are still severely impacted because their systems aren’t protected.

“There’s no reason why these exploits should remain unpatched,” the blog post said. “Organizations need to install security patches and update machines.”

If not ransomware yesterday, it’s cryptomining malware today. Given how versatile the EternalBlue exploit is, tomorrow it could be something far worse — like data theft or destruction.

In other words: if you haven’t patched already, what are you waiting for?

It’s the end of crypto as we know it and I feel fine

Watching the current price madness is scary. Bitcoin is falling and rising in $500 increments with regularity and Ethereum and its attendant ICOs are in a seeming freefall with a few “dead cat bounces” to keep things lively. What this signals is not that crypto is dead, however. It signals that the early, elated period of trading whose milestones including the launch of Coinbase and the growth of a vibrant (if often shady) professional ecosystem is over.

Crypto still runs on hype. Gemini announcing a stablecoin, the World Economic Forum saying something hopeful, someone else saying something less hopeful – all of these things and more are helping define the current market. However, something else is happening behind the scenes that is far more important.

As I’ve written before, the socialization and general acceptance of entrepreneurs and entrepreneurial pursuits is a very recent thing. In the old days – circa 2000 – building your own business was considered somehow sordid. Chancers who gave it a go were considered get-rich-quick schemers and worth of little more than derision.

As the dot-com market exploded, however, building your own business wasn’t so wacky. But to do it required the imprimaturs and resources of major corporations – Microsoft, Sun, HP, Sybase, etc. – or a connection to academia – Google, Netscape, Yahoo, etc. You didn’t just quit school, buy a laptop, and start Snapchat.

It took a full decade of steady change to make the revolutionary thought that school wasn’t so great and that money was available for all good ideas to take hold. And take hold it did. We owe the success of TechCrunch and Disrupt to that idea and I’ve always said that TC was career pornography for the cubicle dweller, a guilty pleasure for folks who knew there was something better out there and, with the right prodding, they knew they could achieve it.

So in looking at the crypto markets currently we must look at the dot-com markets circa 1999. Massive infrastructure changes, some brought about by Y2K, had computerized nearly every industry. GenXers born in the late 70s and early 80s were in the marketplace of ideas with an understanding of the Internet the oldsters at the helm of media, research, and banking didn’t have. It was a massive wealth transfer from the middle managers who pushed paper since 1950 to the dot-com CEOs who pushed bits with native ease.

Fast forward to today and we see much of the same thing. Blockchain natives boast about having been interest in bitcoin since 2014. Oldsters at banks realize they should get in on things sooner than later and price manipulation is rampant simply because it is easy. The projects we see now are the Kozmo.com of the blockchain era, pie-in-the-sky dream projects that are sucking up millions in funding and will produce little in real terms. But for every hundred Kozmos there is one Amazon .

And that’s what you have to look for.

Will nearly every ICO launched in the last few years fail? Yes. Does it matter?

Not much.

The market is currently eating its young. Early investors made (and probably lost) millions on early ICOs but the resulting noise has created an environment where the best and brightest technical minds are faced with not only creating a technical product but also maintaining a monetary system. There is no need for a smart founder to have to worry about token price but here we are. Most technical CEOs step aside or call for outside help after their IPO, a fact that points to the complexity of managing shareholder expectations. But what happens when your shareholders are 16-year-olds with a lot of Ethereum in a Discord channel? What happens when little Malta becomes the de facto launching spot for token sales and you’re based in Nebraska? What happens when the SEC, FINRA, and Attorneys General from here to Beijing start investigating your hobby?

Basically your hobby stops becoming a hobby. Crypto and blockchain has weaponized nerds in an unprecedented way. In the past if you were a Linux developer or knew a few things about hardware you could build a business and make a little money. Now you can build an empire and make a lot of money.

Crypto is falling because the people in it for the short term are leaving. Long term players – the Amazons of the space – have yet to be identified. Ultimately we are going to face a compression in the ICO and, for a while, it’s going to be a lot harder to build an ICO. But give it a few years – once the various financial authorities get around to reading the Satoshi white paper – and you’ll see a sea change. Coverage will change. Services will change. And the way you raise money will change.

VC used to be about a team and a dream. Now it’s about a team, $1 million in monthly revenue, and a dream. The risk takers are gone. The dentists from Omaha who once visited accelerator demo days and wrote $25,000 checks for new apps are too shy to leave their offices. The flashy VCs from Sand Hill have to keep Uber and Airbnb’s plates spinning until they can cash out. VC is dead for the small entrepreneur.

Which is why the ICO is so important and this is why the ICO is such a mess right now. Because everybody sees the value but nobody – not the SEC, not the investors, not the founders – can understand how to do it right. There is no SAFE note for crypto. There are no serious accelerators. And all of the big names in crypto are either goldbugs, weirdos, or Redditors. No one has tamed the Wild West.

They will.

And when they do expect a whole new crop of Amazons, Ubers, and Oracles. Because the technology changes quickly when there’s money, talent, and a way to marry the two in which everyone wins.

Microsoft acquires Lobe, a drag-and-drop AI tool

Microsoft today announced that is has acquired Lobe, a startup that lets you build machine learning models with the help of a simple drag-and-drop interface. Microsoft plans to use Lobe, which only launched into beta earlier this year, to build upon its own efforts to make building AI models easier, though, for the time being, Lobe will operate as before.

“As part of Microsoft, Lobe will be able to leverage world-class AI research, global infrastructure, and decades of experience building developer tools,” the team writes. “We plan to continue developing Lobe as a standalone service, supporting open source standards and multiple platforms.”

Lobe was co-founded by Mike Matas, who previously worked on the iPhone and iPad, as well as Facebook’s Paper and Instant Articles products. The other co-founders are Adam Menges and Markus Beissinger.

In addition to Lobe, Microsoft also recently bought Bonsai.ai, a deep reinforcement learning platform, and Semantic Machines, a conversational AI platform. Last year, it acquired Disrupt Battlefield participant Maluuba. It’s no secret that machine learning talent is hard to come by, so it’s no surprise that all of the major tech firms are acquiring as much talent and technology as they can.

“In many ways though, we’re only just beginning to tap into the full potential AI can provide,” Microsoft’s EVP and CTO Kevin Scott writes in today’s announcement. “This in large part is because AI development and building deep learning models are slow and complex processes even for experienced data scientists and developers. To date, many people have been at a disadvantage when it comes to accessing AI, and we’re committed to changing that.”

It’s worth noting that Lobe’s approach complements Microsoft’s existing Azure ML Studio platform, which also offers a drag-and-drop interface for building machine learning models, though with a more utilitarian design than the slick interface that the Lobe team built. Both Lobe and Azure ML Studio aim to make machine learning easy to use for anybody, without having to know the ins and outs of TensorFlow, Keras or PyTorch. Those approaches always come with some limitations, but just like low-code tools, they do serve a purpose and work well enough for many use cases.

Security flaw in ‘nearly all’ modern PCs and Macs exposes encrypted data

Most modern computers, even devices with disk encryption, are vulnerable to a new attack that can steal sensitive data in a matter of minutes, new research says.

In new findings published Wednesday, F-Secure said that none of the existing firmware security measures in every laptop it tested “does a good enough job” of preventing data theft.

F-Secure principal security consultant Olle Segerdahl told TechCrunch that the vulnerabilities put “nearly all” laptops and desktops — both Windows and Mac users — at risk.

The new exploit is built on the foundations of a traditional cold boot attack, which hackers have long used to steal data from a shut-down computer. Modern computers overwrite their memory when a device is powered down to scramble the data from being read. But Segerdahl and his colleague Pasi Saarinen found a way to disable the overwriting process, making a cold boot attack possible again.

“It takes some extra steps,” said Segerdahl, but the flaw is “easy to exploit.” So much so, he said, that it would “very much surprise” him if this technique isn’t already known by some hacker groups.

“We are convinced that anybody tasked with stealing data off laptops would have already come to the same conclusions as us,” he said.

It’s no secret that if you have physical access to a computer, the chances of someone stealing your data is usually greater. That’s why so many use disk encryption — like BitLocker for Windows and FileVault for Macs — to scramble and protect data when a device is turned off.

But the researchers found that in nearly all cases they can still steal data protected by BitLocker and FileVault regardless.

After the researchers figured out how the memory overwriting process works, they said it took just a few hours to build a proof-of-concept tool that prevented the firmware from clearing secrets from memory. From there, the researchers scanned for disk encryption keys, which, when obtained, could be used to mount the protected volume.

It’s not just disk encryption keys at risk, Segerdahl said. A successful attacker can steal “anything that happens to be in memory,” like passwords and corporate network credentials, which can lead to a deeper compromise.

Their findings were shared with Microsoft, Apple, and Intel prior to release. According to the researchers, only a smattering of devices aren’t affected by the attack. Microsoft said in a recently updated article on BitLocker countermeasures that using a startup PIN can mitigate cold boot attacks, but Windows users with “Home” licenses are out of luck. And, any Apple Mac equipped with a T2 chip are not affected, but a firmware password would still improve protection.

Both Microsoft and Apple downplayed the risk.

Acknowledging that an attacker needs physical access to a device, Microsoft said it encourages customers to “practice good security habits, including preventing unauthorized physical access to their device.” Apple said it was looking into measures to protect Macs that don’t come with the T2 chip.

When reached, Intel would not to comment on the record.

In any case, the researchers say, there’s not much hope that affected computer makers can fix their fleet of existing devices.

“Unfortunately, there is nothing Microsoft can do, since we are using flaws in PC hardware vendors’ firmware,” said Segerdahl. “Intel can only do so much, their position in the ecosystem is providing a reference platform for the vendors to extend and build their new models on.”

Companies, and users, are “on their own,” said Segerdahl.

“Planning for these events is a better practice than assuming devices cannot be physically compromised by hackers because that’s obviously not the case,” he said.