Windows Vista’s Full Volume Encryption & TPM part 5: does FVE require a TPM or not?

Tonight I stumbled on a rant that quoted a Microsoft web site around various Vista features including Full Volume Encryption (FVE). The stunning thing for me was the following quote (emphasis mine):

“Windows Vista supports full-volume encryption to prevent disk access to files by other operating systems. It also stores encryption keys in a Trusted Platform Model (TPM) v1.2 chip. The entire system partition is encrypted-both the hibernation file and the user data. It also stores encryption keys in a Trusted Platform Model (TPM) v1.2 chip, if one is available on the PC.”

Did I read that right? Does this mean that FVE can actually encrypt the entire system partition whether there’s a TPM 1.2 chip on the system or not? Presumably if this is true, the key to encrypt the volume is stored in the 50 MB partition that is required to store the pre-boot partition that supports FVE. That is, the key is stored in software.

So how does this improve upon what’s available in Windows XP? Frankly I don’t know right now, but I can take a couple of educated guesses. Presumably the Secure Startup sequence requires a user-supplied password before it can decrypt the Vista system partition, so this means there’s yet another password for an attacker to have to brute-force.

However, I gotta wonder whether a software-based Secure Startup boot password is any different from a SYSKEY boot password – no complexity requirements, never needs to be changed, and impossible to manage [pretty much by design] over a large population – how do you archive and recover such a boot password? If so, then this is a just as dangerous/difficult to manage a security control as SYSKEY is.

OK, so I got excited there for a sec, but on further reflection, maybe this isn’t any better than we had before. In fact, it’s even scarier: what if I forgot my Secure Startup boot password, and its encryption key was stored in software? What do I do then? Presumably ALL my data is encrypted with that key (now irretrievable); whereas with SYSKEY I lost the OS but presumably could recover my data, now I’ve lost both the OS and my data. Ugh, sounds pretty gross to me.

I think I read about some capability to archive the encryption key used by Full Volume Encryption, but I’ll have to dig around to confirm (a) if it’s true, and (b) how it works. Until then, consider this entire sub-rant one man’s opinion, no more.

Windows Vista’s Full Volume Encryption & TPM, part 4: available PCs that include TPM 1.2 chip

[Edit: corrected the Broadcom adapter model #, and removed the listing for the Dell Precision 380 Workstation, which turns out to only have a TPM 1.1b chip via the Broadcom BCM5751 chip.]

Since I only talked about Tablet PCs in part 2, I figure I owe it to the community to collect together a listing of any and all shipping PCs that include a v1.2 TPM chip.

What follows are all Servers, desktops, notebooks and Tablets that I could confirm currently include a TPM 1.2 chip:

Servers
none to date

Desktops & Workstations
Dell Optiplex GX620
http://www1.us.dell.com/content/products/productdetails.aspx/optix_gx620?c=us&cs=555&l=en&s=biz
Gateway FX400XL (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400xl&seg=hm
Gateway FX400S (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400s&seg=hm
Gateway FX400X (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400x&seg=hm
Gateway E-6500D SB (via Broadcom NIC referenced here)
http://www.gateway.com/products/gconfig/proddetails.asp?system_id=e6500dsb&seg=sb
HP Compaq Business Desktop DC7600 (via Broadcom NIC)
http://h10010.www1.hp.com/wwpc/us/en/sm/WF04a/12454-64287-89301-321860-f56.html
Vector GZ desktop
http://www.pdspc.com/products/vectorgz.aspx

Notebooks
Gateway M250 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym250b
Gateway M460 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym460b
Gateway M680 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym680b

** HP TC4200 [THEORY: the TPM is an orderable part (Part #383545-001, $42.00 list price), which implies that it’s a removable/replaceable part (and thus that a TPM 1.2 chip could be swapped in later), but this is only an unconfirmed theory on my part] **

Tablets
Gateway M280 Series
http://www.gateway.com/products/gconfig/proddetails.asp?seg=sb&system_id=m280eb

Bonus 1: Add-on Components
Broadcom BCM5752 & BCM5752M network controller chips (which has an integrated TPM 1.2 chip)
http://www.broadcom.com/press/release.php?id=700509

Bonus 2: Linux drivers
Linux driver with support for Infineon’s TPM v1.2 chip
http://www.prosec.rub.de/tpm/

And again, don’t forget to check Tony McFadden’s TPM Matrix. NOTE: I only used Tony’s TPM Matrix to start my search – I haven’t copied any entries without external confirmation, so there may be disagreements between our pages. When in doubt, remember that unless I could confirm a TPM 1.2 chip was included in a PC system, I did not list that system here. Tony’s page is meant to be more comprehensive, so he lists both PC systems with TPM 1.1 chips as well as those with unknown chips or which haven’t been confirmed to include a TPM chip.

P.S. Do you know of any other PC systems shipping a TPM 1.2 chip? If so, add your comment below!


P.P.S. What have I learned in my searches for TPM 1.2-integrated PC systems? Here’s a couple of tips that may be helpful if and when you off on your own search:

  1. If the spec sheet only mentions non-version-specific phrases such as “TPM chip”, “TPM Embedded Security Chip” or “the TCG standard” [emphasis mine], you can and should assume that the chip is a TPM 1.1 chip. Anytime I was able to confirm a TPM 1.2 chip, the PC system vendor made specific and repeated mention of the 1.2 version number. [Apparently this is a big differentiator, though few if any references on the Internet have clarified why.]
  2. If you are looking into a PC that was shipped before Summer 2005, you can rest assured that it did NOT ship with a TPM 1.2 chip, since the TPM chip vendors didn’t have production chips on the market until at least mid-summer of 2005.

Windows Vista’s Full Volume Encryption & TPM, part 3: links & background reading

Paul Thurrott indicates that FVE will appear in Enterprise & Ultimate editions of Vista:
http://www.winsupersite.com/showcase/winvista_beta1_vs_tiger_02.asp

Bart DeSmet digs in deep on EFS, TPM, Secure Startup and more:
http://community.bartdesmet.net/blogs/bart/archive/2005/08/17/3471.aspx

David Berlind speculates on possible incompatibility between Vista/TPM & virtual machine technology:
http://blogs.zdnet.com/microsoftvistas/?p=17

George Ou shines light on a potential key export “backdoor” for FVE, and his ideas on why smartcards would be an ideal FVE key storage mechanism:
http://blogs.zdnet.com/Ou/?p=109

William Knight vaguely alludes to some proprietary algorithms used in FVE that could lead to “a possibility of in-memory attacks for keys.”
http://www.contractoruk.com/002386.html

David Berlind speculates again on a possible use of the TPM by Windows Product Activation (totally unconfirmed at this point):
http://blogs.zdnet.com/microsoftvistas/?p=44

An out-of-date but still “best there is” collection of TPM-related hardware, software and integration information:
http://www.tonymcfadden.net/tpmvendors.html

And last but not least, Microsoft’s Technical Overview of FVE:
http://www.microsoft.com/whdc/system/platform/pcdesign/secure-start_tech.mspx

Windows Vista’s Full Volume Encryption & TPM, part 2: FVE on Tablet PC?

OK, so where was I when I last left the TPM topic? Oh yeah

Frankly I don’t know what to think about the state of TPM-backed data encryption. I really *want* to be able to say “yeah baby – your best bet for securing data on a laptop will be Vista’s FVE” (or any other OS-level TPM-backed file encryption option). For a few hours, I actually believed it could be true – not just for an individual, but for any really big organization as well.

However, the past couple of months’ effort has me pretty much convinced otherwise. I’m not exactly optimistic for the prospect of widespread TPM-secured data protection in the near future.

It looks to me like Full Volume Encryption (FVE) in Windows Vista won’t be a viable option for anyone who isn’t prepared to drop a bundle on new computing hardware at the same time. That’s because there’s almost no computers – especially mobile computers – on the market that have a v1.2 TPM.

While I realize that there are other IHV- and ISV-supplied TSS packages to support TPM-backed file encryption, I am mostly focused on Vista FVE for a couple of reasons:

  1. Until a service is provide in-the-box with the OS, my experience with customers is that integrating vendor-specific security software is a huge hassle, and not supportable at scale over shorter periods of time (e.g. 2-3 years).
  2. There’ll often be more than one TPM-enabled package to support – generally, it looks like an organization will have multiple packages, one for every desktop/notebook/tablet/server vendor that integrates a different TPM module.
  3. It’s not clear at this time how the TSS packages are licensed, but I’ll take a SWAG and assume that you’re only licensed to use the TSS package on the system with which it was shipped, and that you’ll have to pay extra to use that package on PCs that were shipped with a different TSS package.
  4. An organization could scrap the bundled software packages entirely and just license a third-party product across the board (e.g. Wave), but the choices are pretty limited from what I’ve seen, and personally (without having had any hands-on experience to support my gut feeling) I don’t know how much confidence I’d have locking my organization’s most prized data up under this – e.g. what’s the enterprise management (archival & recovery, configuration management, identity management) story like?
  5. [Disclosure: I’m a former Microsoft employee, security consultant and spent most of my tenure consulting on EFS, RMS and other security technologies.]

I’ve been in the market for a new laptop for a while, and one of the reasons for my recent obsession with TPM is that (a) any purchase I make now will have to last well beyond the release data of Vista, (b) since I intend to continue to leverage my Windows security expertise, I should really get a computer that supports FVE so I get first-hand knowledge of how it works, and (c) you generally can’t add a TPM chip to a computer after you’ve purchased it (with at least one known exception).

Oh, and I’ve committed myself to the Tablet PC variant, since I am a committed “whiteboard zealot” and I expect to use the freehand drawing capability quite a bit.

So my mission is to find a Tablet PC that meets my “requirements”:

  • TPM v1.2 chip
  • max RAM > 1 GB
  • dedicated video RAM > 32 MB (to support the lunatic Vista graphical improvements)
  • can run from battery for at least three hours a day (e.g. bus rides to and from work, meetings away from my desk)
  • won’t break my wrist if I use it standing up (e.g. weight under 5 lbs)
  • will withstand dropping it once in a while – I’m more than a bit clumsy

I have spent countless hours scouring the Internet for TPM-enabled Tablets. After my intial survey of the PC vendors’ offerings, I figured there’d be at least a couple of options from which to choose. However, the longer I looked, the more bleak it became. Of the major vendors of Tablet PCs (Acer, Fujitsu, Gateway, HP, Lenovo, Motion and Toshiba), I have so far found exactly ONE Tablet on the market with a v1.2 TPM chip.

One.

And not exactly the industry standard for large enterprise deployment – Gateway!

Did I mention that Windows Vista will require the v1.2 chip to support Secure Startup and Full Volume Encryption?

Oh, and did you hear that Microsoft is trying like h*** to get Tablet PCs in the hands of as many users as possible?

Geez Louise, I even went so far as to contact Fujitsu (who have a really fantastic Tablet with a v1.1 TPM chip) to see if they were sitting on any about-to-be-released v1.2-enabled Tablets, asking them the following:

Could you give me some idea of the following:
– whether Fujitsu is committed to integrating v1.2 TPM chips in their computing products?
– when we can expect to see Tablet PCs with v1.2 TPM chips integrated into them?
– Any planned model or series of Tablets that the v1.2 TPM chips will be used in e.g. Lifebook 4000 series, Slate vs. Convertible, etc.?

And this is the response I got:

We fully intend to continue our support of TPM and transition to v1.2.

However, at this time we can not provide a date as to when this will be available. Fujitsu company policy and NDA agreements with suppliers do not allow us to publicly disclose future plans prior to product launch.

So what’s a guy to think? Right now we’ve got exactly one FVE-ready Tablet on the market, and according to this guy, the big wave of computer upgrades in the business sector may already be passing by. [Let me ignore the fact that I haven’t looked into notebooks yet, and assume that TPM v1.2-equipped notebooks are just as scarce. I’ll check into this further and report back.]

Between now and the shipment of Vista (perhaps October 2006, if you can believe these rumours), less than a year away, am I to believe that hordes of TPM v1.2-equipped PCs will show up on people’s desks? If so, then perhaps there might be a minority of organizations who would consider testing the Vista FVE technology (though I doubt they’d be ready to standardize on it, assuming – rightly – that they’ll have less than a majority of Vista FVE-ready PCs in their organization).

But even if TPM v1.2-equipped PCs were to quickly dominate these organizations, would I feel comfortable urging such organizations to adopt Vista to enable use of FVE to protect their data? I honestly don’t know – I don’t feel a resounding “YES” coming on, but neither do I feel a “NO” building in my gut. Perhaps it’s because I feel like this question won’t be practical for a number of years yet.

By requiring the v1.2 TPM chip for FVE & Secure Startup, I believe that:

  • Third-party TSS packages will get a lot of leeway to take the “organizational standard” position – especially for those TSS packages that also support v1.2 TPM chips
  • Most mid-sized to large organizations won’t be in a position to adopt FVE & SS as their data protection standard until say 2008 or later.

This leaves me wondering what data will be left to protect by then? Given the fact that most organizations are being forced through one regulation or another to encrypt customer-sensitive data, I believe that the next couple of years will be the final window for unencrypted user data to reside on client PCs.

Put another way: if you’re the InfoSec officer in charge of recommended strategies for regulatory compliance & avoiding liability, wouldn’t you rather just encrypt every disk on every “physically insecure” PC throughout the organization? That’s one sure-fire way to know that users haven’t accidentally stored a sensitive file in an unencrypted volume, folder or file. Only then would the organization be able to claim that a lost or stolen PC did not contain unencrypted customer data.

[Now, sure, in 3-5 years there’ll be room to re-evaluate the technology used to maintain protected data on hard drives, and it’s quite possible that by then Vista’s SS & FVE will get the nod from many organizations. Migrating from one highly-technical solution to another is never easy in large orgs, and is pretty scary for small outfits or self-supporting end users, but I’m leaving the door open for the landscape to change beyond my wildest imaginings in the 3-5 year timeframe…]

Does anyone see things differently? Does Vista FVE look like it’ll capture a significant portion of the “data protection” market? I’d really like to be wrong about this – it would suck if the best “free” on-disk data protection technology to come out of Microsoft won’t be practical for the majority until long after they had to commit to another on-disk encryption solution.

Windows Vista’s Full Disk Encryption is only available if you have Microsoft Software Assurance?

http://www.computerworld.com/softwaretopics/os/windows/story/0,10801,104918,00.html

Wow – personally, I think someone in marketing at Microsoft has miscalculated on this one. Don’t get me wrong, I can understand the rationale – “Well, most of the customers that have asked us for this feature are already on Software Assurance or wouldn’t have to spend much additional $$$ to get it. The smaller orgs still have EFS to be able to protect their data, and since they haven’t asked for anything else, they must be satisfied with EFS right?”

I don’t buy it – here’s my thinking:

  • Just because those few organizations who’ve actually taken the time to articulate their needs happen to have the SA arrangements already made (or have the EA leverage to negotiate cheap SA rates), doesn’t mean they’re the only ones who would (or could) use this feature;
  • Just because SA has been considered by many Microsoft customers to be a rip-off, and not worth buying again, shouldn’t lead to the effect (intentional or not) of holding some of the most critical features of Vista hostage from the rest of the Microsoft customer base (especially those who wish to purchase one of the premium Vista SKUs such as the rumoured Professional or Full Media editions);
  • Many of the organizations who haven’t explicitly articulated a need to their Microsoft reps for Windows-native full disk encryption [at least based on my experience with them] are either (a) still struggling with their much more limited – and challenging, in most cases – deployments of some form of file encryption on user’s PCs, and are sick of talking about encryption, or (b) have committed to another technology because Microsoft hasn’t yet provided a solution for this critical business need. However counterintuitive it might sound, those organizations who fall under (b) should be given the chance to try Vista’s full-disk encryption without having to commit to SA to do so. Many organizations with whom I’ve worked have told me they’d far rather use a technology that already comes with the products they’re using, than to have to integrate yet another piece of third-party hardware into an already-overly-complex “desktop” deployment – just so long as they believe the built-in technology reasonably achieves their overall goals. Nothing like hands-on testing (and widespread talk from others also testing) to help convince them – but it’s very difficult to get that groundswell of opinion when so few organizations even qualify to be able to use a technology like Secure Startup.

It’s not like the need isn’t critical in every organization – just the opposite in fact, based on my experience with customers over the years. I wonder if it just happens that there hasn’t been enough formal market research at Microsoft to show how widespread the need really is.

Makes me wonder what ELSE is being locked up in the SA-only Vista Enterprise SKU. I’d love to hear a response to this from those at Microsoft who’ll have to defend this to the legions of Microsoft customers for whom Secure Startup won’t be available…

Trusted Computing Best Practices, the TNC spec, and Microsoft’s involvement – hypocritcal?

Below are excerpts from Bruce Schneier’s “Schneier on Security” blog, asserting that Microsoft is making an effort to prevent the TCG’s software-only spec for TPM apply to Windows Vista before its release:

In May, the Trusted Computing Group published a best practices document: “Design, Implementation, and Usage Principles for TPM-Based Platforms.” Written for users and implementers of TCG technology, the document tries to draw a line between good uses and bad uses of this technology.

[…]

Meanwhile, the TCG built a purely software version of the specification: Trusted Network Connect (TNC). Basically, it’s a TCG system without a TPM.

The best practices document doesn’t apply to TNC, because Microsoft (as a member of the TCG board of directors) blocked it. The excuse is that the document hadn’t been written with software-only applications in mind, so it shouldn’t apply to software-only TCG systems.

This is absurd. The document outlines best practices for how the system is used. There’s nothing in it about how the system works internally. There’s nothing unique to hardware-based systems, nothing that would be different for software-only systems. You can go through the document yourself and replace all references to “TPM” or “hardware” with “software” (or, better yet, “hardware or software”) in five minutes. There are about a dozen changes, and none of them make any meaningful difference.


If true, this feels to me like some form of hypocrisy, at least at a company level. Microsoft took a decidedly different stance on the use of the “no execute” (NX) feature of the latest generation of CPUs from Intel and AMD, and in an ideal world I’d expect them to do the same here.

In the release of Windows XP’s Service Pack 2 (SP2), they implemented changes to the OS that would enable it to assert the “no execute” flag on any and all processes running on the system – if a process attempted to execute a “page” that was previously considered a data page (i.e. non-executable code), then the OS could immediately halt the program and alert the user. The intent is to prevent things like “buffer overruns” from being able to successfully circumvent a program’s intended purpose and ultimately cause the program to do something the attacker wishes (usually a malicious attack on the OS, its programs, or the user’s data). Worms and viruses have had a field day with this kind of attack for years, and Microsoft and the CPU vendors finally got around to implementing an idea that had kicked around the security community for quite a while.

So far so good. However, while this feature was intended to work with the cooperation of software and hardware, it left most of the existing base of XP users (those without NX-capable CPUs) up the creek. So Microsoft decided to implement a subset of those ideas on any computer running Windows XP SP2. This is a software-only implementation of NX – not perfect, not foolproof, and definitely not as strong as the hardware-backed NX you get with the NX-capable CPUs, but a major leap forward from the “buffer overrun friendly” versions of Windows that have preceded it.

And actually, it seems to work pretty well. I’ve enabled the NX feature on all the computers I touch, and seen it catch a number of programs that were (in most cases accidently) caught doing the very things that NX is set to trap. It doesn’t interfere with the stable, mature applications I’m running, and it hasn’t yet prevented me from doing anything really important. Mostly, it’s trapped this behaviour in the third-party “shareware” type apps that are nice to have. [Hopefully I’ve been able to help the developers of these apps by sending them the crash dumps from these apps. When I am notified by XP SP2 that an app was caught by NX, I’ll trace through the dialogs that tell me where the dump files are located – indicated as the “technical information” that would be submitted to Microsoft through the Error Reporting feature – I’ll find the dump folder, Zip up a copy, and email that Zip file to the ISV who developed the app. Microsoft probably does this as well for apps that often show up in their error reporting queues, but I figure it can’t hurt to make sure anyway. Hint: I don’t have one on my system right now – the folder is deleted once it’s uploaded to Microsoft’s error reporting site – but the crash dump files will be written to your %temp% folder, with a folder name conaining “WER”, and the major files will have the extension “.hdmp” and “.mdmp”. The files compress quite well.]

So here’s my concern: if Microsoft’s Windows division was comfortable with taking a hardware-assisted feature like NX and implementing it as a “software-only” feature, wouldn’t it seem hypocritical to resist applying a software-only spec for TPM to the premier OS next on the horizon? I know I’m being naive here, but it seems like Microsoft would be in a near-ideal position to apply TNC to Vista. They’ve been working on the formerly code-named “Palladium” technology for ages now – or at least talking about it in the press. As well, they’ve apparently been involved with the TCG and the development of these documents for quite a while now, and presumably had at least some level of influence over their content (though probably not a dominant hand in them, given the number of other players with just as much at stake here).

So I wonder aloud: what possible benefit does Microsoft gain from Vista “escaping” the confines of the TNC spec? I would guess it’s because, at this late stage in the development of Windows Vista (they just passed Beta 1), there aren’t a lot of fundamental changes to the OS that could be introduced – without significant risk of delaying the release of Vista AGAIN. [How many scheduling delays now, and how many valuable features REMOVED to keep the schedule from slipping further?]

Perhaps there are other just as innocent explanations as well, e.g.:

  • They’ve been trying to get the TNC spec worked into Vista all along, but at the same time as they decided to pull the “Palladium” features out of Vista, they also had to decide whether to further delay Vista (and continue to stabilize the TNC components) or take the TNC components out of Vista and stabilize the Vista ship schedule.
  • The TNC spec may have taken a late change that drastically altered the requirements for Vista, and the Vista team couldn’t add the major code change without resetting the Vista development milestones.
  • There are plans to add TNC into Vista post-RTM – not unlike the way that many significant features were added to XP via SP2.

It would certainly help quell a potential firestorm of controversy if Microsoft got out ahead of Schneier’s allegations and discussed their plans for TNC implementation in Windows, and what prevents them from incorporating the spec in Vista before it ships. Despite the nefarious personality that some would like to attribute to every action from Microsoft, I’ve found that the people I’ve met and with whom I’ve worked there really do have the best of intentions at heart.

Encrypting files on the server – WTF???

I can’t tell you how irritated I get when I read yet another recommendation from some well-meaning security expert that says you should use EFS to encrypt files on a Windows SERVER. I have little or no problem with EFS on a Windows CLIENT (though if you’re not using domain accounts, or you don’t use SYSKEY [shudder], you’re only keeping your files safe from grandma, not your kids), but I have to wonder how many people understand how decryption keys are protected (and by what) when they recommend using EFS on a server.

SQL Database (mdf) encryption example
Let’s take a simple case: you want to protect your SQL database files from remote attackers, so naturally you think “I’ll encrypt the data using EFS – cheap, free and easy – and then remote attackers won’t be able to access the data.” Yes, in one sense that is quite true – if a remote attacker were to try to copy the files on disk – e.g. from a buffer overflow exploit that gave them remote LocalSystem access – then NTFS would return an Access Denied error.

  • when you encrypt a file that is to be accessible to a Service (such as the “MS SQL Server” service that mounts the SQL database files), you are in reality required to encrypt the file in the context of the Service account in which the service runs.
  • In this example, you’d have to encrypt in the MSSQLServer service’s account context – and if you’ve been reading your SQL Server security guidance, you’ll already have created a service account and downgraded MSSQLServer from the default LocalSystem service account context.
  • This means that only the service account (e.g. you’ve created a local account named SERVER\SQLServiceAcct) can decrypt the files.
  • What happens when the service starts? The service “logs on” with the SQLServiceAcct (actually the Service Control Manager calls CreateProcessAsUser() or similar API and runs the new process in the context of the user account specified as the Service Account in the service’s configuration).
  • How does the Service Control Manager “authenticate” the service? The service account name is stored in cleartext in the Registry, and the service account password is stored as an LSA Secret elsewhere in the Registry.
  • LSA Secrets are ACL’d so they are not readable by any user except the LocalSystem, and they are further encrypted with the System Key (aka SYSKEY), so that only the LSA process (which has the ability to use the SYSKEY decryption key) could access the LSA Secrets.
  • [AFAIK] The Service Control Manager requests that the LSA decrypt the service account password and pass it to the Service Control Manager for use in the CreateProcessAsUser() API call.
  • Once the MSSQLServer service is running in the correct user context, then the EFS driver in NTFS will decrypt the encrypted database files for the MSSQLServer process, and SQL Server will be able to mount the now-decrypted database files.
  • Any process running in any other user context will not be able to supply the correct RSA private key for EFS to be able to decrypt the files. In our example, if the attacker could remotely run a script in the LocalSystem context that tried to copy the database files,NTFS will return an Access Denied message to the script process that tried to access the encrypted database files.

However, if that same remote attacker were really interested in getting access to that encrypted file, they could quite easily grant themselves access:

  • Anyone with LocalSystem access (or local Administrators membership as well) could grant themselves the SeDebugPrivilege, and then run any number of “hacker” tools that could dump the LSA Secrets from memory into cleartext form.
  • e.g. the family of lsadump*.exe tools attach to the LSASS.EXE process (via the Debug privilege) and dump out all the decrypted LSA Secrets.
  • Once you have the decrypted LSA Secrets, you can quickly find the SQLServiceAcct password, which then gives you the ability to logon as that user account.
  • Once you can authenticate as the SQLServiceAcct user account, you’ll have access to all the RSA decryption keys stored in that user’s profile. Then any attempts to read/copy files encrypted by that user will be automatically decrypted by EFS.

This is an unavoidable consequence of the scenario. Services must be able to start automatically (at least, on all Windows servers for which I’ve had to recommend security measures), which means that the Service Control Manager must be able to read the password from LSA Secrets without user intervention.

[This also usually means that SYSKEY boot passphrases or boot floppies won’t be used, since the use of an “off-system SYSKEY” means the server will never boot without an administrator intervening, which makes remote management a heckuva lot harder. Unless you have some of those fancy Remote Insight boards AND a sysadmin who doesn’t mind getting paged every time the server has to reboot.]

My conclusion: EFS-encrypting files for processes that start without user intervention provides very little protection against remote attackers who can gain LocalSystem or Administrators access to your server. This means *any* Service, whether on a server or a client (e.g. the ol’ ArcServ backup agent that runs on every Windows server and client, and [at least used to] “require” a Domain Admin account as the service account. That’s another hairy security implementation for another day’s rant, lemme tell you…).

[Note: Netscape web server had this same “problem” back in the days when I still administered Netscape-on-Windows. If you had an SSL certificate configured for the site, and you didn’t want to have to stand at the keyboard every time you wanted to start the web server, you’d have to store the private key’s decryption password in a plaintext file on the server. Kinda ruled out any *real* security that you could claim for that private key, but whatever – SSL was just there to encrypt the session key anyway, and very few SSL sessions lasted long enough for the fabled “sniff the SSL session on the wire” attacks anyway.]

SQL Database dump file example
“But wait Mike – what if the MSSQLServer service was always running? Doesn’t SQL have an exclusive lock on all its database files while the service is running?” Yes, absolutely. This brings to mind a couple of different thoughts:

  • how do you make sure the service is always running – prevent it being shut down, or ensure that the server reboots as soon as the service is no longer running?
  • if the files are already exclusively locked, doesn’t that mean the remote attacker won’t be able to read or copy the files off the filesystem? Why bother encrypting if the service *never* doesn’t run?

Also note: the “exclusive lock” principle obviously won’t apply to scheduled database dump files – the files are written once, then unlocked by the scheduled dump process/thread. This should make you think twice/thrice about encrypting the database dump files on disk – the files will be unlocked, waiting on the filesystem for that same LocalSystem/Admin attacker to logon as the dump user context and copy the files at their leisure. [It would also mean that any remote process to read or copy the dump files – e.g. an enterprise backup system running on a central server – would have to be able to decrypt the files remotely. This requires “Trusted for Delegation” configuration for the server where the dump files are held, which is a security headache that warrants careful thought before implementing.]

My best advice for protecting the database dumps from remote attackers?

  • Don’t ever dump to the local filesystem of the server – stream your database backups over the network, either to a remote file share that wouldn’t be accessible to the remote attackers, or directly to a backup device that writes the files to backup media; OR,
  • Minimize the amount of time that the database dumps are stored on a locally-accessible filesystem. Have the files copied off-device as soon as possible, and if possible wipe the free space after you’ve deleted the files (if you’re concerned about the remote attackers undeleting the files).