Yes, this is contagious…
Your results:
You are Spider-Man
| You are intelligent, witty, a bit geeky and have great power and responsibility. ![]() |
Yes, this is contagious…
Your results:
You are Spider-Man
| You are intelligent, witty, a bit geeky and have great power and responsibility. ![]() |
I’ve been toying with the notion of learning some “real” coding for years now. No matter how good I get at my expertise(s), and no matter how much demand for infrastructure geeks like me there is, I’ve felt a growing pressure to get some “chops”. Yeah, I can read an API, I can sometimes *follow* a codepath (almost easy in VBScript by now, still brutally hard in a C++ fragment), and I feel comfortable in using tools like Depends.exe, ProcExp.exe. Hell, I even have gotten to *almost* understand what I’m doing when I run a debugger like windbg.exe.
I took a great introductory college course on ASP.NET development from a really good friend a couple of years ago, but didn’t quite finish it (i.e. I didn’t write the final). I’ve had an IDE installed on most of my computers for years now, but didn’t hardly do much more than fire up a sample and feel inadequate.
So a few months back I spotted the Visual Studio Express betas – stripped-down IDEs that are targeted at folks just like me. At first I felt just as inadequate with them as with the full-fledged beasties – I still didn’t really know where to start, and without a good sense of the “vocabulary” of a coding language, I always felt like I was crippled from doing something practical with it. [Sorry, but I’m one of those guys that doesn’t really *learn* the lesson by using artificial dev scenarios that don’t do much more than “Hello World” crap. Maybe that works for a lot of folks, and I’m just broken, I dunno.]
Then I started seeing some really encouraging signs:
And so I took more and more steps to get closer. I got a couple of books out from the library that would give me some fun, easy, quick stuff to play with:
And most importantly, I sketched out a design idea for a simple application that I would actually use. [More on that later, when I get some of the cool features working.]
But here’s the kicker: not only was it fairly easy to stumble across the basic code fragments that I would need to make the basics of my app work. Not only did I find that things like the “Me” object were damned intuitive, and some of the new controls (like the Menu Bar Toolstrip) were brilliant for quickly whipping up the stuff I *never* want to have to write from scratch. No, the bit that finally got me to blog about this “dirty secret” of mine was this:
[hmm, uploading the screenshot doesn’t seem to be working.]
I’ve run across an error like this before: “NullReferenceException was unhandled” – “Object reference not set to an instance of an object”. Seen it tons of times, and never knew what to do with it.
So when did they finally know how to translate these errors into English? Now there’s a dialog that includes
Troubleshooting tips:
Use the “new” keyword to create an object instance.
Check to determine if the object is null before calling the method.
Get general help for this exception.
*I* can actually do something with that information. OK, so hell, if I can get past this kind of vague-as-everything error message, I’m figuring this is do-able, and I’ll keep pounding away at this code.
Then I check back to Microsoft’s web site to see the current offerings, and was surprised to be able to download the released version of the Express editions directly off the web. !!!
Well holy freak, this is a pretty good deal – download any one of the Express Edition dev tools and use it free for a YEAR. What? Are you guys nuts? What happened to the 60/90/120-day evals? Won’t this eat into a giant sales opportunity? Must be giving some Marketing guy chills just considering this approach…
Well, call me crazy but I think this is great – give guys like me enough time to actually start using the stuff – long enough that I can actually justify to a manager the cost of buying one of these things.
No, wait – WHAT? [OK, I’m done after this] Seems that if you download ’em before 2006-11-07 (i.e. next year), they’re free to use forever. [which means they’re free from now on, because you *know* that you’ll always be able to dig up a download of them somewhere on the ‘net once they’re out like this.]
Sweet.
http://www.computerworld.com/securitytopics/security/story/0,10801,105429,00.html
I am very much interested in anything that helps an organization get a handle on the kinds of “attacks” this device is intended to detect.
My first reaction when I read “The current version of the Symantec appliance does not actually block suspicious queries — it simply monitors and reports on what the database is up to — but that feature is being considered for a future version…” was – Wow, doesn’t that make this a pretty useless piece of tech then?
However, when I think back on all the customers with whom I’ve worked, I’ve found that most of them are happy enough to be able to detect unauthorized behaviour. Sure, if preventative controls cost no more (time, effort, resources, usability) than the equivalent detective control, they’d be happy to use that instead. However, most of us have had enough experience with “prevention is the only path to security” approaches to understand that preventative security can only guarantee that it’ll block some form of intended usage, and that (as Schneier so often points out) the bad guys will always find some other way to accomplish their goals, if they’re determined enough.
Such as: if you block unauthorized use through a database “intrusion prevention” appliance, the bad guys will then try other attack vectors such as:
Bottom line: I like the thinking that went into Symantec’s database security appliance, and I hope to see more creative ideas like this in the future. As the article said, “…enterprise users are becoming increasingly focused on data security and regulation compliance.” [emphasis mine]
http://www.computerworld.com/softwaretopics/os/windows/story/0,10801,104918,00.html
Wow – personally, I think someone in marketing at Microsoft has miscalculated on this one. Don’t get me wrong, I can understand the rationale – “Well, most of the customers that have asked us for this feature are already on Software Assurance or wouldn’t have to spend much additional $$$ to get it. The smaller orgs still have EFS to be able to protect their data, and since they haven’t asked for anything else, they must be satisfied with EFS right?”
I don’t buy it – here’s my thinking:
It’s not like the need isn’t critical in every organization – just the opposite in fact, based on my experience with customers over the years. I wonder if it just happens that there hasn’t been enough formal market research at Microsoft to show how widespread the need really is.
Makes me wonder what ELSE is being locked up in the SA-only Vista Enterprise SKU. I’d love to hear a response to this from those at Microsoft who’ll have to defend this to the legions of Microsoft customers for whom Secure Startup won’t be available…
A colleague recently asked me about a previous post of mine:
“Mike, in your blog you mentioned you must use Syskey for real protection of EFS protected data. You said if you didn’t use Syskey, it was relatively easy to get to EFS files. So 3 questions that I haven’t been able to find an answer:
First I should clear up the misunderstanding I may have created regarding SYSKEY and EFS. What I meant to assert is that EFS files are relatively easy to get at (for educated attackers) unless you use either:
(a) SYSKEY boot floppy or SYSKEY boot password, or
(b) domain logon accounts (and a relatively decent password/passphrase).
I don’t generally recommend SYSKEY in a domain environment; instead I recommend domain accounts and strong passwords or passphrases for reasonable security against brute force attacks.
As for the direct questions I *was* asked:
I had a quick look around the Internet for current details on leveraging a TPM (Trusted Platform Module) chip for encrypting files on disk – here’s what I learned on my first pass:
This is fascinating, and a lot more than I expected to turn up. It seems that TPM has finally started to catch on with the PC vendors – I was shocked to see that pretty much all the major PC vendors had TPM-enabled PCs. It’s not that I didn’t expect this to happen, but that since I hadn’t heard any of my customers asking me about this so far, I assumed it was still “on the horizon” (like “the year of the PKI” is still just a year or two away, for the tenth year in a row).
I’m going to devote some serious research into the state of TPM-enabled data encryption, and over the next few posts I’ll be putting up my findings and opinions on where I think TPM-enabled encryption fits into the kinds of solutions I normally recommend.
Watch for it.
Below are excerpts from Bruce Schneier’s “Schneier on Security” blog, asserting that Microsoft is making an effort to prevent the TCG’s software-only spec for TPM apply to Windows Vista before its release:
In May, the Trusted Computing Group published a best practices document: “Design, Implementation, and Usage Principles for TPM-Based Platforms.” Written for users and implementers of TCG technology, the document tries to draw a line between good uses and bad uses of this technology.
[…]
Meanwhile, the TCG built a purely software version of the specification: Trusted Network Connect (TNC). Basically, it’s a TCG system without a TPM.
The best practices document doesn’t apply to TNC, because Microsoft (as a member of the TCG board of directors) blocked it. The excuse is that the document hadn’t been written with software-only applications in mind, so it shouldn’t apply to software-only TCG systems.
This is absurd. The document outlines best practices for how the system is used. There’s nothing in it about how the system works internally. There’s nothing unique to hardware-based systems, nothing that would be different for software-only systems. You can go through the document yourself and replace all references to “TPM” or “hardware” with “software” (or, better yet, “hardware or software”) in five minutes. There are about a dozen changes, and none of them make any meaningful difference.
If true, this feels to me like some form of hypocrisy, at least at a company level. Microsoft took a decidedly different stance on the use of the “no execute” (NX) feature of the latest generation of CPUs from Intel and AMD, and in an ideal world I’d expect them to do the same here.
In the release of Windows XP’s Service Pack 2 (SP2), they implemented changes to the OS that would enable it to assert the “no execute” flag on any and all processes running on the system – if a process attempted to execute a “page” that was previously considered a data page (i.e. non-executable code), then the OS could immediately halt the program and alert the user. The intent is to prevent things like “buffer overruns” from being able to successfully circumvent a program’s intended purpose and ultimately cause the program to do something the attacker wishes (usually a malicious attack on the OS, its programs, or the user’s data). Worms and viruses have had a field day with this kind of attack for years, and Microsoft and the CPU vendors finally got around to implementing an idea that had kicked around the security community for quite a while.
So far so good. However, while this feature was intended to work with the cooperation of software and hardware, it left most of the existing base of XP users (those without NX-capable CPUs) up the creek. So Microsoft decided to implement a subset of those ideas on any computer running Windows XP SP2. This is a software-only implementation of NX – not perfect, not foolproof, and definitely not as strong as the hardware-backed NX you get with the NX-capable CPUs, but a major leap forward from the “buffer overrun friendly” versions of Windows that have preceded it.
And actually, it seems to work pretty well. I’ve enabled the NX feature on all the computers I touch, and seen it catch a number of programs that were (in most cases accidently) caught doing the very things that NX is set to trap. It doesn’t interfere with the stable, mature applications I’m running, and it hasn’t yet prevented me from doing anything really important. Mostly, it’s trapped this behaviour in the third-party “shareware” type apps that are nice to have. [Hopefully I’ve been able to help the developers of these apps by sending them the crash dumps from these apps. When I am notified by XP SP2 that an app was caught by NX, I’ll trace through the dialogs that tell me where the dump files are located – indicated as the “technical information” that would be submitted to Microsoft through the Error Reporting feature – I’ll find the dump folder, Zip up a copy, and email that Zip file to the ISV who developed the app. Microsoft probably does this as well for apps that often show up in their error reporting queues, but I figure it can’t hurt to make sure anyway. Hint: I don’t have one on my system right now – the folder is deleted once it’s uploaded to Microsoft’s error reporting site – but the crash dump files will be written to your %temp% folder, with a folder name conaining “WER”, and the major files will have the extension “.hdmp” and “.mdmp”. The files compress quite well.]
So here’s my concern: if Microsoft’s Windows division was comfortable with taking a hardware-assisted feature like NX and implementing it as a “software-only” feature, wouldn’t it seem hypocritical to resist applying a software-only spec for TPM to the premier OS next on the horizon? I know I’m being naive here, but it seems like Microsoft would be in a near-ideal position to apply TNC to Vista. They’ve been working on the formerly code-named “Palladium” technology for ages now – or at least talking about it in the press. As well, they’ve apparently been involved with the TCG and the development of these documents for quite a while now, and presumably had at least some level of influence over their content (though probably not a dominant hand in them, given the number of other players with just as much at stake here).
So I wonder aloud: what possible benefit does Microsoft gain from Vista “escaping” the confines of the TNC spec? I would guess it’s because, at this late stage in the development of Windows Vista (they just passed Beta 1), there aren’t a lot of fundamental changes to the OS that could be introduced – without significant risk of delaying the release of Vista AGAIN. [How many scheduling delays now, and how many valuable features REMOVED to keep the schedule from slipping further?]
Perhaps there are other just as innocent explanations as well, e.g.:
It would certainly help quell a potential firestorm of controversy if Microsoft got out ahead of Schneier’s allegations and discussed their plans for TNC implementation in Windows, and what prevents them from incorporating the spec in Vista before it ships. Despite the nefarious personality that some would like to attribute to every action from Microsoft, I’ve found that the people I’ve met and with whom I’ve worked there really do have the best of intentions at heart.
I predict this will be a watershed moment in terms of focus on security of DATA, and (thankfully) take the primacy away from perimeter, network and host security (which in my opinion has consumed an inordinate share of attention, leaving the ONLY UNIQUE [information security] AND IRREPLACEABLE ASSET EACH ORGANIZATION HAS – their data – to languish in insecure obscurity. Let’s hope this helps get those infosec security audit & remediation efforts refocused on the ASSETS, not on the IMPACT, part of the threat analysis equation.
Not to underestimate the efforts this will kick off, I believe those truly interested in securing the privacy and confidentiality of their customers’ data (credit cards, PII and other privacy-occluded data) will have to spend considerable effort on:
I can’t tell you how irritated I get when I read yet another recommendation from some well-meaning security expert that says you should use EFS to encrypt files on a Windows SERVER. I have little or no problem with EFS on a Windows CLIENT (though if you’re not using domain accounts, or you don’t use SYSKEY [shudder], you’re only keeping your files safe from grandma, not your kids), but I have to wonder how many people understand how decryption keys are protected (and by what) when they recommend using EFS on a server.
SQL Database (mdf) encryption example
Let’s take a simple case: you want to protect your SQL database files from remote attackers, so naturally you think “I’ll encrypt the data using EFS – cheap, free and easy – and then remote attackers won’t be able to access the data.” Yes, in one sense that is quite true – if a remote attacker were to try to copy the files on disk – e.g. from a buffer overflow exploit that gave them remote LocalSystem access – then NTFS would return an Access Denied error.
However, if that same remote attacker were really interested in getting access to that encrypted file, they could quite easily grant themselves access:
This is an unavoidable consequence of the scenario. Services must be able to start automatically (at least, on all Windows servers for which I’ve had to recommend security measures), which means that the Service Control Manager must be able to read the password from LSA Secrets without user intervention.
[This also usually means that SYSKEY boot passphrases or boot floppies won’t be used, since the use of an “off-system SYSKEY” means the server will never boot without an administrator intervening, which makes remote management a heckuva lot harder. Unless you have some of those fancy Remote Insight boards AND a sysadmin who doesn’t mind getting paged every time the server has to reboot.]
My conclusion: EFS-encrypting files for processes that start without user intervention provides very little protection against remote attackers who can gain LocalSystem or Administrators access to your server. This means *any* Service, whether on a server or a client (e.g. the ol’ ArcServ backup agent that runs on every Windows server and client, and [at least used to] “require” a Domain Admin account as the service account. That’s another hairy security implementation for another day’s rant, lemme tell you…).
[Note: Netscape web server had this same “problem” back in the days when I still administered Netscape-on-Windows. If you had an SSL certificate configured for the site, and you didn’t want to have to stand at the keyboard every time you wanted to start the web server, you’d have to store the private key’s decryption password in a plaintext file on the server. Kinda ruled out any *real* security that you could claim for that private key, but whatever – SSL was just there to encrypt the session key anyway, and very few SSL sessions lasted long enough for the fabled “sniff the SSL session on the wire” attacks anyway.]
SQL Database dump file example
“But wait Mike – what if the MSSQLServer service was always running? Doesn’t SQL have an exclusive lock on all its database files while the service is running?” Yes, absolutely. This brings to mind a couple of different thoughts:
Also note: the “exclusive lock” principle obviously won’t apply to scheduled database dump files – the files are written once, then unlocked by the scheduled dump process/thread. This should make you think twice/thrice about encrypting the database dump files on disk – the files will be unlocked, waiting on the filesystem for that same LocalSystem/Admin attacker to logon as the dump user context and copy the files at their leisure. [It would also mean that any remote process to read or copy the dump files – e.g. an enterprise backup system running on a central server – would have to be able to decrypt the files remotely. This requires “Trusted for Delegation” configuration for the server where the dump files are held, which is a security headache that warrants careful thought before implementing.]
My best advice for protecting the database dumps from remote attackers?
The New York Times recently posted an article about a wide-ranging set of data security issues that I found interesting.
This is the kind of thing that’s recently been guiding my thinking – not just encryption because CSB 1386 [for example] says you should, but holistic means of preventing loss of Confidentiality (via Information Disclosure threats e.g. not just the always-discussed-but-rare-in-the-real-world MITM SSL attacks but lo-tech attacks like getting hired at the delivery company that picks up your backup tapes).
The amount of effort that MS has poured into tabbed browsing over the years – first denying its utility, then showing how you can do it with other IE-based browsers, and finally in actually implementing it in IE7 – makes me wonder what’s up in the minds of the competition, and all the “whiners”, who kept harping on the lack of tabs in IE [caveat: I like tabbed browsing as much as the next person]
At best, tabbed browsing is a “nice to have”, and if you *really* didn’t like that IE didn’t have it, I gotta wonder why you didn’t just go use another browser that *did* have tabs. I can see *some* legitimate reason for putting tabs in IE, but the effort that it generated on MSFT’s behalf I believe was disproportionate to the benefit of finally getting tabs in IE.
It reminds me of the times when the press are goaded into spending inordinate amounts of time reporting on trivial issues, giving the government or industry plenty of “cover” in which to execute much more controversial policies and decisions. Like implementing extreme policies while the press spends every waking moment wondering about Terry Schiavo, as one example.
Microsoft, you should watch out for what people are *really* doing while you’re not watching – while you’re being goaded into focusing all this attention on such a trivial implementation (even if some of your biggest customers are “demanding” tabbed browsing). At minimum, you’re being taunted into playing a game of catch-up with the competing browsers, and getting no more benefit from this than being able to claim “me too” on a feature that I believe is ultimately trivial. What major “big-bang” features do your competition get to deliver, while you’re playing me-too rather than implementing new features?
I know this sounds a bit paranoid, but it doesn’t mean it’s inaccurate…