Spoiler: It's not the end of the world

Your PC will still be able to load other OSes if you wish to do so. Windows 8 based ARM devices will be locked down tighter than a drum. #seb #computing #UEFI #Windows #Microsoft

Embedded Link

Windows 8's locked bootloaders: much ado about nothing, or the end of the world as we know it?
Microsoft has published the hardware requirements that manufacturers must follow if they want to slap a "Designed for Windows 8" sticker onto their systems. In among many innocuous requirements—multitouch systems must support at least five points of touch, there must be at least 10 GB of free space available to the user, and more—are a set of requirements for Windows 8 systems' firmware. These requirements have reignited Linux users' fears that they will be locked out of Windows 8 hardware.

Get your test boxes ready to play.

Windows 8 beta coming in late February. #seb #Microsoft #Windows #Computing

Embedded Link

CES 2012: Windows 8 Beta In Late February
There has been some speculation over the last few months over when to expect the beta release of Windows 8. During Microsoft's final CES keynote tonight, Microsoft put that speculation to rest (more or less), announcing that the Windows 8 beta will be released to the public in late February. Also being released alongside the Windows 8 beta will be the Windows Store, Microsoft’s central repository for Metro applications. The Windows Store will be available globally, and will support every lang…

ArsTechnica ponders if it’s time for Microsoft to force critical updates.

Meanwhile back in the Windows ‘verse all the anti-virus and system patches in the world won’t make a bit of difference if no one bothers to actually apply them to their systems. A new malware package known as Conficker has been making sudden gains on systems across the net taking advantage of a vulnerability in Windows that was patched months ago. This prompts Joel Hruska over at ArsTechnica.com to ponder whether critical updates should be forced onto systems:

Microsoft issued a patch for MS08-067 on October 23 and rates the severity of the flaw as “Critical.” for all previous versions of Windows 2000, XP, XP-64, and Server 2003. Windows Vista and Windows Server 2008 are apparently less vulnerable; Microsoft’s aggregate severity rating for these two operating systems is “Important.”

There’s a story within the rise of Conficker that I think is worth exploring. Microsoft appears to have dealt with this issue in textbook fashion; the company issued a warning, released a patch, and (presumably) rolled that patch into November’s Patch Tuesday. A significant amount of time—five to six weeks—has passed since Microsoft released its fix, yet PC World reports Conficker may have already infected as many as 500,000 systems.

It would be extremely fascinating to see data on how a patch spreads throughout the Internet once released by Microsoft as well as information on whether or not the severity of any particular flaw affects how rapidly users move to apply the patch. Events like this this raise the question of whether or not Microsoft should have the capability to push critical security updates out to home users automatically, regardless of how AutoUpdate is configured. I say home users for a reason; businesses and enterprise-class companies may still need to deploy the patch on a specialized timeline in order to ensure servers stay operational.

The idea of mandatory updates is unpopular with a lot of folks, myself included, but there’s a fair argument to be made here. Microsoft takes a lot of shit for having major holes in their OS, but a lot of those holes are patched within a reasonable time upon their discovery. Those patches don’t do any good if they’re not applied and the average PC user is not a technical support guy like me and probably won’t even be aware that he needs to apply patches, but he won’t hesitate to blame Microsoft if he gets infected. At the very least I could see an argument for setting the option for critical updates to be installed automatically as the default with the option to turn it off for folks who know what they’re doing. We already have a number of different software packages, mostly DRM systems, that update themselves automatically whether the user wants them to or not and a lot of folks seem to have no problem living with that situation (the rest of us just don’t use that software). I see a much stronger argument that can be made for Microsoft doing the same with critical updates than any DRM system.

The problem of unpatched systems has gotten bad enough that back in 2005 some ISPs started blocking infected systems from using their services and others have been breaking Internet protocols in controversial ways to try and combat the problem, but the best offense is a good defense and that means individual users keeping their systems patched and running current anti-virus software.  The question then becomes: Should Microsoft be allowed to at least force the critical updates on its users?

Microsoft uses Vista haters to demonstrate that Vista’s not so bad.

If you’ve spent much time here then you already know that I think Windows Vista is a decent operating system that is unfairly maligned. If I had a dime for every time I’ve had someone talk to me about how much Vista sucks only to say they haven’t tried it when I ask if they’ve even touched the OS, well, I’d have at least a few bucks to spend. Surely I’m not the only person who’s noticed that it’s gotten to the point of being “common knowledge” that Vista blows chunks such that the criticisms are repeated endlessly by people who haven’t even used the OS.

It seems Microsoft noticed that trend as well and they set out to put it to the test:

Spurred by an e-mail from someone deep in the marketing ranks, Microsoft last week traveled to San Francisco, rounding up Windows XP users who had negative impressions of Vista. The subjects were put on video, asked about their Vista impressions, and then shown a “new” operating system, code-named Mojave. More than 90 percent gave positive feedback on what they saw. Then they were told that “Mojave” was actually Windows Vista.

“Oh wow,” said one user, eliciting exactly the exclamation that Microsoft had hoped to garner when it first released the operating system more than 18 months ago. Instead, the operating system got mixed reviews and criticisms for its lack of compatibility and other headaches.

To be sure, the focus groups didn’t have to install Vista or hook it up to their existing home network. Still, the emotional appeal of the “everyman” trying Vista and liking it clearly packs an emotional punch, something the company has desperately needed. Microsoft is still trying to figure out just how it will use the Mojave footage in its marketing, though it will clearly have a place.

I wouldn’t be surprised by that at all. Certainly Vista has it’s issues, but then what OS doesn’t. The truth is the problems it had at launch were no where near as bad as what XP went through and, as was the case with past versions of Windows, it’s been slowly improving since then.

Apparently Microsoft is rolling out a new campaign promoting Vista that will run into the hundreds of millions in dollars and will include such things as free technical support for small businesses that switch to using Vista. Along the way you can be sure they’re going to be using that Mojave footage to show that Vista has gotten a bad rap:

“In the weeks ahead, we’ll launch a campaign to address any lingering doubts our customers may have about Windows Vista,” Ballmer wrote. “And later this year, you’ll see a more comprehensive effort to redefine the meaning and value of Windows for our customers.”

What gives the Mojave project its power, though, is the fact that it isn’t Ballmer or someone else at Microsoft saying that Vista has gotten a bad rap. It’s everyday people.

With scenes reminiscent of both Apple’s “real people” campaign of a few years back as well as classic commercials from Folgers and others, the Mojave project could prove a formidable weapon.

The Mojave project is remarkable both for its humble origin as well as the speed with which it was pulled off. The idea started barely two weeks ago in an e-mail from Microsoft’s David Webster to several superiors, including Veghte. Given the green light, Microsoft started videotaping responses just last week, in San Francisco. The preview Veghte gave to CNET News on Wednesday was the first time the footage had been shown outside the company and its contractors.

The footage could get a public airing as soon as next week or even at Thursday’s financial analyst meeting, although plans were still in flux as of late Wednesday night.

With the success of Apple’s anti-Vista ads—Macs are up to an 8.5 market share now—I’m surprised it’s taken this long for Microsoft to get around to fighting back. Now the question is are the big enough to overcome “conventional wisdom”?

Linux doesn’t seem to live up to the stability hype.

Every time I write about computing on the Windows platform here, particularly when discussing problems, I can be assured that I’ll get at least a dozen comments/emails from folks encouraging me to switch to Linux because it’s the most secure and stable operating system in the universe. I’ve only dabbled in Linux previously, having used it mainly because my webhosts are all Linux based, but since taking on the new job some two months ago I’ve had to become a lot more familiar with it. Specifically I’ve had to get to know Ubuntu Linux rather quickly as it’s the primary distro used here at the office. I’ve mentioned before how one of the tasks assigned to me was to put together a Ubuntu based kiosk for the scanners to use when on break to browse the web. I’ve made a lot of progress since I started on that project, but there’s still a bit to go before it’s completely done. I’m still far from a Linux expert, but I’m much further along than I was when I started two months ago. Installing the OS and various packages is no longer a knuckle biting experience and I’m getting quite comfortable with VIM despite the fact that I hate it.

One of the things I’ve noticed in that time is that the much vaunted stability that Linux is supposed to be known for is largely a myth; or at least it appears to be for me. Hardly a day goes by that I don’t have ant least one crash that requires me to completely reboot the laptop to get it working again and there are often several crashes during the day that I manage to recover from. When you consider that I spend around 85% of my time running nothing other than Firefox 3 and Pidgin, that’s an impressive bit of crashing. Just getting a malfunctioning program to close is an annoying process and half the time it doesn’t restore the system to a usable state. Logging out of the system using the CTRL-ALT-BACKSPACE key combination and then logging back in will fix things once every six times or so that I try it. Every now and then I’ll get lucky and an application will freeze up, the window turning an ominous gray, and then unfreeze on its own after a couple of moments for no discernible reason.

Now I accept that it’s possible I could just be a dumb fuck who’s doing something wrong and causing his own problems, but I find that difficult to believe because I’m not really doing anything all that advanced. I’m running a web browser and an IM chat client—two things that hardly ever crash on my Windows XP box at home.  I’m sure I’ll get tons of emails about how some folks have had their Linux workstations working non-stop for 10 years with nary a crash in sight, but, based on my own experiences with it, I can only imagine it’s because you never do anything with it. I say this because I have two laptops here, both running Ubuntu, and the one that doesn’t crash at least once a day is the one that I only touch occasionally to check the web based workstation monitor on. It seems as long as I don’t do much with it then it runs along just fine, but if I spend any amount of time using it it gets all pissy. I’ve been good about patching things when the little icon shows up and says there’s important updates to be installed—something which is a near-daily exercise as well—and I’ve been scanning the web looking for info on what may be causing the issues and there’s a whole host ranging from the various hardware drivers in use to issues with some Firefox plugins.

Don’t get me wrong. There’s plenty to like about Linux in general and Ubuntu in particular, it’s a shit load easier to install than it used to be for example, but from the standpoint of an every day user I have to say that this crashes at least as often as any Windows installation does and is about ten times harder to diagnose as to why. For as far along as the GUI has come on Linux, it still seems like if you really want to be sure the changes you make take hold and you’re seeing all there is to see that you have to open a command line and wade through endless text based configuration files and logs which, if you’re lucky, might be semi-readable in content. I recognize that my years of using Windows makes it seem easier to use in some respects, but I don’t think it’s all an illusion brought on by familiarity either. I’m sure some of this will become easier to diagnose as I become ever more familiar with Linux in my day to day use, but at the moment I’m less than impressed with its much-vaunted aura of stability.

I’ve been dicking around with Ubuntu.

I don’t know if I mentioned it or not, but one of the other things I picked up as a result of some PC side work lately is an old donated IBM Thinkpad 600E. Damn thing is ancient (Pentium II 366 Mhz), but I was able to bump the RAM in it up to 512MB and slap a 20GB HD in it so I’ve at least got a working laptop once more. So I figured I’d see if I could get Ubuntu to install on it to try out and quickly discovered why Linux has a long way to go before it’s going to replace Windows Vista or any other Microsoft OS.

Everything I read about Ubuntu claims it’s the easiest of the Linux distros to work with. So far that has not been my experience. I started off with downloading the Live CD/Install CD image that was recommended on the Ubuntu website. That was a mistake as it apparently doesn’t give you a choice on whether to launch the Live CD (which essentially runs Ubuntu from the CD-ROM) or just do an install. It turns out that starting a Live CD takes some time, no, make that a lot of time. So much so that I thought it wasn’t doing anything at all and maybe got a bad image. After talking with some coworkers one of them mentioned that it took upwards of an hour for his to startup on some hardware that wasn’t quite as old as what I was running on, but he put that down to only having had 256MB of RAM. So I tried again that night and let it sit for two and a half hours with no apparent signs of life coming from the system.

Returning to the Ubuntu website I don’t find any suggestions that would be helpful in speeding this process up any or bypassing the launch of the Live CD, but I do find a link to downloading an “alternate text-only install CD” which I proceed to grab. This drops the whole Live CD bit and gets straight to doing the install, but this still took an inordinate amount of time to complete. By my estimates it took at least an hour and a half to finish the install and it wasn’t an entirely smooth process. The laptop itself has a Linksys PCMCIA wireless card in it and Ubuntu did manage to see the card, but wasn’t able to actually get it to work for some reason. I tossed in a 3COM 10/100 card I had and that didn’t fare any better despite the fact that it’s listed as being compatible on the Ubuntu website.

But it did finally install and I found some help pages on the Ubuntu site that offered some suggestions on how to get the networking cards working. To say that the process of installing alternate drivers and enabling them was a convoluted and involved process would be an understatement. Hell, just finding where to configure the damned things was a lesson in trial and error. To top it all off it still didn’t work even after trying everything suggested on the website. Not having a working network interface pretty much negates the whole point of having the laptop for me as I wanted it specifically for accessing the Internet away from my desktop.

So I wiped the hard drive and tossed my Windows XP CD-ROM. Total install time was an hour and four minutes. Both network cards were detected and while I did have to download drivers for the Linksys wireless card, I was able to do so using the 3COM card without issue. Considering the age of the laptop XP seemed to run pretty well probably thanks to the half-gig of RAM I had in it. The difference in the two experiences was amazing. Despite being a pretty crappy OS in many ways, getting Windows up and running was a no-brainer.

While I’m certainly nowhere near as knowledgeable about Linux as I am Windows, I have been working with it for years with my webhosts so it’s not like I’m clueless. If the difference in setting up the two OSes is that profound for me then I can only imagine what it’d be like for your average I-just-want-the-damned-thing-to-work Joe User and it drives home the point of why Linux won’t be replacing Windows anytime soon no matter how much safer, faster, better it happens to be.

I’ve not completely given up on getting Ubuntu to work as I’ve had some more suggestions from coworkers that use it on how to possibly get it up and running. Might even try reinstalling it tonight, though I’m debating downloading the Kbuntu variant as I like the KDE desktop a bit better than Gnome. Depends on whether I feel like tearing out what little is left of my hair.

Vista SP1 due in first quarter 2008.

Official word has come down from Microsoft that Windows Vista’s first service pack will arrive some time in the first quarter of 2008. The plan calls for 15,000 people to have access to SP1 Beta by the end of September with the final release date depending on how many bugs the beta testers manage to discover once they get their hands on it. No new features are planned at this point as this is mainly to be a performance and stability update.

This is significant if only because it’s become a tradition among some segments of the Windows population to wait until the first service pack is released before even considering moving to the new OS and Microsoft is clearly hoping this will help move those folks along. The article linked above is a Q&A with Jon DeVaan who is a senior vice president of the Windows Core Operating System and it provides a couple of interesting insights:

PressPass: How do you know and decide what gets fixed for a service pack?

DeVaan: We are constantly monitoring the quality of users’ experience through Windows Vista’s built-in, automated feedback systems, such as the Customer Experience Improvement Program (CEIP) and Windows Error Reporting (WER). These are systems that customers anonymously and privately participate in via an explicit opt-in choice. Through the data we get back, we can identify, diagnose and then repair the most detrimental and prevalent problems users encounter.

Our primary focus after launch became addressing ecosystem compatibility issues that the data showed had adversely impacted some users’ Windows Vista experience. For example, when consumers see a “Device Not Found” message or the systems report back that a device failed to install, we can prioritize getting the needed drivers available on Windows Update or up on the hardware vendor’s Web site. As a result, our driver coverage went from 1.4 million in January to more than 2.2 million today. We also work directly with our partners to improve overall driver quality. We are able to see which drivers are causing system crashes or contributing to hangs and other performance problems, and then work across the ecosystem to bring solutions to market via Windows Update.

PressPass: So what changes should we expect to see in Windows Vista SP1?

DeVaan: I should start by saying that one thing people shouldn’t expect to see is new features, although some existing components and features will be enhanced. For example, we’ve added support in BitLocker Drive Encryption for encrypting multiple volumes on the PC, and have improved printer management by simplifying printing to a local printer from within a Terminal Server session. Service packs typically are not vehicles for new features, and the same will be true with Windows Vista SP1.

Windows Vista SP1 will contain changes focused on addressing feedback from our customers across a number of areas. In addition to all the fixes delivered via other channels like Windows Update, Windows Vista SP1 will address specific reliability and performance issues that have been discussed on many self-help forums, such as copying files and shutdown time. It will support new types of hardware and emerging standards, like EFI (Extensible Firmware Interface) and ExFat (a new file format that will be used in flash memory storage and consumer devices). It will also include some management, deployment, and support improvements, such as adding the ability to detect and correct common file sharing problems to Network Diagnostics. Windows Vista SP1 also will include Secure Development Lifecycle process updates, where we identify the root cause of each security bulletin and improve our internal tools to eliminate code patterns that could lead to future vulnerabilities.

As we’ve done in the past, we will document all of the changes through our support.microsoft.com site in a Knowledge Base article, which will be available around the time the beta is released.

Early word has it that SP1 is unusually big for what it’s supposed to contain:

Based on current test versions, the operating system update will be a 1GB file when uncompressed. By way of comparison, Windows XP—the whole thing—shipped on a CD, which only holds about three quarters of a gigabyte. On the plus side, systems that already have the latest Vista patches can be brought up to the Service Pack 1 level with only a 50MB compressed file through Microsoft’s online Windows Update utility.

Also notable, installing the OS will require 7GB of free hard drive space, though much of that will be returned to the user once the megapatch is applied.

This shouldn’t be a problem for most folks unless their HD’s are close to capacity and if you’re running Vista then chances are you’ve got a hefty hard drive already. Still something to plan for if you’re one of those data junkies.

The Tech Report takes Hitachi’s new 1 Terabyte hard drive for a spin.

We’ve been able to have over 1 terabyte of hard drive space in our personal computers for awhile now by doubling up on 500GB or higher hard drives, but this is still a milestone as it’s the first consumer level hard drive to lay claim to being a terabyte in a single drive. The folks over at The Tech Report got their hands on one to test it out, but first they discussed the age-old problem of when a terabyte isn’t really a terabyte:

By now I’ve no doubt been heckled by someone insisting that the 7K1000 doesn’t actually offer a full terabyte of storage capacity. This person probably sounds like the comic book guy from The Simpsons, but don’t dismiss him. He has a point, sort of.

According to the International System of Units (SI), a terabyte consists of 1,000,000,000,000 bytes—10004, or 1012.  Windows confirms that the 7K1000 delivers 1,000,202,240,000 bytes, which is more than it needs, so what’s the comic book guy on about?

Look a little closer, and you’ll see that while the 7K1000 does indeed offer over a trillion bytes, that capacity only translates to 931 gigabytes.  For an explanation of why, we have to delve into the always exciting world of numerical systems.  SI units are built on the same base 10 decimal system we’ve been using since grade school.  Computers, however, use a binary base 2 system.  So, while a kilobyte in decimal is 1,000 bytes, a kilobyte in binary translates to 1,024 bytes.  A binary terabyte, then, is not 1,0004, but 1,0244, or 240.

Multiplying that out, a binary terabyte yields 1,099,511,627,776 bytes, which is why the 7K1000 falls short of a thousand gigabytes. The drive would actually need 1,024 gigabytes to achieve terabyte status in the binary world. This translation problem isn’t unique to the 7K1000, either. Virtually all hard drives advertise their capacities in SI units, so their actual capacities fall short of binary expectations.

The discrepancy between the stated size on the box and what you actually see once it’s installed in your PC has been the source of grumbling for years now, but really hasn’t come to a head. I suspect, however, that with the gap being what amounts to what was once a hard drive unto itself (69GB) that said grumbling may grow a little louder as capacities—and the associated gap—continue to increase.

But that’s being nitpicky. The real question is how well does the drive perform? According to the folks at TR it performs pretty well, but not so well as to justify its price compared to some of the smaller, smaller being a relative term here, and cheaper drives such as the Western Digital 750GB Caviar SE16. This is particularly important when you consider that you’re probably going to want two drives to set up a mirrored RAID array in case one of the drives fail because 931GB is a lot of data to lose to a hard drive failure.

Still, kudos to Hitachi for being the first on the market with a hard drive that us old timers once only dreamed of. I won’t be getting one anytime soon as I’m still finding I have plenty of room on the measly little 320GB hard drive in my current machine, but with the impending release of Windows Home Server it may not be that far off in the future that I’m looking for lots of storage space.

My Windows Vista experiment has ended… for now.

It’s true, the other day I sat down and restaged my PC back to Windows XP Professional, but not so much due to any failing on Vista’s part. For the not-quite-a-month or so that I ran Vista I had very few problems. The biggest one was the audio driver for my motherboard as nVidia isn’t including it with their standard driver package like they do with the XP drivers and the default one from Microsoft would work fine for a bit, but tended to corrupt sound in games for no apparent reason. A quick visit to the Realtek website for their latest AC97 Vista drivers was all it took to smooth out that problem.

No, the reason I switched back to XP is because I started playing World of Warcraft once more after landing the job at Meijers. WoW runs just fine under Vista, but I’d previously gotten into the habit of running dual monitors while playing WoW with the game on my primary screen and my secondary screen set aside for looking up info as needed such as good grinding spots for getting your jewel crafting skill up. Having both WoW and several browser windows going at the same time eats up quite a bit of memory and while 1GB of RAM is enough to pull this off under XP, Vista uses up a bit more RAM and as such made trying to do the same thing a bit more of an annoyance.

That’s the only reason I switched back. Because I was spoiled in how I was used to playing WoW. If I had 2GB of RAM I’d probably still be running Vista. Most of the other games I tried playing under Vista worked pretty well, though I didn’t get to test all of my usual games. Civ 4, F.E.A.R., and Half-Life 2 all ran without a hitch and day-to-day use was just fine as well. I’m still working on an entry about what installing and using Vista is like, but thought I’d mentioned that the experiment is at an end for the moment.

My Windows Vista experiment.

Today I got a free copy of Windows Vista Business Edition, compliments of Microsoft themselves, for my participation in their Power Together promotion awhile back. I had to sit through three half-hour webcasts on different parts of Vista and how I can use it to do this, that, and another thing much better than I ever could before and for my patience they sent me a full free copy.

Seeing as I have it I figure I may as well try it out and get used to it. I already know there will be some issues because I’ve been keeping up on them at the various tech websites I read. In all I should expect about a 17% drop in the performance of my games, assuming they don’t crash out, and some immature drivers from nVidia among many others, but the simple truth of the matter is that I need to learn Vista one way or the other because like it or not it is the future of Windows PCs and if I’m to stay in the technical support field it helps if I am up on the latest version of the OS.

There’s also the fact that at least as few of you regulars actually seem to pay attention to my tech entries and a couple of you have written in asking for my opinion on Vista. I figure this is one way to come up with a few blog entries at least as I write about my experiences with the OS. If things are so bad that I switch back to XP before very long then that in itself will say a lot. You may recall that I played around with the beta version awhile back, but I had that set up as a secondary install and I didn’t really use it the way I would if it were my primary OS. You don’t really get to know an OS until you’ve fought to do your day to day stuff on it.

So once I’m done with this entry and getting caught up on my email, I’ll be backing up my data and doing a fresh install. My PC has been in need of a restage for awhile now anyway so I may as well experiment while I’m at it. Keep tuned to find out how things go.