Practical Technology

for practical people.

March 1, 2006
by sjvn01
0 comments

Why Windows Vista will suck

Oh! My aching head.

When I first saw ExtremeTech’s Why Windows Vista Won’t Suck, I thought: “Aha, sarcasm.”

Nope. I was wrong.

They really were saying that Vista is pretty good.

Oh please.

First, let me say, I’ve been running Vista myself for quite some time. Next to me at this very moment is a Gateway 835GM. Under the hood, it has an Intel Pentium D 2.8GHz dual-core processor, an Intel 945G chipset, 1GB DDR2 (double data rate) DRAM, a 250GB SATA hard drive, and built-in Intel GMA (graphics media accelerator) 950 graphics. That’s a fairly powerful machine. Which is a good thing, because it’s the only PC in my office of 20 PCs that’s got enough oomph to run the Windows Vista February CTP (Community Technology Preview) build 5308 without driving me into fits of rage.

Mind you, it’s not enough machine for Vista. I could run any Linux with all the bells and whistles on it without a problem. But, even though this system meets Intel’s recommendations for a Vista-capable Intel Professional Business Platform, it still doesn’t have the graphics horsepower needed to carry off Vista’s much ballyhooed three-dimensional Aero Glass interface.

My point is, though, that while I write a lot about Linux, and I prefer it, my real specialty is that I know operating systems of all types and sorts, including Vista.

So when I say Vista sucks, well, I know what I’m talking about.

“Suck” is a relative term, though. Vista will be better than XP, which has easily been Microsoft’s best desktop operating system to date.

However, Vista also requires far more hardware oomph than previous Windows systems. I’d say Intel’s recommendations are pretty much a minimum for Vista. I would only add that if you expect to see the fancy desktop, you need to invest in, say, an ATI Radeon XPress 200, an Nvidia nForce4, or a high-end graphics card.

The truth is that very, very few people are going to be upgrading their existing systems to Vista. To make it work well, you’re really going to need a new computer. If you didn’t buy your PC in 2006, I wouldn’t even try to run Vista on it.

OK, so the first reason that Vista sucks is that, no matter what version you get, it’s likely to be expensive. No matter what Microsoft ends up charging for it, the only way most people are likely to be running it is when they get a new PC.

Now, let’s see what my colleagues at ExtremeTech have to say in Vista’s defense …

Vista is much safer and more secure. “The whole kernel has been reorganized and rewritten to help prevent software from affecting the system in unsavory ways.”

Well, yes, this is certainly what Microsoft would have to do to make it truly secure. I’ve say that myself. Unfortunately, while Microsoft has worked hard on improving Vista’s security, it’s still pretty much the same old rickety kernel underneath it.

Need proof? In January, Microsoft shipped the first security patch for Vista. It was for the WMF (Windows Metafile) hole. You know, the one, that my security guru friend Larry Seltzer called, “one of those careless things Microsoft did years ago with little or no consideration for the security consequences.”

Good job of cleaning up the core operating system, Microsoft!

Of course, Linux never had this kind of garbage to clean up in the first place.

The ExtremeTech guys also say that Microsoft has done a good job of cleaning up Windows’ use of memory management and heaps. They’re right about that.

What they don’t mention is that Linux and Mac OS X have both done that kind of thing well for years. They also don’t mention that for an application to actually get the most from these improvements, it will need to be rewritten. So, if you want to get the most from Vista, be sure to set some money aside for new applications as well as a new PC. You’ll need it.

They also praise SuperFetch, Microsoft’s new combination application pre-fetching technique and hyper-active virtual memory manager. Intelligent pre-fetching is a fine idea for boosting performance. You’ve been able to use it in any application written with the open-source GCC for years. Microsoft’s execution of it, however, has one of the biggest “What were they thinking of?” mistakes I’ve seen in a long time.

You see, with SuperFetch you can a USB 2.0-based flash drive as a fetch buffer between your RAM and your hard disk. Let me spell that out for you. Vista will put part of your running application on a device that can be kicked off, knocked out, or that your dog can carry away as a chew toy. Do you see the problem here? Me too!

I also understand that Vista will have improved TCP/IP networking. It’s nice to know that they’ve finally done something with that open-source BSD code that’s the basis of their TCP/IP network protocol.

What ExtremeTech doesn’t mention, though, is that Microsoft is also planning on making it so that you can use IPSec (IP security protocol) for internal network security. This is another of their “What were they thinking of?” moments.

IPSec works fine for VPNs (virtual private networks). But, as John Pescatore, an analyst at Gartner Inc., said about this scheme, “Once you try to encrypt internal communications, your network architecture breaks.” He’s got that right.

Next up, they say wonderful things about Home Premium Vista having Media Center capability being built into it. Maybe I’m just a little confused here, but after looking at the feature sets, the only thing I see that’s changed here is that they’ll be calling the next media-enabled Windows “Home Premium Vista” instead of “Media Center Vista.”

They also praise this version for having CableCard support, with the result that you’ll be able to record HD (high definition broadcasts) from cable instead of being stuck with OTA (over the air) HDTV, without turning your entertainment room into an electronics lab.

Excuse me, but that’s not because Microsoft is being innovative. It’s because they are still not shipping CableCard cards for PCs. Come the day they finally ship — and I’m betting the ATI OCCUR makes it out first — I suspect MythTV and the other open-source PVR (personal video recorder) projects will be right there.

The ExtremeTech crew also has nice things to say about Vista’s audio support. Mea culpa, it is better than anything else out there. So, Linux desktop designers, it’s time to get cracking on audio support. Vista’s still won’t be out, at the earliest, until the fourth quarter of this year, and that gives you plenty of time to play catch up.

DirectX10, which is mostly used for game graphics and in the aforementioned Aero, is also much improved. It’s also, however, completely different from DirectX9. Current games, current graphic cards, won’t be able to do anything with it, which is why Vista also supports DirectX 9.

Here again, I’ll give the Microsoft guys come credit. DirecX10 is a big improvement for the gamers. It’s still not going to make your PC the equal of a dedicated game console, however.

The folks from ExtremeTech also like the fact that Vista will have many more built-in applications. Isn’t this why Microsoft got into trouble with the Department of Justice a while back? Isn’t this the kind of thing that has both South Korea and the European Union raking them over the coals? Why, yes. Yes, it is.

Be that as it may, as I sit here looking at my SUSE 10 Linux desktop, I can’t help but notice that I have, for free, every software application I could ever want. Advantage: Linux.

At the end of the story, the ExtremeTech crew ‘fesses up that “We don’t know that it’s going to be great just yet.” True. And, I don’t know that it’s going to suck yet, either.

Expensive? Yes. Awful? We’ll see.

What I do know, is that I really don’t see a thing, not one single thing, that will make the still undelivered Vista significantly better than the Linux or the Mac OS X desktops I have in front of me today.

A version of “Why Windows Vista will suck” first appeared in DesktopLinux.

February 24, 2006
by sjvn01
0 comments

RIM, NTP and Patent Madness

Is this a great country or what that millions of users and a billion-dollar company can be held hostage by a court that’s taking seriously patents that the U.S. Patent and Trademark Office has already rejected?

If there are still people who doubt the essential stupidity of software patents, they havent been following RIM vs. NTP.

Three million U.S. business users live and die by their RIM BlackBerrys. The U.S. Department of Justice asked the federal court in Virginia to keep BlackBerry wireless e-mail service going because government workers need it.

The court, however, turned down the Department of Justices request. And on Feb. 24, the judge who has kept NTPs action going is considering granting NTP an injunction that would shut down the mobile e-mail service.

The basis for all this? Five patents. Five, if I may say so, bad, lousy patents.

But who cares what I think? What should count for something is that the U.S Patent and Trademark Office has re-evaluated the contested patents and rejected all of them!

Yes, these initial rejections were “non-final rejections.” Yes, NTP has already appealed these rejections.

And how is NTP doing? On Feb. 22, the USPTO issued a final rejection for one of the patents, and eWEEK has just learned that the office has slam-dunked another patent with a final rejection.

The remaining three? The USPTO has made it clear from its public statements that it plans to reject all of NTPs patents.

In short, businesses, the federal government and a multibillion dollar e-mail business are being held hostage by NTP, its five bad patents and Judge James Spencer of the Eastern District Court of Virginia, who denied RIMs request to stay a possible injunction pending the USPTOs final decision on all of these patents, which is probably months away.

What kind of garbage is this?

Before these patents were discredited, RIM was willing to pay $450 million to the patent holding company as long as NTP granted “RIM and its customers an unfettered right to continue its BlackBerry-related wireless business without further interference from NTP or its patents.”

Almost half a billion smackers wasnt good enough for NTP, so talks broke down.

So here we are. A major business and a vital part of American business communications are hanging by a thread because of software patents that everyone, except for NTP and one judge, realizes are garbage.

NTP, on the other hand, like other patent trolls, has nothing to lose. It has no customers and no products. NTP does nothing except collect licensing fees and sue companies that dont pay it off.

A patent troll, according to Peter Detkin, the former assistant general counsel for Intel who coined the phrase, “is somebody who tries to make a lot of money off a patent that they are not practicing and have no intention of practicing and in most cases never practiced.” Thats NTP with its RIM shakedown to a T.

This isnt the Sopranos, though. What NTP is doing is completely legal. Its just also completely wrong

Because of the fatally flawed U.S. patent system, NTP and many other companies are preventing good ideas from becoming good products and services. And, as in RIMs case, patent trolls are making existing products and services more expensive for all of us.

Im not the only one who thinks so. Carmi Levy, senior research analyst at Info-Tech Research Group, said, “An injunction would give free reign to patent trolls. We are in danger of devolving into an era where technology companies expend their energy on legal battles rather than innovation. Shareholder value will decline and the best interests of the market will be ignored.”

Exactly.

But even if RIM or other “trolled” companies win, well still have to pay more for our technology. After all, RIM will still have to pay millions in legal fees. Where will that money come from? Why, from every BlackBerry subscribers pocket, of course.

This is only going to continue, unless we demand of our congressmen that they realize, as the Public Patent Foundation has pointed out, that todays wrongly issued patents and unsound patent policies are harming the public. Once they finally get those ideas in their heads, they can give the patent system the top-to-bottom reform it sorely needs.

When that day comes, I hope, I really hope, that they just kill the idea of patenting software once and for all. It does no one any good except for the patent trolls.

A version of this story first appeared in eWEEK.

February 13, 2006
by sjvn01
0 comments

Google Windows apps coming to Linux

Google and CodeWeavers Inc. are working together to bring Google’s popular Windows Picasa photo editing and sharing program to Linux. The program is now in a limited beta test. If this program is successful, other Google applications will be following it to the Linux desktop, sources say.

The Linux Picasa implementation includes the full feature set of the Windows Picasa 2.x software. It is not, strictly speaking, a port of Picasa to Linux. Instead, Linux Picasa combines Windows Picasa code and Wine technology to run Windows Picasa on Linux. This, however, will be transparent to Linux users, when they download, install, and run the free program on their systems.

Wine is an open-source implementation of the Windows API (application programming interface). It runs, in turn, on top of the X Window System and Linux (or Unix). Wine is not, as has sometimes been said, a Windows emulator. Wine provides a Windows API middleware layer that enables Windows programs, such as Office 2003, to run on Linux without the slowing effects of an operating system emulation or a virtual machine. Indeed, in some respects, Wine on Linux is faster than XP on the same hardware.

The new program is reportedly re-tooled to work perfectly under CodeWeaver’s CrossOver Office Wine emulation. This may mean that Linux Picasa is using the program’s own native Windows DLLs (dynamic link libraries). Wine enables developers to use Windows DLLs for greater speed when they’re available.

The free Linux Picasa download will include a runtime version of CodeWeavers’s modified Wine, so that users can simply download the package from Google and run it on their Linux system. Users will not need to download and install Wine, or to purchase CodeWeavers’s commercial version of Wine, CrossOver Office.

Sources close to the project said that the Linux version of Picasa is meant to be as easy to install as the Windows version.

Officially, CodeWeavers had no comment about this project. Google public relations replied that “We don’t have any information to share at this time,” on the project or any business relationship with CodeWeavers.

Sources close to CodeWeavers, though, said that CodeWeavers has been tasked with the job of making sure that Picasa will work well with Wine and Linux. If successful, future versions of Picasa will be written to the Wine APIs so that the program can easily run on both Windows and Linux with Wine.

Sources also said that Picasa for Linux will be out shortly. Further, if the Linux Picasa project is successful, you can expect to see other Google Windows programs migrating to Linux via Wine and CodeWeavers.

According to sources close to Google, another popular Google Windows program is also coming over to Linux: the Google Talk client.

This project, however, is not going through CodeWeavers. Instead, sources indicate that a beta version of Google Talk for Linux has been created within Google.

While no further information was forthcoming on this project, it is worth noting that Google hired Sean Egan, the lead developer of the popular open-source IM client GAIM in October 2005. GAIM, itself, already supports the Jabber protocol, which Google Talk uses, and can be configured to work with Google Talk.

February 3, 2006
by sjvn01
0 comments

Why Photoshop tops most-wanted Linux app list

Photoshop? The application most people want, at this date, to be ported to Linux from Windows is Photoshop? Color me surprised!

When Novell Inc. started its survey of what applications people wanted ported to Linux, both Novell CoolSolutions site editor Scott Morris and I were both surprised to find Adobe Photoshop anywhere near the top of the list in early results.

Quicken, my own favorite, QuickBooks, Dreamweaver — sure. But Photoshop — when we already have GIMP (Gnu Image Manipulation Program)?

GIMP, in case you don’t know, has long been considered one of the Linux desktop’s success stories. For example, it’s been described as offering, “a level of functionality comparable to Photoshop for free” in LinuxPlanet — back in the year 2000.

Now, what I know about photo editing programs could be placed in a small, say, 360KB 5.25-inch floppy disk. If you can’t do it with Google’s Picasa on Windows, F-Spot on Linux/GNOME, or iPhoto on Mac OS X, it’s beyond me.

So I pestered some of my friends in the graphics business to see why Linux users would prefer Photoshop over GIMP.

First of all, Photoshop — on either Mac OS X or Windows — is the default photographic and prepress program for serious graphics firms. Just as Quark Inc.’s QuarkXPress was for the longest time the best layout program in serious publishing work, Photoshop is simply “The” application that professionals use.

It’s also not really thought of as a “Windows” application in many shops. For many graphic pros, it’s a Mac OS program. So this appears to be a case where it’s not really so much that people want a Windows application ported to Linux, they want what they see as the best-of-breed application, regardless of operating system, to run on Linux.

I was also told that while GIMP’s functionality may rival Photoshop’s, how you get there is very different. For instance, to users who know Photoshop, GIMP’s SDI (Single Document Interface) can be confusing. In GIMP, each image gets a separate window, whereas Photoshop’s MDI (Multiple Document Interface) groups them all together in a single window.

Now, this may not sound like much, but I picked this example because the debate over whether SDIs or MDIs are the better way to handle a desktop is one of those endless debates in interface usability circles. Most people aren’t aware of the details of these issues; they just know that if they’re used to doing it one way, doing it the other way is a lot more difficult.

GIMP’s interface has also been criticized for hiding menus that should be near the top. I did some checking on this and I wonder if some Photoshop fans just haven’t looked at GIMP’s upcoming 2.4 update. For example, as Nathan Willis, in his early look at GIMP 2.4 in NewsForge noted, in the new GIMP, operations now have a top-level menu of their own instead of being buried in the Layer menu.

If you want a more Photoshop-like interface to GIMP today, your best choice is Scott Moschella’s self-described “hack” Gimpshop.

GIMP 2.4 will also include color management. This is a must for serious graphic designers, who must work with on-screen images that will end up in print. It’s still not perfect, according to both Willis and my friends in the business, but it’s a start.

Willis also lists several things that are still missing from GIMP 2.4.

“The most fundamental shortcoming of the GIMP, according to graphics professionals, remains its limitation to grayscale and RGB image modes; press-ready images need CMYK (Cyan-Magenta-Yellow-Black), and many designers make heavy use of Lab color and duotone (tinted) modes. Second is its limitation to 8-bit color — as high-end scanner and digital camera prices drop, more and more people need to work with 16-bit-per-channel data,” wrote Willis.

Photoshop also has its own world of software training. There are many different ways to get to slightly different ends in Photoshop. People who’ve taken the time and trouble to learn many of them have very little incentive to learn another graphics program.

Another problem, according to my buddies, is that besides Photoshop itself, there are hundreds of Photoshop plug-in programs. Of those, everyone has their handful of favorites that they use on most of their projects. GIMP simply doesn’t have anything close to this sort of third-party add-on software community.

In addition, other important graphics and publishing programs are set to work with Photoshop. For example, remember how I said QuarkXPress “used to be” the pros’ only choice for pre-print and page layout? It’s being knocked off its pedestal by Adobe InDesign. In part, that’s because Adobe Bridge enables easy file integration across the entire Adobe creative suite.

As my colleague, John Rizzo, observed in our sister publication, Publish, “It’s no stretch to say that Adobe Systems Inc.’s Photoshop has made the transition from user application to major developer platform.”

And, there you have it — the reasons why users want Photoshop.

GIMP? It’s good, and it’s getting better, but unless Adobe takes a wrong step, I don’t see it playing a major role on professional desktops.

Some would argue, of course, that since GIMP is free software, it will eventually play a larger role. I still don’t see it.

As Rizzo said, and I’ve seen and heard now, Photoshop really is a platform, not just an application. When you’re buying into an entire system, as the graphics business clearly has, the upfront cost of a single application doesn’t amount to a hill of beans in the buying decision.

Still, if you look beneath the surface, simply bringing Photoshop over to Linux isn’t going be enough. For Linux to be taken seriously in design shops, Adobe needs to start moving its entire creative suite of software to Linux.

While Adobe has been edging toward Linux for some time now, it also took its own sweet time in bringing the latest version of Adobe Reader, aka Acrobat, to Linux. After all, Adobe didn’t even release Version 6 for Linux.

Still, Adobe did show up for the OSDL’s desktop architect meeting this past December. This was not a meeting for just anyone; only a few dozen of the top Linux desktop designers and architects were there. Because Adobe was there, it must be taking the Linux desktop seriously.

I certainly hope they are. After all, as the Novell survey is showing, Linux desktop users are certainly taking Photoshop seriously.

A version of this story first appeared in DesktopLinux.

February 2, 2006
by sjvn01
0 comments

Linux Wi-Fi hacking for fun and features

OK, how many of you are Wi-Fi hackers? Don’t be shy, we’re all friends here.

Besides, you’re in good company. There are several different groups working on better ways to get the most out of the Linux hidden away within many Wi-Fi devices. In particular Linksys Wi-Fi routers and APs (access-points) like the WRT54G and WAP54G have long been hacker favorites. (Editor’s note: If you’re considering getting a LinkSys router to hack, be sure to read about the WRT54GL — a special Linux-friendly version with more RAM and Flash).

Indeed, Eric Raymond, hacker extraordinaire and the person behind a little term you might have heard of — open source — has written a guide to hacking the 54G.

Why do this to perfectly functional Wi-Fi boxes? Well, if the famous climber George Mallory had been a hacker, he might have said, “Because it is there.”

But, it’s more than that. Firmware and Linux hacks on Wi-Fi devices can increase their range, add in security, add ssh (Secure Shell) functionality, and add VPN (virtual private network) services.

In other words, hacking a Wi-Fi device can add a great deal more functionality to an already useful network device.

Until recently, all of this has been, well, not difficult to do, but not a job for a newbie either. Now, however, as Joe Barr explains in a Linux.com article entitled OpenWrt nears prime-time, a new program that’s now at release candidate four, OpenWrt, makes hacking many different kinds of routers a lot easier.

According to Barr, OpenWrt RC4 is really a Linux distribution based on the 2.4.30 kernel. As such, you can use it to run a wide-variety of applications. Besides the functionality I mentioned earlier, you can even use it, on some devices, to run such open-source applications as the Asterisk PBX (private branch exchange); SANE, the Linux scanner driver; or DansGuardian, a Web proxy and content filtering program.

Better still for new users, OpenWrt now has a Web interface: OpenWrt Admin Console. This makes setting up the basic functionality — type of Wi-Fi security, operating mode (access point, bridge, client, or ad hoc), etc., etc. — much easier.

None too shabby, eh?

Of course, like any of these projects, you do take the chance of taking the perfectly sound and working heart of your Wi-Fi network and… turning it into a brick.

If you install OpenWrt, or any other such program like Sveasoft or HyperWRT, you will void your warranty and there’s always the chance that you’re going to end up with a paperweight with antennas.

Don’t say I didn’t warn you!

That said, in the case of OpenWrt, I’ve managed to get it running successfully on both of my Linksys WRT54G version 2.0 router/APs and it’s made both of them a lot more useful to me.

So, if you’re feeling brave and you want to give your Wi-Fi network a kick in the pants, what I can say except, “Go for it!”

January 30, 2006
by sjvn01
0 comments

Selling Linux and Open Source to Bean-Counters

I’m no bean-counter. I’m an IT guy. But, I know that over the last few years, it’s CFOs and dollars, not CIOs and gigabytes, that determine what technology companies buy.

But, here’s news you can use to get your CFO on board with a Linux and open-source make over. E-Trade Financial saved $13 million a year and they realized a boost in performance by switching to Linux from Solaris.

Now, $13 million isn’t chicken feed, even to a company like E-Trade that reported $1.7 billion of revenue in 2005.

It wasn’t just Linux that made the difference, though. It was also the Apache Web server and the Jakarta Tomcat JSP (Java Server Pages) servlet system.

In another eWEEK E-Trade story, E-Trade’s VP of architecture, Lee Thompson, explained, “the Red Hat 7.2 kernel came out, which had support for SMP (symmetric multi-processing) and a 32-bit message queue for shared memory. And, all of a sudden, our application booted.”

That, however, was only the first part of the story. It’s what E-Trade did next that many would-be corporate Linux supporters fail at.

Thompson went ahead and. “grabbed a bunch of our architects and we ran like crazy and got a full stack of our application… and we ported a representative stack of our application — our authentication, quote services, product services, some of our trading services and the servlets that rendered the HTML-over to this new stack.”

See the point? Thompson didn’t just show that Linux and open-source could now potentially run the company’s software; they went ahead and showed that it really could run the company’s software.

It’s one thing though to simply run software. It’s another to show that you can do it effectively. Thompson and his crew made that next step.

“We ran some load testing on it, and we knew when it fell over, and the way the Sun systems worked, we could keep adding more and more test users on the Sun box and it would just keep, cranking along — it didn’t really elbow. The Linux box was much faster, and then around, it was somewhere around 180 users, it would elbow… But before 180 users it was much faster.”

So, they showed that Linux was faster, but they also found out where it would stop working effectively. I think it’s very important when trying to sell Linux and open-source in a business to not oversell it.

Time and again, I’ve seen Linux supporters go on and on about how wonderful Linux is, but never mention it’s weaknesses. By doing this, they come off more as fans than serious IT workers. This, in turn, weakens their case.

The final part, and this is where Thompson et. al. showed that they know how modern IT works, is that they then showed that while the Linux systems could only support 180 users, as opposed to the Sun 4500s, which could support 300 to 400 users, the Linux servers cost about $4,000 each while the 4500s were running around a quarter of a million dollars apiece.

Now, those are the kind of numbers, not GHz or Mbps, that make a CFO’s day sunny.

Better still, from the big picture viewpoint, Keynote measurements of transaction numbers showed that with Linux, transactions were only taking 4 to 5 seconds, rather than 8 to 9 seconds with Solaris. For E-trade’s customers — online-traders who want speed, more speed, and then more speed on top of that — this was great.

So what are the lessons from this tale?

First, test out Linux and open-source on the actual jobs your company does. Next, test it with actual corporate work. Then, try it out in production. And, last but never, ever least these days, show how Linux and open-source will make money for your business.

It’s all about the bottom line, and with lower upfront costs than Windows or Solaris, there’s no reason why Linux shouldn’t become THE business operating system of the 21st century.