Practical Technology

for practical people.

April 8, 2008
by sjvn01
0 comments

Linux supporters gather at Foundation annual meeting

Austin, Texas—How do you herd cats? Well, as the famous EDS commercial shows, it isn’t easy. In a sense, that’s what the Linux Foundation, the nonprofit pro-Linux organization, will be doing this week at the invitation-only LF Collaboration Summit at the University of Texas Super Computing Center here.

Linux, as anyone who follows it knows, is the result of the efforts of hundreds of developers, and it serves the needs of at least as many companies and–thanks to its role in leading Web sites such as Google and its popularity with Web-hosting companies–hundreds of millions of users.

So, while no one “runs” Linux in the same way that Microsoft runs Windows, the Linux Foundation, and its related projects such as the Linux Standard Base, does the best it can to herd the Linux cats.

Or, to be more precise, the LF brings together the top cats from both the corporate world and the open-source community. At this week’s meeting, engineers and developers, and CEOs and CIOs will join together to talk about Linux’s recent path and its future for the coming year and beyond.

While the technology of the Linux kernel and Linux printing, for example, will be star subjects of the meeting, the attendees will also be discussing the economics of Linux. Make no mistake, this is no gathering of Linux fan boys. This is a gathering of chip vendors, such as AMD and Intel; PC OEMs, such as Asus, Dell, Everex and Hewlett-Packard; and Linux companies, from the biggest–Red Hat and Novell–to some of the smallest.

Among the subjects the attendees will be addressing are how well PC vendors are doing with their preinstalled Linux offerings; how MySQL–in a speech given by MySQL CEO Marten Mikos–will work with Linux now that the popular open-source DMBS company has been acquired by Sun; and how Google, the LiMo Foundation and OpenMoko, among others, will be bringing out Linux phones.

In addition, IDC Vice President of Research Al Gillen will be presenting a Linux Foundation-sponsored white paper: “The Role of Linux Servers in Commercial Workloads.” The gist of the paper is that Linux servers are transforming from simply being edge (Web servers and services) and infrastructure (file and print sharing) servers to “mainstream business-oriented workloads.” The bottom line is that IDC sees Linux server spending increasing from 2007’s $21 billion to almost $50 billion by 2011.

Also, top executives from Red Hat, IBM, Intel, Motorola, Oracle, and Via Technology, among others, will be speaking at the conference. The more important part of the gathering won’t be the speeches and panel discussions; it will be the gathering of open-source developers, desktop and server vendors, ISVs, corporate Linux consumers, and end users to work out where the Linux cat herd will be going next. It should be an interesting week.

A version of this story first appeared in Linux-Watch.

April 4, 2008
by sjvn01
0 comments

Microsoft Gives Up on Vista

The question now isn’t “Is Vista Dead?” It is. The real question is: Can Microsoft get Windows 7 out in time to save its desktop domination? I think Microsoft “could” pull it off. Here’s how.

Vista is dead.

That’s not what Bill Gates said at a seminar on corporate philanthropy in Miami on April 4, but it might as well have been. What Gates actually said, according to the Reuters report, is that he expects that the next desktop version of Windows, Windows 7, would be released “sometime in the next year or so.”

Goodbye Vista. It has not been fun knowing you.

I predicted that Microsoft was giving up on Vista in January. It seems I was right. Microsoft’s own top brass had hated Vista when it first came out, why should they expect anyone else to like it?

Vista SP1 has proven to be a painful upgrade and its performance still lags behind XP SP2 and, the still unreleased XP SP3. Worse still, from a Microsoft executive’s viewpoint, Windows is actually losing desktop market share to Mac OS X and Linux. Microsoft never loses desktop market share. But with Vista Microsoft is finally losing customers.

I think Microsoft saw the handwriting on the wall early on. The company started playing up Windows 7 as early as July 2007. Now, Microsoft’s business plan is always to get its customers to upgrade to the next version. It’s how they make their billions. But, in this case, Vista was barely out the door.

Can Microsoft actually make a Windows 7 that can ship by 2009 that will win customers? Vista was infamous for its blown deadlines. Windows 7 must not only replace the failed Vista, it has to convince Microsoft’s customers that Windows 7 will really be better than XP.

That isn’t going to be easy. I find it more than a little telling that Microsoft has given XP Home a new lease on life for UMPC (Ultra Mobile PCs). Still. I think Microsoft has one card up its sleeve that just might keep its customers happy and make it out in 2009: Server 2008 Workstation.

n stark contrast with Vista, Server 2008 works extremely well in eWEEK Labs and in my own Linux-dominated office. Even with some security troubles, Server 2008 is a darn sight better than Vista or Server 2003.

Cleaned and Speeded Up

So, what Microsoft could do is use Server 2008’s kernel as the core of Windows 7. On top of that it adds a cleaned and speeded up Aero Glass interface, Silverlight and Internet Explorer 8. At the same time, Microsoft should dump the Vista user interface command structure and return to XP.

One reason why people don’t like Vista is not only is it slower than XP, it requires them to relearn how to do bread-and-butter operations. While Microsoft is at it, they can also throw out such annoying ‘Vistaisms’ as requiring users to answer seemingly endless menu choices on whether they really want to install a program or what have you.

To make darn sure that Windows 7 doesn’t have the software compatibility problems that still plague Vista SP1, they can also add an XP compatibility layer. This would actually be an XP VM (virtual machine) running with Server 2008’s Hyper-V virtualization. If an application doesn’t run with the native Server 8 core, no problem; just automatically run it in the XP VM.

Old Windows hands will recall that Microsoft once used a similar approach in Windows NT 3.5 with a WOW (Windows on Windows) sub-system that let users run Windows 95 applications on NT.

If Microsoft were to take this path, I can actually see the company delivering a new desktop operating system by 2009 that users would actually want to use. If they try, as they did with Vista, to reinvent the desktop operating system wheel, there’s no way they’ll get anything out until 2011 that users will want to run.

And, by then, Microsoft’s problem may be convincing Linux and Mac OS users to come back to Windows rather than trying to get XP users to upgrade.

A version of this story first appeared in eWEEK. >

April 4, 2008
by sjvn01
0 comments

XP Home Lives, and So Does Linux, on UMPCs

When I thought Microsoft was going to extend XP’s lifetime to better slug it out with Linux on Ultra Mobile PCs and Mobile Internet Devices, I was afraid Linux was going to have to fight hard for the low-end of the desktop.

Now that we know that only XP Home is going to have a longer life and Microsoft is going to have to contort itself over what systems can and what systems can’t get it, I’m much happier. XP Pro was much more troublesome in my mind.

You see while XP Home will keep, well, the home users happy, XP Home has always been useless for businesses. It all comes down to one simple fact. You can use XP Pro on a business network, but not XP Home.

As my colleague Joe Wilcox over at Microsoft-Watch put it: “Ultra low-cost PCs and MIDs aren’t Windows PC companions, they’re replacements—for many end users. And Linux will deliver the enterprise capabilities lacking in Windows XP Home.”

Exactly.

More >

April 3, 2008
by sjvn01
0 comments

Another SCO buyout stumbles

Like the 11th chapter of a bad horror movie, the SCO zombie keeps stumbling forward moaning “Linux,” instead of “Brains.” This time, however, there may not be another movie left for SCO. SCO’s most recent would-be buyer, Stephen Norris & Co. Capital Partners LP, no longer wants to buy SCO.

SCO has staggered on since August 2007, when the U.S. District Court in Salt Lake City ruled that Novell, and not SCO, owned UNIX’s intellectual property Thus, with the legs cut out beneath all of its Linux cases, SCO filed for Chapter 11 bankruptcy in mid-September 2007.

In late October, SCO claimed it had found a buyer, York Capital Management. That proposed deal fell apart barely a month later. By year’s end, SCO had dropped off the Nasdaq.

You’d think that would be the end wouldn’t you? No case, no money, no buyer, and no more stock market presence. You’d think that would be the end wouldn’t you? Alas, no. On Valentine’s Day, SCO showed that even it could still get a little fiscal love—up to $100 million worth if you count the credit line—from Stephen Norris & Co.

In return for $5 million in cash, and loans of up to $95 million, Stephen Norris would buy SCO, move the company out of bankruptcy, take it private, and, of course, continue its seemingly endless and pointless Linux litigation.

It wasn’t a done deal, although SCO’s ownership agreed to both sell the company and fire long-time CEO Darl McBride. First, the proposed sale had to pass the scrutiny of the U.S. Bankruptcy Court in Delaware.

The deal didn’t make it that far. At the Bankruptcy Court on April 2, according to a report by Steven Church of Bloomberg News, SCO attorney Arthur Spector said that now the Stephen Norris deal was off. “We don’t have a new deal. But, when we get the deal that we think we are going to get, it’s going to be better.”

Instead of buying SCO, the company, Stephen Norris & Co. now wants to buy the Lindon, Utah’s business assets. Of course, the largest remaining question in both Novell’s lawsuit in Utah and in the Delaware bankruptcy court is what assets, if any, the company has left and who legally owns them.

In short, what could SCO possibly have left to sell that could, according to a report in Groklaw, leave SCO with “plenty of cash to pay their creditors with interest and have cash or cash reserves for paying future debts”?

There were no details given about this latest proposed deal. The Groklaw on-the-scene report continued that SCO’s attorney said he was “waiting until the deal is done before making full disclosures. He’s not going to leave any loose ends this time. He mentions that SCO may need an extension to get everything prepared and do due diligence.”

The Groklaw report also stated that, Joseph McMahon Jr., an attorney with the U.S. Trustee’s office, said “that the former York deal was basically a plan to sell IP that they [SCO] didn’t own and he wants to make sure that this new deal is clear about what is being sold since it is an asset purchase.”

So it is that, now without a deal in place, SCO continues to stumble forward. Whether the company will still be around to make its next court date remains an open question.

A version of this story was first published in Linux-Watch.

April 1, 2008
by sjvn01
0 comments

LPI to offer security training with LPIC-3 certification

There are Linux certifications, such as the RHCE (Red Hat Certified Engineer). There are security certifications, like the CCSP (Cisco Certified Security Professional). Now we have the first certification that combines Linux and security: the Linux Professional Institute’s LPIC-3 with its new “Security” exam elective.

The LPI certification, unlike the Red Hat and Novell certifications, is vendor-neutral. The long-delayed, top-level LPIC-3 arrived in 2007.

To obtain this certification, roughly equivalent to the RHCE or the NCLE (Novell Certified Linux Engineer), a Linux administrator must have already achieved the LPIC-1 and LPIC-2 certifications. In addition, he or she must pass the enterprise-level core certification exam (LPI-301) and a “Mixed Environment” elective (LPI-302). According to the LPI press release, “In 2008 LPI will develop a ‘Security’ elective (LPI-303) to accompany this level of the program. The introduction of this advanced-level ‘Security’ elective follows extensive consultation with industry leaders and enterprise customers on priorities for the LPIC-3 program.”

While Linux is known for being secure, security is not an operating system, it’s a process. While recent major security breaches such as the Advance Auto Parts network break-in and the Hannaford credit card heist have nothing to do with Linux, it can be expected that companies will want more security training for their administrators.

The LPI-303 Security exam development has already begun. The focus will be on dealing with real-world security threats, according to LPI. After the exam has been developed to an alpha stage, LPI and its corporate partners will give it a “global Job Task Analysis survey” during the summer of 2008.

This will be followed by a series of beta exams from October to November 2008. If all goes well, the course will be published and made part of the LPIC-3 certification in February 2009.

April 1, 2008
by sjvn01
0 comments

Q: Who Really Creates Linux? A: The Enterprise

Some people are still under the delusion that Linux is written by unwashed hackers living in their parents’ basements whose only social life is playing D&D and having flame wars over IRC (Internet Relay Chat) about whether vi or EMACS better and debating Picard versus Kirk. Nothing, nothing could be further from the truth.

The LF (Linux Foundation) has just released a new report, “Linux Kernel Development: How Fast It is Going, Who is Doing It, What They are Doing, and Who is Sponsoring It.” This comprehensive study of the last three years of Linux kernel development, from version 2.6.11 to 2.6.24 releases, reveals that the average Linux developer is being paid by a major corporation to develop Linux.

To be exact, between 70 and 95 percent of Linux developers over the last three years have been paid to work on Linux. According to the report, “More than 70 percent of total contributions to the kernel come from developers working at [such companies as] IBM, Intel, The Linux Foundation, MIPS Technology, MontaVista, Movial, NetApp, Novell and Red Hat.”

Over the years, the number of Linux developers has been increasing. Version 2.6.11 had only 483 programmers whose code actually made it into the kernel. The latest kernel, 2.6.24, had 1,057 developers. Over the last three years, 3,678 programmers have had their work included in Linux’s core.

That said, the report also stated that “despite the large number of individual developers, there is still a relatively small number who are doing the majority of the work. Over the past three years, the top 10 individual developers have contributed almost 15 percent of the number of changes and the top 30 developers have contributed 30 percent.”

In fact, the top five developers, Al Viro (1.9 percent of the total percentage of changes to the kernel); David Miller (1.8 percent); Adrian Bunk (1.7 percent); Ralf Baechle (1.6 percent); and Andrew Morton (1.5 percent), alone accounted for 8.5 percent of Linux’s recent code changes.

Of all the developers, 74.1 percent work on Linux for their companies. Of the rest, many programmers?12.9 percent with unknown employers?made 10 changes or less to the kernel. Only 13.9 percent of developers were clearly working on Linux as a hobby.

So, while Linux does have a substantial contribution being made to it by amateurs, the vast bulk of it is being written by corporate programmers. The companies that are building Linux, in order of their contributions to the kernel, are:

1) Red Hat, 11.2 percent
2) Novell, 8.9 percent
3) IBM, 8.3 percent
4) Intel, 4.1 percent
5) LF, 3.5 percent
6) SGI, 2.0 percent
7) MIPS Technology, 1.6 percent
8) Oracle, 1.3 percent
9) MontaVista, 1.2 percent
10) Linutronix, 1.0 percent.

In addition, consultants’ efforts have counted for 2.5 percent of the total work on Linux.

The authors of the study, Linux kernel developers Jonathan Corbet and Greg Kroah-Hartman, and Linux Foundation Director of Marketing Amanda McPherson, also note that, “What we see here is that a small number of companies are responsible for a large portion of the total changes to the kernel. But there is a ‘long tail’ of companies which have made significant changes.”

They also point out in the study that “none of these companies are supporting Linux development as an act of charity; in each case, these companies find that improving the kernel helps them to be more competitive in their markets.”

Besides Linux distributors, like Red Hat, Novell and MontaVista, where the profit motive is clear, the study also finds that “companies like IBM, Intel, SGI, MIPS, Freescale, HP, etc. are all working to ensure that Linux runs well on their hardware. That, in turn, makes their offerings more attractive to Linux users, resulting in increased sales.”

Other businesses that work on developing Linux, “like Sony, Nokia, and Samsung ship Linux as a component of products like video cameras, television sets, and mobile telephones. Working with the development process helps these companies ensure that Linux will continue to be a solid base for their products in the future.”

It’s not just IT companies these days that are working on improving Linux. For example, the study’s writers state that “The 2.6.25 kernel will include an implementation of the PF_CAN [Controller Area Network] network protocol which was contributed by Volkswagen. PF_CAN allows for reliable communications between components in an interference-prone environment?such as that found in an automobile. Linux gave Volkswagen a platform upon which it could build its networking code; the company then found it worthwhile to contribute the code back so that it could be maintained with the rest of the kernel.”

So since your typical Linux developer is more likely to be a full-time, upper-middle class software engineer, why does the FUD about Linux developers still hang on? McPherson believes it’s because “it’s difficult for most people to get their minds around competitive mass collaboration. It’s obviously a huge shift from the command and control models of old. Most people seem to have a hard time understanding that companies will pay people to work on software that their competitors use and profit from.”

McPherson continued: “People are still stuck in the zero sum game of the past. But now, with papers like this, they can see that companies who support open source actually profit through a shared R&D cost. The myth persists but as open source is becoming more common than proprietary development; I think you’ll see a shift in understanding.”

A version of this story was first published in Linux-Watch.