Hilarious Windows 98 Review!

This past summer, Microsoft released Windows 98 to a far more muted fanfare. Instead of prompting people to “Start it up,” the update to Windows 95 modestly promised tighter integration with the Web, a few performance tweaks, a slew of bug fixes, and an almost imperceptibly improved design. Because Win 98 looks a lot like its predecessor, and because many of its enhancements are actually found in Microsoft’s separately available Internet Explorer (IE) 4.01 Web browser, moving from Win 95 to 98 isn’t nearly the profound experience the previous upgrade was.

win98

Of course, that makes the decision of whether to upgrade to Win 98 even tougher. To help you decide, we’ve gone under the hood to uncover the differences between Windows 95 and its newer, spiffier sibling. We point out what’s been improved, what remains the same, and what’s still sorely lacking. Then we discuss whether you should pay the $90 and move to Windows 98 right away, or wait until you buy your next computer and get Win 98 preinstalled.

A New Coat of Paint

At first glance, the new version of Windows resembles the old. In fact, you have to look carefully to notice any cosmetic differences, although you’ll notice that Windows 98 has added a few more icons to the default desktop. Along with the My Computer, Recycle Bin, and shameless Setup the Microsoft Network icons, we found other icons that can help you with your work. For instance, someone at Microsoft wised up and placed the My Documents folder, where we like to keep our Word and Excel files, right on the desktop and in the Documents menu in the Start pop-up menu. If you use Microsoft Office 97 under Windows 95, the office suite creates the My Documents folder on your hard disk, but you have to launch the Windows Explorer file manager to find it. The new operating system puts the default file-save stash in plain view for quick and easy access. Pretty smart.

However, for some reason, Microsoft still keeps Windows Explorer–as opposed to Internet Explorer–off the desktop. For three years, each time we reviewed a new Win 95 PC, we had to drag and drop the Windows Explorer icon onto our desktop so we could launch the file manager whenever we wanted. You must do the same thing with Windows 98. Although you can now technically use IE as a file manager, we prefer to use Windows Explorer to navigate through the files on our hard disk.

The Taskbar in Windows 98 has been slightly tweaked. The Start button still resides in the left-hand corner, and in the right-hand corner you can still find the System Tools tray holding a digital clock plus icons for the programs automatically launched when you start up your PC. But next to the Start button, where Windows 95 displayed icons of your launched applications, you’ll now see what’s called a Quick-Launch bar containing four small icons–three of them, as it happens, for the Microsoft Internet tools that have launched a flurry of Windows 98-related antitrust action. These include the IE 4.01 Web browser, the Outlook Express email program, and a special icon for viewing channels (Web sites to which IE 4.01 users can subscribe to automatically receive updated content).

Although we love the fourth mini-icon–for Show Desktop, a nifty tool that minimizes all your open applications with one click so you can swiftly return to a clean screen if interrupted by prying eyes–you may find the others intrusive. Because we’re dedicated Ecco Pro users, we deleted the Outlook Express icon from the Taskbar by right-clicking on it and choosing Delete. Contrarily, if you’re a fan of Microsoft freebies, you can drag and drop icons from the QuickLaunch bar onto the desktop.

Just Browsing, Thanks

Between the launch of Windows 95 and Windows 98, a seismic event called the World Wide Web greatly affected the way the new Windows works and looks.

Several key components in Windows 98 look and operate much like a Web browser. According to Microsoft’s usability studies, nontechnical users have an easier time using a Web browser than the rest of Windows itself. Armed with that info, the company’s programmers turned the new OS into a virtual browser.

Need an example? Look no further than Windows Explorer or Control Panel. These modules once resembled their stodgy Windows 3.1 File Manager and Control Panel counterparts: a box full of icons. Now Windows Explorer looks like IE 4.01, complete with browser-like Forward and Back buttons in the toolbar and an address line where you can type either a DOS-style file address, such as C:\Program Files\Quicken, or a Web address to jump to an online site.

Not only do Control Panel, My Computer, and Recycle Bin offer views resembling Web pages, they also include a smattering of HTML code to give you more information about the contents of folders along with their bare-bones property lists. Move the mouse pointer over a disk icon, for instance, and you’ll see how much free space the disk has. We especially liked the warning that appeared when we clicked on the Windows folder, stating that we could potentially cause programs to stop working correctly if we modified the folder’s contents.

Changing to a traditional icon or list view is as easy as clicking a pull-down menu, but the browser metaphor is certainly snazzier and more helpful than the cut-and-dried look of Windows 95.

Nothing But Net

Windows 98 shines when working with the Web. In our admittedly unofficial tests, Internet Explorer 4.01 felt more stable and less prone to crashes than previous releases of IE, just as the OS itself did compared with Windows 95. If the majority of your home office tasks take place on the Web, the newlywed pair of IE 4.01 and Win 98 is worth a long look.

But legions of Windows 95 users are already browsing with IE 4.01, available free for the downloading from Microsoft’s site (www.microsoft.com/ie) or third-party libraries such as c|net’s www.browsers.com. If you’d like to join them, be prepared to set aside at least 50MB of hard-disk space to store the new browser. Expect a slight performance hit as well–IE 4.01 isn’t ideal for slower, older PCs.

Call us free-market radicals, but we still prefer Netscape’s Navigator browser (www.netscape.com)–which, like Internet Explorer, is readily available for free download. If Navigator is your favorite way to maneuver through the Web, it’ll work fine with Windows 98, but you won’t be able to use the OS’s seamless file management/Web browsing features.

Although the integration of Windows Explorer and Internet Explorer is almost enough to make us give up Navigator, we’re not in the least tempted by the Active Desktop feature that represents Microsoft’s exceptionally pushy brand of push technology. By default, the Windows 98 desktop displays a vertical Channel Bar of Web-site icons. Most of these take you to Microsoft-related sites such as the Microsoft Network and MSNBC. A few featured sites don’t belong to Microsoft (Warner Brothers, America Online, Disney, and PointCast), but you can be sure those companies paid hefty fees to appear in the Channel Bar.

Automatic delivery of favorite-site content sounds great in theory, but Web searching in the background while you try to work in the foreground is a notorious system hog. Unless you have an ultra-fast Internet connection and at least 64MB of memory, your MMX Pentium PC may start to feel like the 386 desktop you bought eight years ago.

Let’s be honest: Does the world need another invitation to try a taste of AOL? We think the regular AOL disks that litter the environment are enticement enough to try the online service, and find the Channel Bar to be overkill if you’re accustomed to using Web browser bookmarks to check your favorite sites. Fortunately, you can turn off the channels and get to work.

Performance Anxiety

PC processors have grown many times faster over the past few years. Software hasn’t. In fact, applications have become heavier and more bloated with extra features, multimedia clips, and animated help files. And while you wait for Windows 95 to launch, you may have wondered why you paid for a PC with a blazing chip.

Microsoft has streamlined Windows 98 to load and shut down quicker, and tweaked Disk Defragmenter to rearrange application components on your hard disk to launch faster (after you’ve opened and closed them a few times for Win 98 to monitor their behavior). However, you’ll need an advanced Pentium II-class system and ample memory to take advantage of these performance boosts. If you have an older machine, you’re out of luck–Windows 98 will perform about the same as Windows 95.

For this review, we used a Gateway Solo 2500 LS notebook with a 233MHz Pentium II processor, 64MB of RAM, and Windows 98 preinstalled. Although we had no complaints about launching such typical sluggards as Microsoft Word 97 and Excel 97, we still had to endure a tedious wait when we powered up the notebook.

If you’re still waiting for the day when Windows appears before your eyes as quickly as the image when you switch on a TV set, you’ll need one of the new PCs starting to appear with special fast-boot BIOS ROM code. Or you’ll have to pay a slightly higher electric bill and adopt a desktop that, like the newest Compaq and Hewlett-Packard consumer models, can enter a notebook-style “sleep” mode instead of actually switching off. On such a computer, Win 98 can schedule tasks such as defragmenting the hard disk or downloading Internet pages while you sleep.

To improve performance in a broader sense, Windows 98 provides a ton of new drivers for up-to-date peripherals, which will be immensely helpful when you add hardware such as a DVD-ROM drive to your system. Some other new hardware-related capabilities are flashy, but less universally useful: Got a spare monitor? You can finally mimic the Macintosh desktop-publishing trick of using two displays simultaneously (for, say, your page layout and your toolbar palettes). Got an ATI Technologies TV tuner or All-in-Wonder card connected to your cable TV outlet? As of press time, you’re the only customer for Microsoft’s TV Guide-like WebTV for Windows 98 viewer.

By contrast, we’re truly impressed with the operating system’s support for the Universal Serial Bus (USB) standard, which lets you simply plug a device such as a scanner or printer into a USB port without having to install an internal SCSI card. With USB, the promised days of plug-and-play will have finally arrived. It’s about time.

Two other under-the-hood improvements that earn our applause are FAT32 support and the new Windows Update feature. The former replaces DOS’s ancient disk format with a 32-bit File Allocation Table–a more efficient formatting scheme that stores files in smaller chunks to reduce wasted or slack space on your hard disk.

With the latter, you can click the Windows Update icon at the top of the Start menu and dial into a special Microsoft site where an Update Wizard automatically downloads and installs software updates, bug patches, and other assorted fixes to your operating system, drivers, and applications. Think of it as a regularly scheduled tune-up for your PC.

The Final Verdict

And now the real question: Should you upgrade to Windows 98? The answer isn’t crystal clear.

If you live on the Internet, enjoy the simplicity of using a Web browser, and daily move among and manage scores of downloaded HTML pages and files, then we wholeheartedly recommend making the move to the new Windows. For Webaholics, Windows 98 can help tame the onslaught of online data and speed up your computing chores.

Similarly, if you have a recent PC with USB ports or plan on upgrading your old one to take advantage of USB peripherals, then run, don’t walk, to Windows 98. It’s the ideal solution for avoiding potential technical problems down the road, especially when combined with Windows Update-PC, heal thyself. You’ll also find Microsoft’s performance tweaks have diminished the system crashes that plagued Windows 95, but you shouldn’t expect an entirely crash-free version of Windows–ever.

But if Windows 95 works fine for you and doesn’t crash often, and your Web activity is going well enough, then stay put: Windows 98 will be an insignificant upgrade. Microsoft says that any future software upgrades designed for Windows 98 will work under Win 95, and vice versa–including, we hear, the upcoming Office 2000 productivity suite.

If you still use Windows 3.1, you’re long overdue to take advantage of the current versions’ improved interface and the extra capacity of 32-bit computing, but the catch-22 is that your hardware likely isn’t up to Windows 98′s standards. For you, it probably makes more sense to take advantage of today’s thrifty PC prices by trading up to a brand-new notebook or desktop.

If you do buy a new system, the OS decision may be out of your hands: You’ll get Windows 98 rather than 95 preinstalled. This is the ideal way to graduate to a new operating system, because the PC manufacturer installs it from scratch and tests its performance with the hardware and software you’ll be using. With a new operating system on a virgin hard disk, the chances of clashes with leftover drivers, libraries, and other bits of remaining code from previous versions of Windows are near zero.

The bottom line on Windows 98? Microsoft has created a nice update to Win 95, but with nothing that screams “Buy me! Install me now!” as the latter did at its debut three years ago. Don’t get us wrong–Windows 98 will remain on some of our test systems. But for our mission-critical machines, we can wait for the inevitable bug fixes and add-ons that didn’t make it in the initial release. Windows 98 will be here for a while…what’s the hurry?

64-Bit Chips – Still Underutilized?

The solution, of course, is a disaster recovery plan and backup performed by Hard Drive Recovery Group – http://www.harddriverecovery.org/ . But providing functional duplication of the key SP2s would mean incurring tremendous costs. So when IBM offered an alternative — its new 64-bit S70 — it took almost no time to make Holder a buyer.

“Economics was the key driver,” he says. “We brought in an S70 on a trial basis, and the loading of the apps was so much faster that it was obvious, even without tuning, the performance was there.”

Without bothering to do a detailed analysis, Holder concluded that a single S70 server would be sufficient to back up several SP2s and, down the road, would be able to scale well beyond current plans to increase capacity by installing even more SP2s in the future.

Now, the S70 hums away in a remote, unattended operation near Toronto, ready to shoulder the burden if necessary.

But while Holder clearly feels a 64-bit machine was the right solution for this particular problem, he, like other IT managers, is not yet ready to jump ship on tried-and-true 32-bit technology. More to the point, however, there are still precious few applications written to take full advantage of 64-bit addressing.

Indeed, as Dan Kusnetzky, program director for operating environments and serverware services at International Data Corp., observes, “it is almost as if vendors are rushing to 64-bit only to get there before some other vendor does, not because a mass of end users demanded it.”

For his part, Holder is counting on IBM’s plans to continue to expand the capabilities of the SP2 architecture. For now, though, he’s content knowing that 64-bit can deliver substantially higher performance, even with code that is not 64-bit native. And for his “worst case-only” app, that’s still a good enough fit.

Remortgaging The Future

While Holder has chosen to limit his use of 64-bit computing, a small cadre of early adopters has taken the next step — investing in 64-bit Digital AlphaServers as a cornerstone of future operations. One of the more high-profile examples is Fannie Mae. Established in 1938 under the appellation Federal National Home Mortgage — Fannie Mae is a private corporation, federally chartered to provide financial products and services that increase the availability and affordability of housing for low-, moderate-, and middle-income Americans.

servers

By one measure, Fannie Mae has the distinction of being the largest corporation in the U.S., with $366 billion in assets and an additional $674 billion in mortgage-backed securities outstanding. The company is one of the largest issuers of debt after the U.S. Treasury and its stock is traded on the NYSE.

What makes Fannie Mae successful is, in part, its ability to accurately predict which mortgagees will be reliable debtors and which will be likely to default. “Whoever analyzes the risk best can price best,” says Eric Rosenblatt, senior credit analyst, thereby increasing the size and productivity of the loan portfolio. “On the margins you will end up much less relaxed about taking a deal when your equations are producing less specific results,” he says.

The problem is not unlike that described by Coleridge’s Ancient Mariner — there’s water everywhere but nothing to drink. Fannie Mae has more than 10 million loans, each of which comes with complex transactional records. But slogging through that sea of data takes lots of processing horsepower. The problem is aggravated by the type of processing that needs to be done, where the questions are nearly always open-ended and the answers are difficult to predict.

“The statistical work we do is an iterative process,” says Rosenblatt. “There is no closed-form solution, you just keep making estimates and getting closer to the point where you feel comfortable with the answer. We look at the relationships among factors such as the amount of down payment, the borrower’s credit history, and the likelihood of default,” he says. “In practice it is very complicated to isolate this relationship out of a huge economic pie.”

Why has Fannie Mae invested in 64-bit processing? Because in the financial industry the ability to make rapid and reliable decisions confers competitive advantage. “We want to be one step ahead of the competition,” says Rosenblatt.

Fannie Mae has come a long way. “In the past this data resided on mainframes and was not easily accessible to our economists and policy makers,” says Mary Ann Francis, director of database systems development. “We sometimes spent weeks spinning through tapes to gather information about a particular cohort of loans for research. On the AlphaServer we’re able to put as much of this data online as we want. For researchers this means faster, more flexible SAS System queries — and the ability to look across multiple databases.”

“For my own work,” adds Francis, “the most striking benefit is the speed of database administration, thanks to the parallel-processing features of Alpha and Oracle. Files that we once had to load overnight on the mainframe now load in two to three minutes during the day. Now we load 60-million-row source files into our database table in 28 minutes. It has truly changed our world.”

Fannie Mae’s search for an alternative to its mainframe focused on three key criteria: performance, availability, and affordability. “Our new systems had to support parallel processing and relational databases,” says Francis. “For the database, we did a structured competitive benchmark, and Oracle won based on company strengths and test execution. After that, Digital did a benchmark for us of Oracle on Alpha.” In some eases,” she says, “the AlphaServer produced results that were an order of magnitude faster than other platforms.”

Implementation went smoothly. For example, says Rosenblatt, more than 95% of Fannie Mac’s SAS code was migrated without change from the mainframe. “It took just three months to get the system running and integrated into our environment, our systems people trained, and the data loaded and accessible to users,” adds Francis.

Server Connections Needed

Where the rubber meets the road, in the Wild West world of the ISP, there is also a glimmer of interest in 64-bits — sometimes because of scalability issues and sometimes simply because of the need to squirt lots of data out to users.

SBC Internet Services (formerly Pacific Bell Internet), which came in “tops in overall customer care service quality” among U.S. ISPs, according to a recent benchmarking report by the Washington Competitive Group LLC, has been wrestling with runaway growth. Currently, SBC provides Internet access services to more than 300,000 business and residential customers — a customer base it wants to continue to expand at a high rate.

Ensuring continued growth and those great service marks is, in part, the responsibility of Chris Torlinsky, senior systems manager. “Our requirement is the ability to scale to millions of users,” he says. “In order to do that we will be stretching the limits of our 32-bit computers. When it comes to data-intensive applications, like a database, there is a need for 64-bits, if not right now then in the very near future.”

Currentiy, Torlinsky is waiting for his 32-bit supplier, Sun Microsystems, to provide for full 64-bit processing. “Maybe in a year we will be on 64-bit systems,” he says. Although Torlinsky says SBC does use some formal metrics for making major decisions u like a 64bit migration — he says some things are still decided on a seat-of-the-pants basis. And this may be one of them.

“We may decide very quickly that some applications have to go to 64-bits but for the foreseeable future I don’t think we will decide to move everything to 64-bit,” he says. Indeed, Torlinksy says one of the positive attributes about Sun’s offerings, which helped shut out other vendors, is that customers can continue to choose either the 32- or 64-bit option on 64-bit hardware. “With Sun, if you want compatibility with 64-bits, it’s just a matter of choosing it,” he says.

That kind of freedom — to choose 64-bits for select apps while keeping tried-and-true 32-bit apps elsewhere — seems to be what many IT managers wish they could get as they contemplate the potential perils of migration.

Clearly, 64-bit computing is making its way into the mainstream. But enthusiasm for 64-bits can be misplaced, warns Jonathan Eunice, an analyst with Nashua, N.H.-based Illuminata. “64-bits is just one of maybe a dozen major engineering variables a technical evaluator should examine as part of an enterprise server purchase,” he says. Things like I/O channel, the system bus, clustering, and operating system multithreading can be just as important.

“You need to look at deliverable performance through all the software layers,” says Eunice. And that, he adds, is best done by measuring hoped-for outcomes by testing your apps with custom benchmarks. Only then, he argues, can you understand the end-to-end performance characteristics of 64-bit when compared to more mainstream options.

32-bit or 64-bit?

A move to a 64-bit hardware and operating system platform isn’t for everyone. But for those with specific high-performance requirements it can be the answer. Here’s what to expect from 64-bit computing:

* With 64-bits you have the ability to cache much more in memory. For applications such as data warehousing that means queries will run much faster. Similarly, with OLTP applications you can have much more concurrency among users without requiring additional servers and overhead.

* 32-bit machines are generally limited to 2Gb of directly addressable memory — ample for run-of-the-mill computing but not for large mission-critical or highly memory-intensive applications such as data warehousing.

* With 64-bits there is no longer a 2Gb limit on “chunk size”–the size of a file that can be loaded.

* Although 32-bit machines can still handle most of the tasks that a 64-bit machine might tackle, they will usually require more CPUs and/or more individual servers–potentially increasing initial purchase costs, as well as ongoing support and maintenance.

Building a Virtual Organization on 64-bits

California dreaming is becoming a reality in the Secretary of State’s office, where, according to CIO David Gray, they are well on their way to creating online analogs for every function they handle. “Within five years we want all our services to be on the Internet — everything from registering to become a notary to filing corporate documents or accessing the California state archives,” he explains.

“Web-enabling our public services stands to lower our costs by reducing manpower, office rental, paper forms, and processing and filing,” says Gray.

“As an engineer I understand the technology — the larger instruction size and registers of 64-bit,” says Gray. “I can’t say we have any metrics that would quantify whether there is a big difference between being 64-bit or 32-bit. My gut feeling is that most applications wouldn’t know the difference — unless they were doing something computationally intensive.”

Part of the formula for achieving Gray’s ambitious goal has been a flight from mainframes to 64-bit Digital AlphaServers running 64-bit Digital Unix and Oracle 7 for the enterprise database, as well as 32-bit Windows NT and Microsoft SQL Server.

But in some applications, the story is different. Gray, for instance, supports a distributed database with 20 million voter registration records. It connects to 58 counties and does regular data cleansing. “It runs very well on the AlphaServers,” says Gray.

Indeed, his office is in the process of expanding the system to provide an online repository for campaign and lobbyist disclosures. “And our intent is to make that available on the Internet,” he adds.

Auto Auction Tune-Up

While corporations like federal mortgage lender Fannie Mae, with its mounds of data needing to be crunched, represent a classic vendor argument for 64-bit computing, others have found that even comparatively modest IT operations can benefit from a bit boost. Take, as an example, ADT Automotive Inc. in Nashville, Tenn., one of the nation’s top three wholesale vehicle redistribution and auction companies.

“If it weren’t for our 64-bit system we would have had to scale back our applications or distribute them over multiple computers,” says Eddy Patton, the company’s project leader for systems and development.

ADT’s LION provides online information and supports the auctioning system. It was first implemented as part of a private intranet, but is now also available to 900 autodealers around the country via the Internet. LION runs as part of an integrated 32-bit IBM AS/400 and 64-bit Digital Alpha system linking 28 auction sites and supporting up to 500 simultaneous log-ins.

Patton’s enthusiasm is undiminished despite the fact that he is currently running 32-bit NT as his operating system. Still, he says, having 64-bit hardware has provided overall performance benefits.

“Although ADT was an all-IBM and SNA shop prior to bringing in the Alpha Server, we have found that the integration of Windows NT and Alpha technology into our environment went very smoothly,” says Jim Henning, vice president of information services.

‘Gotta Have It’ – Eventually

Nicholas-Applegate, a global investment management firm with offices in San Diego, Houston, San Francisco, New York, London, Singapore, and Hong Kong, currently manages more than $31.6 billion in global assets in more than 60 countries. While nowhere near as large an operation as federal mortgage lender Fannie Mae, the company is finding that 32-bit computing has limitations.

Unix Systems Manager David Buckley says he and his shop are eagerly a waiting the availability of a full-fledged 64-bit hardware and OS platform from Sun, their long-time hardware vendor. According to Buckley, his database indexes can’t grow much longer with Sun’s current Solaris 32-bit operating system. “We really need the ability to have files bigger than 2Gb,” he says.

Buckley says he is happy to follow the forward-migration path setout by Sun. “In our part of the financial market,” he says, “most of the applications are written for Sun. We have been married to them for a while and our staff is more Sun-literate than they are on any other Unix.” Those factors, he says, kept them from even considering another vendor for the 64-bit transition.

Full 64-bit functionality, including the kernel, will be available with Sun Solaris 2.7, probably before the end of the year, Buckley says. Then, the company can begin to explore 64-bit computing in earnest, he adds.

Achieving Higher Performance

What is meant by 64-bit and what are the risks and rewards of adopting this technology? According to Informix Corp.’s Nasir Ameireh, a senior product marketing manager for Informix Dynamic Server (IDS), there are some fundamental concepts you must grasp and steps you should take when considering implementation.

For instance, he notes that 64-bit isn’t just a CPU with a largeword size or the ability to fetch more instructions per cycle. Nor is it simply a matter of data bus performance or choosing 64-bit applications or compilers. Its all of these things and more. You can’t have a 64-bit application without a 64-bit compiler and an operating system to support the results. In short you need it all.

What’s more, the true value of 64-bit systems may not appear until all of these things are present because all are interdependent.

Some, he notes, raise interesting issues: If it is possible to stuff more instructions into a single fetch, how do you handle out-of-order execution? What is the impact of cache performance? How good is the compiler at improving this? This gives rise, he says, to the need and importance of 64-bit standards like Intel IA64 Digital/Compaq, Sun, and IBM.

To take advantage of 64-bit hardware and operating systems, you need a minimum of 4Gb of memory. Why? Because 32-bit systems can address 4Gb of memory now. Moving to 64-bit hardware and operating systems with less than 4Gb memory, a complete plan for database engines, applications, and system tuning, could actually result in poorer performance, Ameireh explains.

Furthermore, he says, it is important to determine whether your applications need to be 64-bit, and you should have a plan– including a test plan — in place. If you decide to stay with 32-bit applications you should understand how they will exist in future mixed or all 64-bit environments. It is also important to understand the impact of 64-bit on third-party products which need to be “64-bit safe” or “64-bit enabled.”

CE Was An Interesting Beast

“The best fit for CE today is in places where there’s a visual user interface,” said Paul Zorfass, lead embedded analyst at First Technology Inc. He cited consumer-electronics applica-tions, along with smart phones, vehicle navigation systems and designs requiring interapplication connectivity.

Paul Rosenfeld, director of marketing at Microtec, believes the “bottom line” is that “there are some apps that CE does really well now; there are some that it might do better in a year, when CE 3.0 comes along; and there are some that it’ll never do well.”

CE 3.0 will be best-positioned for broader design wins if it can meet specific technical targets, analyst Zorfass said. “Microsoft has to do two things: CE 3.0 has to have a good level of determinism, and it has to be configurable.” The latter would mean a truly modular implementation, probably including source code, which would enable customers to pick and choose the pieces of the OS they want to use.

Despite the paucity of CE-based deeply embedded apps so far, many industry players believe it’s a mistake to dismiss the OS, because Microsoft and its partners are still priming the pump.

Indeed, to get momentum behind CE, Microsoft has strung together an unusual series of partnerships and marketing arrangements in the RTOS community. For example, Integrated Systems is offering support services for CE. Yet the company competes with CE via its own pSOS RTOS.

webbbok

Most vendors explain such fence-straddling approaches by stating their belief that CE won’t encroach on their bread-and-butter, deeply embedded applications. “CE 3.0 will be harder, as opposed to hard, so it addresses a bigger piece of the market,” said Microtec’s Rosenfeld. “But there’s still a big difference between where [Microtec’s] VRTX is targeted and where CE is aimed.”

At this week’s conference, however, CE will get a push that could prod it into one such arena: complex, ASIC-based embedded designs.

In that regard, Microtec will announce a deal with ARM under which Microtec will provide an integrated tool kit to enable deployment of CE on custom ARM-based embedded applications.

More important, the tools are intended to work in a hardware/software co-development environment. That’s the kind of setup engineers are likely to use with ARM, argues Microtec.

“There are some unique challenges, such as debugging, that engineers face in deploying CE on custom ASIC hardware,” said Rosenfeld. “In the traditional X86 world, you have standard boards. However, in the ARM environment, you don’t even have standard processors; you have core-based ASICs.”

Indeed, the use of application-specific processors is becoming the rule rather than the exception in the embedded world. Along with ARM, rivals include core-based offerings built around the MIPS Technologies architecture. Also in the picture are processors powered by Hitachi’s SH3 and SH4 designs, along with PowerPC and X86 implementations. On the software front, all are supported by CE.

Applied Microsystems Inc. (Redmond, Wash.), for one, is addressing deeply embedded designs based on PowerPC, X86 and SH with its new Foundation-CE development suite (see Oct. 19, page 44). The company will highlight the suite at this week’s conference.

Indeed, with so many platforms in play, the industry is realizing that serious embedded developers working in joint software/hardware environments won’t be able to get by with old-style software tools. “There’s multiple layers here,” Rosenfeld said. “If all you’re going to do is build apps, you can get away with a $500 tool. However, if you’re building your own board, then you’ve got a problem, because those tools don’t have the foundation for hardware debug.

“Looking further ahead, if you’re going to do your own ASIC, you have another level of challenge, because you need $50,000 ASIC design tools.”

While support for CE on embedded CPUs is important if the OS is to gain ground, there’s another key element in the equation. Specifically, Microsoft must improve the real-time performance of CE 3.0. Yet even Microsoft acknowledges that CE may never deliver the guaranteed, ultrafast interrupt response times-features known as determinism and low latency-that are the equal of the best of the traditional microkernels.

That is one reason Microsoft isn’t pushing CE for the toughest of the hard real-time apps but is pulling back half a step to tout the OS for many-but not all-applications. At the same time, Microsoft has expanded the discussion that’s long been used to quantify interrupt response.

Indeed, Microsoft’s Barbagallo argues that two numbers must be taken into consideration: “There’s the time it takes when an interrupt comes in. Then there’s the time it takes to respond to that interrupt.”

The time it takes for CE to acknowledge that an interrupt has come in is under 10 microseconds, Barbagallo said. “On that playing field, we’re exactly the same as all the other RTOSes. Where we in fact are slower is in the time it takes to register that interrupt if you want the OS to do something with it. In that scenario, we’re under 50 microseconds.

“In contrast, an RTOS like QNX or pSOS probably takes on the order of 10 to 15 microseconds to flip it over to the OS.”

Still, Microsoft is giving no quarter to its competitors, noting that CE can use hardware-trapped interrupts for specific cases in which immediate service is mandatory.

RTOS counterpunch

Not surprisingly, Microsoft’s rivals don’t see things that way.

With CE 3.0, Microsoft will be competing most aggressively against two specific RTOSes: VxWorks, from Wind River Systems, and QNX, from QNX Software Systems Ltd. For their part, both vendors will provide their own respective counterpunches at this week’s confab.

Wind River will come to the conference to showcase a major new release of its real-time development environment (see related story, page 80). Dubbed Tornado II, the integrated development environment comes with a new debugger and with integrated simulation capabilities to support hardware/software co-development. It’s also closely coupled to the company’s VxWorks real-time operating system.

Tornado II consists of three highly integrated components: the Tornado suite of tools, a set of cross-development tools and utilities running on both the host and the target; the VxWorks run-time system, which is billed as a high-performance, scalable real-time operating system that executes on the target processor; and a full range of communications software options.

“Tornado II provides a complete framework for our new networking and graphics environments,” said Wind River president and chief executive officer Ron Abelmann.

In addition, the company recently launched a major push into graphical user interfaces for embedded applications, with the release of eNavigator and HTMLWorks. The two GUI-development tools are built around hypertext markup language (HTML) technology.

That strategy directly addresses another debate that’s roiling the embedded arena: how to support GUIs. Previously, many resource-constrained embedded apps had displays that were rudimentary at best. Now, fancy display-based interfaces are the order of the day.

To meet developers’ needs, vendors are pursuing a couple of approaches. One tack assumes that embedded developers are better off with a fixed-and often heftier-set of GUI resources. The other approach gives developers the option of mixing and matching GUI components, the better to meet the parameters of resource-constrained designs.

For example, one of the raps against CE is that it is too bulky because it includes a full GUI (see related story, page 79). Microsoft disputes that assertion, claiming CE is completely customizable.

“If you want to build a full-up user interface that looks like Windows, then that takes up a fair bit of code, whether you build it with CE’s primitives or with Zinc primitives,” said Microsoft’s Barbagallo. “There’s no getting around that. However, if you want a smaller, specialized interface-maybe your application uses an LCD display-then you can do it with a very small amount of CE code.

“There’s fundamentally no difference there between us and our competitors; it’s just software functionality,” Barbagallo added.

To prove its flexibility, Microsoft will showcase at the embedded conference a Unisys bank-check scanner based on CE that uses a downsized user interface.

For its part, QNX Software Systems is looking to blunt CE by playing its Java card. At this week’s show, QNX will demonstrate a real-time Java application for robotics control running under its Neutrino RTOS.

“We have been extremely active in developing the real-time requirements for Java,” said a company source, noting that the demo will also be shown in the HP and IBM booths at the conference.

As the QNX demo indicates, Java in the embedded world is beginning to move from the talking stage to the implementation phase. However, as with CE, things seem to be at the early end of the pipeline.

Sun Microsystems’ EmbeddedJava spec-which has some nine licensees-is continuing its movement from paper spec into code. At the same time, HP is proceeding with the incursion into the embedded Java world that it launched this past spring.

Neither Sun’s nor HP’s Java implementations run by themselves but must be paired with a traditional RTOS. HP’s Java implementation has gained ground through license agreements with Integrated Systems, Lynx Real-Time Systems Inc., Microware and QNX. In addition, Wind River has endorsed-though it has not formally licensed-the HP technology. (Microware, QNX and Wind River are also licensees of Sun’s EmbeddedJava.)

Microsoft, too, is in the Java picture. It has licensed HP’s Java Virtual Machine and plans to integrate it into Windows CE.

Lurching forward

The CE and Java arenas share one notable similarity: Their respective proponents are attempting to build support via marketing deals. Just as Sun and HP are trying to sign up Java partners, Microsoft is lining up supporters for CE.

“Our business model for the embedded space is no different than our business model in the desktop space,” Microsoft’s Barbagallo said. That model is “the run-time revenues we receive from the OS.

“We’ve been very clear to all our partners as far as what tools we need to provide support for CE in the embedded space,” Barbagallo added. “Because they see we have this platform approach, they’re saying it makes sense to build tools for CE, because we’re not going to be competing against them.”

However, Microsoft may have to tune its long-term marketing tactics to better address embedded designers.

“The embedded market is different from the desktop market,” analyst Zorfass said. “It’s got slower adoption cycles and different requirements.