64-Bit Chips – Still Underutilized?

The solution, of course, is a disaster recovery plan and backup performed by Hard Drive Recovery Group – http://www.harddriverecovery.org/ . But providing functional duplication of the key SP2s would mean incurring tremendous costs. So when IBM offered an alternative — its new 64-bit S70 — it took almost no time to make Holder a buyer.

“Economics was the key driver,” he says. “We brought in an S70 on a trial basis, and the loading of the apps was so much faster that it was obvious, even without tuning, the performance was there.”

Without bothering to do a detailed analysis, Holder concluded that a single S70 server would be sufficient to back up several SP2s and, down the road, would be able to scale well beyond current plans to increase capacity by installing even more SP2s in the future.

Now, the S70 hums away in a remote, unattended operation near Toronto, ready to shoulder the burden if necessary.

But while Holder clearly feels a 64-bit machine was the right solution for this particular problem, he, like other IT managers, is not yet ready to jump ship on tried-and-true 32-bit technology. More to the point, however, there are still precious few applications written to take full advantage of 64-bit addressing.

Indeed, as Dan Kusnetzky, program director for operating environments and serverware services at International Data Corp., observes, “it is almost as if vendors are rushing to 64-bit only to get there before some other vendor does, not because a mass of end users demanded it.”

For his part, Holder is counting on IBM’s plans to continue to expand the capabilities of the SP2 architecture. For now, though, he’s content knowing that 64-bit can deliver substantially higher performance, even with code that is not 64-bit native. And for his “worst case-only” app, that’s still a good enough fit.

Remortgaging The Future

While Holder has chosen to limit his use of 64-bit computing, a small cadre of early adopters has taken the next step — investing in 64-bit Digital AlphaServers as a cornerstone of future operations. One of the more high-profile examples is Fannie Mae. Established in 1938 under the appellation Federal National Home Mortgage — Fannie Mae is a private corporation, federally chartered to provide financial products and services that increase the availability and affordability of housing for low-, moderate-, and middle-income Americans.

servers

By one measure, Fannie Mae has the distinction of being the largest corporation in the U.S., with $366 billion in assets and an additional $674 billion in mortgage-backed securities outstanding. The company is one of the largest issuers of debt after the U.S. Treasury and its stock is traded on the NYSE.

What makes Fannie Mae successful is, in part, its ability to accurately predict which mortgagees will be reliable debtors and which will be likely to default. “Whoever analyzes the risk best can price best,” says Eric Rosenblatt, senior credit analyst, thereby increasing the size and productivity of the loan portfolio. “On the margins you will end up much less relaxed about taking a deal when your equations are producing less specific results,” he says.

The problem is not unlike that described by Coleridge’s Ancient Mariner — there’s water everywhere but nothing to drink. Fannie Mae has more than 10 million loans, each of which comes with complex transactional records. But slogging through that sea of data takes lots of processing horsepower. The problem is aggravated by the type of processing that needs to be done, where the questions are nearly always open-ended and the answers are difficult to predict.

“The statistical work we do is an iterative process,” says Rosenblatt. “There is no closed-form solution, you just keep making estimates and getting closer to the point where you feel comfortable with the answer. We look at the relationships among factors such as the amount of down payment, the borrower’s credit history, and the likelihood of default,” he says. “In practice it is very complicated to isolate this relationship out of a huge economic pie.”

Why has Fannie Mae invested in 64-bit processing? Because in the financial industry the ability to make rapid and reliable decisions confers competitive advantage. “We want to be one step ahead of the competition,” says Rosenblatt.

Fannie Mae has come a long way. “In the past this data resided on mainframes and was not easily accessible to our economists and policy makers,” says Mary Ann Francis, director of database systems development. “We sometimes spent weeks spinning through tapes to gather information about a particular cohort of loans for research. On the AlphaServer we’re able to put as much of this data online as we want. For researchers this means faster, more flexible SAS System queries — and the ability to look across multiple databases.”

“For my own work,” adds Francis, “the most striking benefit is the speed of database administration, thanks to the parallel-processing features of Alpha and Oracle. Files that we once had to load overnight on the mainframe now load in two to three minutes during the day. Now we load 60-million-row source files into our database table in 28 minutes. It has truly changed our world.”

Fannie Mae’s search for an alternative to its mainframe focused on three key criteria: performance, availability, and affordability. “Our new systems had to support parallel processing and relational databases,” says Francis. “For the database, we did a structured competitive benchmark, and Oracle won based on company strengths and test execution. After that, Digital did a benchmark for us of Oracle on Alpha.” In some eases,” she says, “the AlphaServer produced results that were an order of magnitude faster than other platforms.”

Implementation went smoothly. For example, says Rosenblatt, more than 95% of Fannie Mac’s SAS code was migrated without change from the mainframe. “It took just three months to get the system running and integrated into our environment, our systems people trained, and the data loaded and accessible to users,” adds Francis.

Server Connections Needed

Where the rubber meets the road, in the Wild West world of the ISP, there is also a glimmer of interest in 64-bits — sometimes because of scalability issues and sometimes simply because of the need to squirt lots of data out to users.

SBC Internet Services (formerly Pacific Bell Internet), which came in “tops in overall customer care service quality” among U.S. ISPs, according to a recent benchmarking report by the Washington Competitive Group LLC, has been wrestling with runaway growth. Currently, SBC provides Internet access services to more than 300,000 business and residential customers — a customer base it wants to continue to expand at a high rate.

Ensuring continued growth and those great service marks is, in part, the responsibility of Chris Torlinsky, senior systems manager. “Our requirement is the ability to scale to millions of users,” he says. “In order to do that we will be stretching the limits of our 32-bit computers. When it comes to data-intensive applications, like a database, there is a need for 64-bits, if not right now then in the very near future.”

Currentiy, Torlinsky is waiting for his 32-bit supplier, Sun Microsystems, to provide for full 64-bit processing. “Maybe in a year we will be on 64-bit systems,” he says. Although Torlinsky says SBC does use some formal metrics for making major decisions u like a 64bit migration — he says some things are still decided on a seat-of-the-pants basis. And this may be one of them.

“We may decide very quickly that some applications have to go to 64-bits but for the foreseeable future I don’t think we will decide to move everything to 64-bit,” he says. Indeed, Torlinksy says one of the positive attributes about Sun’s offerings, which helped shut out other vendors, is that customers can continue to choose either the 32- or 64-bit option on 64-bit hardware. “With Sun, if you want compatibility with 64-bits, it’s just a matter of choosing it,” he says.

That kind of freedom — to choose 64-bits for select apps while keeping tried-and-true 32-bit apps elsewhere — seems to be what many IT managers wish they could get as they contemplate the potential perils of migration.

Clearly, 64-bit computing is making its way into the mainstream. But enthusiasm for 64-bits can be misplaced, warns Jonathan Eunice, an analyst with Nashua, N.H.-based Illuminata. “64-bits is just one of maybe a dozen major engineering variables a technical evaluator should examine as part of an enterprise server purchase,” he says. Things like I/O channel, the system bus, clustering, and operating system multithreading can be just as important.

“You need to look at deliverable performance through all the software layers,” says Eunice. And that, he adds, is best done by measuring hoped-for outcomes by testing your apps with custom benchmarks. Only then, he argues, can you understand the end-to-end performance characteristics of 64-bit when compared to more mainstream options.

32-bit or 64-bit?

A move to a 64-bit hardware and operating system platform isn’t for everyone. But for those with specific high-performance requirements it can be the answer. Here’s what to expect from 64-bit computing:

* With 64-bits you have the ability to cache much more in memory. For applications such as data warehousing that means queries will run much faster. Similarly, with OLTP applications you can have much more concurrency among users without requiring additional servers and overhead.

* 32-bit machines are generally limited to 2Gb of directly addressable memory — ample for run-of-the-mill computing but not for large mission-critical or highly memory-intensive applications such as data warehousing.

* With 64-bits there is no longer a 2Gb limit on “chunk size”–the size of a file that can be loaded.

* Although 32-bit machines can still handle most of the tasks that a 64-bit machine might tackle, they will usually require more CPUs and/or more individual servers–potentially increasing initial purchase costs, as well as ongoing support and maintenance.

Building a Virtual Organization on 64-bits

California dreaming is becoming a reality in the Secretary of State’s office, where, according to CIO David Gray, they are well on their way to creating online analogs for every function they handle. “Within five years we want all our services to be on the Internet — everything from registering to become a notary to filing corporate documents or accessing the California state archives,” he explains.

“Web-enabling our public services stands to lower our costs by reducing manpower, office rental, paper forms, and processing and filing,” says Gray.

“As an engineer I understand the technology — the larger instruction size and registers of 64-bit,” says Gray. “I can’t say we have any metrics that would quantify whether there is a big difference between being 64-bit or 32-bit. My gut feeling is that most applications wouldn’t know the difference — unless they were doing something computationally intensive.”

Part of the formula for achieving Gray’s ambitious goal has been a flight from mainframes to 64-bit Digital AlphaServers running 64-bit Digital Unix and Oracle 7 for the enterprise database, as well as 32-bit Windows NT and Microsoft SQL Server.

But in some applications, the story is different. Gray, for instance, supports a distributed database with 20 million voter registration records. It connects to 58 counties and does regular data cleansing. “It runs very well on the AlphaServers,” says Gray.

Indeed, his office is in the process of expanding the system to provide an online repository for campaign and lobbyist disclosures. “And our intent is to make that available on the Internet,” he adds.

Auto Auction Tune-Up

While corporations like federal mortgage lender Fannie Mae, with its mounds of data needing to be crunched, represent a classic vendor argument for 64-bit computing, others have found that even comparatively modest IT operations can benefit from a bit boost. Take, as an example, ADT Automotive Inc. in Nashville, Tenn., one of the nation’s top three wholesale vehicle redistribution and auction companies.

“If it weren’t for our 64-bit system we would have had to scale back our applications or distribute them over multiple computers,” says Eddy Patton, the company’s project leader for systems and development.

ADT’s LION provides online information and supports the auctioning system. It was first implemented as part of a private intranet, but is now also available to 900 autodealers around the country via the Internet. LION runs as part of an integrated 32-bit IBM AS/400 and 64-bit Digital Alpha system linking 28 auction sites and supporting up to 500 simultaneous log-ins.

Patton’s enthusiasm is undiminished despite the fact that he is currently running 32-bit NT as his operating system. Still, he says, having 64-bit hardware has provided overall performance benefits.

“Although ADT was an all-IBM and SNA shop prior to bringing in the Alpha Server, we have found that the integration of Windows NT and Alpha technology into our environment went very smoothly,” says Jim Henning, vice president of information services.

‘Gotta Have It’ – Eventually

Nicholas-Applegate, a global investment management firm with offices in San Diego, Houston, San Francisco, New York, London, Singapore, and Hong Kong, currently manages more than $31.6 billion in global assets in more than 60 countries. While nowhere near as large an operation as federal mortgage lender Fannie Mae, the company is finding that 32-bit computing has limitations.

Unix Systems Manager David Buckley says he and his shop are eagerly a waiting the availability of a full-fledged 64-bit hardware and OS platform from Sun, their long-time hardware vendor. According to Buckley, his database indexes can’t grow much longer with Sun’s current Solaris 32-bit operating system. “We really need the ability to have files bigger than 2Gb,” he says.

Buckley says he is happy to follow the forward-migration path setout by Sun. “In our part of the financial market,” he says, “most of the applications are written for Sun. We have been married to them for a while and our staff is more Sun-literate than they are on any other Unix.” Those factors, he says, kept them from even considering another vendor for the 64-bit transition.

Full 64-bit functionality, including the kernel, will be available with Sun Solaris 2.7, probably before the end of the year, Buckley says. Then, the company can begin to explore 64-bit computing in earnest, he adds.

Achieving Higher Performance

What is meant by 64-bit and what are the risks and rewards of adopting this technology? According to Informix Corp.’s Nasir Ameireh, a senior product marketing manager for Informix Dynamic Server (IDS), there are some fundamental concepts you must grasp and steps you should take when considering implementation.

For instance, he notes that 64-bit isn’t just a CPU with a largeword size or the ability to fetch more instructions per cycle. Nor is it simply a matter of data bus performance or choosing 64-bit applications or compilers. Its all of these things and more. You can’t have a 64-bit application without a 64-bit compiler and an operating system to support the results. In short you need it all.

What’s more, the true value of 64-bit systems may not appear until all of these things are present because all are interdependent.

Some, he notes, raise interesting issues: If it is possible to stuff more instructions into a single fetch, how do you handle out-of-order execution? What is the impact of cache performance? How good is the compiler at improving this? This gives rise, he says, to the need and importance of 64-bit standards like Intel IA64 Digital/Compaq, Sun, and IBM.

To take advantage of 64-bit hardware and operating systems, you need a minimum of 4Gb of memory. Why? Because 32-bit systems can address 4Gb of memory now. Moving to 64-bit hardware and operating systems with less than 4Gb memory, a complete plan for database engines, applications, and system tuning, could actually result in poorer performance, Ameireh explains.

Furthermore, he says, it is important to determine whether your applications need to be 64-bit, and you should have a plan– including a test plan — in place. If you decide to stay with 32-bit applications you should understand how they will exist in future mixed or all 64-bit environments. It is also important to understand the impact of 64-bit on third-party products which need to be “64-bit safe” or “64-bit enabled.”

Leave a Reply

Your email address will not be published. Required fields are marked *