Starting A Blog Is Not Hard When You Have Basic Coding Knowledge

No Comments »

ybWhether the domain host will require you to use HTML codes to create your website or not, you need to learn how to create a page using HTML codes. Why? What if you want to expand your field in blogging and you want to try more domain hosts, and they require HTML codes? How can you expand your field now? HTML coding is widely taught in schools, and there are actually courses in school and online that help people have an idea about starting a blog with the use of HTML codes.

Everything starts with the basics, of course.  As you blog more and more each day, you will later learn how to do complicated HTML codes. Basic HTML codes, because they are simple, the outcomes are simple. When you learn how to do complex HTML codes then you can expect your blog site to look better than basic blogs with basic HTML codes. Now for those who already have basic knowledge to complex knowledge about HTML codes then it won’t be a problem for them in starting a blog since they can make whatever in their mind possible. Let us say for example, creating text boxes, comment boxes where users can input their comments, or maybe a radio button that can rate how good the article is so the blogger can improve in the future!

Why Is Learning The Basics Better On How Do You Start A Blog?

It is important to learn the basics of how you start a blog because you will learn or be given a hint where the code came from and how do you use it compared to how it was used. Like they always say, it is always better to learn where it came from before jumping into the most advanced one. Why? By jumping to the most advanced tool, it is the same as buying the touchscreen phone with no idea how to use it without experiencing on how its basic form was used.

If you do this jumping into the most advanced, it is like getting something or struggling to use it without any single clue how it is supposed to work. There are many sites that offer the basic and the advanced and this is really helpful to the people that know how to do both of them or maybe who can only do one of them. There are also many ways how you can learn its derivation and that is by carefully paying attention to any resemblances or similarity to any of the both to each other.


The Need For Attorney In An IRS Tax Help

No Comments »

aiarSome people are asking whether they need an attorney when seeking for an IRS tax help. Actually, seeking for a law practitioner really depends on the need of the individual. There are some cases that tax help provided to check the tax refund while there are some people that seek tax help because they are faced with penalties. Taxes can be complicated, which is why it’s important to find a company with expertise. When different offenses are faced due to some tax issues, this is the right time to seek for counseling from an expert attorney who is familiar with the process. IRS tax help can contribute to the welfare of the individual but by seeking an attorney, the stress of the facing these issues are minimized.

Hiring an attorney for tax issues is quite an expense but the role of this practitioner is to defend and provide you with sufficient help in times like this. He/she will gather all the necessary data and find ways on how to minimize your penalties and have your problems with the government resolved. The process is definitely a long one but you have to be patient in dealing this. Always keep calm when talking to an agent for IRS tax help so that everything will be at ease and concerns are solved appropriately.

IRS Tax Help Through Telephone Assistance

IRS tax help cannot be availed only through their website or government office but also via a telephone call. There are individuals who cannot make a personal appointment due to some personal reason; that is why Internal Revenue Service is extending their assistance even by means of a telephone call. Seeking help via telephone is convenient because it does not require the individual to visit the office. Aside from this, there are IRS agents who take responsibility for this telephone call. Hence, IRS tax help is made easy in order for every individual to know their tax status.

Through this method of assistance, one can ask for current or previous tax forms, publications as well as instructions. For business related tax problems, the agent will also guide the caller on how to go about it. Since there are times that one has to wait for the agent to be available, the caller can listen to several pre-recorded information that are relevant to the tax help. As long as the individual calls within business hours, he/she can receive a response about his/her concerns on taxes and find solutions about it.


Linux Exploded For Several Reasons

No Comments »

You can’t work in the information technology sector and not be touched by Linux in some way, even if it’s only in a debate about what role Linux might play in your organization.

Linux is an Open Source operating system developed by a team of programmers lead by Linus Torvalds and originally targeted at the Intel x86 platform. Linux is a UNIX-like operating system, but since it was written completely from scratch, it uses no original UNIX source code. On the positive side, that means there are no copyright restrictions on the code. The biggest disadvantage, however, is that the operating system isn’t built upon a base of heavily used code. It also doesn’t have all the features you’d expect from a production UNIX system, like large-scale storage management.

Red Hat recently introduced an Enterprise Edition that includes Computer Associates’ AcrServeIT. The tacit admission is there’s a 100 percent Open Source operating system that still leaves high-end users wanting.

cusLinux began in the early 1990s and has moved from Intel to Motorola, Alpha, Sparc, and other major platforms. Because of its broad base of programmer support and its ability to run software from the large GNU software library, Linux is a viable alternative to both commercial UNIX servers, and Microsoft Windows and NT client machines. Ease of use–a common concern with UNIX–is less of a problem with two widely available desktop environments, KDE and GNOME, both of which are easy for Windows or Macintosh OS users and developers to learn. Both environments support a variety of themes that affect the appearance of windows, buttons, and scrollbars, letting users adopt a Windows or Macintosh look and feel.

To be a serious contender in the e-business market, Linux needs to offer more than a pretty desktop, and it does. First, it’s becoming the operating system of choice for high-end server vendors, such as IBM, Silicon Graphics, and Compaq Computer. Other vendors, such as Dell Computer, offer Linux preconfigured on its servers. Customer support is available from hardware manufacturers, as well as from Linux distributors like Red Hat, Caldera, and SuSE.

Also, Linux tools abound. The most popular Linux distributions include web and FTP servers, as well as other Internet applications such as Gopher, Domain Name Services (DNS), Mail, News, Proxy, and Search servers. The breadth of applications available with the widely supported, inexpensive operating system makes it an ideal candidate for web server applications. Industry analysts report more than 30 percent of web servers run on Linux.

Apache: Web server of choice

The Apache Web Server is a freely distributed HTTP server developed by the Apache Group and managed by the Apache Project (http://www.apache.org). Evolving from the National Center for Supercomputing Applications (NCSA) web server developed at the University of Illinois, Apache has become the most popular web server. According to a Netcraft survey of more than 13 million web sites conducted in March 2000, (http://www.netcraft.com/survey), Apache is used by 60 percent of the respondents. The next most popular server, Microsoft IIS, came in at just under 21 percent.

Popularity, though, is hardly the only consideration in judging e-business software. Scalability, reliability, and integration with other applications is crucial. Apache has been a stable platform on UNIX for some time, but Windows implementations haven’t proved as reliable. The Apache Group’s recently announced Apache 2.0 includes better support for non-UNIX platforms, which should improve Windows stability.

On the performance side, Apache 2.0 supports a hybrid multiprocessor/multithreaded mode that promises to improve scalability, though more real-world use is required before we know how well it meets that promise. The Apache Software Foundation supports projects focused on integrating Apache into the larger e-business environment, including XML-Apache, Java-Apache, and Java Servlet and Java Server Page support. The mod perl project provides developers the tools to create Apache modules in Perl that eliminate the need to run CGI scripts in a separate process, thus avoiding costly process instantiation.

Web servers also must provide access to middle-tier business services, especially those driven by database applications. Apache is recognized as a viable HTTP server by high-end web applications servers, such as the Oracle Application Server and IBM WebSphere.

Like Linux, Apache’s wide market acceptance demonstrates that Open Source development can create an essential tool for e-business. The release of Apache 2.0 (which was in alpha testing at press time) further shows that Open Source software can keep up with the demands of changing needs. However, while Apache is a solid choice for UNIX and Linux platforms, until a stable Windows version is available (and perhaps it’s close), Microsoft IIS is probably a better alternative for NT.

Perl: Portable programming

While C/C++ is a portable programming language, the learning curve and time required to develop fully-functioning applications is oftentimes prohibitive. While Java has eliminated some of the most (potentially) problematic aspects of C–especially pointers–and offers a rich set of libraries like C/C++, development is still coding-intensive. The Perl programming language has emerged as the programming tool of choice by as many as one million developers, according to The Perl Journal.

Perl was originally used as a systems administration tool on UNIX platforms, but was quickly adopted for Web development and data-intensive programming because it includes many features found in other tools, such as C, awk, sed, and BASIC. According to http://www.perl.com, Perl is the most popular web programming language, due in large part to its powerful text manipulation. From an e-business perspective, the fact that Perl runs on so many platforms and can handle operating system administration tasks, as well as more traditional database-oriented tasks, makes it an ideal candidate for a development tool. Since Perl applications can be written to run as embedded modules in Apache, web server processing can improve as much as 2,000 percent.

Can Perl pass the muster in an e-business environment? Amazon.com and Deja.com are two of the major dot-coms that use Perl to run their sites. In addition to Apache integration, the nsapi_perl module embeds a Perl interpreter in a Netscape web server so e-businesses don’t have to trade the faster performance of an embedded interpreter if they don’t use Apache.

The Perl language is also widely supported by third-party developers, offering more than 400 modules at the Comprehensive Perl Archive Network (CPAN) (http:// www.perl.com/CPAN/). Perl has been ported to major UNIX platforms, as well as Microsoft Windows and NT, and Macintosh. Also, Perl can easily integrate with databases through DBI modules, making it an ideal tool for creating applications from database-centric web pages to data warehouse extraction, transformation, and load scripts.

While popular, the Perl syntax can be cryptic. This is understandable given its origin as a systems management- oriented tool, but this limits its use for large-scale software development. Python, an object-oriented scripting language, is a better choice if you’re looking at larger applications where reuse and object orientation will pay off.

Databases: The weak link

When we think of e-business and databases we generally think of the big names: Oracle, IBM DB2, Microsoft SQL Server, Sybase, and Informix. Occasionally, we’ll hear about the two most popular Open Source offerings–MySQL and Postgres–but not too often. Why not? Those offerings simply can’t compete with the features and functionality of today’s commercial relational database management systems. In some cases, the lack of features makes the Open Source databases unusable in an e-business environment.

For example, MySQL’s lack of subqueries and right outer joins require developer work-around solutions. More seriously, the database’s lack of transaction support completely eliminates it as a serious contender for e-business. The transaction support found in major commercial offerings makes it possible to define a logical unit of work consisting of a series of steps that must all occur for the transaction to be completed. For example, transferring funds from your savings to your checking account may consist of two distinct steps–withdrawing money from savings and then depositing it into checking. Without a way to group those two steps, it would be possible for the withdrawal to be made without the corresponding deposit. That could occur if the server crashed in the middle of the operation, and no server is immune to crashing or other problems that could disrupt a transaction.

PostgreSQL, originally developed at the University of California at Berkeley, has many of the features found in commercial databases systems, including transactions, stored procedures, and extensive SQL support. If you’re looking for an Open Source database, this is probably the best. However, you must remember there’s much more to a database than what meets the programmer. If you need to consider integration with enterprise resource planning systems, failure recovery, parallel query optimization, advanced partitioning options, support for data warehousing (such as summarization and query redirection), then the commercial offerings are your best bet.

Conclusion: No simple answer

Can an e-business succeed with Open Source? Of course. TCP/IP and other core Internet protocols were developed in an Open Source environment. The key to success is to not blindly ignore or embrace Open Source any more than you would a particular vendor’s offering.

Open Source offers strong programming tools, Web servers, and a popular operating system that’s making steady inroads into production environments. In terms of databases, like your Web server, the database must be reliable, scalable, and easily integrate with your other systems. If you stick to the most widely supported Open Source solutions, including Linux, Apache, and Perl–to name just three–you can build a stable, reliable e-business platform.

Linux isn’t as established as UNIX, and it may not scale or promise the kinds of uptime you expect from UNIX, but for small- and mid-sized web sites, it may fit the bill. Apache and Perl both have strong developer support and in key areas, such as database access and performance, they get as much attention as any commercial product would.

For real-world, large-scale e-business, Open Source mixed with commercial applications is the best approach. While making significant inroads, Open Source still can’t marshal the resources to keep up with changes in technology. Red Hat’s alliance with Computer Associates for high-end storage options is a case in point. While advanced storage applications and robust databases may emerge from Open Source development, there aren’t any now–and we need them now.


Down With NT! Up With Linux!

No Comments »

This could be described as the year of the operating system. Microsoft Corp. is racing to finish Windows 2000 before 2000, IBM and Santa Cruz Operation (SCO) are forging a new version of Unix for Intel 64-bit processors and Compaq, Sun and Hewlett-Packard are toughening their proprietary Unix versions.

linux1For an IS manager it’s an embarrassment of riches – and a time of confusion. The expected lock-down of systems starting this summer due to Year 2000 concerns may give users a few months of breathing room and time to compile a list of questions.

Among them: Is it worth waiting for the much-delayed Win2000 or should we stick to NT Server 4.0? Will Win2000 be as reliable as Unix? Will the Data Centre edition be worthy of a data centre? And, as always, is Windows’ alleged cost of ownership advantage just window-dressing?

“NT versus Unix? I get this question from clients two times a day,” says Tom Bittman, vice-president and research director at the Gartner Group in Stamford, Conn. With more mission-critical applications coming out, NT supporters want them added to the IT mix. But managers wonder if Microsoft is up to the task.

“In almost all cases they’re leaning towards Unix,” said Bittman, “and wondering if they’re crazy.”

But Bittman warned NT is “probably two years behind the hype.”

“NT still has issues of scalability for the majority of the market, and that’s one reason many companies are going with Unix.”

In fact, he added, Unix is undergoing a bit of a come-back.

“We’re definitely seeing the start of a slow-down in the server space. A lot of companies are realizing Unix still has a place, and that NT is still missing some attributes. The belief is they may be fixed in Windows 2000. But while they’re waiting, that puts more of a focus on Unix for certain mission-critical and back-end applications.”

Microsoft’s ambitions are high: the Data Centre edition will scale up to 16 processors and 64 GB of memory, said Erik Moll, Windows 2000 marketing manager for Microsoft Canada. The Advanced Server edition (now called Enterprise Edition ) will go up to four-way symmetric multi-processing. Both will have advanced clustering and load-balancing capabilities.

Other features will include the Active Directory, a management service with file-level encryption and support for Public Key Infrastructure, and Web and Internet support services.

“There’s no question as we move forward to Windows 2000 one of our key focuses will be on reliability and availability,” said Moll. “We’ve greatly reduced the number of reboots as you do reconfigurations of your system, and we also worked hard to make sure device drivers have gone through compatibility testing.”

Until Win2000 is proven, however, users are looking at NT.

For Bittman, Unix has it over NT for scalability, mixed workloads, high availability and reliability, maturity of applications and the availability of people skilled enough to run mission-critical deployments.

Expect more than 200 concurrent users on your system? Don’t use NT, he advises, unless perhaps you’ve got SAP R/3. Gartner has seen SAP implementations with 850 users, not big by Unix standards, but impressive considering most applications on NT can’t handle more than 200 users. It speaks to how well SAP has been tuned for Windows, said Bittman.

However, he added, the majority of the market would need about 400 users, and that’s where Win2000 is headed.

As scalability declines as an issue, high availability and reliability will become more important, said Bittman. Microsoft’s “Wolfpack” clustering technology is two years behind Unix, said Bittman, but will be better in Win2000.

“Our biggest concern is NT reliability,” he said, “and it’s not getting better. In fact, I have clients who tell me in their view, every release of NT has only gotten worse. I don’t think that’s true, but there’s that perception out there.”

The biggest issue is memory leaks. Microsoft acknowledges the problem, but most will be plugged only in Win2000. Meanwhile, Gartner tells clients to expect an NT system will go down at least once a week. For companies who can tolerate that kind of performance, he said, NT will be good enough.

However, Ritchie Leslie, director of Montreal-based DMR Consulting Group Inc.’s Microsoft strategic alliance, believes NT has a big place in the enterprise.

“For medium-scale applications, say transaction systems supporting a few hundred users, Windows environments have a clear total cost of ownership advantage over Unix environments,” he said. “NT is cheaper to buy, most organizations have NT servers so if they can avoid having to buy a specialized Unix box to run an application, they can reduce the total cost.”

There have been “huge advances” in NT management tools, he said, adding the operating system is catching up in stability and reliability.

“For all but the very largest applications, Windows NT is becoming an increasingly practical platform,” he said. But, he added, it’s not ready for a large data warehouse or large transaction-based system.

Bev Crone, who watches both platforms as general manager of midrange system sales for IBM Canada Ltd., acknowledges that NT/Win2000 use will continue to eat into the Unix market. “At first blush,” he said, Unix looks more expensive than Microsoft’s offering, but not when the user considers factors such as reliability, availability and scalability.

Unix vendors aren’t standing still, he added, pointing to the IBM Unix collaboration on Project Monterey for Intel chips.

But Bittman is skeptical. Santa Cruz, Calif.-based SCO Inc. hasn’t set sales records with UnixWare, its Intel offering, he said.

Vendors are only now realizing that the first version of the IA-64 chip will create a small market, which won’t grow until the next generation of processors, called McKinley, debuts.

While some analysts believe Microsoft is aiming for an October release of Win2000 Professional, Server Edition and Advanced Server, Bittman says that will only be for “bragging rights.” He expects it will come out next year and will be filled with bugs that weren’t caught during beta testing, because users are telling him they aren’t testing it with mission-critical apps.

“We’re telling clients don’t deploy it in a production mode for at least six to nine months after general availability, after at least one service pack and maybe two,” Bittman said. “We’re also telling them it will be less reliable than NT 4 with service pack 5 for a year.”


Unix, Despite It’s Age, Still Has Followers

No Comments »

Operating systems, graphics cards and processors all have combined to improve the workstation and give shape to the market. Although these factors are conspiring to generate a considerable shift towards the Windows NT or personal workstation, the traditional Unix workstation still has its loyal followers.

According to figures released last year by research firm IDC Canada Ltd., NT workstation shipments, between 1998 and 2003, are projected to have a compound annual growth rate of 15 per cent. For the same period, traditional workstations are expected to decline by three per cent each year.

Despite the Unix workstation’s decline, it will still find a home in many niche areas and among companies that have made large purchases in the past and are reluctant to switch, said IDC.

“It’s still doing very well in the scientific area, as well as in large-scale manufacturing,” says Alan Freedman, research manager for servers and workstations at IDC. “It’s the organizations with the huge installed base that haven’t made the transition, while the organizations that are small or more agile are moving towards NT.”

unixTraditional Unix workstations are still found in departments devoted to engineering, mapping, geology and other technical applications. “The Unix market is not shrivelling up or fading away,” Freedman says. “But what we’re seeing now is that some of the mechanical and electrical design areas that were wholeheartedly Unix are now at least taking a look at the NT workstations.”

The reason they are looking at personal workstations has a lot to do with lower prices, increasing operating system reliability and the advances in processor and graphics technology. Independent software vendors have responded by porting many of their applications to Windows NT.

Backing up the capabilities of the personal workstation are improvements on the processor front. Of particular importance are Streaming SSMD Extensions, an innovation from Intel Corp. of Santa Clara, Calif. Similar to the MMX innovation, SSE gives the Pentium III the ability to better perform the floating point calculations needed for high-end graphics calculations.

Coinciding with the advances in processing are low-cost graphics cards which ease entry into the world of high-end graphics work. In the past, the customer had to spend $3,000 or more to get a graphics card with an on-board geometry accelerator but now there are cards that can do this for less than half that amount.

Able to leverage the power of on-board processors, users get graphic performance that scales with their processing performance. One of the prominent applications of workstations’ high-end graphics capabilities is in Geographical Information System (GIS) applications. Some of the major GIS vendors, such as ESRI Inc. of Redlands, Calif., are porting many of their products to the Windows NT operating system. Higher processing power, in conjunction with the latest graphics cards, allow for a more dynamic presentation of geographic information.

The newer graphics cards also allow workstation users to use two monitors on a single workstation. Graphics cards that make this possible include the Millennium G400 Series from Matrox Graphic Inc. of Montreal. Based on what Matrox calls the “DualHead Display,” the feature allows the user to extend one application across two monitors, or open multiple applications at once. Some users, the company says, display applications on one monitor while showing tool bars on the other.

Six months ago desktop PCs were equipped with 4MB video cards. Now users are getting 8MB or 16MB.

But while this is pretty powerful for the desktop level, hardware isn’t necessarily the only means of defining a workstation, argues Kevin Knox, a senior research analyst with the Gartner Group in Stamford, Conn. “I think workstations are defined more by the application than they are by the hardware,” he says. “Generally, workstations are systems optimized for a specific vertical application. It’s not just high-end, mid-range and low-end.

“I agree that the lines are blurred and there are vendors out there that play to that. I think their workstation numbers are inflated significantly because they are showing workstations in the desktop market.”

“The high-end market is flat to a small decline in terms of revenues, and a larger decline in terms of units because of NT,” says IDC’s Freedman. “However, some companies are coming down with lower-priced Unix workstations to combat that — most notably Sun Microsystems with workstations that are lower in price and target the same markets as the NT workstations.”

“So while Unix does not have the majority of the units, it does have the lion’s share of the revenue,” says Freedman. “We are predicting over the next four or five years, slight negative growth in units and a bit higher negative growth in revenue — about two or three per cent.”

Gartner Group reports that NT will eventually supersede Unix in the high-end market. The Unix versus NT operating system game has been playing for some time now, and vendors, which at one time clearly chose sides, no longer seem as sure of the winning team.

Not too long ago, the workstation market consisted of Sun, HP, IBM and SGI, but there has been a rapid penetration of Wintel systems, says Knox. “Sun is trying to protect its installed base, and frankly not doing very well on the low end,” he says. “They introduced the Darwin product and that really hasn’t taken off as I know they wish it had.”

What users are saying, he continues, is they have an office productivity machine for the everyday applications, and a Unix box, and they want to consolidate them into a single system. “Right now that’s the NT system,” adds Knox. He expects traditional PC vendors such as Compaq and Dell to take the lead in market share because of the improved performance of NT, Xeon processors and other technologies.

There are, however, still some markets that can only be served, at this point, by the robustness Unix delivers. Traditionally, high-end workstation markets have included mechanical computer-aided design (MCAD) and electronic design (ECAD) in industries as diverse as auto, finance and software design.

Changes in workstation market

The rise of the personal workstation has dramatically changed the face of the workstation market in Canada — at least in terms of vendors.

In 1997, according to IDC, Hewlett-Packard Co. was the leading vendor with more than 14,000 units shipped in that year. Second was Sun Microsystems Inc. with approximately 8,000 units shipped. Following Sun were IBM, Digital, Compaq, Dell and Silicon Graphics. Since that time, the Windows NT/personal workstation market has been growing at 15 per cent compound annual growth while the Unix market has been declining by a three per cent annual growth rate. Trends for both camps are expected to continue until 2003.

In 1999, 19,500 workstations were shipped in Canada. as much as 32.6 per cent of the market is now held by Dell.

Compaq follows at 23.7 per cent, then Hewlett-Packard at 21.6 per cent, followed by IBM at 14.7 per cent. Other workstations account for the remaining 7.4 per cent of the market, IDC Canada reports.

Risc machines no longer dominate

Three years ago, the workstation market was dominated by RISC (Reduced Instruction Set Computing) processor-based products running Unix operating systems and applications.

Today, several developments in this marketplace have allowed advanced application users to rely on other processors to provide comparable performance to a traditional workstation at a lower price.

A workstation-class system is a higher- performance computer specifically engineered for applications with more demanding processing, video and data requirements intended for professional users who need exceptional performance for computer-aided design (CAD), geographic information systems (GIS), digital content creation (DCC), computer animation, software development and financial analysis.

With the introduction of Pentium II processors, many computer companies expanded their product lines to offer Intel based workstations. The added performance provided by these and successive Intel Pentium III and Pentium III Xeon processors have resulted in a strong shift from proprietary, traditional workstations to branded personal workstations, which use the Windows NT operating system.

Workstation users benefit from rapidly evolving processor technology. High performance workstation-class systems let power users be more productive as projects can be completed much faster, saving organizations time and money.

The workstation market has been one of the first to benefit from the set of instructions incorporated into Intel’s Pentium III processors, called Streaming SIMD Extensions (SSE). This performance improvement will come from the new SSE-enhanced applications and drivers being introduced by hardware and softwar vendors.

Most branded workstations also provide the option to add a second processor, allowing users to takeadvantage of the multi-tasking and multi-threading capabilities of their applications and operating systems.

In addition to dual processor support, workstation-class products are differentiated by their options for advanced Open GL graphics adapters, greater memory and storage expansion, higher performance harddrives and application certification.

It is important to understand that all 3D graphics cards are not created equal. 3D video adapters can generally be categorized as those optimized for advanced workstation applications or those that are good for games.

OpenGL (OGL) support is the industry standard that separates the workstation from a gaming workstation.

Most of the workstation graphics glory goes to the high-end 3D video cards, but multiple monitors are also an important productivity tool for many workstation users. Two or more monitors can be of benefit tp those who require more display space for increased efficiency and effectiveness while multi-tasking.

For instance, multiple monitors can help software developers create and debug applications by having an application on one screen and a debugger on another, or a programming editor on one and an onlin reference manual on the other.


Fanatics? The War Was Won!

No Comments »

I started out in 1994 as a Linux advocate, saying to myself, “This is great, but I wish it were easier to install and didn’t screw up my boot sector.” In 1995-1996, I forgot about it so that I could concentrate on applications. In 1997 I went into denial, and in 1998 I tried to remain objective.

In 1999, I’m taking a “who cares” attitude. I’m not denying Linux per se; I’m simply refusing to get caught up in an OS holy war.

It’s not 1999 yet, though, so I still have a little time left for some more denial–not just about Linux, but also about Windows 2000 and NetWare.

Linux fanatics out there, you’re going to have to get over this: In some aspects, Windows NT is a better operating system. The biggest NT advantage is that it has a development model and a ton of rich consumer-friendly applications.

lnWere that the only thing, Linux would be home free, since the mass acceptance of the operating system will spur on more applications. But Linux also has problems with its scheduler. I’ve written before that the Windows NT scheduler is not up to par to what is available on some Unix platforms (see PC Week, June 1, Page 71). However, NT’s scheduler makes Linux’s look like dog meat.

Another Linux problem is with I/O. An engineer I know says that Linux is rife with Ring 3 scaling problems. But he added that the operating system will “get there” soon enough.

The trouble with NT starts with its registry, which most engineers complain is a horrible mess. The only people who like the NT Registry are those who sell packages to “fix” it.

As bad as it is, the registry is the least of Microsoft’s worries. We have today a big need for 24-by-7 uptime, and NT just doesn’t cut it. NT doesn’t allow IT managers to gracefully kill rogue programs. It makes organizations reboot systems too often, even when minor, noncritical application-level changes are made. Sure, Microsoft and others have patches, kludges and fixes that let NT function in this environment. But corporations want guaranteed uptime; that’s why Linux is perfect here.

NetWare 5.0 should have been poised to reap profits from a delay in Windows 2000 and the newness of Linux. Unfortunately, there are a ton of problems with NDS (Novell Directory Services), including incompatibilities between NDS with NetWare 5.0 and NDS with NetWare 4.0 implementations.

There are also unconfirmed reports that NetWare 5.0 is slower than NetWare 4.0 in some instances. The performance problem stems from NetWare’s unithreaded TCP/IP stack. But really, these performance differences are so slight that it shouldn’t really make a big difference.

All this hand wringing is meaningless in a way. We in the press and in the community constantly operate in an “exclusive OR” world. That is, if something new comes along, we have to assume it will displace something else. But the buying practices of corporations rarely function in this way. Corporations buy to solve problems.

That’s why I see businesses forcing vendors to work together. The consumers will push Microsoft to accept Linux; they’ll push for development of stronger NT development (for example, Winsock) APIs on the Linux kernel. They’ll push Microsoft to accept NDS because consumers don’t plan to dump it.

Next year, though, Linux will be pushing other Unix vendors out of the market. The smartest move Novell could make would be to completely dump the NetWare code base and move all of the NetWare services to Linux. Caldera supports NetWare for Linux now.

Watch out for Caldera, by the way. In 1999, it’s going to make some Linux announcements that will knock your socks off.


Corporate Acceptance Was Completely Necessary For Linux

No Comments »

For IT organizations such as West Virginia Network in Morgantown, which runs the network applications for the state’s higher education institutions on AIX Risc 6000, NT, and Intel platforms, “it would probably take a killer app to move us to Linux quickly,” says Jeff Brooks, lead systems programmer for WVNET.

On the desktop side, however, he says Linux is taking hold. “We have an increasing number of people running a Linux desktop environment for personal productivity, including mainframe programmers. IT professionals who are not supporting PC products are spending too much time maintaining their PCs. We calculated the time people spent doing non-business, non-mission-related maintenance–doing upgrades, or rebooting when that blue screen comes up on NT– and it’s not non-trivial.”

As an embracer of Linux, Java, and other things non-Microsoft, IBM, which dropped to Number 2 in the Software 500, grew its software revenue 6% to $11.9 billion, with total corporate revenue growing 4% to $81.7 billion. IBM, like other Top 10 companies Sun, HP, Compaq, and Hitachi, derives the bulk of its revenue from hardware sales.

IBM’s strategy for making money from software and services is a model the others are also following, each in their own way, says IDC’s Bozman. “If you look at IBM you see a model for how you can make more from software and services, but [IBM's offerings are] more marketable, more cross-platform. Look at Lotus and Tivoli.”

Like IBM, HP “realizes there is a synergy there in combining software sales and services.”

Sun, she says, uses software more as a lever to drive hardware sales. And unlike IBM, “Sun has a small professional services organization and doesn’t want to be concerned about competing with the Andersens, etc.”

Compaq entered the enterprise software fray with its acquisition of Digital Equipment, inheriting not only the software but Digital’s services organization. While Compaq so far has not been able to meld this acquisition as smoothly as it probably hoped, and recently replaced CEO Eckhard Pfeiffer with chairman and acting CEO Ben Rosen, analysts are positive about Compaq’s ability to move forward in the enterprise software world. “Compaq already has a lot of corporate IT accounts; they have extremely strong relations with the IT world. The ability to take DEC and integrate it is an obvious question mark, but you have to believe Compaq will make [DEC] an important part of the future,” says Tim Bajarin, president, Creative Strategies Inc, San Jose, Calif.

He adds, “Clearly they see enterprise-driven software as a key component to the overall value-added products they sell in the marketplace. The demand for complete systems solutions is on the rise, not the decline. I would be surprised if Compaq doesn’t get it right.”

linuxAlso pushing hard in the one-stop shopping realm is Computer Associates (#4), now busy absorbing its recently announced acquisition of Platinum Technology. CA in 1998 grew its software revenue 12% to almost $4.9 billion, with total corporate revenue growing to almost $5.1 billion. The combination of Platinum and CA, based on 1998 software revenue would be $5.6 billion, which would rank the combined company this year at #3, surpassing Oracle. Size seems to matter in the systems management market, as the industry consolidates into a smaller group of large suppliers. Collectively, companies in the Software 500 that compete in the systems/network management space grew software revenue an average of 16.2%.

In the enterprise applications arena, where both Oracle (#3) and PeopleSoft (#10) compete, “the key priority for ERP vendors is to extend their applications and frameworks to the world of e-business,” says Steve Bonadio, senior analyst, Hurwitz Group, Framingham, Mass. ERP will be the backbone that enables companies to efficiently and effectively communicate and collaborate with themselves, their customers, partners, and suppliers.”

Both Oracle and PeopleSoft grew at a good pace, with Oracle reporting a 20% growth in software revenue (which also includes their database business) to $5.3 billion. And PeopleSoft hit the billion dollar mark in 1998, with its 48% increase in software revenue. Both companies beat the average software revenue growth rate (17.4%) for companies in the Software 500 that compete in the ERP/enterprise applications market.

While the ERP suppliers are benefiting from a still-healthy demand for their solutions, corporate IT professionals still need to evaluate the health of their potential vendors, and the health of their strategy, says Hurwitz’s Bonadio. “As ERPvendors aggressively round of their product functionality and architectural strategies, companies using ERP applications need to make sure that their existing ERP investments are secure. There is nothing worse than spending millions and taking years to implement an ERP solution than to find out that you have to do it all over again.”

ERP was not the only software segment thriving in 1998. Among the Software 500, suppliers in both the Internet/e-commerce and data warehouse/OLAP markets saw software revenues rise an average of l7.8%.

Across all segments of the software industry, many companies grew through merger or acquisition, a trend that has continued as the industry matures. Among the Software 500, 32% merged with or acquired another company during 1998 (see “Here Today Gone Tomorrow,” pg. 28 for a look at what happened to some of last year’s Software 500 companies). While investment banks tend to say this is largely a positive trend for IT buyers, assuring the continuance of new and innovative products from startups and small companies, and enabling more one-stop shopping and more formalized support organizations, IT professionals are not so sure.

The supposed benefits of one-stop shopping “depend almost completely on the willingness of the vendor to try to fit our enterprise situation,” says WVNET’s Brooks. Some vendors, in his experience, “try very hard to lock you into multiyear contracts with little added value. Some conglomerates have offered us software they think is a great deal but that we don’t necessarily need. In that sense, the whole merger mania for us every year looks a little gloomier, because we have to deal with a smaller number of suppliers that give us heartburn. But then again, I’m not a stockholder.”

In addition, says Brooks, “we tend to lose relationships with developers we may have worked with for 10 years. All of a sudden there’s a layer of management between us and them–if they’re still there.”

Agreeing with Brooks is Lynn Manzano, Y2K project manager at Experian Inc., a leading supplier of information on consumers, businesses, motor vehicles, and property, in Orange, Calif. “You do lose a lot continuity, and you lose the depth of support.” On the other hand, she says, “Hopefully I’ll get a better price. With some of these bundled purchases we saved a lot of money, from a cost perspective. From a functionality perspective, I don’t know yet” the benefits from a merger or acquisition.

Another overriding concern for IT in 1998 was the Y2K issue. 1998 was the year the software industry and the IT community got serious about Y2K, scrambling to address millennium date issues before year-end, so 1999 could be spent testing. Among the Software 500, 89% of the companies reported that their primary software products are now Y2K compliant. Only 1% of the companies said they will not be compliant by year-end ’99, nor will they make it by 2000. And 10% of the companies did not respond.

The Y2K issue proved to be a double-edged sword for IT. On one hand, many organizations were forced to put off new development projects to concentrate on their millennium fix. On the other hand, Y2K has prompted a massive updating of legacy systems to, in many cases, new packaged applications.

Says WVNET’s Brooks, “In 1998 we were largely sidetracked by Y2K. Every application we were working with from mid-year 1998 through now had been with an eye to getting everything done for Y2K. The fringe benefit is massive updating. I think, architecturally, that’s clearing out a lot of deadwood. It’s really a new broom–inadvertently.”

And while it didn’t get quite the attention in the U.S. that the Y2K issue received, Europeans were busy grappling with the debut of the Euro currency, which observers say creates a more complex coding challenge than Y2K, as it affects fundamental business rules. Among the Software 500, which are predominantly U.S.-based companies, 50% reported that their primary software products are Euro compliant, while 6% said they are not compliant yet. Twenty-nine percent reported that Euro compliancy was not applicable to their software products, while 15% declined to answer.

With the Euro now launched, and Y2K soon to be winding down, what’s ahead for 1999 and beyond?

WVNET’s Brooks cites storage management as an area to watch–”the whole issue of integrating storage management across the enterprise, particularly the interoperability of storage and the prospects for data interchange on a rapid, secure basis.”

Says Mark Gorenberg, a partner in the venture capital firm Hummer Winblad Venture Partners, San Francisco, Calif., “Lower cost and plentifulness of storage is a huge trend for people in IT.”

Other trends, Gorenberg notes, include the management of the extended enterprise, the movement of ERP vendors to e-commerce, and the rise of vertical software markets. “Vertical markets are very much in vogue. It’s possible to create a vertical company as a standalone company now. Before, they couldn’t grow large enough.”

For Brooks, he is looking to the new millennium to bring simplification. “It seems that over the 25 years I’ve been following the industry, complexity grows at a fairly good clip until people refuse to tolerate it, then something comes out to simplify it. It used to be the desktop environment was fairly complicated, then Windows came out, and pretty much everybody’s PC looked the same for a few years. There has been another one of those [trends] with the Web, but it’s not done yet. I have a feeling our architectural issues in about four years will look very different. But I’m not seeing anything from the pundits that satisfies me about what could be next big thing.”

So is life e-beautiful for IT professionals today? Both WVNET’s Brooks and Experian’s Manzano agree there are lots of opportunities for IT professionals, both employment-wise and innovation-wise. And the software they have to work with keeps getting better. “The tools are significantly more compatible,” says Brooks. “You can have a toolbox and have some hope that most of them work together somewhat.” With more packaged products, he notes there are also fewer opportunities to provide solutions for users that the software vendors won’t, “but at the same time you can devote more time to doing other things, like developing new applications, or training, or doing production evaluation, and spend less time looking at code.”

Gorenberg adds, “There are a number of huge opportunities in IT. Outsourcing has created real capitalism for people in IT. Outsourcing and the whole service provider phenomenon are growing like gangbusters. For the first time venture firms are funding service companies, and those companies are growing and going public, making great IT people the new rock stars.”


Remembering The Kings Of A Fragmented Market

No Comments »

At the annual CeBIT conference in Germany last month a Santa Cruz Operation Inc. (SCO) official issued what might be called a considerable boast.

Tamar Newberger, SCO’ s marketing director, was talking about her firm’s partnership with IBM Corp. to produce a new unified 64-bit version of their Unix versions for Intel’s upcoming IA-64 platform.

“Together the two companies’ market share will be 50 per cent of the Unix server market,” she predicted.

That would assume both companies can combine and keep up their current market share after the operating system, code-named Project Monterey, is released next year.

chAnalysts doubt it will be easy. By 2003 Monterey will have only limited penetration on 32-bit Intel machines and “limited success” on IA-64, wrote George Weiss, vice-president and research director of Gartner Group Inc. of Stamford, Conn., in a recent report.

Onlookers will get an idea of Monterey’s future when the partners hold a press conference this month.

The move illustrates vendor expectations for the new platform and raises a question: Will one Unix emerge to challenge Microsoft Corp.’s Windows 2000 (formerly NT) operating system?

Unix has shouldered its way into the Intel world after being concentrated on proprietary platforms. Just under 60 per cent of the Unix servers sold worldwide last year ran on Intel chips, according to International Data Corp. of Framingham, Mass.

The platform beat Windows NT in revenues for server software, US$2.9 billion to $1.4 billion. And as SCO demonstrated at CeBIT in announcing UnixWare 7 for the Data Center, Unix players are trying harder to move to the heights of the enterprise.

The IA-64 technology may accelerate these trends. The possibility of relatively inexpensive Intel servers powering corporations has some vendors smiling and others shaking.

After years of spawning over a dozen versions, the Unix world is slowly shrinking.

Challenged by the success of NT and by the costs of moving their proprietary Unix operating systems to a 64-bit architecture, many vendors have given up.

Instead they’re forming alliances to back the next generation of Unix and letting customers know that down the line they’ll have to switch.

Hewlett-Packard Co., a partner with Intel on IA-64 design, will port HP-UX to the new platform, which will also be licensed to Hitachi and NEC Corp.

But while it will continue to make HP-UX for its PA-RISC servers, HP said eventually it will drop those chips and focus on the Intel market.

Fragmentation will continue, however. Sun Microsystems Inc. is working on an IA- 64 version of Solaris Unix, but will keep making Solaris for its SPARC-powered servers. Compaq Computer Corp. will port its Tru64 Unix to the IA-64 platform, but isn’t giving up its Alpha chip-powered servers running Tru64. It will also continue selling servers with SCO’s UnixWare and Monterey.

IBM Corp. will have Monterey as well as its stake in the PowerPC, the RS/6000 and its platforms.

Monterey “could make IBM into a stronger Unix power,” said Weiss, “but it could be several years before it creates the infrastructure needed to be able to market it effectively.”

Generally, vendors say 64-bit Unix on their proprietary systems will be for high-end demands, while 64-bit Unix on Intel will be for less demanding applications and in small to medium-sized organizations.

Meanwhile, Unix contenders are pumping out new high-end features for their current versions, hoping to leave NT behind while they move higher in the enterprise.

For example, Compaq is promising to bring out enhanced clustering capabilities for Tru64 this year, including application and system partitioning.

Hewlett-Packard will add a scalable architecture to HP-UX, allowing four of its V-class RISC servers to run under one copy of the operating system by the summer.

UnixWare 7 for the Data Centre, which sells for US$9,999, is licensed for 150 users, eight CPUs and 32 GB of memory, and is claimed to be able to run more than 10,000 hours continuously. It can be upgraded to support up to 32 processors and 64 GB of memory. It also includes an event-logging feature which allows report logs to be filed into a SQL database.

Company spokesmen say it will meet the “new era of volume economics” offered by the Intel platform.

However, Weiss said he’s skeptical that UnixWare can play in the data centre yet even if it’s optimized for the new Pentium III Xeon processor. The real Intel data centre technology is IA-64, he said, because it will have 64-bit memory addressing capacity.

Besides, he added, a data centre to him means high availability hardware, large disk farms, storage management – things that aren’t deployed using “high volume” (read mass market) equipment.

That leaves only one other Unix version: Linux.

It may sound odd to include an operating system that doesn’t have a major company behind it for service and support or offer high-availability clustering.