Enterprise System Planning: Using tomorrows technologies on yesterdays equipment.
by Ted Vukelich
"Sir, we need to start formulating a plan to replace our desktops every two years to stay poised for rapid growth and expansion."
Three years ago you would receive praise for thinking ahead and taking the initiative to stay ahead of the "IT game". But in today's world of application integration and online computing, this statement will make you look bad real quick. Not only must we consider how to add computing tools to our company's arsenal, we must find ways to cut costs and increase our existing systems longevity.
Ask any experienced MCP and they will tell you that they are often asked to contribute in customer system reviews and make recommendations on how to use existing system hardware and software with new technologies. This usually amounts to figuring out how to use system intensive applications on outdated machines. We are constantly asked, for example, how do we make a system intensive application such as SQL run on our 486's. Until recently, the honest answer would have been, "You don't". But with an emerging technology known as server-based computing, new applications can be easily deployed on these older systems, saving the customer money, and administrators needless headaches.
In the past it was standard to say that upgrading our desktops would increase employee performance by allowing them to work faster and with less "glitches". And though this still holds true, in today's world of mixed operating systems and mission critical software, many companies are turning to the server-based computing model to plan their long term IT strategy. As MCP's, we are expected to be knowledgeable about nearly all of today's technologies and be able to make recommendations on them. Many MCP's have asked me how to handle a situation where there is a need for intensive applications to run on subpar systems. As an MCSE and support engineer for a company who specialized in Server Based Computing products, I have seen first hand the amount of time both client and administrators have saved implementing this model. The key now is for all of us to atleast be knowledgeable of what's out there to better serve our customer needs.
Though server based computing has been around for some time, we are just recently seeing products that truly leverage the network model in a manner that reduces costs, saves time, and provides a competititve advatage. Instead of having your IT staff roll out applications to your users by going around to every office PC and installing them, we can now install a piece of software one time and make it available to our entire enterprise with a few mouse clicks. Not only can we save precious dollars, but we can also implement a long-term strategy that isn't dependent on replacing all hardware with the latest "fat client".
So what do we need to implement this, and how does it really work? The good news is that we can switch to this network model painlessly and can even use what we already have. A true server based network works like this. Imagine all of your resources being on your server, with virtually nothing on end-user desktops. Your desktops connect to your server and are presented with a familiar desktop. However, since all of your applications are on the server, the only bandwidth sent across your network are keystrokes and mouse clicks. Granted, your server has to be able to handle this load. But if you use a product such as Citrix's MetaFrame, you can implement this strategy easily while leveraging your existing infastructure.
For a company with hundreds of workstations and critical computing needs, this can save thousands of hours while optimizing your current resources. By not being dependent on our current workstations for future growth, we can be better positioned for rapid growth and change within the business environment. This will allow a company to stay on top of technological advances and not be left behind while competitors move forward.
ATTENTION MCP'S: YOU CAN DO THIS!
O.K., this all sounds great, but can it be implemented and really deliver what the customer expects? Yes, it can, if you are aware of the existing resources to make it happen. The first question you need to address is: Will the model be feasible for my client? If you're in an office of four machines, probably not. However, if your office is planning on growing at a fast pace, it is imperative that the computer system be able to grow as well. If we spend all of time setting up bigger and better workstations, it uses up time we could be using for network optimization or something else that will make us shine.
Lets take a company with over 10,000 users. It is no longer feasible for a company that large to constantly be faced with upgrading each desktop with the latest software release or hardware driver. They must be able to make rapid changes to their entire network quickly and without problems. With a product such as MetaFrame, an administrator can simply install the needed application one time and have their users connect to the application on that server. This would save costs in the short term as well as allow them to grow using their existing infrastructure.
The beauty of all of this is even though the administrator is able to "lock down" the production environment people connect to, users can still maintain control of their local machines if there is a need to. Also, since everything is on the server there is no need to have a "beefed-up" desktop machine. The only true function of the client device is to send keystrokes to the session on the server that you are attached to. Citrix's MetaFrame product, which runs on top of Microsoft's Terminal Server, can be used with most client operating systems. This alleviates the need to change operating systems in order to meet your applications needs. It also offers users the ease of a windows interface no matter what operating system they are connecting from.
THE MEAT: METAFRAME
So what are the ins and outs for us to worry about? So good of you to ask! First, Citrix MetaFrame, in conjunction with Microsoft's Terminal Server, makes our job 100% easier by allowing seamless integration of different client operating systems. MetaFrame is installed on the server, and from there you can "push" out the client software from the server. These clients include all Windows clients as well as Linux, Unix, Macintosh, and JAVA just to name a few.
MetaFrame allows for a user to either connect to a pre-configured desktop or to a published application. What that means is you can set up separate desktops for your different departments. For example, the marketing dept. can connect to the marketing desktop while the engineering division can connect to a development desktop that doesn't have the marketing applications on it. You can make these desktops available across subnets as well as for remote users. This is ideal for sales people who are desperate for system access but still have a P100 laptop.
As for day to day administration, MetaFrame allows you to control what a user sees from the server. This makes going to a user's desktop to make a change in his wallpaper a thing of the past. If a user needs an application, you can publish it to his server desktop or set him up to connect just to that application. There are also utilities for load balancing across multiple servers as well as installation add ons to assist in rolling out an application across multiple servers.
THE CUSTOMER IS WHO REALLY WINS
There have been many times when clients have asked me how much of my time is spent on client side issues. The answer usually ranges anywhere from 20-40%. Almost all of this is due to errors that stem from the user having too much control or access to the system. Even a company with just 50 users can end up paying thousands of dollars a month which go to correcting user mistakes. NT does offer significant ways to alleviate these mistakes using system policies. But in an enterprise environment these policies do not work for everything. With MetaFrame, we have control of the actual desktop that a user sees when they connect. We can control what applications are used for the production environment while still letting the user have access to their local settings.
Another benefit of MetaFrame is that users can connect to any pre-configured desktop that they have rights to. This is imperative for users who need to connect to different application sets. Instead of the traditional model that causes users to hop from PC to PC in order to gain access to these applications, the server-based model allows applications to run independent of what's installed on the desktop. And no matter what the speed of the users desktop is, they can still run these applications efficiently.
REMOTE USERS CAN CONNECT TO!
In the past remote users have usually been inhibited by slow connections, limited access, and the lack of powerful laptops. With MetaFrame, even remote users can connect to a desktop or published application. This is a huge benefit because it allows mobile users seamless access to all of the applications they have at the office.
As more companies turn to the web for remote access solutions, it is important that their networks leverage the Web to give remote users all of the resources available at the office. With MetaFrame, administrators can easily configure web pages to have links to applications that a user can access directly through a web browser. In the Published Application Manager utility, there is a feature that allows administrators to configure an application to be accesses this way. The software walks you through this step by step using a graphical tool where you can specify exact settings for the application to run with.
As companies continue to grow at rapid paces, the need for scalable computer systems will be imperative for survival. Replacing desktops and upgrading software on a user level will become a thing of the past. There will be no need to replace existing platforms on the client side if we can leverage our servers to perform the bulk of the work. With the explosion of e-commerce and Internet business systems, our systems must be able to integrate with one another for data sharing and transaction processing.
In the future we can expect an increased need to share internal information with our business partners. This will become increasingly important with the infusion of inter-linked business systems that require access to several data resources. To accomplish this under the traditional networking model can take considerable time and unnecessary legwork for an IT dept. With a server-based computing model, IT departments can give administrators central control over data warehouses, allowing them to make changes seamlessly across multiple locations.
Though we will continue to see "power desktops" that can do more things, it is imperative that we remember that the backbone of our network is the server and we must allow for it to be scalable to fit our business needs. Warehousing our information in a way it can be accessed by differing operating systems is the future and many large companies have already adopted the strategy.