By Topic

IBM Systems Journal

Issue 1 • Date 2004

Filter Results

Displaying Results 1 - 13 of 13
  • Preface

    Page(s): 3 - 4
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (35 KB)  

    Utility computing is a recent development in IT (information technology) outsourcing, whereby service capacity is provided as needed and the customer pays only for actual use. To provide what amounts to computing on demand, providers face business and technological challenges, including pricing of new services, adopting new business models, automated management of resource provisioning, and management of service level agreements (SLAs). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Preparing for utility computing: The role of IT architecture and relationship management

    Page(s): 5 - 19
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (130 KB)  

    In this paper we examine the likely impact of utility computing on information technology (IT) outsourcing. Drawing on a set of eleven outsourcing cases and on IT outsourcing literature, we identify four risks that lessen the potential benefits of IT outsourcing. We consider two approaches to outsourcing: selectively managing a network of outsourcing partners and managing large-scale exclusive partnerships. The firms in our sample introduced a number of popular relationship management practices in order to counter the risks of outsourcing. We describe their practices but then observe that, in addition to the capability of managing their vendor relationships, the firms' ability to generate value from outsourcing depends on the maturity of their IT architectures. We discuss the implications of both vendor relationship management and architecture design capabilities as firms seek the benefits of utility computing, and conclude that both continue to play key roles. We close with some recommendations as to how firms can use relationships to build effective architectures and how an effective architecture built around standards-based technologies and business process components can enable a firm to capitalize on the strategic agility that utility computing offers. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Price-at-Risk: A methodology for pricing utility computing services

    Page(s): 20 - 31
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (348 KB)  

    Whereas most companies use the century-old cost-plus pricing, this pricing method is especially inadequate for services on demand because these services have uncertain demand, high development costs, and a short life cycle. In this paper we propose a novel methodology, Price-at-Risk, that explicitly takes into account uncertainty in the pricing decision. By explicitly modeling contingent factors, such as uncertain rate of adoption or demand elasticity, the methodology can account for risk before the pricing decision is taken. The methodology optimizes the expected “net present value,” subject to financial performance constraints, and thus improves on both the cost-based and value-based approaches found in the marketing literature. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The utility business model and the future of computing services

    Page(s): 32 - 42
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (140 KB)  

    The utility business model is shaped by a number of characteristics that are typical in public services: users consider the service a necessity, high reliability of service is critical, the ability to fully utilize capacity is limited, and services are scalable and benefit from economies of scale. This paper examines the utility business model and its future role in the provision of computing services. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A Web content serving utility

    Page(s): 43 - 63
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (230 KB)  

    Utility computing allows users, or customers, to utilize advanced technologies without having to build a dedicated infrastructure. Customers can use a shared infrastructure and pay only for the capacity that each one needs. Each utility offers a specific information technology service, delivered on a pay-as-you-go model. This paper describes the design and development of a content-serving utility (CSU) that provides highly scalable Web content distribution over the Internet. We provide a technology overview of content distribution and a summary of the CSU from a customer perspective. We discuss the technical architecture underlying the service, including topics such as physical infrastructure, core service functions, infrastructure management, security, and usage-based billing. We then focus on the key issues affecting the performance and capacity of both the service infrastructure and the customer Web sites it supports. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • How to build a WebFountain: An architecture for very large-scale text analytics

    Page(s): 64 - 77
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (166 KB)  

    WebFountain is a platform for very large-scale text analytics applications. The platform allows uniform access to a wide variety of sources, scalable system-managed deployment of a variety of document-level “augmenters” and corpus-level “miners,” and finally creation of an extensible set of hosted Web services containing information that drives end-user applications. Analytical components can be authored remotely by partners using a collection of Web service APIs (application programming interfaces). The system is operational and supports live customers. This paper surveys the high-level decisions made in creating such a system. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An architecture for the coordination of system management services

    Page(s): 78 - 96
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (298 KB)  

    Today, system management services are implemented by dedicated subsystems, built using proprietary system management components. These subsystems are customized to automate their operations to the extent feasible. This model has been successful in dedicated enterprise environments, but there are opportunities to broaden the scope of these services to multicustomer utility computing environments while reducing the costs of providing these services. A new model suitable for utility computing is being developed to address these opportunities. This model features several new elements: (1) a repository to represent the state of the remotely managed components and of the services that manage them, (2) a repository of policies and operational constraints, and (3) a set of meta-management services that use existing management services to analyze, construct, and safely execute a complex set of management tasks on remote systems. The meta-management services manage the system management services provided by the utility—;they guide and modify the behavior of the services, often as a result of the collective analysis of the state of one or more services. In this paper, we describe requirements and behaviors of such meta-management services and the architecture to provide them. We focus on the components of this architecture that enable and provide effective meta-management services in a utility environment. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Using a utility computing framework to develop utility systems

    Page(s): 97 - 120
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (551 KB)  

    In this paper we describe a utility computing framework, consisting of a component model, a methodology, and a set of tools and common services for building utility computing systems. This framework facilitates the creation of new utility computing systems by providing a set of common functions, as well as a set of standard interfaces for those components that are specialized. It also provides a methodology and tools to assemble and re-use resource provisioning and management functions used to support new services with possibly different requirements. We demonstrate the benefits of the framework by describing two sample systems: a life-science utility computing service designed and implemented using the framework, and an on-line gaming utility computing service designed in compliance with the framework. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Policy-based automated provisioning

    Page(s): 121 - 135
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (437 KB)  

    The term “policy-based computing” refers to a software paradigm that incorporates a set of decision-making technologies into its management components in order to simplify and automate the administration of computer systems. A significant part of this simplification is achieved by allowing administrators and operators to specify management operations in terms of objectives or goals, rather than detailed instructions that need to be executed. A higher level of abstraction is thus supported, while permitting dynamic adjustment of the behavior of the running system without changing its implementation. This paper focuses on the application of the policy-based software paradigm to the automated provisioning architecture described elsewhere in this issue. We show how the use of policies can enhance utility computing services by making them more customizable and more responsive to business objectives. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Web services on demand: WSLA-driven automated management

    Page(s): 136 - 158
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (614 KB)  

    In this paper we describe a framework for providing customers of Web services differentiated levels of service through the use of automated management and service level agreements (SLAs). The framework comprises the Web Service Level Agreement (WSLA) language, designed to specify SLAs in a flexible and individualized way, a system to provision resources based on service level objectives, a workload management system that prioritizes requests according to the associated SLAs, and a system to monitor compliance with the SLA. This framework was implemented as the utility computing services part of the IBM Emerging Technologies Tool Kit, which is publicly available on the IBM alphaWorks™ Web site. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Utility computing SLA management based upon business objectives

    Page(s): 159 - 178
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (384 KB)  

    It has become increasingly desirable for companies worldwide to outsource their complex e-business infrastructure under the utility computing paradigm by means of service level agreements (SLAs). A successful utility computing provider must be able not only to satisfy its customers' demand for high service-quality standards, but also to fulfill its service-quality commitments based upon business objectives (e.g., cost-effectively minimizing the exposed business impact of service level violations). This paper presents the design rationale of a business-objectives-based utility computing SLA management system, called SAM, along with implementation experiences. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The utility metering service of the Universal Management Infrastructure

    Page(s): 179 - 189
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (194 KB)  

    One of the main characteristics of on demand computing in general and of utility computing services in particular is the “pay-as-you-go” model. To implement this model, one needs a flexible way to meter the services and resources being used. The UMI (Universal Management Infrastructure) architecture, designed to provide common functions that are needed by most, if not all, of the utilities in a utility computing system, therefore includes a metering function. The architecture of the metering system is hierarchical and highly flexible. This paper reviews the metering service architecture and describes how UMI's metering service function is used in the context of utility computing services, for collecting and storing metered data, computing service metrics (which are useful to the data-consuming applications), and feeding the metrics to various consumer modules (e.g., for accounting and billing). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Design of an enablement process for on demand applications

    Page(s): 190 - 203
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (173 KB)  

    In today's business and economic conditions, enterprise customers are demanding information technology (IT) solutions that are cheaper, less complex, and easier to install. At the same time, independent software vendors (ISVs) are seeing revenue from their core licensed offerings erode because of competition and market saturation. Many believe that the answer to these problems is to offer IT solutions by means of utility computing. Like an electric utility, software applications can be offered as on demand services, and customers pay only for what they use. Creating and implementing such utilities is by no means trivial. It requires some expert help and adherence to established standards and guidelines. In this paper we describe the design of such a process that we call the Application Enablement Program. The process helps ISVs transform their applications into on demand services. This process is structured, repeatable, and globally deployable. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

Throughout its history, the IBM Systems Journal has been devoted to software, software systems, and services, focusing on concepts, architectures, and the uses of software.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
John J. Ritsko
IBM T. J. Watson Research Center5