By Topic

IBM Journal of Research and Development

Issue 3.4 • Date May 2007

Filter Results

Displaying Results 1 - 19 of 19
  • Message

    Page(s): 246 - 247
    Save to Project icon | PDF file iconPDF (44 KB)  
    Freely Available from IEEE
  • Preface

    Page(s): 247 - 249
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (43 KB)  

    In our rapidly changing economy, businesses face the challenges of increasing their responsiveness and resiliency to varying market forces while minimizing business costs. Decisions that involve these challenges may catapult a business to success in the marketplace or cause a business to fail. Many examples exist of even a single poor business decision having a devastating impact on a business. Naturally, businesses are using the vast computation power of modern computers to help make crucial decisions. This computational power helps businesses study a large number of scenarios and possibilities in order to reach favorable outcomes, given a particular set of circumstances. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Applying supply chain optimization techniques to workforce planning problems

    Page(s): 251 - 261
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (306 KB)  

    The IBM Research Division has developed the Resource Capacity Planning (RCP) Optimizer to support the Workforce Management Initiative (WMI) of IBM. RCP applies supply chain management techniques to the problem of planning the needs of IBM for skilled labor in order to satisfy service engagements, such as consulting, application development, or customer support. This paper describes two RCP models and presents two approaches to solving each of them. We also describe the motivation for using one approach over another. The models are built using the Watson Implosion Technology toolkit, which consists of a supply chain model, solvers for analysis and optimization, and an Application Programming Interface (API) for developing a solution. The models that we built solve two core resource planning problems, gap/glut analysis and resource action planning. The gap/glut analysis is similar to material requirements planning (MRP), in which shortages (gaps) and excesses (gluts) of resources are determined on the basis of expected demand. The goal of the resource action planning problem is to determine what resource actions to take in order to fill the gaps and reduce the gluts. The gap/glut analysis engine is currently deployed within the IBM service organization to report gaps and gluts in personnel. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Workforce optimization: Identification and assignment of professional workers using constraint programming

    Page(s): 263 - 279
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (1254 KB)  

    Matching highly skilled people to available positions is a high-stakes task that requires careful consideration by experienced resource managers. A wrong decision may result in significant loss of value due to understaffing, underqualification or overqualification of assigned personnel, and high turnover of poorly matched workers. While the importance of quality matching is clear, dealing with pools of hundreds of jobs and resources in a dynamic market generates a significant amount of pressure to make decisions rapidly. We present a novel solution designed to bridge the gap between the need for high-quality matches and the need for timeliness. By applying constraint programming, a subfield of artificial intelligence, we are able to deal successfully with the complex constraints encountered in the field and reach near-optimal assignments that take into account all resources and positions in the pool. The considerations include constraints on job role, skill level, geographical location, language, potential retraining, and many more. Constraints are applied at both the individual and team levels. This paper introduces the technology and then describes its use by IBM Global Services, where large numbers of service and consulting employees are considered when forming teams assigned to customer projects. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Statistical methods for automated generation of service engagement staffing plans

    Page(s): 281 - 293
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (192 KB)  

    In order to successfully deliver a labor-based professional service, the right people with the right skills must be available to deliver the service when it is needed. Meeting this objective requires a systematic, repeatable approach for determining the staffing requirements that enable informed staffing management decisions. We present a methodology developed for the Global Business Services (GBS) organization of IBM to enable automated generation of staffing plans involving specific job roles, skill sets, and employee experience levels. The staffing plan generation is based on key characteristics of the expected project as well as selection of a project type from a project taxonomy that maps to staffing requirements. The taxonomy is developed using statistical clustering techniques applied to labor records from a large number of historical GBS projects. We describe the steps necessary to process the labor records so that they are in a form suitable for analysis, as well as the clustering methods used for analysis, and the algorithm developed to dynamically generate a staffing plan based on a selected group. We also present results of applying the clustering and staffing plan generation methodologies to a variety of GBS projects. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • A quantitative optimization model for dynamic risk-based compliance management

    Page(s): 295 - 307
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (199 KB)  

    The changing nature of regulation forces businesses to continuously reevaluate the measures taken to comply with regulatory requirements. To prepare for compliance audits, businesses must also implement an effective internal inspection policy that identifies and rectifies instances of noncompliance. In this paper, we propose an approach to compliance management based on a quantitative risk-based optimization model. Our model allows dynamic selection of the optimal set of feasible measures for attaining an adequate level of compliance with a given set of regulatory requirements. The model is designed to minimize the expected total cost of compliance, including the costs of implementing a set of measures, the cost of carrying out periodic inspections, and the audit outcome cost for various compliance levels. Our approach is based on dynamic programming and naturally accounts for the dynamic nature of the regulatory environment. Our method can be used either as a scenario-based management support system or, depending on the availability of reliable input data, as a comprehensive tool for optimally selecting the needed compliance measures and inspection policy. We illustrate our approach in a hypothetical case study. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Modeling of risk losses using size-biased data

    Page(s): 309 - 323
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (277 KB)  

    In this paper we present a method for drawing inferences about the process of financial losses that are associated with the operations of a business. For example, for a bank such losses may be related to erroneous transactions, human error, fraud, lawsuits, or power outages. Information about the frequency and magnitude of losses is obtained through the search of a number of sources, such as printed, computerized, or Internet-based publications related to insurance and finance. The data consists of losses that were discovered in the search. We assume that the probability of a loss appearing in the body of sources and also being discovered increases with the magnitude of the loss. Our approach simultaneously models the process of losses and the process of populating the database. The approach is illustrated using data related to operational risk losses that are of special interest to the banking industry. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Multicommodity network flow approach to the railroad crew-scheduling problem

    Page(s): 325 - 344
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (315 KB)  

    We present our solution to the crew-scheduling problem for North American railroads. (Crew scheduling in North America is very different from scheduling in Europe, where it has been well studied.) The crew-scheduling problem is to assign operators to scheduled trains over a time horizon at minimal cost while honoring operational and contractual requirements. Currently, decisions related to crew are made manually. We present our work developing a network-flow-based crew-optimization model that can be applied at the tactical, planning, and strategic levels of crew scheduling. Our network flow model maps the assignment of crews to trains as the flow of crews on an underlying network, where different crew types are modeled as different commodities in this network. We formulate the problem as an integer programming problem on this network, which allows it to be solved to optimality. We also develop several highly efficient algorithms using problem decomposition and relaxation techniques, in which we use the special structure of the underlying network model to obtain significant increases in speed. We present very promising computational results of our algorithms on the data provided by a major North American railroad. Our network flow model is likely to form a backbone for a decision-support system for crew scheduling. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Production design for plate products in the steel industry

    Page(s): 345 - 362
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (343 KB)  

    We describe an optimization tool for a multistage production process for rectangular steel plates. The problem we solve yields a production design (or plan) for rectangular plate products in a steel plant, i.e., a detailed list of operational steps and intermediate products on the way to producing steel plates. We decompose this problem into subproblems that correspond to the production stages, where one subproblem requires the design of casts by sequencing slabs which, in turn, have to be designed from mother plates. The design of mother plates consists of a two-dimensional packing problem. We develop a solution approach which combines mathematical programming models with search techniques from artificial intelligence. The use of these tools provides two types of benefits: improvements in the productivity of the plant and an approach to making the key business performance indicators, such as available-to-promise at a production level, operational. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The material allocation problem in the steel industry

    Page(s): 363 - 374
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (287 KB)  

    A major challenge in the initial stage of production planning for the steel industry is the material allocation problem (MAP): finding the best match of orders and materials (steel slabs and coils) with respect to an objective function that takes into account order due dates, preferences for matches, allocated weights, surplus weights of materials, and other factors. The MAP is NP-hard and is difficult to solve optimally. We apply a local search algorithm for the MAP that includes rich moves, such as ejection chain methods. Our algorithm is yielding considerable cost reduction in a real steelworks. In particular, a two-variable integer programming (TVIP) neighborhood search technique contributed to the cost reductions. TVIP defines a neighborhood space for the local search as a two-variable integer programming problem and efficiently finds a solution in the neighborhood. By using TVIP, the number of small batches of surplus material can be successfully reduced. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Strategic planning of preparedness budgets for wildland fire management

    Page(s): 375 - 390
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (303 KB)  

    As part of the prototyping effort for the preparedness module (PM) of the Fire Program Analysis (FPA) system that IBM developed for five U.S. federal agencies, we designed and implemented an optimization model for determining budgets necessary for managing wildland fires during the initial response period. For a given budget, the model uses a mixed-integer linear optimization approach to maximize the number of acres managed (i.e., land protected from fire damage as a result of the initial response). The model is solved iteratively to establish a function that maps best achievable effectiveness, in terms of acres managed, at different budget levels. To handle the computationally prohibitive size of the resulting model instances, we devised a heuristic-based solution approach, and we reformulated the client's original model by switching to a continuous time domain and introducing piecewise-linearized functions. As a result, we not only built a tractable model, but also succeeded in delivering a performance speedup of more than 150 fold. We also conducted validation experiments for certain assumptions in the model to assess their impact on the solution quality. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Inventory allocation and transportation scheduling for logistics of network-centric military operations

    Page(s): 391 - 407
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (349 KB)  

    This paper describes a prototype inventory-placement and transportation-scheduling solution developed in support of the emerging military doctrine of Network-Centric Operations (NCO). NCO refers to an unprecedented ability to share information among cooperating forces, enabled by modern communications and computing technology. The objective of the Network-Centric concept is to collect, disseminate, and react to real-time information in order to improve the performance of the U.S. Army as a fighting force. One problem that arises in the logistics domain involves the maintenance of combat vehicles. We seek to determine the improvement, if any, made possible by exploiting accurate information on the status of available repair parts inventory, the current locations of mobile supply points, and the demand for parts. We describe logistics algorithms for maximizing the operational availability of combat vehicles by producing flexible, optimized inventory and delivery plans that decrease replenishment times and prioritize parts allocations and repairs. Our algorithms are designed to leverage real-time information available from modern communications and inventory tracking technology by employing state-of-the-art mathematical optimization models. Our simulations indicate that Network-Centric Logistics (NCL) can significantly improve combat vehicle availability in comparison with current practice. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Marketing event optimization

    Page(s): 409 - 419
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (261 KB)  

    We present an algorithm for performing multi-channel marketing event optimization. Previous related work, which made use of a complex mixed-integer linear program to generate a marketing plan, was capable of providing plans for only groups of individuals, referred to as micro-segments, and considered only the direct mail channel. Today, most firms use multiple channels, such as e-mail, call centers, and direct mail, to contact customers with marketing events. Our method successfully overcomes many past restrictions and reduces the run time for the marketing scenarios so that computation can take place on a daily basis. The algorithm used is an advanced form of a greedy heuristic which—given a set of marketing events by channel, a set of individuals, some constraints, and the concepts of saturation and cannibalization—determines the optimal set of marketing events to present to an individual customer. The algorithm is embedded in a solution that is designed to operate as an interactive what-if scenario planner, or as a batch-oriented job that can continually maintain each customer's future contact plans in an optimal fashion. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Optimizing marketing planning and budgeting using Markov decision processes: An airline case study

    Page(s): 421 - 431
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (667 KB)  

    Many companies have no reliable way to determine whether their marketing money has been spent effectively, and their return on investment is often not evaluated in a systematic manner. Thus, a compelling need exists for computational tools that help companies to optimize their marketing strategies. For this purpose, we have developed computational models of customer buying behavior in order to determine and leverage the value generated by a customer within a given time frame. The term “customer value” refers to the revenue generated from a customer's buying behavior in relation to the costs of marketing campaigns. We describe a new tool, the IBM Customer Equity Lifetime Management Solution (CELM), that helps to determine long-term customer value by means of dynamic programming algorithms in order to identify which marketing actions are the most effective in improving customer loyalty and hence increasing revenue. Simulation of marketing scenarios may be performed in order to assess budget requirements and the expected impact of marketing policies. We present a case study of a pilot program with a leading European airline, and we show how this company optimized its frequent flyer program to reduce its marketing budget and increase customer value. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • An SQL-based cost-effective inventory optimization solution

    Page(s): 433 - 445
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (395 KB)  

    Successful implementation of an inventory optimization solution requires significant effort and can pose certain risks to companies implementing such solutions. Depending on the complexity of the requirements, the solution may also involve a substantial IT investment. In this paper, we present a cost-effective solution for inventory optimization that can be useful for small and medium-sized businesses with limited IT budgets. This solution can be implemented on any application platform that is capable of processing basic SQL™ (Structured Query Language) commands. The solution eliminates the need to purchase additional software and has a framework in which sales data in an Enterprise Resource Planning (ERP) system are accessed, demand statistics based on this data are generated along with other key parameters, and optimal inventory policies, such as those involving safety stocks and lot sizes, are calculated and reported. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Inventory budget optimization: Meeting system-wide service levels in practice

    Page(s): 447 - 464
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (352 KB)  

    The work described in the literature on inventory and supply chain management has advanced greatly over the last few decades and now covers many aspects and challenges of applied supply chain management. In this paper we describe an approach that combines many of these academic aspects in a practical way to manage the spare parts logistics at a German automobile manufacturer. The basic problem is a single-echelon inventory problem with a system-wide service-level requirement and the possibility of issuing emergency orders. There exist two related optimization problems: One is to maximize the system-wide service level under the constraint of a given budget; the other is to minimize the budget for a given system-wide service level. The most important requirements and constraints considered are a detailed cost structure, different packaging sizes, capacity constraints, several storage zones, the decision whether or not to stock a product, stochastic lead times, highly sporadic demands, and the stability of the optimization result over time. Our approach has been implemented successfully in an automotive spare parts planning environment. The complete solution package integrates into the mySAP ERP®, the SAP Enterprise Resource Planning system, and APO 4.0, the SAP Advanced Planning and Optimization system. A detailed description of the model is given and results are presented. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Insider attack and real-time data mining of user behavior

    Page(s): 465 - 475
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (743 KB)  

    Early detection of employees' improper access to sensitive or valuable data is critical to limiting negative financial impacts to an organization, including regulatory penalties for misuse of customer data that results from these insider attacks. Implementing a system for detecting insider attacks is a technical challenge that also involves business-process changes and decision making that prioritizes the value of enterprise data. This paper focuses primarily on the techniques for detecting insider attacks, but also discusses the processes required to implement a solution. In particular, we describe a behavior-anomaly-based system for detecting insider attacks. The system uses peer-group profiling, composite feature modeling, and real-time statistical data mining. The analytical models are refined and used to update the real-time monitoring process. This continues in a cyclical manner as the system self-tunes. Finally, we describe an implementation of this detection approach in the form of the IBM Identity Risk and Investigation Solution (IRIS). View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Advances in analytics: Integrating dynamic data mining with simulation optimization

    Page(s): 477 - 487
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (234 KB)  

    We introduce a simulation optimization approach that is effective in guiding the search for optimal values of input parameters to a simulation model. Our proposed approach, which includes enhanced data mining methodology and state-of-the-art optimization technology, is applicable to settings in which a large amount of data must be analyzed in order to discover relevant relationships. Our approach makes use of optimization technology not only for data mining but also for optimizing the underlying simulation model itself. A market research application embodying agent-based simulation is used to illustrate our proposed approach. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Mixed-integer nonlinear programming: Some modeling and solution issues

    Page(s): 489 - 497
    Save to Project icon | Click to expandQuick Abstract | PDF file iconPDF (491 KB)  

    We examine various aspects of modeling and solution via mixed-integer nonlinear programming (MINLP). MINLP has much to offer as a powerful modeling paradigm. Recently, significant advances have been made in MINLP solution software. To fully realize the power of MINLP to solve complex business optimization problems, we need to develop knowledge and expertise concerning MINLP modeling and solution methods. Some of this can be drawn from conventional wisdom of mixed-integer linear programming (MILP) and nonlinear programming (NLP), but theoretical and practical issues exist that are specific to MINLP. This paper discusses some of these, concentrating on an aspect of a classical facility location problem that is well-known in the MILP literature, although here we consider a nonlinear objective function. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.

Aims & Scope

The IBM Journal of Research and Development is a peer-reviewed technical journal, published bimonthly, which features the work of authors in the science, technology and engineering of information systems.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
Clifford A. Pickover
IBM T. J. Watson Research Center