Loading [a11y]/accessibility-menu.js
Incremental Weighted Ensemble for Data Streams With Concept Drift | IEEE Journals & Magazine | IEEE Xplore

Incremental Weighted Ensemble for Data Streams With Concept Drift


Impact Statement:In many applications of information systems, learning algorithms have to adapt to concept drifts in dynamic environments where data often arrives continuously and sequent...Show More

Abstract:

As a popular strategy to tackle concept drift, chunk-based ensemble method adapts a new concept by adjusting the weights of historical classifiers. However, most previous...Show More
Impact Statement:
In many applications of information systems, learning algorithms have to adapt to concept drifts in dynamic environments where data often arrives continuously and sequentially over time, called data stream learning algorithms. However, existing chunk-based stream learning algorithms need to wait for an entire chunk of data then update the classifiers, which may cause delayed adaptation. The paper makes full use of the strengths from chunk-based learning and online learning, proposes two incremental weighted ensemble models. The proposed algorithms not only successfully solve the adaptation delay, but also can retain the valuable historical information of the non-drift regions from a local drift.

Abstract:

As a popular strategy to tackle concept drift, chunk-based ensemble method adapts a new concept by adjusting the weights of historical classifiers. However, most previous approaches normally evaluate the historical classifier based on an entire chunk newly arrived, which may cause delayed adaptation. To address the issue, two novel ensemble models, named incremental weighted ensemble (IWE) and incremental weighted ensemble for multi-classification (IWE-M), are proposed. At each time step, all base classifiers are incrementally updated on a newly arrived instance. Following that, the instance is collected into a cache array. Once a data chunk is formed, a new base classifier is created. More specially, a forgetting mechanism based on variable-size window is designed to adjust the weight of each base classifier in IWE in terms of its classification accuracy on the latest instances in an online manner. IWE-M, an extension of IWE, aims to solve multiclass problems with local concept drifts...
Published in: IEEE Transactions on Artificial Intelligence ( Volume: 5, Issue: 1, January 2024)
Page(s): 92 - 103
Date of Publication: 23 November 2022
Electronic ISSN: 2691-4581

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.