Skip to Main Content
The optimal pixel expansion of an (n, n) visual cryptographic scheme (VCS) was proven as 2n-1 in 1995; and that of a (2, n)-VCS was proposed in 2002. Yet, most existing best pixel expansions of (k, n)-VCSs for k ≥ 3 have not been improved since 1996. Whether they are already optimal, and if not how to find the optimums have never been explored. In this paper, we model the minimization of the pixel expansion in a (k, n)-VCS into an integer linear program to acquire the optimum solution. Computational results demonstrate that our integer linear program is simple, effective to obtain the optimum solution and flexible for coping with various types of (k, n)-VCSs.