Abstract
Imagine a large building with many corridors. A robot cleans these corridors in a greedy fashion, the next corridor cleaned is always the dirtiest to which it is incident. We determine bounds on the minimum s(G) and maximum S(G) number of time steps (over all edge weightings) before every edge of a graph G has been cleaned. We show that Eulerian graphs have a self-stabilizing property that holds for any initial edge weighting: after the initial cleaning of all edges, all subsequent cleanings require s(G) time steps. Finally, we show the only self-stabilizing trees are a subset of the superstars.
Similar content being viewed by others
References
Cook WJ, Cunningham WH, Pulleyblank WR, Schrijver A (1998) Combinatorial optimization. Wiley, New York
Edmonds J, Johnson EL (1973) Matching, Euler tours and the Chinese postman. Math Program 5:88–124
Eiselt HA, Gendreau M, Laporte G (1995) Arc routing problems, part 1: the Chinese postman problem. Oper Res 43(2):213–242
Gajardo A, Goles E, Moreira A (2002) Complexity of Langton’s ant. Discrete Appl Math 117:41–50
Gale D (1998) Tracking the automatic ant and other mathematical explorations. Springer, New York
Messinger ME, Nowakowski RJ, Prałat P, Wormald NC (2007) Cleaning random d-regular graphs with brushes using a degree–greedy algorithm. In: Proceedings of the 4th workshop on combinatorial and algorithmic aspects of networking (CAAN2007). Lecture notes in computer science. Springer, Berlin, pp 13–26
Messinger ME, Nowakowski RJ, Prałat P (2008) Cleaning a network with brushes. Theor Comput Sci 399:191–205
Author information
Authors and Affiliations
Corresponding author
Additional information
Partially supported by grants from the NSERC.
Rights and permissions
About this article
Cite this article
Messinger, M.E., Nowakowski, R.J. The robot cleans up. J Comb Optim 18, 350–361 (2009). https://doi.org/10.1007/s10878-009-9236-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10878-009-9236-7