Jump to content

Manfred K. Warmuth

From Wikipedia, the free encyclopedia
(Redirected from Manfred Warmuth)
Manfred Klaus Warmuth
Alma materUniversity of Colorado, Boulder
Known for
AwardsElected to Leopoldina (2021)
Scientific career
FieldsComputer Science
Institutions
Doctoral advisorHal Gabow
Doctoral studentsYoav Freund

Manfred Klaus Warmuth is a computer scientist known for his pioneering research in computational learning theory.[1] He is a Distinguished Professor emeritus at the University of California, Santa Cruz.

Education and career

[edit]

After studying computer science at the University of Erlangen–Nuremberg, earning a diploma in 1978, Warmuth went to the University of Colorado Boulder for graduate study, earning a master's degree there in 1980 and completing his Ph.D. in 1981.[2] His doctoral dissertation, Scheduling on Profiles of Constant Breadth, was supervised by Harold N. Gabow.[3]

After postdoctoral research at the University of California, Berkeley and Hebrew University of Jerusalem, Warmuth joined the University of California, Santa Cruz in 1983, became Distinguished Professor there in 2017, and retired as a professor emeritus in 2018. He was a visiting faculty member at Google Brain from 2019 to 2020.[4]

Contributions

[edit]

With his student Nick Littlestone,[3] Warmuth published the weighted majority algorithm for combining the results for multiple predictors in 1989.[5][WM]

Warmuth was also the coauthor of an influential 1989 paper in the Journal of the ACM, with Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, introducing the Vapnik–Chervonenkis dimension to computational learning theory.[6][VC] With the same authors, he also introduced Occam learning in 1987.[7][OR]

Recognition

[edit]

In 2021, Warmuth became a member of the German National Academy of Sciences Leopoldina.[4]

Selected publications

[edit]
VC.
Blumer, Anselm; Ehrenfeucht, Andrzej; Haussler, David; Warmuth, Manfred K. (1989), "Learnability and the Vapnik–Chervonenkis dimension", Journal of the ACM, 36 (4): 929–965, doi:10.1145/76359.76371, MR 1072253, S2CID 1138467; a preliminary version, "Classifying learnable geometric concepts with the Vapnik–Chervonenkis dimension", was presented at the ACM Symposium on Theory of Computing (STOC 1986), doi:10.1145/12130.12158
OR.
Blumer, Anselm; Ehrenfeucht, Andrzej; Haussler, David; Warmuth, Manfred K. (1987), "Occam's razor", Information Processing Letters, 24 (6): 377–380, doi:10.1016/0020-0190(87)90114-1, MR 0896392
WM.
Littlestone, Nick; Warmuth, Manfred K. (1994), "The weighted majority algorithm", Information and Computation, 108 (2): 212–261, doi:10.1006/inco.1994.1009, MR 1265851; announced at the IEEE Symposium on Foundations of Computer Science (FOCS 1989), doi:10.1109/SFCS.1989.63487

References

[edit]
  1. ^ Manfred Warmuth, Simons Institute in the Theory of Computing, retrieved 2023-05-17
  2. ^ "Manfred K. Warmuth", IEEE Xplore, IEEE, retrieved 2023-05-17
  3. ^ a b Manfred K. Warmuth at the Mathematics Genealogy Project
  4. ^ a b Warmuth, Manfred K., "Curriculum Vita" (PDF), German National Academy of Sciences Leopoldina
  5. ^ Blum, Avrim; Mansour, Yishay (2007), "Learning, regret minimization, and equilibria", in Nisan, Noam; Roughgarden, Tim; Tardos, Éva; Vazirani, Vijay V. (eds.), Algorithmic Game Theory, Cambridge University Press, pp. 79–101, ISBN 978-0-521-87282-9, MR 2391751; see 4.3.2 Randomized Weighted Majority Algorithm, pp. 85–86
  6. ^ Kearns, Michael J.; Vazirani, Umesh V. (1994), An Introduction to Computational Learning Theory, MIT Press, Cambridge, MA, p. 70, ISBN 0-262-11193-4, MR 1331838
  7. ^ Valiant, Leslie G., "A view of computational learning theory", in Meyrowitz, Alan L.; Chipman, Susan (eds.), Foundations of Knowledge Acquisition, The Springer International Series in Engineering and Computer Science, vol. 195, Springer, pp. 263–289, doi:10.1007/978-0-585-27366-2_8; see p. 280
[edit]