Distributed Submodular Maximization
- Submitting institution
-
University of Edinburgh
- Unit of assessment
- 11 - Computer Science and Informatics
- Output identifier
- 59250457
- Type
- D - Journal article
- DOI
-
-
- Title of journal
- Journal of Machine Learning Research
- Article number
- 235
- First page
- 1
- Volume
- 17
- Issue
- -
- ISSN
- 1532-4435
- Open access status
- Out of scope for open access requirements
- Month of publication
- December
- Year of publication
- 2016
- URL
-
-
- Supplementary information
-
-
- Request cross-referral to
- -
- Output has been delayed by COVID-19
- No
- COVID-19 affected output statement
- -
- Forensic science
- No
- Criminology
- No
- Interdisciplinary
- No
- Number of additional authors
-
3
- Research group(s)
-
C - Foundations of Computation
- Citation count
- 15
- Proposed double-weighted
- No
- Reserve for an output with double weighting
- No
- Additional information
- The paper addressed the problem of performing submodular maximization in a distributed setup like hadoop. The conceptual and utilitarian properties of the algorithm were affirmed by complete theoretical probabilistic proofs. This was accompanied by exhaustive experiments on massive datasets. By distributing it over hadoop like systems, this paper makes many machine learning tasks scalable to massive datasets. Follow up to this work has been taken up by multiple international research groups, e.g. Justin Ward at EPFL theory group.
- Author contribution statement
- -
- Non-English
- No
- English abstract
- -