Robust optimization (RO) is a powerful paradigm for decision making under uncertainty. Existing computational approaches, including the reformulation approach and the cutting-plane method, do not scale well, hindering the application of RO to modern high-dimensional decision problems. In this paper, we devise a novel first-order algorithm for solving RO based on a max-min-max perspective. Our algorithm operates directly on the model functions and sets through the subgradient and projection oracles, which enables easy exploitation of problem structures and is especially suitable for large-scale instances. Theoretically, we prove that the oracle complexity of our algorithm for attaining an ε-approximate optimal solution is O(ε−3) or O(ε−2), depending on the smoothness of the model functions. The algorithm and convergence analysis are then extended to distributionally robust optimization. We also show via extensive numerical experiments that the proposed algorithm significantly outperforms the reformulation approach, the cutting-plane method and two other recent first-order algorithms, often by an order of 102−104 in terms of accuracy and/or computational speed.