We analyze a simple randomized subgradient method for approximating solutions to stochastic systems of convex functional constraints, the only input to the algorithm being the size of minibatches. By introducing a new notion of what is meant for a point to approximately solve the constraints, determining bounds on the expected number of iterations reduces to determining a hitting time for a compound Bernoulli process, elementary probability. Besides bounding the expected number of iterations quite generally, we easily establish concentration inequalities on the number of iterations, and more interesting, we establish much-improved bounds when a notion akin to H\”{o}lderian growth is satisfied, for all degrees of growth, not just the linear growth of piecewise-linear convex functions or the quadratic growth of strongly convex functions. Finally, we establish the analogous results under a slight modification to the algorithm which results in the user knowing with high confidence an iterate is in hand that approximately solves the system. Perhaps surprisingly, the iteration bounds here are deterministic — all of the probability gets wrapped into the confidence level (albeit at the expense of potentially large minibatches).