We propose a derivative-free algorithm for optimizing computationally expensive functions with computational error. The algorithm is based on the trust region regression method by Conn, Scheinberg, and Vicente [4], but uses weighted regression to obtain more accurate model functions at each trust region iteration. A heuristic weighting scheme is proposed which simultaneously handles i) differing levels of uncertainty in function evaluations, and ii) errors induced by poor model fidelity. We also extend the theory of Λ-poisedness and strong Λ-poisedness to weighted regression. We report computational results comparing interpolation, regression, and weighted regression methods on a collection of benchmark problems. Weighted regression appears to outperform interpolation and regression models on nondifferentiable functions and functions with deterministic noise.