Not only can you, it is in very definition of O-complexity.
The O-complexity is not really as simple as most laymen programmers understand it.
Simply said, the O is defined asymptotically. That means, that you are trying to understand how the algorithm would behave if n was approaching infinity. And in majority of cases, one term dominates the calculation. Your example of
O(n^2) + O(n) = O(n^2)
is good representation of that. While for small
O(n) does influence the time, for
n approaching infinity,
O(n) becomes negligible compared to
O(n^2). So it is fine to just ignore it.
This is also why you see Big-O often with only single term. Because if this term is in complexity equation, it would dominate it as
n approaches infinity.
Also one note. While Big-O is good way to gauge performance of an algorithm, it is more of a theoretical mathematical tool, than practical way to asses algorithms.
For example, you could have two algorithms. One
O(n^2) and second
O(1000*n). The first one would clearly be faster for
n smaller than 1000. But because the constants are dropped for complexity to be "correct" then second would have to be really written as