The definition of big O notation is pure math - there is nothing specific to analysis of algorithms.
For example: "the function x^2 is O(x^3)" is a valid sentence in big-O notation, and is true.
Big O is commonly used in other places besides analysis of algorithms, such as when truncating the higher-order terms in a Taylor series approximation.
Another example is in statistics and learning theory, where we see claims like "if we fit the model with N samples from the population, then the expected error is O(1/sqrt(N))." Notice the word expected - this is an average-case, not worst-case, analysis.
For example: "the function x^2 is O(x^3)" is a valid sentence in big-O notation, and is true.
Big O is commonly used in other places besides analysis of algorithms, such as when truncating the higher-order terms in a Taylor series approximation.
Another example is in statistics and learning theory, where we see claims like "if we fit the model with N samples from the population, then the expected error is O(1/sqrt(N))." Notice the word expected - this is an average-case, not worst-case, analysis.