All other answers are correct, but we can consider the following case, that gives us the time complexity of O(|E|).
The following answer is from Algorithms book by Dasgupta, chapter 5, page 140, section path compression:
In the time complexity computation of this algorithm, the dominant part is the edge sorting section which is O(|E| log|E|) or as all other answers explained O( |E|. log|V|).
But, what if the given edges are sorted?
Or if the weights are small (say, O(|E|)) so that sorting can be done in linear time (like applying counting sort).
In such a case, the data structure part becomes the bottleneck (the Union-find) and it is useful to think about improving its performance beyond log n per operation.
The solution is using the path-compression method, while doing the find() operation.

This amortized cost turns out to be just barely more than O(1), down from the earlier O(log n). For more details please check this reference.
The brief idea is, whenever the find(v) operation is called to find the root of a set which v belongs to, all the nodes' links to their parent will be changed and will point out to the root. This way if you call find(x) operation on each node x on the same path, you will get the set's root (label) in O(1). Hence, in this case, the algorithm bottleneck is the Union-find operation and using the described solution it is O(1), the running time of this algorithm in the described situation is O(|E|).