The chart, which I made myself, shows how they work in best and worst cases with respect to Big-Oh and here're the reasons why they act this way.
Bubble Sort
It starts on the left and compares the adjacent items and keeps bubbling the larger one to the right. In the best case, we just need to compare 1 pass coz everything's in order and no swaps are made, but it still takes n times. In the worst case, we need to compare n - 1 passes and each pass there's a swap, then it takes (n -1) + (n - 2) + ... + 3 + 2 + 1 steps, thus we get O(n ^2).
Selection Sort
It finds the largest item throughout the list and swap it with the last item, then it finds the largest item in the list regardless of the last one since it's already in the correct position, and so on. Thus it sees the two lists 1, 2, 3, 4, 5 and 5, 4, 3, 2,1 the same since it just finds the largest item each iteration thus causes (n - 1) + (n- 2) + ... + 2 + 1 comparisons. So its efficiency is independent of the data.
Insertion Sort
It maintains a sorted sublist from the left and each new item is inserted to the previous sublist s.t the sorted sublist is one item larger. Thus the worst case takes (n - 1) + (n- 2) + ... + 2 + 1 steps to insert each new item in the right place. The best Case, just append every new item to the sublist, which takes n times.
Merge Sort
As I described in the previous Blog, the dividing step takes logn times and then the merge step takes always a + b times to compare each two pairs of length a and b, respectively, then merges them into one larger list. Thus the whole comparison steps takes n times. Thus the whole Sorting process takes n times logn steps. Then it's clear the best and worst case take the same runtime.
Quick Sort