Enjoy an ad free experience by logging in. Not a member yet? Register.

Results 1 to 10 of 10
Thread: Quickest Sorting Algorithm

07222003, 10:08 PM #1
 Join Date
 Jun 2002
 Posts
 70
 Thanks
 0
 Thanked 0 Times in 0 Posts
Quickest Sorting Algorithm
Okay, I'm back again with this Sorting business. I was wondering if anyone knew what the fastest sorting algorithm is. I'm currently using a quicksort which has a O(nlogn) efficiency. I was wondering if there were any faster. Thanks.
Obiwan Jabroni
May the Schwartz be With You
07232003, 01:26 AM
#2
 Join Date
 Feb 2003
 Location
 California
 Posts
 925
 Thanks
 0
 Thanked 0 Times in 0 Posts
quick sort is the fastest of the more conventional ways to sort, though it is possible to write a sorting algorithm that sorts faster, but like I said it is the fastest of the ways that everyone uses...
Jason
07232003, 03:34 AM
#3
No, not proportionally. You can prove the most efficient comparisonbased sorting algorithms are O(n log n). Noncomparison based sorts, such as Proxmap or various bucket sorts, can operate at O(n).Originally posted by Jason
quick sort is the fastest of the more conventional ways to sort, though it is possible to write a sorting algorithm that sorts faster, but like I said it is the fastest of the ways that everyone uses...
Jason
07232003, 04:35 PM
#4
 Join Date
 Jun 2002
 Posts
 70
 Thanks
 0
 Thanked 0 Times in 0 Posts
Proxmap and bucket sorts? Can you put some light to these terms? Thanks in advance.
Obiwan Jabroni
May the Schwartz be With You
07232003, 07:37 PM
#5
 Join Date
 Feb 2003
 Location
 California
 Posts
 925
 Thanks
 0
 Thanked 0 Times in 0 Posts
but ObiwanJebroni is doing a comparison based search which is why I was suggesting that quicksort is the fastest that he could possibly get unless he writes his own which could prove to be a touch faster. There is data here that is being sorted and so comparison sorting algorithms should be used.jkd:
No, not proportionally. You can prove the most efficient comparisonbased sorting algorithms are O(n log n). Noncomparison based sorts, such as Proxmap or various bucket sorts, can operate at O(n).
Jason
07242003, 02:39 PM
#6
 Join Date
 Jun 2002
 Posts
 70
 Thanks
 0
 Thanked 0 Times in 0 Posts
Still no consensus on Shell Sort, eh?
Obiwan Jabroni
May the Schwartz be With You
07252003, 01:06 AM
#7
You can try various O(n log n) sorts. It depends on the nature of the data sorted. When using BigOh notation, you drop any constant coefficients and insignificant terms. For example, some may be n lg n and others may be n ln n, but using change of base, we still get n log n for BigOh. Or in the case of convex hull algorithms, it could even be (n lg n) + O(n). (Sorting the data points, then O(n) to find the convex hull.)
You can do one of two things:
1. Take the popular O(n log n) sorts and actually prove the number operations they require precisely.
2. Run them through your expected data repeatedly, and see which one performs slightly faster.
Anyway, Proxmap is a O(n) sort, though only in most cases. It can be O(n^2) in worst case though I believe, and it uses n^2 in memory. (or maybe 2n... it's been a while).
Bucketsorts are for sorting data without needing comparisons. Say you were sorting strings. You could put every string starting with "a" in a bucket, every string starting "b" in another, etc. All the way up through "z". Then you take them out of the buckets in order, and repeat for the second character. Once you go through all the letters, and you continued taking them out in order, you should end up with the sorted data.
07262003, 03:33 AM
#8
 Join Date
 Oct 2002
 Location
 Milwaukee, Wisconsin
 Posts
 123
 Thanks
 1
 Thanked 0 Times in 0 Posts
hmmm really depends on what ur trying to sorta.. but i would say you could use ann array string and a nested for loop maybe?
08012003, 09:39 PM
#9
 Join Date
 Jun 2002
 Location
 Mumbai, India
 Posts
 218
 Thanks
 0
 Thanked 0 Times in 0 Posts
08282003, 05:31 PM
#10
 Join Date
 Jun 2002
 Posts
 185
 Thanks
 0
 Thanked 0 Times in 0 Posts
Depending on the implementation, quicksort has a possible worst case of O(n^2). It all depends on how you define the pivot value (which determines how a given list gets broken down into two sub lists).
If you choose a bad pivot, you can end up with only one item in one sub list and the remaining items in the other. That leads to the O(n^2) run time. To prevent that, you need to choose a pivot value that splits the list into equal sized sub lists as closely as possible.
That's what they call a balanced quicksort. A merge sort is similar and also gives a worst case of O(n log n).
That's all theory of course. In practice you can often improve sorting by combining methods. The overhead of a quicksort can make it run slower than other sorting algorithms for shorter lists. So you might speed up your quicksort by switching to something like a shell sort or insertion sort once a sub list gets below a certain length.
Other factors like memory requirements and I/O (if you're sorting files or database records) can affect the actual times as well. A lot depends on the data and where it's stored. So, like jdk said, you'd have to experiment in order to optimize for a particular situation.