Will a sorting algorithm faster than n*log(n) be found before 2030?
➕
Plus
19
Ṁ1492
2029
9%
chance

If it only works on quantum computers - it is enough for YES resolution.

The time complexity should be less than O(n*log(n)).

As people hint I am bad at market creation here are the additions made after opening:

  • The algorithm has to be implemented by 2030, not just "be proven to exist". Proving the existence is not "finding".

  • The worst time of the new questioned algorithm has to be better than O(n*log(n)) for a YES resolution.

  • The market closes at the end of 2029 and resolves at the start of 2030.

  • The intended field of interest of the question was about whether quantum computers or related technologies would give any improvement. Not about whether there are datatypes which are easier to sort.

  • I will not resolve this question by myself, as I am not qualified to evaluate the results of work in quantum physics. I will not bet.

Get
Ṁ1,000
and
S3.00
Sort by:

https://arxiv.org/pdf/quant-ph/0102078.pdf

we consider queries of the type “Is xi < xj ?”

It seems that improvement would need new type of queries. Like being able to test "is b the max of {a,b,c}" at a single iteration, instead of two comparison tests a<b, b<c.

You provably cannot use fewer than O(n log n) comparisons to sort a list using a comparison-based sorting algorithm.

I don't know much about quantum computing but https://quantumcomputing.stackexchange.com/questions/1276/what-is-the-current-state-of-the-art-in-quantum-sorting-algorithms says we have the same lower bound for that. I am not going to bet, however, because it seems like you don't particularly understand the question you're asking given that you also talk about sorting a list of integers, for which there are well-known O(n) algorithms already.

@osmarks thanks for the link.

1) List of integers is just a suitable case to show why I am skeptical about mathematical validity to compare f(n) to f(n,k).

2) That's Jon who brought a counting sort, and I don't know how it would be applicable to fractions for example, so I went to talk about integers from there. I did not tell in the question itself about integers as I expected something not limited to the specific case.

If I am wrong to talk about Integers in the context of counting sort, educate me please, how I would iterate over "Double".

@KongoLandwalker Sub-O(n log n)-time sorting algorithms are only applicable to particular data types (for example, radix sort will work on anything which is sortable using a fixed-length binary representation). The general case is comparison-based sorts, where the items being sorted are "black boxes" which can only be compared with each other. It is provably impossible to do better than O(n log n) worst case with this (roughly speaking, because there are n! possible permutations which take you from the existing list to the sorted list, requiring log(n!) bits of information to distinguish; each comparison provides at most log 3 = O(1) bits; log(n!) = O(n log n) by Stirling's approximation).

There already exist linear time sorting algorithms like Counting Sort. https://en.m.wikipedia.org/wiki/Counting_sort

@jonsimon let's say I have 40 numbers to sort, but 10^9 possible values.

Not quite optimistic to iterate over 10^9.

@KongoLandwalker You didn't specify anything about the implementation being realistic in large-N cases. Discussions of big-O complexity are never realistic. Quicksort is in wide use as the "practically fastest" sorting algorithm despite have a big-O runtime of n^2.

If you wanted to specify something more specific you should create a new market.

predicts YES

@jonsimon Not to mention that you could absolutely sort integers up to 1e9, it would just take ~1GB RAM to do so.

Look. Comparing the rate of growth of f(n) to f(n,k) looks mathematically incorrect to me. I will wait for more discussion happening here.

@jonsimon isn't it the point of using O-notation to discuss arbitrary large N-cases?

predicts YES

@KongoLandwalker Alright, in that case I'm exiting the market. I recommend updating the description to clarify in more detail what does and doesn't count. Comparison-based sorting algorithms are already proven to be no faster than n*log(n), so you're only going to be getting weird fringe algorithms as possible Yes resolutions.

@jonsimon

I don't understand, what criteria you suggest to add?

I am adding new points for the market to make it easier for future readers.

@jonsimon Technically quicksort has an expected runtime of nlogn. The proof is quite elegant too.

@KongoLandwalker You technically "did not say that it has to be comparison based", but you also didn't specify any information that other classes of algorithms use to beat the n log n lower bound, and then excuse yourself on doubting the "mathematical validity to compare f(n) to f(n,k)" or write that it "looks mathematically incorrect to me". So you're effectively restricting to comparison based algorithms, but aren't openly stating it and instead weasel around it.

@__nobody yes, there is a restriction. Because how are we supposed to compare entities of different sets? If f1(n) is mentioned, and it is comparable to other f2(n), so a set of all fx(n) is the one which we look at.

I think it is obvious.

If there is a legal way to compare functions defined on different argument fields (and you give me a link to familiarize with) then I will write an update point to specify.

I do think it is a mathematical restriction, not a terminology restriction.

I am to assume that asymptotically superior, but impractical algorithms such as Levin search are ignored implicitly here.

@kugelblitz_acolyte could you provide some link? There are several papers in the internet, but I see no connection to actual sorting.

I will add to the description that the algorithm should be implemented, not "proven to exist".

@KongoLandwalker http://www.scholarpedia.org/article/Universal_search

There are implementation of this already. The issue is that it is computationally infeasible, despite its asymptotic complexity.

@kugelblitz_acolyte thanks for the link. Difficult to understand, correct me. Does it mean that anytime I want to run a sorting function, I would need to have an infinite amounts of machines?

If I am allowed to have arbitrary amount of machines running in parallel, then I feel I have a better solution.

Let's take N machines (or N-1 if N is odd). We divide them in two parts. First set is "placed above" the memory where a set to sort will be located. These machines are connected: first to places 1 and 2, second to places 3 and 4 and so on. The other set of machines is "placed under": first to places 2 and 3, second to 4 and 5...

Each machine has to swap the 2 places if the smaller is on the right. All machines in set 1 can run with no conflicts in access to memory (same for set 2). Compare and swap is predictable constant amount of ticks.

Now let them flip flop. Set1 makes a computation, then set2 and so on. Any element would take its place after at max N such computations. Does that mean we have O(N)?

Unlimited machines is a weird concept. What I described could not be scaled as easily as treesort. Adding more memory to the system is easier that adding new processors.

Maybe I understood the whole concept wrong, but what I described seems to me better than infinite algorithms, most of which will just... run.

@KongoLandwalker what I described is basically a bubblesort in N timesteps.

@KongoLandwalker We define algorithm complexity with reference to number of timesteps in a standard model of computation, which this is not. Hutter search is probably what the other person meant. It is guaranteed to be asymptotically optimal, but is also unusable in practice, since it works by searching through all possible algorithms sorted by runtime and all proofs of algorithm correctness sorted by length. Because the time taken to find the best algorithm using that does not scale with the size of the input, it's a constant factor and ignored for the purposes of big-O/asymptotic complexity. However, this is completely unusable in practice because the constant factor is very big.

Sorry for the empty space in the description. Some kind of bug

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules