이 문제는 정수 / 다항식 제곱과 같습니다.
1. 다항식 곱셈 정수 곱셈에 해당하는 것으로 알려져있다 .
2. 당신은 이미 다항식 / 정수 제곱 법으로 문제를 줄였습니다. 그러므로이 문제는 거의 제곱만큼 어렵습니다.
이제이 문제에 대한 정수 제곱을 줄입니다.
알고리즘이 있다고 가정하십시오.
F(a⃗ )→P2(x),where P(x)=∑ai∈a⃗ xai
이 알고리즘은 본질적으로 질문에 요청한 알고리즘입니다. 따라서이 작업을 수행 할 수있는 마법 알고리즘이 있다면 정수 y 를 제곱 하는 함수 를 만들 수 있습니다 ( 오, 예, mathjax : P ).SQUARE(y)y
1.:2.:3.:4.:5.:6.:7.:8.:9.:10.:11.:12.:13.:Algorithm 1 Squaringprocedure SQUARE(y):a⃗ ←()i←0while y≠0 doif y & 1 thena⃗ ←a⃗ iend ifi←i+1y←y≫1end whileP2(x)←F(a⃗ )return P2(2)end procedure▹ a⃗ starts as empty polynomial sequence▹ break y down into a polynomial of base 2▹ if lsb of y is set▹ append i to a⃗ (appending xi)▹ shift y right by one▹ obtain the squared polynomial via F(a⃗ )▹ simply sum up the polynomial
Python (test with codepad):
#/cs//q/11418/2755
def F(a):
n = len(a)
for i in range(n):
assert a[i] >= 0
# (r) => coefficient
# coefficient \cdot x^{r}
S = {}
for ai in a:
for aj in a:
r = ai + aj
if r not in S:
S[r] = 0
S[r] += 1
return list(S.items())
def SQUARE(x):
x = int(x)
a = []
i = 0
while x != 0:
if x & 1 == 1:
a += [i]
x >>= 1
i += 1
print 'a:',a
P2 = F(a)
print 'P^2:',P2
s = 0
for e,c in P2:
s += (1 << e)*c
return s
3. Thus, squaring is at most as hard as this problem.
4. Therefore, integer squaring is equivalent to this problem. (they are each at most as hard as each-other, due to (2,3,1))
Now it is unknown if integer/polynomial multiplication admits bounds better than O(nlogn); in fact the best multiplication algorithms currently all use FFT and have run-times like O(nlognloglogn) (Schönhage-Strassen algorithm) and O(nlogn2O(log∗n)) (Fürer's algorithm). Arnold Schönhage and Volker Strassen conjectured a lower bound of Ω(nlogn), and so far this seems to be holding.
This doesn't mean your use of FFT is quicker; O(nlogn) for FFT is the number of operations (I think), not the bit complexity; hence it ignores some factors of smaller multiplications; when used recursively, it would become closer to the FFT multiplication algorithms listed above (see Where is the mistake in this apparently-O(n lg n) multiplication algorithm?).
5. Now, your problem is not exactly multiplication, it is squaring. So is squaring easier? Well, it is an open problem (no for now): squaring is not known to have a faster algorithm than multiplication. If you could find a better algorithm for your problem than using multiplication; then this would likely be a breakthrough.
So as of now, the answer to both your questions is: no, as of now, all the ~O(nlogn) multiplication algorithms use FFT; and as of now squaring is as hard as multiplication. And no, unless a faster algorithm for squaring is found, or multiplication breaks the O(nlogn) barrier, your problem cannot be solved faster than O(nlogn); in fact, it cannot currently be solved in O(nlogn) either, as the best multiplication algorithm only approaches that complexity.