I am trying to convert a Matlab program into Python. It's not giving me the results I'd like.
Matlab Code:
for jj=1:data_length %for each symbol in the input symbol sequence
[a,b]= min(abs(phase_recovered(jj)-U_alphabets));
quantized(jj)=U_alphabets(b);
end
Here quantized
is initialized in the for loop for the first time. However, in Python it was giving the error
quantized
is undefined.
So I defined quantized
as zeros of length equal to data_length
.
Python Code:
quantized=zeros(data_length,dtype='complex')
for jj in arange(0,data_length):
diff=np.absolute((phase_recovered[jj]-u_alphabets))
a=diff.argmin()
b=diff[a]
quantized[jj]=u_alphabets[b]