-3

I am trying to convert a Matlab program into Python. It's not giving me the results I'd like.

Matlab Code:

for jj=1:data_length            %for each symbol in the input symbol sequence

    [a,b]= min(abs(phase_recovered(jj)-U_alphabets));    

    quantized(jj)=U_alphabets(b);     
end

Here quantized is initialized in the for loop for the first time. However, in Python it was giving the error

quantized is undefined.

So I defined quantized as zeros of length equal to data_length.

Python Code:

quantized=zeros(data_length,dtype='complex')
for jj in arange(0,data_length):
    diff=np.absolute((phase_recovered[jj]-u_alphabets))
    a=diff.argmin()
    b=diff[a]
    quantized[jj]=u_alphabets[b]
AndyG
  • 39,700
  • 8
  • 109
  • 143
marriam nayyer
  • 677
  • 5
  • 9
  • 19

1 Answers1

1

I think you don't want the line b = diff[a], you want something more like:

quantized=zeros(data_length,dtype='complex')
for jj in arange(0,data_length):
    diff=np.absolute((phase_recovered[jj]-u_alphabets))
    b = diff.argmin()
    quantized[jj] = u_alphabets[b]

Also if u_alphabets happens to be sorted you could use a solution like the one described here.

Community
  • 1
  • 1
Bi Rico
  • 25,283
  • 3
  • 52
  • 75