2

I have a Python code intended to create a chirp (like in LoRa) from one frequency to the other.

On purpose, I want to create it by sampling with a 5 MHz sampling rate. I will move the values to an FPGA later on so it can just run through the list of samples. I want to sweep around 100 kHz with a bandwidth of 20 kHz (90.000 Hz to 110 Hz).

It looks roughly like this:

total chirp When zooming in on the first two peaks, I calculate a time difference of 11.02us -> 90.744 Hz. That is about expected.

But when zooming in on the last two peaks, I calculate a time difference of 7.77uS -> 128.7kHz. That is way higher than the 110 kHz I'm aiming for.

In my code, what I do is creating a sampling frequency of 5 MHz by dividing 50 MHz by 10. I calculate the expected time of my chirp and then the amount of samples in that whole chirp.

Using that, I calculate my delta t and my delta f. Doing so I can iterate over the range of samples and create a sine value for each sample by incrementally increasing the time with delta t and the frequency with delta f.

My code below contains some parts as well that intend to write all those values into binary format for my FPGA later on, those outputs will be written to a txt file but are not important right now.

I'm asking my code to print out the timestamps and corresponding frequencies at that timestamp. That output is showing the expected frequencies at the expected timestamps. So I'm really confused why my plot is giving the result that I do not expect.

The code:

import numpy as np
import matplotlib.pyplot as plt
import scipy

Fclock = 50000000
ClockFactor = 10
binaryRange = 0b1111111111


def chirp(s, SF, Fc, BW, counter, file):
    Fsample = Fclock / ClockFactor
    #   general calculations

    #   number of chips
    num_chips = int(2**SF)  

    #   chip time            
    Tchip = 1 / BW                         

    #   symbol time
    Tsymbol = num_chips * Tchip                 
    
    #   time of a sample
    Tsample = 1 / Fsample

    #   start frequency
    f_begin = Fc - BW/2      

    #   stop frequency           
    f_end = Fc + BW/2              

    sineValuesBin = []
    sineValues = []
    num_samples = int(Tsymbol / Tsample)

    #   Frequency increment (delta) per sample
    Fdelta = BW / num_samples

    t = 0
    tlist = []
    f = f_begin + s / num_chips * num_samples * Fdelta

    

    for i in range(num_samples):
        sineValue = np.sin(float(2 * np.pi) * float(f) * float(t))
        sineValuesBin.append(bin(int(sineValue * binaryRange / 2 + binaryRange / 2))[2:].zfill(10))
        sineValues.append(sineValue)
        tlist.append(t)
        file.write(f"Frequency: {f}\n")
        f += Fdelta
        if f > f_end:
            f = f_begin
        t += Tsample
        print(f"freq: {f}; t: {t}")
    

    plt.plot(tlist, sineValues, '.-')    
    plt.show()

    i = counter
    for sineValueBin in sineValuesBin:
        file.write(f"    sig_sineList({i}) <= \"{sineValueBin}\";")
        file.write("\n")
        i += 1
    
    return i



def message(symbol_items, SF, Fc, BW):
    counter = 0


    f = open("output.txt", "w")
    f.close()
    f = open("output.txt", "w+")
    for symbol in symbol_items:
        counter = chirp(symbol, SF, Fc, BW, counter, f)
    f.close()


symbols = [0]
message(symbols, 2, 100000, 20000)

TLDR: I create a sampled frequency sweep plot from 90.000 Hz to 110.000 Hz with a sampling rate of 5 MHz. The plot shows that the last frequency is not around 110.000 Hz but 128.000 Hz. Why?

Small update: I found that, when I half the frequency delta, it works perfectly. But if anyone could tell my why that is.

Reinderien
  • 11,755
  • 5
  • 49
  • 77
Mart
  • 475
  • 4
  • 21

0 Answers0