2

I'm having issues setting the timers on the STM32F7 dissovery board to 500 Khz. I seem to top around around 370kHz for some reason. 'm toggling a GPIO pin with a scope to the input and simply changing the Period on the timer to monitor what's happening.

I'm using CubeMX to generate my project files and I initialise my timer:

static void MX_TIM1_Init(void)
{
  TIM_ClockConfigTypeDef sClockSourceConfig = {0};
  TIM_MasterConfigTypeDef sMasterConfig = {0};
  htim1.Instance = TIM1;
  htim1.Init.Prescaler = 0;
  htim1.Init.CounterMode = TIM_COUNTERMODE_UP;
  htim1.Init.Period = 108;
  htim1.Init.ClockDivision = TIM_CLOCKDIVISION_DIV1;
  htim1.Init.RepetitionCounter = 0;
  htim1.Init.AutoReloadPreload = TIM_AUTORELOAD_PRELOAD_DISABLE;
  if (HAL_TIM_Base_Init(&htim1) != HAL_OK)
  {
    Error_Handler();
  }
  sClockSourceConfig.ClockSource = TIM_CLOCKSOURCE_INTERNAL;
  if (HAL_TIM_ConfigClockSource(&htim1, &sClockSourceConfig) != HAL_OK)
  {
    Error_Handler();
  }
  sMasterConfig.MasterOutputTrigger = TIM_TRGO_OC1;
  sMasterConfig.MasterOutputTrigger2 = TIM_TRGO2_OC1REF;
  sMasterConfig.MasterSlaveMode = TIM_MASTERSLAVEMODE_DISABLE;
  if (HAL_TIMEx_MasterConfigSynchronization(&htim1, &sMasterConfig) != HAL_OK)
  {
    Error_Handler();
  }

}

I then start the timer in interrupt mode:

if(HAL_TIM_Base_Start_IT(&htim1) != HAL_OK)
        {
            Error_Handler();
        }

and then toggle a GPIO pin when the period has elapsed:

void HAL_TIM_PeriodElapsedCallback(TIM_HandleTypeDef *htim)
{

    if(htim->Instance == TIM1)
    {

        HAL_GPIO_TogglePin(GPIOG, GPIO_PIN_6);

    }
}

The GPIO pin is set as:

  GPIO_InitStruct.Pin = GPIO_PIN_6;
  GPIO_InitStruct.Mode = GPIO_MODE_OUTPUT_PP;
  GPIO_InitStruct.Pull = GPIO_NOPULL;
  GPIO_InitStruct.Speed = GPIO_SPEED_FREQ_VERY_HIGH;
  HAL_GPIO_Init(GPIOG, &GPIO_InitStruct);

However, I've played around with the Period on the timer and I've gotten the following results:

539 = 100KHz
239 = 300KHz
215 = 330KHz
108 = 369Khz

I'd expect to get 500Khz with a Period of 215 but this isn't the case.Is there anything wrong with my settings?

MaskedAfrican
  • 196
  • 2
  • 12

1 Answers1

3

The timer settings are right. The interrupt code is too slow.

The HAL library is not suited for timing critical applications. HAL tries (and fails) to handle every possible use case in one-size-fits-all functions, which means lots of unnecessary processing with associated delays. Use a simple interrupt handler instead of the TIM1_IRQHandler() supplied by HAL that just clears the interrupt status and inverts a bit directly in GPIOG->ODR. This should do:

void TIM1_IRQHandler(void) {
    TIM1->SR = ~TIM_SR_UIF;
    GPIOG->ODR ^= (1 << 6);
}

just 2 lines of code, instead of the 100+ lines of HAL_TIM_IRQHandler(). Should work up to 1 MHz, maybe more.

Toggling output pins in a timer interrupt handler is fine as an embedded programming exercise, but it wastes a significant amount of CPU cycles to achieve what a timer can do alone, delaying and possibly blocking other tasks.

A timer can output a square wave (PWM signal) on its output channels with frequencies up to the half of its source clock. Look for PWM edge-aligned mode in the Reference Manual.

  • I'm simply using the GPIO pin toggling to figure out what the frequency is. My goal is to use the timer to as an external trigger for the ADC. My question is then, if the timer settings are right and I'm not toggling an GPIO, does that mean, the ADC will still trigger at the right time? Or is there a way to use the LL libraries to achieve this? – MaskedAfrican Jul 12 '19 at 13:11
  • @MaskedAfrican Once started, the timer runs at the set speed no matter what. It will dutifully trigger interrupts, DMA conversions, whatever you tell it to do, at equal intervals. The timer doesn't care whether the code toggles any pins, or whether it completes before the next interrupt request, or if there's any code at all. If the code is too slow, it will miss some interrupts, but it won't cause the timer to stop and wait for the code to catch up. – followed Monica to Codidact Jul 12 '19 at 13:56
  • @MaskedAfrican The same holds for ADC, the timer would happily trigger it at any frequency up to 108MHz, but you should ensure that the trigger interval is greater than the sum of the ADC stabilization time, conversion time, and a third time that I can't recall right now. If the ADC can't keep up, it'll start missing the triggers, and you'd get fewer samples than expected. Now consider this carefully: the code has trouble inverting a single bit 600,000 times per second. How would it deal with samples coming in at nearly that speed? There is barely enough time to make a single library call. – followed Monica to Codidact Jul 12 '19 at 14:25
  • you're right. so essentially, the LL libraries is the way to go. I'll take a look at the implementation and report back. I did use your `TIM1_IRQHandler` and it toggled at 500khz so I guess this question is answered – MaskedAfrican Jul 12 '19 at 15:16