0

I am trying to compute buy and hold abnormal returns (BHAR) for a sample of stock with different event date.

The formula to compute BHAR is as follow:


BHAR formula

Rit is the actual firm return and the expected return is computed using the Market model :

Expected return equation

the alpha and beta are defined using a regression of the firm data with the market proxy, on an estimation period of 252 trading days that start 11 days before the event date (to shield it from the effect of the announcment).

I have 3 dataframes:

  • index_data with prices and date of the market index used as proxy
  • stock_data with the tickers of my selected firms, the date and the closing price
  • event_data with event dates and the corresponding tickers

In pseudo-code what i'd like to do is :

for each value in event_data

  • compute the estimation window start as: event date - 265 days - 11 days
  • compute the estimation window end as: event date - 11
  • fit a linear regression model between stock_data and index_data on the estimation window
  • extract alpha and beta coefficients for each stock
  • compute the expected return as alpha + Beta * Market return in a new column in stock_data dataframe
    end for
  • Compute the BHAR as the actual firm return cumulative product - the expected return cumulative product

I have tried the following code:

# Loop through the tickers in ticker_BHAR
for (ticker in event_data$Ticker[1:19]) {
  # Get the event date for the current ticker
  event_date <- as.date(event_data$date[event_data$Ticker == ticker], format = "%d-%m-%Y")
  
  # Compute the start and end dates for the estimation window
  start_date <- event_date - 265 - 11
  end_date <- event_date - 11
  
  # Subset the stock_data and index_data dataframes to only include the dates in the estimation window
  stock_subset <- stock_data[stock_data$Ticker == ticker & stock_data$date >= start_date & stock_data$date <= end_date, ]
  index_subset <- index_data[index_data$date >= start_date & index_data$date <= end_date, ]
  
  # Fit a linear regression model between the stock and index data
  model <- lm(stock_subset$adjusted ~ index_subset$close)
  
  # Extract the alpha and beta coefficients from the model
  alpha <- coef(model)[1]
  beta <- coef(model)[2]
  
  # Compute the expected return for the current ticker and add it to the stock_data dataframe
  stock_data[stock_data$Ticker == ticker, "expected_return"] <- alpha + beta * market_return
}

The error I get when running the code is :

"Error in model.frame.default(formula = stock_subset$adjusted ~ index_subset$close, : variables length differ (found for 'index_subset$close')"


I am a big newbie, I am working on providing a MRE

Thank you for your help.

Simon
  • 3
  • 2
  • 2
    I think the problem with your regression is that the two variables are of different lengths. They need to be the same length. It's hard to help you because not all the relevant information is in your post. Please read this and edit your question to address the points therein: https://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example – John Polo Dec 20 '22 at 02:01
  • The contrasts error usually happens when your factor variable has only one level. We can't say for sure what is the problem without a sample dataset. – Iyar Lin Dec 20 '22 at 12:56

0 Answers0