1

To implement a rather intricate design of a screen in an iOS app, I have a UITableView nested inside of a UIScrollView.

To keep the logic simple, I implemented a method on the UITableView that calculates its entire height, and i use the result of that method and set a constraint on the nested table view, so that the scrolling logic can be solely on the UIScrollView to deal with. (I forward methods such as scrollRectToVisible from the UITableView to the UIScrollView)

While this works great with small data sets, I have recently discovered the the reuse capabilities of the UITableView are not used, because the framework believes the entire UITableView to be visible when I set that height constraint. A simple log method in the cellForRowAtIndexPath method shows all cells get calculated at once.

My question is, is there anything I can do where I would be able to tell the nested UITableView how much of it is actually visible on screen, and to only compute those visible cells?

I basically need to override whatever part of UITableView that is responsible for calculating what cells should be visible on screen.

rmaddy
  • 314,917
  • 42
  • 532
  • 579
Gagan Singh
  • 988
  • 12
  • 22

1 Answers1

-3

The table view will think of itself as filling its whole frame with cells. If you limit the height it will limit the cell count visible. Are you using the deque with reuse identifier method (if not see below)

How can I recycle UITableViewCell objects created from a XIB?

Community
  • 1
  • 1
Michael Voznesensky
  • 1,612
  • 12
  • 15