3

Below is my drawRect function for a simple custom view. When I use

var h: CGFloat = 200

the view displays itself properly: a rectangle part of which is black, part of which is red. But when I use

var h: CGFloat = fractionalHeight*parentViewHeight

all I get is a black rectangle. If fractionalHeight (a property of the view) is 0.5, half of the rectangle should be red. The print statement confirms the fact that h is what I think it should be. Hmmmmm? What's going on?

    override func drawRect(rect: CGRect)
{
    var ctx = UIGraphicsGetCurrentContext()

    CGContextClearRect(ctx, rect);

    let parentViewBounds = self.bounds
    let parentViewWidth = CGRectGetWidth(parentViewBounds)
    let parentViewHeight = CGRectGetHeight(parentViewBounds)
    println ("w,h = \(parentViewWidth),\(parentViewHeight) ")

    // CGContextClearRect(ctx, rect);

    CGContextSetRGBFillColor(ctx, 0.0, 0.0, 0.0, 1); // black
    CGContextFillRect(ctx, CGRectMake(0, 0, parentViewWidth, parentViewHeight));

    println("in drawRect, fractionalHeight = \(fractionalHeight)")

    var h: CGFloat = 200 // fractionalHeight*parentViewHeight

    CGContextSetRGBFillColor(ctx, 1.0, 0.0, 0.0, 1); // red
    CGContextFillRect(ctx, CGRectMake(0, 0, parentViewWidth, h))
    println("in drawRect, h = \(h)")

}
rmaddy
  • 314,917
  • 42
  • 532
  • 579
jxxcarlson
  • 223
  • 3
  • 13
  • Your code works for me. How do you have `fractionalHeight` declared? I used `var fractionalHeight: CGFloat = 0.5`. – vacawama Jun 14 '14 at 23:35
  • Yes, I have `class OCIndicator: UIView {`, then `var fractionalHeight: Float` declaring the property. – jxxcarlson Jun 14 '14 at 23:38
  • Where are you setting `fractionalHeight` to `0.5`? – vacawama Jun 14 '14 at 23:46
  • I'm setting in `viewDidLoad: override func viewDidLoad() { super.viewDidLoad() dict = NSMutableDictionary(contentsOfFile: path) var indicator: OCIndicator = OCIndicator(frame: CGRectMake(10, 30, 50, 400)) self.view.addSubview(indicator) self.indicator.fractionalHeight = 0.5 println("indicator.fractionalHeight = \(self.indicator.fractionalHeight)") self.indicator.setNeedsDisplay() }` – jxxcarlson Jun 15 '14 at 00:14
  • The initialization code for the indicator object sets the property fractionalHeight to zero. Later, in viewDidLoad, I set it to 0.5 and call setNeedsDisplay on indicator. I've found that if I set it to 0.5 in the initialization code, it displays properly. But of course I want to be able to reset it later. – jxxcarlson Jun 15 '14 at 00:31

1 Answers1

3

I believe you are talking to 2 different OCIndicator instances.

I believe you have an OCIndicator (in your StoryBoard perhaps) hooked up to an IBOutlet in your ViewController.

In viewDidLoad you are creating a second instance of OCIndicator and assigning it to the local variable indicator and you are adding that to your subView which is covering up the first one. But then, you are setting fractionalHeight on self.indicator which is not the same indicator and you are telling that one that it needs to display, but it has been covered up by the one added in viewDidLoad.

So, get rid of:

var indicator: OCIndicator = OCIndicator(frame: CGRectMake(10, 30, 50, 400))
self.view.addSubview(indicator)

from viewDidLoad and see how that works.

vacawama
  • 150,663
  • 30
  • 266
  • 294
  • @vacawama Could you help me with my drawRect issue? http://stackoverflow.com/questions/39855349/make-2-contradictory-methods-work-in-drawrect – Andy Jazz Oct 16 '16 at 15:21