I have this method here as an extension on the String class which calculates size of text before rendering:
Size textSize({
required TextDirection textDirection,
required double textScaleFactor,
required double minWidth,
required double maxWidth,
TextStyle? textStyle,
int? maxLines,
TextAlign? textAlign,
Locale? locale,
String? ellipsis,
StrutStyle? strutStyle,
TextHeightBehavior? textHeightBehavior,
TextWidthBasis? textWidthBasis,
}) {
final textSpan = TextSpan(
text: this,
style: textStyle,
locale: locale,
);
final TextPainter tp = TextPainter(
textDirection: textDirection,
textScaleFactor: textScaleFactor,
text: textSpan,
maxLines: maxLines,
textAlign: textAlign ?? TextAlign.start,
locale: locale,
ellipsis: ellipsis,
strutStyle: strutStyle,
textHeightBehavior: textHeightBehavior,
textWidthBasis: textWidthBasis ?? TextWidthBasis.parent,
)..layout(
minWidth: minWidth,
maxWidth: maxWidth,
);
return tp.size;
This method works and gives the expected results when running in the app. But when I call it in my test, with the same parameters I used in the app, it is giving different results.
I tried diving into the inner code of it but couldn't figure out what's the difference between my test and app that's causing the problem. So anyone can explain to me why is it giving two different results for the same parameters and how to solve it?
note: The parameters I tried are
text: "Type the text here:",
textDirection: Directionality.of(context),
minWidth: 0.0,
maxWidth: moc.size.width,
textScaleFactor: 1.0,
textStyle: const widgets.TextStyle(fontSize: 30, color: Colors.black),
The size it is giving in the normal app is Size(258.0, 34.0)
and in the test it is giving Size(570.0, 30.0)
.
I know there is some implicit different parameters between the test and normal app but I can't figure out which one it is. Even setting screen size in the test didn't work.