And the culprit was (drums) : Instabug framework.
They tell you right there on their marketware pages they allow
users to take audio notes during feedback composition.
So I've added NSMicrophoneUsageDescription
into the app plist
explaining that.
Note that there is a lot of apple API that instabug uses
Undefined symbols for architecture arm64:
(i've removed some that seems legitimate according to what that framework claims to do and left what I see no claims for in the marketware)
"_AVMakeRectWithAspectRatioInsideRect", referenced from:
+[IBGIAMImageAttachmentView sizeForContent:forWidth:] in InstabugHost_lto.o
"OBJC_CLASS$_CTTelephonyNetworkInfo", referenced from:
objc-class-ref in InstabugHost_lto.o
"_AVNumberOfChannelsKey", referenced from:
-[IBGVoiceNoteManager startRecording] in InstabugHost_lto.o
"_CTRadioAccessTechnologyHSDPA", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyGPRS", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyWCDMA", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyEdge", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyCDMA1x", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyCDMAEVDORevA", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyCDMAEVDORevB", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyLTE", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"OBJC_CLASS$_AVURLAsset", referenced from:
OBJC_CLASS$_IBGAsset in InstabugHost_lto.o
"OBJC_METACLASS$_AVURLAsset", referenced from:
OBJC_METACLASS$_IBGAsset in InstabugHost_lto.o
"_CTRadioAccessTechnologyCDMAEVDORev0", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
"_CTRadioAccessTechnologyHSUPA", referenced from:
+[IBGInspector getCarrier] in InstabugHost_lto.o
ld: symbol(s) not found for architecture arm64
So in this post-Snowden world I have to wonder why does it need coretelephony,
for example.
So what I'm getting at is that if you do not have the source the a 3rd
party framework you have to disclose to the user that your app
itself is NOT using microphone, or camera so that the user
has an option of denying access to that device.
You don't want to be in the news someday due to some security flaw
exploited via YOUR app.
Unresolved: The carefully crafted microphone usage description does not solve the issue with security completely though in case your app DOES use microphone and a 3rd party framework (think that it) needs it too.
You'd have to craft a lengthy description outlining the risks.
Here's where credits disclosure could come handy giving users an idea which 3rd party code your are relying on. Give the credit where it's due :^)
If you are lazy such as myself and never read through the ios security whitepaper
here's a short https://developer.apple.com/videos/play/wwdc2016/705/
In case you have no desire to watch the video in its entirety: around 19:00 mark the speaker tells you explicitly that you must not be lazy with those descriptions (you are responsible for a 3rd party
code potentially abusing the permissions user has granted to your app.
gotta love the binary frameworks ;^)
UPD for iOS 15: Apple has acted up upon the security hole of 3rd party binary only frameworks requesting access to microphone and added an audit trail to (among other things) microphone usage in ios15. App Privacy Report it's called in settings. Thusly part of the responsibility to audit that trail is shifted towards the users of the app that has 3rd party junkware embedded. Amen.