I am trying to stream audio from the microphone of an Android device to a server over TCP. Problem is that I get an error on the console. The TCP connection is made but no audio data is sent.
I realize it could be because of a bad choice of codec because some needs to be able to seek in the stream which is not possible. I can actually use any codec that works but I read that MediaRecorder.OutputFormat.RAW_AMR and MediaRecorder.AudioEncoder.AMR_NB was the best combination for streaming. Please suggest another alternative if there is a better one.
Here is what I see in the log:
11-06 11:09:27.276 22983-22983/se.jensolsson.test.test D/ViewRootImpl@5ed8717[MainActivity]: ViewPostImeInputStage processPointer 0
11-06 11:09:27.355 22983-22983/se.jensolsson.test.test D/ViewRootImpl@5ed8717[MainActivity]: ViewPostImeInputStage processPointer 1
11-06 11:09:27.387 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setup
11-06 11:09:27.394 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setAudioSource(1)
11-06 11:09:27.397 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setAudioEncoder(1)
11-06 11:09:27.400 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: setOutputFile
11-06 11:09:27.400 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: prepare
11-06 11:09:27.407 22983-25466/se.jensolsson.test.test I/MediaRecorderJNI: start
11-06 11:09:27.408 22983-25466/se.jensolsson.test.test E/MediaRecorder: start failed: -38
11-06 11:09:27.408 22983-25466/se.jensolsson.test.test W/System.err: java.lang.IllegalStateException
11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at android.media.MediaRecorder._start(Native Method)
11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at android.media.MediaRecorder.start(MediaRecorder.java:1170)
11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at se.jensolsson.test.test.MainActivity$1$1.run(MainActivity.java:78)
11-06 11:09:27.411 22983-25466/se.jensolsson.test.test W/System.err: at java.lang.Thread.run(Thread.java:762)
Here are the relevant parts of the AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
Here is the source code:
public class MainActivity extends AppCompatActivity {
private MediaRecorder mediaRecorder;
private boolean permissionToRecordAccepted;
private static final int REQUEST_RECORD_AUDIO_PERMISSION = 200;
private ParcelFileDescriptor pfd;
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
switch (requestCode){
case REQUEST_RECORD_AUDIO_PERMISSION:
permissionToRecordAccepted = grantResults[0] == PackageManager.PERMISSION_GRANTED;
if (!permissionToRecordAccepted ) finish();
break;
}
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ActivityCompat.requestPermissions(this, new String[] { Manifest.permission.RECORD_AUDIO }, REQUEST_RECORD_AUDIO_PERMISSION);
Button buttonStartRecording = (Button)findViewById(R.id.button_start_recording);
buttonStartRecording.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
new Thread(new Runnable() {
@Override
public void run() {
try {
Socket s = new Socket("10.0.83.8", 8888);
ParcelFileDescriptor pfd = ParcelFileDescriptor.fromSocket(s);
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
recorder.setOutputFormat(MediaRecorder.OutputFormat.RAW_AMR);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
recorder.setOutputFile(pfd.getFileDescriptor());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
} catch (Exception e) {
e.printStackTrace();
}
}
}).start();
}
});
}
}
The device I am running on is a Samsung Galaxy A5 with Android 7.0. I am using minSdkVersion 22 and targetSdkVersion 26 in the gradle file.
EDIT: The preinstalled app Voice recorder works fine. So I dont see how this could be that the microphone is busy.
EDIT 2: If I change to the following and save to a file instead of a stream, it seem to work. So I still bet there is a problem with the sound format and streaming as a network stream does not support seek. If this is the case, what format should I use??
//recorder.setOutputFile(pfd.getFileDescriptor());
File outputFile = File.createTempFile("test", "mp4", getApplicationContext().getCacheDir());
recorder.setOutputFile(outputFile.getPath());
EDIT 3 None of the answers are correct. I have now found out that the main problem is that I cannot save the sound data to a stream created by ParcelFileDescriptor.fromSocket
It works however if I do this
ParcelFileDescriptor[] mParcelFileDescriptors = ParcelFileDescriptor.createPipe();
final ParcelFileDescriptor mParcelRead = new ParcelFileDescriptor(mParcelFileDescriptors[0]);
ParcelFileDescriptor mParcelWrite = new ParcelFileDescriptor(mParcelFileDescriptors[1]);
And then send the stream contents to a server. I do not know if there is some timing issue with this or if it could cause corruption for certain sound file formats as I guess bytes in the header could change at any time depending on format.