I'm trying to create an app for Glass to stream video using GDK. I found nice example with libstreaming (https://github.com/fyhertz/libstreaming) there: Google Glass stream video to server
The solution there creates stream server on the phone. VLC might be used to connect to it. If receive buffer = 0, then I could get up to 0.1 sec delay sacrificing quality.
I was able to run example on my Note2 with maximum resolution 320x240 30fps 500kbps. When I'm trying to run it on Glass, I have 176x144 maximum and "Fail to connect to camera service" error.
I'm curious, could I get more than 176x144 over h.264 or other protocols? I completely stuck on "Fail to connect to camera service" error.
My code is pretty much the same as here: Google Glass stream video to server
except:
in MainActivity:
SessionBuilder.getInstance()
.setSurfaceView((SurfaceView) findViewById(R.id.surface))
.setCallback(this)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setVideoEncoder(SessionBuilder.VIDEO_H264)
.setVideoQuality(new VideoQuality(176, 144, 12, 24000));
in Manifest:
<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT"/>
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
Any help or fresh ideas are much appreciated. I heard, Mirror API could be used to stream video, Unfortunately didn't investigated that yet.