You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+15-14Lines changed: 15 additions & 14 deletions
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@
7
7
* Android 4.0 or more recent is required.
8
8
* Supported encoders include H.264, H.263, AAC and AMR.
9
9
10
-
The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.
10
+
The first step you will need to achieve to start a net.majorkernelpanic.streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.
11
11
12
12
* With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. [The example 3](https://github.com/fyhertz/libstreaming-examples#example-3) illustrates that use case.
13
13
* With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in [the example 1](https://github.com/fyhertz/libstreaming-examples#example-1).
@@ -25,7 +25,7 @@ There are three ways on Android to get encoded data from the peripherals:
25
25
26
26
### Encoding with the MediaRecorder API
27
27
28
-
The **MediaRecorder** API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a **LocalSocket** instead of a regular file (see **MediaStream.java**).
28
+
The **MediaRecorder** API was not intended for net.majorkernelpanic.streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a **LocalSocket** instead of a regular file (see **MediaStream.java**).
29
29
30
30
Edit: as of Android Lollipop using a **LocalSocket** is not possible anymore for security reasons. But using a [**ParcelFileDescriptor**](http://developer.android.com/reference/android/os/ParcelFileDescriptor.html) does the trick. More details in the file **MediaStream.java**! ([Thanks to those guys for the insight](http://stackoverflow.com/questions/26990816/mediarecorder-issue-on-android-lollipop))
31
31
@@ -42,15 +42,15 @@ The **MediaCodec** API do not present the limitations I just mentionned, but has
42
42
The buffer-to-buffer method uses calls to [**dequeueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#dequeueInputBuffer(long)) and [**queueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer(int, int, int, long, int)) to feed the encoder with raw data.
43
43
That seems easy right ? Well it's not, because video encoders that you get access to with this API are using different color formats and you need to support all of them. A list of those color formats is available [here](http://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html). Moreover, many encoders claim support for color formats they don't actually support properly or can present little glitches.
44
44
45
-
All the [**hw**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/package-summary.html) package is dedicated to solving those issues. See in particular [**EncoderDebugger**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/EncoderDebugger.html) class.
45
+
All the [**hw**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/hw/package-summary.html) package is dedicated to solving those issues. See in particular [**EncoderDebugger**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/hw/EncoderDebugger.html) class.
46
46
47
-
If streaming with that API fails, libstreaming fallbacks on streaming with the **MediaRecorder API**.
47
+
If net.majorkernelpanic.streaming with that API fails, libstreaming fallbacks on net.majorkernelpanic.streaming with the **MediaRecorder API**.
48
48
49
49
The surface-to-buffer method uses the [createInputSurface()](http://developer.android.com/reference/android/media/MediaCodec.html#createInputSurface()) method. This method is probably the best way to encode raw video from the camera but it requires android 4.3 and up.
50
50
51
-
The [**gl**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/gl/package-summary.html) package is dedicated to using the MediaCodec API with a surface.
51
+
The [**gl**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/gl/package-summary.html) package is dedicated to using the MediaCodec API with a surface.
52
52
53
-
It is not yet enabled by default in libstreaming but you can force it with the [**setStreamingMethod(byte)**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/MediaStream.html#setStreamingMethod(byte)) method.
53
+
It is not yet enabled by default in libstreaming but you can force it with the [**setStreamingMethod(byte)**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/MediaStream.html#setStreamingMethod(byte)) method.
54
54
55
55
### Packetization process
56
56
@@ -65,7 +65,7 @@ If you are looking for a basic implementation of one of the RFC mentionned above
65
65
66
66
RTCP packets are also sent to the receiver since version 2.0 of libstreaming. Only Sender Reports are implemented. They are actually needed for lip sync.
67
67
68
-
The [**rtp**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/rtp/package-summary.html) package handles packetization of encoded data in RTP packets.
68
+
The [**rtp**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/rtp/package-summary.html) package handles packetization of encoded data in RTP packets.
69
69
70
70
# Using libstreaming in your app
71
71
@@ -139,7 +139,7 @@ This example is extracted from [this simple android app](https://github.com/fyhe
139
139
140
140
@Override
141
141
publicvoid onSessionError(int message, int streamType, Exception e) {
142
-
// Might happen if the streaming at the requested resolution is not supported
142
+
// Might happen if the net.majorkernelpanic.streaming at the requested resolution is not supported
143
143
// or if the preview surface is not ready...
144
144
// Check the Session class for a list of the possible errors.
145
145
Log.e(TAG, "An error occured", e);
@@ -159,15 +159,15 @@ This example is extracted from [this simple android app](https://github.com/fyhe
// Stops the net.majorkernelpanic.streaming session
163
163
mSession.stop();
164
164
}
165
165
166
166
```
167
167
168
-
The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceView** is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the **MediaRecorder** API). On Android 4.3, streaming with no **SurfaceView** is possible but not yet implemented. The call to **setContext(Context)** is necessary, it allows **H264Stream** objects and **AACStream** objects to store and recover data using **SharedPreferences**.
168
+
The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceView** is needed for video net.majorkernelpanic.streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the **MediaRecorder** API). On Android 4.3, net.majorkernelpanic.streaming with no **SurfaceView** is possible but not yet implemented. The call to **setContext(Context)** is necessary, it allows **H264Stream** objects and **AACStream** objects to store and recover data using **SharedPreferences**.
169
169
170
-
A **Session** object represents a streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the **start()** (resp. **stop()**) method is invoked.
170
+
A **Session** object represents a net.majorkernelpanic.streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the **start()** (resp. **stop()**) method is invoked.
171
171
172
172
The method **getSessionDescription()** will return a SDP of the session in the form of a String. Before calling it, you must make sure that the **Session** has been configured. After calling **configure()** or **startPreview()** on you Session instance, the callback **onSessionConfigured()** will be called.
173
173
@@ -184,7 +184,7 @@ The method **getSessionDescription()** will return a SDP of the session in the f
184
184
}
185
185
Strinf sdp = mSession.getSessionDescription();
186
186
...
187
-
// Blocks until streaming actually starts.
187
+
// Blocks until net.majorkernelpanic.streaming actually starts.
188
188
try {
189
189
mSession.syncStart();
190
190
} catch (Exception e) {
@@ -203,7 +203,8 @@ Check out [this page of the wiki](https://github.com/fyhertz/libstreaming/wiki/U
Visit [this github page](https://github.com/fyhertz/spydroid-ipcamera) to see how this streaming stack can be used and how it performs.
243
+
Visit [this github page](https://github.com/fyhertz/spydroid-ipcamera) to see how this net.majorkernelpanic.streaming stack can be used and how it performs.
0 commit comments