Skip to content

Zoom in and zoom out in camera #176

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 67 commits into
base: mediacodec
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
e46c57e
Better support of the MediaCodec API, but still disabled by default. …
Nov 2, 2013
19b0271
Mavenize library for easy inclusion in other projects
Jan 8, 2014
187ce18
Merge pull request #16 from soundmonster/master
Jan 8, 2014
279e6ac
Merge branch 'mediacodec'
Feb 8, 2014
dd0ddbd
libstreaming requires android 4.0
Feb 8, 2014
84ea97f
Git forgot some stuff during the merge.
Feb 8, 2014
5e42220
Small bug fix
Feb 9, 2014
bbf73d6
Cleaner logs and important fix in EncoderDebugger
Feb 10, 2014
819a8ef
Cleaner logs and important fix in EncoderDebugger
Feb 10, 2014
4f4cd4d
Updated README
Feb 10, 2014
47de3fc
Fixed some typos.
Feb 10, 2014
f3efff3
Fixed some typos.
Feb 10, 2014
14faa7a
Fixed some typos.
Feb 10, 2014
8f749e7
Fixed some typos.
Feb 10, 2014
64ccdf5
Fixed some typos.
Feb 10, 2014
b7e1ae7
Fix that prevent two Session instances to open the camera at the same…
Feb 10, 2014
b562457
Fixed some typos.
Feb 11, 2014
2359e79
A fix for some phone that have a buggy MediaCodecList API.
Feb 14, 2014
5b7e1c7
A fix for some phone that have a buggy MediaCodecList API.
Feb 15, 2014
ada92e2
I readded the \r\n in the RtspClient, remove it if you need. Sps and …
Feb 19, 2014
8c45c58
Bug in the RtspServer finally solved ! Fixed a bug in the MP4Parser a…
Feb 22, 2014
7b59c68
Started the implementation of TCP support. Not ready yet.
Mar 28, 2014
9530973
New feature: the aspect ratio of the gl.SurfaceView can now easily ma…
Mar 29, 2014
6a97798
Updated the javadoc.
Mar 29, 2014
0472180
Some bug fixes, some unecessary calls to the Camera API are now avoided.
Apr 24, 2014
e1758aa
New handler created for each session instance. Otherwise, it gives a …
brunosiqueira May 19, 2014
6893171
Merge pull request #35 from igarape/master
May 28, 2014
f95f91b
FOund a cleaner way to solve the dead thread issue.
May 28, 2014
2e62cbc
fix "ERROR: resource directory 'libstreaming/res' does not exist"
pnemonic78 Jun 2, 2014
b369ce8
5 bits for the object type / 4 bits for the sampling rate / 4 bits for
Jun 2, 2014
2758df6
spelling.
Jun 2, 2014
9c5df6c
Cleaner error message when the Callback buffer was to small error occ…
Jun 13, 2014
b931f21
Merge branch 'master' of https://github.com/pnemonic78/libstreaming i…
Jun 13, 2014
0752b10
Merge branch 'pnemonic78-master'
Jun 13, 2014
41b1818
Fixed typo
Jun 13, 2014
d8a6498
Updated javadoc.
Jun 13, 2014
4315bc1
Fix a bug preventing the size of the PPS from being written into STAP…
Aug 12, 2014
e41aa61
Merge pull request #58 from sigmabeta/stapa-fix
Apr 19, 2015
2343e3a
Lollipop support attempt
May 28, 2015
39ead53
Updated Javadoc
Jun 1, 2015
38abef3
Update README.md
Jun 1, 2015
628e730
Update README.md
Jun 1, 2015
c3b707a
Two fixes in the RTSP client (RtspClient.java) proposed by Magnus Joh…
Aug 9, 2015
aff4392
Merge branch 'master' of https://github.com/fyhertz/libstreaming
Aug 9, 2015
b2ad11f
Add basic Authorization
Aug 28, 2015
b8c9134
Deal with null username and password
Aug 28, 2015
ac44416
Merge pull request #129 from grunk/master
Aug 28, 2015
587b516
Removed org.apache.http.* imports
serpro Sep 9, 2015
94189f4
Merge pull request #132 from serpro/patch-1
Oct 3, 2015
2d9a17f
Fixing squid:S2293 - The diamond operator ("<>") should be used.
the-best-dev Mar 17, 2016
8312cf7
Fixing squid:S1066 - Collapsible "if" statements should be merged.
the-best-dev Mar 17, 2016
b050628
Fixing squid:S2178 - Short-circuit logic should be used in boolean co…
the-best-dev Mar 17, 2016
68b19dd
Fixing squid:S2259- Null pointers should not be dereferenced.
the-best-dev Mar 17, 2016
fefcf29
Fixing squid:S1155, squid:S1126
the-best-dev Mar 17, 2016
38ecb1d
fixing `NullPointerException` when no query in URI string (e.g. rtsp:…
Apr 26, 2016
197b569
Merge pull request #187 from DevFactory/release/collapsable-if-statem…
Mar 13, 2017
3a7a52f
Merge pull request #189 from DevFactory/release/general-code-quality-…
Mar 13, 2017
af5a95b
Merge pull request #184 from DevFactory/release/short-circuit-logic-s…
Mar 13, 2017
bd30223
Merge pull request #186 from DevFactory/release/null-pointers-should-…
Mar 13, 2017
f620117
Merge pull request #205 from apavlenko/fix/null-pointer-uri-parse
Mar 13, 2017
71558f6
libstreaming is now licensed under the Apache license version 2.0
Mar 13, 2017
fe23825
libstreaming is now licensed under the Apache license version 2.0
Mar 13, 2017
ba6dc86
Updated readme.
Mar 12, 2018
89e0382
Updated readme.
Mar 12, 2018
e71b49c
Merge pull request #185 from DevFactory/release/diamond-operator-shou…
Apr 10, 2018
0c2078d
The 'client_port' parameter's range specification (i.e RTCP port part…
KentVu Apr 17, 2018
3a78d22
Merge pull request #276 from KentVu/pr_prepare
Feb 8, 2019
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions AndroidManifest.xml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
android:versionName="4.0" >

<uses-sdk
android:minSdkVersion="9"
android:targetSdkVersion="18" />
android:minSdkVersion="14"
android:targetSdkVersion="19" />

</manifest>
876 changes: 202 additions & 674 deletions LICENSE

Large diffs are not rendered by default.

208 changes: 166 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,26 +4,68 @@

**libstreaming** is an API that allows you, with only a few lines of code, to stream the camera and/or microphone of an android powered device using RTP over UDP.

* Supported encoders includes H.264, H.263, AAC and AMR
* Since version 2.0, a basic support for RTCP has been implemented.
* libstreaming also features a RTSP server for easy remote control of the phone camera and microphone.
* Android 4.0 or more recent is required.
* Supported encoders include H.264, H.263, AAC and AMR.

The full javadoc documentation of the API is available here: http://dallens.fr/majorkernelpanic/libstreaming/
The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.

## How it does it
* With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. [The example 3](https://github.com/fyhertz/libstreaming-examples#example-3) illustrates that use case.
* With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in [the example 1](https://github.com/fyhertz/libstreaming-examples#example-1).
* Or you use libstreaming without using the RTSP protocol at all, and signal the session using SDP over a protocol you like. [The example 2](https://github.com/fyhertz/libstreaming-examples#example-2) illustrates that use case.

libstreaming uses various tricks to make streaming possible without the need of any native code. Access to the devices camera(s) and microphone is achieved by simply using a **MediaRecorder**, but configured to write to a **LocalSocket** instead of a regular file ( **MediaStream.java** ). Raw data from the peripheral are then processed in a thread doing syncronous read at the other end of the **LocalSocket**. Voila !
The full javadoc documentation of the API is available here: http://guigui.us/libstreaming/doc

What the thread actually does is this: it waits for data from the peripheral and then packetizes it to make it fit into proper RTP packets that are then sent one by one on the network. The packetization algorithm that must be used for H.264, H.263, AMR and AAC are all specified in their respective RFC:
## How does it work? You should really read this, it's important!

* RFC 3984 for H.264
* RFC 4629 for H.263
* RFC 3267 for AMR
* RFC 3640 for AAC
There are three ways on Android to get encoded data from the peripherals:

Therefore, depending on the nature of the data, the packetizer will be one out of the four packetizers implemented in the rtp package of libstreaming: **H264Packetizer.java**, **H263Packetizer.java**, **AACADTSPacketizer.java** or **AMRNBPacketizer.java**. I took the time to add lots of comments in those files so if you are looking for a basic implementation of one of the RFC mentionned above, they might come in handy.
* With the **MediaRecorder** API and a simple hack.
* With the **MediaCodec** API and the buffer-to-buffer method which requires Android 4.1.
* With the **MediaCodec** API and the surface-to-buffer method which requires Android 4.3.

RTCP packets are also sent by this thread since version 2.0 of libstreaming. Only Sender Reports are implemented.
### Encoding with the MediaRecorder API

The **MediaRecorder** API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a **LocalSocket** instead of a regular file (see **MediaStream.java**).

Edit: as of Android Lollipop using a **LocalSocket** is not possible anymore for security reasons. But using a [**ParcelFileDescriptor**](http://developer.android.com/reference/android/os/ParcelFileDescriptor.html) does the trick. More details in the file **MediaStream.java**! ([Thanks to those guys for the insight](http://stackoverflow.com/questions/26990816/mediarecorder-issue-on-android-lollipop))

This hack has some limitations:
* Lip sync can be approximative.
* The MediaRecorder internal buffers can lead to some important jitter. libstreaming tries to compensate that jitter.

It's hard to tell how well this hack is going to work on a phone. It does work well on many devices though.

### Encoding with the MediaCodec API

The **MediaCodec** API do not present the limitations I just mentionned, but has its own issues. There are actually two ways to use the MediaCodec API: with buffers or with a surface.

The buffer-to-buffer method uses calls to [**dequeueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#dequeueInputBuffer(long)) and [**queueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer(int, int, int, long, int)) to feed the encoder with raw data.
That seems easy right ? Well it's not, because video encoders that you get access to with this API are using different color formats and you need to support all of them. A list of those color formats is available [here](http://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html). Moreover, many encoders claim support for color formats they don't actually support properly or can present little glitches.

All the [**hw**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/package-summary.html) package is dedicated to solving those issues. See in particular [**EncoderDebugger**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/EncoderDebugger.html) class.

If streaming with that API fails, libstreaming fallbacks on streaming with the **MediaRecorder API**.

The surface-to-buffer method uses the [createInputSurface()](http://developer.android.com/reference/android/media/MediaCodec.html#createInputSurface()) method. This method is probably the best way to encode raw video from the camera but it requires android 4.3 and up.

The [**gl**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/gl/package-summary.html) package is dedicated to using the MediaCodec API with a surface.

It is not yet enabled by default in libstreaming but you can force it with the [**setStreamingMethod(byte)**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/MediaStream.html#setStreamingMethod(byte)) method.

### Packetization process

Once raw data from the peripherals has been encoded, it is encapsulated in a proper RTP stream. The packetization algorithm that must be used depends on the format of the data (H.264, H.263, AMR and AAC) and are all specified in their respective RFC:

* RFC 3984 for H.264: **H264Packetizer.java**
* RFC 4629 for H.263: **H263Packetizer.java**
* RFC 3267 for AMR: **AMRNBPacketizer.java**
* RFC 3640 for AAC: **AACADTSPacketizer.java** or **AACLATMPacketizer.java**

If you are looking for a basic implementation of one of the RFC mentionned above, check the sources of corresponding class.

RTCP packets are also sent to the receiver since version 2.0 of libstreaming. Only Sender Reports are implemented. They are actually needed for lip sync.

The [**rtp**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/rtp/package-summary.html) package handles packetization of encoded data in RTP packets.

# Using libstreaming in your app

Expand All @@ -36,32 +78,125 @@ RTCP packets are also sent by this thread since version 2.0 of libstreaming. Onl
<uses-permission android:name="android.permission.CAMERA" />
```

And if you intend to use multicasting, add this one to the list:

```xml
<uses-permission android:name="android.permission.CHANGE_WIFI_MULTICAST_STATE" />
```

## How to stream H.264 and AAC

This example is extracted from [this simple android app](https://github.com/fyhertz/libstreaming-examples#example-2). This could be a part of an Activity, a Fragment or a Service.

```java
Session session = SessionBuilder.getInstance()
.setSurfaceHolder(mSurfaceView.getHolder())
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
.setVideoEncoder(SessionBuilder.VIDEO_H264)
.build();
protected void onCreate(Bundle savedInstanceState) {

...

mSession = SessionBuilder.getInstance()
.setCallback(this)
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_NONE)
.setAudioQuality(new AudioQuality(16000, 32000))
.setVideoEncoder(SessionBuilder.VIDEO_H264)
.setVideoQuality(new VideoQuality(320,240,20,500000))
.build();

mSurfaceView.getHolder().addCallback(this);

...

}

public void onPreviewStarted() {
Log.d(TAG,"Preview started.");
}

@Override
public void onSessionConfigured() {
Log.d(TAG,"Preview configured.");
// Once the stream is configured, you can get a SDP formated session description
// that you can send to the receiver of the stream.
// For example, to receive the stream in VLC, store the session description in a .sdp file
// and open it with VLC while streming.
Log.d(TAG, mSession.getSessionDescription());
mSession.start();
}

@Override
public void onSessionStarted() {
Log.d(TAG,"Streaming session started.");
...
}

@Override
public void onSessionStopped() {
Log.d(TAG,"Streaming session stopped.");
...
}

@Override
public void onBitrateUpdate(long bitrate) {
// Informs you of the bandwidth consumption of the streams
Log.d(TAG,"Bitrate: "+bitrate);
}

@Override
public void onSessionError(int message, int streamType, Exception e) {
// Might happen if the streaming at the requested resolution is not supported
// or if the preview surface is not ready...
// Check the Session class for a list of the possible errors.
Log.e(TAG, "An error occured", e);
}

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {

}

@Override
public void surfaceCreated(SurfaceHolder holder) {
// Starts the preview of the Camera
mSession.startPreview();
}

@Override
public void surfaceDestroyed(SurfaceHolder holder) {
// Stops the streaming session
mSession.stop();
}

session.setDestination(destination);
String sdp = session.getSessionDescription();
session.start();
```

The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceHolder** is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (It's an ennoying limitation of the **MediaRecorder** API). The call to **setContext** is optional but recommanded, it allows **H264Stream** objects and **AACStream** objects to store and recover data in the **SharedPreferences** of your app. Check the implementation of those two classes to find out exactly what data are stored.
The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceView** is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the **MediaRecorder** API). On Android 4.3, streaming with no **SurfaceView** is possible but not yet implemented. The call to **setContext(Context)** is necessary, it allows **H264Stream** objects and **AACStream** objects to store and recover data using **SharedPreferences**.

A **Session** object represents a streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the **start()** (resp. **stop()**) method is invoked.

The method **getSessionDescription()** will return a SDP of the session in the form of a String. Before calling it, you must make sure that the **Session** has been configured. After calling **configure()** or **startPreview()** on you Session instance, the callback **onSessionConfigured()** will be called.

**Session** objects represents a streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the start() (resp. stop()) method is invoked. The method **setDestination** allows you to specify the ip address to which RTP and RTCP packets will be sent. The method **getSessionDescription** will return a SDP of the session in the form of a String.
**In the example presented above, the Session instance is used in an asynchronous manner and calls to its methods do not block. You know when stuff is done when callbacks are called.**

The complete source code of this example is available here: https://github.com/fyhertz/libstreaming-examples
**You can also use a Session object in a synchronous manner like that:**

```java
// Blocks until the all streams are configured
try {
mSession.syncConfigure();
} catch (Exception e) {
...
}
Strinf sdp = mSession.getSessionDescription();
...
// Blocks until streaming actually starts.
try {
mSession.syncStart();
} catch (Exception e) {
...
}
...
mSession.syncStop();
```

## How to use the RTSP client

Check out [this page of the wiki](https://github.com/fyhertz/libstreaming/wiki/Using-libstreaming-with-Wowza-Media-Server) and the [example 3](https://github.com/fyhertz/libstreaming-examples#example-3).

## How to use the RTSP server

Expand Down Expand Up @@ -102,18 +237,7 @@ context.startService(new Intent(this,RtspServer.class));
context.stopService(new Intent(this,RtspServer.class));
```

# Class diagramm

![Class Diagram](http://dallens.fr/majorkernelpanic/libstreaming/ClassDiagram.png "Class diagram")

*SessionBuilder* and *RtspServer* are the class that you will want to use directly, they make use of everything else in the streaming package.

# Spydroid-ipcamera

Visit [this github page](https://github.com/fyhertz/spydroid-ipcamera) to see how this streaming stack can be used and how it performs.
Further information about Spydroid can be found on the google page of the project [here](https://spydroid-ipcamera.googlecode.com).
The app. is also available on google play [here](https://play.google.com/store/apps/details?id=net.majorkernelpanic.spydroid).

# Licensing

This streaming stack is available under two licenses, the GPL and a commercial license. *If you are willing to integrate this project into a close source application, please contact me at fyhertz at gmail.com*. Thank you.
Loading