Skip to content

Commit 1d5d7f2

Browse files
committed
Gradle project
1 parent 3a78d22 commit 1d5d7f2

File tree

138 files changed

+3007
-2821
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

138 files changed

+3007
-2821
lines changed

.classpath

Lines changed: 0 additions & 9 deletions
This file was deleted.

.gitignore

Lines changed: 12 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,13 @@
1-
# built application files
2-
*.apk
3-
*.ap_
4-
*.dex
5-
*.class
6-
bin/
7-
gen/
8-
*.class
9-
*.o
10-
*.so
11-
*.sh
1+
*.iml
2+
.gradle
123
local.properties
13-
custom_rules.xml
14-
ant.properties
15-
*~
4+
5+
.idea/*
6+
!.idea/codeStyles/
7+
8+
.DS_Store
9+
build
10+
captures
11+
12+
.externalNativeBuild
13+
.cxx

.project

Lines changed: 0 additions & 33 deletions
This file was deleted.

AndroidManifest.xml

Lines changed: 0 additions & 10 deletions
This file was deleted.

README.md

Lines changed: 15 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
* Android 4.0 or more recent is required.
88
* Supported encoders include H.264, H.263, AAC and AMR.
99

10-
The first step you will need to achieve to start a streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.
10+
The first step you will need to achieve to start a net.majorkernelpanic.streaming session to some peer is called 'signaling'. During this step you will contact the receiver and send a description of the incomming streams. You have three ways to do that with libstreaming.
1111

1212
* With the RTSP client: if you want to stream to a Wowza Media Server, it's the way to go. [The example 3](https://github.com/fyhertz/libstreaming-examples#example-3) illustrates that use case.
1313
* With the RTSP server: in that case the phone will act as a RTSP server and wait for a RTSP client to request a stream. This use case is illustated in [the example 1](https://github.com/fyhertz/libstreaming-examples#example-1).
@@ -25,7 +25,7 @@ There are three ways on Android to get encoded data from the peripherals:
2525

2626
### Encoding with the MediaRecorder API
2727

28-
The **MediaRecorder** API was not intended for streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a **LocalSocket** instead of a regular file (see **MediaStream.java**).
28+
The **MediaRecorder** API was not intended for net.majorkernelpanic.streaming applications but can be used to retrieve encoded data from the peripherals of the phone. The trick is to configure a MediaRecorder instance to write to a **LocalSocket** instead of a regular file (see **MediaStream.java**).
2929

3030
Edit: as of Android Lollipop using a **LocalSocket** is not possible anymore for security reasons. But using a [**ParcelFileDescriptor**](http://developer.android.com/reference/android/os/ParcelFileDescriptor.html) does the trick. More details in the file **MediaStream.java**! ([Thanks to those guys for the insight](http://stackoverflow.com/questions/26990816/mediarecorder-issue-on-android-lollipop))
3131

@@ -42,15 +42,15 @@ The **MediaCodec** API do not present the limitations I just mentionned, but has
4242
The buffer-to-buffer method uses calls to [**dequeueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#dequeueInputBuffer(long)) and [**queueInputBuffer**](http://developer.android.com/reference/android/media/MediaCodec.html#queueInputBuffer(int, int, int, long, int)) to feed the encoder with raw data.
4343
That seems easy right ? Well it's not, because video encoders that you get access to with this API are using different color formats and you need to support all of them. A list of those color formats is available [here](http://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html). Moreover, many encoders claim support for color formats they don't actually support properly or can present little glitches.
4444

45-
All the [**hw**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/package-summary.html) package is dedicated to solving those issues. See in particular [**EncoderDebugger**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/hw/EncoderDebugger.html) class.
45+
All the [**hw**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/hw/package-summary.html) package is dedicated to solving those issues. See in particular [**EncoderDebugger**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/hw/EncoderDebugger.html) class.
4646

47-
If streaming with that API fails, libstreaming fallbacks on streaming with the **MediaRecorder API**.
47+
If net.majorkernelpanic.streaming with that API fails, libstreaming fallbacks on net.majorkernelpanic.streaming with the **MediaRecorder API**.
4848

4949
The surface-to-buffer method uses the [createInputSurface()](http://developer.android.com/reference/android/media/MediaCodec.html#createInputSurface()) method. This method is probably the best way to encode raw video from the camera but it requires android 4.3 and up.
5050

51-
The [**gl**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/gl/package-summary.html) package is dedicated to using the MediaCodec API with a surface.
51+
The [**gl**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/gl/package-summary.html) package is dedicated to using the MediaCodec API with a surface.
5252

53-
It is not yet enabled by default in libstreaming but you can force it with the [**setStreamingMethod(byte)**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/MediaStream.html#setStreamingMethod(byte)) method.
53+
It is not yet enabled by default in libstreaming but you can force it with the [**setStreamingMethod(byte)**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/MediaStream.html#setStreamingMethod(byte)) method.
5454

5555
### Packetization process
5656

@@ -65,7 +65,7 @@ If you are looking for a basic implementation of one of the RFC mentionned above
6565

6666
RTCP packets are also sent to the receiver since version 2.0 of libstreaming. Only Sender Reports are implemented. They are actually needed for lip sync.
6767

68-
The [**rtp**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/streaming/rtp/package-summary.html) package handles packetization of encoded data in RTP packets.
68+
The [**rtp**](http://guigui.us/libstreaming/doc/net/majorkernelpanic/net.majorkernelpanic.streaming/rtp/package-summary.html) package handles packetization of encoded data in RTP packets.
6969

7070
# Using libstreaming in your app
7171

@@ -139,7 +139,7 @@ This example is extracted from [this simple android app](https://github.com/fyhe
139139

140140
@Override
141141
public void onSessionError(int message, int streamType, Exception e) {
142-
// Might happen if the streaming at the requested resolution is not supported
142+
// Might happen if the net.majorkernelpanic.streaming at the requested resolution is not supported
143143
// or if the preview surface is not ready...
144144
// Check the Session class for a list of the possible errors.
145145
Log.e(TAG, "An error occured", e);
@@ -159,15 +159,15 @@ This example is extracted from [this simple android app](https://github.com/fyhe
159159

160160
@Override
161161
public void surfaceDestroyed(SurfaceHolder holder) {
162-
// Stops the streaming session
162+
// Stops the net.majorkernelpanic.streaming session
163163
mSession.stop();
164164
}
165165

166166
```
167167

168-
The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceView** is needed for video streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the **MediaRecorder** API). On Android 4.3, streaming with no **SurfaceView** is possible but not yet implemented. The call to **setContext(Context)** is necessary, it allows **H264Stream** objects and **AACStream** objects to store and recover data using **SharedPreferences**.
168+
The **SessionBuilder** simply facilitates the creation of **Session** objects. The call to **setSurfaceView** is needed for video net.majorkernelpanic.streaming, that should not come up as a surprise since Android requires a valid surface for recording video (it's an annoying limitation of the **MediaRecorder** API). On Android 4.3, net.majorkernelpanic.streaming with no **SurfaceView** is possible but not yet implemented. The call to **setContext(Context)** is necessary, it allows **H264Stream** objects and **AACStream** objects to store and recover data using **SharedPreferences**.
169169

170-
A **Session** object represents a streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the **start()** (resp. **stop()**) method is invoked.
170+
A **Session** object represents a net.majorkernelpanic.streaming session to some peer. It contains one or more **Stream** objects that are started (resp. stopped) when the **start()** (resp. **stop()**) method is invoked.
171171

172172
The method **getSessionDescription()** will return a SDP of the session in the form of a String. Before calling it, you must make sure that the **Session** has been configured. After calling **configure()** or **startPreview()** on you Session instance, the callback **onSessionConfigured()** will be called.
173173

@@ -184,7 +184,7 @@ The method **getSessionDescription()** will return a SDP of the session in the f
184184
}
185185
Strinf sdp = mSession.getSessionDescription();
186186
...
187-
// Blocks until streaming actually starts.
187+
// Blocks until net.majorkernelpanic.streaming actually starts.
188188
try {
189189
mSession.syncStart();
190190
} catch (Exception e) {
@@ -203,7 +203,8 @@ Check out [this page of the wiki](https://github.com/fyhertz/libstreaming/wiki/U
203203
#### Add this to your manifest:
204204

205205
```xml
206-
<service android:name="net.majorkernelpanic.streaming.rtsp.RtspServer"/>
206+
207+
<service android:name="net.majorkernelpanic.net.majorkernelpanic.streaming.rtsp.RtspServer" />
207208
```
208209

209210
If you decide to override **RtspServer** change the line above accordingly.
@@ -239,5 +240,5 @@ context.stopService(new Intent(this,RtspServer.class));
239240

240241
# Spydroid-ipcamera
241242

242-
Visit [this github page](https://github.com/fyhertz/spydroid-ipcamera) to see how this streaming stack can be used and how it performs.
243+
Visit [this github page](https://github.com/fyhertz/spydroid-ipcamera) to see how this net.majorkernelpanic.streaming stack can be used and how it performs.
243244

build.gradle

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
import org.gradle.internal.jvm.Jvm
2+
3+
buildscript {
4+
ext.kotlin_version = "1.6.20"
5+
repositories {
6+
google()
7+
mavenCentral()
8+
}
9+
dependencies {
10+
classpath 'com.android.tools.build:gradle:7.1.3'
11+
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
12+
}
13+
}
14+
15+
println "Gradle uses Java ${Jvm.current()}"
16+
17+
allprojects {
18+
repositories {
19+
google()
20+
mavenCentral()
21+
}
22+
}
23+
24+
task clean(type: Delete) {
25+
delete rootProject.buildDir
26+
}

build.xml

Lines changed: 0 additions & 92 deletions
This file was deleted.

0 commit comments

Comments
 (0)