[教程]Live Camera Previews in Android

This code replaces an earlier version which is still available

The Android SDK provided by Google doesn't provide any camera emulation beyond a stand-in animation (see below). Since Google haven't yet provided a working implementation (or even a date by which one will be available), I'm publishing the source code for a set of temporary classes that I wrote to overcome this limitation. It's the first code I wrote on the Android platform, and it was written in haste, though it has functioned very well for me.

The code is public domain. I make no warranty as to its fitness for any particular purpose.

The code consists of:

  • CameraSource
    A light interface that protects client code from the choice of camera.
  • GenuineCamera
    A CameraSource implementation that uses the device's camera.
  • HttpCamera
    A CameraSource implementation that GETs images over an HTTP connection.
  • SocketCamera
    A CameraSource implementation that obtains images directly over a TCP/IP connection.
  • BitmapCamera
    A CameraSource implementation that uses a supplied bitmap.
  • WebcamBroadcaster
    A small Java application that uses JMF to broadcast captured stills over the network.

Results

When drawing the output of the genuine camera to screen, you will see something that looks like this:

Device's Camera preview output.

When drawing the output of a remote camera that is obtaining frames from a running webcam broadcaster you would see this (if you were stood over my desk).

Still obtained from a webcam broadcaster.

To save myself more time, I chose to produce a small settings configuration activity too (the code for which is not included).

A simple configuration activity for the camera settings.

Usage

Using a camera source is very simple:

CameraSource cs = new SocketCamera("192.168.0.100", 9889, 320, 240, true); if (!cs.open()) { /* deal with failure to obtain camera */ } while(/*some condition*/) { cs.capture(canvas) //capture the frame onto the canvas } cs.close();

The HttpCamera implementation should be able to obtain stills from any HTTP serving webcam or webcam software. At present I am using WebCam2000 which I run under Windows XP.

Fabian Necci kindly emailed me to let me know that he has successfully used this code on a MacBook using EvoCam to publish JPEG frames to an internal web server. Other software should be available on other systems.

Torben Vesterager has used the code with his MacBook Pro laptop which has Apple's own iSight built-in. He reports that the camera only supports YUVFormats and so had to change the code in the WebcamBroadcaster to use YUVFormat instead of RGBFormat.

Jeffrey Sharkey has used Gentoo Linux with a Logitech QuickCam express using the WebcamBroadcaster and the Linux-native JMF library. He followed this by using DV4Linux to get JMF to recognize his firewire MiniDV camera. This gave him much clearer pictures in Android; he wrapped his JMF calls using the dv4lstart utility:

dv4lstart jmfinit dv4lstart java WebcamBroadcaster

If you successfully use this code on a platform other than those listed above, or have used other webcam software, please let me know so that I can include it in this documentation for the benefit of others.

The SocketCamera implementation relies on a WebcamBroadcaster to be serving webcam stills from the address & port specified in its constructor.

To run a WebcamBroadcaster you will need JMF installed and you may need to supply the library location to the JVM when you run the application (this depends on how you install it). For example, on my machine (currently Windows XP) I use:

java "-Djava.library.path=C:\Program Files\JMF2.1.1e\lib" WebcamBroadcaster

You can pass the following combination of parameters to the application, all of which are strictly positive integers (all dimensions in pixels):

  • WebcamBroadcaster port
  • WebcamBroadcaster width height
  • WebcamBroadcaster width height port

Alternatively, you may provide no parameters, in which case the defaults are width: 320, height: 240 and port: 9889.

Notes

  • I am not able to provide support/advice relating to JMF. Sorry, but the library is old and the documentation stale. I am using it here because its the quickest option, not necessarily the best. You can reimplement the WebcamBroadcaster application using any language/library you choose, so long as it emulates the extremely simple behaviour of the WebcamBroadcaster class. For an easy life I recommend using the HttpCamera implementation.
  • The SocketCamera implementation can be expected to outperform the HttpCamera implementation significantly.
  • Obviously performance is poor, every frame needs to be compressed, sent over a network and then decompressed. This is not a useful long term implementation. Nevertheless, performance isn't so bad that the code is unusable. On my machine, capturing a frame using SocketCamera takes about 100-150ms, HttpCamera is typically two to three times slower.
  • Android currently features a very significant bug that disregards the connect timeout setting for URL connections. Until that bug is fixed, the HttpCamera implementation will hang the first time it fails to connect to the webcam server. I believe this bug was resolved in the m5 emulator releases, but have not confirmed it.
  • When capturing a sequence of stills, JMF has a habit of 'freezing' on a single frame. If this happens, just stop and start the WebcamBroadcaster application. You do not need to restart your Android application.
  • I don't recommend using "localhost" when specifying an address to which the camera should connect. The Android device will probably treat this as the loopback on the emulated device itself. Instead use the IP address of the machine which is hosting the webcam (or WebcamBroadcaster).
赞(0) 打赏
分享到: 更多 (0)

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏