Playing video with VideoView · Advanced Android Development Course- Practicals
The VideoView class can load images from various sources (such as resources or content .. Flags that can impact the layout in relation to system UI. Failure to do so will trigger an IllegalStateException, thrown by measure(int, int). IOException; + method public void onRestoreFile(dayline.info ParcelFileDescriptor SecurityException; method public void setDataSource( dayline.info Relation implements dayline.infoer. diff --git a/core/java/ android/widget/dayline.info b/core/java/android/widget/dayline.info index 88a0eeff FFmpegMediaMetadataRetriever returns setDataSource failed: status = > help me about my problem regarding getting current frame from videoview using rtsp.
We also describe how to pass types across process boundaries. We talk about playing audio and video and then show you how to record audio. We cover text messaging in the telephony part of the chapter. We show you that Android has built-in support for displaying and persisting preferences. We discuss three types of UI elements: We also talk about organizing preferences within your applications. You will be able to utilize the various types of components available in the Android SDK to build your mobile applications.
You will also know how to deploy and version your applications. Hashimi through his web site at http: You can reach Satya Komatineni through his web site at http: At the forefront of this advancement are handheld devices that are transforming into computing platforms. Mobile phones are no longer just for talking—they have been capable of carrying data and video for some time. So the battle lines of operating systems, computing platforms, programming languages, and development frameworks are being shifted and reapplied to mobile devices.
We are also expecting a surge in mobile programming in the IT industry as more and more IT applications start to offer mobile counterparts. We are excited about Android because it is an advanced platform that introduces a number of new paradigms in framework design. The fact that hitherto dedicated devices such as mobile phones can now count themselves among other venerable general-computing platforms is great news for programmers see Figure This new trend makes mobile devices accessible through general-purpose computing languages and therefore increases the range and market share for mobile applications.
The Android Platform fully embraces this idea of general-purpose computing for handheld devices. It is indeed a comprehensive platform that features a Linux-based operating system stack for managing devices, memory, and processes. The Android Platform, although built for mobile devices, exhibits the characteristics of a full-featured desktop framework. Google makes this framework available to Java programmers through a software development kit called the Android SDK.
When you are working with the Android SDK, you rarely feel that you are writing to a mobile device because you have access to most of the class libraries that you use on a desktop or a server—including a relational database. Handheld is the new PC. Android offers its own optimized JVM to run the compiled Java class files in order to counter the handheld device limitations such as memory, processor speed, and power.
Figure provides an overview of the Android software stack. Supporting standards and publishing APIs would greatly encourage widespread, low-cost development of mobile applications, but none of these OSs has taken a clear lead in doing so.
Then Google entered the space with its Android Platform, promising openness, affordability, open source code, and a high-end development framework. Google acquired the startup company Android Inc. The key players at Android Inc. Android timeline In latea group of industry leaders came together around the Android Platform to form the Open Handset Alliance http: Android was designed to serve the needs of mobile operators, handset manufacturers, and application developers.
The members have committed to release significant intellectual property through the open source Apache License, Version 2. When Android was released, one of its key architectural goals was to allow applications to interact with one another and reuse components from one another. This reuse not only applies to services, but also to data and UI. As a result, the Android Platform has a number of architectural features that keep this openness a reality.
Android has also attracted an early following because of its fully developed features to exploit the cloud-computing model offered by web resources and to enhance that experience with local data stores on the handset itself. In late Google released a handheld device called Android Dev Phone 1 that is capable of running Android applications without being tied to any cell phone provider network. At around the same time, Google also released a bug fix version 1. Android fixed this issue by releasing the 1.
The last two chapters of this book are dedicated to exploring the features from this 1. Delving into the Dalvik VM Google has spent a lot of time thinking about optimizing designs for low-powered handheld devices. Handheld devices lag behind their desktop counterparts in memory and speed by eight to ten years. Compare that to the lowest-priced Dell laptop, which comes with a 2. The performance requirements on handsets are severe as a result, requiring handset designers to optimize everything.
It reuses duplicate information from multiple class files, effectively reducing the space requirement uncompressed by half from a traditional. Second, Google has fine-tuned the garbage collection in the Dalvik VM, but it has chosen to omit a just-in-time JIT compiler, in this release at least. Finally, the Dalvik VM uses a different kind of assembly-code generation, in which it uses registers as the primary units of data storage instead of the stack.
Google is hoping to accomplish 30 percent fewer instructions as a result. We should point out that the final executable code in Android, as a result of the Dalvik VM, is based not on Java bytecode but on. This means you cannot directly execute Java bytecode; you have to start with Java class files and then convert them to linkable. This extreme performance paranoia extends into the rest of the Android SDK. However, all of this XML is compiled to binary files before these binary files become resident on the devices.
Android provides special mechanisms to use this XML data. Comparing Android and Java ME As you have seen so far in this chapter, Android has taken a dedicated and focused approach to its mobile-platform efforts that goes beyond a simple JVM-based solution.
The Android Platform comes with everything you need in a single package: Developers can be assured that when they develop new applications, all key libraries will be available on the device. Let us offer a brief overview of Java ME before comparing the two approaches. Figure shows the availability of Java for various computing configurations. Furthermore, two configuration sets are available for Java ME. Java ME for CDC involves a pared down version of Java SE with fewer packages, fewer classes within those packages, and even fewer fields and methods within those classes.
Each defined profile makes an additional set of APIs available to the developer. As a consequence, if you have a Java program that runs on your desktop, there are no guarantees that it will run on devices supporting only micro editions. This API includes a number of packages under javax. The additional profiles available on top of CDC make the javax. It is also expected that JavaFX http: It offers a declarative UI programming model that is also friendlier to designers.
A mobile version of JavaFX is expected to be released sometime in Now that you have a background on Java ME, look at how it compares to Android: Java ME addresses two classes of micro devices and offers standardized and distinct solutions for each.
Android, on the other hand, applies to just one model. Java ME has multiple UI models for each configuration, depending on the features supported by the device: The JSRs for each Java ME specification are harder to follow; they take longer to mature; and finding implementations for them can be difficult. The Dalvik VM is more optimized and more responsive compared to the standard JVM supported on a similarly configured device.
Because of the Dalvik VM, Android runs. This should not be a major concern as long as Java is compiled to standard Java class files. Only runtime interpretation of Java bytecode is not possible. There is widespread support for Java ME on mobile devices because most mobile phones support it. But the uniformity, cost, and ease of development in Android are compelling reasons for Java developers to program for it.
As we mentioned earlier, Android has its own UI approach instead. In this section, we would like to cover the development aspect of Android. Figure is a good place to start this discussion.
Although the core is Linux, the majority—if not all—of the applications on an Android device such as the T-Mobile G1 are developed in Java and run through the Dalvik VM.
These libraries are responsible for recording and playback of audio and video formats. A library called Surface Manager controls access to the display system and supports 2D and 3D. The FreeType library is responsible for font support. SQLite is also an independent open source effort for relational databases and not directly tied to Android. You can acquire and use tools meant for SQLite for Android databases as well. Most of the application framework accesses these core libraries through the Dalvik VM, the gateway to the Android Platform.
As we indicated in the previous sections, Dalvik is optimized to run multiple instances of VMs. As Java applications access these core libraries, each application gets its own VM instance. However, some features of the Java experience might differ because the version of Java SE on Android is a subset of the full platform. Programmers develop end-user applications on top of this Java API. Some examples of end-user applications on the device include Home, Contacts, Phone, Browser, and so on.
Skia also forms the core of the Google Chrome browser. From a media perspective, the Android Platform supports the most common formats for audio, video, and images. We will briefly talk about the Android phone emulator and foundational components, UI programming, services, media, telephony, animation, and OpenGL. We will also show you some code snippets when they are helpful. Both approaches support an emulator that you can use to run, debug, and test your applications.
You will not even need the real device for 90 percent of your application development. This is the same technology that allows emulation of one operating system on top of another, irrespective of the processor.
ARM is a bit microprocessor architecture based on RISC Reduced Instruction Set Computerin which design simplicity and speed is achieved through a reduced number of instructions in an instruction set. The emulator actually runs the Android version of Linux on this simulated processor. ARM is widely used in handhelds and other embedded electronics where lower power consumption is important.
Much of the mobile market uses processors based on this architecture. Compared to that, the Pentium classic contains 3, 3. You can find more details about the emulator in the Android SDK documentation at http: Even menus in your application are loaded from XML files.
Screens or windows in Android are often referred to as activities, which comprise multiple views that a user needs in order to accomplish a logical unit of action. Views internally use the familiar concepts of canvases, painting, and user interaction. An activity hosting these composite views, which include views and view groups, is the logical replaceable UI component in Android.
Protocols are put in place so that Android can manage state as users hide, restore, stop, and close activity windows.
You will get a feel for these basic ideas in Chapter 2, along with an introduction to setting up the Android development environment. An intent is an amalgamation of ideas such as windowing messages, actions, publish-and-subscribe models, interprocess communications, and application registries.
Here is an example of using the Intent class to invoke or start a web browser: Depending on the list of browsers that are installed on the device, Android will choose a suitable one to display the site.
You will learn more about intents in Chapter 3.
Questions tagged [mediametadataretriever]
Android also has extensive support for resources, which include familiar elements and files such as strings and bitmaps, as well as some not-so-familiar items such as XML-based view definitions.
The framework makes use of resources in a novel way to make their usage easy, intuitive, and convenient. This indirection helps a great deal when it comes to localization. Chapter 3 covers the R. Another new concept in Android is the content provider.
A content provider is an abstraction on a data source that makes it look like an emitter and consumer of RESTful services.
The underlying SQLite database makes this facility of content providers a powerful tool for application developers. Look at an example of how XML does this for a simple layout containing a text view: Android also provides extensive support for menus, from standard menus to context menus.
These asynchronous dialogs present a special challenge to developers accustomed to the synchronous modal dialogs in some windowing frameworks. Android also offers support for animation as part of its UI stack based on views and drawable objects.
Android supports two kinds of animation: You accomplish this with computers by changing the intermediate values at regular intervals and redrawing the surface. Frame-by-frame animation occurs when a series of frames is drawn one after the other at regular intervals.
Android enables both animation approaches through animation callbacks, interpolators, and transformation matrices. Moreover, Android allows you to define these animations in an XML resource file. Check out this example, in which a series of numbered images is played in frame-by-frame animation: A Camera object in the graphics library provides support for depth and projection, which allows 3D-like simulation on a 2D surface.
If you are not familiar with OpenGL programming, the learning curve is steep. Starting with release 1. We will cover these improvements in Chapter Additionally, that SDK introduced a new concept called live folders, which we will also cover in Chapter In Android, security spans all phases of the application lifecycle—from design-time policy considerations to runtime boundary checks.
Location-based service is another one of the more exciting pieces of the Android SDK. This portion of the SDK provides application developers APIs to display and manipulate maps, as well as obtain real-time device-location information. The chapter will also cover interprocess communication communication between applications on the same device.
Here is an example of doing an HttpPost in Android: Here is a quick example of how to play an audio file from an Internet URL: The chapter will also address the following aspects of the telephony API: Both audio and video recording are accommodated in 1. This is covered with examples in Chapter Chapter 12 also covers voice recognition, along with the input-method framework IMFwhich allows a variety of inputs to be interpreted as text while typing into text controls.
The input methods include keyboard, voice, pen device, mouse, etc. This framework was originally designed as part of Java API 1. Here is an example: Details about the manifest file will emerge throughout the book as we develop each idea.
In this practical you build a simple app that plays a video clip that is either embedded in the app's resources, or available on the internet. Creating, building, and running apps in Android Studio The Activity lifecycle, and how configuration changes such as changes to the device screen orientation affect that lifecycle Making data persistent across configuration changes with the instance state bundle What you will LEARN How to use the VideoView class to play video in your app How to use the MediaController class to control video playing in a VideoView How activity state changes affect video playback state in the VideoView About event listeners and callbacks for media events How to play media files from different locations embedded in the app, or streamed from the internet What you will DO Build an app called SimpleVideoView that plays a video clip by using the VideoView and MediaController classes.
Preserve the playback position for the video playback across activity state changes. React to the end of the video playback with an event listener. Handle the time period in your app where the media is being prepared buffering or decodingand provide a helpful message to the user.
Play media files as embedded files in the app, or streamed from the internet. App overview In this practical you build the SimpleVideoView app from scratch. SimpleVideoView plays a video file in its own view in your app: The SimpleVideoView app includes a familiar set of media controls. The controls allow you to play, pause, rewind, fast-forward, and use the progress slider to skip forward or back to specific places in the video.
You start by playing a video file embedded with the app's resources. In the second task, you modify the app to buffer and play a video file from the internet. Play video with VideoView The simplest way to play video in your app is to use a VideoView object, which is part of the Android platform.
Although it does not provide a lot of features or customization, VideoView class implements a lot of the basic behavior you need to play a video in your app. Your app can play media files from a variety of sources, including embedded in the app's resources, stored on external media such as an SD card, or streamed from the internet.
In this example, to keep things simple, you embed the video file in the app itself. The MediaController view provides a set of common media-control buttons that can control a media player the VideoView object. In this task you build the first iteration of the SimpleVideoView app, which plays a video clip.
13.1: Playing video with VideoView
Create a new Android project. Call it SimpleVideoView and use the Empty activity template. You can leave all other defaults the same. Create a new raw resource directory to hold the video file the app will play: Change both the Directory name and Resource type to "raw".
Leave all other options as is and click OK. Download the Tacoma Narrows MP4 video file. This constraint keeps the aspect ratio of the video the same and prevents the view from being stretched too far in either direction depending on how the device is rotated. The implementation looks like this: In onCreateget a reference to the VideoView in the layout: This stops the video from playing and releases all the resources held by the VideoView.
It is critical that your app completely release those resources when it's not using them, even if the app is paused in the background.
Implement the Activity lifecycle method for onStop to release the media resources when the app is stopped. Implement onStart to initialize the resources when the app is started again. Override the onStart method and call initializePlayer. If the app is running on an older version of Android, pause the VideoView here. In older versions of Android, onPause was the end of the visual lifecycle of your app, and you could start releasing resources when the app was paused.
In newer versions of Android, your app may be paused but still visible on the screen, as with multi-window or picture-in-picture PIP mode. In those cases the user likely wants the video to continue playing in the background. If the video is being played in multi-window or PIP mode, then it is onStop that indicates the end of the visible life cycle of the app, and your video playback should indeed stop at that time.
If you only stop playing your video in onStopas in the previous step, then on older devices there may be a few seconds where even though the app is no longer visible on screen, the video's audio track continues to play while onStop catches up.
surfaceview - Rotating an android VideoView - Stack Overflow
This test for older versions of Android pauses the actual playback in onPause to prevent the sound from playing after the app has disappeared from the screen. Build and run the app. When the app starts, the video file is opened and decoded, begins playing, and plays to the end. There is no way to control the media playback, for example using pause, play, fast-forward, or rewind.
You add these capabilities in the next task. Older versions of the emulator support fewer types of video formats, and may also suffer from degraded performance during playback. If you run the app on a physical device that runs a version of Android older than API 23, you should not have either of these problems. The Android platform provides a way to control media using the MediaController view, which is in the android. A MediaController view combines the most common media control UI elements buttons for play, pause, fast-forward, and rewind, as well as a seek or progress bar with the ability to control an underlying media player, such as a VideoView.
To use a MediaController view, you don't define it in your layout as you would other views. Instead you instantiate it programmatically in your app's onCreate method and then attach it to a media player. The controller floats above your app's layout and enables the user to start, stop, fast-forward, rewind, and seek within the video. In the figure above: A VideoView, which includes a MediaPlayer to decode and play video, and a SurfaceView to display the video on the screen A MediaController view, which includes UI elements for video transport controls play, pause, fast-forward, rewind, progress slider and the ability to control the video In this task you add a MediaController to the SimpleVideoView app.
Locate the onCreate method in MainActivity. When you add the MediaController class make sure that you import android. Use setMediaController to do the reverse connection, that is, to tell the VideoView that the MediaController will be used to control it: As before, the video begins to play when the app starts.
Tap the VideoView to make the MediaController appear. You can then use any of the elements in that controller to control media playback.