MediaCodec | use MediaCodec to encode/decode and pass samples
kandi X-RAY | MediaCodec Summary
kandi X-RAY | MediaCodec Summary
Example show case use cases of MediaCodec. It can be valuable for applications that doing encoding, decoding and transfering samples in H.264 (for example) over network, etc.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Called when the surface is initialized
- Starts the scheduler
- Called when data has been destroyed
- Stop the worker
- Initialize the view
- Start rendering
- Stops the rendering process
- Configures the surface with a csd0
- Decodes a single sample from an array of bytes
MediaCodec Key Features
MediaCodec Examples and Code Snippets
Community Discussions
Trending Discussions on MediaCodec
QUESTION
I am building an android application that lists down all camera devices, I want to list down all the camera devices and allow the user to use them all and not only that, I want also to allow the user to change the play with the resolution as they want, so I follow this link: https://developer.android.com/training/camera2/camera-enumeration
It recommends only using the camera devices with the flag:
...ANSWER
Answered 2022-Mar-30 at 00:06BACKWARD_COMPATIBLE devices are ones that support YUV and JPEG output and a bunch of basic camera behavior.
In general, only very few camera types won't list BACKWARD_COMPATIBLE; one such example is a pure depth camera, which won't produce JPEGs. For such devices, you have to manually check what output formats are actually supported via 'getOutputFormats', since it's likely something like JPEG won't be listed, or it may only support monochrome output and not color, which may make it impossible to use it with a video recorder.
If you're seeing a lot of devices get filtered out by excluding BACKWARD_COMPATIBLE, it'd be interesting to know, since in my experience they're very rare.
QUESTION
I'm trying to load videos from firebase storage to my recycler view with ExoPlayer. But the problem is that the video sometimes does not play and if i hit the play button it gives me this error below
...ANSWER
Answered 2022-Mar-27 at 19:27I had to release the player when the recycler view item are recycled
I find the answer from this question here
QUESTION
I'm trying to write a simple TvInputService for Android TV using ExoPlayer. On the emulator everything works fine, but on Sony TV (KDL-43WF804) I get IllegalStateException
from video codec after a few seconds of video playing. What am I doing wrong?
Logs:
...ANSWER
Answered 2022-Feb-10 at 19:35I figured it out. In my case, this exception is caused by the crash of the system tv application, which owns the Surface
object. The codec goes into the Error
state when the Surface
becomes invalid and at the same moment the ExoPlayer tries to work with the codec's buffers, not knowing that the codec has changed the Executing
state to Error
.
And the reason for the crash of the system tv app was the following exception:
QUESTION
I am trying to use the video_player, but I am getting the below error. I have also added an MRE (minimum reproducible example).
I have used an emulated Pixel 4, an emulated Pixel 4 XL, and an emulator Pixel 5 with the Android Studio Beta, but none of them worked.
The below error was when I was using a Pixel 4 XL, but the error was the same with all of them.
Error:
...ANSWER
Answered 2022-Jan-11 at 08:53It can be a bug of that Flutter package, indeed. Have you tried to create an issue in GitHub of that package?
Secondly, during my development, I see several times when emulators just fail and real devices always work. The solution I used is - simply to do not test them on simulators. Real users never use simulators, aren't they?
It can be a bug of the library when running on x86 arch (the arch simulators use). Then, nobody with a real device (arm arch) will ever see the bug.
Thirdly, what about trying to use "cloud real devices" to test whether they work on real Pixel devices that you are worried about. There are many platforms that host some real devices and you can connect to them via a webpage and test your app.
QUESTION
For a testing purpose I am creating a new video from existing one by using MediaExtractor and MediaMuxer. I expect the new video to be exactly the same duration as the original one but it is not the case. The new video duration is slightly shorter than the original one.
...ANSWER
Answered 2022-Jan-06 at 10:38no answer but some inputs that may help:
I believe the media extractor and media muxer are vendor owned, google have the default cpp implementation but the vendors can override it. you can review the google implementation here: https://cs.android.com/android/platform/superproject/+/master:frameworks/av/media/libstagefright/MediaMuxer.cpp;l=173
It helps me to solve one of the voodoo bugs in the engine related to "last frame" / time mismatch. NOTE: you need to make sure you are looking o n the right version (check the blame tool to the right).
the media can contain any metadata value that was pushed while creating the file. so, you can calculate it again or just use what you get.
did you tried to take the video that you created and then run the test when this file is the input?
QUESTION
I am using flutter ffmpeg and try to save output video in local storage but getting error, plese help me I tried too many solutions but none of them worked. Thanking you :)
getting output path by path_provider package
ANSWER
Answered 2021-Dec-27 at 12:10I got solution so I answered here, issue is not of flutter_ffmpeg, issue is caused because app had not permission to write in external storage to resolve this add "MANAGE_EXTERNAL_STORAGE" in mainfest.xml file and set output path is File('storage/emulated/0/my_folder/o.mp4').path , and everything works fine.
QUESTION
I am getting error on capturing video
java.lang.NullPointerException: Attempt to invoke virtual method 'void android.media.MediaCodec.reset()' on a null object reference
Even though i have given Record_Audio permission. I am using camerax version: 1.0.2
...ANSWER
Answered 2021-Dec-17 at 12:46For those who are struggling with MediaCodec exception while recording video. Have a look at CameraX video. For me it solved issue. Hope it will be helpful to someone.
QUESTION
I am getting irregular newBufferInfo.presentationTimeUs because of which if I put Thread.sleep to slow down the playback, a lot of frames are dropped.
Actually, with Surface the frames timestamps are synchronized automatically with system timestamp without sleep, however it does not work with giving output to OpenGLES. https://developer.android.com/reference/android/media/MediaCodec#releaseOutputBuffer(int,%20long)
I thought mExtractor.getSampleTime()
is the problem but even after removing it, the problem is still there.
ANSWER
Answered 2021-Nov-19 at 05:04I've notices a couple of problems on your code.
First, you shouldn't compute presentation time by yourself since if the video has B frames, the presentation time of frames might not always come in increasing order. Causing the frame timestamps seems to come out of order.
https://ottverse.com/i-p-b-frames-idr-keyframes-differences-usecases/
Second, you shouldn't drop the frame based on the time you passed it into the decoder. Decoder some time needs multiple frame to be passed before outputting a new frame, so it may take some times for the decoder to decode a particular frame. Rather you should calculate it based on the first frame rendered.
QUESTION
I am trying to overlay a png over a transparent gif using FFMPEG. The problem is the command is running flawlessly but the output file in converting transparent pixels into black or white.
I am using the following command.
...ANSWER
Answered 2021-Oct-15 at 15:44QUESTION
I am using OpenGLES2 output to display to a SurfaceView or encode to mp4 using MediaCodec.
However, I can only do one at a time. I can obviously draw using OpenGLES2 onto two separate surfaces but that would be a really inefficient use of the GPU.
What I want is to use some sort of reference counting to reuse the buffer to both draw on the screen and encode the single OpenGLES2 output. Like how camera service does in the Shared Surfaces concept.
Can can one do both display and encode of a buffer? Is there some sort of tee element (like in GStreamer) present in Android?
...ANSWER
Answered 2021-Aug-13 at 11:45You can't make your surfaceView bigger than the screen. Although there are multiple ways to do this in different manner but directly you can't reuse surfaceview to encode after display to screen.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install MediaCodec
You can use MediaCodec like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the MediaCodec component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page