CaptureDevice | Adobe Air Native Extension for video capturing from cameras | Camera library
kandi X-RAY | CaptureDevice Summary
kandi X-RAY | CaptureDevice Summary
Adobe Air Native Extension for video capturing from cameras
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of CaptureDevice
CaptureDevice Key Features
CaptureDevice Examples and Code Snippets
Community Discussions
Trending Discussions on CaptureDevice
QUESTION
Here is the snippet of the code:
...ANSWER
Answered 2022-Feb-01 at 22:05You are accessing the local variable captureDevice
, not the struct's captureDevice
variable.
Just replace the line causing the error to the following:
QUESTION
I am trying to run code to show the video output of my webcam and all I am getting is a single picture. Here is my code:
...ANSWER
Answered 2021-Oct-05 at 10:01gray=cv2.cvtColor(gray, cv2.COLOR_BGR2GRAY)
QUESTION
I'm trying to change the exposure in my camera app according to certain point of the image.
I'm using the following code that is triggered when the user taps on screen. For now I simply try to expose to the center.
...ANSWER
Answered 2021-Aug-05 at 05:11I actually ran into this issue yesterday. Turns out there's a problem with using exactly (0.5, 0.5)
. When I use (0.51, 0.51)
it works every time 🤷
QUESTION
Actually, I want to broadcast a live match with some overlays over it like sponsors images on top corners of the screen and a score card on the bottom of the screen. Can someone help me or guide me on a way of implementation I use this pod (haishinkit) but this pod is not serving the purpose. I use rtmpstream.attachScreen function for broadcasting my UIView but this function is not picking up my camera view (AVCaptureVideoPreviewLayer) other than this scorecard and sponsor images are broadcasting. I want to broadcast my Camera Screen along with Scorecard, other images along with the audio.
...ANSWER
Answered 2021-Aug-04 at 09:32I have found a way to live stream camera view with overlays on it by creating 2 RTMPStream objects, one for attaching the camera and the second one is for attachscreen. following is the code.
QUESTION
I have a code to capture preview images from the app for image processing. I need to control the frame rate of the preview for that purpose, But the code to set the frame rate has no effect on the preview stream .Here is the code
...ANSWER
Answered 2021-Jul-08 at 10:43Check if it supports the desired FPS using this:
QUESTION
I'd like to use RxSwift to process video frames captured from the iPhone camera. I'm using a community maintained project, https://github.com/RxSwiftCommunity/RxAVFoundation, which bridges AVFoundation (used to capture camera output) and RxSwift.
I'm trying to just print a dummy log statement whenever new video frames get written to the output buffer. The following is my ViewController. I configure the AVCaptureSession, set up the Rx chain, then start the session. However, the print statement in the .next case is never triggered. I reached out to the project owner. Is the below code correct? Here's the Reactive extension for the AVCaptureSession class from the community maintained project: https://github.com/RxSwiftCommunity/RxAVFoundation/blob/master/RxAVFoundation/AVCaptureSession%2BRx.swift
...ANSWER
Answered 2021-Jun-29 at 08:46Because you've defined your DisposeBag
locally inside viewDidLoad
as soon as viewDidLoad
finishes all the subscriptions added to the bag are disposed.
Declare your DisposeBag
as an instance variable of the ViewController
to fix:
QUESTION
On macOS, is it possible to see a virtual Camera, such as OBS, as a CaptureDevice? I see that, for example, Google Chrome or Zoom can use this camera, but using AVCaptureDevice.DiscoverySession I am unable to see it.
Am I doing wrong?
...ANSWER
Answered 2021-Jun-29 at 05:25Try setting the DiscoverSession
mediaType
to .video
, and making sure your OBS virtual camera is working: you should be able to select it in the Camera menu in Photo Booth.app
.
QUESTION
Is there any way to play audio directly into a capture device in C#? In my project I will have to feed later on a virtual capture driver with audio so I can use it in other programs and play the wanted audio anywhere else, but Im not sure it is possible in C#, I tried to do this with NAudio (which is truly amazing):
...ANSWER
Answered 2021-May-31 at 07:07You cannot push audio to the device which generates audio on its own, "capture device".
Loopback mode means that you can have a copy of audio stream from a rendering device, but this does not work the other way.
The way things can work more or less as you assumed is when you have a special (and custom or third party, since no stock implementation of the kind exists) implementation of audio capture device, designed to generate audio supplied by external application such as your pushing the payload audio data via an API.
Switching to C++ will be of no help with this challenge.
QUESTION
I've been making an iOS camera app and trying to solve this problem for two days (but cannot solve this).
What I'm working on now is change the focus and exposure automatically depending on the user's tapped location. Sometimes it works fine (maybe about 20% in total), but mostly it fails. Especially when I try to focus on a far object (like 5+metre) or when there are two objects and try to switch the focus of one object to another. The image below is an example.
The yellow square locates where the user tapped and even though I tapped the black cup in the first picture, the camera still focuses on the red cup.
...ANSWER
Answered 2021-Apr-09 at 09:58Thanks to @Artem I was able to solve the problem. All I needed to do was convert the absolute coordinate to the value used in focusPointOfInterest (min (0,0) to max (1,1)).
Thank you, Artem!!
QUESTION
I'm using two separate iOS libraries that make use of the device's camera.
The first one, is a library used to capture regular photos using the camera. The second one, is a library that uses ARKit to measure the world.
Somehow, after using the ARKit code, the regular camera quality (with the exact same settings and initialization code) renders a much lower quality (a lot of noise in the image, looks like post-processing is missing) preview and captured image. A full app restart is required to return the camera to its original quality.
I know this may be vague, but here's the code for each library (more or less). Any ideas what could be missing? Why would ARKit permanently change the camera's settings? I could easily fix it if I knew which setting is getting lost/changed after ARKit is used.
Code sample for iOS image capture (removed error checking and boilerplate):
...ANSWER
Answered 2021-Mar-02 at 01:59It happens because ARKit's maximum output resolution is lower than the camera's. You can check ARWorldTrackingConfiguration.supportedVideoFormats
for a list of ARConfiguration.VideoFormat
to see all available resolutions for the current device.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install CaptureDevice
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page