SwiftyCam | A Snapchat Inspired iOS Camera Framework written in Swift | Camera library
kandi X-RAY | SwiftyCam Summary
kandi X-RAY | SwiftyCam Summary
SwiftyCam is a a simple, Snapchat-style iOS Camera framework for easy photo and video capture. SwiftyCam allows users to capture both photos and videos from the same session with very little configuration. Configuring a Camera View Controller in AVFoundation can be tedious and time consuming. SwiftyCam is a drop in View Controller which gives complete control of the AVSession.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of SwiftyCam
SwiftyCam Key Features
SwiftyCam Examples and Code Snippets
Community Discussions
Trending Discussions on SwiftyCam
QUESTION
I am using 2 pods to record (SwiftyCam) and then merge multiple clips recorded (Swift Video Generator).
However, I am encountering a problem which is starting to seriously bother me. I also opened an issue on this. If you would like to read it, here's the link: Last video in array of multiple videos dicatates whether previous videos are mirrored. Please note (before reading the summarized description of the problem) that all videos are recorded in portrait orientation and that those recorded using the front camera are supposed to be mirrored (as single clips but also in the merged video).
To summarize it: If I only record clips with one camera, the merged video looks alright (e.g. only with the front camera: every single clip is mirrored and this doesn't change when merging them). However, if I use multiple clips from both cameras, say I record one with the front camera, afterward another one with the back camera, then the first video (front camera) will be "unmirrored" in the merged video. The opposite occurs if the last clip was recorded using the front camera: All clips by the back camera are mirrored in the merged video in this case.
Now I tried looking into the code of the video generator and found this (at swift video generator, VideoGenerator.swift, l. 309):
...ANSWER
Answered 2019-Jul-23 at 09:14With some research and some help from raywanderlich.com I managed to find a solution to this problem and even discovered another deeper one which is caused by the other pod I mentioned (SwiftyCam). Due to this issue with SwiftyCam I had to tweak the solution I am presenting here a bit, i.e. I had to change the translations of the CGAffineTransform
which usually shouldn't occur (EDIT: I also put this code in the solution now, might need it, might not: you'll have to try and see, I currently cannot explain why it is sometimes needed and sometimes not).
Solution:
First of all, we need two helper functions from raywanderlich.com:
This one will give us information on the video's orientation and whether or not it's portrait. Actually, the [UIImage.Orientation]Mirrored
cases are missing in the original function but I needed rightMirrored
(first else if
), too:
QUESTION
I am using SwiftyCam and would like to give some sort of animated feedback when using the tap to focus feature.
SwiftyCam comes with a didFocusAtPoint function which is called when the screen is tapped and provides you with the point of the tap.
How can I now create and show an animation at that point?
...ANSWER
Answered 2018-Dec-17 at 22:18The DemoSwiftyCam
project on Github already has an implementation for this feature:
ViewController.swift
QUESTION
I've developing an app using NativeScript with Angular I can run it on the simulator but when i open the project in XCode and try to create the archive file I get the following under Apple Mach-O Linker
...ANSWER
Answered 2018-Sep-14 at 05:10It looks like this is your problem ld: framework not found BEMCheckBox
. You are missing the framework BEMCheckBox
in your project which is causing the build to fail.
Check out https://github.com/Boris-Em/BEMCheckBox#installation for how to install the framework. If your not sure how to install it, check out CocoaPods (https://guides.cocoapods.org/using/using-cocoapods). It's a great tool for installing and managing frameworks.
QUESTION
I have created my camera view following the demo project on GitHub from SwiftyCam Everything lays out correctly and is going well; however, when the camera button is pushed I get a message in the console saying "[SwiftyCam]: Cannot take photo. Capture session is not running". There have been other people having this problem with swift 4 and you can find that here. I have gone through the whole framework line by line but for some reason I can't figure it out. I would really appreciate it if someone could look at the framework and the documentation and help me out. The way I'm doing it is just about exactly how it's done in the Swift 4 demo project so that would be the code reference.
Thank you in advance
EDIT: Below is the code when setting up the SwipeNavigationController after the WelcomeVC
...ANSWER
Answered 2018-Aug-10 at 04:37public enum Position {
case center
case top
case bottom
case left
case right
}
enum ActivePanDirection {
case undefined
case horizontal
case vertical
}
public protocol SwipeNavigationControllerDelegate: class {
func swipeNavigationController(_ controller: SwipeNavigationController, willShowEmbeddedViewForPosition position: Position)
func swipeNavigationController(_ controller: SwipeNavigationController, didShowEmbeddedViewForPosition position: Position)
}
open class SwipeNavigationController: SwiftyCamViewController, SwiftyCamViewControllerDelegate {
@IBOutlet fileprivate var currentXOffset: NSLayoutConstraint!
@IBOutlet fileprivate var currentYOffset: NSLayoutConstraint!
open fileprivate(set) weak var activeViewController: UIViewController!
public weak var delegate: SwipeNavigationControllerDelegate?
open fileprivate(set) var centerViewController: UIViewController!
open var topViewController: UIViewController? {
willSet(newValue) {
self.shouldShowTopViewController = newValue != nil
guard let viewController = newValue else {
return
}
addEmbeddedViewController(viewController, previousViewController: topViewController, position: .top)
}
}
open var bottomViewController: UIViewController? {
willSet(newValue) {
self.shouldShowBottomViewController = newValue != nil
guard let viewController = newValue else {
return
}
addEmbeddedViewController(viewController, previousViewController: bottomViewController, position: .bottom)
}
}
open var leftViewController: UIViewController? {
willSet(newValue) {
self.shouldShowLeftViewController = newValue != nil
guard let viewController = newValue else {
return
}
addEmbeddedViewController(viewController, previousViewController: leftViewController, position: .left)
}
}
open var rightViewController: UIViewController? {
willSet(newValue) {
self.shouldShowRightViewController = newValue != nil
guard let viewController = newValue else {
return
}
addEmbeddedViewController(viewController, previousViewController: rightViewController, position: .right)
}
}
open override var shouldAutomaticallyForwardAppearanceMethods: Bool {
get {
return false
}
}
@IBOutlet fileprivate var mainPanGesture: UIPanGestureRecognizer!
fileprivate var previousNonZeroDirectionChange = CGVector(dx: 0.0, dy: 0.0)
fileprivate var activePanDirection = ActivePanDirection.undefined
fileprivate let verticalSnapThresholdFraction: CGFloat = 0.15
fileprivate let horizontalSnapThresholdFraction: CGFloat = 0.15
fileprivate var centerContainerOffset: CGVector!
fileprivate var topContainerOffset: CGVector!
fileprivate var bottomContainerOffset: CGVector!
fileprivate var leftContainerOffset: CGVector!
fileprivate var rightContainerOffset: CGVector!
open var shouldShowTopViewController = true
open var shouldShowBottomViewController = true
open var shouldShowLeftViewController = true
open var shouldShowRightViewController = true
open var shouldShowCenterViewController = true
fileprivate let swipeAnimateDuration = 0.2
public init(centerViewController: UIViewController) {
super.init(nibName: nil, bundle: nil)
shouldShowTopViewController = false
shouldShowBottomViewController = false
shouldShowLeftViewController = false
shouldShowRightViewController = false
self.centerViewController = centerViewController
addChildViewController(centerViewController)
centerViewController.didMove(toParentViewController: self)
}
public func swiftyCamSessionDidStartRunning(_ swiftyCam: SwiftyCamViewController) {
print("Session did start running")
captureButton.buttonEnabled = true
}
public func swiftyCamSessionDidStopRunning(_ swiftyCam: SwiftyCamViewController) {
print("Session did stop running")
captureButton.buttonEnabled = false
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didTake photo: UIImage) {
let newVC = PhotoViewController(image: photo)
self.present(newVC, animated: true, completion: nil)
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didBeginRecordingVideo camera: SwiftyCamViewController.CameraSelection) {
print("Did Begin Recording")
captureButton.growButton()
hideButtons()
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didFinishRecordingVideo camera: SwiftyCamViewController.CameraSelection) {
print("Did finish Recording")
captureButton.shrinkButton()
showButtons()
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didFinishProcessVideoAt url: URL) {
let newVC = VideoViewController(videoURL: url)
self.present(newVC, animated: true, completion: nil)
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didFocusAtPoint point: CGPoint) {
print("Did focus at point: \(point)")
focusAnimationAt(point)
}
public func swiftyCamDidFailToConfigure(_ swiftyCam: SwiftyCamViewController) {
let message = NSLocalizedString("Unable to capture media", comment: "Alert message when something goes wrong during capture session configuration")
let alertController = UIAlertController(title: "AVCam", message: message, preferredStyle: .alert)
alertController.addAction(UIAlertAction(title: NSLocalizedString("OK", comment: "Alert OK button"), style: .cancel, handler: nil))
present(alertController, animated: true, completion: nil)
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didChangeZoomLevel zoom: CGFloat) {
print("Zoom level did change. Level: \(zoom)")
print(zoom)
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didSwitchCameras camera: SwiftyCamViewController.CameraSelection) {
print("Camera did change to \(camera.rawValue)")
print(camera)
}
public func swiftyCam(_ swiftyCam: SwiftyCamViewController, didFailToRecordVideo error: Error) {
print(error)
}
public required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
let flipCameraButton: UIButton = {
let button = UIButton()
let image = UIImage(named: "cameraSwitch")
button.setImage(image, for: .normal)
button.addTarget(self, action: #selector(cameraSwitchTapped), for: .touchUpInside)
return button
}()
let captureButton: SwiftyRecordButton = {
let button = SwiftyRecordButton(frame: CGRect(x: 150, y: 572, width: 75, height: 75))
let image = UIImage(named: "focus")
button.setImage(image, for: .normal)
button.addTarget(self, action: #selector(cameraTapped), for: .touchUpInside)
return button
}()
let orangeButton: UIButton = {
let button = UIButton()
let image = UIImage(named: "OrangeIcon")
button.setImage(image, for: .normal)
button.addTarget(self, action: #selector(goToOrange), for: .touchUpInside)
return button
}()
let greenButton: UIButton = {
let button = UIButton()
let image = UIImage(named: "GreenIcon")
button.setImage(image, for: .normal)
button.addTarget(self, action: #selector(goToGreen), for: .touchUpInside)
return button
}()
let pinkButton: UIButton = {
let button = UIButton()
let image = UIImage(named: "pinkCameraIcon")
button.setImage(image, for: .normal)
button.addTarget(self, action: #selector(goToPink), for: .touchUpInside)
return button
}()
@objc func goToOrange() {
//orangeVC.navigationController?.setNavigationBarHidden(false, animated: true)
self.containerSwipeNavigationController?.showEmbeddedView(position: .left)
}
@objc func goToGreen() {
//greenVC.navigationController?.setNavigationBarHidden(false, animated: true)
self.containerSwipeNavigationController?.showEmbeddedView(position: .right)
}
@objc func goToPink() {
self.containerSwipeNavigationController?.showEmbeddedView(position: .bottom)
}
@objc func cameraSwitchTapped() {
switchCamera()
}
@objc func cameraTapped() {
print("CAMERA TAPPED")
takePhoto()
}
// Mark: - Functions
open override func viewDidLoad() {
super.viewDidLoad()
self.cameraDelegate = self
if currentXOffset == nil && currentYOffset == nil {
view.addSubview(centerViewController.view)
centerViewController.view.isHidden = true
centerViewController.view.translatesAutoresizingMaskIntoConstraints = false
self.currentXOffset = alignCenterXConstraint(forItem: centerViewController.view, toItem: view, position: .center)
self.currentYOffset = alignCenterYConstraint(forItem: centerViewController.view, toItem: view, position: .center)
view.addConstraints([self.currentXOffset, self.currentYOffset])
view.addConstraints(sizeConstraints(forItem: centerViewController.view, toItem: view))
}
testViewdid()
assert(currentXOffset != nil && currentYOffset != nil, "both currentXOffset and currentYOffset must be set")
if mainPanGesture == nil {
mainPanGesture = UIPanGestureRecognizer(target: self, action: #selector(onPanGestureTriggered(sender:)))
view.addGestureRecognizer(mainPanGesture)
}
let frameWidth = view.frame.size.width
let frameHeight = view.frame.size.height
centerContainerOffset = CGVector(dx: currentXOffset.constant, dy: currentYOffset.constant)
topContainerOffset = CGVector(dx: centerContainerOffset.dx, dy: centerContainerOffset.dy + frameHeight)
bottomContainerOffset = CGVector(dx: centerContainerOffset.dx, dy: centerContainerOffset.dy - frameHeight)
leftContainerOffset = CGVector(dx: centerContainerOffset.dx + frameWidth, dy: centerContainerOffset.dy)
rightContainerOffset = CGVector(dx: centerContainerOffset.dx - frameWidth, dy: centerContainerOffset.dy)
activeViewController = centerViewController
}
open override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
activeViewController.beginAppearanceTransition(true, animated: animated)
}
open override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
activeViewController.endAppearanceTransition()
captureButton.delegate = self
}
open override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
activeViewController.beginAppearanceTransition(false, animated: animated)
}
open override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
activeViewController.endAppearanceTransition()
}
func testViewdid() {
view.addSubview(flipCameraButton)
view.addSubview(captureButton)
view.addSubview(orangeButton)
view.addSubview(greenButton)
view.addSubview(pinkButton)
shouldPrompToAppSettings = true
cameraDelegate = self
maximumVideoDuration = 10.0
shouldUseDeviceOrientation = true
allowAutoRotate = true
audioEnabled = true
captureButton.buttonEnabled = true
navigationController?.isNavigationBarHidden = true
UIApplication.shared.statusBarStyle = .lightContent
navigationController?.isNavigationBarHidden = true
//setupViews()
}
// Let UIKit handle rotation forwarding calls
open override func shouldAutomaticallyForwardRotationMethods() -> Bool {
return true
}
// MARK: - Containers
open func showEmbeddedView(position: Position) {
weak var disappearingViewController: UIViewController?
let targetOffset: CGVector
switch position {
case .center:
if !activeViewController.isEqual(centerViewController) {
disappearingViewController = activeViewController
}
activeViewController = centerViewController
targetOffset = centerContainerOffset
case .top:
activeViewController = topViewController
targetOffset = topContainerOffset
case .bottom:
activeViewController = bottomViewController
targetOffset = bottomContainerOffset
case .left:
activeViewController = leftViewController
targetOffset = leftContainerOffset
case .right:
activeViewController = rightViewController
targetOffset = rightContainerOffset
}
if !activeViewController.isEqual(centerViewController) {
disappearingViewController = centerViewController
}
currentXOffset.constant = targetOffset.dx
currentYOffset.constant = targetOffset.dy
disappearingViewController?.beginAppearanceTransition(false, animated: true)
activeViewController.beginAppearanceTransition(true, animated: true)
delegate?.swipeNavigationController(self, willShowEmbeddedViewForPosition: position)
UIView.animate(withDuration: swipeAnimateDuration, animations: {
self.view.layoutIfNeeded()
}) { (finished) in
self.delegate?.swipeNavigationController(self, didShowEmbeddedViewForPosition: position)
self.activeViewController.endAppearanceTransition()
disappearingViewController?.endAppearanceTransition()
}
}
open func isContainerActive(position: Position) -> Bool {
let targetOffset: CGVector
switch position {
case .center:
targetOffset = centerContainerOffset
case .top:
targetOffset = topContainerOffset
case .bottom:
targetOffset = bottomContainerOffset
case .left:
targetOffset = leftContainerOffset
case .right:
targetOffset = rightContainerOffset
}
return (currentXOffset.constant, currentYOffset.constant) == (targetOffset.dx, targetOffset.dy)
}
open func lock() {
self.mainPanGesture.isEnabled = false
}
open func unlock() {
self.mainPanGesture.isEnabled = true
}
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldReceiveTouch touch: UITouch) -> Bool {
return true
}
@IBAction fileprivate func onPanGestureTriggered(sender: UIPanGestureRecognizer) {
switch sender.state {
case .began:
if isContainerActive(position: .top) || isContainerActive(position: .bottom) {
activePanDirection = .vertical
} else if isContainerActive(position: .left) || isContainerActive(position: .right) {
activePanDirection = .horizontal
} else {
activePanDirection = .undefined
}
case .changed:
let translationInMainView = sender.translation(in: view)
if translationInMainView.x != 0 {
previousNonZeroDirectionChange.dx = translationInMainView.x
}
if translationInMainView.y != 0 {
previousNonZeroDirectionChange.dy = translationInMainView.y
}
switch activePanDirection {
case .undefined:
activePanDirection = fabs(translationInMainView.x) > fabs(translationInMainView.y) ? .horizontal : .vertical
case .horizontal:
let isCurrentlyShowingRightViewController = currentXOffset.constant < centerContainerOffset.dx
let isCurrentlyShowingLeftViewController = currentXOffset.constant > centerContainerOffset.dx
let minX = isCurrentlyShowingRightViewController || shouldShowRightViewController ? rightContainerOffset.dx : centerContainerOffset.dx
let maxX = isCurrentlyShowingLeftViewController || shouldShowLeftViewController ? leftContainerOffset.dx : centerContainerOffset.dx
if shouldShowCenterViewController {
currentXOffset.constant = min(max(minX, currentXOffset.constant + translationInMainView.x), maxX)
}
case .vertical:
let isCurrentlyShowingBottomViewController = currentYOffset.constant < centerContainerOffset.dy
let isCurrentlyShowingTopViewController = currentYOffset.constant > centerContainerOffset.dy
let minY = isCurrentlyShowingBottomViewController || shouldShowBottomViewController ? bottomContainerOffset.dy : centerContainerOffset.dy
let maxY = isCurrentlyShowingTopViewController || shouldShowTopViewController ? topContainerOffset.dy : centerContainerOffset.dy
if shouldShowCenterViewController {
currentYOffset.constant = min(max(minY, currentYOffset.constant + translationInMainView.y), maxY)
}
}
// reset translation for next iteration
sender.setTranslation(CGPoint.zero, in: view)
case .ended:
/*
* Handle snapping here
*/
switch activePanDirection {
case .horizontal:
if currentXOffset.constant > 0.0 {
// within range of center container
if currentXOffset.constant < (horizontalSnapThresholdFraction * view.frame.size.width) {
showEmbeddedView(position: .center)
}
// within range of left container
else if currentXOffset.constant > ((1.0 - horizontalSnapThresholdFraction) * view.frame.size.width) {
showEmbeddedView(position: .left)
}
// center region: depends on inertia direction
else {
// pulled right
if previousNonZeroDirectionChange.dx > 0.0 {
showEmbeddedView(position: .left)
}
// pulled left
else {
showEmbeddedView(position: .center)
}
}
}
else if currentXOffset.constant < 0.0 {
// within range of center container
if currentXOffset.constant > (horizontalSnapThresholdFraction * -view.frame.size.width) {
showEmbeddedView(position: .center)
}
// within range of right container
else if currentXOffset.constant < ((1.0 - horizontalSnapThresholdFraction) * -view.frame.size.width) {
showEmbeddedView(position: .right)
}
// center region: depends on inertia direction
else {
// pulled left
if previousNonZeroDirectionChange.dx < 0.0 {
showEmbeddedView(position: .right)
}
// pulled right
else {
showEmbeddedView(position: .center)
}
}
}
case .vertical:
if currentYOffset.constant > 0.0 {
if currentYOffset.constant < (verticalSnapThresholdFraction * view.frame.size.height) {
showEmbeddedView(position: .center)
}
else if currentYOffset.constant > ((1.0 - verticalSnapThresholdFraction) * view.frame.size.height) {
showEmbeddedView(position: .top)
}
else {
if previousNonZeroDirectionChange.dy > 0.0 {
showEmbeddedView(position: .top)
}
else {
showEmbeddedView(position: .center)
}
}
}
else if currentYOffset.constant < 0.0 {
if currentYOffset.constant > (verticalSnapThresholdFraction * -view.frame.size.height) {
showEmbeddedView(position: .center)
}
else if currentYOffset.constant < ((1.0 - verticalSnapThresholdFraction) * -view.frame.size.height) {
showEmbeddedView(position: .bottom)
}
else {
if previousNonZeroDirectionChange.dy < 0.0 {
showEmbeddedView(position: .bottom)
}
else {
showEmbeddedView(position: .center)
}
}
}
case .undefined:
break
}
default:
break
}
}
func addEmbeddedViewController(_ viewController: UIViewController, previousViewController: UIViewController?, position: Position) {
if viewController.isEqual(previousViewController) {
return
}
previousViewController?.beginAppearanceTransition(false, animated: false)
previousViewController?.view.removeFromSuperview()
previousViewController?.endAppearanceTransition()
previousViewController?.willMove(toParentViewController: nil)
previousViewController?.removeFromParentViewController()
addChildViewController(viewController)
view.addSubview(viewController.view)
view.sendSubview(toBack: viewController.view)
viewController.view.translatesAutoresizingMaskIntoConstraints = false
viewController.didMove(toParentViewController: self)
view.addConstraint(alignCenterXConstraint(forItem: viewController.view, toItem: centerViewController.view, position: position))
view.addConstraint(alignCenterYConstraint(forItem: viewController.view, toItem: centerViewController.view, position: position))
view.addConstraints(sizeConstraints(forItem: viewController.view, toItem: centerViewController.view))
}
func alignCenterXConstraint(forItem item: UIView, toItem: UIView, position: Position) -> NSLayoutConstraint {
let offset = position == .left ? -self.view.frame.width : position == .right ? toItem.frame.width : 0
return NSLayoutConstraint(item: item, attribute: .centerX, relatedBy: .equal, toItem: toItem, attribute: .centerX, multiplier: 1, constant: offset)
}
func alignCenterYConstraint(forItem item: UIView, toItem: UIView, position: Position) -> NSLayoutConstraint {
let offset = position == .top ? -self.view.frame.height : position == .bottom ? toItem.frame.height : 0
return NSLayoutConstraint(item: item, attribute: .centerY, relatedBy: .equal, toItem: toItem, attribute: .centerY, multiplier: 1, constant: offset)
}
func sizeConstraints(forItem item: UIView, toItem: UIView) -> [NSLayoutConstraint] {
let widthConstraint = NSLayoutConstraint(item: item, attribute: .width, relatedBy: .equal, toItem: toItem, attribute: .width, multiplier: 1, constant: 0)
let heightConstraint = NSLayoutConstraint(item: item, attribute: .height, relatedBy: .equal, toItem: toItem, attribute: .height, multiplier: 1, constant: 0)
return [widthConstraint, heightConstraint]
}
func hideButtons() {
UIView.animate(withDuration: 0.25) {
self.flipCameraButton.alpha = 0.0
}
}
func showButtons() {
UIView.animate(withDuration: 0.25) {
self.flipCameraButton.alpha = 1.0
}
}
func focusAnimationAt(_ point: CGPoint) {
let focusView = UIImageView(image: #imageLiteral(resourceName: "focus"))
focusView.center = point
focusView.alpha = 0.0
view.addSubview(focusView)
UIView.animate(withDuration: 0.25, delay: 0.0, options: .curveEaseInOut, animations: {
focusView.alpha = 1.0
focusView.transform = CGAffineTransform(scaleX: 1.25, y: 1.25)
}) { (success) in
UIView.animate(withDuration: 0.15, delay: 0.5, options: .curveEaseInOut, animations: {
focusView.alpha = 0.0
focusView.transform = CGAffineTransform(translationX: 0.6, y: 0.6)
}) { (success) in
focusView.removeFromSuperview()
}
}
}
func toggleFlashAnimation() {
if flashEnabled == true {
//flashButton.setImage(#imageLiteral(resourceName: "flash"), for: UIControlState())
} else {
//flashButton.setImage(#imageLiteral(resourceName: "flashOutline"), for: UIControlState())
}
}
}
QUESTION
Hope you are well.
I am using your SwiftyCam and its very handy to create any simple camera app. The only problem I am having is I can not find any way to save the recorded video to gallery.
I tried this: UISaveVideoAtPathToSavedPhotosAlbum(url, nil, nil, nil)
But its giving me this error: Cannot convert value of type 'URL' to expected argument type 'String'
Can you suggest any solution?
...ANSWER
Answered 2018-Aug-03 at 14:20According to signature
func UISaveVideoAtPathToSavedPhotosAlbum(_ videoPath: String, _ completionTarget: Any?, _ completionSelector: Selector?, _ contextInfo: UnsafeMutableRawPointer?)
You need to supply video path as String
type not URL
, so you can use url.path
QUESTION
I have an app (enterprise, distributed OTA) that among other things records video clips. All of a sudden, we started getting video uploads that were missing audio, and this issue now seems to be totally reproducible. I've been using the PBJVision library, which had seemed to work great, but I have also tested this with SwiftyCam (another AVFoundation-based library) with same results. It's unclear exactly when this was introduced, but I've checked the following:
- Ensure that a
NSMicrophoneUsageDescription
is set in the target .plist - Ensure that camera and microphone permissions are showing as granted in system settings
- Try disabling microphone permissions in settings (app correctly prompts the user to re-enable permissions)
- Try earlier releases of the video capture library in case of regression
- Try different video capture library
- Explicitly set audio enabled and bitrate for PBJVision/SwiftyCamera, and ensure that the session is at least reporting that it has audio in the logs (that is, the library and AVFoundation think there's an input set up, with an input stream that's being handled)
- Take a video with the system camera, and upload through the app — in this case, audio does work (it's not a problem with the hardware)
- Reset all content and permissions on a device, to make sure there isn't some kind of cached permission hanging out
- Make sure volume is not muted
The copy that gets saved to the camera roll is also silent, so it's not happening when the video gets uploaded. I also started to implement recording using just AVFoundation, but don't want to waste time if that will produce the same results. What could be causing a particular app not to record audio with video? I have looked at related questions, and none of the solutions provided address the problem I'm having here.
EDIT:
Here are the logs that appear when starting, recording, and stopping a PBJVision session:
...ANSWER
Answered 2017-Aug-28 at 16:41It turns out that this was actually due to using another library to play a sound after starting the video recording. This apparently preempts the audio channel for the recording, as that ends up being empty (see Record Audio/Video with AVCaptureSession and Playback Audio simultaneously?). It does not appear to matter whether or not the other sound playback is started before or after starting video recording. This is a good warning case around using multiple libraries that all touch the same system APIs — in some cases, like this one, they interact in undesirable ways.
In this case, the solution is to make sure that the two sources aren't using the same AVAudioSessionCategory, so they don't conflict.
QUESTION
Sorry I have been away from programming for a while (since before Swift 3) and even back then I wasn't real good at it. Anyway I was playing around with SwiftyCam:
https://github.com/Awalz/SwiftyCam
And I notice these squares in the code. Anyone know what these things are? Never saw this before, and not sure if it's new with Swift 3 or if I just haven't come across them. Example:
...ANSWER
Answered 2017-Apr-14 at 06:40The little squares are actually text. They just appear in Xcode as little squares with the name of the image after it. e.g.
As you can see in your pasted code, it's just this in text form:
QUESTION
i'll post below the 2 section of code of my 2 .swift files:
Master .swift(partial code)
...ANSWER
Answered 2017-Mar-07 at 12:06Change your secondary .swift to:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install SwiftyCam
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page