timelapse | Native macOS app for recording timelapse videos
kandi X-RAY | timelapse Summary
kandi X-RAY | timelapse Summary
...a little macOS app that records your screen to make a timelapse.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Called when the application finishes
- Create a menu
- Check if ffmpeg is found
- Set the status bar
- Loads the images
- Run the screenshot
- Take the screen capture
- Return the screen with the given mouse location
- Get the filename of the screenshot
- Start recording
- Create a new directory
- Return the path to the base directory
- Run ffmpeg
- Displays a notification with the given text
- Stop recording
- Get recording time
timelapse Key Features
timelapse Examples and Code Snippets
Community Discussions
Trending Discussions on timelapse
QUESTION
This is a continuation of a post which I could not get an answer that works so I here I go again.
I am trying to display images using data returned by a mySQL query. The query returns the names of an image files and a timelapse which is a number (seconds). I need to display each image for the number of seconds returned in TimeLapse
.
With help from my first post I am able to display images that are static but not from the result of the query.
The static method works and displays all the images one after the other
...ANSWER
Answered 2021-Jun-02 at 10:36The problem is because the PHP loop which creates the div
elements is in the wrong place. You repeat the entire .outer
structure multiple times with each containing a single .banner-container
.
You should instead loop inside the single .outer
and create multiple .banner-container
elements within it, like this:
QUESTION
I have a folder with thousands of jpgs at 1024x768 that I want to convert into a single video for playback.
The error I get is Error initializing output stream 73:0 -- Error while opening encoder for output stream #73:0 - maybe incorrect parameters such as bit_rate, rate, width or height Conversion failed!
Here's my input $ ffmpeg -i Timelapse/*.jpg -c:v libx264 -preset ultrafast -crf 0 output.mkv -y
What is strange is it errors on a specific numbered output stream. It seems to be either 71:0, 72:0, or 73:0. I thought it was something wrong with the file it is attempting to process in the given stream but the resolution is all the same (as I've seen errors when its not divisible by 2). I've deleted the 71st-73rd image in hopes it was somehow messed up but that doesn't help either. I've ensured my libx264 is installed correctly as well.
Any suggestions?
Terminal output example
...ANSWER
Answered 2021-Apr-04 at 17:23You forgot the -pattern_type glob
input option. As a result ffmpeg
expanded the wildcard (*
) and interpreted image0000.jpg
as the only input and all of the following images as outputs. The command was executed as:
QUESTION
Say I have a db containing events:
...ANSWER
Answered 2021-Mar-30 at 16:23What about this?
QUESTION
I want to make my Raspberry Pi camera timelapse boxes to take long exposures at night, and shorter exposures at day. I've gathered a list of sunset and sunrises for my location like this (Script here)
It has the time of day where sun rises, noon and sets. (Or the value "sun": never_rises, or never_sets for polar winter and polar summer, where I live.)
My Picamera has max 6s exposure, so 6000000 microseconds, to get usable images at night. In daytime, i get a good exposure around 4000 microsecond exposure.
I want to make a script that calculates exposure time from max: 6000000 to min: 4000 based on the current time, every minute.
Im thinking something like: Below the time, and until 2 hours before sunrise, its dark, = max exposure.
In 2 hours from sunrise, to daylight, get a value from max to min.
min continues trough the day.
Then, from min to max in two hours, from sunset and outwards, when its dark again.
But my math skills are weak. How could I calculate a smooth transition from max to min each minute, during two hours?
If the sun set and rise was the same time every day, I could make a excel sheet to get usable values, but as the sun set and rises at different times every day it gets tricky.
Scripts used to make the timelapses can be found here.
...ANSWER
Answered 2021-Feb-03 at 19:34This takes some time to run, so there may be a better way to implement it. It assumes 1 microsecond steps for exposure times, so if it is different, just modify the np.linspace
line.
I have it starting at the max exposure, 2 hours before sunrise exposure begins to linearly decrease to the min value at sunrise. At sunset it begins to linearly increase to the max value at 2 hours past sunset. If the time between sunset and sunrise was less than 4 hours, I decided to scale the max exposure based on the time difference. There might be better outcomes using different scales (i.e. log or geometric), but that might come more from experimentation than the coding itself.
Personally, I think the timing assumptions are a little off, but that's more a question of appropriate exposure settings based on time of day than the question posed here and the script can be adjusted as such.
QUESTION
I have an MP4 video file (Amundsen.mp4) that I created in Google Earth Engine - a timelapse, and a list of dates of each image (dates.txt) - not all consecutive days.
I want to use this list of dates to timestamp each frame in the video in python. Could someone please suggest how, or point me towards a tutorial that does this? I have not found resources on how to work with a video in this way.
...ANSWER
Answered 2020-Dec-17 at 05:48Thanks to the comments I succeeded. This was my successful code
QUESTION
I have a problem with CoreData...I've managed to get it to work, but it won't show my title property (which is the task name). I'm pretty new at this, I hope someone can help me out.
MainUI <--this is a pic of what the UI looks like and where the problem is.
This is the main ViewController
...ANSWER
Answered 2020-Oct-31 at 19:10It looks like inside your CellView
your not closing the switch statement, this will probably cause that your Text is only displayed when the checkbox ist checked. (completionState
is true) Place your Text that contains your title outside the switch statement:
QUESTION
I wrote a simulation involving two scripts running in different consoles. The scripts are sending messages to each other via websocket. Messages are sent in defined intervals and contain a timestamp (currently I use datetime.utcnow
). Now when I speed up the simulation,'naturally the datetime
timestamps are unaffected by this which means they are out of sync with the simulation time. Is there a way to "speed up" the system time or do i have to write my own timestamp function where i can determine the speed?
Edit: Since i can't change the system time I wrote a script with a clock that runs at a speed I can determine and which can generate timestamps. However I have can't find a way to run this clock in the backgroud without blocking the console. I thought i could use asyncio but it is not working like i expected. This is the code:
...ANSWER
Answered 2020-Oct-24 at 09:06I think this is a two part question, first is about how to track the timelapse and second is about how to work with non blocking background tasks and asyncio.
For the first, it looks like you're just adding simtime += 1*TIMELAPSE every second. It may be cleaner here to just set system time at the start of your script in a variable, and then when you want to check the current simtime you can check system time again and subtract the starting system time, then multiply the result by your TIMELAPSE. That should be equivalent to what's going on with simtime in this script, but is much simpler.
For the second part of this question, dealing with asyncio and non-blocking, I think you should use asyncio.run_in_executor to execute the background task and then you can use run_until_complete to execute your foreground task as below. Note that I removed the async coroutine from the clock function here and just use time.sleep for the background task since the executor doesn't need it to be a coroutine for that synchronous part. You can fiddle with the sleep time in clock to verify it doesn't block on the async some_other_task coroutine.
QUESTION
I used MediaMuxer
and MediaCodec
to generate a mp4 video.
The video is playble after I call mMediaMuxer.stop()
However, when the user quit the app before I get the change to call the stop()
method, I am left with a big mp4 file that is not playable.
Is there anyway to repair this mp4 file to make it playable?
EditHere is one example of a corrupted mp4 file
And I was able to repair the file using this online tool but this tool asked to upload a non-corrupted video as reference.
Here is the non-corrupted mp4 video that I used as reference. When I uploaded this video, the tool repaired my broken mp4 file.
So it is possible to repair the file but how did they do it?
If useful, here is the code I used to generate both corrupted and non corrupted
...ANSWER
Answered 2020-Oct-13 at 12:02In general MP4 is not a good recording format. Usually the sample table is kept in memory and written on close. So in case of a power loss or an application bug - you loose the recording. Use a MPEG-2 Transport Stream or a fragmented MP4 then most of the written media remains playable. Most likely your file will contains just a MP4 'ftyp' and 'mdat' atom with the audio and video interleaved. With some educated guessing and knowledge about the video stream - there is chance to extract audio and video. https://fix.video seems to do it.
QUESTION
We have a stream that is stored in the cloud (Amazon S3) as individual H264 frames. The frames are stored as framexxxxxx.264
, the numbering doesn't start from 0 but rather from some larger number, say 1000 (so, frame001000.264
)
The goal is to create a mp4 clip which is either timelapse or just faster for inspection and other checking (much faster, compressing around 3 hours of video down to < 20 minutes), this also requires we overlay the frame number (the filename) on the frame itself
At first I was creating a timelapse by pulling from S3 only the keyframes (i-frames? still rather new to codecs & stuff) and overlaying the filename on them and saving as png (which probably isn't needed, but that's what I did) using (this command is used inside a python script)
...ANSWER
Answered 2020-Oct-04 at 05:35I ended up using https://github.com/DaWelter/h264decoder and not ffmpeg directly.
It has a very simple interface, I just called
QUESTION
I am running into an odd issue with passing an argument with spaces in a Python3 argument.
For example:
...ANSWER
Answered 2020-Sep-03 at 15:08Thanks to @puffin, the issue was with the alias -- not the argument. The alias has to have it's argument variable quoted. For example.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install timelapse
Make sure that you have ffmpeg installed (e.g. run brew install ffmpeg).
Download timelapse
Unzip and start the app. If you get a warning about the app being unsigned, go to System Preferences > Security & Privacy and allow the app to run.
A new icon appears in your menubar; start and stop the screen recording from there.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page