raspberry-pi-camera | raspberry pi camera with 3d printed case
kandi X-RAY | raspberry-pi-camera Summary
kandi X-RAY | raspberry-pi-camera Summary
Making a "real" raspberry pi mirrorless interchangeable-lens camera (MILC) with 3d printed case.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Takes a photo .
- record a video
- Toggle preview
- Run a film simulation .
raspberry-pi-camera Key Features
raspberry-pi-camera Examples and Code Snippets
Community Discussions
Trending Discussions on raspberry-pi-camera
QUESTION
For my company, we need a device to take pictures locally and store it locally as well. There are no internet or wireless connections available within this machine. This is an industrial setting where the machines (and so its control components/sensors) move a lot.
I have written an algorithm that requires images as inputs, and maps them to output values used for control commands. However, we now need to interface this software, with the appropriate hardware (camera plus computer/microcontroller) to test and use this algorithm.
Online research suggests that there are plenty of idustrial cameras with addionial software/SDKs supplied for programmable use on an arbitrary OS. However, because of our space and mechanical constraints for the camera (must fit within ~100 mm in 1 direction, must be water resistant etc.), it becomes very hard to find the right camera that fits.
Because of these limitations, our current idea is to use an (industrial) smartphone, which also yields some supplementary advantages (like additional sensors, which may be used for different applications later on). The smartphone is then connected via cable (depending on the connector a usb-c or micro-usb etc.) to a raspberry pi. We are flexible in the exact types of hardware. For example, we can buy a linux smarthpone if required, or we can use a different computer/microcontroller if needed. So the answer to this question may suggest different smartphone type and computer type if necessary.
Our current available hardware is an android smartphone and raspberry pi 2 though. And my question, based on the above assumptions, is:
Is there some software/method available that enables the Raspberry Pi to access a smartphone's camera (and potentially other sensors) such that you can control it to capture images?
The preferred programming language of use is Python, but I imagine that other languages may be required for such task.
An online search reveals that usually people look to do it the other way around: They either seek to control the Pi with their smartphone, or they access the camera wirelessly.
If anything is unclear, please suggest improvements/addtions and I will edit the question!
...ANSWER
Answered 2019-Jun-17 at 12:05I propose you write a small app for this that connects to a webserver / API running on your Raspberry PI. The app will listen to commands from the webserver / API and execute what it is instructed to do (e.g. Take a picture and send it).
Because there is no connectivity out of the box (as you said), you can enable tethering via USB on the smartphone, and by connecting the smartphone to the Raspberry PI using the USB cable (and installing the required drivers) they will have internet connectivity to eachother, and the app will be able to communicate directly to the webserver / API on the Raspberry PI.
[EDIT] You could also use a USB webcam. The smartphone will be connected via USB as well, so you could just use a USB webcam directly. Find one that is waterproof, or a rugged one, and just communicate with the webcam directly from the Raspberry PI instead of having to write an app in between (which will greatly increase development costs). This method will also be cheaper in terms of hardware
QUESTION
I'm trying to make a live stream from raspberry to Android through the internet.
I searched through the web and I'm actually able to stream from the raspberry, and read the stream from mobile when the mobile is directly connected to the raspberry.
But if I want to make it online, there is something I'm missing on how to "pipe" this stream through another server.
So mainly I want to check how to post the stream to a server, and how to retrieve it from a mobile in realtime.
I already checked the following :
Http Live Streaming with the Apache web server
https://docs.peer5.com/guides/setting-up-hls-live-streaming-server-using-nginx/
...ANSWER
Answered 2018-Mar-04 at 10:47You have to forward your port to an external port on a web server.
There are some tutorials which you able to find them by this keywords:
QUESTION
Following step 6 of Adrian's guide and some others, I managed to stream 320x240 frames with a speed of 10 fps and 0.1 s latency from my raspberry pi to my laptop. The problem is, when I test this system in my lab (which is equipped with an antique router), it can only stream 1-2 fps with a 1-1.5 second latency, which is totally unacceptable for what I intend to do with those frames.
Right now, my method is simple and straight forward: the server on the raspberry pi capture a frame and store it as a 320x240x3 matrix like the guide mentioned above, then pickle that matrix and keep pumping it over a TCP socket. The client on the laptop keep receiving these frames, do some processing on them, and finally show the result with imshow. My code is rather long for a post (around 200 lines) so I would rather avoid showing it if I can.
Is there any way to reduce the size of each frame's data (the pickled 320x240x3 matrix, its length is 230 kB) or is there a better way to transmit that data?
EDIT:
Okay guys, the exact presented length of the pickled array is 230563 bytes, and the payload data should be at least 230400 bytes so overhead should be no more than 0.07% of the total package size. I think this narrows the problem down to wireless connection quality and the method for encoding the data to bytes (pickling seems to be slow). The wireless problem can be solved by creating ad-hoc network (sounds interesting but I have not tried this yet) or simply buying a better router, and the encoding problem can be solved with Aaron's solution. Hope that this will help future readers :)
...ANSWER
Answered 2018-Apr-12 at 20:34tl;dr: struct
is actually slow.. Instead of pickle
use np.ndarray.tobytes()
combined with np.frombuffer()
to eliminate overhead.
I'm not well versed in opencv
, which is probably the best answer, but a drop-in approach to speeding up transfer could be to use struct
to pack and unpack the data to be sent over the network instead of pickle
.
Here's an example of sending a numpy
array of known dimensions over a socket using struct
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install raspberry-pi-camera
You can use raspberry-pi-camera like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page