Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

camera: Rename libcamera-* to rpicam-* #3272

Merged
merged 1 commit into from
Nov 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -59,10 +59,10 @@ Run the code on the Pico, and set the camera running:

[,bash]
----
libcamera-hello -t 0 --qt-preview --shutter 3000
rpicam-hello -t 0 --qt-preview --shutter 3000
----

A frame should now be generated every time that the Pico pulses the pin. Variable framerate is acceptable, and can be controlled by simply
varying the duration between pulses. No options need to be passed to libcamera-apps to enable external trigger.
varying the duration between pulses. No options need to be passed to rpicam-apps to enable external trigger.

NOTE: When running libcamera apps, you will need to specify a fixed shutter duration (the value does not matter). This will ensure the AGC does not try adjusting camera's shutter speed, which is controlled by the external trigger pulse.
Original file line number Diff line number Diff line change
Expand Up @@ -40,13 +40,13 @@ exit
Start the sink running:
[,bash]
----
libcamera-vid --frames 300 --qt-preview -o sink.h264
rpicam-vid --frames 300 --qt-preview -o sink.h264
----

Start the source running
[,bash]
----
libcamera-vid --frames 300 --qt-preview -o source.h264
rpicam-vid --frames 300 --qt-preview -o source.h264
----

Frames should be synchronous. Use `--frames` to ensure the same number of frames are captured, and that the recordings are exactly the same length.
Expand Down Expand Up @@ -75,14 +75,14 @@ On the boards that you wish to act as sinks, solder the two halves of the MAS pa
Start the sink running:
[,bash]
----
libcamera-vid --frames 300 -o sync.h264
rpicam-vid --frames 300 -o sync.h264
----
Allow a delay before you start the source running (see note below). Needs to be roughly > 2 seconds.

Start the source running:
[,bash]
----
libcamera-vid --frames 299 -o sync.h264
rpicam-vid --frames 299 -o sync.h264
----

[NOTE]
Expand Down
2 changes: 1 addition & 1 deletion documentation/asciidoc/computers/camera/camera_usage.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@ Further details on the camera modules can be found in the xref:../accessories/ca

All Raspberry Pi cameras are capable of taking high-resolution photographs, along with full HD 1080p video, and can be fully controlled programmatically. This documentation describes how to use the camera in various scenarios, and how to use the various software tools.

Once you've xref:../accessories/camera.adoc#installing-a-raspberry-pi-camera[installed your camera module], there are various ways the cameras can be used. The simplest option is to use one of the provided camera applications, such as `libcamera-still` or `libcamera-vid`.
Once you've xref:../accessories/camera.adoc#installing-a-raspberry-pi-camera[installed your camera module], there are various ways the cameras can be used. The simplest option is to use one of the provided camera applications, such as `rpicam-still` or `rpicam-vid`.
12 changes: 6 additions & 6 deletions documentation/asciidoc/computers/camera/gstreamer.adoc
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
=== Using Gstreamer

_Gstreamer_ is a Linux framework for reading, processing and playing multimedia files. There is a lot of information and many tutorials at the https://gstreamer.freedesktop.org/[_gstreamer_ website]. Here we show how `libcamera-vid` can be used to stream video over a network.
_Gstreamer_ is a Linux framework for reading, processing and playing multimedia files. There is a lot of information and many tutorials at the https://gstreamer.freedesktop.org/[_gstreamer_ website]. Here we show how `rpicam-vid` can be used to stream video over a network.

On the server we need `libcamera-vid` to output an encoded h.264 bitstream to _stdout_ and can use the _gstreamer_ `fdsrc` element to receive it. Then extra _gstreamer_ elements can send this over the network. As an example we can simply send and receive the stream on the same device over a UDP link. On the server:
On the server we need `rpicam-vid` to output an encoded h.264 bitstream to _stdout_ and can use the _gstreamer_ `fdsrc` element to receive it. Then extra _gstreamer_ elements can send this over the network. As an example we can simply send and receive the stream on the same device over a UDP link. On the server:

[,bash]
----
libcamera-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host=localhost port=5000
rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! udpsink host=localhost port=5000
----

For the client (type this into another console window) we can use:
Expand All @@ -22,7 +22,7 @@ To stream using the RTP protocol, on the server you could use:

[,bash]
----
libcamera-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=localhost port=5000
rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=localhost port=5000
----

And in the client window:
Expand All @@ -36,7 +36,7 @@ We conclude with an example that streams from one machine to another. Let us ass

[,bash]
----
libcamera-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.3 port=5000
rpicam-vid -t 0 -n --inline -o - | gst-launch-1.0 fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.3 port=5000
----

If the client is not a Raspberry Pi it may have different _gstreamer_ elements available. For a Linux PC we might use:
Expand All @@ -48,7 +48,7 @@ gst-launch-1.0 udpsrc address=192.168.0.3 port=5000 caps=application/x-rtp ! rtp

==== The `libcamerasrc` element

`libcamera` provides a `libcamerasrc` _gstreamer_ element which can be used directly instead of `libcamera-vid`. On the server you could use:
`libcamera` provides a `libcamerasrc` _gstreamer_ element which can be used directly instead of `rpicam-vid`. On the server you could use:

[,bash]
----
Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
=== Differences compared to _Raspicam_ Apps

Whilst the `libcamera-apps` attempt to emulate most features of the legacy _Raspicam_ applications, there are some differences. Here we list the principal ones that users are likely to notice.
Whilst the `rpicam-apps` attempt to emulate most features of the legacy _Raspicam_ applications, there are some differences. Here we list the principal ones that users are likely to notice.

* The use of Boost `program_options` doesn't allow multi-character short versions of options, so where these were present they have had to be dropped. The long form options are named the same, and any single character short forms are preserved.

* `libcamera-still` and `libcamera-jpeg` do not show the capture image in the preview window.
* `rpicam-still` and `rpicam-jpeg` do not show the capture image in the preview window.

* `libcamera` performs its own camera mode selection, so the `--mode` option is not supported. It deduces camera modes from the resolutions requested. There is still work ongoing in this area.

Expand All @@ -25,15 +25,15 @@ Whilst the `libcamera-apps` attempt to emulate most features of the legacy _Rasp

* There are some differences in the metering, exposure and AWB options. In particular the legacy apps conflate metering (by which we mean the "metering mode") and the exposure (by which we now mean the "exposure profile"). With regards to AWB, to turn it off you have to set a pair of colour gains (e.g. `--awbgains 1.0,1.0`).

* `libcamera` has no mechanism to set the AWB into "grey world" mode, which is useful for "NOIR" camera modules. However, tuning files are supplied which switch the AWB into the correct mode, so for example, you could use `libcamera-hello --tuning-file /usr/share/libcamera/ipa/rpi/vc4/imx219_noir.json` (for Pi 4 and earlier devices) or `libcamera-hello --tuning-file /usr/share/libcamera/ipa/rpi/pisp/imx219_noir.json` (Pi 5 and later devices).
* `libcamera` has no mechanism to set the AWB into "grey world" mode, which is useful for "NOIR" camera modules. However, tuning files are supplied which switch the AWB into the correct mode, so for example, you could use `rpicam-hello --tuning-file /usr/share/libcamera/ipa/rpi/vc4/imx219_noir.json` (for Pi 4 and earlier devices) or `rpicam-hello --tuning-file /usr/share/libcamera/ipa/rpi/pisp/imx219_noir.json` (Pi 5 and later devices).

* There is support for setting the exposure time (`--shutter`) and analogue gain (`--analoggain` or just `--gain`). There is no explicit control of the digital gain; you get this if the gain requested is larger than the analogue gain can deliver by itself.

* libcamera has no understanding of ISO, so there is no `--ISO` option. Users should calculate the gain corresponding to the ISO value required (usually a manufacturer will tell you that, for example, a gain of 1 corresponds to an ISO of 40), and use the `--gain` parameter instead.

* There is no support for setting the flicker period yet.

* `libcamera-still` does not support burst capture. In fact, because the JPEG encoding is not multi-threaded and pipelined it would produce quite poor framerates. Instead, users are advised to consider using `libcamera-vid` in MJPEG mode instead (and `--segment 1` can be used to force each frame into a separate JPEG file).
* `rpicam-still` does not support burst capture. In fact, because the JPEG encoding is not multi-threaded and pipelined it would produce quite poor framerates. Instead, users are advised to consider using `rpicam-vid` in MJPEG mode instead (and `--segment 1` can be used to force each frame into a separate JPEG file).

* `libcamera` uses open source drivers for all the image sensors, so the mechanism for enabling or disabling on-sensor DPC (Defective Pixel Correction) is different. The imx477 (HQ cam) driver enables on-sensor DPC by default; to disable it the user should, as root, enter

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
=== Known Issues

We are aware of the following issues in `libcamera` and `libcamera-apps`.
We are aware of the following issues in `libcamera` and `rpicam-apps`.

* On Raspberry Pi 3 (and earlier devices) the graphics hardware can only support images up to 2048x2048 pixels which places a limit on the camera images that can be resized into the preview window. In practice this means that video encoding of images larger than 2048 pixels across (which would necessarily be using a codec other than h.264) will not support, or will produce corrupted, preview images. For Raspberry Pi 4 the limit is 4096 pixels. We would recommend using the `-n` (no preview) option for the time being.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
=== Python Bindings for `libcamera`

The https://github.com/raspberrypi/picamera2[Picamera2 library] is a libcamera-based replacement for Picamera, which was a Python interface to Raspberry Pi's legacy camera stack. Picamera2 presents an easy to use Python API.
The https://github.com/raspberrypi/picamera2[Picamera2 library] is a rpicam-based replacement for Picamera, which was a Python interface to Raspberry Pi's legacy camera stack. Picamera2 presents an easy to use Python API.

Documentation about Picamera2 is available https://github.com/raspberrypi/picamera2[on Github] and in the https://datasheets.raspberrypi.com/camera/picamera2-manual.pdf[Picamera2 Manual].

Expand Down
23 changes: 0 additions & 23 deletions documentation/asciidoc/computers/camera/libcamera_raw.adoc

This file was deleted.

4 changes: 2 additions & 2 deletions documentation/asciidoc/computers/camera/qt.adoc
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
=== Using _libcamera_ and _Qt_ together

_Qt_ is a popular application framework and GUI toolkit, and indeed _libcamera-apps_ optionally makes use of it to implement a camera preview window.
_Qt_ is a popular application framework and GUI toolkit, and indeed _rpicam-apps_ optionally makes use of it to implement a camera preview window.

However, _Qt_ defines certain symbols as macros in the global namespace (such as `slot` and `emit`) and this causes errors when including _libcamera_ files. The problem is common to all platforms trying to use both _Qt_ and _libcamera_ and not specific to Raspberry Pi. Nonetheless we suggest that developers experiencing difficulties try the following workarounds.

1. _libcamera_ include files, or files that include _libcamera_ files (such as _libcamera-apps_ files), should be listed before any _Qt_ header files where possible.
1. _libcamera_ include files, or files that include _libcamera_ files (such as _rpicam-apps_ files), should be listed before any _Qt_ header files where possible.

2. If you do need to mix your Qt application files with libcamera includes, replace `signals:` with `Q_SIGNALS:`, `slots:` with `Q_SLOTS:`, `emit` with `Q_EMIT` and `foreach` with `Q_FOREACH`.

Expand Down
Loading