summaryrefslogtreecommitdiffstats
path: root/Documentation/media/uapi/v4l/selection-api-005.rst
diff options
context:
space:
mode:
Diffstat (limited to 'Documentation/media/uapi/v4l/selection-api-005.rst')
-rw-r--r--Documentation/media/uapi/v4l/selection-api-005.rst10
1 files changed, 5 insertions, 5 deletions
diff --git a/Documentation/media/uapi/v4l/selection-api-005.rst b/Documentation/media/uapi/v4l/selection-api-005.rst
index 94731a13efdb..5b47a28ac6d7 100644
--- a/Documentation/media/uapi/v4l/selection-api-005.rst
+++ b/Documentation/media/uapi/v4l/selection-api-005.rst
@@ -16,19 +16,19 @@ cropping from an image inside a memory buffer. The application could
configure a capture device to fill only a part of an image by abusing
V4L2 API. Cropping a smaller image from a larger one is achieved by
setting the field ``bytesperline`` at struct
-:ref:`v4l2_pix_format <v4l2-pix-format>`.
+:c:type:`v4l2_pix_format`.
Introducing an image offsets could be done by modifying field ``m_userptr``
at struct
-:ref:`v4l2_buffer <v4l2-buffer>` before calling
+:c:type:`v4l2_buffer` before calling
:ref:`VIDIOC_QBUF`. Those operations should be avoided because they are not
portable (endianness), and do not work for macroblock and Bayer formats
and mmap buffers. The selection API deals with configuration of buffer
cropping/composing in a clear, intuitive and portable way. Next, with
the selection API the concepts of the padded target and constraints
-flags are introduced. Finally, struct :ref:`v4l2_crop <v4l2-crop>`
-and struct :ref:`v4l2_cropcap <v4l2-cropcap>` have no reserved
+flags are introduced. Finally, struct :c:type:`v4l2_crop`
+and struct :c:type:`v4l2_cropcap` have no reserved
fields. Therefore there is no way to extend their functionality. The new
-struct :ref:`v4l2_selection <v4l2-selection>` provides a lot of place
+struct :c:type:`v4l2_selection` provides a lot of place
for future extensions. Driver developers are encouraged to implement
only selection API. The former cropping API would be simulated using the
new one.