Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They say elsewhere that the quality and usability of them did not meet what they considered reasonable for the mm^2 of die space required.


Ah, yes, I do agree on the quality – it certainly can't compete in any way with high-quality software or dedicated encoders.

But arguably it doesn't have to; many real-time applications (e.g. surveillance cameras) have local bandwidth to spare and/or don't care too highly about quality, and compatibility with older viewing devices without re-encoding is a priority.


If compatibility is a priority and bandwidth isn't an issue, MJPEG is still the way to go and is by far the most common encoding I see in security cameras.


On the other hand you could get cool stuff from the video encoding hardware, such as access to motion vectors on the cheap: https://github.com/osmaa/pinymotion

I wonder though if the OpenGL ES3.1 compute could be used for this purpose on it.


Oh wow, this is such a great hack!


I think nobody used MJPEG in last 15 years.


Well, GP just shared that it's still widely used within security cameras?

It's also the only compressed format supported by many popular webcams, e.g. Logitech's C900 series ever since they removed H.264.

Digital cinema projectors apparently also use it (well, the JPEG2000 variant) exclusively since MPEG-2 was deprecated: https://en.wikipedia.org/wiki/Digital_Cinema_Package


If bandwidth ain't much of an issue, can just dump frames to a separate device for encoding, aside, Can't imagine someone using a pi 5 just for camera usage (aside from projects needing cameras)


I was hoping to use RPI5 as an NVR with image detection for Home Automation.

The Frigate add-on with Home Assistant can publish events based on image detection for automations and supports RPI3 / RPI4 hardware acceleration.

https://docs.frigate.video/configuration/hardware_accelerati...


https://pipci.jeffgeerling.com honestly I'd get a home server and run HA through docker, it's gotten me into home servers.


ESP32CAM might be good enough.


Maybe not only for cameras, but also for cameras. And when doing other things simultaneously, a video encoder not hogging all processing power becomes even more important.


Last time I tried to encode a video from H.264 to HEVC (hw-accelerated) on Linux it was such a pain to get to work that I eventually gave up and simply accepted the performance hit. I'm sorry, but I'm not gonna recompile ffmpeg so that it works on my machine. Considering that most RPi-users probably use a Linux-based OS this is IMHO a sensible decision.


Well it's not that uncommon to use a version of ffmpeg with more features enabled for a specific purpose. For instance my jellyfin server uses jellyfin-ffmpeg[0] to do hardware acceleration, even on my Pi.

[0]: https://github.com/jellyfin/jellyfin-ffmpeg


The default ffmpeg I got from Raspberry Pi OS supports hardware acceleration out of the box these days, as far as I remember.


For hardware acceleration it's probably easier to use gstreamer, depending on what device you are using to decode. But then you have a whole new problem.


If it is a vanilla Broadcom Soc then die space is not really their consideration or decision. But maybe it is customised for RPi?

Can't find the datasheet for the BCM2712 right now.


Well, to quote Gordon Hollingworth on the original post:

"In future we’ll have to do something, but for Pi 5 we feel the hardware encode is a mm^2 too far."

Also, Raspberry Pi Foundation and Broadcom have been really working together on successors since the... BCM2787 in the Raspberry Pi 3, if I remember correctly? Broadcom still reserves the right to sell to anyone, but the Pi is still the primary customer for those specific chips now.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: