cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Feature Request: EOS Webcam Utility to present Sound Interface

livestreamer
Enthusiast
# Summary

Currently, Canon EOS Webcam Utility presents a *video* interface to the Operating System, enabling a video signal to be consumed by an application (such as OBS, Facetime, Zoom, etc.).

This feature request is to propose that Canon EOS Webcam Utility *also* presents an *audio* interface, providing audio signal, synchronised with the video signal, to be consumed by applications running on the host machine.

# Implementation

This ought to be easy enough to do, for the Utility to source a stereo audio signal (from camera's internal mic, or microphone-in), and then present it to the operating system as a standard audio interface, which can also be selected by applications.

# Use Case - Wireless Livestream Publisher

EOS R sending content over WiFi to EOS Utility, containing synchronised 4k video + stereo WAV audio (synced in camera), being consumed as a single Source in OBS.

Then, the camera operator is able to control precisely what people hear (via headphone monitor) AND precisely what people see (via camera viewfinder)

This would eliminate any synchronisation / sound-drifting issues trying to sync "video only" from Camera, with audio from another source (trust me, it's painful, especially over WiFi / Bluetooth, but also over USB).

It would also be an immensely powerful tool in the hands of experimental device owners.. Camera to Computer to Livestream... NO CABLES!!!!!

# Conclusion

Please put this on your roadmap, and prioritise it. I can also help fund this if you have a way for me to donate.
13 REPLIES 13

Waddizzle
Legend
Legend

#Summary

 

Once upon a time, personal computers had neither audio or video signal sources.  Microphone inputs and headphone outputs were added to desktop computers.  People recorded and played back pop music. You could buy headphones and listen to music as you worked on your PC!  

 

#Implementation

 

Over time, flat screen technology developed to the point that laptops were possible, and they too were given mic inputs and headphone output jacks.  Laptop portability allowed new uses of the mic and headphone jacks.  Business uses were to record audio at meetings for later documentation as meeting notes.

 

#Use Case

 

Eventually, laptops were given built in video cameras.  Users could wear headphones with a built-in microphone, and make video calls to one another.  Some users would wear a hidden microphone, and use a hidden speaker in their ear.  Apps like Skype were invented, which allowed remote audio/video conferencing by letting multiple users join to their on one video call.

 

Manufacturers began selling "webcam" devices, which plugged into USB ports.  These devices most commonly only provided video and audio input signals, but no audio output.  These integrated devices were not suitable for high quality live streaming across the internet.

 

#Conclusion

 

External devices were created that captured both audio and video.  This brought the power of a video studio to the masses.  People could create their own high quality videos and edit them.  Social media apps allowed people to broadcast these video streams to social media platforms in real time.  

--------------------------------------------------------
"The right mouse button is your friend."

livestreamer
Enthusiast
# What happened next?

And for some time, creators of beautiful video and audio art were contented at giving away their beautiful live art in real-time to social media platforms.

They received fame, in the form of "Likes", "Views" and "Followers", and some of them even managed to make some money out of it.

Meanwhile, the social media platforms grew rich off advertising revenues, with attention-grabbing content being displayed alongside, before, and (even) during playback - trying to distract Consumers away from the live artwork. They needed to cover their mounting costs, of Transcoding, Distribution and Operations... along with bureaucracy, politics and business expenses.

Until one day, some projects in the experimental "Blockchain" community, collaborated to connect live video content with digital currencies (extensions of Bitcoin, if you will), such that anyone with the desire, can Publish livestreaming content safe in the knowledge that if anyone is watching, they must be paying, per-second, directly to them, in real time, by whoever is watching (the Consumer).

With the advertising-driven intermediaries disintermediated out of the way, Consumers could connect directly with Publishers, supporting each other with "likes that equal money" and "followers that equal paid subscriptions".

And without the drag of post-production (aka "faking it"), the Publishers grew their skills in the creation of live content... and the whole world started living much more "in the now", instead of in yesteryear (or even yesterday!).

The beauty and truth started to flow more freely in the world. Sounds and Images could be shared in perfect synchronisation from Canon's cameras.

Shared by artists, musicians, journalists, teachers, lifeloggers, videographers, dancers, cameramen, clowns, public speakers, performers... all getting paid for what they're making, by people who are watching, in real-time, so that they can pay their rent and feed themselves.

The End

(or The Beginning?)

livestreamer
Enthusiast
Upon further research, the stereo sound signal from the EOS R is already coming into EOS Utility Live View Shoot.

Should be super simple to add that to the EOS Webcam Utility as a Sound Interface for the Operating System.

Can't seem to get Webcam Utility working over WiFi though... but still trying.


@livestreamer wrote:
Upon further research, the stereo sound signal from the EOS R is already coming into EOS Utility Live View Shoot.

Should be super simple to add that to the EOS Webcam Utility as a Sound Interface for the Operating System.

Can't seem to get Webcam Utility working over WiFi though... but still trying.

Did your research include the video resolution of the Live View signal?  They are able to include audio because of the low resolution of the display allows for the extra signal bandwidth to include audio.

 

Take your pick.  High resolution video without audio.  Or, low resolution video with audio.  

 

Video capabilities in computers came along MUCH later than audio.  They are completely independent devices.  Audio and video are just like a mouse and keyboard.  Computers had keyboards long before they had a mouse.  They are clearly independent devices.  The audio and video inputs are the same way. 

 

Converting audio to digital hardware is on the mic inputs.  

--------------------------------------------------------
"The right mouse button is your friend."

livestreamer
Enthusiast
So, I can understand this in a USB 2.0 world, where you'd struggle to send a 720p stream over a cable. And in this case, I peesonally would actually take 360p with synced audio, over 576p without audio.

I like music you see, and the synchronization is so vital. Even milliseconds of drift ruin the real feel of being at a live concert...

But... stop the press... am I naive to expect that with USB-C and Gigabit LANs and 5G, that we might ever be able to take a full 4K signal from an EOS R, into OBS using EOS Webcam Utility, and out to the world, uncompressed...

Or is there some bottleneck somewhere that I'm not aware of?


@livestreamer wrote:
So, I can understand this in a USB 2.0 world, where you'd struggle to send a 720p stream over a cable. And in this case, I peesonally would actually take 360p with synced audio, over 576p without audio.

I like music you see, and the synchronization is so vital. Even milliseconds of drift ruin the real feel of being at a live concert...

But... stop the press... am I naive to expect that with USB-C and Gigabit LANs and 5G, that we might ever be able to take a full 4K signal from an EOS R, into OBS using EOS Webcam Utility, and out to the world, uncompressed...

Or is there some bottleneck somewhere that I'm not aware of?

Remember, the cameras only have USB 2.0 ports.  Connecting them to a USB 3.0 port will not speed them up.  There are a LOT of bottlenecks that you seem to be unaware, overlooked, or otherwise have not considered.

--------------------------------------------------------
"The right mouse button is your friend."

livestreamer
Enthusiast
(the aforementioned USB-C signal ought to be able to deliver at least a 320kbps MP3 audio quality stream synchronised with the 4k uncompressed video feed.)

livestreamer
Enthusiast
The EOS R has USB-C port, which is substantially faster than USB 2.0.

Furthermore, if connecting over WiFi, the USB 2.0 limitation bottleneck ought to be irrelevant.

I really hope that the Webcam Utility is at some point going to be capable of taking a 4K stream from the EOS R over USB-C, otherwise it's going to be hamstrung, and Sony will continue to outcompete on this front.

@Waddizzle - I wonder if you have a response on this one?

As far as I am aware, USB-C is more than capable of carrying heavy bandwidth in terms of data transmission.

 

My theory is that this target of 576p is due to limitations of previous models of camera such as the 60D, which only had a USB 2.0 port - only capable of transporting 720p) - and so Webcam Utility Beta is hardcoded to take a 1024x576 input.

 

I am not as knowledgable as you about the potential bottlenecks within a Canon EOS R, but I wonder whether the camera might currently be downscaling the 4K (3840x2160) stream from the camera's sensor to a 576p (1024x576) video feed to present to Webcam Utility Beta - which might explain the camera overheating? (if it's transcoding on the device - ouch!)

Thus, I wonder whether the EOS R could just present the 3840x2160 to Webcam Utility Beta without transcoding it on the device (or better, just present the format set in the camera for recording onto a card). The USB-C cable, and host computer ought to be able to handle this data input.

I'm stuck with this one - feel like it's a limitation of the Firmware, only sending 1024x576 (this is what `gphoto2` on Linux is receiving) - and that this camera would be far more useful as a 4k streaming device than a 576p.

What do you think?

Announcements