Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there a reason OP can't get themselves a $50 USB capture card and a $20 HDMI cable, and use OBS to capture the feed from the HDMI-out in the camera? Most decent capture cards also expose themselves as cameras to almost all applications. This is my setup, and it works perfectly. Nikon D7500 as a webcam. More professional setups use Atomos monitors with built-in NVMe drives mounted directly to the camera.

I generally find the camera manufacturers' in-house programs absolutely terrible. Nikon's webcam utility is free[1], but has significant limitations over the capture card setup. Likewise for Sony. Both have considerable resolution and framerate limits, and I'd rather feed a 4K 60 FPS stream into my meeting program and let it handle the compression than have an XGA 1024×768 15 FPS output from the camera.

[1]: https://downloadcenter.nikonimglib.com/en/products/548/Webca...



This is hugely dependent on whether the camera supports clean HDMI output - that is, without overlays. My Canon camera for example insists on showing a focus square over HDMI no matter what, and it is impossible to disable.


You can remove it by installing magic lantern. It lets me use my old 650D as a second camera.


Unfortunately there is no port of ML to my specific model. I did some porting work myself by running the camera firmware in QEMU, but to be able to run it on hardware I apparently needed some signing key that only the Magic Lantern lead dev has. By the time I was doing all of this he was busy with real world stuff so ultimately I just borrowed a friend's Nikon camera.


ML doesn't work on a lot of cameras - yet. It's quite far behind the last generation of SLRs and stays away from the flagship models.


The particular camera he's talking about, the G5 X Mark II, does support clean HDMI out. I used to use it as my webcam.


> Is there a reason OP can't get themselves a $50 USB capture card and a $20 HDMI cable, and use OBS to capture the feed?

This is how I've used my Sony camera since COVID. It works great.

I wasn't sure at first if OP was trying to do something nonstandard, because you get video to your computer with a video cable. Plus a way for your computer to capture that, which for me is CamLink.

Honestly, I'm surprised there's a relevant manufacturer app at all. Not surprised that it costs money.

This is a bit like not having power in your home to charge your camera with and asking the manufacturer for a generator. They may have a solution, but the price will be bad.


OP wants to just use the USB cable, which makes sense for me.


USB 2.0, that bog standard version from 2000 that is assumed to be the lowest common denominator possible for any new hardware...

Edit: 4am math correction...

480Mbit/sec transfer; Uncompressed, that's ~333333 pixels per frame for 60FPS. Not even considering overhead, but https://en.wikipedia.org/wiki/USB_video_device_class 1.1 support from 2005 includes Motion JPEG (low compression, all patents probably expired given it was developed in the 90s) and MPEG2 (also sufficiently old, to be unencumbered now).

However, if they'd use USB 3.0 ~ 5gbps, ideally over a USB-C port, the connection would be more modern, and easily able to handle even 4K video with now free from patents and well supported compression algorithms.


the camera indeed has USB-C port, 3.2 gen1


I like the new marketing names they finally settled on.

https://en.wikipedia.org/wiki/USB#Connector_type_quick_refer...

USB ${N}Gbps instead of the confusing old labels and operation mode classifications.

I'll assume you meant the 5Gbps version of the link, which ( 5000000000/8/3/60 ) can drive about 3.47M (24 bit) pixels at 60fps, even raw. https://en.wikipedia.org/wiki/List_of_common_display_resolut... It looks like 4K mode would require the use of VP8 (likely no hardware included) or h264. Patent license issues are soon to expire, (though some BS one won't until 2030, not sure how that's even possible), but no remedy for older models https://en.wikipedia.org/wiki/Advanced_Video_Coding#Licensin... drat.


Why should the manufacturer raise the price of the camera for you and me just to implement something extra OP wants that they can already do through HDMI?


It is already implemented, otherwise they wouldn't be able to enable it once the subscription is active.

Why should the OP need to pay a subscription to enable a feature that is build into the camera, that is a standard feature on other cameras and imposes no ongoing costs to the manufacturer¹? This is an example of gouging, pure and simple.

----

[1] unless they are forcing the user to use their hosted service for steaming the webcam output, in which case there is some bandwidth and perhaps other processing cost, but that is on them for having not just implemented a standard that enables local-only recording


Also why does it have to be a subscription in the first place. If it is a non standard use that requires extra software you don't and you want to separate those costs from users that don't need it, then make it a one time payment at least.

Subscriptions make sense when you have ongoing costs like significant load on servers that are needed for the service provided. But not for some piece of software you write once and are more or less done with (minus some small patches)


That’s the really egregious thing. I think a bunch of programmers should be able to see the merit in charging money for software. It’s a bit of a bitter pill in a product that we mentally categorize as “device” rather than “computer” but it’s at least somewhat sensible. Software costs money to make, that money has to come from customers, and getting it from the customers who use it makes sense.

But requiring a subscription is such a blatant “fuck you, we want more profit without doing any work, and you’re going to provide it.”


> Why should the OP need to pay a subscription to enable a feature that is build into the camera

Getting video into your computer through USB is _not_ built into the camera. Else why is OP downloading an app to do it?

The app is part of the implementation, and it costs money. I have no problem with the manufacturer charging separately for that. The rest of us can use a video cable to get video into our computers.


You are entirely ignoring the subscription for what should, at most, be a one-off cost.

> The app is part of the implementation

Give other cameras can do it, there has a standard for it since 2003, and there are F/OSS implementations for others, maybe I'm asking the wrong question and instead should have asked “why should the OP pay a subscription for their bad choice of how to implement the feature?”.


The company can charge whatever they want for this feature. Most people who can afford to use a good camera as their webcam will never use it, because they know the quality is worse and they'd rather use industry-standard HDMI.

If I asked Sony for a power generator to charge my camera's battery, they could charge me a million a month if they'd like. Hopefully that would signal to me that there are better and more standard options.


> The company can charge whatever they want for this feature.

They can. But that doesn't mean everyone is forced to be happy about it, and doesn't mean it can't be talked about so other people who might not be happy about it can use the information to chose a different camera from a different manufacturer instead of discovering the issue post-purchase.

> they'd rather use industry-standard HDMI

Or the industry standards for video-over-USB, that this manufacturer chose not to implement because they couldn't easily gouge a subscription out of it.


OP bought a camera not sold as a webcam and is trying to use it as a webcam. Fair enough, I've done the same.

A standard way of doing that is to use a video cable to get video output and plug that into a capture card on your computer. OP doesn't want to do that and would prefer that the manufacturer included webcam functionality out of the box.

Also fair enough! But if that's the requirement, buy a camera that meets that requirement, and understand that it's not a standard feature in these cameras.

I get subscription fatigue, but this is not a good hill to die on. It's getting outraged over expecting a camera to do what it wasn't designed to do, when there are already simple and standard ways of making it do that.


Requiring a separate kit made of HDMI cap box, and two usb cables (assuming the box power is feedable via usb) also makes canon create further e-waste. That's only because they're greedy and that stuff is already inside. And nothing on the site of the camera https://www.canon-europe.com/cameras/powershot-g5-x-mark-ii/... gives any indication that such external app and subscription would be required.


> That's only because they're greedy

No, it's because Canon didn't sell OP a webcam. There's no expectation for them to provide webcam software.

If someone wants an external camera that doubles as a webcam with no adapters, that's totally fine for them! They should shop with that in mind.


OP bought a camera, and the camera can be used as a webcam - but deliberately not with standard protocols. Pretending the limitation is of technical nature rather than a result of corporate greed is both delusional and harmful to consumer rights.


Nothing about this is deceptive or a violation of consumer rights. Far from it.

This is common for cameras. My Sony works in the same way. It can be used as a webcam using HDMI and a capture card. Canon clearly states this in their marketing for OP's camera.

OP apparently didn't understand this, but the solution is simple -- get an HDMI cable and a capture card.


> OP bought a camera not sold as a webcam and is trying to use it as a webcam. Fair enough, I've done the same. A standard way of doing that is…

And another standard way, supported by at least some cameras, without even single extra charge never mind a subscription⁰, is apparently video over USB.

> I get subscription fatigue, but this is not a good hill to die on.

No users are dying on this hill¹. OP is just stating, in an exasperated tone admittedly, what the state of affairs is with this camera. Some of us are agreeing with him that it seems off, and is part of the ongoing enshitification of the software and hardware worlds. Others can use this information to help guide their choice of camera (or supplier of other equipment), or not, their choice.

----

[0] Which implies they could decide to discontinue the feature at a whim later, no matter how much the user has paid between now and then.

[1] I'll refrain² from mentioning that you are putting up quite a determined fight for the “nah, this sort of thing is fine, really” hill.

[2] Oops, I tell a lie…


Indeed, we should be glad they don‘t charge us for each picture we take …


They have already implemented it, otherwise it wouldn't work.


The app is part of the implementation. And they're apparently subsidizing the cost by charging separately for it.

Drop the fee and that's now baked into the camera's base price.


OP expects the camera comes with some decent convenience at that price.


OP is using a camera as a webcam that's not sold as a webcam. That's fine, I do the same with mine, but it's also fine of the manufacturer to allow for that by simply providing A/V interfaces instead of trying to account for every use case.


Canon advertises their cameras as webcams.


No, they don't. They advertise that the camera can "turn into" a webcam with the right software or through HDMI out.


That’s basically the same though, isn’t it?


That's what the marketers wanted us to think, sure.

It's like me trying to sell you a car that can "turn into" a boat with the right attachments. Notice I didn't say how much the attachments cost.


Respectfully, you're just making things up.


I resent that accusation. We should have a Zoom call on two Canon webcams to hash this out.


Pay my subscription for the next decade and we have a deal.


You're simply mistaken.


The marketing material for OP's model:

> Use the EOS Utility Webcam Beta Software (Mac and Windows) to turn your Canon camera into a high-quality webcam, or do the same using a clean HDMI output.


Marketing material changes over time and varies between models and regions. Canon customers bought their cameras because Canon advertised a set of features. I bought mine because Canon advertised that I could use it as a webcam. I don't think you're making a persuasive argument.


You still can use it as a webcam. It's right there in the marketing materials. Clean HDMI out. That lets you use it as a webcam.

Canon advertising its potential to be used as a webcam doesn't mean it's a webcam. It means you can adapt it for use as one. And you still can. The adapter is an HDMI cable or software, which may or may not be free.


Convenience is always extra


Exactly. But why does he need to buy a USB capture card and HDMI cable? He can just hire someone to come and record the videos for him. They'll also do the post processing.

Why does he even even record the videos himself? He can just hire actors to do what he wants, probably a lot better.

And what's the whole thing with buying a camera? He should just buy a studio and hire a crew to manage all that stuff.


Buying usb capture cards is a standard accessory for content creators. It's not a big deal.


Not a big deal at all.

The outrage in this thread is incredible. Buying a couple A/V adapters to adapt a non-webcam camera into a webcam is somehow seen as a terrible burden.

If someone doesn't want to do that, perhaps they should buy...a webcam. No adapters needed.

A camera comes with more power at the cost of simplicity for this use case.


This is what's called a slippery slope.

A capture card and HDMI cable together cost less than $100. Hiring someone will be at least an order of magnitude more expensive—and more so the more people you hire.


Whoosh.

That was the entire point of the comment, to point out the slippery slope in the HDMI/cable card argument in the first place.


Eh? There's no slippery slope. A capture card is standard equipment for any broadcaster.

There is an absolutely massive gulf between a one-time supplementary purchase of $100 versus a multi-million dollar studio, film crew, actors. The comment was ludicrous, sorry.


> There's no slippery slope. A capture card is standard equipment for any broadcaster.

Guess what. OP isn't a broadcaster. Neither is the commenter. And everybody else here and their moms find it ludicrous that someone has to spend another X, where X is literally any amount over 0, to do something that the camera can already clearly do. In high quality. As evidenced by the existence of the software in the first place.

> The comment was ludicrous, sorry.

Yeah no shit Sherlock. That was the whole point. Are you actually this dense in real life or are you playing it up to be a troll?


You don't even need OBS for this - capture cards show up as digital cameras in macOS


You do, capture cards introduce latency something around 30-50ms (at least the cheaper ones) and if you are using non built in mic you need to resync everything up.


Indeed.


Would this approach also give you control of camera settings? I think the OP's situation, he wanted that.


How easy and slick this setup will be depends on the camera.

For example, my camera can't operate and charge over USB at the same time, so you need a supplemental power supply. And it won't autofocus continuously or keep the exposure and white balance stable unless you're recording a video. And videos can only be so long.

So I've got a HDMI-to-USB converter, a special HDMI cable, a special power brick and adaptor, a special tripod so all those cables don't pull the whole setup over, and I've got to restart video recording every 30 minutes or so, and wipe the microsd card regularly.

Your camera's probably better suited to this than mine :)


I've been using a Sony mirrorless (anything above a5100 will work) for over 6 years now; it needed a "dummy battery", and an HDMI capture card (about $25 for noname brands, or $80+ for Elgato, BlackMagic etc). It auto-focuses, doesn't write to microsd, and works flawlessly.

Even if you aren't buying Elgato, you can use Elgato's compatibility page to know which cameras work well: https://www.elgato.com/us/en/s/cam-link-camera-check


A word of warning on capture cards: I first bought a no-name off Amazon, thinking to save money. The video quality was abysmal. Artifacts everywhere.

I returned it and got an Elgato, which has worked great from day one.


Weirdly I had the exact opposite experience. Elgato always felt laggy. I bought a no-name USB Stick format card and it looked great (once I got my camera settings dialed in) but would disconnect when I bumped my desk. I cracked the case open and soldered a USB cable I cut in half to the pads, and 3d printed a new case and it's been rock solid for the last 4 years. Only problem is the once in a blue moon I need to use Teams my video get's horizontally squished and I can't seem to fix it.


Same setup here, down to the brand.

For those who don't know, the dummy battery is a power cable with a battery-shaped adapter that plugs in where the battery would go to provide continuous power.


What camera do you have? Why can't it autofocus when its not recording?

I believe you, but thats very silly.


I can force my (canon) camera to autofocus while not recording but usually you want to avoid that. It really hits the battery because the lens is permanentely adjusting.

Most mirrorless cameras a hybrids and you usually do not need this feature while takting stills.


Makes sense. On my sony camera (a7iv), it does continuous autofocus in video mode. You don't need to hit the record button - just set the focus mode to AFC (autofocus continuous) and it does its thing.

I also just tried connecting it as a webcam over USB, and it does continuous autofocus when set up like that too. I'm sure it uses more power, but the camera can power itself over the USB port while connected, so thats not a problem.


None of my stills cameras focused continuously out of the box, probably to save power (moving potentially heavy lens elements around requires energy). My Olympus mirrorless can be told to focus all the time, but it's not the default.


They -can-, they just don't, unless you specifically enable it for power reasons.


No offense, but this sounds like a terrible camera for your use case. It sounds like you know that.

My Sony that I've been using as a webcam since COVID can do that, and it was 6 years old when I bought it. Upgrade when you can!


To be fair, I also have the dummy battery + HDMI capture + desktop clamp mount + live view faff for my D7500, but once you set it up it's just... there. I don't need to fiddle with it much further. It's a bit of a cable mess but I intend to upgrade to the Z6iii together with an upgrade to a desktop (so I can have a PCIe capture card), which will cut down the number of dongles all over.


I have that setup too. I was referring to this:

> it won't autofocus continuously or keep the exposure and white balance stable unless you're recording a video

That basically defeats our setup as now they're worrying about their recording time running out in the middle of a meeting.


Sure. I can do anything. It's the principle of the thing.


The principle is to use the right tool for the job.

USB can do just about anything. Video out is one possibility. But HDMI can already do that.

It doesn't make sense to expect the manufacturer to provide a free app to make USB do something you can already do over HDMI, and for which HDMI is intended.

This article is rage bait where there's no real cause for outrage. But it's adjacent enough to "right to repair" and "subscription fatigue" that it sounds outrageous.


The right too for the job most certainly is not HDMI.

The video feed should (depending on usecase, sure) be compressed on the device and sent over USB.

Sending uncompressed video just to be badly compressed in a capture device is most definitely not the right tool for the job.


Realtime compression in a portable device with limited processing power is going to reduce quality. It is better to transport uncompressed video and let the receiver decide how to manage it. USB-3 has adequate bandwidth for doing this. USB-C lets you switch to DisplayPort if the receiver can handle it.


The camera already does realtime compression to the sdcard. It has dedicated hardware for this. USB-2 has adequate bandwidth for compressed audio+video.

Your HDMI capture device (which is a cheap portable device with limited processing power) is probably going to do a much worse job.

Sending uncompressed video over usb is absurd.


On these cameras, HDMI is the right tool for the job. USB video quality is often poor where it's supported, and HDMI is there for video output.

These cameras are not made to be webcams. OP is using theirs as one, and that's fine; I do too. But device-side compression for USB video out, a webcam app, etc. are webcam features. They come at a cost, and many camera buyers don't need them.

For those of us using these cameras in these nonstandard ways, we can reach for HDMI, which is the right tool for this particular job.


The camera already have high-quality compression since it needs that to store video. If maybe latency is poor or other reasons exist not to use that then fine. HDMI can be a workaround, it still is an insanely bad tool for the job.


It's a workaround for the camera not bundling all the features that it needs to be a webcam, absolutely.

The standalone cameras I've used haven't included free webcam functionality and I don't think that's outrageous, but apparently many people here who've been downvoting me disagree.

Personally, I think HDMI is great for A/V tasks that a camera doesn't support out of the box since it's a widely supported standard.


This is exactly what I do. I'm also confused by this article...


It's rage bait. People hate subscriptions, understandably so, and people without A/V experience might expect a camera not sold as a webcam to easily double as a webcam since they both can capture video.

It's just a really poor reason to be outraged at Canon (or Sony or any of the other companies whose non-webcam cameras don't seamlessly turn into webcams without some standard A/V adapters).


Canon's webcam software was until recently free. It was the sole reason I bought a Cabin camera. This is a rug pull.


That's upsetting, but my point is the article itself is rage bait. It's not outrageous for Canon to charge for webcam software when there's HDMI video-out on the camera.


Imagine stanning for nickel-and-diming.


I can imagine supporting my right to charge my customers for the software I build for them, absolutely. And I support Canon's right to do the same.

Many of us have had the experience of clients telling us to "just" write code that they think is easy, but we know how that can go in reality.

There's already a simple solution here in HDMI. I don't see a reason to be outraged at Canon over not providing another solution that most buyers will never even use.


When I was trying to get back into photography, the fact that Sony's camera has built-in webcam capabilities played a small (but not trivial) part in choosing to invest in Sony's ecosystem. They're just great cameras overall, but I can't say it didn't play a part.

USB camera feeds work out of the box with Sony mirrorless cameras.

So ultimately, if Canon wants to play these games, let's see if the market of NEW buyers like me respond in a way that will make Canon change their minds.


> a $50 USB capture card and a $20 HDMI cable

Are there any USB-connectable capture devices that can process 4K?

Everything I see tries really hard to hide the fact that while they can input 4K, they can only produce 1920x1080.


Elgato Cam Link 4K


Oh, good to know! I didn't manage to find this one. But this one is $100 in the US, and $120 in some other places. Which is quite a bit of additional money to pay on top of your camera, which already has USB and should just provide a video stream there…

(this dongle is also USB-A, unfortunately)


There are cheaper ones but as noted above, most devices are not real 4K or poor quality and "unbranded". You should be able to convert USB-A to USB-C.


I’ve been using this with a Fuji XT4 for last 2 years as webcam, working great. Though for stuff like google meet, I usually set it to 1080p 60fps since that’s the max res most meeting software will accept anyways, and frame rate is more important for live meeting than res.


At least with my camera the feed is low resolution and has the on screen overlays on it.


I mean why invest $70 (and a lot of ressources) in hardware when, in theory, you have everything you need, the software is just locked behind a paywall?


But you generally don't have everything you need. As I've mentioned most cameras' USB webcam output (if at all present) is quite bad, even via the official programs or gphoto. The 'correct' way to access video output is through their, well, video-out port (usually HDMI), which almost necessitates a capture card or monitor.


Evidently these cameras are capable of exporting high quality video via USB, if you pay 5 bucks a month. This doesn't sound like a hardware problem. It also has a control channel, unlike HDMI.


> these cameras are capable of exporting high quality video via USB

No, they are not. The USB port is (usually) USB 2.0 and the video output, even though the application might claim 1080p 30 FPS, is a 'digital upscale'[1] from XGA or 720p. That in my view is decidedly not 'high quality'. My monitor has eight times that resolution and more than four times the framerate, totalling more than a 32× increase in bandwidth, and it is from 2021.

If users want high-quality video out from their pro cameras, use a capture card or monitor. That's how it's always been. As another commenter said, this article is rage-bait because the OP has purposely chosen a decidedly poorly-supported way to use their camera's functionality instead of the industry standard.

[1]: https://www.reddit.com/r/canon/comments/1e32r51/canon_eos_we...


Did you bother to look up literally any part of your comment?

The Canon G5 X II has USB 3.1 over a USB-C port. All of what you said does not in any way shape or form apply to the topic at hand.


Without loss of generality, the USB spec doesn't matter. The overwhelming majority of dedicated cameras are not set up to supply high-quality video output through their USB ports. Mass storage definitely and MTP, maybe, but video out? No, and even if you get it, it is badly nerfed. The industry standard is HDMI capture, or writing to disks in slots—whether SD, CF Express, or even M.2 NVMe.

This entire comment section is a massive storm in a teacup by photography and AV amateurs who don't know the ins and outs of AV work flows and assume that stuff should just work the way they want it to.


No, the industry standard, as has been pointed out to you multiple times by multiple different people, is USB-over-video. Something that a camera with a USB 3.1 port is more than capable of handling.

Just because you didn't bother to do any amount of research whatsoever before going on your unhinged rants does not suddenly make them any more relevant when talking about this specific camera. We're not talking about generally ("loss of generality" ok Jan ), we're talking about a specific blog post from a specific person talking about a specific camera.

The fact that you still don't understand this shows you clearly are not responding in any type of good faith here. It's your way or the highway even if your way is wildly outdated and not relevant.


Fair enough -- though I'd take 720p from my DSLR over 1080p+ from my webcam every time; that's enough pixels for a meeting. And I also had to get a capture card for it, because the USB access was locked behind proprietary/shitty software (Sony).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: