Why not use the already open displayPort and make it better.
noo we need yet another standard!
Lock-in.
This must be for commercial displays where it is beneficial for installation to have power and data over a single cable.
I can’t think why I would want power delivery to my PC monitor over the display cable. It would just put extra thermal load on the GPU.
I think it’s aimed at TVs in general, not computer monitors. Many people mount their TVs to the wall, and having a single cable to run hidden in the wall would be awesome.
I wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you’re going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.
Headroom and safety factor. Current screens may draw 120w, but future screens may draw more, and it is much better to be drawing well under the max rated power.
Sound for an 80" screen? Not for home systems.
Projector
In wall power cables need to be rated for it to prevent fire risks. This will need to have thick insulation or be made of a fire resistant material.
Even in that scenario it will complicate the setup. Now your Roku will also have to power your TV? No, any sane setup will have a separate power cable for the TV.
I don’t think you’d ever have a peripheral power the tv. The use case I’m envisioning is power and data going to the panel via this single connector from a base box that handles AC conversion, as well as input (from Roku etc) and output (to soundbar etc.). Basically standardizing what some displays are already doing with proprietary connectors.
The popular use for power delivery through a display cable is charging a laptop from your monitor; it’s already very common with Thunderbolt or USB-4 monitors. But 480W seems a bit overkill for that.
Nah, it’s for powering the 1000w RTX 6090.
It would just put extra thermal load on the GPU.
Passing power through doesn’t have to put noticeable load on the GPU. The main problem I see there is getting even more power to the GPU - Nvidia’s top cards are already at the melting point for their power connector.
Passing power through doesn’t have to put noticeable load on the GPU.
I specifically said thermal load. Power delivery always causes heat dissipation due to I2R losses.
That’s what I meant. Compared to the power the GPU is actually using, transmission losses for a pass-through should be negligible. If you have a good way to get it to the card in the first place.
~~Why is that better than usb-c? ~~
Wait… Power the other way. Whoops, I get it.
That already kinda allow this and the actual load is pretty small
Even a big 30 in display is maybe 20 watts
Well, power delivery goes several times that. Laptops are another very useful case for it. It’s nice to be able to just have a single display port and power connector
You can do this to an extent, today
Today I learned DidplayPort 2.1 can carry 240W.
They fixed it.
Running that much power next to a data line sounds like a terrible idea for signal integrity, especially if something shorts to said data lines. It just sounds sketchy or filled with so many asterisks that it’s functional impossible to reach their claimed throughput.
USB standard is up to what, 40Gbps and 240W? That’s pushing the envelope already. We’ll see if this new standard can prove itself, anyways.
USB4v2 can do 80Gbps and 240W.
It can also do 120Gbps/40Gbps asymmetric.
See, IDK anything about data and power and cables but I dislike the vibe when I dock my laptop with that itty bitty USB-C connector that does power and 2x monitors and networking and peripherals.
I did buy the bonkers expensive proper cable from lenovo, and it does generally just work, but maybe once every few weeks I have to unplug & re-plug.
More power and more data through the same cable just seems daft.
Loved automobiles with 4 wheels? Chinese cars have 13! In your face suckers!
Even an 80” tv only uses around 150W, if my research is correct. Surely this must be thinking about massive displays.
If you’re gonna release a new standard, may as well have the headroom for future growth so it’s not outdated too soon in the future.
Won’t this heat up like a mother fucker
It depends on the voltage used. If they run 48V which seems to be supported by USB-C EPR. Then the cable has to do the same 5A it’s capable of doing today. Then the heat is the same.
When it comes to their own new connector/cable they can use even higher voltage or more/thicker conductors for power.
Not really that impressive since it seems to be about four times as wide as USB-C
If it’s physically more stable and reliable than HDMI, then count me in
Power delivery by itself could be a useful standard for ebike and power station charging (battery to battery charging too). 480w is most I’ve seen, but maybe USB is working on better, or 240w and more flexible/cheaper cables can work. HDMI providing 54v output would be great for most common battery system charging, and dual/triple BMSs for 2x and 3x ports/charging would be awesome.
We already have alternative, it’s called thunderbolt port.
No, we don’t. Apple proprietary nonsense isn’t worth the metal it’s made of.
If it’s not usb-c it’s banned in EU. Because we stopped there and we won’t go forward.
the GPMI cable comes in two flavors — a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard
I actually copied this from the article to come here to the comments and have a whinge about all the different USB-C standards, and here you are explaining the reason why.
Don’t get so excited. Read my comment again.
Please don’t make stuff up.
Other stuff isn’t banned and the law already has allowances for emerging standards.
I think you could have a second connector in addition to a main USBC.
Honestly we need higher capacity for screen cables for PC. Both HDMI and display port are limiting performance because of their low, 40-80gbps, bandwidth. Their performance maxes out at 4k120hz with uncompressed HDR color. You can’t use 8k screens or multiple 4k screens without lowering quality.
Where I work, everyone has 2 4k screens. You can use two cables to connect them, you know…
And every one of them has either put their scaling up to 150% or simply set them to 2k, because you cannot read a damn thing on them.
More than 4k is a theoretical need for a veeeery small market
Graphics cards only come with one HDMI port though. The LG OLED is popular for 4k screens because it ticks all the boxes and is much cheaper than equivalent gaming monitors, but that means it doesn’t support dp.
And it means that you have to upgrade the graphics card just for the cable even if it is still relatively new. The point is that we shouldn’t be held back by just a cable .
My graphics card has 3 HDMI ports and two DP ports… You cannot use all at once, but three screens are supported simultaneously…
Actually? I don’t know much about that legislation. Does it really not have room built-in for tech improvements?
It does! If there’s a good alternative it can be proposed, or that’s what I read here on Lemmy