Why did Firewire die? It was fast and comfy as fuck. You could use it to connect two computers, or even dozens of HDDs

Why did Firewire die? It was fast and comfy as fuck. You could use it to connect two computers, or even dozens of HDDs.

What happened?

Attached: two-firewire-ports-BJK4J6.jpg (1300x956, 305K)

Other urls found in this thread:

anandtech.com/show/11847/gigabyte-announces-x399-designare-ex
twitter.com/SFWRedditImages

Apple charged licensing fees. Everyone bailed on it.

The connectors were a pita - nothing to stop the smallest yank pulling them out

slower than usb 3

i managed to plug one upside down into an external hdd caddy and killed the port
it had usb too though

apple happened. same reason HDMI is getting phased out, sony owns it and charges people to use it. apple is too fucking greedy to let a single IP go

I didn't have anything that used it. Had plenty of shit that used usb.

it existed way before USB 3

Intel considered adding it to their 440BX chipset.
Intel didn't and it only saw limited adoption.

Attached: harumi sicp.jpg (768x576, 83K)

Apple want money for every port, every device and force companies using FireWire in every USB product.

Massive Asshole licensing agreed just only FW tech or Sony videocameras use it.

>using computers as very large external HDDs and optical drives
>daisychaining hard drives
>IP over FireWire
I miss it

There are unimplemented standards for up to 3.2Gb/s and even over fibre optics

>You could use it to connect two computers,
How do I move large porn folders from on computer to another?

>HDMI is getting phased out
what is HDMI getting replaced with?

It was prohibitively expensive for cheap stuff like mice, did not have enough advantages over USB for more expensive stuff, the connectors were a mess (large 1394a and 1394b connectors were physically non-interchangeable despite the protocol being backwards compatible, while the small connector you'd want to put in compact devices did not have power), and the power specification was a shitshow (unregulated 5-30V with no set current limit)

DP?

why not HDBaseT?

Too big for laptops

>mfw still using VGA

the same reason SCSI went away (in consumer machines).... cost.

The Basic architecture required every device to have a certain amount of processing power, making the devices more expensive.
Cheap-and-works won over expensive-and-works-better... as always.

I thought one of FireWire's advantages over USB (and why in practice FW400 is faster than USB 2.0's quoted 480Mb/s max) is that FireWire always has a dedicated controller for the heavy lifting whereas USB piggybacked on the CPU

It's called thunderbolt now and it still has the same flaws that FireWire did.

Attached: 1556717828795.jpg (574x382, 23K)

And you don't want a dedicated controller for heavy lifting for every $5 keyboard or zip disquette

because the average consumer only used it for their camcorder

FireWire 800 was twice as fast as USB 2.0 and you could daisy chain.
It was replaced by Thunderbolt, which is over twice as fast as USB 3.0 and can also daisy chain.

We didn't actually regress, just switched protocols (and connectors), to support more features and better connectors.

We regressed in a way, the new standard is developed by Intel, creating some vendor lock-in. It's just now being opened up to the point that AMD can have it in the form of USB 4

>FireWire always has a dedicated controller for the heavy lifting whereas USB piggybacked on the CPU
Exactly. So the cost of every peripheral would be at least a bit higher.

I was under the impression that the controller only added cost on the host side and on the peripheral side it'd be no different than requiring hardware to interface with USB.
Would FireWire have been a suitable interface for things like keyboards and mice? I've always seen USB as a peripheral port and FireWire as a high speed data port

Firewire was developed by Apple and had licensing fees, how is that different from Thunderbolt?

>HDMI getting phased out
I’m not seeing it
>muh licensing fees
No one gives a shit about licensing fees, the cost is basically nothing for devices made en mass
There is a reason the Raspberry PI who’s entire purpose was to be a cheap SBC, used HDMI instead of DP or even DVI and it was because even with the licensing fees it’s still cheaper to implement HDMI than other video outputs as HDMI is so mass produced

I never used firewire and the only thing I had that even ever had a firewire port on it was a sound card, back when onboard audio still sucked balls and we still bought soundcards.

Firewire was symmetrical, not host-device like USB. You needed a pretty complex controller in every peripheral.

All I'm saying is Thunderbolt's been around for over a decade and I've never seen a single AMD device with it. Meanwhile FireWire reached a point of ubiquity across devices from all sorts of manufacturers. If I want to use Thunderbolt right now, I'd have to buy a new motherboard and CPU. With FireWire I could simply buy an expansion card, as I should be able to with Thunderbolt.

I still use it at least once a week.
DAW pc with a TI card connected to an M-Audio NRV10.

>Would FireWire have been a suitable interface for things like keyboards and mice?

No, because it did not have an equivalent to USB Low Speed (1.5 Mbps) and Full Speed (12 Mbps), the minimum speed was 98 Mbps, which in the early 00s wasn't trivial to implement. Firewire also required ~$0.2-0.5 licensing fee per device, which is significant when you're aiming to sell your mouse for $5 in retail.

Licensing fees, more expensive controller, several incompatible connectors, USB being "good enough", etc.
It's a good thing it never took off, though. FireWire, by virtue of its design, can dump host's memory through the controller at any time. On one hand this is great for debugging kernel panics, on the other hand it's a security nightmare.

But you CAN buy a PCIe Thunderbolt card.

Shit really? And they work with AMD hardware?

>All I'm saying is Thunderbolt's been around for over a decade and I've never seen a single AMD device with it.
Maybe because the spec has only been open for barely one year, and it really depends on board vendors not sitting on their asses?
You fucking retard, at least know the facts before you try to construct an argument.

Apparently there are some hoops to jump through, but it isn't completely impossible.

anandtech.com/show/11847/gigabyte-announces-x399-designare-ex

>Maybe because the spec has only been open for barely one year
That's my whole point, you fucking dolt

Oh, sorry, you're right. I didn't read the whole chain.

>the controller only added cost on the host side and on the peripheral side it'd be no different than requiring hardware to interface with USB.
No it added controllers to both sides, there's no "host" and "peripheral", every device is independent and interfacing with the others, that's why you could daisy chain them.

>Would FireWire have been a suitable interface for things like keyboards and mice?
It would just be overkill, both in terms of speed and cost. It's like implementing USB 3.0 on a floppy drive.

>I've always seen USB as a peripheral port and FireWire as a high speed data port
Yeah that's basically it. It was used extensively in media workstations and audio production, where transferring big files was usual. Also in many prosumer (and higher) camcorders and other external devices, like audio recorders etc. . And obviously the original iPod too (and that was the reason the iPod was practically a mac exclusive when it first got introduced).

The symmetrical protocol made firewire better when you would be connecting more than a dozen devices (a bunch of cameras, disk drives, etc.), cause it didn't tax the CPU at all so lag was not an issue.
In its late days, the even came up with FireWire S1600 (1.6Gb/s) and S3200 (3.2Gb/s) to get something faster without having to use USB. Of course with the introduction of Thunderbolt, that's a mute point, cause Thunderbolt does *all* that FireWire did, better and faster. The only downside is the intel lockin (which is pure cancer), and secondarily more expensive controllers and cables, but the latter is to be expected for faster, more feature-rich protocol.

Unless you did digital video and a few other professional applications, you didn't really need the sustained throughput of Firewire. For most plebs, USB2 was "good enough."

Eventually video transitioned away from tape based media, eliminating the need to keep up with the data stream. Systems could now treat the source like a drive and re-request missed packets.

Conspiracy theory is that Jobs purposefully put a license fee on FireWire so Macs would be *the* system for people to do video on. Yes, you could get FW on Windows machines (many laptops actually had them, including my deal-of-the-week compaq.) But Macs were ready to go out of the box.

Attached: inst.jpg (1080x1349, 187K)

Originally the fee was $50K to use the Firewire name for licensing, which I believe was per company, unlimited ports/devices. When Apple was dying, they tried going patent troll, and changed the fee to $1/port. Intel told them to fuck off and removed Firewire support from their chipsets. That killed it from ever being adopted by PCs. Microsoft could have stepped in to keep Apple from making that change, but for some reason they didn't.

FW kicks ass for audio work

>Not seeing it
All of my company's notebooks only have DP and VGA.

He is correct. Much of enterprise PCs are exclusively DP and VGA. Monitors with HDMI connections are already being phased out for straight DP and VGA only connections. Since you can buy a DP to HDMI/DVI/VGA cable it is pointless to have HDMI on any computer.

My old GPU had one DVI, two displayports and two HDMI ports. My new GPU has 3 displayports and only one HDMI port. I actually have to buy adapters for legacy display hardware. In a few years there won't be HDMI on GPUs at all, I do expect them to get USB 3 though.

I mean USB type C.

>I've never seen a single AMD device with it
You might. Intel's dropped the licensing fees. It's still not free or cheap to add to a board but at least there is no license-tax now.

It wasn't an Apple standard, you idiot, it was IEEE1394.

And the reason it failed it was simple:
1. More expensive than USB
2. Most PCs skipped that port, practically only Macs cared to have it
3. USB was good enough for normies still cameras and printers, IEEE1394 was only needed for video.

(((what happened)))

Lolwut? My 290x had a Mini DisplayPort.

HDMI will stick around on GPUs, it would be really stupid of them to just disappear especially given the hoard of 4k true 120hz TVs about to be released

... which HDMI doesn't have capacity to actually support.

All ports will eventually dissapear just like Disk Drives (cd/dvd/bluray) drives did on most computers.

HDMI 2.1 does
It might take a new gen of GPUs to actually get native support and that is why I say it would be really stupid for them to drop HDMI

miniDP is not Thunderbolt, they just share a connector.

Or you could use displayport and skip all that bullshit.

2.1 is faster than DP 1.4
DP 1.4 can't do 4k 120hz without chroma subsampling or DSC while HDMI 2.1 can do 4k 120hz even with 10bpc color and 4:4:4 chroma