[opendtv] Re: The tragedy of FireWire: Collaborative tech torpedoed by corporations
- From: Craig Birkmaier <brewmastercraig@xxxxxxxxxx>
- To: opendtv@xxxxxxxxxxxxx
- Date: Tue, 27 Jun 2017 09:17:29 -0400
On Jun 26, 2017, at 11:04 PM, Manfredi, Albert E <albert.e.manfredi@xxxxxxxxxx>
I can't understand how you can write so voluminously, only to respond to the
fact that your points were factually wrong, Craig.
They were factually correct Bert.
You are simply grasping at straws, as you do not have a clue about why FireWire
succeeded, then ultimately has been replaced by newer more capable technology.
The article looks at more than 20 years of history related to the standard,
both the positives and the negatives. Similar articles could be written about
almost any technology and the politics of adoption/ replacement.
We have two items: where the article placed the blame and about USB-C. Once
again: the Ars Technica article placed most of the blame on the demise of
FireWire on Apple. It's obvious if you bothered to read the article. I even
went to the trouble of quoting it. So, here we are for the third time
Again, there is no blame here - the technology was successful for its intended
purpose for more than a decade - maybe too successful, as I will relate later
in this post.
I didn't invent what the article said, Craig. This is something you do
*incessantly*. If you disagree with the article, then say so. But don't
INVENT what you want the article to say.
Here is what the article said Bert - the part that tells the real story, not
the half-truth story you are promoting:
Today FireWire is fading into memory. Thunderbolt took its place at the high
end of the market. And at the volume end, USB 2.0 has given way to the much
faster USB 3.0, which is now being replaced by USB-C—a standard being led and
championed by Apple. It has smaller, simpler connectors that can be plugged
in upside down, along with twice the theoretical speed (10 Gbps) of USB 3.0
and far more versatility. It can power HDMI and DisplayPort, natively, over
an adapter, along with the full gamut of USB devices, from 1.0 up to 3.1.
1. FireWire was never a mass market interconnect standard - most commodity PC
users never needed the capabilities it offered for synchronous high speed data
2. Thunderbolt supports a superset of standards that USB 3.0 and USB 3.1 are
not designed to support. Again, many of these capabilities are NOT NEEDED for
commodity PCs - the volume end of the market.
3. Size matters - Apple is now supporting USB-C for the reasons described
above, and several important reasons not discussed in the article.
In 2012, Apple introduced the Lightning connector for iOS devices, replacing
the larger 30 pin connector used on earlier generation iPods, iPhones and
iPads. Lightning introduced the small format reversible connector concept that
led to the development of USB-C, introduced in 2014. This was needed to deal
with ever smaller product designs - the same issue that is causing USB-C to
become the connector of choice for new slim laptop designs.
Meanwhile, the "friendly" competition Czars in Europe, started putting pressure
on Apple and others to create a common standard for connectors on smart phones
- "one connector to rule them all." So now we have competitive industry
standards in play and political pressure to choose one standard.
It appears that Apple is "caving to the pressure," but in a manner that still
allows for significant product differentiation, as is the case with many
products moving to USB-C.
This article compares Lightning and USB-C and discusses the trade offs.
Will we ever have a standard?
And by “standard” I mean “one cable to rule them all”. I don’t think it’s a
certainty, but there is potential. Apple has a lot invested in Lightning, but
USB C is incredibly flexible. The connector is even able to pull double duty
to support Thunderbolt connections. USB C was the single connector chosen for
all peripheral and charging activities on the newest MacBook, and as Apple is
making the claim that our iPads can be productivity devices, a standard
connector across all devices makes some sense.
The article notes that moving to USB-C does not assure that devices will
support the full feature set of USB-3.1; many devices are moving to the USB-C
connector, but only support USB 2.0. Such is the nature of a connector that
supports multiple standards, including Thunderbolt.
Equally important in terms of the demise of FireWire, the Ars Technica article
sums the situation up as follows:
Speeds across networks of all sizes are now so high that there's also little
need for something like FireWire. "The packets can arrive way before it's
needed, because it's so fast," Sirkin noted. "So you don't need to worry
about being synchronous any more." Even so, it's interesting to consider just
how close FireWire came to ubiquity—but for the short-sighted actions of the
computing and consumer electronics industry's most innovative company.
So blame Apple for the demise of FireWire if you like, but no standard remains
frozen in time.
The other thing is, I was personally involved in multiple such "holy wars,"
and always found the zealots to be truly strange. Most of them, professional
meeting attenders who didn't seem to get any perspective on the technologies,
i.e. comprehend alternatives. Most of the arguments they made were completely
overtaken by events. The usual evolution is, the simplest technology is
easiest to upgrade, so it ultimately wins. Not the fanciest. Best not to get
all Religious True Believer about these things.
In this case there are two winners:
The volume market USB standards continue to evolve, driven by innovations like
the lightning connector and the new political imperative to have a common
standard for all mobile devices.
The same connector can now support the high performance standard - Thunderbolt
- needed for the much smaller market for high performance products that need
capabilities NOT supported (or needed) for the volume market.
NO BERT. They are NOT merging into one. For some reason YOU cannot
grasp what "superset" means.
This is what I mean by lacking perspective, Craig. First off, even with USB
2.0, USB was no longer hampered by the strict master/slave architecture.
Either device could take on either role. So here's how this will play out.
USB-C devices negotiate. If they both have Thunderbolt 3 capability, they can
use it. If they have DisplayPort, they can use that. If they have USB 3.1,
which they all must, they can use that. The lowest common denominator between
the devices gets enabled. That's what it means when it says: "Thunderbolt 3
is a superset solution which includes USB 3.1 (10Gbps), and adds 40Gbps
Thunderbolt and DisplayPort 1.2 from a single USB-C port."
Correct. The connector is the only commonality.
What is supported varies based on the requirements for the device. And by the
way, devices that use the USB-C connector DO NOT need to support USB 3.1, as
was clearly noted in the pocketnow article.
Nothing you say suggests that USB will evolve to support everything supported
by Thunderbolt - there is no reason to encumber lower performance products with
The more important question is whether USB will be needed at all?
Other than for power, I almost never use USB to move bits around, other than a
USB memory stick. My mice are wireless. Most of our keyboards are wireless,
although the Logitech Create keyboard I am typing this on uses the built in
three conductor connector added to the iPad Pro tablets. My data is backed up
on a WiFi router with a 1 TB drive or to the cloud. Our iPhones and iPads use
the cloud to share files, photos, e-mail, and all manner of links created and
shared by the Apps we use.
The more expensive implementations will include the greater feature set. So,
USB 3.1 will be improved as USB has always done, to the point that the USB-C
connector will be supporting bit rates most likely well beyond 40 Gb/s, and
any mention of BOTH Thunderbolt AND DisplayPort will be just for the history
Computers will continue to have HDMI for some time, but soon enough, what's
the point? USB 3.0 was **already** designed to support displays, migrating up
from USB's beginnings as keyboard, mouse, and printer interface. USB-C just
more so. The fact that bit rates go up and up is hardly new news, Craig.
Yup. As technology evolves the connectors we use must evolve as well. And as
the speed of the digital links increase it becomes possible to support/emulate
multiple standards across the same connector.
As noted above, it remains to be seen how much further USB will evolve. The
next big thing is WIRELESS.
Wireless networking (WiFi)
IOT - devices that use wireless connections to connect to local networks and
In the meantime we can expect to see the USB-C connector to support many
standards across a wide range of devices.
(But, of course, trust Apple to come up with another incompatible interface,
before this happens. It's *always* been thus.)
The "volume market" MacBook, introduced in 2015 has one connector - USB-C.
The MacBook Pro models all use Thunderbolt - two ports on the base model and
four ports on the higher end models. All of these laptops can support multiple
4K displays and raid arrays: USB-C cannot support this level of performance. I
expect iPhones and iPads to migrate to USB-C over the next two years (maybe
sooner if the EU demands it).
Apple typically leads in terms of innovation - the Mag Safe power connector was
a very popular "safety" feature on Apple laptops - it is no longer needed, as
USB-C connectors can now carry handle the higher power requirements for laptop
charging AND the connector pulls out easily, like the Mag Safe connector.
There is a reason so many companies copy Apple's innovations...
Now one last, but very important comment about FireWire, before we put this
thread to rest, as Bert seems to love typing "End of Thread."
When we created the Desktop Video industry in the '90s, the cost to become a
"Video Professional" was reduced substantially, but it was still an expensive
undertaking. A broadcast quality camera still cost $30,000 or more. A Mac based
Desktop Editing system cost another $30,000 or more depending on the investment
is disk arrays. These systems used "light touch" intra-frame compression
(typically Motion JPEG); the typical user upgraded storage frequently as drive
capacities and interconnect standards evolved.
Then the bottom fell out...
The CE industry developed the DV format for digital consumer camcorders.
The Canon XL-1 DV camcorder was introduced in 1997. It used intraframe
compression, recording standard definition video on a tape drive. And it had a
FireWire port to transfer the image data to a computer.
Suddenly the cost to shoot and edit high quality video dropped dramatically. At
the same time Sony introduced similar camcorders using their proprietary
version of FireWire - iLink.
A brief aside. Bert loves to criticize Apple as a walled garden manufacturer -
but it was Sony that corrupted the FireWire standard with their own Walled
The HDV consortium was formed in 2004, and soon multiple CE vendors were
selling High Definition camcorders at a fraction of the price of their
broadcast quality cameras - e.g Panasonic's DVC-Pro and Sony XDCAM. HDV used
inter-frame compression and FireWire to transfer the bits to a computer.
By this time most desktop computers had the performance needed to edit DV and
HDV files using applications like Adobe Premiere and Apple's iMovie. Many video
professionals, who had invested in the broadcast quality Desktop Video systems,
found they were now competing with very low cost "videographers." Many friends
took a beating when this happened; many went out of business.
But that was not the final insult. For a good part of the previous decade, I
wrote multiple stories in Videography, Television Broadcast, and later
Broadcast Engineering, that the video camcorder business would be largely
replaced by Digital still cameras and new mobile platforms.
In 2008 Canon introduced its first EOS digital SLR with video recording
capabilities. Like the earlier HDV camcorders, these cameras could support any
Canon system lens, allowing video producers the same flexibility to choose the
right lens for each shot, as Hollywood had done for decades with Prime lenses.
But the DSLRs were cheaper, and not fenced in by the need to shoot the 720P and
1080i HD formats that were already being challenged by new 4K cameras like Red.
These cameras rode the CMOS imaging curve and soon passed 4K and next gen 8K
Canon's top of the line DSLRs now have sensors with more than 20 million pixels
and shoot 4K at 60P. THe $699 EOS Rebel shoots all of the 1080i and 720P HD
formats. These cameras now use various forms of flash memory for recording
media and different flavors of USB for direct connection to a PC.
Other than TV station news crews and cameras at live sporting events, I rarely
see camcorders anymore. DSLRs are far more versatile and cheaper. And we all
carry HD camcorders in our pockets - e.g. smartphones. We can even edit HD
video on our phones or stream video live to the Internet (typically lower
resolution progressive formats).
Stressing about a legacy digital interconnection standard that helped to enable
HD for the masses, ignores the reality that everything in our digital world is
evolving rapidly, often destroying not just technologies, but entire industries
that came before...
End of Thread
Other related posts: