[opendtv] Re: NAB 2016

  • From: cooleman@xxxxxx
  • To: opendtv@xxxxxxxxxxxxx
  • Date: Fri, 29 Apr 2016 11:48:36 +0200

Daniel Grimes schreef op 28-04-2016 20:55:

Regarding the AJA KONA, AJA does, in fact, have an IP card version,
Kona IP, announced at NAB:

https://www.aja.com/en/news/top-stories/452

$2500 without the SFPs.

I suppose the IP-video technology is still too new to be able to use
off-the-shelf, enterprise level, network cards to take in the signal.

Dan

Haven't read it all, but from today's SMPTE email:



Hot Button Discussion
IP for Content Creators
By Michael Goldman


Much of the focus regarding media’s thrust into the world of IP-based data transmission has revolved around the issues of how to build Ethernet foundations for broadcast plants, and the various methods for delivering content efficiently to consumers using Internet Protocol (IP)-based methodologies and systems. But beyond broadcasters, content creation companies of all sizes and types are also currently in various stages of melding into the IP universe. For them, “there is a definite push to make the entire content industry a data-centric industry, and we are in that transition right now, but it’s not yet a done deal,” explains John Footen, Partner in the Information, Media and Entertainment Business Consulting Practice at Cognizant Technology Solutions. Footen, who has previously served as co-chair of SMPTE standards committees TC-32NF (Network/Facilities Architecture) and TC-34CS (Media Systems, Control, and Services), says “we are arguably at the beginning of that transition, not even at the halfway point yet.”

The reason for that, he elaborates, is that the term “content creation” casts a wide net, encapsulating both the worlds of production and post-production, as well as live broadcasting. As such, “a multitude of workflows” are currently at play on the landscape, with a variety equipment coming and going. Some of this hardware is already interoperable, thanks to strides made with maturing IP video standards for content, pairing nicely with wider IT industry standards for the actual transport of data. However, some of it is not interoperable, since “a host of challenges still exist” in terms of how production and post-production workflows have traditionally operated, compared to how the IT world functions. Quite simply, Footen suggests, the rise of IP currently fits better with some workflows than with others.

“Workflows are continuing to evolve in a world where IP needs to be an inter-connection method for equipment in the production environment,” Footen says. “But for production, there are so many different kinds of workflows, with the most challenging of those being live TV. The use of IP technology typically involves additional latencies that, in a live TV situation, can be problematic. We have new challenges in dealing with those kinds of things that we didn’t have with SDI [Serial Digital Interface] technology, or other such technologies from the past.

“In actual [non-live] production on a stage, companies are primarily still using some sort of physical media [to record or store content], because the bandwidth and realtime requirements of capturing content are such that connecting to, say, a cloud-based system, while it can be done, is currently challenging and expensive. But that said, as costs decrease, over time, we will see more of that kind of production work. And meantime, every organization that I am aware of is at least investigating migrating from physical media completely, at some point, to cloud-based storage capabilities for in-house media in the post-production process.”

At the simplest level, Footen explains that the involvement of the Internet Protocol in content creation simply means “evolving historical digital mechanisms we used to use, like SDI, into another means of encapsulating data and moving it around from place to place, including original capture for live production, post-production.” He specifies that this is a different from the issue of data storage, for which the industry has already developed reams of highly sophisticated solutions. “And remember, once the file—the data itself—is stored, it no longer involves IP,” he clarifies. The purpose of IP is only for moving the data around. If it has been received and stored, then the IP-related information goes away, and it is just a file, encapsulated with whatever data you are storing, he adds.

But for “moving data around,”—the actual transport of data—IP is clearly more efficient, or at least will be, once the industry has fully transitioned to it. That’s because IP methods already involve “what is a more standardized way of encapsulating data for movement, and so, hardware is already available from the IT industry for this purpose,” he says.

“[Many manufacturers] are already making compatible hardware for the transmission of video or audio data, or both, plus metadata, that has been placed in an IP encapsulation,” Footen explains. “And that hardware does not care what data is being transmitted, so, you can move it around without needing all the specialized hardware that was required in the SDI world, simply by using the IT industry’s technology. Until recently, we didn’t want to use IP for [content creation], because it involves sending data packets in ways that can result in out-of-order delivery, and the buffering that was required was considered unacceptable in the early days in order to allow those packets to reassemble. But then things got much faster in IP [thanks to advances in processing and broadband], and so, that buffer is now low enough that you can accept the relatively low latency that is required today.”

Furthermore, Footen adds, “costs of storage, costs of bandwidth, costs of software” have all been decreasing in recent years. That means, even smaller post-production facilities can use IP with greater sophistication than would have been possible even a few years ago, he says.

Another factor that is easing the path toward interoperability and, therefore, easier integration of IP systems into production and post-production workflows is the fact that “SMPTE and other organizations have developed functioning IP video standards, which makes interoperability less of an issue and allows small facilities to have more sophisticated setups with less effort. We have a much greater level of interoperability than we saw only five to 10 years ago.”

Much of the video over IP momentum in terms of standards and initiatives within the industry has been built on top of the SMPTE 2022 standard for describing how all common digital video formats can travel over an IP network. In the past year or so, a couple of initiatives were announced for developing a compressed video streaming standard that could improve interoperability across the industry. The first, utilizing SMPTE 2022, is called Alliance for IP Media Solutions (AIMS), which was founded by Grass Valley and is supported by several major manufacturers. The competing approach, Adaptive Sample Picture Encapsulation (ASPEN), is led by Evertz with support from several other companies and is built on an Evertz proprietary methodology using MPEG-2 for video over IP transport. Both concepts rely on IP protocols for switching and delivery, but have different approaches to video synchronization and other details. A TV Technology article examines both approaches, and so does this Broadcast Bridge piece.

Meanwhile, the industry’s ongoing work on IT-based networking strategies is continuing at a rapid pace on many fronts, including work by the Joint Task Force on Networked Media (JT-NM), a partnership of the European Broadcasting Union (EBU), the Advanced Media Workflow Association (AMWA), the Video Services Forum (VSF), and SMPTE. In April, JT-NM announced its move to Phase 3 of its initiative to find the best way to achieve interoperability across the industry to further the pursuit of IT-based media facilities. The new phase will include identification of common approaches toward key elements of interoperability and how best to adopt them efficiently, among other things.

Footen emphasizes that despite the standard or initiative for video over IP within the media world, one key issue is that the nature of the data content being transported doesn’t really matter from an IT point of view. The standards used for data transport, he says, “are generic—they move data. Because of that, if we want to make changes to our use cases in video, we now only have to focus on what specific changes or designs would be related to our area of expertise—in this case, video. But the underlying mechanism of transport itself will now be evolved by the IT industry.”

Therefore, he points out, relevant IT standards like the Internet Protocol, the Transmission Control Protocol (TCP), or the User Datagram Protocol (UDP), among others, ensure that the foundation is already built, in a sense, for production and post-production companies that are in the midst of transitioning, or still preparing to transition, to the IP infrastructure. And that means, as the industry dives further into radical upgrades to picture and sound with immersive audio, 4K, 8K, 3D, higher dynamic range, higher frame rates, and much more, requiring periodic needs to upgrade or alter workflows, engineers won’t have to spend much time on the transport part of the equation.

“One of the good things about using IP is that the IP stack itself is not something we have to spend time specifying,” he explains. “That means our standards’ [developers] can spend time focused on the essence of what is the ‘media related’ part of the standard, instead of the parts that can be handled by the IT industry. I think this gives our engineers who are involved in standardization a bit more focus than they might have had in the past, where they can now [concentrate] on more important factors, such as establishing more accuracy and efficiency as they develop the next set of standards that will incorporate HDR, HFR, 4K, and such. In other words, [we] won’t need to start from scratch every time. This is a huge advantage, and [the reason why] the media industry is moving in this direction. After all, the media industry is too small, compared to the rest of the IT world, to try to spend time developing [protocols and standards] for specific products. We need to use the [basic foundation], so engineers can work on making sure the underlying media is right, without worrying about transport. They will have to work out how best to describe the data, but once they do, we can use the transport layer that IP provides to move it around.”

Still, there are specific issues that concern media types regarding this transition; the issue of security when relying on cloud services, storage, or distribution sits at the top of that list of concerns, which was covered in the August 2013 Newswatch. Footen states that headlines on high-profile hacks of major studios and the ongoing battle involving online piracy and content distribution over the past couple of years makes this is understandable. However, that issue will be largely moot, in Footen's view, in the long run, given that virtually the entire business world is moving in this direction, including major entities with far larger security concerns than the entertainment industry will ever have.

“People are starting to realize that various cloud service providers like Amazon and Microsoft have far more, higher quality security experts focused on securing the data in their clouds than any [entertainment company] could ever hire,” Footen says. “At this point, with organizations like the CIA and others storing very critical and sensitive—even top-secret data—in clouds, it is hard to argue that the media industry has higher security requirements than those organizations. I think, at this point, while it is not impossible that there will be problems in the cloud, the probability of those problems being worse than the problems that people would have in their own on-premises facilities is lower. In that sense, I think the question of whether to use the cloud from a business or security standpoint is kind of a dead issue. The remaining issues with the use of the cloud [involve] access to the cloud, and performance of the cloud for certain applications, such as for live broadcasting or live production, in general. Those are the applications that require tight latencies and tight performance characteristics for which the cloud is less well adapted today. But I don’t think there are many people arguing not to use it at all anymore.”

A related issue involves Service Oriented Architectures (SOA), which was discussed in the December 2011 Newswatch, a particular kind of architectural approach that relates to permitting businesses “to organize their technologies and people into a series of independent services that can be orchestrated into workflows,” as Footen describes it. He adds that, currently, for media companies, including content companies, SOA services are already being deployed into supply chains for distribution and other kinds of services, but we aren’t seeing it a lot in production right now. He said he believes it is only a matter of time before we will see it adopted for that kind of work, as well.

Another dynamic that makes the post-production industry, in particular, ripe for making a smooth transition into the IP universe is the fact that it faces, in Footen’s view, fewer cultural obstacles to transforming their infrastructures to begin with than their broadcasting brethren have faced. This is because the nature of post-production is different from that of broadcasting, and, therefore, is more closely aligned to the natural engineering goals of IT professionals than what broadcast engineers are accustomed.

“The cultural differences between IT and broadcast engineering are substantive,” Footen says. “A good way to describe the IT versus the broadcast engineering mentality is that in IT, when a problem occurs, you want to ticket it, study it, and then figure out how to stop it from ever happening again as your first priority. In broadcast, the first priority as a general rule is to get the system back on air, back online—reboot it if you have to, even if you lose the information about what went wrong. Footen states that this might not be a fair way to describe it, because some engineers are able to think both ways in both industries. But culturally, this has always been a problem in bringing broadcast and IT together. The production and post-production spaces have been file-based for a longer period of time, and their processes are less often in realtime. And so, they can afford a little bit more of the IT style mentality. So they have adopted and adapted more for their personnel and ways of doing things for IT-style activities than perhaps the broadcast industry has done to date. That said, as time goes on, this is all converging, and we will see more of it in all sectors.”


----------------------------------------------------------------------
You can UNSUBSCRIBE from the OpenDTV list in two ways:

- Using the UNSUBSCRIBE command in your user configuration settings at FreeLists.org
- By sending a message to: opendtv-request@xxxxxxxxxxxxx with the word 
unsubscribe in the subject line.

Other related posts: