Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting, but this misses perhaps the most embarrassing part: They're using Avid and not FCP.

I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."

Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?

It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.

"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)

Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.



The storage requirements are still massive. I would guess the raw footage for something like severance (and they probably shoot in at least 4k) is going to be in the area of a petabyte for the entire season.

Even today it's not close to practical to have an entire episode's worth of raw footage (of which there'll be many many takes, many many angles) entirely on an editor's workstation.

The surprising aspect is that they don't use proxies for editing rather than remote desktop.


Ben Stiller claims just 83 terabytes for editing. Maybe this is the size of proxies. https://www.youtube.com/watch?v=TXNQ01Sy6Xw&t=45s


83 terabytes of raw footage for one episode (the S2 finale). This was the longest episode (which doesn't necessarily correspond with footage shot). But for a 10 episode 4K HDR series, 1 PB is in the ballpark for a season.


And yet they still can't stream me a 4k hdr file without noticeable banding all over all of those white hallways. It's funny how this is the highest quality stuff yet has the lowest quality defects at the same time due to fundamental limitations of digital systems.


It is not a "fundamental limitation of digital systems". It is a limitation of streaming services.

If you had more throughout, more bit depth, etc you would have enough colors to not see banding. But the bitrates required (if you insist on 4k) are tough on SSD/HDD IO to say nothing about your network connection. And even if you have the best connection ever, most people don't and streaming services will want the bitrate to be as low as possible as long as the average viewer is not too upset, because delivering higher bitrate costs service real money and most customers don't have the connection for it and don't care about banding


> The surprising aspect is that they don't use proxies for editing rather than remote desktop.

In my experience it is way easier to scale storage bandwidth than compute, atleast locally.

There has been times where I've been able to cut a shoot from the raw files, and this has beeen corroborated by other editors, beforr proxies were available.

So it took less time to cut and submit for review than to actually generate the proxy media.

Sure if your workflow had a decent gap between shooting and post then generating proxies is trivial but sometimes a little more storage and memory bandwidth goes a very long way.


> The surprising aspect is that they don't use proxies for editing rather than remote desktop.

Who says they're not using proxies and remote desktop?


Sorry, this take is not good.

Yes, attaching many terabytes of video is cheap now.

But scrubbing through that high res raw video isnt (just) size intensive. Its throughput intensive. Size : throughput :: energy density : power density. You can get pretty good all SSD NAS but using a 40Gbps (5GBps, minus overhead) Thunderbolt 4 is still gonna be ok but not stellar. A single desktop SSD can triple that!

I can fully see the desire to remote stream. Being able to AV1 on the fly encode to your local editing station, or even 265, at reduced quality, while still having the full bit depth available for editing sounds divine.


What "take" are you talking about?

You're saying Thunderbolt 4 is going to struggle with something, and then touting a desktop SSD as "tripling" TB 4 throughput... but finally declaring that "remote streaming" is somehow better than both of those?

What an absolute crock.


These takes:

> I also don't buy the author's rationale for remote editing; it's oddly archaic

> Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet

Remote streaming is far better. A 2mbps or 20mbps connection to a powerful editing station is awesome. A compressed down h.265 with HDR will still let you edit very well, but be able to do intensive editing tasks with ease.

This really isn't hard at all, the advantages & wins are amazing, remote desktops have been amazing for decades now. I struggle to see how you continue to justify being so far up a creek, other than exhibiting pathology.


Again, you are contradicting yourself and haven't been able to cite all of these "amazing wins." You're claiming that you're going to struggle with scrubbing over TB 4... and pushing remote editing instead! That's laughable.

Also I don't think you understand compression. Interframe-compressed codecs like H.265 are a bigger computational pain in the ass than ProRes (for example).

And "remote desktops have been amazing for decades..." What? Irrelevant. In the '90s people were still buying heavily optimized turnkey systems with SCSI arrays just to be able to capture and edit SD video at broadcast quality; and you couldn't even stream VHS (6-hour mode) quality over the Internet. Come on, man. Why shill so hard for your pet workflow, and berate other people who don't want or need it?

I can scrub my 4K video just fine over Thunderbolt 2. Maybe you need to defrag, bro!


I really struggle to understand where you are coming from, don't see what reef you've so clearly beached yourself on. To resolutely not get it.

> scrubbing over TB 4... and pushing remote editing instead! That's laughable.

You seem incapable of grasping the basic premise of what desktop streaming is. A modern video card will give you a pretty good quality 10-bit 4:2:2 (or 4:4:4 or 4:2:0) hardware accelerated h.265 hevc & AV1 capable encoder, that is just sitting there for use & which will consume no other resources; for all intensive purposes free.

You connect to your render workstations desktop & scrub there. On its many GBps SSD array.

Even better, instead of buying everyone on the team their own high end desktop or beastly laptop and their own SSD array, anyone can connect to a virtual desktop as they need. There's actually 3x different hardware encoders even on regular consumer GPUs! A 64 core AMD 7R13 Milan is $1000 and will let you load up absurd numbers of GPUs and SSD, that'll host a whole team very effectively.

Really confused why the internet is scaring youso, how you've missed the premise of this article entirely. Maybe you should try booting Sunshine and Moonlight some day, as an easy to DIY low latency low bandwidth VDI.


re: Apple not using Final Cut Pro (FCP). I feel like Apple made an intentional decision to abandon the high end production market when they released FCP 10 in 2011. They dropped multicam, XML import/export, etc. I heard they eventually brought most of these features back but seems clear Apple isn't focusing on this part of the market.


FCP 7 was garbage, which Apple bought from Macromedia. It was never "high end."

The new FCP could have righted many wrongs, but Apple turned its development over to people who didn't even understand industry-standard terms... and who rejected input from experts Apple had hired years earlier. But that's Apple's standard behavior. They just don't learn.


> It is pretty baffling that Apple would create a PR vehicle that impugns its products like this.

I'm struggling to see any of this, frankly. Of course apple uses non-apple software. It'd be pretty weird if they didn't.

All this marketing bullshit reinforces the value of refusing to engage with marketing. What a massive waste of time and effort for all societies and cultures involved.


Struggling to see any of what?


How this "impugns" apple at all?


Did you read the article? It points out several ways.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: