Posted by shortformblog 6 days ago
He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
So it doesn't surprise me that their video editing tools are designed for a single user at a time.
Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.
Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
Opinions are my own and do not reflect those of my employer.
I love reading articles that purport to tell the public how things at Apple work. They're almost always laughably full of shit.
That implies every floor is different which matches what you are saying.
Most of the stories that have come out felt like they were the image Apple wanted to give. It started with Apple going after missing iphone that was left at a bar. We've heard those working on latest design for the next iphone were sequestered away from the rest of the company. I've always thought it was marketing spin and I'm glad we have an ex-apple employee confirming this. Back in the 'Lisa' days Apple did split and silo divisions, Apple did closely guard new iPhone designs with very few leaks happening but the rest of the mythology is more marketing.
It isn't absurd as what GP mentions was imported into Amazon by Dave Limp, a former Apple C-suite. It was a terrible culture shock for most of the ICs in my team being reorg'd reporting in to Limp, after Steve Kessel (of Kindle fame), the previous leader, went on a sabbatical.
I worked for a company that did some work for the federal government. Boring stuff. Their compliance rules essentially required that we firewall the folks with operational access to their data from the rest of the company. We included the physical offices in that to avoid certain expenses and controls companywide.
By putting that, they decrease the likelihood of reprocussion in the workplace for things said outside of the workplace.
You can still get in hot water for anything you say that ties back to you or the company regardless if you disclose who your employer is.
This is the grey-area that corporations typically carve out in a social-media policy so that employees can engage in discussions around their employer without being on behalf of their employer.
It's still a perilous position to put yourself in as an employee. Innocent and innocuous things can always be misunderstood or misinterpreted.
What happens when you use that disclaimer and are self-employed though?
That's the main issue. But also this happened about six years ago.
>
> A Mac with macOS 14.0 or later and Keynote 14.3 or later
>
> An iPhone with iOS 17.0 or later and Keynote 14.3 or later
>
> An iPad with iPadOS 17.0 or later and Keynote 14.3 or later
Those OSes were released around June of 2023, so a little over a year?
I think it's kinda bizarre to accuse someone of not "researching" their own recollections. How would one accomplish that?
Back in the day Keynote files would just be passed around via a shared server so you and the people you were collaborating with could make and merge changes between them, eg I’d do one part of a presentation, Rick would do another part, and we’d copy our slides out of and paste them into each others’ decks to get a complete version for rehearsing with. If we had notes for each other, we’d give each other the notes out of hand rather than just directly change each others’ slides.
There’s a lot of mythology that people just make up about how secrecy works at Apple. It’s mostly sensible.
The severed floor.
That sentence, by itself, is more or less correct (from my 26 years at Apple). However, it suggests/implies things that are not correct.
1) In case you got the impression: Apple certainly does not design software to be non-collaborative simply because it would enable sharing/leaking when used within Apple. I would say that Apple has been focused since Day 1 on a mindset where one-computer equals one-user. The mindset was that way really until Jobs was fired, discovered UNIX, and then returned with Log In and Permissions. To this day though I think collaboration is often an afterthought.
So too do they seem to be focused on the singular creative. I suspect Google's push into Web-based (and collaborative) productivity apps (Google Docs, etc.) forced Apple's hand in that department — forced Apple to push collaborative features in their productivity suite.
2) Of course Apple collaborates internally. But to be sure it is based on need-to-know. No one on the hardware team is going to give an open preso in an Apple lunchroom on their hardware roadmap. But you can bet there are private meetings with leads from the Kernel Team on that very roadmap.
That internal secrecy, where engineers from different teams could no longer just hang out in the cafeteria and chat about what they were working on went away when Jobs came back. It probably goes without saying it was rigorously enforced when the iPhone was a twinkle in Apple's eye.
The internal secrecy was sold to employees as preserving the "surprise and delight" when a product is finally unveiled but at the same time, as Apple moved to the top of the S&P500, there were a lot of outsiders that very definitely wanted to know Apple's plans.
3) Lastly, yes, plenty of floors and wings of buildings are accessible only with those with the correct badge permissions. I could not, for example, as an engineer badge in to the Design floor.
Individual cabinets needing badge access? I have no idea about that. I am aware of employees hanging black curtains in their office windows when secret hardware would come out of their (key-locked) drawers. (On a floor that is locked down to only those disclosed, obviously the black curtains become unnecessary.)
My badge only worked where I had explicitly been given access, and desks were to be kept clear and all prototypes or hardware had to be locked in drawers and/or covered with black cloths. Almost every door was a blind door with a second door inside, so that if the outer one opened, it was not possible to see into the inner space.
Both are designed to replicate the same functionality as Concurrence and Quantrix (itself a clone of Lotus Improv) both by Lighthouse Design, who made lots of apps for NeXTSTEP and were purchased by Sun.
Steve Jobs used Concurrence on a ThinkPad and also a Toshiba laptop to make presentations prior to Keynote (which I believe was created internally for him at first) even while back at Apple.
Real-time collaboration was added in Keynote 7.0 released in Sept 2016.
https://www.macworld.com/article/228811/keynote-pages-and-nu...
The editors of Severance are actually using Avid. For music composition they're using Albeton. Neither are Apple products. The remote desktop product they're using is Jump Desktop.
While the show is an Apple TV+ show, and they happen to use to Macs in the process, this has shockingly little to do with Apple tools or products.
There are far too many things that can go wrong with such a setup.
Yes, the fire marshal has also thought of the first thing you just thought of to post. They aren't stupid.
And if you want to make that scenario terrifying, imagine being there on a weekend or holiday.
> I can very easily imagine what could have happened in those buildings if a badge was required to leave a conference room.
The facilities team and fire marshal are also easily capable of imagining this, already have, and you can ask them about it.
In this case the doors would fail open, or are made of glass and can be broken down. It's not a /really/ secure location. It's just a tech company that likes to seem secure during work hours. After hours of course the janitors get to see everything.
In an effort to steelman your comment, you may have incorrectly interpreted the earlier "I wonder how well that is tested" as "this is unsafe and illegal" rather than "among the many things wrong with this, this has increased the number of things that can go wrong, and is less safe on an absolute scale, whether or not it's strictly legal and up to code", and then assumed everything else in subsequent comments was about fire safety, rather than being a series of points in support of locking people into a building is a bad idea.
You are asserting the competence of the fire marshal, as an argument in a conversation about locking employees and interviewees and visitors inside a company's office rooms.
What you may think was happening here: "heh, nerds think they're smarter than the fire marshal and nobody involved thought of this until they came along; of course there'd be a way for sufficiently capable humans to get out of a room if something went wrong, and of course this will have been made to pass fire code, which is the only thing being talked about here".
What was actually happening here: While with sufficient analysis (which has most likely been done) it is possible to provide a sufficient degree of fire safety to make it not against fire code to lock people into a building, that doesn't make it right or zero-cost or risk-free, nor does it alleviate the stress and potential problematic-but-non-fatal situations that could arise. At no point was the primary purpose of the comment "people might burn in a fire", even though the risk of that is not zero at any time and has likely been raised (within presumably-acceptable-to-fire-code levels) by such a setup.
When I said "I can very easily imagine what could have happened", I was not imagining a fire burning down the people with the building inside. I was imagining how few failures it would require to end up with people being trapped in a room for long enough to reach the level of stress required to physically break out of a room, compounded by having worked in labs where the air conditioning was sometimes woefully insufficient.
It takes a lot of stress to get normal people to the point that they're willing to break windows or doors or walls in order to escape a room, and nobody should be subjected to such things, because there's zero security justification for a company locking people inside at any time.
Any random local government staffer is the most powerful person in the universe and obeying them is a religious edict. Apple has zero power to disobey anything in the fire code and they're probably not even capable of imagining doing so. That's why the random suburb they're in has the best public schools in the country and all the houses are like $5 million.
As an example there's currently a big empty lot next to said HQ where the mall used to be, because a random woman on the city council has blocked apartment construction for the last decade, because she thinks Apple employees will move in and molest local high school students.
I know some people will say this is because of age. But I want to suggest I often thought of COVID years 2019 to 2023s as a single year / event. For reasons I cant quite fathom. So when I think of 2015 it would only be like a 2023-2019, 2018, 2017, 2016. So around 4-5 years ago.
Video is a harder game due to the processing and data requirements, but I know that there are a lot of startups trying to make it collaborative first. I’m really excited for that to be the default.
Keynote works just fine with multiple simultaneous users. I work at Apple (for now) and do it all the time with managers/EPMs etc.
There is plenty of collaboration in the company but it's typically constrained to the current project you're working on. And working in enterprise companies today it is no different.
You can even read any accounts of famous shipped products to back up that cross functional collaboration has been their culture for many decades. Jobs mentioned it many times, and many articles have been written about it.
Additionally keynote (and the entire iWorks suite) has had collaborative editing for years now.
I suspect your friend is likely misinformed or not reliable?
I’m reminded of my friend in grade school that had “an uncle that worked at Nintendo”.
Not saying he didn’t, but just because someone works there doesn’t mean they know what’s going on.
> We are seeking an experienced Software Architect specialized in source control systems to join our dynamic team. The ideal candidate will have expertise in designing, implementing, and managing systems like GitHub, GitLab, Perforce, Bitbucket, and Artifactory.
Many of which take time to be migrated into the mothership.
It was quite common to have remote desktop cards on high end machines so that you could hide them away somewhere quiet. The edit stations/Flame/Baselite machines all hada fucktonne of 15k sas drives in them, so were really noisy.
You couldn't invite a director to see what you were doing, when all you can hear is disk/fan whine.
They were quite expensive because they needed to be able to encode and send 2k video in decent bitdepth (ie not 420, but 444), and low latency. Worse still they needed to be calibrateable so that you could make sure that the colour you saw was the colour on the other end.
Alas, I can't remember what they are called, thankfully, because they are twats to manage.
Every time I've seen higher end workstations, the actual workstation itself was always in a separate room, and there's been some kind of remote KVM solution used. The workstation was always very noisy and generated a lot of heat. It's also just... a lot of money to shove under a desk where people kick it all afternoon.
https://www.rackmountsolutions.net/24u-ucoustic-soundproof-s...
I'm afraid of trying something really invasive like liquid because I live in Portugal and getting weird stuff takes forever.
> Or are you aware of a more effective enclosure?
No I don't really: I have some heavy blankets hanging which helps a bit. My musician friend told me I should glue some egg cartons to the blankets, so I'm going to try that soon.
Teradici came on the scene and started running everything over IP. Hardware at first (old EVGA pyramids were everywhere) where you had to route the video out into a custom card that then put out the signal via IP.
Now it's all software with the leaders being teradici (merged with HP anywhere which came from IBM), nicedcv (Amazon), parsec, and a few others.
The big advantage in content production over something like vcn/rdp was color fidelity, local cursor termination, and support for hardware like Wacom tablets. You can even do 7.1 audio and multiple monitors. Turns out when you are an artist having a local like feel is incredibly important. 60fps is 16ms per frame. So even with virtual workstations on AWS you want to deploy them in a region that is relatively close to the end user.
So there are a couple of options, depending on the hardware. If it kicked out HD-SDI you could just patch the display into the coax in the building and have done with it.
But that only worked if you were in the same building and your machine kicked out HD-SDI
Most machines either shat out dual-link DVI or worse, some custom shit. Getting a cable that can reliably transport dual-link DVI >10 meters was difficult and expensive. Worse still, it had a habit of dropping back to single link, or some other failure mode that was everso annoying to debug. More over, 10 meters often isn't far enough. Especially if the room had a projector (so might be >5m long throw.)
Now, thats the simple case. The hard case is multi-building. Say, you have an operator working in london, and the director in new york, you want to give them the highest quality picture possible. The only way to do that at the time was with one of these cards, or some nasty SDI-hardware h264 transcoder (hugely expensive at the time)
I really wish I could remember what they were called. They appear to have fallen out of favour.
Now, you'd just use cynesync, as you're laptop can encode video in real time now (https://www.backlight.co/product/cinesync) Also, rumour has it that the wolverene movie was leaked because a producer got coked up and left an unencypted laptop on a plane, rather than using cynesync to show an edit to someone important. Alas I can't verify that.
This is exactly the insight I was hoping for. Thank you.
thats the one! many thanks
For passive cables, that makes sense. But with repeaters, wouldn't you be able to go further? Maybe cable repeaters like that are newer than I imagine.
By about 2014 hardware encoders were good enough to send decent quality video over gigabit.
Thanks though, will keep it in mind. These days Steam Link is decent enough for what I wanted, but ya never know.
I think what I was referring to are repeaters you put between cables, that amplifies the signal. You connect that device between two HDMI cables (say 5m) + connect it to power for it to actually extend the distance the signal can travel.
I'm not sure what an active HDMI cable would be, maybe circuitry inside the cable that draws power from the HDMI port?
https://www.chargerlab.com/the-way-to-metaverse%EF%BC%9Ftear...
Pretty much:
https://www.cablematters.com/Blog/HDMI/how-do-active-hdmi-ca...
Perhaps they're better now but, yeah, for me it didn't work out as intended.
Also I believe there are displayport cables that do the data transfer over optical fiber for long runs as well.
There are a slew of HDMI extension systems, some that even use ethernet with hardware encoding/decoding. Grandparent commenter hasn't worked in the industry in at least a decade if they're talking about DVI.
(It's "passthrough" and not "uncompressed" because DisplayPort may use DSC depending on the resolution and frame rate.)
US$500 for an optical cable can be a lot cheaper than paying for HDMI extender sender and receiver boxes.
There's plenty of HDMI2.0 compliant "video over ethernet cable or fiber things" which are the ordinary COTS products that may not be sufficient for serious video editing needs.
People on video editing workstations these days are using higher end monitors that can be trusted to work in 10bit color and to match a certain color space grading.
On the other hand it's a lot easier these days to have a relatively quiet video editing workstation that has 8 to 16TB of local, pci-express bus attached NVME storage for work space, and that same workstation can have a not-very-expensive 100GbE NIC in it attached to some large/noisy storage elsewhere.
We had those, the problem is that they loose bitdepth. They were also fucking unreliable. We had a lot HDMI extenders and they worked for 1920x1080, and sometimes 2k if you were lucky.
we used them for the "prosumer" LCD projectors we had the in the review rooms. They didn't work so well for the massive christie projectors. (I seem to recall they abused 3g-SDI to get resolution)
> Geoffrey Richman reviews season two finale footage. In his at-home edit bay (not pictured), he works on iMac, which remotes into a separate Mac mini that runs Avid from a post-production facility in Manhattan’s West Village.
Yeah. That would be a horrible experience.
Had sony bothered to follow it's own rules it wouldn't have been hacked and had all its data leaked in 2014.....
But to answer the question, we had a shit tonne of networking, so as far as I'm aware it was just on the vanilla network. Might have been a seperate VLAN though.
I wanted to be a compositor, but failed the rotoscoping test at the company I was working at. So I fell back on my technical skills, and became an infra engineer. I left VFX in about 2015, and sadly no matter how much I want to go back, I don't see much of a future in it. GenAI is really going to do a number on it.
I don't think Generative AI will make entire industries disappear, but rather make people within those industries do more with less. Seeing as you somewhat see what future of the industry is, and assuming you're right, it puts you in a good position to gain the skills you think will be sought after. You have the technical skills too seemingly. Just an idea, I'm not working in either areas so take it with a bit of salt I suppose.
I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."
Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?
It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.
"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)
Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.
Even today it's not close to practical to have an entire episode's worth of raw footage (of which there'll be many many takes, many many angles) entirely on an editor's workstation.
The surprising aspect is that they don't use proxies for editing rather than remote desktop.
If you had more throughout, more bit depth, etc you would have enough colors to not see banding. But the bitrates required (if you insist on 4k) are tough on SSD/HDD IO to say nothing about your network connection. And even if you have the best connection ever, most people don't and streaming services will want the bitrate to be as low as possible as long as the average viewer is not too upset, because delivering higher bitrate costs service real money and most customers don't have the connection for it and don't care about banding
In my experience it is way easier to scale storage bandwidth than compute, atleast locally.
There has been times where I've been able to cut a shoot from the raw files, and this has beeen corroborated by other editors, beforr proxies were available.
So it took less time to cut and submit for review than to actually generate the proxy media.
Sure if your workflow had a decent gap between shooting and post then generating proxies is trivial but sometimes a little more storage and memory bandwidth goes a very long way.
Who says they're not using proxies and remote desktop?
Yes, attaching many terabytes of video is cheap now.
But scrubbing through that high res raw video isnt (just) size intensive. Its throughput intensive. Size : throughput :: energy density : power density. You can get pretty good all SSD NAS but using a 40Gbps (5GBps, minus overhead) Thunderbolt 4 is still gonna be ok but not stellar. A single desktop SSD can triple that!
I can fully see the desire to remote stream. Being able to AV1 on the fly encode to your local editing station, or even 265, at reduced quality, while still having the full bit depth available for editing sounds divine.
You're saying Thunderbolt 4 is going to struggle with something, and then touting a desktop SSD as "tripling" TB 4 throughput... but finally declaring that "remote streaming" is somehow better than both of those?
What an absolute crock.
> I also don't buy the author's rationale for remote editing; it's oddly archaic
> Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet
Remote streaming is far better. A 2mbps or 20mbps connection to a powerful editing station is awesome. A compressed down h.265 with HDR will still let you edit very well, but be able to do intensive editing tasks with ease.
This really isn't hard at all, the advantages & wins are amazing, remote desktops have been amazing for decades now. I struggle to see how you continue to justify being so far up a creek, other than exhibiting pathology.
Also I don't think you understand compression. Interframe-compressed codecs like H.265 are a bigger computational pain in the ass than ProRes (for example).
And "remote desktops have been amazing for decades..." What? Irrelevant. In the '90s people were still buying heavily optimized turnkey systems with SCSI arrays just to be able to capture and edit SD video at broadcast quality; and you couldn't even stream VHS (6-hour mode) quality over the Internet. Come on, man. Why shill so hard for your pet workflow, and berate other people who don't want or need it?
I can scrub my 4K video just fine over Thunderbolt 2. Maybe you need to defrag, bro!
> scrubbing over TB 4... and pushing remote editing instead! That's laughable.
You seem incapable of grasping the basic premise of what desktop streaming is. A modern video card will give you a pretty good quality 10-bit 4:2:2 (or 4:4:4 or 4:2:0) hardware accelerated h.265 hevc & AV1 capable encoder, that is just sitting there for use & which will consume no other resources; for all intensive purposes free.
You connect to your render workstations desktop & scrub there. On its many GBps SSD array.
Even better, instead of buying everyone on the team their own high end desktop or beastly laptop and their own SSD array, anyone can connect to a virtual desktop as they need. There's actually 3x different hardware encoders even on regular consumer GPUs! A 64 core AMD 7R13 Milan is $1000 and will let you load up absurd numbers of GPUs and SSD, that'll host a whole team very effectively.
Really confused why the internet is scaring youso, how you've missed the premise of this article entirely. Maybe you should try booting Sunshine and Moonlight some day, as an easy to DIY low latency low bandwidth VDI.
The new FCP could have righted many wrongs, but Apple turned its development over to people who didn't even understand industry-standard terms... and who rejected input from experts Apple had hired years earlier. But that's Apple's standard behavior. They just don't learn.
I'm struggling to see any of this, frankly. Of course apple uses non-apple software. It'd be pretty weird if they didn't.
All this marketing bullshit reinforces the value of refusing to engage with marketing. What a massive waste of time and effort for all societies and cultures involved.
It's a clever way to have your media centralized and yet have access to editors all over the world.
And a modern AVID system does not struggle with a few editors accessing the same footage.
First of all it's usually a proxy format and Secondly the storage can deliver a combined 800MB pr box sustained for x number of editors at the same time.
Yes I avid feel free to ask.
AVID hasn't been at the forefront of video editing since the Avid/1 / ABVB days. They sell a reasonably usable program with horrible hardware (since Meridien hardware - it's good they finally let us use other hardware such as BlackMagic), but never truly fix large problems. People therefore stay on a specific version of the software for ages, because everyone is scared of new and different bugs.
AVID's shared media offerings are tenfold the cost of other storage options simply because they have a flag on the mounted volumes that tells Media Composer to allow project and media sharing. "800MB pr box sustained" means nothing because anyone can do that easily with commodity hardware.
In other words, AVID is milking their cash cow and they really don't innovate or even try to offer a good product.
Apple, on the other hand, destroyed their professional editing products, then replaced them with decent tools, but ones that are worlds different. Many people have mixed feelings about this. On the other hand, if you want to edit 8K ProRes, Final Cut Pro makes it simple on any ARM-based Mac.
It's their dependency on Blackmagic that's been there biggest problem the past 5 years.
Meridian was light years ahead of the competition. The firewire based adrenaline sucked.
And you won't find anyone complaining about their DX series just to bad they dropped that.
And your really not understanding the way avid nexis works if you think it's just a flag
I think you've been sold a bunch of ideas. For instance, Avid has no dependency on Blackmagic. They use Open IO, which means you can use any card that supports Open IO, whether Aja, Blackmagic, Bluefish, Matrox, whatever.
Nexus / ISIS isn't special. The flag is literally just a flag that tells Media Composer to enable bin and media sharing. It can be enabled on any kind of sharing - NFS, AFS, SMB, et cetera. For example, check out Mimiq software for enabling it wherever you want.
The NEXIS hardware/software isn't just a flag, another visit to Google
Second, please tell me how the fact that you can use no video interface card or Aja means you're dependent on Blackmagic.
Third, please tell me how bin locking on ISIS / Nexis is different than bin locking on third party shared storage with the AVID sharing flag turned on.
You've offered literally no searchable facts. If I search for anecdotes about how ISIS / Nexis are different, I'm only going to get marketing fluff.
So offer something of substance. Claiming someone is wrong without even saying what they're wrong about is not how any of this works.
Now please, before you make more fact based claims, i have used and still use the 3rd party "bin lock" solutions when I have special cases, and i can promise you, an ordinary file server does not compare when many clients are hitting the storage.
Substance delivered, lets see if there is a chance of someone learning something.
1) How does AVID selling Blackmagic hardware make that a dependency? You can just as easily buy Aja hardware. "This depends on that" means it requires it. AVID systems do not require Blackmagic hardware at all. If you think they do, please explain.
2) Bin locking and media sharing (client specific paths in "OMFI MediaFiles" and "Avid MediaFiles") is a flag that is either off or on. That has nothing to do with all of the other things that have been sold to you as "special" about AVID storage.
For instance, link aggregation has been built in to macOS since the early days of Mac OS X. Also, it really doesn't matter. If you want something that literally does 4 gigabytes a second, you can do that all sorts of ways with current Macs - no need for multiple NICs.
Anyhow, speed is largely irrelevant for editing systems. The only time speed matters is if the storage can't keep up. You're not watching video at 2,400 frames per second as you're scrubbing through video at 100x speed, so people who are concerned with "800 MB" (you're not even saying per second, or anything like that) are no different than the people who want the wanna-be muscle car that puts out 500 horsepower but that are just going to and from the store. Who cares? If you care, you know. If you have the need, you know. If you're working on 4K uncompressed, you're not doing it on shared storage, anyway - that's just silly. But if you REALLY need to do 4K uncompressed on shared storage, guess what? You're not using AVID, because it can't support that :)
Otherwise, "800 MB" is just a sales number. I just build a NAS for less than $2500 that does 1.2 gigabytes per second, and I wasn't even trying to make it fast.
> i have used and still use the 3rd party "bin lock" solutions when I have special cases, and i can promise you, an ordinary file server does not compare when many clients are hitting the storage.
Those are two different things. If you choose to conflate them, that's up to you, but I can easily show shared storage that makes AVID's look outright pokey, particularly with twenty clients, just as I can show you software that turns the AVID bin locking off or on, so you're not fooling anyone by trying to suggest that all bin locking file servers are somehow inferior, or that they're inferior because they support bin locking, or whatever way you want others to think they're connected.
They're separate things. You do understand that, right?
I hope you take away from this that there are more products than just the half a dozen that are most common, and that products outside of the post world often make products in the post world look ridiculous, if in part because the ones in the post world are a generation older and multiple times the price. But because people in the post world don't know any better, they more often than not spend literally ten times the going price to get something with an AVID sticker on it, even when you can show them that the AVID product is just a rebadged Seagate storage array or whatever.
But at least you use a car reference, AVID is the Ferrari, the "I CAN BUILD A NAS" is the useless muscle car. And of course they have gear that can handle uncompressed Footage, please, they are the standard for winning an Oscar.
I take away from this that you have no idea of how this work, it's not called "bin locking servers" that's just software. And it's not the hardware that makes a NEXIS special, it's the AVIDFS. I am only writing this for others that might read this so they understand.
Please keep this going, I am entertained
The way you develop & manage the proxies appears to be the biggest part of the battle in making things go fast. There's no reason for editor workstations to be operating with the full res native material unless theres a targeted reason to do so.
And I mean that in a completely positive "it's awesome" way. Just... not the problems anyone else should be facing.
With Covid remote access became the norm and the online/proxy workflow more or less died. Avid still has a working version (better than the original) but it's widely used.
Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams.
They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge. And especially if you have tight deadlines you want all the variables out of the way.
"With Covid remote access became the norm and the online/proxy workflow more or less died"
No; remote access DEMANDS a proxy workflow, since you're not going to edit full-resolution files over the Internet. So it did not "die;" just the opposite. Witness the entire "camera to cloud" marketing mania that swept NAB a few years ago. That's based entirely on the rapid upload of proxy files to begin editing ASAP.
From NAB last year: “We introduced the [Blackmagic Camera] iPhone app a little while ago,” said Bob Caniglia, director of sales for the company in North America. “You can shoot with that phone, work with the cloud service, share proxies. The camera to Blackmagic cloud to Resolve workflow started with the camera app. The Ursa Broadcast G2 [camera] is now in beta for that software too. That's a good direction on where we're going.”
Does that sound like it died? Or https://blog.frame.io/2024/04/11/visit-us-at-nab-2024/
But back to your assertions: "Proxies are used for several reasons, expensive storage, heavy codecs at high bitrates or multicams. They are typically avoided whenever you can because the online part of a proxy based workflow can be a challenge"
That makes absolutely no sense. You just claimed that proxies are used to avoid "heavy codecs at high bitrates" but then claim "the online part of a proxy based workflow can be a challenge." But you neglected to provide a single example of what's so "challenging" about it, especially when you just cited proxies as an advantage.
Thus, since you pushed the issue, we see that in fact it is you who has no idea what you're talking about. But hey, keep insulting other users.
Or... if you prefer to be informed: https://filmmakermagazine.com/120946-new-remote-tools-workfl...
https://www.tvtechnology.com/news/cameras-support-expanding-...
and many many more...
- Composer/Nexis all hosted on Cloud (AWS): fine, but pricy and the Nexis experience is meh
- Composer hosted Cloud/Nexis hosted on Prem: actually works well, but you need to have a direct-connect to AWS (the network can be pricey)
- Composer on on-premise VDI/Nexis hosted on Prem: works really well, and I have a bias towards this instead of fully in cloud for not only security reasons since the TCO is less
- Composer Cloud (or whatever they call it today - used to Composer Sphere): this is a setup where instead you stream real-time proxy to the Composer from MediaCentral. You can download hi-res media if you need to. It works ok, but it more suited for News workflows. Security is a thing with this solution.
- Adobe/OpenDrives on AWS: I mention this, because we do this too. This has all sorts of things to talk about, and is pretty good, but, again, you gotta know what you are doing.
For the on-premise ones, VMWare is our Hypervisor of choice, and, yup, we are looking for other options. And we have all the usual IT problems: domain management, updates, roaming desktops, etc.
If you are looking for 3rd-monitor image viewing (like in the old days with hardware), you can swing NDI or 2110. NDI is ok, and for 2110 you need a network and router to handle it.
If you have time to expand on the "bandwith and latency thing", I'd love to hear more. Even a "you need to be geographically within X miles of the instance" ballpark figure would be wonderful to know.
It worked.. Kind a
One of the main reasons it's used in larger post houses is the hardware and software support that is world wide with people on site if needed.
My home internet is a fiber gigabit 3g/3g up/down. Tucked away under the staircase is where my fiber ONT terminates and it is my server room. I have half a dozen boxes running various things. 4 symmetric 2012 i7 mac minis running linux KVM, and hosting various critical services - pihole, home automation, Homekit Secure Video etc.
Then there a giant former gaming PC with 7 HDD bays running the entire storage backend for a whole load of GoPro/Osmo/Insta360 videos I capture. Rclone to Google Photos for back-up. I don't edit any videos. Just there to capture memories so I can at some point when AI tools get good enough just have it generate clips. Same box runs my plex server with HW transcoding.
Then there is the actual gaming PC, a mini-ITX running steam remote play. Has power, a network cable and a fake HDMI dongle that emulates a monitor to trick the GPU into thinking something is actually plugged in.
Basically everything I do with desktop PCs at home is via some sort of remote interface.
Remote gaming is probably the most demanding of all of these. Low-latency HW-accelerated solutions eg: Parsec / steam-link are incredible technologies.
I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.
Be honest—you're just playing Factorio
You can just follow that thread no write up needed tbh
Moonlight works great (over ethernet at least) locally though.
iPhones already do this today. I'm often surprised how well made they are
Sounds like this author didn't watch the whole video. They are completely open about the fact that the editing team collaborated through remoting. At 5:20 an editor specifically says they "remoted into the Mac mini."
The second half of the post raises an arguably good question about the need for fancy Macs when cloud-based workflows only require glorified terminals. But that too may misplaced here -- it's entirely possible that the team members each do local editing work and then host their own collaboration sessions.
The extent of my 'cloud' involvement with apple is the operating system software update mechanism and having an account to download Xcode, so that I can install compiler + macports on a new machine.
There’s also so much inefficient, bloated crap that ships with modern macOS that I would never pick it for a proper workstation these days. I have CPU meters in the system tray, and there’s always some stupid process gobbling up all my spare cycles. The other day it was some automatic iPhone backup process. (Why was that using so much cpu, Apple?). Sometimes it’s indexing my hard drive, or looking for faces in photos, or who knows what stupid thing. It’s always something, and its almost always first party software.
In comparison, the cores on my Linux workstation are whisper quiet, and usually idle at 0%. The computer waits for me to give it work.
(Namely background QoS, it only runs on the efficiency cores, and more expensive activities stop when the user is active.)
If you're having an actual specific problem report it with Feedback Assistant. If you aren't, I recommend removing all that useless monitoring stuff and getting an outdoor hobby.
As an actual performance engineer I've basically never in my life gotten a useful report from someone looking at those every day. Although other vibes based bugs like "I feel like my battery life is bad lately" often do find something.
Are these processes behaving properly or is it in some stupid infinite loop? I can't tell. Is it considered acceptable by apple for background processes to make my efficiency cores sit at 100% utilisation more or less all the time - even when I'm on battery power? How much will that reduce my laptop's battery life?
I can't tell. I have no way to tell. Its all an opaque jungle of processes running processes. Half of them are buggy half the time, and I don't know which half. It gets more complex and stupid every year.
I swear, macos seemed to run better 10 years ago when I had a computer that was many times slower. Strangely, at the time, there were no constant background processes chewing up CPU all the time like this. Tell me, how is any of this stuff making my computing experience better?
I think my preferred computer has a fast, modern CPU and software from a decade or two ago. Off the top of my head, I can't name a single feature added in macos in the last decade that I actually care about. (Excluding support for modern hardware.)
If the battery life is less than you expect then there's a problem. I think that's pretty easy to notice.
It sounds like that's a bug though, you should report it. Posting on random forums about it won't cause it to get fixed.
The only way I could tell that my battery life has gone down would be by doing actual tests - but those are notoriously difficult - because I can't use my laptop at the same time. (Or, I guess I can - but I'd need to use it the same way across tests). It sounds like days of work to test my battery life with and without transient background tasks. I don't even know how I'd test that - because I don't know how to turn all that stuff off for the control.
I'm also not going to post an issue on apple's bug tracker that I have an intuition that my battery life is worse than it could be. That'd get deleted instantly.
I hear you that complaining online probably won't help. But can't see how complaining about battery life in feedback assistant would help either. The situation is crappy.
Don't worry, I am literally telling you to do this. Apple is made entirely out of bug reports. It's their job to handle them.
I would say that you shouldn't put too much effort into it, simply because of burnout.
The “bug” here is system activity I’m not deliberately invoking.
System activity can certainly cause problems like paging out all the file cache pages you wanted to use when you get back to the machine. It doesn't have to though.
Are those programs written well, or are they using so many cycles because they’re inefficient and slow? And when did I ever opt in to this? Spotlight has slowly gotten more and more horrible over time. Half the time I use it to invoke system preferences it can’t find it. Or it can’t find the applications folder. If Spotlight is this terrible, why is the hard disk indexer so busy? Is it any better engineered than spotlight? I doubt it. Likewise, I don’t want photoanalysisd looking at my photos. I don’t use that “photos by person” feature. Why does it use hour upon hour of cpu time to make this feature available - just in case I use it later I guess? Get lost.
I really wish Apple stopped adding random crappy features to macOS that I don’t use - but which burn cpu cycles. Instead, fix your shit. Indexing is fine if you make spotlight actually be good again. Photo analysis is useful if I decide it’s useful and turn it on. And maybe if Xcode and SwiftUI weren’t such a buggy, crash ridden, undocumented mess, then maybe, maybe, I’d trust you more to run random background processes.
As it stands, I don’t trust Apple - particularly their application teams - to be good custodians of my cpu.
https://forums.dolphin-emu.org/Thread-wiimote-pairing-macos-...
The bluetooth stack in Windows and Linux does.
Apple's official position is that my wiimotes are the problem and I should buy an Apple supported bluetooth game controller.
Presumably one of these four controllers:
When the people using your tools hate the tools, that isn't a good sign.
They own GitHub, they make Visual Studio Code, they made C#/.NET open-source and cross-platform, they added Linux support to Windows (twice), and they created WinGet, just off the top of my head.
Our project supports the three major desktop operating systems. I have Windows and Linux VMs that I can switch to when I need to test something on those OS. No serious corporation is going to risks Hackintosh.
I’m fairly sure they don’t use iCloud which is why some of that stuff is still less than desirable.
We can probably assume that Microsoft uses some kind of Exchange set up and Google will use a version of Gmail.
Whenever I meet with people from Apple, it’s over WebEx.
I heard a rumor that they use some Oracle enterprise groupware, which is presumably https://en.m.wikipedia.org/wiki/Oracle_Beehive
I don’t find it all that surprising:
- Sun/NeXT were doing stuff together before Apple and NeXT merged
- Lots of Java stuff at Apple immediately following the merger including a Cocoa-Java bridge and WebObjects is rewritten in Java
- Oracle/Sun stuff doesn’t need to be run on Windows
- Steve Jobs and Larry Ellison were good friends
https://www.apple.com/newsroom/2025/02/apple-will-spend-more...
Opening a New Manufacturing Facility in Houston
As part of its new U.S. investments, Apple will work with manufacturing partners to begin production of servers in Houston later this year. A 250,000-square-foot server manufacturing facility, slated to open in 2026, will create thousands of jobs.
Previously manufactured outside the U.S., the servers that will soon be assembled in Houston play a key role in powering Apple Intelligence, and are the foundation of Private Cloud Compute, which combines powerful AI processing with the most advanced security architecture ever deployed at scale for AI cloud computing. The servers bring together years of R&D by Apple engineers, and deliver the industry-leading security and performance of Apple silicon to the data center.
They have redundancy and backups.
> Unless the insurance is anti-piracy insurance?
This is a big part of it, actually. Content that leaks prior to launch can reduce revenues significantly. Both from lost viewership due to people already having seen it, and from negative reviews of the unfinished early edits. Many movies change significantly for the better from early cuts.
[0] Remember that one time Pixar rm -rf'd their server and almost lost Toy Story 2 but for one manager who had a local copy of the project at home?
I wish this were an internet rule we could repeat ad infinitum.
Lucifer, Minority Report, Blindspot, and Carmichael were all leaked, and those shows were on different networks, which means it was likely a third-party company that was doing effects in post. I don't recall if it was ever sussed out what exactly happened and now they all got leaked, but it definitely made the industry a bit warier.
Most studios are hyper paranoid about their movie or TV show getting leaked.
> he works on iMac, which remotes into a separate Mac mini that runs Avid
So the conjecture from the article that the mac mini isn't powerful enough is false
> In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk. Instead, it’s being driven by another Mac on the other side of a speedy internet connection
And based on other comments here, this is a pretty common way to do things.
Why the sensationalism?
[0] https://www.apple.com/newsroom/2025/03/how-the-mind-splittin...
Not what the article says... and that doesn't follow anyways. The remote experience was terrible and the non-remote experience wasn't shown at all. How fast the Mac Mini theoretically is doesn't matter at all once you have such an insane bottleneck.
> And based on other comments here, this is a pretty common way to do things.
And? The industry is making a mistake by knee-capping its editors. It's going into seconds-per-frame territory in the video, it's as close to unusable as you can get. The article seemed to definitively prove its case that someone desperately needs to step back and look at what the requirements for these editors actually are, rather than ramming conflicting demands into each other to appease the anti-piracy insurance mobsters.
I just wish it didn't require an internet connection for authentication
Sunshine is a self-hosted game stream host for Moonlight. Offering low latency, cloud gaming server capabilities with support for AMD, Intel, and Nvidia GPUs for hardware encoding. Software encoding is also available.
1: https://github.com/LizardByte/SunshineI'm kinda surprised you've managed to be on HN for 5 years and never come across the concept of a "LAN" or "VPN" before, but I guess you're one of today's lucky 10000. To the first, sometimes you have machines (or VMs) local to your own network but in another physical location that you'd like to be able to access from your own system. It's a fairly significant use case, and one where no internet connection is involved whatsoever. For example it's generally desirable to locate powerful (and in turn generally loud) servers and associated gear (including environmental control, redundant power etc) in physically isolated locations from where the humans are working for noise reasons if nothing else, though security and efficiency are important as well. While it's possible to pipe raw video over IP, a quality remote desktop solution will generally be more flexible/scalable and doesn't require special (expensive) extra hardware and potentially additional fiber.
And for systems located on other LANs remote from your own, you can use a VPN to link them securely as if they had a direct physical (though higher latency/more jittery) link, again avoiding any exposure to the public net. That then reduces to the above. In both cases it's desirable to have zero unnecessary 3rd party dependencies.
What is clear to me is that Parsec belongs to a newer breed of remote tools, inline with TeamViewer and AnyDesk, that primarily respond to the need of post ISP firewall era, where by default ports are blocked, so peerless remote tooling becomes harder to install and administer, these have a client-server-client based architecture. And Parsec builds upon this architecture by placing some secret lag reducing sauce on their Server instead of just authenticating and forwarding.
My guess is that they have a proprietary predictive and interpolation based OS algorithm tightly coupled to the OS UI, and this secret sauce lives and is closed source on their backend, so you would kind of need to host a third server in the middle, maybe we will see a competitor for a VPN niche, or an open source alternative.
If an open source solution arises, I bet that it would require an installation of a server, and it would probably start with X11 or Wayland tight coupling.
Unnecessary snark.
----
This would make sense in VPN environments though.