Posted by radeeyate 6 days ago
There were some perspective and lighting issues, and a few small glitchy artifact moments. They probably showed about 5 minutes of footage from various parts of the movie.
I managed to pull 8K from some 70mm ride footage. I think it might have stretched to 10K if I'd had the equipment.
Wizard of Oz is 3x35mm for the color scenes, though. So, in theory you could use some clever algorithm to almost certainly create 8K transfers from it by using the tiny differences between the three frames.
> From my experience, 4K was the absolute edge you could pull from the best 35mm films... I managed to pull 8K from some 70mm ride footage.
This matches the expert opinion I've heard elsewhere, as well as what was done when scanning Blade Runner (an historically important film now in the national archives). They scanned the 35mm camera negatives at 4K and the 65mm VFX camera negatives at 8k. I'd add that in film scanning, extended color depth is also very important to capturing the full fidelity of what's on the celluloid. An adjacent high-impact factor is how the grain structure of film is handled in post-scan processing. This entails artfully balancing a tricky set of trade-offs because the highest-frequency edge and chroma details are down in the noise floor. Extracting more signal is a knife-edge balance between amplifying noise on one side or suppressing filmic texture on the other. Ultimately, these choices move beyond technical correctness to aesthetic judgement, which I imagine has to be the sweat-inducing part of the job. Anyone really interested in resolution, film scanning and digital cinema should see this in-depth comparison shot by an expert ASC cinematographer specifically to be a definitive set of proofs for the industry. He compares (and pixel-peeps) identical shots from a range of top digital cinema cameras (Arri Alexa, RED, etc) with a high-quality 6K film scan. https://www.yedlin.net/ResDemo/ResDemoPt2.html. I found the results pretty surprising.
> Wizard of Oz is 3x35mm for the color scenes, though. So, in theory you could use some clever algorithm to almost certainly create 8K transfers from it by using the tiny differences between the three frames.
Yes, I really hope that whoever handled the source preparation contributes a technical paper or note on what they did and learned in the process. What I've found so far is far too vague technically but implies they started with the 3-strip camera negatives (perhaps from an earlier scan). As you mentioned, that would enable interesting possibilities for noise reduction by leveraging the differences between the three strips. I kind of hope that sort of noise reduction was actually done by film preservationists in an earlier scan (perhaps for a Blue-ray or UHD release) as the archival source scans deserve the highest fidelity.
As I said in my other, longer post in this thread, I really hope that this effort for a Sphere version will also yield a less extreme 2.39:1 (Cinema Scope), 2.74:1 (Super Panavision) or even IMAX version from the 4:3 aspect original. I appreciate and value wide format cinema but personally think the Sphere FOV is so incredibly extreme it has more negatives than positives for cinematic storytelling. FOVs as extreme as Sphere are really only suitable for theme park attractions, amusement park rides or head-mounted immersive stereo 3D. I realize that the opportunity to collect $100 a ticket at the Sphere was too good for Warner to pass up but we're talking about a cinematic film that's become a cultural artifact. I'm not so purist as to balk at the idea of extending such a classic film with VFX and AI assist but a more "serious" cinematic version would be wonderful in addition to the over-the-top Sphere format. Hopefully, the Sphere version is a limited time release window and there's a subsequent IMAX release (15 perf / 70mm please) and an eventual 2.4:1 UHD (which I'd definitely buy).
I mean it's perfectly watchable on a 1080p display, but in 4K it's nowhere near the kind of 4K quality you get with modern digital cameras with modern lenses. Sure you can watch it its 4K scan (heck, they scanned it in 8K), but most of those pixels above 1080p are just showing you film grain and blur. Which is nice in terms of looking like it would with a real projector! But it's not "4K quality" in terms of resolvable detail.
It really needs to be watched because opinions will vary based on aesthetic interpretation but in broad strokes, it's pretty clear that above well-shot, high-quality 2K (aka 1080P) other variables beyond resolution begin to matter more (color depth, film stock, shot style, etc.) This supports your contention to an extent but the test also shows that beyond well-shot, high-quality 2K, going to 4K doesn't matter very much either. To be clear, it can matter sometimes but other factors can easily be equally or more important than resolution once beyond 2K.
From my own experience and experimentation, I already felt many in the industry over-weight the importance of 4k resolution vs 2k in overall image quality (assuming both are equally high-quality and well-photographed), but this convinced me 4K over 2K makes even less difference than I thought. And this was under controlled test conditions with pixel-peeping. In real-world theatrical or couch viewing distances of cinematic content, it's hard to think 4K matters much at all over good 2K. I already knew I will NEVER upgrade to 8K for theatrical or couch-distance viewing of cinematic content (and I have $50K of gear in a dedicated custom-built, soundproof, lightproof home theater room with high-end laser projector and 150-inch screen), but this shows even the idea of 8k is ridiculous overkill and a complete waste of money, bandwidth and tech for cinematic context viewing. (note before the inevitable objections: 8k can be useful for 2 ft distance desktop viewing of synthetic content and can be necessary for 2-inch distance head-mounted VR).
Most of all, getting real 4K detail requires a level of precision in camera focus that frequently just isn't happening. Whether it's an anamorphic lens that is intentionally blurring everything except the center of the shot, or just a shallow depth of field that becomes even shallower in 4K. You can get some astonishing detail with fixed landscape establishing shots, but it really depends on the lens. The only place where you consistently get full 4K sharpness is in 3D animation.
That being said, if you watch on a projector, 4K can be nice because you really can't see the square pixels anymore. You don't usually get more detail than with 1080p, but it does feel more like the detail limit is due to the limit of sharpness of the lens, rather than the pixel format. That being said, it barely matters. Spending money on brightness and contrast is 100x more important.
100% agree. Relevant examples: "The Batman" (2022) a gorgeous film which was shot with ARRI ALFA anamorphic lenses, intentionally modified and detuned to achieve a specific, "dirty" look and character. Blade Runner 2049 (2017) with Roger Deakins winning the Oscar for best cinematography even though he intentionally chose to shoot primarily on the Arri Alexa XT, a native 3.4K resolution camera. And the BR 2049 4K UHD is widely considered a reference grade disc and one of the best examples of what the 4K UHD format can achieve, despite technically being a 3.4K upscale.
And a personal example, this year's Superbowl broadcast, which was produced and distributed in 1080P HDR10, looked shockingly good on my 150-inch calibrated screen, which was enabled by a combination of best-in-class cameras, lenses and production talent with a high bit-rate throughout the distribution chain. It turns out when your primary cameras have $85,000 Canon glass on the front, it makes a difference (especially when Canon has a team of techs on-site all week to custom tune the lenses). Canon also loaned them a couple prototype super long lenses able to instantly switch to a shallow depth of field mode with zero change in aperture or shutter. One of those created a shot that made me literally rewind my DVR and pause just to appreciate (and then go look up how they did it later) It was a beautiful close-up on one player's face full-screen with perfect soft bokeh from a camera at least 150 feet away. Incredible.
> if you watch on a projector, 4K can be nice
It's interesting because I upgraded from a very high-quality native 1080P DLP projector (with pseudo-4K pixel shift) to high-end native 4K laser only about 18 months ago (fairly late). I never saw pixels on the 1080P projector except when displaying test signals during calibration (and even then, barely). I'm glad I waited a few years for 4K laser to more fully mature because, while there was an uplift in overall image quality going to native 4K, it was almost disappointingly minor. Even having waited so long, I feel like a meaningful portion of the 4K upgrade cost was wasted. Frankly, I'd still be perfectly happy with the (very good) perfectly calibrated 1080P DLP projector that's now nearly 7 years old.
If I was going to put more money into my AV system (and I'm not... because it wouldn't make anything look or sound any better), I'd invest in an even better projector lens. But that wouldn't matter unless Hollywood started releasing higher fidelity content. And even if they did, it wouldn't matter because there's no way to get a bunch of that higher fidelity content on my screen. Hollywood's not releasing anything beyond 4K HDR10+ UHD discs anytime soon (and maybe ever) and streamers don't want to distribute content at scale beyond 20-25 Mbps. OTA broadcast content could deliver slightly better fidelity (in theory) once (if) ATSC 3.0 ever rolls out widely but broadcasters have already made it clear they're going to divide any extra bandwidth the FCC gives them into more shopping channels at low bitrates. Frankly, I think we're pretty much at the end game for source quality in cinematic content for theater or couch-mode viewing. Now the only remaining battle is getting more people upgraded to gear, calibration, signal path and source content which can fully and correctly display what properly encoded high bit-rate 4K HDR10+ already provides (I include Dolbyvision in that because, in practice, it's pretty much the same quality as well-encoded HDR10+).
I also agree with you that it's hard to even see a noticeable difference between good 2K and 4K except with synthetic content like CGI animation. I think people who report observing a significant difference are either having some placebo effect or something was wrong with their 2K setup, signal chain or content, so 4K looked better not due to the resolution increase but because it fixed whatever was wrong with their 2K setup.
> Spending money on brightness and contrast is 100x more important.
Indeed. And also HDR, higher quality source content (ie less compressed), getting rid of digital keystoning or any other subtle degradation in the signal chain and, of course, ambient light control. I always tell new projector owners to first leave their projector turned off, darken the room to viewing mode and then sit there in the dark staring at their blank white screen. Then I point out, "that white is the blackest black you'll ever see". And that will (sometimes) finally convince them to get serious about light control. :-)
And then I walk by the row of 65 and 75 inch screens at Costco advertising "Amazing 8K Upgrade" to people who watch streaming movies from their couch 8-feet away from the screen and just shake my head. They're hawking grade-A homeopathic bullshit to nice people who don't know any better and should feel bad about themselves for lying to sell people something they don't need.
It’s possible the sources I was looking at were thinking of later eras of film, which I’d assume would have higher quality (and the lenses certainly improved).
Is this real resolution ? Or is it "youtube" resolution ?
Also, did they upscale the frame rate to 48fps??? Hard to tell from the video, but it looks like that’s the case. If so these people are ridiculous hacks and should never be allowed to touch another classic film. (“24fps would be disorienting on such a large screen!” Maybe that’s a sign this project should have never left the boardroom.)
Doesn't seem like it'd be an unreasonable choice to me if it's done well (I didn't notice any interpolation artifacts, checking through frames). People going to see a film at the Sphere probably aren't format purists expecting to see an entirely faithful and untouched version of the film, but rather a version adapted to take full advantage of the unusual medium.
Probably isn't done yet
This sounds awful, as if Google thinks direction and editing are technological limitations rather than artistic choices. This entire project seems fit for a culture that loves content and hates art. If these tasteless jokers ever do this to Stalker I will probably riot. (“Carefully composed shots is for anachronistic dweebs, our society demands an Experience Like No Other.”)
> Every change, Hays notes, was made in close collaboration with Warner Bros., to ensure continuity with the spirit of the original.
Considering Warner Bros. Discovery’s track record is 3 years of selling out their own directors and writers, I would be more confident if Google were left to their own devices! DeepMind likely has more respect for the film than Warner Bros. management.
Last time I heard this kind of talk, they butchered Lion King with the CGI remake. Getting similar vibes here.
Or when DVDs introduced that feature where you could switch camera angles. Well, turns out you don't want to do that because selecting the angle is part of the storytelling. So the feature basically died.
The "bottleneck" for movies is not so much the realism. There are strongly diminishing returns going from filmstrips* to movies to color films to HD to 4K to 3D, at each step you gain less and less. The core is still the story, the characters, the worldbuilding etc.
We see the same thing with video games. Until sometime in the 2000s, we would always ask about a new game "how is the graphics?", and we'd marvel at the new graphics capabilities. This is not really that big of a thing any more. Going from GTA 3 to 4 to 5 didn't increase the fun in proportion to the graphics quality. I mean, just look at the popularity of minecraft.
The real problem is that the entertainment industry is bankrupt creativity-wise. They have no idea how to make really new stuff. Everything is a remake. Note that they aren't introducing this new medium with a new story, they are refurbishing an old movie.
*filmstrips used to be kinda like a slideshow where you'd insert film into a projector and manually twist a knob to go to the next one, each slide showing a still frame and some text, telling a story. Fun times as a kid, not sure if it was as big in places other than the Eastern Bloc countries. Like here https://kultura.hu/uploads/media/default/0003/04/thumb_20396...
I agree. For cinematic content viewed in a theater or in a living room couch context, going from analog SD to digital HD was huge. Going from HD (2K) to 4K can be good but the quality is mostly from more bits being devoted to the compression than from the extra pixels. Most theatrical digital presentation is in DCP format and still 2k and people think it looks great.
Other video engineering things most people don't know:
* Well done HDR10 (or Dolbyvision) can contribute hugely to image quality. For example, I'd choose a 2K movie in HDR10 every time over the same movie in 4K without HDR.
* Theatrical 3D presentation is generally pretty bad and should be avoided if you care about visual quality. It's often around half the brightness and half the resolution and yet costs more. Even if done ideally, the end effect doesn't deliver anything like how your eyes actually see a real environment. The details get technical but theatrical 3D projection is an unnatural artificially constructed effect that's just weird. People claiming "It looked just like reality!" just shows the power of the placebo effect and suggestion, because objectively measured, it's just not. Famous directors don't usually come out against theatrical 3D the way they do against some other things. The reason is 3D is a pricey up sell that's mostly all margin, so it generates serious money but, privately, they despise 3D from both a quality and aesthetic viewpoint.
It's about financial risk calculation. They aren't willing to take a bath on a flop. They're doing these things because it's financially less risky.
David Lynch used ML to remaster Inland Empire, which was shot on a digital camcorder and was simply too dark and blurry. This was an excellent use of the technology. Blowing up The Wizard of Oz for the sake of tech bros and tourists is a terrible use of the technology.
“Google” isn’t making the artistic decisions here - there’s a full production staff from the studio doing that. Google is making what they ask for.
>I am … saying that these edits to the film are tasteless trash
Making such a claim with zero knowledge or experience is a pretty bold move - how are you so confident here?
While I didn’t get to see the private preview shown at the sphere this week, I’ve spoken to about a dozen people who did and they were all very positive about it.
1. reading dialog cards, especially since I can lip read a bit and that's not what the characters are saying
2. the truly awful "period" music added
3. the grainy black and white
I've seen some work on the 1929 film "Wings" that was done in high def, sound effects were added, and some small colorization of scenes.
I'd like to take that further:
1. remove graininess and fuzziness with AI, and of course all scratches, etc.
2. add foley sound effects
3. replace dialog cards with dubbed dialog, lip read if possible
4. add a decent music sound track (Like what was done with "Metropolis")
5. colorize it!
These are all very doable today, as I've seen each done in a small way.
Why should we suffer all the limitations of silent movies? I just want to enjoy it, not suffer.
Besides, how many movies in the last 50 years are made in B+W, use title cards, have a soundtrack of someone plinking on a pipe organ, are fuzzy and grainy, etc?
Most people won't even watch a B+W movie, let alone a silent one.
P.S. Reddit has a subreddit for colorized photos. There is some amazing work done there. It really makes the photos more interesting. My wallpaper is showing one right now.
The biggest technical problem is a 165 degree FOV is not just ill-suited for theatrical storytelling, it's actively harmful because it significantly constrains the compositional and creative choices a director and cinematographer can make. Historically, Hollywood has experimented with a variety of wider (or taller) than typical cinematic formats including Omnivision, Circle-Vision 360, MagnaVision, Cinerama, IMAX and many others. Over many decades of experimentation, it became clear that, for cinematic storytelling, formats up to around 2.5:1 were mostly upside assuming the costs and space could be supported. Extra-wide formats like Cinerama and IMAX had creatively useful upsides but came with some significant downsides which could be minimized with careful handling. Ultra-wide formats like Omnivision, Circle-Vision and, now, Sphere, were primarily useful only for theme parks and short "You Are There"-type features such as Disneyland's Circle-Vision Grand Canyon tour. It can be helpful to refer to the reference chart on this page showing the different FOVs in a typical theater overlaid with the SMPTE, THX and 20th Century Fox recommended FOVs. https://acousticfrontiers.com/blogs/articles/home-theater-vi...
Experimentation showed that ultra-wide formats, which are more than double typical cinematic FOVs and originally developed for world fairs, are simply ill-suited to cinematic storytelling. In addition to audience fatigue during longer run-times, significant technical challenges of optical distortion, and high costs - perhaps the worst part was the director losing much of the ability to signal to the audience what's important through framing and composition. This signal channel between the director and audience usually goes unnoticed by most viewers but it's a profoundly important storytelling tool for directors and cinematographers. To be fair, I do think immersive/surround visual formats can be useful in the context of a theme park attraction, amusement ride, VR headset or interactive gaming. They just don't work well for cinematic storytelling - like Wizard of Oz. It's a good tool being used for the wrong job.
Recently, I screened the "Postcards from Earth" movie at the Sphere. This movie was created specifically to launch the Sphere and is their featured demo. And they indeed struggled mightily with the issues I've outlined. Ultimately, they chose to mostly not use more than about a 60 degree slice from the center of their 165 degree canvas, at least for anything significant to the story. All that very expensive, compromise-causing extended FOV was relegated to ambient scenic support except for a few brief "stunts" where some large object would arrive from overhead or an edge. But even those would quickly move from being at the edges (and too close/big), to exist in the center 60 degrees like everything else. Also, Sphere content must strictly limit any camera panning, tilting or side dollying to avoid causing motion sickness.
The issues I've outlined above are primarily "Production" problems with Ultra-Wide FOVs, however the extreme format also causes significant "Presentation" problems. These presentation problems come from the Sphere going all-in on creating such a large, extremely wide-angle, wrap-around presentation that fills the entire visual field for 17,000 seats. Unfortunately, choosing that "feature" as the top priority requires other important tech aspects of visual presentation like contrast, dynamic range and resolution to be significantly compromised. A key problem is that the wrap-around screen being 165 degrees causes it to illuminate itself nuking the contrast. The sides down near the horizon line are opposite and shining directly at each other. Another significant issue is that it's almost impossible to shoot or present real-world camera content able to fill the entire 165 degree Sphere screen with a single natural image. As near as I could tell, the entire Postcards from Earth movie doesn't contain even a single full-screen frame that was shot with one camera. It's all CGI with occasional real-world camera shots composited into small frames within the wrap-around CGI field. This is because it's incredibly challenging (if not entirely impossible) to photograph a single image that wide and tall while keeping the perspective from being severely distorted. During the "Planet Earth"-type scenes, the director clearly had to go to great effort to keep any real-world object with straight lines from getting too big. On top of the significant cinematic, compositional and tech issues, the content of Postcards from the Earth is also weak. The story was trite, shop-worn and heavy-handed. The acting, music, cinematography, etc was overall weak - basically what I'd call "pretty good for a video game cut scene but certainly not AAA cinema grade."
In terms of nice things to say... well, the audio presentation wasn't bad. By which I mean, it wasn't great but it was impressively good for that huge of a space dominated by a massive non-parabolic reflector. Basically, the massive size and unusual shape of the space make it extremely challenging to provide a decent audio field to the majority of seats. IMHO, the audio engineering team over-achieved in the degree to which they're able to address many of those challenges. There are a huge number of tuned speakers hidden behind the acoustically semi-transparent screen driven by a lot of DSP power. Very expensive and technically quite difficult. Unfortunately, the resulting audio - while technically impressive given the significant constraints, still isn't nearly as good as a well-tuned flagship Dolby Atmos, THX-certified theater in Hollywood or Manhattan. Another unfortunate aspect is the perforations in the screen required to enable the audio transparency further reduce the screen's ability to generate peak nits of brightness.
If you want to experience today's highest fidelity theatrical imagery for cinematic storytelling, the Sphere isn't it. Sadly, it's not even close (which makes the >$100 price for a bad movie, presented poorly especially egregious). The best you can experience today is visiting one of the 31 real IMAX cinemas (out of 1700 IMAX branded screens (aka "LieMAX" screens)) but only when they are showing a movie A) Distributed in the full 15 perf / 70mm 2D IMAX format, and which was B) Specifically shot to utilize the full 1.43:1 aspect ratio of the full IMAX format. Unfortunately, that's a small minority of what's shown on those 31 screens. Most films shown on IMAX screens weren't really shot specifically for full 1.43 IMAX format (which is super tall compared to normal cinema aspect), so they're just open gate (unmasked) versions of films primarily framed and lit for typical wide format exhibition (like 2.39:1). Also, avoid anything shown in 3D because the 3D projection process (whether IMAX or not) always significantly reduces brightness, contrast and resolution vs 2D projection.
* There are specific reasons for this correlation driven by the additional on-set shooting requirements placed on the cinematographer, lighting director and camera team to support VFX. This takes time and focus away from artfully lighting and lensing the live action elements and producers don't always budget sufficient time for both. When something has to give, it isn't going to be shooting the VFX plates.
* There are specific aesthetic reasons driving this. On a film with 2000 VFX elements that will later be layered into almost every live action scene, the cinematographer and lighting director are understandably reluctant to light or lens expressively beautiful shots because matching those shots can be more challenging for the VFX teams. It's easier (and faster) for everyone if the cinematographer lights and lenses with a more flat look. I recently saw an interview with the cinematographer on the Marvel series Loki (Season 2) and he said that he talked extensively with the VFX director specifically about this issue and the VFX director was unique in telling him, "Nope, shoot it how you really want to and we'll match you." Obviously, this was enabled in-part by the Marvel-level budget and talent. But it's interesting that it's rare enough to be notable.
* Ultimately, there's good evidence that VFX-heavy films don't have to look "flat HDR boring". We have counter-examples like "The Batman" (2022) proving a VFX-soaked super hero movie with beautifully expressive lighting and principle photography is possible. It's just hard, requires more time and money and you need highly skilled people.
The thread is here and includes links to several good videos with example clips: https://news.ycombinator.com/item?id=43560111
On the outpainting, I think in the documentary you can see some "bones" of one of the off-screen characters in a window, so it looks like they have some sort of adaption perhaps to manually animate or constrain the AI model.
The last film I've seen in an ultra-wide like this was the original 3-wide version of How The West Was Won, which I actually really enjoyed. I think it's important to remember that most of the important action is constrained the center and the edges are really just to fill-in your field of view. At least, that's how this should work.
I definitely agree. I saw Lawrence of Arabia in Hollywood presented in the original Super Panavision 70 (2.76:1) and it was sensational. However, the Sphere format is far wider than even that. Plus the Sphere format is both wide and tall and the horizontal curvature yields a wrap-around that's nearly 180. And the shape of Sphere's internal screen is weird, sort of like sitting inside an empty, cavernous NFL football helmet (note: the internal screen shape is very different than the shape of the Sphere exterior dome). The internal screen is made out of multiple complex compound curves. This inconsistency makes rectilinear-esque projection mapping even more obviously wrong, especially with any camera motion. So don't confuse your appreciation for "How the West Was Won" in wide format with the theatrical abomination that is the Vegas Sphere.
I think there's lots of reason for optimism around Extra-Wide formats like IMAX and Panavision Super 70. However, I struggle to find anything positive in the trade-offs baked into the much more extreme Sphere format. All engineering in this area is full of trade-offs but with the Sphere it's clear they optimized almost exclusively for huge capacity (up to 20,000 people), the highest possible "First 90-seconds 'Wow!' impact on naive audiences", and the most impressive external and internal conceptual 'curb-appeal'. They wanted to be obviously, overwhelmingly "biggest, roundest, tallest, widest screen EVAH!" And to over-achieve on that dimension they intentionally chose to reject all aspects of technical balance which might mitigate the worst quality trade-offs. In the design brief they went all-in on FUCK the contrast, brightness, parallax and any hope of cinematic storytelling. Just make the screen BIGGER, WIDER, TALLER (and cram MOAR seats under it).
In that sense, I guess they understood their market, because it fits the worst aspects of the traditional Vegas low-brow, aesthetic of big, garrish, tasteless, attention-seeking spectacle focused on exterior curb-appeal and shallow initial impact yet ultimately delivering low-quality disappointment. And it looks like it's working because they're clearly making money. They're even talking about replicating the Sphere in other major cities around the world, although I suspect the format may not have staying power outside the unique context and aesthetic of Vegas. And to be fair, the Sphere may be a fine concert and event arena (I didn't see it in that form and the reviews are mixed). I only consider the Sphere an abject failure in trying to be a theatrical venue for high-quality visual presentation of cinematic storytelling.
I remain open-minded and even hopeful regarding other potential advances in wider-format and alternative-format theatrical exhibition. I'm also open-minded about new storytelling potential that might be enabled by immersive head-mounted stereo VR. But I simply love the medium of cinematic storytelling too much to remain silent when the Sphere markets itself to the naive masses as being the ultimate in high-quality theatrical presentation when in reality the format chose to be so over-the-top extreme, it's literally broken for the purposes of high-quality theatrical presentation of cinematic storytelling. And I went to the Sphere sincerely hoping to be blown away by an unexpected miracle of modern day engineering overcoming the challenges of the format, but alas...
Wait, what? When did Google DeepMind open an Atlanta office? That seems like a news story in and of itself... maybe I’ve been under a rock the last few years.
https://blog.google/inside-google/company-announcements/atla...
In this blog post from 2013, they refer to receiving ISO 50001 certification for an already existing data center in Douglas County, which is apparently considered part of the Atlanta metro area.
https://blog.google/outreach-initiatives/environment/pushing...
To your point about Google DeepMind’s Atlanta location’s history, I asked Google Gemini, since I assumed it might have tuning specifically for Google info. According to Gemini, Google Brain had offices in Atlanta, while DeepMind did not pre-merger.
Here’s a list of all their locations I found: