display screens star wars factory

If you’ve ever wished you could recreate the iconic scene in Star Wars, where Luke Skywalker discovers a holographic message of Princess Leia, pretty much kicking off the entire franchise, you can now get a lot closer to that dream. Unfortunately, you won’t be using an R2-D2 astromech droid to do so, but rather a new type of display developed by Looking Glass Factory, a Brooklyn-based startup.

Today, July 24, the company announced the eponymous Looking Glass, a completely self-contained display for viewing three-dimensional, holographic content. Think of it like a second screen for your computer that allows you to view objects in three dimensions, without the need to put on a clunky headset.

The Looking Glass costs $600 for a 8.9-inch display (around how much a decent VR setup currently costs), and $3,000 for a 15.6-inch version. The displays only require two cables (an HDMI and USB) and a decently powered laptop to work, and can either be controlled by a mouse and a keyboard, like a regular screen, or connected up to a wireless controller, like

Frayne said he expects the display to find a home in commercial settings before it finds a way into our homes, the eventual goal. He’s had conversations with companies, games-makers, and artists, and believes that, although there is enjoyment and value to be found in VR and AR headsets, their benefits are often outweighed by how isolating the experiences are, and how lazy we can be.

By creating something that looks similar to the computer screens we’ve plopped ourselves in front of for the last 20 years, we wouldn’t have to radically change our consumption behavior to enjoy a completely new way of looking at content. Looking Glass’s displays can also be viewed by multiple people at once, unlike a VR headset, making them far more useful for collaborative work in 3D.

display screens star wars factory

Behind the scenes on the set ofThe Mandalorian,the storytellers and unprecedented visual effects engineers creating the first live-actionStar Warstelevision series collaborated to crack the code for what has become a game-changing creation: StageCraft, a technological marvel that immerses the cast and production crew inside their CG environments in real time with the help of a massive wraparound LED screen.

“Jon Favreau found the breakthrough that George [Lucas] was always looking for when he first explored the idea of a live-action TV show,” adds Richard Bluff, the visual effects supervisor on the acclaimed series. Known as StageCraft, the innovators at Industrial Light & Magic, Lucasfilm, and their collaborative partners on the project have achieved the planet-hopping magic ofStar Warsstorytelling that transports viewers to the galaxy far, far away by settlingThe Mandalorianin a largely computer generated, photo-real environment that wraps around physical sets and real actors to create a seamless effect.

It’s staggering to watch the final effect unfold, as a team of artists and engineers known as the Brain Bar act as mission control just a few yards away from the Volume, a curved cocoon of glowing LED screens ready to transport those standing inside of it literally anywhere. “It’s exactly the same sort of technology as the large LED screens you see in Times Square,” Bluff says. “What we wanted to do was shoot on a small stage with small physical sets that could be wheeled in and out fairly quickly and then extend those physical sets on the wrap-around LED wall.” And by moving visual effects to the beginning of the filming process, StageCraft enriches the performance of the actors, and the experience for directors and cinematographers using the new methodology for more precise storytelling in a completely fabricated galaxy.

It was just about six months before filming began that showrunner Jon Favreau, executive producer Dave Filoni, and DP Greig Fraser joined forces with ILM, Epic Games (maker of the Unreal Engine), and production technology partners Golem Creations, Fuse, Lux Machina, Profile Studios, NVIDIA, and ARRI to unlock this innovative achievement. While ILM had pioneered virtual production tools and had worked successfully with LED technology on previous Star Wars films, StageCraft was still very much in its infancy at the time, a virtual reality platform that helped storytellers scout fabricated environments to set up their shots. Propelled by the support of Lucasfilm president Kathleen Kennedy and the sheer will of Favreau, who was always pushing the collaborative team to try new things and bringing together some of the brilliant minds capable of making it happen, the crew took their next steps into the larger world by crafting the prototype of the Volume.

Fresh off projects likeThe Jungle BookandThe Lion King, Favreau was passionate about employing new technology to enrich storytelling as he began his work onThe Mandalorian. But the scheduling constraints of a television show — and aStar Warsshow that had to satisfy the planet-hopping scope fans have come to expect while making it feel entirely authentic and accessible no less — meant whatever the team came up with had to appear realistic and be able to be shot on a Los Angeles soundstage without the traditional challenges of location shooting. “One of the things we wanted to do is move away from green screens and make the scale of aStar WarsTV show work,” Bluff says. “And we knew that we needed a technological innovation to push the boundaries and provide a solve for the production. Through the collaboration with Jon Favreau, Greig Fraser, ILM,  Epic Games, and others we landed on the idea of utilizing video wall technology.”

Bluff, like many fans, discoveredStar Warsas a child enthralled by the galaxy far, far away and the stories that unfolded there. “I fell in love withStar Warsas a kid. I would watch the movies over and over again and I would lie in front of the TV drawing the characters. I desperately wanted to get into any sort of creative environment, particularly the film world, through drawing.” WhenJurassic Parkbroke new ground with realistic computer graphics animation, Bluff saw his opening. “I understood that the doors were about to be blown off the industry and it was going to be a race to learn this technology because it was in its infancy. So I took that opportunity with both hands!”

“It was that pioneering spirit from the early days of ILM and the birth of computer graphics and the birth of the Pixar computers that really got me into it,” adds Kim Libreri, the chief technical officer of Epic Games, who collaborated closely with Bluff and the crew. “Honestly, ILM and George’s legacy had a big influence on me choosing my career path.” One night when he was studying in college in the UK, Libreri had a dream about a realistic video game that involved flying a snowspeeder over Hoth during the battle with the AT-ATs fromStar Wars: The Empire Strikes Back. “It was totally real. I was in the movie, but it was a game. I thought, one day it will be possible that a video game will look as good as a movie. And that has driven me my entire career.”

Bluff went on to land a job with ILM as part of the team of matte artists onStar Wars: Revenge of the Sith;Libreri also went on to work as a visual effects supervisor with ILM before moving on. And they’re both thrilled to be working on newStar Warsstories. “For me coming back toStar Warsin the role that I’m in advancing technology, I hope it inspires kids that were my age when I [discoveredStar Wars],” Bluff says.

Nearly two decades after his first role working behind the scenes building the galaxy, Bluff recalls those early StageCraft tests in the spring on 2018, when the crew working onThe Mandalorianhad mere weeks to come up with the wraparound screen prototype. “We got the go ahead to start actively developing the tech sometime in the end of March 2018. The idea was that on the 10thof June the same year we would do a test on the stage with a much smaller LED Volume wraparound screen with our picture camera, a prototype of a costume, and many different LED environments to test the technology.”

Months later, the Los Angeles set ofThe Mandalorianwas a bustling hive of activity. At the epicenter, the Volume, a curved 20-foot high 270-degree LED video wall made up of 1,326 individual LED screens topped with an LED ceiling. Inside the 75-foot diameter performance space, a variety of physical props and partial sets, created by production designer Andrew Jones and his team, could be swapped out to match the screens and quickly transform the innovative arena into a variety of planets and interior locations.

StageCraft’s genius lies in the way it brings together the collaborative creative forces on the firstStar Warslive-action series and allows them to work together in real-time. The green- and blue-screens that were once standard for shooting real actors and props before digitally swapping out the background for computer generated environments would have caused a nightmare for the post-production effects team with a central character like the Mandalorian, whose gleaming beskar armor would have reflected back the very hues intended to be cut out from the final frame. In that way, StageCraft is a boon for cast, crew, and fans alike. “Wrap around green screens cause confusion for both actors and crew, limiting spontaneity and on-the-fly creativity,” Bluff says.

But beyond the time and cost savings of eliminating the step of meticulously replacing the green screens, StageCraft allows for better lighting on set, with environments reflected back in Mando’s armor without a laborious post-production detailing work. And StageCraft’s magic effectively allows creators to produce high-resolution visual effects in real time thanks to the game engine technology. “Everything we are doing we would have always done in post-production and it would have always looked exactly the same,” Bluff says. “That was the goal. We never wanted to compromise the quality of the show.  In fact, to the contrary we wanted to improve on the quality because our challenge was a main character with a fully reflective costume.

“This approach was a game-changer for us, not only by eliminating the green screens but by providing everyone on set with a VFX set extension that typically isn’t created until weeks after the shoot wraps. Instead it was there on the day, people could interact with it andimmerse themselves — the actors, the camera operators, the director of photography. Everybody was standing on a desert or in a forest or in some hangar somewhere. There was no need for the questions of: where am I going? What is this location? How deep does this go back? How tall is the ceiling?

“It’s like we’ve put them inside a video game,” Libreri adds, although the photo-realistic images go beyond the scope of current gaming technology. “The fact that we’re able to go to these locations, I think they got better content, better performances. It’s like going back to the days when George [Lucas] was out there in Tunisia,” or more recentStar Warsfilms that took the crew ofStar Wars: The Rise of Skywalkerto the sands of Jordan. “You get that feeling back,” Libreri says. “It feels like classicStar Wars…and it’s going to have resonance for many years.”

Such groundbreaking innovations are often the culmination of years of small, hard-won victories that propel filmmaking technology ahead inch by inch. For instance, Favreau and other filmmakers had already been experimenting with the technology, making films like Favreau’s The Jungle Book and others, like Gravitybefore it, possible, when LED screens were first used by Lucasfilm for lighting on Rogue One: A Star Wars Story. At the time, the quality of the images meant most of the screens still had to be replaced by higher-resolution effects in post-production on the Star Wars standalone film. “We built on what was done for Rogue One, we both utilized a wraparound LED screen but at the time they were limited to playing back pre-rendered environments from a single camera perspective,” Bluff says. “Their intention was to focus on capturing the interactive lighting from the screens knowing that the content playing back would be replaced in postproduction for a high-fidelity version which would maintain the exact same lighting.”

As a longtime caretaker ofStar Warsstorytelling, working at the right hand of Lucas himself when he first signed on to helm the animated seriesStar Wars: The Clone Wars, “Dave [Filoni] brings a connection to George’s original vision,” Bluff says. “Dave is incredibly creative and is a wonderful storyteller in his own right, but the fact that he was schooled by George himself keeps everybody honest. You feel like you’ve got an open door back to what George was trying to do in the original movies. He guides us. He inspires us. He makes us laugh every day on set. He’s a wonderful, wonderful human.  And everybody loves working with him.”

Libreri first got involved in discussions when Filoni was exploring using Unreal Engine and similar platforms for animated storytelling. “We talked about the idea that a TV show likeClone Warsand some episodic animated content could be done in engine,” Libreri says. Tracking physical cameras in a virtual environment was something that Favreau had helped advance with his work onThe Jungle Bookand more recently,The Lion King. “As you start to look at all these different projects, Jon was always in the conversation and always pushing for what this next advance was going to be.” For years, “nobody quite had the answer,” Bluff says. “It took everybody coming together and talking about their experiences and what Jon wanted to do whether it was Greig Fraser the DP, whether it was Kim Libreri at Epic, myself from ILM or of course Jon and Dave the filmmakers themselves. And it was through those conversations that we came up with this idea of how we wanted to try to shootThe Mandalorian.”

“They are basically air traffic control,” Bluff says, although huddled together just feet from the Volume, they resemble NASA’s stoic mission control room. “They are the ones that are operating the massive screens. They are bringing up all the different environments that you would see, that you would shoot against. They’re able to move mountains quite literally. They can rotate the world. They can move us from one end of the hangar to the other end of the hangar. They can add extra lighting into the scene that of course would appear to have an effect on the actors on the stage so they do many, many, many things to continue to make the camera believe the magic trick.”

“I don’t believe that the technology would have worked as well as it did without the special place thatStar Warshas in everybody’s heart,” Bluff adds. “You can see it in the work that they did and the love that they had for everything that we did. Everybody brought something to this project that you rarely find anywhere else and it all goes back to George Lucas and the wonderful movies he made decades ago.”

Associate Editor Kristin Baver is a writer, host ofThis Week! In Star Wars, and all-around sci-fi nerd who always has just one more question in an inexhaustible list of curiosities. Sometimes she blurts out “It’s a trap!” even when it’s not. Do you know a fan who’s most impressive? Hop on Twitter and tell@KristinBaverall about them.

display screens star wars factory

MOC-33070 A Galaxy Far Far Away Modular Displays – Series 2 Star Wars by Antbill MOC FACTORY is a set in Star Wars collection, designed by Antbill . This MOC 33070 comes with PDF instruction which is easy to build and find the missing pieces. The set contains 1713 pieces, first released in 2021.

display screens star wars factory

Amazon.com: Samsung Galaxy Note 10+ Plus Star Wars Special Edition Factory Unlocked Cell Phone with 256GB (U.S. Warranty), Aura Black Note10+: Electronics

display screens star wars factory

Koensayr Manufacturing, also known as Koensayr Corporation, was a major manufacturer of starship components; nearly one-fifth of all starships in operation had at least one component from the company. They are well known, especially for their excellent engine, weapon, shield, and sensor designs. Parts sales made up nearly 72 percent of the company"s profits.

Koensayr also produced a small number of its own vehicles, most notably the venerable Y-wing starfighter, the Sigma-Class shuttle, and the K-wing bomber. It also developed a personal starskiff prior to the Swarm War.

display screens star wars factory

Contrary to appearances, Geonosis" massive droid factory was not actually destroyed during the engagement shown in Attack of the Clones, but later in an episode of Star Wars: The Clone Wars, after the planet was briefly recaptured and then lost by the Separatists. Although this factory, and others on the desert planet, produced much of the initial stock of the Confederacy"s battle droids, including new models like the B2-series super battle droids, it was just one example of many such complexes. Baktoid Combat Automata, the company responsible for the production of the vast majority of Separatist battle droids, was owned by the Techno Union, which, like the other major corporations that supported the Confederacy, was interplanetary and boasted a presence in many systems.

The wealth of possible locations for additional Separatist droid factories is a good reminder that the Clone Wars were farther-reaching than could ever be depicted in the films or even the show. Because the construction of a droid army would have aroused the suspicion of the Galactic Republic, the factories responsible would have been hidden away on distant planets loyal to the Separatist cause from the outset. And given that one of the founding principles behind the formation of the Confederacy of Independent Systems was a disillusionment with the influence of the Core Worlds over the rest of the galaxy, many of their member systems would be so far-flung as to be outside of the effective reach of the Republic even when the war began. The resources and locations available to the Separatists for further factories would have only grown as they recruited more systems to their cause during the Clone Wars.

The potential for confusion over the destruction of this particular facility may be owing to the fact that it differs from the usual focus of the Battle of Geonosis was significant in that it began the Clone Wars proper, it was also indicative of many similar kinds of battles that would be fought over the coming years as both sides of the conflict burned through considerable resources all across the galaxy.

display screens star wars factory

The VP3268-4K display is amazing. My work is known for its temperature of light and shadow, and this display brings that to a whole new level. The colors are extremely vibrant. This display ROCKS.

display screens star wars factory

Research into volumetric displays is even older than that film. And the approach has a crucial advantage over holograms, because it requires much less computational power. But despite decades of effort, free-space displays are still limited to small, crude drawings, and they have struggled to get off the ground commercially, says Smalley. Still, he is hopeful that work bringing together different and more-practical technologies, including acoustic levitation, will help volumetric display to find its killer app. It might be used in detailed interactive mock ups for medical trainees, perhaps, or to give people the ability to chat with distant relatives rendered in 3D. And the Sussex team’s acoustic method wouldn’t necessarily require a long phase of development to make its way out of the laboratory, says Smalley. “I would make a bet on this technology becoming commercial before many of the other technologies we work on.”

A globe in a volumetric display.This was shot using exposure times of 0.025–20 seconds. (Only images drawn within 0.1 seconds appear as continuous images to the human eye.)Credit: Eimontas Jankauskis/Univ. Sussex

The Sussex technique does have a drawback: it needs speakers on two sides of the display, which restricts a viewer’s ability to interact with the display and limits its size. But with hardware upgrades, Subramanian says it could be possible to use a different kind of acoustic wave to create images with speakers on just one side. The researchers are also working to improve their understanding of how the bead responds to the forces acting on it, which would allow them to move it faster, to draw more-complex images by levitating multiple beads at once and to integrate sight and touch more closely. In the current set-up, the tactile sensation and image don’t occur in exactly the same place, because the fields needed to create them can interfere with each other. Ochiai’s group has already found a way to bring together touch and sight by using fields that do not interfere: an acoustic field for tactile feedback and a laser to draw tiny images in plasma. The group has used the approach to draw braille dots in the air

Inevitably, any 3D display gets compared to Star Warsholograms. Sussex’s technique makes bigger images than previous similar methods and incorporates sound, so it brings us closer to recreating that, says Qiong-Hua Wang at Beihang University in Beijing, who works on 3D display devices. But the images are still tiny and far from photorealistic. Creating the kind of 3D effect in Star Wars by any means could take ten years, or even longer, she says.

But Barry Blundell, a physicist specializing in 3D technologies at the University of Derby, UK, cautions against trying to use volumetric technology to make rich, photorealistic displays. “No one would look at a sculpture and compare it to a painting,” he says. He adds that efforts to compete with holograms have often led to commercial dead-ends, and that the displays are best suited to applications that would be impossible in other media but don’t require high detail, such as interactive displays capable of showing complex 3D movements.

Interactivity could be powerful, Smalley says. Surgeons in training might use such displays to practise threading a catheter through the vessels of the heart, for instance. With one million moving particles, he adds, “you can have a disembodied face — do face-to-face telepresence”. Creating avatars of people in a space could give a stronger sense of presence than a photorealistic image seen through virtual reality, he says.

At the Sussex lab, a million-particle display seems a long way off. Only time will tell if the group’s approach will pave the way to such numbers. After showing off his sphere’s limited repertoire of tricks, Hirayama shuts off power to the speakers. The flapping butterfly vanishes, and the bead that created it drops and bounces gently on the display’s base. Hirayama picks it up and sets it in a box with hundreds of others, ready at any time to create magic in thin air.