- /
- /
- /
FUTURE OF FILM
IMMERSIVE BROADCAST
For the past few decades, we have lived in a digital 2d world where we delivered and absorbed information from screens of all shapes and sizes. The times have changed and new technologies have emerged giving us the ability to deploy new ways of communication through wearable devices and advanced sensor systems. Using these resources a new way to offer entertainment has arisen.
EMBRACING CINEMA
We’ve always turned to the big screen for the stars and scenery we love. Now it’s time to take the foundation of these elements and merge them into 3d environments. Using the combination of digital storytelling with the intractability of the future web we aim to encompass the true meaning of the film in a totally new way and experience. With the help of those, who love to lead the way a lot of greatness is in store.
HARDWARE FUNDAMENTAL
When people think of the Metaverse they automatically assume that it involves some kind of VR headset. Although
this is differentially the most integrated way to experience it in full force there are all kinds of ways we can relay visual
extensions for user enjoyment. Using the SciFi fundamentals that we have dreamed about, some realities are not that
far off from taking place. It all comes down to the practicality of how things actually work versus the magic of
computer graphics. Below are a few foundations we are aiming to leverage for both influencers and consumers.
MIXED REALITY
The most common way to consume the different avenues of the immersive web is to embrace it
through virtual and augmented reality headsets. These are commonly available and we are constantly improving upon their usability as time goes on. Users will be able to join in on live broadcasts and shows to share and enjoy events related to brands. As we continue to roll out more features and availability of this kind of hardware you will start seeing the opportunities and use cases expand upon newly undercovered doors waiting to be explored.
HOLO DISPLAYS
A more futuristic approach that conceptually has been around for a long time is the utilization of holographic
displays. This technology is actually more challenging than you’d expect as the light photons that get illuminated into the air need to bounce or be reflected off of something to be seen by the human eye. To mimic this visual we use thin transparent filaments to project the image in space or sometimes the use of smoke or fog particles to portray a 3d image is actually there. A good example of this is a kiosk or central resources providing information.
PROJECTION MAPPING
On the other hand, using a similar technology as holograms we are able to project images onto solid surfaces that are already available in the real world. The way this works is essentially we stretch, skew, and warp the image’s UVs to wrap around the mesh. In turn, this acts like a digital texture on top of the real counterpart. The best example of this is if you look at Disney World’s magic castle show they put on every night near the time of the park closing. It’s real-world magic when you can bring a static object to life with animation and motion effects.
DEPTH SENSORS
The previous three technologies are a brief overview of how you will be able to view the capabilities of 3D assets. To capture and display the parties in an environment we utilize special sensors that not only capture the multiple frames of the subject but also rely on infrared lasers to capture the depth or distance of a particular part of the object you are capturing. Similar to projection mapping and utilizing UV points along the texture methodologies we intuitively take this same approach in deploying a captured texture on top of the 3d generated mesh.
To learn more about immersive broadcasting experiences check out our DFX Pitch Deck.