Software and the Second Renaissance

Renaissance artists like Vermeer and Caravaggio were long considered “masters” of art, with an ability to capture light better than any of their predecessors and most of their successors. Modern artists still struggle to use paint to translate a setting onto the canvas, precisely as they see it.

For centuries, this was chalked up to some sort of artistic enlightenment. Recently, however, a few skeptical artists and academics have proposed a different theory as to why that pearl earring glistens so authentically in the famous Vermeer.

They suggest that the old masters might have used an opto-mechanical device to help them craft their masterpieces.

The masters would have built a box or a tent with a pinhole, and a mirror. As the light from the external source hits the mirror tilted at 45 degrees, it’s projected through a pinhole onto a canvas. The image is flipped and inverted, but the colors remain the same. From there, the Masters composed a paint-by-numbers masterpiece.

If this theory is correct (it’s still hotly debated), this type of camera obscura would have been a device that changed art history. Before this invention, Proto-Renaissance artists could accurately represent proportions and could convey texture with different brush strokes, but without the ability to capture light, their work lacked depth.

The camera obscura enabled painters to finally understand light and transpose what they were looking at. The technology let them to imprint their vision onto a canvas.

For the next four centuries, artists and creatives across industries faced a similar challenge to that of those Proto-Renaissance painters: they lacked the tools to share their creative vision with the world– until recently.

The Second Renaissance

From Schubert’s unfinished symphonies, to Pessoa’s manuscripts, to Orson Welles’ uncut masterpiece, history is filled with unfinished work that never met the standards of their makers. And the same goes for unrecognized artists, writers, and musicians, who spent their lives trying to share their artistic vision with the world.

When software started to eat the world, however, everything started to change.

All of a sudden, creatives were better equipped than ever before to bring their vision to life:

  • Point-and-click software-enabled cameras and Photoshop enable photographers to see their work instantly and make hundreds of minute adjustments until the photo matched their pre-imagined masterpiece.
  • Sophisticated playback equipment and sound-editing software enables musical artists to make adjustments to the pitch, tone, and octave of their music until they replicate the music they’ve been hearing in their head.
  • Composition software and a world wide web of assets enabled graphic designers to easily create and adjust colors, curves, and text, until they’ve pulled off the aesthetic they were striving for.

Software lets creatives manipulate their art in high-fidelity. They can hear their song, see their picture take shape before them. Artists are no longer painting on a blank canvas– they’re starting with a sculpture to build upon or chisel away.

Software has thus both, elevated these art forms and increased their accessibility.

Professionals can create better work, faster, and amateurs can create something substantial without decades of experience. A musical artist like J. Cole can easily compose, record, and edit his own tracks using a digital cloud recording system– something that previously required full editing teams, months of time, and complicated hardware. Similarly, a high school student who loves music can start creating tracks and mixing music in his basement for free.

And this is only a few of all the creative pursuits that have been improved. Software is re-defining the creative workflow for engineering, architecture, fashion design, art directing, industrial design, set design, interior decorating, and dozens of other fields.

The 16th century Renaissance has been defined as an era where there was a higher quantity and quality of art than ever before. Today, we’re living through a second Renaissance.  There are more pieces of high-quality art, literature, and music available for public consumption now than at any other time in history. And this is only the beginning.

The Opportunity in Film

Software companies have drastically improving the workflow of individual artists. A photographer has better control over his photo, a singer over her song, a fashion designer over his clothing design, and an architect over his building design.

But an art form like filmmaking, which requires the artistic talents of multiple collaborators, is still rife with opportunity.

Filmmaking requires the visual talents of DPs, the logistical chops of producers, the narrative vision of writers and directors, and the acting talents of thespians. On paper, the director has the final say, but in reality, what ends up on screen is often a combination of many different, and often competing visions.

Software has improved the workflow for a number of individuals throughout the pipeline:

  • Cinematographers, rather than just envisioning lighting setups, can use tools like Cine Designer R3 to help pre-visualize and experiment with how best to light his or her scene.
  • Editors, rather than getting assistants to organize hundreds of hours of footage, use Adobe Premiere Pro to store, edit, and sift through all that data to maintain control over the cut.
  • VFX supervisors, rather than working with simplistic storyboards, can use pre-visualization software such as FrameForge to maintain control over the eventual look of every shot.

More precise control lets filmmakers better execute their vision, but it also leaves room for experimentation at each stage, and in film as a whole. Cinematographers who can better communicate their vision, can also quickly try a quick deviation. Editors can compare alternative cuts more easily. Production designers can visualize and model out 3 different sets before selecting the best one.

But one major point of friction remains: how do directors maintain creative control over everything that’s in frame?

Broaden the Purview of the Artist

The director is faced with a problem of scope. He must control the direction of a project that involves about 600 people, 1000 of pieces of camera gear, and dozens of locations. And he must do it over the span of several months or years.

The director cannot feasibly move every camera, put together every rig, or adjust every source of light. He must convey his vision to groups of people whose job it is to execute what that director has envisioned. But words, still images, and even storyboards often fail to get that message across, which is why even a 10 second scene can take hundreds of retakes and dozens of man hours to pull off.

But what if we gave the director the tools to instantly create the scene he was imaging? What if you could take the movie set, shrink it down, and then give the director the ability to physically build and manipulate the frame of each camera?

This is the problem the 3D modeling and CAD software solved for designers. Architects, who for ages struggled to communicate their designs clearly to owners and crews of builders, or fashion designers who had to personally communicate how their two-dimensional sketches should look and feel.

Now, my team has set out to solve this problem for filmmakers.

Control The Real Through the Augmented

Our software, Vermeer physically shrinks down the filming environment and puts it into the hands of the filmmaker. Using AR, a 3D model of the the physical world is displayed through a phone or tablet screen. From there, the filmmaker can physically push the device through the landscape to orchestrate the shots that he or she would like for their team to execute.

Using this technique, a filmmaker can effectively craft a pre-vis in real time and communicate his or her vision instantly. Just like the modern musical artist and just like the modern photographer, a filmmaker can work with his craft in high-fidelity.

We started to tackle this huge problem by starting with one type of the shot– the aerial shot.

We’re diminishing the distance between the filmmaker and his or her camera. Vermeer is designed to give a user the ability to move a camera fluently through three dimensions. He or she can design the camera path in a miniature AR landscape, using natural movements and hand gestures. From there, the drone executes the shot autonomously.

Check out the Vermeer Beta by downloading it from the iOS app store.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>