Friday, June 26, 2009

Dogma


About 10 years ago the Dogme movement emerged from Denmark, attempting to assert a new stripped-down aesthetic in filmmaking. Filmmakers such as Lars von Trier and Thomas Vinterberg embraced a new straight-forward, honest (and presumably cheaper) mode of filmmaking which precluded real actors, constructed sets, post-synced sound or effects, all in an attempt to strip away the over-determined rules that inflected (and infected) normal picture making.

Only natural lighting environments were allowed to be filmed; no extra lights could be added. And only existing objects in real locations could be used. No props or guns or other genre elements to add visual "interest." It all had to be present and available for the filmmakers... or anyone. The idea was to capture the truth as it happened in front of the camera and record it un(pre)mediated as it occurs, with no subjective manipulation, no trickery, no egos. Truth at 24 frames a second.

They were unsuccessful for the most part. While this is an interesting approach to making films - and especially for ones that aren't documentaries - it makes for difficult, overly mannered yet loosely structured and finally rather restrictive results. Such avant-garde narratives - without artifice or production values - are an acquired taste. Without most of the tools of 100 years of filmmaking at their disposal, the dogme-ists paint themselves into an ascetic conundrum in which flights of cinematic fancy are by default precluded.

The last successful Dogme film was 2001's "Italian For Beginners" (and there's consensus that that didn't follow the Dogme vow of chastity rules to the letter either). Yet the spontaneous no-production-value aesthetic has been embraced by a new generation of filmmakers. It's a reflection of our familiarity with streaming videos on YouTube and our small personal devices, lo-fi but authentic. Such above-ground hits as "The Blair Witch Project" and "Quarantine" (by way of "REC") appropriate (if don't rigorously follow) the Dogme ideals of hand-held cameras and off-the-cuff shooting in natural, real-world settings with a documentary narrative drive. J.J. Abrams' "Cloverfield" also uses the videocam reality-t.v. model to great effect, tapping into our voyeuristic tendencies.

(Although it's likely 80% of that film is fake, manufactured by CGI in post.)

Interestingly, and tellingly, all these are horror films.

The Dogme '95 movement was an articulated attempt to capture the spectacle of the real, in unmediated and unfiltered visual terms. It turns out that mode of filmmaking is discomforting.

We like a little artifice between us and reality. The spectacle the camera captures, when allowed to film uninhibited and unfiltered, is truthful, perhaps - but also (or therefore) profound, scary, intense, forbidden, and a bit horrifying.

An unintended progression of those Danes 10 years ago.

Saturday, June 20, 2009

Independent Days


What I really want to do is direct.

What everyone wants to do is direct. Everyone's a closet moviemaker. Everyone's a comedian. Everyone has a screenplay in their bottom drawer, but no one's heard of anyone they know actually making it in Hollywood.

I went the independent production route myself. You get some friends together, scare up a couple thousand dollars, a film camera and shoot your clever Tarantino/Linklater pastiche, convinced that since it costs so little, there's no way it can't make money. The video store is full of them. Why not add to the noise?

We've all heard of the independent filmmaker success stories. Make a film in a weekend (or over 3 years) and it sells at Sundance for $3 million, and the next thing you know you're hired to direct the Luke Cage remake. They know what you can do with pocket change, so only if he had some real money....

It's an elegant theory. But it's disingenuous. For every Bryan Singer, there's a thousand Jacob Freydont-Atties. For every David Gordon Green that (eventually) gets pulled into the majors a dozen JP Allens remain unknown. Hundreds of films get submitted to each of over 200 festivals in the US every year (and that's just the features) and even of the ones that are selected, it's likely their first, best and last showing are at these festivals, never getting a distribution deal, or even ending up on DVD except as souvenir home burns for the cast and crew.

There are more movies out there than you can ever find out about. More people want to make movies than the industry can possibly gainfully employ. If you don't believe me, ask yourself how many times you've heard someone say words along the lines of "You know what would make a great movie?"

You've said it yourself. Everyone's got an opinion, and you know what they say about that. We think we can do it better, and perhaps we can. We'd do anything to be in pictures. But it's not just about having a better idea. It's about being in the right place - at the right time, with the right people surrounding you, and often with the right amount of money sitting on the table orphaned and waiting to be invested.

Financing is all - more projects come to fruition because they've been paid for than because they need to be told. Independent films always have a hard and schizophrenic life. They're borne of passion and necessity and wear their sponsor-less authenticity as a badge of honor, the entire time putting on airs to convince they're more than the backyard make-believe they are. They push the envelope and defiantly resist categorization and (often) coherence, because that would be selling out.

Yet they exude a needy greed to be loved, because ultimately they can't afford to piss off their audiences or their producers, and end up playing to the cheap seats, simultaneously wishing and fearing a state-funded co-opting or, at least the perceived notion of one - pursuing and risking a Kurt Cobain-ian reduction of street cred as the zeros multiply on the residual checks.

Even Ron Howard started as a seat-of-your-pants go-for broke exploitation director, which in a way is still reflected in his gilded work on "Angels and Demons," done not for art but to assert his position in the industry. A $200 million budget, completely competent and completely forgettable, reminding us that there is never enough resources of the right kind on any picture. It's the difference between the first "Terminator" and the second, the difference being a budget 10 times the size so aesthetic challenges aren't solved, so much as financed to death.

Ridley Scott makes one movie a year, and while we can discuss the vagaries of his depiction of the CIA in "Body of Lies" or "Matchstick Men," we'll never see the mad independence of "The Duellists" again. In today's environment, the list of directors able to generate a meaningful body of work is extremely short. Bad penny Terry Gilliam and rock star Martin Scorsese still can't put together the projects they really want to do. Scorsese had defaulted to music documentaries, which are probably the level of fight he's willing to take on nowadays.

And what about the filmmakers that didn't have the fortune of having worked with Robert DeNiro in their early careers? Who had a unique voice but couldn't sell a ticket? They've moved on to shooting cable shows. Or pulling cable.

Or writing for cable. Or writing work orders for cable installation.

Being independent comes with a price. By the time someone offers to pay the bill, you're already face-down in the pool.

Friday, June 19, 2009

Data, Metadata, and Statistics


Digital objects exist in a different way beyond mere objects in the physical world. They're created and the information by which they are described is added to the object, so it can be found.

This is "metadata" - kind of like the stuff that gets stuck to your shoe that you simply can't rub off.

Every digital object collects this as it moves, gets copied, is altered - even deleted. No fingerprints remain invisible. (Yes, even an object that isn't there still declares itself, if only by virtue of the fact that it is no longer present.) Lots of times metadata is intentionally added to an object. Titles, dates, to-do lists ("Delete after end of quarter," "Save for blog," "unused takes").

But just because this digital object has collected all this extra descriptive information doesn't mean it's the better for it. The object becomes larger as it travels, and it costs time and energy to preserve all this stuff on the object, not just the object itself.

And just because there's all this new information on it doesn't mean it's good info. Much of it may be wrong. Or incomplete. Or mean different things to different people, programs, or systems.

The signal to noise ratio begins to change. And just because it's all info about the object itself also doesn't mean that it's metadata, either. Maybe the info is part of the object's creation, but doesn't actually describe it. It might not be about the object, just riding along, attached accidentally or through someone's alterior or altruistic motives.

Once an object collects information about itself, that doesn't mean it should all be preserved with the object.But figuring out what belongs, what might be needed in the future, and what's merely a parasitic piece of code costs resources to deduce.

Not all metadata is created equal. It has a lifecyle, and some becomes obsolete at a certain point in the various iterations of the object, as it moves from VHS to laserdisc to DVD to Blu-Ray, for example. Just because you got it, just because it's right, doesn't mean anyone's gonna give a damn.

Metadata lives and dies and people get paid a lot to create, preserve, and migrate it. But it's invisible and of unknown value. So we spend more time worrying about it than what it is describing. We shouldn't lose sight of the underlying artistic creation that makes it necessary in the first place, in this world of digital access. A page of poetry or cut of music, a clip of film that people fell in love with, 100 years ago. And maybe 100 years from now.

In the future there will be no record players. You'll want to be able to find Miles Davis, won't you?

Saturday, June 13, 2009

Vistas


As I catch up on all the old films I've never seen, and I could never catch up if I saw 6 a day and limited myself to what was made before 1945, the differences between then and now are striking. Besides a mode of cutting (much slower) and a level of discourse (most of the golden age writers came from Broadway), there's a sense of place that is so resolutely real and beautiful.

Films used as part of their production value actual locations, often as part of the spectacle of visual entertainment. Movies had traded from the very beginning on showing something you had never seen before, whether or not it was a train coming into the station head-on, the beheading of Ferdinand, Death playing chess, or what images were kicking around inside of Fellini's head.

Film, being a photographic and (deceptively) real representation of what happened in front of the camera, brought the world to us.

Producers would as part of their production strategy, cast the location as well. Even small family dramas such as "Rebel Without A Cause" use striking locations to give a sense of scope and natural beauty. (Hitchcock, perversely, would fake some of his up, but that's an epistological discussion for another time.)

Of course filmmakers use trickery, intercutting, and stunt doubles to convince us that they travelled to Rome to shoot their ancient saga (and not Bronson Canyon). But the setting was considered an important character in the film.

We saw vistas used to dramatic purpose as late as the '80s - I remember "The River Wild" being more interesting to look at than to listen to. But in the current age of CGI, producers have gotten used to, and audiences have grown to expect, any spectacular and fantastic visual fireworks that can be created with computer graphics.

So we get the nuanced cityscapes of "The Dark Knight," the miles of weather-ruined vistas of "The Day After Tomorrow," and the virgin-sand blue sky chicanery of "Castaway."

It's all spectacular and beautiful. And it's fake.

They can do anything. 90% of what you see in "Star Trek" doesn't exist in the real world. Everything down to the lens flares coming off the lights on the bridge was created in a room of computers months after shooting, and I'm not entirely sure about Chris Pine either.

TV with the 100-cuts-a-minute CSI style editing feeds into our need to see something shinier, faster, brighter than the last half-hour. We expect it now; we've culturally forgotten that locations and place used to be an important part of the texture of movies.

Movies used to show us people, real people, in big places doing interesting things.

No wonder people don't fall in love with the movies anymore. "Casino Royale" attempted to reclaim this dynamic after the cartoony "Die Another Day." And that's why I prefer "The Eiger Sanction" which seems to be shot entirely on a mountain, to "Cliffhanger" which only tries to convince me for certain key scenes. And even when it got boring, which it certainly did, I could always let my mind wander to look at the scenery.

Thursday, June 4, 2009

Something/Anything?


Readers going past the last couple of posts here will note clearly I've gone through various stages of thinking about what this blog discusses.

I started out bemoaning the de-sacredification (new word) of the movie-going experience (such as in Hollywood Ending and Nice Things Destroyed). I was conscious that a new generation wouldn't fall in love with film, and it was a ripe and sitting target close to my heart at the beginning.

As I continue with grad school, there's been a new understanding of what archiving meant, and how the past intersects with the future. What do you save, why do you save it, and for what purpose. Not everything should be saved, even if it's becoming obsolete or out of favor.

An exploration which led to a kind of phenomenological investigation of the indexical qualities of film vs. video. Around December of last year I struggled into an amateur Lacanian discourse about aura and semiotics, while trying not to use those words too much. I'm glad I got that out of my system (in Worked Matter, The Real and the True) but I still believe our cognitive relationship to what's filmed and projected, versus what is captured and streamed, is very different and shapes whether or not we are transported by what we see, or merely amused.

So, I've realized how deeply we're seeped in a new age of spectatorship and reception. By January, I'd accepted the Border between Calm and Catastrophe. New Moguls and Post Modern engage with moving images and films not as objects, not as site-specific performance events (such as a movie theatre) but as hypermodern events, infinitely accessible and duplicatable.

Although importantly, not with the same qualities. The future of moving images is as a stream. That matters in the long run as we figure out what we will be saving, preserving, restoring, and archiving for some mythical future audience: the actual object, a digital version of the analog object (which won't have the stability or long-livedness), an analog copy of the digital object (which won't act in the same interactive, dynamic ways), or merely a proxy that may approximate the look and feel as a faded souvenir.

I've always been aware of the impulse to wanting to fetishize the film object (Fin de Cine) and am guilty of it myself. But more and more moving images will be borne digital and be delivered that way, never enjoying a status as physical object. Without existing in some repository, they will remain around only as long as people copy and share them.

When they stop being used and migrated, they will deteriorate, lost to the past, in a cloud of memory.

This realization of the shift in the quality of moving images will be reflected in where I hope this blog is going.