Is CGI Used Too Much Nowadays?

I’m generally not a fan of blockbusters, and unless the context is right, my only contact with them remains through memes I stumble across on social media and fail to understand. Which is not much of a problem, as the hype train usually goes by rather quickly. However, I do have my soft spots, and Star Wars is one of them – you just cannot say no to an experience which brings about childhood nostalgia, no matter how unpolished or outright bad it may be (yes, The Last Jedi, I’m looking at you).

So, when watching the latest Star Wars installment – the Han Solo spinoff – I noticed that something was off, visually speaking. Not just the bland colours and uninspired lighting which I had not expected from a Ron Howard movie, but something more having to do with motion, cohesion and a sense of dimension. It’s hard to explain, but something bugged me. Yesterday, I remembered this feeling, so I typed something to do with CGI in YouTube and stumbled across a substantial amount of videos whose authors seem to have reached the same conclusion I had. Something is off with CGI nowadays.

A case of too much

Remember the old-school sci-fi films from the 70s and 80s, or the very first Jurassic Park? They now look rather outdated, but they were technical marvels back in the day, make no mistake. And in no small part due to CGI – when costumes and oddly-shaped cardboard boxes could not enhance the credibility of certain scenes any longer, films turned to the ever more powerful computer software in order to generate what they couldn’t fully replicate in the real world. Fast forward a few decades, and CGI is no longer used for a few key scenes, but for most, if not all scenes. That most certainly escalated quickly.

Yes, the first 300 film did in in an extra obvious and tongue-in-cheek way. I still applaud it for it, and will always love that CGI-heavy film, even with full acknowledgement of its corniness. However, what was merely an experiment quickly got out of hand, and now most major blockbusters, mainly the action, superhero and sci-fi genres, use it throughout the entire project. It’s no longer an exciting opportunity to grant more believability to a certain object or environment, but it has become a lazy fits-all method for designing elements or building scenes. I would argue that such an excess takes away all the fun.

A case of too little

How difficult is it to proofread and improve a page of text? Easy peasy lemon squeezy, right? How about 120 pages of text, in exactly the same very limited time? Not so easy anymore, is it? Production times have shrunk nowadays, especially when it comes to blockbusters, because of several reasons. First of all, every minute on set is extremely expensive. Second, there’s so much marketing pressure to shoot out one movie after another, that it inevitably leads to corner cutting.This means that although we have more amount of VRAM in our computers, and more proficient technology, effects are arguably getting… worse.

If you take a look at the various studies available on YouTube that delve into the problem of CGI nowadays, you’ll find substantiated complaints which show that (still subjectively speaking, but with solid evidence), effects nowadays simply look uglier than before. And it’s not necessarily a question of photorealism, but of design, and how those objects are placed into the larger environment, and how they interact both with each other, and with the said environment. One explanation is that because many of these scenes are no longer set in actual locations, there’s no realism anchor to make them tangible and believable, so they end up feeling off. Objects don’t have a weight anymore, they don’t conform to the laws of physics, and thus part of the charm disappears, even if they’re flawlessly designed otherwise. Which they often aren’t – just compare the beautiful world of The Lord of the Rings trilogy with the far more artificial looking The Hobbit trilogy. The aforementioned time pressure shows its dark side: less time for refinement.



Add comment