Table of Contents
Photo: Video games
Gaming’s Ancestry and Legitimacy
One thing to be grateful for is that the heated “Are video games art?” debates are basically over and done with at this point. Of course, video games are art; they have about as much right to be considered “art” like movies or literature. A video game, on its own, is capable of communicating ideas, which means it would qualify as “art,” but now games, like any other medium, have a lineage and a tradition of sorts. Mind you, this medium is still quite young (only about half a century old, less than half the age of Cinema), but it has already grown up so much.
Related article: OSCAR-NOMINATED – EXCLUSIVE: ‘Dune’ Full Commentary, Reactions, Making Of – Timothee Chalamet, Zendaya, Oscar Isaac
Related article: OSCAR-NOMINATED – ‘House of Gucci’ Full Commentary & Behind the Scenes – Lady Gaga, Adam Driver, Jared Leto, Al Pacino
Related article: OSCAR-NOMINATED – ‘Belfast’ Full Commentary & BTS – Jamie Dornan, Caitriona Balfe, Judi Dench
Related article: OSCAR-NOMINATED – ‘West Side Story’ Full Commentary – Rita Moreno, Steven Spielberg, Rachel Zegler
Related article: OSCAR-NOMINATED – ‘No Time to Die’ Full Commentary, Behind the Scenes & Reactions, Daniel Craig, Rami Malek, Bond
A problem now facing video games is something of an identity crisis. You see, when Cinema was very young, it took after theater and photography — two quite old and defined mediums, and while it would be reductive to narrow an entire medium’s ancestry down to two sources, Cinema started out as a crossbreed between pictures and the stage. Of course, Cinema has proven itself to be so much more than that, with the help of advancements in technology, as well as technique; you would never say that a movie is just a bunch of still photographs trying to masquerade as a theatrical production. Yet video games, particularly AAA games (the gaming equivalent of Hollywood blockbusters), often try to masquerade as movies or TV shows.
Whereas Cinema was able to distance itself from its closest precursor as it evolved, video games seem to have gone in the opposite direction, rather becoming more like Cinema as graphics become more photorealistic and as development teams expand more and more, putting thousands of man-hours into what turns out to be little more than interactive movies. This development worries me, no less so because it was all but inevitable, putting big-budget games in a precarious position.
Related article: EVOLUTION: Every Henry Cavill Role From 2001 to 2021, All Performances Exceptionally Poignant
Related article: EVOLUTION: Every Chris Evans Role From 1997 to 2020, All Performances Exceptionally Poignant
Adventures in Three Dimensions
There are about as many types of games as there are colors that the human eye can perceive, but let’s be reductive once again and split the enormous breadth of gaming into two categories: game-y and cinematic. From the early 1970s to about the mid-’90s, video games were generally pretty game-y. What do I mean by this? Here, with this first category, we have games that function almost (if not entirely) purely on game logic, wherein the player is fully engaged in the experience, constantly being asked to make decisions, or else risk being punished. If you play something as simple as ‘Pac-Man’, you’re constantly on the move, having to gather pellets and power-ups while also anticipating enemy movements. One mistake means death. Arcade games in the olden days, like ‘Pac-Man’ and ‘Donkey Kong’, score maximum points for player engagement, although, conversely, they’re not terribly interesting to watch.
Arcade games, and games released on consoles only capable of rendering 2D movement (being able to move left, right, up, and down, but not forward or backward), refrained from “cinematic” conventions largely out of necessity. The reason for this basically comes down to the fact that the “camera” in a 2D game is more or less static; it’s only able to track the player from one angle, which the player has no direct control over. Any 2D platformer worth its salt will always have the “camera” at a wide-angle, viewing the player character from the side and covering areas that are behind, in front of, directly above, and directly below the player character. The “camera,” therefore, is a nonentity, as far as the player is concerned. Games from the ‘80s and early ‘90s also generally refrained from cutscenes, wherein control is taken away from the player to show off a key event with dramatic angles, pauses, and a non-diegetic score. It wouldn’t take long, however, for technology to advance enough so that games can have cutscenes, as well as cameras that the player would have to control.
Related article: #metoo Revolution: Powerful Questions That Need Answers
Related article: FACT-CHECKED Series: Timothee Chalamet and 32 Facts about The Young Superstar
When ‘Super Mario 64’ was released in 1996, it was one of the very first platformers to have a 3D field of action. Playing this game now, it can be jarring to be subjected to a quick tutorial in which the player learns about how the in-game camera works. In ‘Super Mario 64’, you can use the camera to view the action from a wide-angle, at a medium distance, or close up; you can turn the camera so that it aligns directly behind Mario, making certain platforming challenges easier. The early years of 3D gaming saw various experiments with cameras, ranging from the flexible camera of ‘Super Mario 64’ to the sort of isometric top-down view enforced in ‘Final Fantasy VII.’ Once the player was able to control quite literally how they would watch video games, there was no going back.
Interactive and Non-Interactive Cutscenes – Video Games
What exactly is the problem here, you may ask? Cameras in video games sound like a good thing, and that much is true; after all, engagement has not decreased. However, once cutscenes enter the equation, the problems of player engagement and player agency rear their ugly heads. A cutscene, to be more precise, is when the game plays a short film (either with pre-rendered graphics or using the game’s actual graphics) that usually serves to connect the dots in a narrative. Really, there are two types of cutscenes: interactive and non-interactive. Most video game cutscenes are non-interactive; the cutscene will play out exactly the same way with each playthrough, regardless of player input, and in some cases, the player can’t even press a button to skip the cutscene.
Related article: MUST WATCH – The Hollywood Insider’s CEO Pritan Ambroase’s Love Letter to Black Lives Matter – VIDEO
Subscribe to The Hollywood Insider’s YouTube Channel, by clicking here.
Non-interactive cutscenes are a contentious topic in gaming circles, since they not only deny players both engagement and agency, but they can also be time-wasters if they’re unskippable. If you’re playing a cutscene-heavy game like ‘Metal Gear Solid’ (possibly the most infamous example of a “cinematic” video game), then you’ll be spending about as much time sitting back and watching cutscenes as actually playing the thing. While the poor cutscene-to-gameplay ratio might be worth the struggle on one’s first playthrough, it can prove to be quite tedious on repeat playthroughs. While a cutscene can provide ample time for the player to get a cup of coffee, or take a bathroom break, it still runs into the problem of the player not doing what he/she ought to be doing as close to 100% of the time as possible: play the game.
One solution to the tedium of non-interactive cutscenes is, of course, to make them interactive. But how? An early (and robust) option was the implementation of “quicktime events,” in which the player will be faced with button prompts during what appears to be a non-interactive cutscene, responding to the prompt fast enough or else getting punished. ‘Shenmue’, released in 1999 and itself an early example of an open-world game (wherein the player is presented with a virtual sandbox), had quicktime events — but the most famous early example of this has to be ‘Resident Evil 4’ from 2005, which would blindside the player during cutscenes with quicktime events that could kill you in an instant. ‘Resident Evil 4’ is an action-horror game in which the player is nearly always expected to be on edge, so it makes sense that (in a game filled to the brim with horrific enemies) even the cutscenes are bloodthirsty.
Limited Time Offer – FREE Subscription to The Hollywood Insider
We’re All Just Actors on a Stage
This is all well and good, but it still leaves the problem of player agency, which leads us to what has become a recurring issue with AAA design trends. While video games have deliberately taken cues from Cinema since the days of ‘Super Mario 64’ (and even earlier than that, in all likelihood), the past decade or so has seen an explosion of “cinematic” games with photorealistic graphics that virtually always trade inventiveness with regards to gameplay for a heavier reliance on cutscenes. ‘God of War’ (the 2018 reboot of that franchise, not the 2005 original title) has its camera function in such a way that we never leave the player character’s viewpoint, even during cutscenes, or when there would normally be pauses for the game to load content. How does this benefit the gameplay? I’m not sure. Both ‘The Last of Us’ and its sequel (two of the most critically acclaimed games of the 2010s) are renowned for their storytelling, compelling characters, and grimy “cinematic” visuals, but very little attention is given to the generic action-survival gameplay.
Player choice, beyond menial things like crafting items, is not much of a concern for modern AAA games. There should be a genre that encompasses titles like ‘The Last of Us’, ‘Horizon Zero Dawn’, and the ‘God of War’ reboot, but “third-person action” is the closest to one that I can find. These games are all heavy on either non-interactive cutscenes or interactive cutscenes that are both unskippable and undemanding in terms of player response. While these games look nice, with talented voice actors behind every character, and with every blade of grass vividly rendered in high definition, the gameplay is all but interchangeable — and often forgettable. AAA games now casually achieve a certain “cinematic” effect, but they have also funneled into a gelatinous blob of undemanding gameplay that kneecaps both player engagement, and the capacity for players to make meaningful decisions.
Related article: The Hollywood Insider’s CEO Pritan Ambroase: “The Importance of Venice Film Festival as the Protector of Cinema”
Related article: The Masters of Cinema Archives: The Hollywood Insider Pays Tribute to ‘La Vie En Rose’, Exclusive Interview with Director Olivier Dahan
Related article: – Want GUARANTEED SUCCESS? Remove these ten words from your vocabulary| Transform your life INSTANTLY
Rescuing Player Choice from the Jaws of Cinema
The relationship between video games and movies is complicated and sometimes toxic. Filmmakers, of course, have tried to adapt video games into movies over the years — often with quite negative results. Video game adaptations of movies are by no means good a lot of the time, but they have fared better; indeed, some of the most celebrated games in the medium’s history were based on movies. Transitioning from interactive to non-interactive proves to be far more challenging to execute successfully than vice versa, and I have to assume this is because video games have unique merit that movies lack.
When you adapt ‘Sonic the Hedgehog’, a game all about fast movement and platforming challenges, into a movie, the thing that makes ‘Sonic the Hedgehog’ inherently exciting (i.e., the capacity for players to traverse labyrinthine levels at lightning speed) must be lost. What would it be replaced with? A movie that adapts a novel will (assuming the movie does it job well) compensate for material that had to be cut out for the sake of practicality with some other virtue, such as stellar visual effects and art direction. How, then, would video games benefit from taking cues from movies?
The potential for interactivity sadly does not match up with the reality of the situation. Despite AAA games released in 2022 being leagues more advanced, and no doubt harder to program, than their ancestors from two decades ago, they don’t allow for more options with interactivity; if anything, a lot of them are more restrictive. This is not to say that video games, as a medium, are doomed to evolve further into Frankenstein monsters (not entirely satisfying strictly as games, but also forever flawed as interactive movies), since games produced for the indie market show a good deal more promise. Sadly, with regards to the video game equivalent of Hollywood productions, things aren’t looking too good right now. The bright side is that, as young as video games as a medium are, they also change rapidly — both in terms of visual complexity and design philosophy. I’m not sure what the most expensive and highest-grossing games in the field will be like a decade from now, but I at least hope they will have learned from the current generation’s mistakes.
By Brian Collins
Click here to read The Hollywood Insider’s CEO Pritan Ambroase’s love letter to Cinema, TV and Media. An excerpt from the love letter: The Hollywood Insider’s CEO/editor-in-chief Pritan Ambroase affirms, “We have the space and time for all your stories, no matter who/what/where you are. Media/Cinema/TV have a responsibility to better the world and The Hollywood Insider will continue to do so. Talent, diversity and authenticity matter in Cinema/TV, media and storytelling. In fact, I reckon that we should announce “talent-diversity-authenticity-storytelling-Cinema-Oscars-Academy-Awards” as synonyms of each other. We show respect to talent and stories regardless of their skin color, race, gender, sexuality, religion, nationality, etc., thus allowing authenticity into this system just by something as simple as accepting and showing respect to the human species’ factual diversity. We become greater just by respecting and appreciating talent in all its shapes, sizes, and forms. Award winners, which includes nominees, must be chosen on the greatness of their talent ALONE.
I am sure I am speaking for a multitude of Cinema lovers all over the world when I speak of the following sentiments that this medium of art has blessed me with. Cinema taught me about our world, at times in English and at times through the beautiful one-inch bar of subtitles. I learned from the stories in the global movies that we are all alike across all borders. Remember that one of the best symbols of many great civilizations and their prosperity has been the art they have left behind. This art can be in the form of paintings, sculptures, architecture, writings, inventions, etc. For our modern society, Cinema happens to be one of them. Cinema is more than just a form of entertainment, it is an integral part of society. I love the world uniting, be it for Cinema, TV. media, art, fashion, sport, etc. Please keep this going full speed.”
More Interesting Stories From The Hollywood Insider
– Want GUARANTEED SUCCESS? Remove these ten words from your vocabulary| Transform your life INSTANTLY
– A Tribute to Martin Scorsese: A Complete Analysis of the Life and Career of the Man Who Lives and Breathes Cinema
– Do you know the hidden messages in ‘Call Me By Your Name’? Find out behind the scenes facts in the full commentary and In-depth analysis of the cinematic masterpiece
– A Tribute To The Academy Awards: All Best Actor/Actress Speeches From The Beginning Of Oscars 1929-2019 | From Rami Malek, Leonardo DiCaprio To Denzel Washington, Halle Berry & Beyond | From Olivia Colman, Meryl Streep To Bette Davis & Beyond
– In the 32nd Year Of His Career, Keanu Reeves’ Face Continues To Reign After Launching Movies Earning Over $4.3 Billion In Total – “John Wick”, “Toy Story 4”, “Matrix”, And Many More
video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games, video games