The gaming industry are closely monitoring the technological advancements achieved within cloud computing and artificial intelligence, with major companies like Microsoft investing heavily in cloud computing and making sure to implement it in their upcoming games, namely Crackdown 3.
However, Electronic Arts recently revealed a huge project they are currently working on, titled Project Atlas. It is not a new sci-fi title or even a new game, it is an integrated “game engine and services” game development platform with emphasis on unlocking the full potential of cloud computing and artificial intelligence, or at least accomplishing noticeable progress in their technologies. Chief Technology Officer Ken Moss shared the company’s vision of the project with Medium, where he revealed and talked more about the power Project Atlas possesses. He gave a few examples on how Madden NFL commentators instead of saying pre-recorded lines, they will interact and comment on your gameplay in “contextual, real-time commentary,” giving the game “a greater level of contextual and experiential realism.”
Power Made Easy
With Project Atlas, we are starting to put the power of AI in the creative’s hands. In one example, we are using high-quality LIDAR data about real mountain ranges, passing that data through a deep neural network trained to create terrain-building algorithms, and then creating an algorithm which will be available within the platform’s development toolbox. With this AI-assisted terrain generation, designers will within seconds generate not just a single mountain, but a series of mountains and all the surrounding environment with the realism of the real world. Channeling my 15-year-old self as a burgeoning game maker, I’m especially excited about what all this means for developers large and small. And this is just one example of dozens or even hundreds where we can apply advanced technology to help game teams of all sizes scale to build bigger and more fun games.
Applying AI to Unlock Creativity
Leveraging AI and machine learning will also give game makers the ability to craft in-game interactions with non-playable characters or NPCs in a way that is virtually indecipherable from a human interaction. So, instead of a pre-scripted, pattern-based logic for NPC behavior, this would make it possible for an NPC to engage in a way that is dynamic, contextual and absolutely believable. For example, imagine that you’re playing Madden, and you’ve just thrown your second interception of the game against the same cover 2 defense that caused the first turnover. Instead of the commentator simply stating that you threw a pick, the AI enables contextual, real-time commentary to reference the fact that you’re throwing to the sideline against a cover 2 defense and should have thrown against the weak zone over the middle to your tight end, who was open on the route. This would certainly push the game into a greater level of contextual and experiential realism. The AI is working with your gameplay. It’s responding to your needs as a player.
With Project Atlas, we’re now working to optimize cloud distribution of engine services to process the rendering, physics, and simulation of a game instead of being entirely constrained to the specs of a single client-side computing device. With Project Atlas, which is cloud native, we’ll have the ability to break from the limitations of individual systems. Previously, any simulation or rendering of in-game action were either limited to the processing performance of the player’s console or PC, or to a single server that interacted with your system. By harnessing the power of the cloud, players can tap into a network of many servers, dedicated to computing complex tasks, working in tandem with their own devices, to deliver things like hyper-realistic destruction within new HD games, that is virtually indistinguishable from real life — we’re working to deploy that level of gaming immersion on every device.
Integrating distributed networks at the rendering level means infinite scalability from the cloud. In typical multi-player games today, game performance is a balancing act of the demands of different resources and quality constraints — memory, CPU, GPU, fidelity, resolution, and framerate. Today, the balancing act of all those different constraints generally tops out at about 100 players competing at the same time on a map of a few dozen square kilometers. But the cloud starts to erase those limitations. Thousands of players could compete on a single map hundreds or thousands of kilometers wide, in a game session that could last for days, weeks, or years and with the progression and persistence of realistic seasons and campaigns. Technical limits expand exponentially and game designers get to focus on maximizing fun.
This is extremely interesting. Let us know what you think.
- AMD confirms it’s working with Microsoft on the future of cloud gaming
- Intel Xe GPUs include hardware-based ray tracing acceleration… for render farms
- Unity announces support for Nvidia’s real-time ray tracing
- Here’s What To Expect From Madden NFL 18 On Xbox One X
- Atlas is live for Twitch partners, with Early Access starting today